View Issue Details

IDProjectCategoryView StatusLast Update
0003402OpenFOAMBugpublic2019-12-10 10:30
Reporterrucky96Assigned Towill 
Status closedResolutionsuspended 
PlatformDell Vostro 15 3558OSMicrosoft Windows 10 ProOS Version10.0.18362
Product Version7 
Fixed in Version 
Summary0003402: Bad parallelization. The values at the boundary of each subdomain are not shared at all.
DescriptionI was trying to solve the problem of a magnetically confined conductive fluid in a periodic cylinder. The results for a serial simulation gives resonable results... But when I run the simulation for 2 processors one can distinguish 2 zones in the cylinder (one per processor). If I am using 3 one can distinguish 3 zones. Same for 4.

I suppose that there is a problem in the comunication between the cells at the boundaries of each processors zone. I am using scotch method.

I tried to change the descomposition method and to add a correctBoundaryConditions() in the equation for B in a modified mhdFoam.

I was usinf a modified mhdFoam which takes into account a B external. But setting using the usual mhdFoam (with B external=0) the same issue appears, so it is not a problem of the new solver.
Steps To ReproduceJust perform wmake to cyIniB and modifiedMhdFoam and write
$foamJob mpirun -np 4 mfiedMhdFoam -parallel &

Additional InformationIf you want to see the same issue with the normal mhdFoam just add the sigma, rho and mu parameters and change the units for B and pB.
TagsNo tags attached.



2019-11-30 15:41


2p.png (84,976 bytes)
2p.png (84,976 bytes)
3p.png (94,221 bytes)
3p.png (94,221 bytes) (36,818 bytes)


2019-11-30 18:46

manager   ~0010946

Try running the tutorials/electromagnetics/mhdFoam/hartmann case in parallel and report the results. I just ran it on 4 processors and reconstructed and the results look the same as the serial run.


2019-12-01 01:47

reporter   ~0010947

Hi henry,

Thanks for answering.

I did not found any problem with the tutorial case, nor in a case of a periodic cubic domain or a toroid, but I have seen difference in the cylinder (as you can see from the pictures) . It seems as if each subdomain was solving its region without sharing the information. I tried with a mesh made with blockMeshDict and snappyHexMeshDict and the results are the same. Have you run the case and seen the problems?



2019-12-01 09:11

manager   ~0010948

Can you provide a simple mhdFoam case which reproduces the problem?


2019-12-01 16:10

reporter   ~0010949

Hi hernry,

At the 0.1 time written you should see something like this pattern in the picture (scale Uz form -0.1 to 0.1), it does not appear in serial run. The cyclicAMI patch is also working very poorly (nor in serial nor in parallel).

Regards. (29,279 bytes)
bug2.png (158,584 bytes)
bug2.png (158,584 bytes)


2019-12-01 16:53

manager   ~0010950

Can you provide a SIMPLE mhdFoam case which reproduces the problem?


2019-12-01 17:23

reporter   ~0010951

Hi henry,

What is simple for you? As I said, I tried it in a cubic domain and saw nothing strange (not for a toroidal domain). It is in this specific geometry that I found this rare behavior. I tried two different meshes and got a bad parallelization.

This is all I know, I have nothing else.


2019-12-01 17:26

manager   ~0010952

A simple geometry with the case setup so that it can just be run with mhdFoam rather than requiring a complex meshing and initialization sequence.
As far as I can see there is nothing wrong with mhdFoam either in serial or parallel and you have problems with your case setup and/or initialisation procedure and/or your specific solver for which you will need to arrange for user support.


2019-12-03 16:28

reporter   ~0010953


In case you don't have time to figure out how to generate cylindrical geometries with blockMesh, here you have one that must resemble your geometry:


2019-12-05 11:11

reporter   ~0010960

Hi guin,

Thank you very much X for trying to help. Unfortunately, one of the meshes I was testing is one like the link but made with an m4 file. Anyway thanks for the try :)


2019-12-05 11:13

reporter   ~0010961

Hi henry,

If mhdFoam only works in simple meshes and if everything that is not a box does not work (I mean, in parallel), then it is a code not suitable for high performance simulations, which is what I was trying to prove in my thesis.


2019-12-05 11:14

reporter   ~0010962

**high performance simulations oriented to plasma confinement


2019-12-06 12:43

manager   ~0010966

Last edited: 2019-12-06 12:45

View 2 revisions

We are not saying that "because it works on a box" that everything is fine. We are saying that your case is complicated and there are many reasons why you might be getting the poor results that you are. It might be a bug, or it might just be error that you have made in the case setup. Your report does not narrow it down. Note "Reports that indicate a significant possibility of user error will be closed or deleted.".

If we were to investigate this we would begin by simplifying the case as much as possible whilst retaining the erroneous behaviour. That way all the possibilities that are not responsible for the problem get removed and we can identify the root cause. That takes a lot of time, and you are not paying us, so you have to do it.


2019-12-06 12:44

manager   ~0010967

Last edited: 2019-12-06 12:46

View 2 revisions

For reference, I think the problem is that you are simulating a closed-domain (i.e., no pressure boundaries) with AMI, which is non-conservative. The pressure system (and/or or the pB system) won't be able to solve this accurately and that's why you get all the noise around the AMI and the processor interfaces. It may be more apparent in a parallel simulation, but I'd bet the serial runs aren't valid either. I also think that the pB system will need pressure referencing, something mhdFoam does not currently support.

All of that is supposition. You will have to investigate whether it's correct or not, or pay someone to do so.


2019-12-07 17:35

reporter   ~0010971

Hi will,

Firts of all, thank you for answering. Yes, I understand you. I am just saying that I was not able to see that noise at boundaries of subdomains in a simpler geometry, like a box.

Following your advices I used a cyclic patch (not cyclicAMI) with a mesh made with blockMesh (similar to that of @guin). As you said that "pB system will need pressure referencing, something mhdFoam does not currently support" I changed the solver to one based on elsasser variables, this means that I am just solving two equations compeletely analogous to momentum equation, just reusing the PISO algorithm. So now this problem of pB is fixed.

The solver with elsasser variables has the same result as mhdFoam in the cube (a good signal) and sadly also in the cylinder, so I still see the noise near the limits of the subdomains. Therefore, I do not believe that cyclicAMI or the reference cell of pB was the problem.

I would leave this thread open because I will continue to look for the solution and if someone finds it before me then welcome. Also, it may not be a problem just that it can happen to me.


2019-12-10 10:30

manager   ~0010976

Last edited: 2019-12-10 10:30

View 2 revisions

This is not a forum for you to post your queries for others to investigate. If that is your need, I suggest you try This is a bug reporting system, and a bug in a released version of OpenFOAM has thus far not been identified, so I am closing the report.

If you do figure out what the cause of the issue is, and you can demonstrate it to be a bug in a version of OpenFOAM released by the Foundation, then you will be entirely justified in opening another report.

Issue History

Date Modified Username Field Change
2019-11-30 15:41 rucky96 New Issue
2019-11-30 15:41 rucky96 File Added: 2p.png
2019-11-30 15:41 rucky96 File Added: 3p.png
2019-11-30 15:41 rucky96 File Added:
2019-11-30 18:46 henry Note Added: 0010946
2019-12-01 01:47 rucky96 Note Added: 0010947
2019-12-01 09:11 henry Note Added: 0010948
2019-12-01 16:10 rucky96 File Added:
2019-12-01 16:10 rucky96 File Added: bug2.png
2019-12-01 16:10 rucky96 Note Added: 0010949
2019-12-01 16:53 henry Note Added: 0010950
2019-12-01 17:23 rucky96 Note Added: 0010951
2019-12-01 17:26 henry Note Added: 0010952
2019-12-03 16:28 guin Note Added: 0010953
2019-12-05 11:11 rucky96 Note Added: 0010960
2019-12-05 11:13 rucky96 Note Added: 0010961
2019-12-05 11:14 rucky96 Note Added: 0010962
2019-12-06 12:43 will Note Added: 0010966
2019-12-06 12:44 will Note Added: 0010967
2019-12-06 12:45 will Note Edited: 0010966 View Revisions
2019-12-06 12:46 will Note Edited: 0010967 View Revisions
2019-12-07 17:35 rucky96 Note Added: 0010971
2019-12-10 10:30 will Assigned To => will
2019-12-10 10:30 will Status new => closed
2019-12-10 10:30 will Resolution open => suspended
2019-12-10 10:30 will Note Added: 0010976
2019-12-10 10:30 will Note Edited: 0010976 View Revisions