View Issue Details

IDProjectCategoryView StatusLast Update
0001868OpenFOAM[All Projects] Bugpublic2015-10-17 20:25
Reporteruser1248Assigned Tohenry 
PrioritynormalSeveritycrashReproducibilityalways
Status resolvedResolutionfixed 
Platformx86_64OSGNU/Linux DebianOS VersionJessie 8.2
Product Version 
Fixed in Version 
Summary0001868: boundaryFoam in parallel
DescriptionErrors in use of the solver boundaryFoam in parallel :

If i consider more than 2 domains for the decomposition, the solver crashs. With 2 domains decomposition, the solver runs.


mpirun -np 4 boundaryFoam -parallel
/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.4.0 |
| \\ / A nd | Web: www.OpenFOAM.org |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 2.4.0-dcea1e13ff76
Exec : boundaryFoam -parallel
Date : Oct 15 2015
Time : 11:00:54
Host : "xxxxxxxxxxxx"
PID : 1230
Case : /.xxxxxxxxxxxxxxxxxxxx/boundaryLaunderSharma
nProcs : 4
Slaves :
3
(
"xxxxxxxxxxx.1231"
"xxxxxxxxxxx.1232"
"xxxxxxxxxxx.1233"
)

Pstream initialized with:
    floatTransfer : 0
    nProcsSimpleSum : 0
    commsType : nonBlocking
    polling iterations : 0
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Reading field U

Creating face flux

Selecting incompressible transport model Newtonian
Selecting RAS turbulence model LaunderSharmaKE
LaunderSharmaKECoeffs
{
    Cmu 0.09;
    C1 1.44;
    C2 1.92;
    sigmaEps 1.3;
}

Generating wall data for patch: lowerWall
    Height to first cell centre y0 = 2.80937e-05

Starting time loop

[1]
[1]
[1] --> FOAM FATAL ERROR:
[1] No wall patches identified
[1]
[1] From function boundaryFoam
[1] in file interrogateWallPatches.H at line 57.
[1]
FOAM parallel run exiting
[1]
[2]
[2]
[2] --> FOAM FATAL ERROR:
[2] No wall patches identified
[2]
[2] From function boundaryFoam
[2] in file interrogateWallPatches.H at line 57.
[2]
FOAM parallel run exiting
[2]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
Time = 1

--------------------------------------------------------------------------
mpirun has exited due to process rank 2 with PID 1232 on
node xxxxxxxxxx exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[xxxxxxxxxxx:01229] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[xxxxxxxxxxx:01229] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Steps To Reproducesource openfoam_bashrc

cp -r $FOAM_TUTORIALS/incompressible/boundaryFoam/boundaryLaunderSharma .
cd boundaryLaunderSharma
blockMesh

(use of attached decomposeparDict in system directory)

decomposePar
mpirun -np 4 boundaryFoam -parallel



Additional InformationThe decomposeparDict file :

/*--------------------------------*- C++ -*----------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 1.7.1 |
| \\ / A nd | Web: www.OpenFOAM.com |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
FoamFile
{
    version 2.0;
    format ascii;
    class dictionary;
    location "system";
    object decomposeParDict;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

numberOfSubdomains 4;

method scotch;
preservePatches (front back);
TagsNo tags attached.

Activities

wyldckat

2015-10-15 11:21

updater   ~0005405

AFAIK, boundaryFoam is mostly a proof-of-concept solver, designed for generating a 2D flow profile, for testing a turbulence model and wall treatment are working as intended.
It's fairly hard-coded to work as a single process and as far as I can remember, making it work in parallel isn't straight forward.

One workaround is to use the "simple" or "hierarchical" decomposition and make sure that all processors have access to the reference wall. For example, in the "boundaryLaunderSharma", divide only along the length of the wall/channel.

henry

2015-10-17 20:24

manager   ~0005421

Currently boundaryFoam is setup for serial operation only. Given that it is a 1D solver parallelization would not be beneficial unless VERY high resolution is used. It would be possible to write a parallel version but quite a bit of work. If this is a particular requirement you could have a go yourself or fund us to write the solver for you.

To avoid further confusion I have explicitly removed the 'parallel' option from boundaryFoam:

OpenFOAM-dev: commit 18376369645f3cd7784c0e11c7818ef0294bf65c

Issue History

Date Modified Username Field Change
2015-10-15 10:27 user1248 New Issue
2015-10-15 11:21 wyldckat Note Added: 0005405
2015-10-17 20:24 henry Note Added: 0005421
2015-10-17 20:24 henry Status new => resolved
2015-10-17 20:24 henry Resolution open => fixed
2015-10-17 20:24 henry Assigned To => henry