/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 5.x | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 5.x-3fe7aa77e620 Exec : pimpleFoam -parallel Date : Aug 28 2017 Time : 10:22:23 Host : "node144.service" PID : 13620 I/O : uncollated Case : /beegfs/testcase/IOTestCase nProcs : 4 Slaves : 3 ( "node144.service.13621" "node144.service.13622" "node144.service.13623" ) Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking polling iterations : 0 sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 10) allowSystemOperations : Allowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Overriding DebugSwitches according to controlDict decomposedBlockData 0; OFstreamCollator 1; Overriding OptimisationSwitches according to controlDict maxThreadFileBufferSize 2e+09; maxMasterFileBufferSize 2e+09; Overriding fileHandler to collated I/O : collated (maxThreadFileBufferSize 2e+09) Threading activated since maxThreadFileBufferSize > 0. Requires thread support enabled in MPI, otherwise the simulation may "hang". If thread support cannot be enabled, deactivate threading by setting maxThreadFileBufferSize to 0 in $FOAM_ETC/controlDict Create mesh for time = 0 PIMPLE: Operating solver in PISO mode Reading field p Reading field U Reading/calculating face flux field phi Selecting incompressible transport model Newtonian Selecting turbulence model type LES Selecting LES turbulence model dynamicKEqn Selecting LES delta type cubeRootVol bounding k, min: 0 max: 2.581693 average: 0.064223725 dynamicKEqnCoeffs { filter simple; Ce 1.048; } No MRF models present No finite volume options present Starting time loop Courant Number mean: 0.019271517 max: 0.14347296 deltaT = 1.2e-05 Time = 1.2e-05 PIMPLE: iteration 1 DILUPBiCG: Solving for Ux, Initial residual = 0.00063933574, Final residual = 1.0133512e-07, No Iterations 1 DILUPBiCG: Solving for Uy, Initial residual = 0.0039839205, Final residual = 6.2724362e-07, No Iterations 1 DILUPBiCG: Solving for Uz, Initial residual = 0.0039929386, Final residual = 6.3206595e-07, No Iterations 1 DICPCG: Solving for p, Initial residual = 0.95020722, Final residual = 0.0083812333, No Iterations 8 time step continuity errors : sum local = 3.4815182e-07, global = 7.948159e-10, cumulative = 7.948159e-10 DICPCG: Solving for p, Initial residual = 0.0060926763, Final residual = 2.0447134e-06, No Iterations 1001 time step continuity errors : sum local = 1.5374615e-10, global = 6.3334315e-13, cumulative = 7.9544925e-10 DILUPBiCG: Solving for k, Initial residual = 0.0019602937, Final residual = 3.4806129e-07, No Iterations 1 bounding k, min: -0.0035616505 max: 2.6040647 average: 0.064221013 [2] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/nut" to thread [2] OFstreamCollator : Started write thread [2] OFstreamCollator : Writing 11562830 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/nut" using comm 1 [0] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/nut" to thread [1] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/nut" to thread [0] OFstreamCollator : Started write thread [0] OFstreamCollator : Writing 12268085 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/nut" using comm 1 [1] OFstreamCollator : Started write thread [1] OFstreamCollator : Writing 12544682 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/nut" using comm 1 [3] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/nut" to thread [3] OFstreamCollator : Started write thread [3] OFstreamCollator : Writing 13028027 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/nut" using comm 1 [1] OFstreamCollator : Finished writing 12544682 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/nut" using comm 1 [1] OFstreamCollator : Exiting write thread [2] OFstreamCollator : Finished writing 11562830 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/nut" using comm 1 [2] OFstreamCollator : Exiting write thread [3] OFstreamCollator : Finished writing 13028027 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/nut" using comm 1 [3] OFstreamCollator : Exiting write thread [0] OFstreamCollator : Finished writing 12268085 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/nut" using comm 1 [0] OFstreamCollator : Exiting write thread [3] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/U" to thread [3] OFstreamCollator : Started write thread [3] OFstreamCollator : Writing 43239935 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/U" using comm 1 [1] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/U" to thread [0] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/U" to thread [1] OFstreamCollator : Started write thread [1] OFstreamCollator : Writing 43544183 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/U" using comm 1 [0] OFstreamCollator : Started write thread [0] OFstreamCollator : Writing 43622996 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/U" using comm 1 [2] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/U" to thread [2] OFstreamCollator : Started write thread [2] OFstreamCollator : Writing 43475086 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/U" using comm 1 [1] OFstreamCollator : Finished writing 43544183 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/U" using comm 1 [1] OFstreamCollator : Exiting write thread [2] OFstreamCollator : Finished writing 43475086 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/U" using comm 1 [2] OFstreamCollator : Exiting write thread [3] OFstreamCollator : Finished writing 43239935 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/U" using comm 1 [3] OFstreamCollator : Exiting write thread [0] OFstreamCollator : Finished writing 43622996 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/U" using comm 1 [0] OFstreamCollator : Exiting write thread [3] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/k" to thread [2] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/k" to thread [3] OFstreamCollator : Started write thread [3] OFstreamCollator : Writing 14176288 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/k" using comm 1 [2] OFstreamCollator : Started write thread [2] OFstreamCollator : Writing 14726717 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/k" using comm 1 [1] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/k" to thread [1] OFstreamCollator : Started write thread [1] OFstreamCollator : Writing 14614333 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/k" using comm 1 [0] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/k" to thread [0] OFstreamCollator : Started write thread [0] OFstreamCollator : Writing 14843280 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/k" using comm 1 [1] OFstreamCollator : Finished writing 14614333 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/k" using comm 1 [1] OFstreamCollator : Exiting write thread [2] OFstreamCollator : Finished writing 14726717 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/k" using comm 1 [2] OFstreamCollator : Exiting write thread [3] OFstreamCollator : Finished writing 14176288 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/k" using comm 1 [3] OFstreamCollator : Exiting write thread [0] OFstreamCollator : Finished writing 14843280 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/k" using comm 1 [0] OFstreamCollator : Exiting write thread [2] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/phi" to thread [2] OFstreamCollator : Started write thread [2] OFstreamCollator : Writing 51180546 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/phi" using comm 1 [1] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/phi" to thread [1] OFstreamCollator : Started write thread [1] OFstreamCollator : Writing 52081538 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/phi" using comm 1 [3] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/phi" to thread [3] OFstreamCollator : Started write thread [3] OFstreamCollator : Writing 51714076 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/phi" using comm 1 [2] #0 Foam::error::printStack(Foam::Ostream&)[0] OFstreamCollator : relaying write of "/beegfs/testcase/IOTestCase/processors/1.2e-05/phi" to thread -------------------------------------------------------------------------- A process has executed an operation involving a call to the "fork()" system call to create a child process. Open MPI is currently operating in a condition that could result in memory corruption or other system errors; your job may hang, crash, or produce silent data corruption. The use of fork() (or system() or other calls that create child processes) is strongly discouraged. The process that invoked fork was: Local host: [[4098,1],2] (PID 13622) If you are *absolutely sure* that your application will successfully and correctly survive a call to fork(), you may disable this warning by setting the mpi_warn_on_fork MCA parameter to 0. -------------------------------------------------------------------------- [0] OFstreamCollator : Started write thread [0] OFstreamCollator : Writing 52132478 bytes to "/beegfs/testcase/IOTestCase/processors/1.2e-05/phi" using comm 1 at ??:? [2] #1 Foam::sigSegv::sigHandler(int) at ??:? [2] #2 ? in "/usr/lib64/libc.so.6" [2] #3 ? at pml_cm.c:? [2] #4 ompi_mtl_psm_progress at ??:? [2] #5 opal_progress in "/cluster/mpi/gcc/openmpi/2.1.1/lib/libopen-pal.so.20" [2] #6 sync_wait_mt in "/cluster/mpi/gcc/openmpi/2.1.1/lib/libopen-pal.so.20" [2] #7 ? at pml_cm.c:? [2] #8 MPI_Recv in "/cluster/mpi/gcc/openmpi/2.1.1/lib/libmpi.so.20" [2] #9 Foam::UIPstream::read(Foam::UPstream::commsTypes, int, char*, long, int, int) at ??:? [2] #10 void Foam::Pstream::gatherList(Foam::List const&, Foam::List&, int, int) at ??:? [2] #11 Foam::decomposedBlockData::writeBlocks(int, Foam::autoPtr&, Foam::List&, Foam::UList const&, Foam::UPstream::commsTypes, bool) at ??:? [2] #12 Foam::OFstreamCollator::writeFile(int, Foam::word const&, Foam::fileName const&, Foam::string const&, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType, bool) at ??:? [2] #13 Foam::OFstreamCollator::writeAll(void*) at ??:? [2] #14 ? in "/usr/lib64/libpthread.so.0" [2] #15 clone in "/usr/lib64/libc.so.6" [node144:13622] *** Process received signal *** [node144:13622] Signal: Segmentation fault (11) [node144:13622] Associated errno: Unknown error 11085 (11085) [node144:13622] Signal code: (0) [node144:13622] Failing at address: (nil) [node144:13622] [ 0] /usr/lib64/libc.so.6(+0x35250)[0x2b4d35fc5250] [node144:13622] [ 1] /usr/lib64/libc.so.6(gsignal+0x37)[0x2b4d35fc51d7] [node144:13622] [ 2] /usr/lib64/libc.so.6(+0x35250)[0x2b4d35fc5250] [node144:13622] [ 3] /cluster/mpi/gcc/openmpi/2.1.1/lib/openmpi/mca_pml_cm.so(+0x2405)[0x2b4d490b1405] [node144:13622] [ 4] /cluster/mpi/gcc/openmpi/2.1.1/lib/openmpi/mca_mtl_psm.so(ompi_mtl_psm_progress+0x75)[0x2b4d496e9d05] [node144:13622] [ 5] /cluster/mpi/gcc/openmpi/2.1.1/lib/libopen-pal.so.20(opal_progress+0x5c)[0x2b4d3a0e2c8c] [node144:13622] [ 6] /cluster/mpi/gcc/openmpi/2.1.1/lib/libopen-pal.so.20(sync_wait_mt+0xc5)[0x2b4d3a0e7e35] [node144:13622] [ 7] /cluster/mpi/gcc/openmpi/2.1.1/lib/openmpi/mca_pml_cm.so(+0x275c)[0x2b4d490b175c] [node144:13622] [ 8] /cluster/mpi/gcc/openmpi/2.1.1/lib/libmpi.so.20(MPI_Recv+0x175)[0x2b4d3882ff85] [node144:13622] [ 9] /cluster/engineering/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/lib/openmpi-system/libPstream.so(_ZN4Foam9UIPstream4readENS_8UPstream10commsTypesEiPclii+0x1b5)[0x2b4d363557c5] [node144:13622] [10] /cluster/engineering/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/lib/libfiniteVolume.so(_ZN4Foam7Pstream10gatherListIiEEvRKNS_4ListINS_8UPstream11commsStructEEERNS2_IT_EEii+0xee)[0x2b4d3227964e] [node144:13622] [11] /cluster/engineering/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so(_ZN4Foam19decomposedBlockData11writeBlocksEiRNS_7autoPtrINS_8OSstreamEEERNS_4ListIlEERKNS_5UListIcEENS_8UPstream10commsTypesEb+0x1f9)[0x2b4d34d658e9] [node144:13622] [12] /cluster/engineering/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so(_ZN4Foam16OFstreamCollator9writeFileEiRKNS_4wordERKNS_8fileNameERKNS_6stringENS_8IOstream12streamFormatENSA_13versionNumberENSA_15compressionTypeEb+0xb7)[0x2b4d34c9f577] [node144:13622] [13] /cluster/engineering/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so(_ZN4Foam16OFstreamCollator8writeAllEPv+0xbb)[0x2b4d34c9f93b] [node144:13622] [14] /usr/lib64/libpthread.so.0(+0x7dc5)[0x2b4d38acbdc5] [node144:13622] [15] /usr/lib64/libc.so.6(clone+0x6d)[0x2b4d3608776d] [node144:13622] *** End of error message *** -------------------------------------------------------------------------- mpirun noticed that process rank 2 with PID 13622 on node node144 exited on signal 11 (Segmentation fault). --------------------------------------------------------------------------