View Issue Details
ID | Project | Category | View Status | Date Submitted | Last Update |
---|---|---|---|---|---|
0000579 | OpenFOAM | Bug | public | 2012-07-09 15:11 | 2012-08-03 10:42 |
Reporter | Assigned To | ||||
Priority | normal | Severity | crash | Reproducibility | always |
Status | resolved | Resolution | no change required | ||
Platform | Linux | OS | OpenSuse | OS Version | 11.3 |
Summary | 0000579: LTSReactingParcelFoam crashes running in parallel (with MPI) | ||||
Description | both tutorials(verticalChannel and counterFlowFlame2D) crashes, running in parallel, with LTSReactingParcelFoam | ||||
Steps To Reproduce | copy the decomposeParDict from /OpenFOAM/OpenFOAM-2.1.x/tutorials/lagrangian/coalChemistryFoam/simplifiedSiwek/system/ to /OpenFOAM/OpenFOAM-2.1.x/tutorials/lagrangian/LTSReactingParcelFoam/verticalChannel/system/ then blockMesh and mpirun | ||||
Tags | No tags attached. | ||||
|
What error message(s) do you receive? running locally on an OpenSUSE 11.3 system both cases worked well. |
|
I receive this message: Creating multi-variate interpolation scheme Selecting radiationModel none Constructing reacting cloud [1] [1] [1] --> FOAM FATAL ERROR: [1] read failed [1] [1] From function UIPstream::UIPstream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber) [1] in file UIPread.C at line 114. [1] FOAM parallel run aborting Creating sources Creating field source list from sourcesProperties Creating porous zones Starting time loop [1] Time = 1 Time scales min/max: [1] #0 Foam::error: PrintStack(Foam::Ostream&) in "/home/usr/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [1] #1 Foam::error::abort() in "/home/usr/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [1] #2 Foam::UIPstream::UIPstream(Foam::UPstream::commsTypes, int, Foam::DynamicList<char, 0u, 2u, 1u>&, int&, int, bool, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/usr/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/openmpi-1.5.3/libPstream.so" [1] #3 Foam::IPstream::IPstream(Foam::UPstream::commsTypes, int, int, int, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/usr/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [1] #4 Foam::IOdictionary::readFile(bool) in "/home/usr/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [1] #5 Foam::IOdictionary::IOdictionary(Foam::IOobject const&) in "/home/usr/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [1] #6 Foam::IObasicSourceList::IObasicSourceList(Foam::fvMesh const&) in "/home/usr/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so" [1] #7 [1] in "/home/usr/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/bin/LTSReactingParcelFoam" [1] #8 __libc_start_main in "/lib64/libc.so.6" [1] #9 [1] at /usr/src/packages/BUILD/glibc-2.11.2/csu/../sysdeps/x86_64/elf/start.S:116 -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- -------------------------------------------------------------------------- mpirun has exited due to process rank 1 with PID 12051 on node linux-62ig exiting improperly. There are two reasons this could occur: 1. this process did not call "init" before exiting, but others in the job did. This can cause a job to hang indefinitely while it waits for all processes to call "init". By rule, if one process calls "init", then ALL processes must call "init" prior to termination. 2. this process called "init", but exited without calling "finalize". By rule, all processes that call "init" MUST call "finalize" prior to exiting or it will be considered an "abnormal termination" This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- |
|
in your etc/controlDict, what settings are you using for fileModificationChecking and commsType? e.g. I'm using fileModificationChecking timeStampMaster; commsType nonBlocking; |
|
the same as you are fileModificationChecking timeStampMaster;//inotify;timeStamp;inotifyMaster; commsType nonBlocking; //scheduled; //blocking; |
|
@GenaRisch: which commit are you using? Run "git log -1" to check which one. Also try the latest git commit of OpenFOAM 2.1.x. Additionally, check the attached prepared case, based on your instructions, just to make 100% certain the same test is being performed. @andy: Attached is both said prepared case for parallel execution, using 2 sub-domains and following the "Steps To Reproduce", as well as the logs (serial_and_parallel_logs.tar.gz) for the serial and parallel executions. There is a significant difference in the minimum value for the temperature, seen in the logs, possibly related to the original issue. (edit: forgot to point the temperatures) On the last iteration: Serial - T gas min/max = 389.555623, 573 Parallel - T gas min/max = 337.0483616, 573 Specs I used: $ git log -1: commit 334ec14f742f95b1198bc0de318d625fc724b5e7 Author: Henry <Henry> Date: Thu Aug 2 14:13:23 2012 +0100 Test system: Ubuntu 11.10 x86_64, Gcc 4.6.1. |
|
|
|
|
|
By the way, I remembered now seeing a commit that might be the one that fixed the original reported issue, namely commit 7d703d585daf11438fbc4ad3bae01199675e7f78: https://github.com/OpenFOAM/OpenFOAM-2.1.x/commit/7d703d585daf11438fbc4ad3bae01199675e7f78 |
|
on the local workstation i use commit 223b594ad0fff19dcd066fd9e7ede4200e1e95ca Author: sergio <sergio> Date: Tue May 8 17:27:57 2012 +0100 but this error occurs on other machines, with the newest commit at the date I submitted, too now with the changes in dictionaryIO.C and UIPread.C parallel running works Thanks |
2012-08-03 09:30
|
|
|
@wyldckat i think it is just an unlucky "Time" you compared as you can see in the attached plot both parallel and serial T gas min are approximately in the same range |
|
thanks for the responses |
Date Modified | Username | Field | Change |
---|---|---|---|
2012-07-09 15:11 |
|
New Issue | |
2012-08-01 13:26 |
|
Note Added: 0001528 | |
2012-08-01 14:19 |
|
Note Added: 0001529 | |
2012-08-01 16:02 |
|
Note Added: 0001531 | |
2012-08-02 07:12 |
|
Note Added: 0001534 | |
2012-08-02 10:13 |
|
Note Edited: 0001534 | |
2012-08-02 22:57 | wyldckat | Note Added: 0001545 | |
2012-08-02 22:57 | wyldckat | File Added: verticalChannelPar.tar.gz | |
2012-08-02 22:57 | wyldckat | File Added: serial_and_parallel_logs.tar.gz | |
2012-08-02 22:58 | wyldckat | Note Edited: 0001545 | |
2012-08-02 23:04 | wyldckat | Note Added: 0001546 | |
2012-08-02 23:07 | wyldckat | Note Edited: 0001545 | |
2012-08-03 08:47 |
|
Note Added: 0001547 | |
2012-08-03 09:30 |
|
File Added: T gas min.png | |
2012-08-03 09:30 |
|
Note Added: 0001549 | |
2012-08-03 09:34 |
|
Note Edited: 0001549 | |
2012-08-03 10:42 |
|
Note Added: 0001550 | |
2012-08-03 10:42 |
|
Status | new => resolved |
2012-08-03 10:42 |
|
Resolution | open => no change required |
2012-08-03 10:42 |
|
Assigned To | => user2 |