/*---------------------------------------------------------------------------*\ ========= | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox \\ / O peration | Website: https://openfoam.org \\ / A nd | Version: 6 \\/ M anipulation | \*---------------------------------------------------------------------------*/ Build : 6-eb0ce5d792a1 Exec : myicoUncoupledKinematicParcelFoam -parallel Date : Apr 05 2019 Time : 14:04:32 Host : "backend-1-04.local" PID : 132006 I/O : uncollated Case : /home/ua18762/hopperInitialState nProcs : 2 Slaves : 1("backend-1-04.local.132007") Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking polling iterations : 0 sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 10) allowSystemOperations : Allowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 0 Reading g Reading field U Reading/calculating face flux field phi Selecting incompressible transport model Newtonian Selecting turbulence model type laminar Selecting laminar stress model Stokes Constructing kinematicCloud kinematicCloud Constructing particle forces Selecting particle force sphereDrag Selecting particle force gravity Constructing cloud functions none Constructing particle injection models Creating injector: model1 Selecting injection model manualInjection Constructing 3-D injection Choosing nParticle to be a fixed value, massTotal variable now does not determine anything. [backend-1-04:132006] *** An error occurred in MPI_Send [backend-1-04:132006] *** reported by process [6229983233,139732466008064] [backend-1-04:132006] *** on communicator MPI COMMUNICATOR 3 SPLIT FROM 0 [backend-1-04:132006] *** MPI_ERR_COUNT: invalid count argument [backend-1-04:132006] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [backend-1-04:132006] *** and potentially your MPI job)