|
[Sponsors] |
October 24, 2013, 10:28 |
How to run icoFsiFoam in parallel mode?
|
#1 |
New Member
Join Date: Oct 2013
Posts: 5
Rep Power: 13 |
Hello,
I'm new to OpenFoam and have a questions about the solver icoFsiFoam of OpenFoam-1.6-ext. Until now I was able to run some FSI test cases with icoFsiFoam on one processor but as soon as I try to run the cases on 4 processors by starting the job via the mpirun command I get the following error message: Code:
Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : blocking SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create dynamic mesh for time = 0 Selecting dynamicFvMesh dynamicMotionSolverFvMesh Selecting motion solver: laplaceFaceDecomposition Selecting motion diffusivity: quadratic [0] [0] [0] --> FOAM FATAL ERROR: [0] Cannot find file "points" in directory "constant/solid/polyMesh" [0] [0] From function Time::findInstance(const fileName&, const word&, const IOobject::readOption) [0] in file db/Time/findInstance.C at line 148. [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [1] [1] [1] --> FOAM FATAL ERROR: [1] Cannot find file "points" in directory "constant/solid/polyMesh" [1] [1] From function Time::findInstance(const fileName&, const word&, const IOobject::readOption) [1] in file db/Time/findInstance.C at line 148. [1] FOAM parallel run exiting [1] -------------------------------------------------------------------------- mpirun has exited due to process rank 0 with PID 9104 on node itlrstud053 exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [itlrstud053:09090] 1 more process has sent help message help-mpi-api.txt / mpi-abort [itlrstud053:09090] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages It would be great if someone could help me to run the solver in parallel mode! Greetings Michael |
|
February 21, 2014, 08:03 |
|
#2 |
Senior Member
Join Date: Jan 2014
Posts: 179
Rep Power: 12 |
Hi
You got any solution on that? Cause I am struggling with the same error. Do not really know how to handle the solid mesh decomposition. Is it necessary? |
|
February 21, 2014, 08:56 |
|
#3 |
New Member
Join Date: Oct 2013
Posts: 5
Rep Power: 13 |
Hey,
no I did not manage to get a solution for this problem. But with openfoam-extend-3.0 comes a new FSI solver which can run in parallel mode. The solver is called icoFsiElasticNonLinULSolidFoam maybe you can try this solver. They even use solid mesh decomposition in the tutorial of icoFsiElasticNonLinULSolidFoam-Solver. If you don't want to use solid mesh decomposition as shown in the tutorial you have to link the fluid-processor folders directly to the solid folder. |
|
Tags |
icofsifoam, parallel execution |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
stop when I run in parallel | Nolwenn | OpenFOAM | 36 | March 21, 2021 05:56 |
simpleFoam in parallel issue | plucas | OpenFOAM Running, Solving & CFD | 3 | July 17, 2013 12:30 |
Script to Run Parallel Jobs in Rocks Cluster | asaha | OpenFOAM Running, Solving & CFD | 12 | July 4, 2012 23:51 |
Fluent Parallel Mode | Dinocrack | FLUENT | 0 | May 16, 2011 08:26 |
batch mode - parallel run | turbotel | CFX | 2 | March 29, 2011 17:53 |