|
[Sponsors] |
Problems running in parallel - missing controlDict |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
June 6, 2012, 20:51 |
Problems running in parallel - missing controlDict
|
#1 |
Member
Join Date: Jan 2010
Posts: 44
Rep Power: 16 |
Try to run a tutorial case ($FOAM_RUN/tutorials/multiphase/interFoam/laminar/damBreak ) in parallel. But the errors were reported as follows. Any idea on what's wrong?
$ mpirun -np 4 interFoam -parallel > log & [1] 17036 ~/OpenFOAM/tmm-1.7.1/run/tutorials/multiphase/interFoam/laminar/damBreakFine$ [3] [3] [3] --> FOAM FATAL IO ERROR: [3] cannot open file [3] [3] file: /OpenFOAM/tmm-1.7.1/run/tutorials/multiphase/interFoam/laminar/damBreakFine/processor3/system/controlDict at line 0. [3] [3] From function regIOobject::readStream() [3] in file db/regIOobject/regIOobjectRead.C at line 61. [3] FOAM parallel run exiting [3] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [2] [2] [2] --> FOAM FATAL IO ERROR: [2] cannot open file [2] [2] file: OpenFOAM/tmm-1.7.1/run/tutorials/multiphase/interFoam/laminar/damBreakFine/processor2/system/controlDict at line 0. [2] [2] From function regIOobject::readStream() [2] in file db/regIOobject/regIOobjectRead.C at line 61. [2] FOAM parallel run exiting [2] [1] [1] [1] --> FOAM FATAL IO ERROR: [1] cannot open file [1] [1] file: /OpenFOAM/tmm-1.7.1/run/tutorials/multiphase/interFoam/laminar/damBreakFine/processor1/system/controlDict at line 0. [1] [1] From function regIOobject::readStream() [1] in file db/regIOobject/regIOobjectRead.C at line 61. [1] FOAM parallel run exiting [1] [0] [0] [0] --> FOAM FATAL ERROR: [0] interFoam: cannot open case directory "/OpenFOAM/tmm-1.7.1/run/tutorials/multiphase/interFoam/laminar/damBreakFine/processor0" [0] [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- mpirun has exited due to process rank 3 with PID 17044 on node xxx exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [xxx:17036] 3 more processes have sent help message help-mpi-api.txt / mpi-abort [xxx:17036] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages |
|
June 6, 2012, 20:59 |
|
#2 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Greetings Argen,
You should read very carefully the User Guide, specially the paragraphs before that mpirun command: 2.3.11 Running in parallel Notice that they talk something about decomposePar? Specifically in the second phrase after the "setFields" command!? Keep in mind that with OpenFOAM, it's not just a matter of "click-and-go" or "copy-paste-go". You must keep a very close attention to details at all times!!! Best regards, Bruno
__________________
|
|
June 6, 2012, 22:00 |
|
#3 |
Member
Join Date: Jan 2010
Posts: 44
Rep Power: 16 |
Thanks for your advice, wyldckat. The decomposePar utility was checked and has no problem. Still have no clue why running in parallel doesn't work.
|
|
June 7, 2012, 04:00 |
|
#4 |
Senior Member
Dr. Fabian Schlegel
Join Date: Apr 2009
Location: Dresden, Germany
Posts: 222
Rep Power: 18 |
Did you tried to run excactly the same case seriel (one processor)?
I would try this and if the case starts successfully run "decomposePar". Check your decomposeParDict in the system directory to ensure that the number of processors is exactly the same like you use in your mpirun command. The decomposeParDict should look like this: numberOfSubdomains 4; method scotch; distributed no; roots ( ); Then try it again. Best regards, Fabian |
|
June 7, 2012, 04:50 |
|
#5 |
Disabled
Join Date: Mar 2011
Posts: 174
Rep Power: 15 |
I tried the same tutorial as you, using the commands:
blockMesh cp 0/alpha1.org 0/alpha1 setFields decomposePar mpirun -np 4 interFoam -parallel without touching ANYTHING else. I found the commands through the Allrun script (one directory up). It seemed to be working fine. The message you get means that the "processor*" directories are not properly set up, which means that decomposePar did not run successfully. This probably means that something else did not run correctly earlier (such as blockMesh). Assuming, off course, that by the phrase "The decomposePar utility was checked and has no problem" you mean that you exectuted decomposePar. |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
OF 2.0.1 parallel running problems | moser_r | OpenFOAM Running, Solving & CFD | 9 | July 27, 2022 04:15 |
simpleFoam parallel | AndrewMortimer | OpenFOAM Running, Solving & CFD | 12 | August 7, 2015 19:45 |
parallel running | student666 | OpenFOAM Running, Solving & CFD | 7 | May 21, 2014 15:55 |
Losing Log when running in parallel | djh2 | OpenFOAM Running, Solving & CFD | 4 | February 28, 2014 10:41 |
Problems running in parallel - Pstream not available | dark lancer | OpenFOAM Installation | 14 | October 13, 2013 15:13 |