CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Problems running in parallel - missing controlDict

Register Blogs Community New Posts Updated Threads Search

Like Tree5Likes
  • 1 Post By Argen
  • 1 Post By wyldckat
  • 1 Post By Argen
  • 1 Post By fs82
  • 1 Post By anon_a

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   June 6, 2012, 20:51
Question Problems running in parallel - missing controlDict
  #1
Member
 
Join Date: Jan 2010
Posts: 44
Rep Power: 16
Argen is on a distinguished road
Try to run a tutorial case ($FOAM_RUN/tutorials/multiphase/interFoam/laminar/damBreak ) in parallel. But the errors were reported as follows. Any idea on what's wrong?





$ mpirun -np 4 interFoam -parallel > log &



[1] 17036
~/OpenFOAM/tmm-1.7.1/run/tutorials/multiphase/interFoam/laminar/damBreakFine$ [3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] cannot open file
[3]
[3] file: /OpenFOAM/tmm-1.7.1/run/tutorials/multiphase/interFoam/laminar/damBreakFine/processor3/system/controlDict at line 0.
[3]
[3] From function regIOobject::readStream()
[3] in file db/regIOobject/regIOobjectRead.C at line 61.
[3]
FOAM parallel run exiting
[3]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[2]
[2]
[2] --> FOAM FATAL IO ERROR:
[2] cannot open file
[2]
[2] file: OpenFOAM/tmm-1.7.1/run/tutorials/multiphase/interFoam/laminar/damBreakFine/processor2/system/controlDict at line 0.
[2]
[2] From function regIOobject::readStream()
[2] in file db/regIOobject/regIOobjectRead.C at line 61.
[2]
FOAM parallel run exiting
[2]
[1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] cannot open file
[1]
[1] file: /OpenFOAM/tmm-1.7.1/run/tutorials/multiphase/interFoam/laminar/damBreakFine/processor1/system/controlDict at line 0.
[1]
[1] From function regIOobject::readStream()
[1] in file db/regIOobject/regIOobjectRead.C at line 61.
[1]
FOAM parallel run exiting
[1]
[0]
[0]
[0] --> FOAM FATAL ERROR:
[0] interFoam: cannot open case directory "/OpenFOAM/tmm-1.7.1/run/tutorials/multiphase/interFoam/laminar/damBreakFine/processor0"
[0]
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
mpirun has exited due to process rank 3 with PID 17044 on
node xxx exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[xxx:17036] 3 more processes have sent help message help-mpi-api.txt / mpi-abort
[xxx:17036] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Bachduong likes this.
Argen is offline   Reply With Quote

Old   June 6, 2012, 20:59
Default
  #2
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Greetings Argen,

You should read very carefully the User Guide, specially the paragraphs before that mpirun command: 2.3.11 Running in parallel

Notice that they talk something about decomposePar? Specifically in the second phrase after the "setFields" command!?

Keep in mind that with OpenFOAM, it's not just a matter of "click-and-go" or "copy-paste-go". You must keep a very close attention to details at all times!!!

Best regards,
Bruno
Bachduong likes this.
__________________
wyldckat is offline   Reply With Quote

Old   June 6, 2012, 22:00
Default
  #3
Member
 
Join Date: Jan 2010
Posts: 44
Rep Power: 16
Argen is on a distinguished road
Thanks for your advice, wyldckat. The decomposePar utility was checked and has no problem. Still have no clue why running in parallel doesn't work.
Bachduong likes this.
Argen is offline   Reply With Quote

Old   June 7, 2012, 04:00
Default
  #4
Senior Member
 
Dr. Fabian Schlegel
Join Date: Apr 2009
Location: Dresden, Germany
Posts: 222
Rep Power: 18
fs82 is on a distinguished road
Did you tried to run excactly the same case seriel (one processor)?
I would try this and if the case starts successfully run "decomposePar". Check your decomposeParDict in the system directory to ensure that the number of processors is exactly the same like you use in your mpirun command. The decomposeParDict should look like this:

numberOfSubdomains 4;

method scotch;

distributed no;

roots
(
);

Then try it again.

Best regards,
Fabian
Bachduong likes this.
fs82 is offline   Reply With Quote

Old   June 7, 2012, 04:50
Default
  #5
Disabled
 
Join Date: Mar 2011
Posts: 174
Rep Power: 15
anon_a is on a distinguished road
I tried the same tutorial as you, using the commands:

blockMesh
cp 0/alpha1.org 0/alpha1
setFields
decomposePar
mpirun -np 4 interFoam -parallel

without touching ANYTHING else. I found the commands through the Allrun script (one directory up).

It seemed to be working fine. The message you get means that the "processor*" directories are not properly set up, which means that decomposePar did not run successfully. This probably means that something else did not run correctly earlier (such as blockMesh). Assuming, off course, that by the phrase "The decomposePar utility was checked and has no problem" you mean that you exectuted decomposePar.
Bachduong likes this.
anon_a is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
OF 2.0.1 parallel running problems moser_r OpenFOAM Running, Solving & CFD 9 July 27, 2022 04:15
simpleFoam parallel AndrewMortimer OpenFOAM Running, Solving & CFD 12 August 7, 2015 19:45
parallel running student666 OpenFOAM Running, Solving & CFD 7 May 21, 2014 15:55
Losing Log when running in parallel djh2 OpenFOAM Running, Solving & CFD 4 February 28, 2014 10:41
Problems running in parallel - Pstream not available dark lancer OpenFOAM Installation 14 October 13, 2013 15:13


All times are GMT -4. The time now is 07:57.