|
[Sponsors] |
August 27, 2010, 04:42 |
MPI InterFoam
|
#1 |
New Member
Join Date: May 2010
Posts: 27
Rep Power: 16 |
Hey All,
I am trying to run interFoam (with 1 node) using MPI. I am running openFoam 1.7.1 on Ubuntu luncid. My pc is an AMD phenom X6 (6core processor) 8GB Ram, but I keep getting the following error metro@ubuntu:~/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong$ [1] [3] [3] [3] --> FOAM FATAL IO ERROR: [3] cannot open file [3] [3] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor3/system/controlDict at line 0. [3] [3] From function regIOobject::readStream() [3] in file db/regIOobject/regIOobjectRead.C at line 61. [3] FOAM parallel run exiting [3] [4] [4] [4] --> FOAM FATAL IO ERROR: [4] cannot open file [4] [4] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor4/system/controlDict at line 0. [4] [4] From function regIOobject::readStream() [4] in file db/regIOobject/regIOobjectRead.C at line 61. [4] FOAM parallel run exiting [4] [5] [5] [5] --> FOAM FATAL IO ERROR: [5] cannot open file [5] [5] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor5/system/controlDict at line 0. [5] [5] From function regIOobject::readStream() [5] in file db/regIOobject/regIOobjectRead.C at line 61. [5] FOAM parallel run exiting [5] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [0] [0] [0] --> FOAM FATAL ERROR: [0] interFoam: cannot open case directory "/home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor0" [0] [0] FOAM parallel run exiting [0] [2] [2] [2] --> FOAM FATAL IO ERROR: [2] cannot open file [2] [2] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor2/system/controlDict at line 0. [2] [2] From function regIOobject::readStream() [2] in file db/regIOobject/regIOobjectRead.C at line 61. [2] FOAM parallel run exiting [2] [1] [1] --> FOAM FATAL IO ERROR: [1] cannot open file [1] [1] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor1/system/controlDict at line 0. [1] [1] From function regIOobject::readStream() [1] in file db/regIOobject/regIOobjectRead.C at line 61. [1] FOAM parallel run exiting [1] -------------------------------------------------------------------------- mpirun has exited due to process rank 2 with PID 11141 on node ubuntu exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [ubuntu:11138] 5 more processes have sent help message help-mpi-api.txt / mpi-abort [ubuntu:11138] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages I have set it up so that it will divide my mesh into 6 bits along the z axis. CAn any one please help me? Are there any additional settings I am meant to set up before using MPI. Regards Metro |
|
October 15, 2010, 04:59 |
|
#2 |
New Member
nuria llamas
Join Date: Jul 2010
Posts: 2
Rep Power: 0 |
Hi!
Did you succeed in resolving the problem??? I have the same one... Thank u! |
|
November 7, 2010, 10:49 |
|
#3 |
Senior Member
Robert Sawko
Join Date: Mar 2009
Posts: 117
Rep Power: 22 |
Hello,
I am not sure if this is relevant, but I have seen similar errors when I forgot to run decomposePar. Have you executed decomposePar before running the interFoam solver? |
|
December 14, 2010, 23:41 |
|
#4 |
Member
Paul Reichl
Join Date: Feb 2010
Location: Melbourne, Victoria, Australia
Posts: 33
Rep Power: 16 |
I saw this error when I incorrectly set the directories in the roots section of the system/decomposeParDict file (when running in parallel on a distributed system).
|
|
March 26, 2013, 07:15 |
|
#5 |
Member
Malik
Join Date: Dec 2012
Location: Austin, USA
Posts: 53
Rep Power: 13 |
I find a similar error when I wanted to simulate a case with a bash script.
When I run decomposePar and then run mpirun in the script : I receive : cannot open case directory "/media/windows/OpenFOAM/vent-2.1.1/run/cylindre/turbulence/Spalart/pimple/domaine_elargi/cylindreRE1000000_53/processor0" However, when I run decomposePar by hands, in the case, and then run mpirun with the script, everything is fine. I just don't understand Here is what I do in the script : Code:
cd $2 #$2 is a parameter the user give when calling the script if [ ! -d $chemin/processor0 ] then decomposePar > logdecompose & fi mpirun -np $nbCPU pimpleFoam -parallel > log & |
|
March 26, 2013, 16:19 |
|
#6 |
Member
Paul Reichl
Join Date: Feb 2010
Location: Melbourne, Victoria, Australia
Posts: 33
Rep Power: 16 |
Hi Malaboss,
Have you checked that the script is doing what you think it is doing?. I would try using echo statements to get it to write something to the screen when it enters each section. Does this error message show up in the logdecompose file or in the log file and more importantly does the directory in the error message exist? My suspicion is that the decompose part of the script is not actually being run but that is only a guess. I would try a simpler bash script with just the two commands in it and see how that goes. I assume that the roots section of the system/decomposeParDict has been set correctly. The other possibility is that the bash script is not inheriting settings but I think the other possibilities are more likely. Cheers, Paul. |
|
March 26, 2013, 18:36 |
|
#7 | |||
Member
Malik
Join Date: Dec 2012
Location: Austin, USA
Posts: 53
Rep Power: 13 |
Hi Paul, and thanks for answering
Quote:
My variables are well initialized. As a proof, my code is working only if I run decomposePar before running the script (and obviously if I remove the decomposePar part in my code). Quote:
Also, the directory exists. Quote:
I am thinking about something else : I ran decomposePar with "&" at the end, which means I start to decompose and immediately, I can run the following command lines. Then I may not wait for the end of the decomposition, and run the mpirun part too early. That could explain the fact that OpenFOAM refuses to open the processor directory. I will check tomorrow morning (in France^^) and tell you if that is it. Thank you again ! |
||||
March 27, 2013, 04:24 |
|
#8 |
Member
Malik
Join Date: Dec 2012
Location: Austin, USA
Posts: 53
Rep Power: 13 |
Hi Paul,
I just check my code today and dropped the "&" at the end of the decomposePar line. Now everything is all right. The problem was that I was trying to run my case before having finished my decomposition of the case. Thanks for all ! |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
InterFoam stops after deltaT goes to 1e14 | francesco_b | OpenFOAM Running, Solving & CFD | 9 | July 25, 2020 07:36 |
Open Channel Flow using InterFoam type solver | sxhdhi | OpenFOAM Running, Solving & CFD | 3 | May 5, 2009 22:58 |
Error using LaunderGibsonRSTM on SGI ALTIX 4700 | jaswi | OpenFOAM | 2 | April 29, 2008 11:54 |
Is Testsuite on the way or not | lakeat | OpenFOAM Installation | 6 | April 28, 2008 12:12 |
MPI and parallel computation | Wang | Main CFD Forum | 7 | April 15, 2004 12:25 |