|
[Sponsors] |
error while running in parallel using openmpi on local mc 6 processors |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
May 20, 2012, 06:02 |
error while running in parallel using openmpi on local mc 6 processors
|
#1 |
New Member
Nitin Suryawanshi
Join Date: Mar 2009
Location: Pune, India
Posts: 28
Rep Power: 17 |
when i m running my case for parallel processing with following
mpirun -np 6 pisoFoam -parallel > log & getting following error... neptune@ubuntu:~/tutorials/incompressible/icoFoam/cavity$ mpirun -np 6 pisoFoam -parallel > log & [1] 12387 neptune@ubuntu:~/tutorials/incompressible/icoFoam/cavity$ -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [0] [0] [0] --> FOAM FATAL ERROR: [0] number of processor directories = 2 is not equal to the number of processors = 6 [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- mpirun has exited due to process rank 0 with PID 12388 on node ubuntu exiting improperly. There are two reasons this could occur: 1. this process did not call "init" before exiting, but others in the job did. This can cause a job to hang indefinitely while it waits for all processes to call "init". By rule, if one process calls "init", then ALL processes must call "init" prior to termination. 2. this process called "init", but exited without calling "finalize". By rule, all processes that call "init" MUST call "finalize" prior to exiting or it will be considered an "abnormal termination" This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- therefore after this i did parallel test as mentioned in on of Brunos link
Till 2 step everything working ok but when i say parallelTest gives following error neptune@ubuntu:~/tutorials/incompressible/icoFoam/cavity$ parallelTest parallelTest: command not found Thanks in advance... please help me on this.... |
|
May 20, 2012, 09:28 |
|
#2 |
Senior Member
Adhiraj
Join Date: Sep 2010
Location: Karnataka, India
Posts: 187
Rep Power: 16 |
Why does it complain that you have 2 processor directories and are trying to run with 6 processors?
|
|
May 20, 2012, 11:45 |
|
#3 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Greetings to both!
@suryawanshi_nitin: Adhiraj is right, the decomposition apparently didn't go as you expected. Check your "system/decomposeParDict". As for parallelTest, as of 2.0.0 it has been renamed to Test-parallel. Best regards, Bruno
__________________
|
|
May 20, 2012, 14:37 |
|
#4 |
New Member
Nitin Suryawanshi
Join Date: Mar 2009
Location: Pune, India
Posts: 28
Rep Power: 17 |
Thnaks for your valuable replies
yes you are right, in my actual case i m working with 6 processors but it was giving issue, so i thought of checking with simple case. Below is the error msg of actual case. i have allready solved this case in of201 debian pack, till 1.7 sec by decomposing domain for 6 processors, And now i m using same data to solve further in of210 source pack installation. but while running it for further time after 1.7 sec but getting following error... (And test-Parallel is working now) neptune@ubuntu:~/nitin/s$ mpirun -np 6 pisoFoam -parallel > log & [1] 2865 neptune@ubuntu:~/nitin/s$ [5] [5] [5] --> FOAM FATAL IO ERROR: [5] essential value entry not provided [5] [5] file: /home/neptune/nitin/s/processor5/1.77/phi::boundaryField::symmetryBottom from line 59453 to line 59453. [5] [5] From function fvsPatchField<Type>::fvsPatchField ( const fvPatch& p, const DimensionedField<Type, surfaceMesh>& iF, const dictionary& dict ) [5] [5] in file lnInclude/fvsPatchField.C at line 110. [5] FOAM parallel run exiting [5] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 5 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- -------------------------------------------------------------------------- mpirun has exited due to process rank 5 with PID 2871 on node ubuntu exiting improperly. There are two reasons this could occur: 1. this process did not call "init" before exiting, but others in the job did. This can cause a job to hang indefinitely while it waits for all processes to call "init". By rule, if one process calls "init", then ALL processes must call "init" prior to termination. 2. this process called "init", but exited without calling "finalize". By rule, all processes that call "init" MUST call "finalize" prior to exiting or it will be considered an "abnormal termination" This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- kindly waiting for your replies. Thanks in advance. Nitin |
|
May 21, 2012, 09:31 |
|
#5 |
Member
|
What OpenFOAM is trying to tell you: Your "symmetryBottom" has a missing value in your boundary setup. Be sure to provide all needed variables!
__________________
~~~_/)~~~ |
|
May 21, 2012, 15:13 |
|
#6 |
New Member
Nitin Suryawanshi
Join Date: Mar 2009
Location: Pune, India
Posts: 28
Rep Power: 17 |
sir thanks for your reply
its working well now, but i started the case from start time i.e. 0.0 sec in controldict file, a complete new simulation. From this what i understood is if we are having solution data of of debian pack of201 and if we want to run that case further in of210 source pack installation, then it of210 unable to understand/handle old data from old version especially in parallel case. this is what my interpretation.... thanks sir for your valuable time. if anyone is having clear view about this then they are most wellcome. This is the way to learn fast with more clarity..... regards Nitin Suryawanshi. |
|
February 18, 2017, 23:11 |
|
#7 |
Member
sibo
Join Date: Oct 2016
Location: Chicago
Posts: 55
Rep Power: 10 |
Hi Nitin,
I read your post "error while running in parallel using openmpi on local mc 6 processors" and noticed you solved this problem. I have the exactly same error. I was trying to run a case in parallel with 4 processors in Cluster. the task stops right after it starts and i found this error message in log file. But i tried to run this case in parallel in my own laptop, it works fine. Can you please share how you solve this problem with more detail? Thanks a lot!!! |
|
February 22, 2017, 00:21 |
|
#8 | |
Member
Maria
Join Date: Jul 2013
Posts: 84
Rep Power: 13 |
Quote:
Have you solved the problem? I have the same one!!! Maria |
||
February 22, 2017, 10:04 |
|
#9 |
Member
sibo
Join Date: Oct 2016
Location: Chicago
Posts: 55
Rep Power: 10 |
Hi Maria,
I am still working on that! so annoying!! What i did is I installed Openfoam again and especially pay attention to the load Openmpi to the environment step. Also make sure your openfoam are the same version. Hope this works. Thanks. |
|
February 22, 2017, 10:34 |
|
#10 |
Member
sibo
Join Date: Oct 2016
Location: Chicago
Posts: 55
Rep Power: 10 |
Hi Maria,
It works now!!! |
|
February 22, 2017, 22:33 |
|
#11 |
Member
Maria
Join Date: Jul 2013
Posts: 84
Rep Power: 13 |
Thanks, Sibo.
Mine is also working, and I didn't reinstall it. I have made some mistakes. |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Transient simulation not converging | skabilan | OpenFOAM Running, Solving & CFD | 14 | December 17, 2019 00:12 |
Superlinear speedup in OpenFOAM 13 | msrinath80 | OpenFOAM Running, Solving & CFD | 18 | March 3, 2015 06:36 |
How to write k and epsilon before the abnormal end | xiuying | OpenFOAM Running, Solving & CFD | 8 | August 27, 2013 16:33 |
Upgraded from Karmic Koala 9.10 to Lucid Lynx10.04.3 | bookie56 | OpenFOAM Installation | 8 | August 13, 2011 05:03 |
Running dieselFoam in parallel. | Palminchi | OpenFOAM | 0 | February 17, 2010 05:00 |