|
[Sponsors] |
July 25, 2015, 19:52 |
damBreak case parallel run problem [solved]
|
#1 |
Member
behzad Ghasemi
Join Date: Sep 2013
Location: Iran
Posts: 56
Rep Power: 13 |
Hi dear Foamers,
I have a problem with parallel processing of the open Foam. I saw some threads but I think my problem is a little different so i posted new one. Every time i try to do the damBreak case i get Error below: Code:
(OF:2.4.0-Opt) behzad@behzad:~/Documents/damBreak$ mpirun -np 4 interFoam -parallel > log & [1] 14702 (OF:2.4.0-Opt) behzad@behzad:~/Documents/damBreak$ [0] [0] [0] --> FOAM FATAL ERROR: [0] interFoam: cannot open case directory "/home/behzad/Documents/damBreak/processor0" [0] [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- -------------------------------------------------------------------------- mpirun has exited due to process rank 0 with PID 14703 on node behzad exiting improperly. There are two reasons this could occur: 1. this process did not call "init" before exiting, but others in the job did. This can cause a job to hang indefinitely while it waits for all processes to call "init". By rule, if one process calls "init", then ALL processes must call "init" prior to termination. 2. this process called "init", but exited without calling "finalize". By rule, all processes that call "init" MUST call "finalize" prior to exiting or it will be considered an "abnormal termination" This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [1]+ Exit 1 mpirun -np 4 interFoam -parallel > log Running on my dell xps l502x laptop core i7 2630QM CPU, 12 gig of RAM, 10 gig SWAP space, UBUNTU 12.04.2 LTS,Linux 3.16.0-30-generic (x86_64). my Open MPI version is 1.6.5 . I tested this on several versions of open Foam such as 2.4, 2.3, 3.1 Extend and every time i got errors like that! I've done a few searches in forum and found couple of old threads and couldn't solve the problem. Please give me some steps I'm not a Linux expert. This is the log file: https://www.dropbox.com/s/9p4l4qxepkqdz7c/log?dl=0 Regards Last edited by behzad-cfd; August 3, 2015 at 05:29. |
|
July 26, 2015, 16:05 |
|
#2 |
Member
behzad Ghasemi
Join Date: Sep 2013
Location: Iran
Posts: 56
Rep Power: 13 |
Any idea?
Why no body doesn't answer my questions?! It's very disappointing ... |
|
July 27, 2015, 04:48 |
|
#4 |
Member
behzad Ghasemi
Join Date: Sep 2013
Location: Iran
Posts: 56
Rep Power: 13 |
||
August 2, 2015, 10:11 |
|
#5 | |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Hi behzad,
I've finally managed to take a quick look at your problem... and this is disconcerting . I was expecting that you had provided the case in DropBox, but instead you provided the log file. From what I can figure out, based on the little information you've provided,it seems that you didn't notice about the complaints that decomposePar gave you, because if interFoam is complaining that: Quote:
Please provide the following details (which are somewhat implied in this thread: http://www.cfd-online.com/Forums/ope...-get-help.html):
If you're not familiar with how to use the command line in a Linux system, please study one or two tutorials about it. This page might help you get started: http://openfoamwiki.net/index.php/In...with_the_Shell Best regards, Bruno
__________________
|
||
August 2, 2015, 18:18 |
|
#6 |
Member
behzad Ghasemi
Join Date: Sep 2013
Location: Iran
Posts: 56
Rep Power: 13 |
Hi Bruno,
I'm appreciated of you for accepting my request and your kind answer. I searched open foam's bug page a few days ago and saw a bug exactly same as my problem that you had answered it(bug ID 0000301).http://www.openfoam.org/bugs/ The problem was about illegal machine name. I reinstalled my Linux for some reasons and changed my machine name too.but i hadn't tested parallel running case again until you asked me to create log files, I tested and did same things that i had done before and was not working, but this time it worked without any problem. So i think it was about inappropriate machine name and solved. Thank you again bruno. Best regards, Behzad Last edited by behzad-cfd; August 3, 2015 at 05:34. |
|
Tags |
dambreak, mpirun error, openfoam, parallel error |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
problem during mpi in server: expected Scalar, found on line 0 the word 'nan' | muth | OpenFOAM Running, Solving & CFD | 3 | August 27, 2018 05:18 |
Case running in serial, but Parallel run gives error | atmcfd | OpenFOAM Running, Solving & CFD | 18 | March 26, 2016 13:40 |
Running AMI case in parallel | Kaskade | OpenFOAM Running, Solving & CFD | 3 | March 14, 2016 16:58 |
Superlinear speedup in OpenFOAM 13 | msrinath80 | OpenFOAM Running, Solving & CFD | 18 | March 3, 2015 06:36 |
Parallel Run problem | shhe | OpenFOAM Running, Solving & CFD | 1 | April 27, 2010 07:52 |