|
[Sponsors] |
February 16, 2016, 11:29 |
problem w/ running simpleFoam in parallel
|
#1 |
Senior Member
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17 |
I'm trying to run turbulent pipe flow using simpleFoam, but unfortunately, I get some errors. Please see below to find the errors. I also attach some files used for running the case in parallel. Any thought will be appreciated. Thanks
/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 3.0.0 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 3.0.0-6abec57f5449 Exec : simpleFoam -parallel Date : Feb 16 2016 Time : 09:09:35 Host : "compute-0-1.local" PID : 44738 Case : /home/mazdak/Ex3 nProcs : 4 Slaves : 3 ( "compute-0-1.local.44739" "compute-0-1.local.44740" "compute-0-1.local.44741" ) Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking polling iterations : 0 sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster allowSystemOperations : Allowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // [0] [0] [0] --> FOAM FATAL ERROR: [0] simpleFoam: cannot open case directory "/home/mazdak/Ex3/processor0" [0] [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- |
|
February 16, 2016, 13:25 |
|
#2 |
Member
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13 |
# Get application name
##application=`getApplication` ##runApplication blockMesh ##runApplication setFields ##runApplication $application runApplication decomposeParDict try just "decomposePar" |
|
February 16, 2016, 13:35 |
|
#3 |
Senior Member
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17 |
it gives the same error. The application name is "decomposeParDict" and I need to use "runApplication decomposeParDict".
|
|
February 16, 2016, 15:19 |
|
#4 |
Member
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13 |
Hi,
Here quotes form User Guide : http://http://cfd.direct/openfoam/user-guide/damBreak/#x7-610002.3.11 The first step required to run a parallel case is to decompose the domain using the decomposePar utility. There is a dictionary associated with decomposePar named decomposeParDict which is located in the system directory. Could you post your log.decompase file ? |
|
February 16, 2016, 15:28 |
|
#5 |
Senior Member
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17 |
thanks for your response. I have attached the file.
|
|
February 16, 2016, 15:34 |
|
#6 |
Member
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13 |
As you can see in the log file, there is no command called decomposeParDict.
/share/apps/OpenFOAM/OpenFOAM-2.4.0/bin/tools/RunFunctions: line 52: decomposeParDict: command not found Therefore, you did not succeed to decompose your computational domain. Please try "runApplication decomposePar" and then post your log.decomposePar again. thx |
|
February 16, 2016, 15:35 |
|
#7 |
Senior Member
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17 |
so what should I do?
|
|
February 16, 2016, 15:42 |
|
#8 |
Senior Member
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17 |
on the consul, I performed this command: decomposePar
and the decomposition was performed. and then started submitting the job but got this message this time: /*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.4.0 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 2.4.0-dcea1e13ff76 Exec : simpleFoam -parallel Date : Feb 16 2016 Time : 13:39:05 Host : "compute-3-2.local" PID : 109277 Case : /home/mazdak/Ex3 nProcs : 4 Slaves : 3 ( "compute-3-2.local.109278" "compute-3-2.local.109279" "compute-3-2.local.109280" ) Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking polling iterations : 0 sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster allowSystemOperations : Allowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 0 [3] [3] [3] --> FOAM FATAL IO ERROR: [3] Expected a ')' while reading binaryBlock, found on line 20 an error [3] [3] file: /home/mazdak/Ex3/processor3/constant/polyMesh/faces at line 20. [3] [3] From function Istream::readEnd(const char*) [3] in file db/IOstreams/IOstreams/Istream.C at line 111. [3] FOAM parallel run exiting [3] [0] [0] [0] --> FOAM FATAL IO ERROR: [0] Expected a ')' while reading binaryBlock, found on line 20 an error [0] [0] file: /home/mazdak/Ex3/processor0/constant/polyMesh/faces at line 20. [0] [0] From function Istream::readEnd(const char*) [0] in file db/IOstreams/IOstreams/Istream.C at line 111. [0] FOAM parallel run exiting [0] [1] [1] [1] --> FOAM FATAL IO ERROR: [1] Expected a ')' while reading binaryBlock, found on line 20 an error [1] [1] file: /home/mazdak/Ex3/processor1/constant/polyMesh/faces at line 20. [1] [1] From function Istream::readEnd(const char*) [1] in file db/IOstreams/IOstreams/Istream.C at line 111. [1] FOAM parallel run exiting [1] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [2] [2] [2] --> FOAM FATAL IO ERROR: [2] Expected a ')' while reading binaryBlock, found on line 20 an error [2] [2] file: /home/mazdak/Ex3/processor2/constant/polyMesh/faces at line 20. [2] [2] From function Istream::readEnd(const char*) [2] in file db/IOstreams/IOstreams/Istream.C at line 111. [2] FOAM parallel run exiting [2] |
|
February 16, 2016, 15:48 |
|
#9 |
Member
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13 |
Hi,
Could you please post your log.blockMesh file. |
|
February 16, 2016, 15:57 |
|
#10 |
Senior Member
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17 |
attached is the list of files I got, I don't have the file you mentioned, and below is the new error; in the "Allrun", I removed the command
"runApplication decomposeParDict" when I open one of the processor* directories, I just see a "constant" folder including polyMesh... (see pic. 2) should there be any other folders? |
|
February 16, 2016, 15:58 |
|
#11 |
Senior Member
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17 |
new error:
/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.4.0 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 2.4.0-dcea1e13ff76 Exec : simpleFoam -parallel Date : Feb 16 2016 Time : 13:50:34 Host : "compute-3-4.local" PID : 9299 Case : /home/mazdak/Ex3 nProcs : 4 Slaves : 3 ( "compute-3-4.local.9300" "compute-3-4.local.9301" "compute-3-4.local.9302" ) Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking polling iterations : 0 sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster allowSystemOperations : Allowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 0 [0] [0] [0] --> FOAM FATAL IO ERROR: [0] Expected a ')' while reading binaryBlock, found on line 20 an error [0] [0] file: /home/mazdak/Ex3/processor0/constant/polyMesh/faces at line 20. [0] [0] From function Istream::readEnd(const char*) [0] in file db/IOstreams/IOstreams/Istream.C at line 111. [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [2] [2] [2] --> FOAM FATAL IO ERROR: [2] Expected a ')' while reading binaryBlock, found on line 20 an error [2] [2] file: /home/mazdak/Ex3/processor2/constant/polyMesh/faces at line 20. [2] [2] From function Istream::readEnd(const char*) [2] in file db/IOstreams/IOstreams/Istream.C at line 111. [2] FOAM parallel run exiting [2] [1] [1] [1] --> FOAM FATAL IO ERROR: [1] Expected a ')' while reading binaryBlock, found on line 20 an error [1] [1] file: /home/mazdak/Ex3/processor1/constant/polyMesh/faces at line 20. [1] [1] From function Istream::readEnd(const char*) [1] in file db/IOstreams/IOstreams/Istream.C at line 111. [1] FOAM parallel run exiting [1] |
|
February 16, 2016, 16:06 |
|
#12 | |
Member
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13 |
Quote:
|
||
February 16, 2016, 16:17 |
|
#13 |
Member
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13 |
Hi,
It will be more easier if you can post your test case. |
|
February 16, 2016, 16:27 |
|
#14 |
Senior Member
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17 |
I sent you a dropbox link in a private message. The files include the mesh. thanks for spending time
|
|
February 16, 2016, 16:33 |
|
#15 |
Senior Member
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17 |
I noticed that before performing "decomposePar", I needed to have a folder named "0" not "0.org". In this case I got the "0" folders for each "processor*" folder. But, still. I was not able to run the case.
|
|
February 16, 2016, 16:41 |
|
#16 |
Member
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13 |
||
February 16, 2016, 16:42 |
|
#17 |
Senior Member
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17 |
download it from the above link. please let me know whenever you dl it.
|
|
February 16, 2016, 16:59 |
|
#18 |
Member
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13 |
Hi,
Please use these scripts as attachment. Let me know if you still get error messages. |
|
February 16, 2016, 17:06 |
|
#19 |
Senior Member
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17 |
it dosen't work. it says:
/opt/gridengine/default/spool/compute-2-6/job_scripts/1688: line 15: ./Allrun: Permission denied as I mentioned, I am able to decompose the mesh and get those "0" folders. But, I have problem afterwards |
|
February 16, 2016, 17:11 |
|
#20 | |
Member
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13 |
Quote:
chmod u+rwx Allrun I can run your case without any error. |
||
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
OF 2.0.1 parallel running problems | moser_r | OpenFOAM Running, Solving & CFD | 9 | July 27, 2022 04:15 |
Error running simpleFoam in parallel | Yuby | OpenFOAM Running, Solving & CFD | 14 | October 7, 2021 05:38 |
simpleFoam parallel solver & Fluent polyhedral mesh | Zlatko | OpenFOAM Running, Solving & CFD | 3 | September 26, 2014 07:53 |
problem with running in parallel | dhruv | OpenFOAM | 3 | November 25, 2011 06:06 |
parallel mode - small problem? | co2 | FLUENT | 2 | June 2, 2004 00:47 |