CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

problem w/ running simpleFoam in parallel

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   February 16, 2016, 11:29
Default problem w/ running simpleFoam in parallel
  #1
Senior Member
 
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17
mazdak is on a distinguished road
I'm trying to run turbulent pipe flow using simpleFoam, but unfortunately, I get some errors. Please see below to find the errors. I also attach some files used for running the case in parallel. Any thought will be appreciated. Thanks

/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 3.0.0 |
| \\ / A nd | Web: www.OpenFOAM.org |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 3.0.0-6abec57f5449
Exec : simpleFoam -parallel
Date : Feb 16 2016
Time : 09:09:35
Host : "compute-0-1.local"
PID : 44738
Case : /home/mazdak/Ex3
nProcs : 4
Slaves :
3
(
"compute-0-1.local.44739"
"compute-0-1.local.44740"
"compute-0-1.local.44741"
)

Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : nonBlocking
polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
[0]
[0]
[0] --> FOAM FATAL ERROR:
[0] simpleFoam: cannot open case directory "/home/mazdak/Ex3/processor0"
[0]
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
Attached Files
File Type: txt Allrun.txt (527 Bytes, 11 views)
File Type: txt runHOU.sh.txt (501 Bytes, 6 views)
File Type: txt decomposeParDict.txt (958 Bytes, 7 views)
mazdak is offline   Reply With Quote

Old   February 16, 2016, 13:25
Default
  #2
Member
 
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13
kmefun is on a distinguished road
# Get application name
##application=`getApplication`

##runApplication blockMesh
##runApplication setFields
##runApplication $application

runApplication decomposeParDict

try just "decomposePar"
kmefun is offline   Reply With Quote

Old   February 16, 2016, 13:35
Default
  #3
Senior Member
 
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17
mazdak is on a distinguished road
it gives the same error. The application name is "decomposeParDict" and I need to use "runApplication decomposeParDict".
mazdak is offline   Reply With Quote

Old   February 16, 2016, 15:19
Default
  #4
Member
 
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13
kmefun is on a distinguished road
Hi,

Here quotes form User Guide : http://http://cfd.direct/openfoam/user-guide/damBreak/#x7-610002.3.11

The first step required to run a parallel case is to decompose the domain using the decomposePar utility. There is a dictionary associated with decomposePar named decomposeParDict which is located in the system directory.

Could you post your log.decompase file ?
kmefun is offline   Reply With Quote

Old   February 16, 2016, 15:28
Default
  #5
Senior Member
 
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17
mazdak is on a distinguished road
thanks for your response. I have attached the file.
Attached Files
File Type: txt log.decomposeParDict.txt (105 Bytes, 8 views)
mazdak is offline   Reply With Quote

Old   February 16, 2016, 15:34
Default
  #6
Member
 
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13
kmefun is on a distinguished road
As you can see in the log file, there is no command called decomposeParDict.
/share/apps/OpenFOAM/OpenFOAM-2.4.0/bin/tools/RunFunctions: line 52: decomposeParDict: command not found

Therefore, you did not succeed to decompose your computational domain.

Please try "runApplication decomposePar"
and then post your log.decomposePar again. thx
kmefun is offline   Reply With Quote

Old   February 16, 2016, 15:35
Default
  #7
Senior Member
 
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17
mazdak is on a distinguished road
so what should I do?
mazdak is offline   Reply With Quote

Old   February 16, 2016, 15:42
Default
  #8
Senior Member
 
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17
mazdak is on a distinguished road
on the consul, I performed this command: decomposePar
and the decomposition was performed. and then started submitting the job but got this message this time:

/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.4.0 |
| \\ / A nd | Web: www.OpenFOAM.org |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 2.4.0-dcea1e13ff76
Exec : simpleFoam -parallel
Date : Feb 16 2016
Time : 13:39:05
Host : "compute-3-2.local"
PID : 109277
Case : /home/mazdak/Ex3
nProcs : 4
Slaves :
3
(
"compute-3-2.local.109278"
"compute-3-2.local.109279"
"compute-3-2.local.109280"
)

Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : nonBlocking
polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

[3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] Expected a ')' while reading binaryBlock, found on line 20 an error
[3]
[3] file: /home/mazdak/Ex3/processor3/constant/polyMesh/faces at line 20.
[3]
[3] From function Istream::readEnd(const char*)
[3] in file db/IOstreams/IOstreams/Istream.C at line 111.
[3]
FOAM parallel run exiting
[3]
[0]
[0]
[0] --> FOAM FATAL IO ERROR:
[0] Expected a ')' while reading binaryBlock, found on line 20 an error
[0]
[0] file: /home/mazdak/Ex3/processor0/constant/polyMesh/faces at line 20.
[0]
[0] From function Istream::readEnd(const char*)
[0] in file db/IOstreams/IOstreams/Istream.C at line 111.
[0]
FOAM parallel run exiting
[0]
[1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] Expected a ')' while reading binaryBlock, found on line 20 an error
[1]
[1] file: /home/mazdak/Ex3/processor1/constant/polyMesh/faces at line 20.
[1]
[1] From function Istream::readEnd(const char*)
[1] in file db/IOstreams/IOstreams/Istream.C at line 111.
[1]
FOAM parallel run exiting
[1]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[2]
[2]
[2] --> FOAM FATAL IO ERROR:
[2] Expected a ')' while reading binaryBlock, found on line 20 an error
[2]
[2] file: /home/mazdak/Ex3/processor2/constant/polyMesh/faces at line 20.
[2]
[2] From function Istream::readEnd(const char*)
[2] in file db/IOstreams/IOstreams/Istream.C at line 111.
[2]
FOAM parallel run exiting
[2]
mazdak is offline   Reply With Quote

Old   February 16, 2016, 15:48
Default
  #9
Member
 
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13
kmefun is on a distinguished road
Hi,

Could you please post your log.blockMesh file.
kmefun is offline   Reply With Quote

Old   February 16, 2016, 15:57
Default
  #10
Senior Member
 
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17
mazdak is on a distinguished road
attached is the list of files I got, I don't have the file you mentioned, and below is the new error; in the "Allrun", I removed the command
"runApplication decomposeParDict"

when I open one of the processor* directories, I just see a "constant" folder including polyMesh... (see pic. 2)
should there be any other folders?
Attached Images
File Type: jpg 1.jpg (99.1 KB, 16 views)
File Type: jpg 2.jpg (102.3 KB, 13 views)
mazdak is offline   Reply With Quote

Old   February 16, 2016, 15:58
Default
  #11
Senior Member
 
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17
mazdak is on a distinguished road
new error:


/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.4.0 |
| \\ / A nd | Web: www.OpenFOAM.org |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 2.4.0-dcea1e13ff76
Exec : simpleFoam -parallel
Date : Feb 16 2016
Time : 13:50:34
Host : "compute-3-4.local"
PID : 9299
Case : /home/mazdak/Ex3
nProcs : 4
Slaves :
3
(
"compute-3-4.local.9300"
"compute-3-4.local.9301"
"compute-3-4.local.9302"
)

Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : nonBlocking
polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

[0]
[0]
[0] --> FOAM FATAL IO ERROR:
[0] Expected a ')' while reading binaryBlock, found on line 20 an error
[0]
[0] file: /home/mazdak/Ex3/processor0/constant/polyMesh/faces at line 20.
[0]
[0] From function Istream::readEnd(const char*)
[0] in file db/IOstreams/IOstreams/Istream.C at line 111.
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[2]
[2]
[2] --> FOAM FATAL IO ERROR:
[2] Expected a ')' while reading binaryBlock, found on line 20 an error
[2]
[2] file: /home/mazdak/Ex3/processor2/constant/polyMesh/faces at line 20.
[2]
[2] From function Istream::readEnd(const char*)
[2] in file db/IOstreams/IOstreams/Istream.C at line 111.
[2]
FOAM parallel run exiting
[2]
[1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] Expected a ')' while reading binaryBlock, found on line 20 an error
[1]
[1] file: /home/mazdak/Ex3/processor1/constant/polyMesh/faces at line 20.
[1]
[1] From function Istream::readEnd(const char*)
[1] in file db/IOstreams/IOstreams/Istream.C at line 111.
[1]
FOAM parallel run exiting
[1]
mazdak is offline   Reply With Quote

Old   February 16, 2016, 16:06
Default
  #12
Member
 
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13
kmefun is on a distinguished road
Quote:
Originally Posted by mazdak View Post
attached is the list of files I got, I don't have the file you mentioned, and below is the new error; in the "Allrun", I removed the command
"runApplication decomposeParDict"

when I open one of the processor* directories, I just see a "constant" folder including polyMesh... (see pic. 2)
should there be any other folders?
In each processor* directory, there should be one "0" directory for initial and boundary conditions and one "constant" directory for your decompose mesh.
kmefun is offline   Reply With Quote

Old   February 16, 2016, 16:17
Default
  #13
Member
 
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13
kmefun is on a distinguished road
Hi,

It will be more easier if you can post your test case.
kmefun is offline   Reply With Quote

Old   February 16, 2016, 16:27
Default
  #14
Senior Member
 
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17
mazdak is on a distinguished road
I sent you a dropbox link in a private message. The files include the mesh. thanks for spending time
mazdak is offline   Reply With Quote

Old   February 16, 2016, 16:33
Default
  #15
Senior Member
 
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17
mazdak is on a distinguished road
I noticed that before performing "decomposePar", I needed to have a folder named "0" not "0.org". In this case I got the "0" folders for each "processor*" folder. But, still. I was not able to run the case.
mazdak is offline   Reply With Quote

Old   February 16, 2016, 16:41
Default
  #16
Member
 
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13
kmefun is on a distinguished road
Quote:
Originally Posted by mazdak View Post
I sent you a dropbox link in a private message. The files include the mesh. thanks for spending time
I don't receive any message from you. Could you send it again?
kmefun is offline   Reply With Quote

Old   February 16, 2016, 16:42
Default
  #17
Senior Member
 
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17
mazdak is on a distinguished road
download it from the above link. please let me know whenever you dl it.
mazdak is offline   Reply With Quote

Old   February 16, 2016, 16:59
Default
  #18
Member
 
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13
kmefun is on a distinguished road
Hi,

Please use these scripts as attachment. Let me know if you still get error messages.
Attached Files
File Type: gz runHOU.sh.tar.gz (735 Bytes, 25 views)
kmefun is offline   Reply With Quote

Old   February 16, 2016, 17:06
Default
  #19
Senior Member
 
MAZI
Join Date: Oct 2009
Posts: 103
Rep Power: 17
mazdak is on a distinguished road
it dosen't work. it says:

/opt/gridengine/default/spool/compute-2-6/job_scripts/1688: line 15: ./Allrun: Permission denied

as I mentioned, I am able to decompose the mesh and get those "0" folders. But, I have problem afterwards
mazdak is offline   Reply With Quote

Old   February 16, 2016, 17:11
Default
  #20
Member
 
Kaufman
Join Date: Jul 2013
Posts: 55
Rep Power: 13
kmefun is on a distinguished road
Quote:
Originally Posted by mazdak View Post
it dosen't work. it says:

/opt/gridengine/default/spool/compute-2-6/job_scripts/1688: line 15: ./Allrun: Permission denied

as I mentioned, I am able to decompose the mesh and get those "0" folders. But, I have problem afterwards
Change Allrun file's "Permission" to be executable.

chmod u+rwx Allrun

I can run your case without any error.
kmefun is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
OF 2.0.1 parallel running problems moser_r OpenFOAM Running, Solving & CFD 9 July 27, 2022 04:15
Error running simpleFoam in parallel Yuby OpenFOAM Running, Solving & CFD 14 October 7, 2021 05:38
simpleFoam parallel solver & Fluent polyhedral mesh Zlatko OpenFOAM Running, Solving & CFD 3 September 26, 2014 07:53
problem with running in parallel dhruv OpenFOAM 3 November 25, 2011 06:06
parallel mode - small problem? co2 FLUENT 2 June 2, 2004 00:47


All times are GMT -4. The time now is 17:52.