CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

simpleFoam parallel

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   November 7, 2013, 14:53
Default simpleFoam parallel
  #1
New Member
 
Andrew Mortimer
Join Date: Oct 2013
Posts: 15
Rep Power: 13
AndrewMortimer is on a distinguished road
Hi,

I'm having a bit of trouble running simpleFoam in parallel. I am using the motorBike tutorial and trying to run it on 6 cores (processor is i7-4930k).

I ran blockMesh, surfaceFeatureExtract & snappyHexMesh. I then commented out the functions part of the controlDict file (following a tutorial from a lecturer). Then I ran decomposePar, and viewed the individual meshes in paraFoam and everything seemed to have split up evenly.

The next step I ran

Code:
 mpirun -np 6 simpleFoam -parallel
This gave an error message and did not solve:

Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.2.2                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : 2.2.2-9240f8b967db
Exec   : simpleFoam -parallel
Date   : Nov 07 2013
Time   : 18:47:22
Host   : "andrew-pc"
PID    : 620
Case   : /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel
nProcs : 6
Slaves : 
5
(
"andrew-pc.621"
"andrew-pc.622"
"andrew-pc.623"
"andrew-pc.624"
"andrew-pc.625"
)

Pstream initialized with:
    floatTransfer      : 0
    nProcsSimpleSum    : 0
    commsType          : nonBlocking
    polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 3

Reading field p

[0] 
[0] 
[0] --> FOAM FATAL IO ERROR: 
[0] cannot find file
[0] 
[0] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor0/3/p at line 0.
[0] 
[0]     From function regIOobject::readStream()
[0]     in file db/regIOobject/regIOobjectRead.C at line 73.
[0] 
FOAM parallel run exiting
[0] 
[1] 
[1] 
[1] --> FOAM FATAL IO ERROR: 
[1] cannot find file
[1] 
[1] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor1/3/p at line 0.
[1] 
[1]     From function regIOobject::readStream()
[1]     in file db/regIOobject/regIOobjectRead.C at line 73.
[1] 
FOAM parallel run exiting
[1] 
[2] 
[2] 
[2] --> FOAM FATAL IO ERROR: 
[2] cannot find file
[2] 
[2] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor2/3/p at line 0.
[2] 
[2]     From function regIOobject::readStream()
[2]     in file db/regIOobject/regIOobjectRead.C at line 73.
[2] 
FOAM parallel run exiting
[2] 
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 5 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[3] 
[3] 
[3] --> FOAM FATAL IO ERROR: 
[3] cannot find file
[3] 
[3] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor3/3/p at line 0.
[3] 
[3]     From function regIOobject::readStream()
[3]     in file db/regIOobject/regIOobjectRead.C at line 73.
[3] 
FOAM parallel run exiting
[3] 
[4] 
[4] 
[4] --> FOAM FATAL IO ERROR: 
[4] cannot find file
[4] 
[4] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor4/3/p at line 0.
[4] 
[4]     From function regIOobject::readStream()
[4]     in file db/regIOobject/regIOobjectRead.C at line 73.
[4] 
FOAM parallel run exiting
[4] 
[5] 
[5] 
[5] --> FOAM FATAL IO ERROR: 
[5] cannot find file
[5] 
[5] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor5/3/p at line 0.
[5] 
[5]     From function regIOobject::readStream()
[5]     in file db/regIOobject/regIOobjectRead.C at line 73.
[5] 
FOAM parallel run exiting
[5] 
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 621 on
node andrew-pc exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[andrew-pc:00619] 5 more processes have sent help message help-mpi-api.txt / mpi-abort
[andrew-pc:00619] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
andrew@andrew-pc:~/OpenFOAM/andrew-2.2.2/run/motorBikeParallel$ mpirun -np 6 simpleFoam -parallel > simpleFoamParallel.log
[0] 
[0] 
[0] --> FOAM FATAL IO ERROR: 
[0] cannot find file
[0] 
[0] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor0/3/p at line 0.
[0] 
[0]     From function regIOobject::readStream()
[0]     in file db/regIOobject/regIOobjectRead.C at line 73.
[1] 
[0] 
FOAM parallel run exiting
[0] 
[1] 
[1] --> FOAM FATAL IO ERROR: 
[1] cannot find file
[1] 
[1] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor1/3/p at line 0.
[1] 
[1]     From function regIOobject::readStream()
[1]     in file db/regIOobject/regIOobjectRead.C[2] 
[2] 
[2] --> FOAM FATAL IO ERROR: 
[2] cannot find file
[2] 
[2]  at line 73.
[1] 
FOAM parallel run exiting
[1] 
[4] 
[4] 
[4] --> FOAM FATAL IO ERROR: 
[4] cannot find file
[4] 
[4] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor4/3/p at line 0.
[4] 
[4]     From function file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor2/3/p at line 0.
[2] 
[2]     From function regIOobject::readStream()
[2]     in file db/regIOobject/regIOobjectRead.C at line 73.
[2] 
FOAM parallel run exiting
[2] 
regIOobject::readStream()
[4]     in file db/regIOobject/regIOobjectRead.C at line 73.
[4] 
FOAM parallel run exiting
[4] 
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[3] 
[3] 
[3] --> FOAM FATAL IO ERROR: 
[3] cannot find file
[3] 
[3] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor3/3/p at line 0.
[3] [5] 
[5] 
[5] --> FOAM FATAL IO ERROR: 
[5] cannot find file
[5] 
[5] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor5/3/p at line 0.
[5] 
[5]     From function regIOobject::readStream()
[3]     From function regIOobject::readStream()
[3]     in file db/regIOobject/regIOobjectRead.C at line 73.
[3] 
FOAM parallel run exiting
[3] 

[5]     in file db/regIOobject/regIOobjectRead.C at line 73.
[5] 
FOAM parallel run exiting
[5] 
--------------------------------------------------------------------------
mpirun has exited due to process rank 2 with PID 630 on
node andrew-pc exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[andrew-pc:00627] 5 more processes have sent help message help-mpi-api.txt / mpi-abort
[andrew-pc:00627] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
andrew@andrew-pc:~/OpenFOAM/andrew-2.2.2/run/motorBikeParallel$ mpirun -np 6 simpleFoam -parallel > simpleFoamParallel.log
[1] 
[1] 
[1] --> FOAM FATAL IO ERROR: 
[1] cannot find file
[1] 
[1] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor1/3/p at line 0.
[1] 
[1]     From function regIOobject::readStream()
[1]     in file db/regIOobject/regIOobjectRead.C at line 73.
[1] 
FOAM parallel run exiting
[1] 
[2] 
[2] 
[2] --> FOAM FATAL IO ERROR: 
[2] cannot find file
[2] 
[2] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor2/3/p at line 0.
[2] 
[2]     From function regIOobject::readStream()
[2]     in file [4] 
[4] 
[4] --> FOAM FATAL IO ERROR: 
[4] cannot find file
[4] 
[4] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor4/3/p at line 0.
[4] 
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 5 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[3] 
[3] 
[3] --> FOAM FATAL IO ERROR: 
[3] cannot find file
[3] 
[3] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor3/3/p at line 0.
[3] 
[3]     From function regIOobject::readStream()
[3]     in file db/regIOobject/regIOobjectRead.C at line 73.
[3] 
FOAM parallel run exiting
[3] 
db/regIOobject/regIOobjectRead.C at line 73.
[2] 
FOAM parallel run exiting
[2] 
[5] 
[5] 
[5] --> FOAM FATAL IO ERROR: 
[5] cannot find file
[5] 
[5] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor5/3/p at line 0.
[5] 
[5]     From function regIOobject::readStream()
[5]     in file db/regIOobject/regIOobjectRead.C at line 73.
[5] 
FOAM parallel run exiting
[5] 
[4]     From function regIOobject::readStream()
[4]     in file db/regIOobject/regIOobjectRead.C at line 73.
[4] 
FOAM parallel run exiting
[4] 
[0] 
[0] 
[0] --> FOAM FATAL IO ERROR: 
[0] cannot find file
[0] 
[0] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor0/3/p at line 0.
[0] 
[0]     From function regIOobject::readStream()
[0]     in file db/regIOobject/regIOobjectRead.C at line 73.
[0] 
FOAM parallel run exiting
[0] 
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 639 on
node andrew-pc exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[andrew-pc:00638] 5 more processes have sent help message help-mpi-api.txt / mpi-abort
[andrew-pc:00638] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
I have had a look and cannot find any 'p' files in any of the processor folders.

Thanks for any help,

Andrew
AndrewMortimer is offline   Reply With Quote

Old   November 8, 2013, 11:33
Default
  #2
Senior Member
 
Lieven
Join Date: Dec 2011
Location: Leuven, Belgium
Posts: 299
Rep Power: 23
Lieven will become famous soon enough
Hi Andrew,

Is the p-file present in the 0-folder of the un-decomposed case? decomposePar simply takes all files that it can find and decomposes them. Could you post the output of decomposePar?

Cheers,

L
Lieven is offline   Reply With Quote

Old   November 8, 2013, 14:09
Default
  #3
New Member
 
Andrew Mortimer
Join Date: Oct 2013
Posts: 15
Rep Power: 13
AndrewMortimer is on a distinguished road
Hi Lieven,

No there are no files in each processor directory other than the constant folder containing the polymesh directory with all of the dictionaries usually found within polymesh.

This is the output from decomposePar:

Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.2.2                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : 2.2.2-9240f8b967db
Exec   : decomposePar
Date   : Nov 08 2013
Time   : 18:07:53
Host   : "andrew-pc"
PID    : 3455
Case   : /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel2
nProcs : 1
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time



Decomposing mesh region0

Create mesh

Calculating distribution of cells
Selecting decompositionMethod hierarchical

Finished decomposition in 0.26 s

Calculating original mesh data

Distributing cells to processors

Distributing faces to processors

Distributing points to processors

Constructing processor meshes

Processor 0
    Number of cells = 58130
    Number of faces shared with processor 1 = 1993
    Number of faces shared with processor 3 = 4080
    Number of processor patches = 2
    Number of processor faces = 6073
    Number of boundary faces = 10495

Processor 1
    Number of cells = 58130
    Number of faces shared with processor 0 = 1993
    Number of faces shared with processor 2 = 1310
    Number of faces shared with processor 3 = 43
    Number of faces shared with processor 4 = 3157
    Number of processor patches = 4
    Number of processor faces = 6503
    Number of boundary faces = 12452

Processor 2
    Number of cells = 58131
    Number of faces shared with processor 1 = 1310
    Number of faces shared with processor 4 = 38
    Number of faces shared with processor 5 = 4835
    Number of processor patches = 3
    Number of processor faces = 6183
    Number of boundary faces = 2017

Processor 3
    Number of cells = 58130
    Number of faces shared with processor 0 = 4080
    Number of faces shared with processor 1 = 43
    Number of faces shared with processor 4 = 1980
    Number of processor patches = 3
    Number of processor faces = 6103
    Number of boundary faces = 10818

Processor 4
    Number of cells = 58130
    Number of faces shared with processor 1 = 3157
    Number of faces shared with processor 2 = 38
    Number of faces shared with processor 3 = 1980
    Number of faces shared with processor 5 = 1390
    Number of processor patches = 4
    Number of processor faces = 6565
    Number of boundary faces = 12158

Processor 5
    Number of cells = 58131
    Number of faces shared with processor 2 = 4835
    Number of faces shared with processor 4 = 1390
    Number of processor patches = 2
    Number of processor faces = 6225
    Number of boundary faces = 1921

Number of processor faces = 18826
Max number of cells = 58131 (0.00114685% above average 58130.3)
Max number of processor patches = 4 (33.3333% above average 3)
Max number of faces between processors = 6565 (4.61596% above average 6275.33)

Time = 0

Processor 0: field transfer
Processor 1: field transfer
Processor 2: field transfer
Processor 3: field transfer
Processor 4: field transfer
Processor 5: field transfer

End.
Thanks,
Andrew
AndrewMortimer is offline   Reply With Quote

Old   November 8, 2013, 18:10
Default
  #4
Senior Member
 
Lieven
Join Date: Dec 2011
Location: Leuven, Belgium
Posts: 299
Rep Power: 23
Lieven will become famous soon enough
Ok, that doesn't explain a lot. Could you enter the following in the terminal and post the output:
Code:
ls /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/0/
Cheers,

L
Lieven is offline   Reply With Quote

Old   November 8, 2013, 18:39
Default
  #5
New Member
 
Andrew Mortimer
Join Date: Oct 2013
Posts: 15
Rep Power: 13
AndrewMortimer is on a distinguished road
Apologies I mis-read your post. In the undecomposed case yes the k,nut,omega,p and U files are present as well as the include folder containing fixedInlet, frontBackUpperPatches,initialConditions

ls /home/andrew/openFOAM/andrew-2.2.2/run/motorBikeParallel/0.org gives;

Code:
include
k
nut
omega
p
U
ls /home/andrew/openFOAM/andrew-2.2.2/run/motorBikeParallel/0.org/include gives;

Code:
fixedInlet
frontBackUpperPatches
initialConditions
Cheers,
Andrew
AndrewMortimer is offline   Reply With Quote

Old   November 8, 2013, 18:49
Default
  #6
Senior Member
 
Lieven
Join Date: Dec 2011
Location: Leuven, Belgium
Posts: 299
Rep Power: 23
Lieven will become famous soon enough
Ah, ok, I think I understand. The 0.org is not recognized by decomposePar. Instead you should have a 0-folder. Try again with the following series of commands:
Code:
cd /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/
cp -r 0.orig 0
decomposePar
Cheers,

L
Lieven is offline   Reply With Quote

Old   November 8, 2013, 19:25
Default
  #7
New Member
 
Andrew Mortimer
Join Date: Oct 2013
Posts: 15
Rep Power: 13
AndrewMortimer is on a distinguished road
Great! It is running now Many thanks for the help, it is quite frustrating when something so little is wrong and nothing will run.

Andrew
AndrewMortimer is offline   Reply With Quote

Old   November 8, 2013, 19:56
Default
  #8
New Member
 
Andrew Mortimer
Join Date: Oct 2013
Posts: 15
Rep Power: 13
AndrewMortimer is on a distinguished road
I'm afraid I may have posted a bit prematurely.

After running

mpirun -np 6 simpleFoam

The time directories, 100,200,300 and 400 are not going into the processor folders. That is they are simply being put in the casefile folder.

ls /home/andrew/openFOAM/andrew-2.2.2/run/motorBikeParallel gives:

Code:
0    300  constant  motorBikeParallel.foam     processor2  processor5      system
100  400  log       processor0  processor3  simpleFoam.log
200  500  mesh.run  processor1  processor4  solver.run
Also using reconstructPar -latestTime does not work, I am assuming this is because the solver has not actually split up the solution into each processor directory and not run it in parallel.
AndrewMortimer is offline   Reply With Quote

Old   November 9, 2013, 04:35
Default
  #9
Senior Member
 
Lieven
Join Date: Dec 2011
Location: Leuven, Belgium
Posts: 299
Rep Power: 23
Lieven will become famous soon enough
Hi Andrew,

Indeed, were not there yet but at least were getting closer ;-)

You should add the option -parallel to the solver when you want to run in it parallel. So try
Code:
mpirun -np 6 simpleFoam -parallel
I guess that should do the trick. And it is as you say, if you don't have time folders to reconstruct, reconstructPar won't do a lot.

Try also to run the simulation with foamJob, a tool provided by OF:
Code:
foamJob -p simpleFoam
which basically does the same but automatically detects the number of processors, creates a log-file, ... and you can use this log file with 'foamLog' to extract convergence information.

Cheers,

L
Lieven is offline   Reply With Quote

Old   November 9, 2013, 13:32
Default
  #10
New Member
 
Andrew Mortimer
Join Date: Oct 2013
Posts: 15
Rep Power: 13
AndrewMortimer is on a distinguished road
I'm not sure what the problem is now, but I keep getting a floating point error! This seems strange as it would suggest there's something wrong with the boundary conditions, but I am using the tutorial files. Also I spoke to a friend who has successfully ran it in parallel and I have used the exact same commands as him, the only difference being he has 8 cores and I have 6. As such I edited the decomposePar file to reflect this.

Both
Code:
mpirun -np 6 simpleFoam -parallel
and
Code:
foamJob -p parallel
fail on timestep 267.

The log from this can be found here:
https://www.dropbox.com/s/ol8git043gbz1ic/log
(couldn't seem to upload it to the forums and too many characters to post)

Cheers,
Andrew
AndrewMortimer is offline   Reply With Quote

Old   November 12, 2013, 18:03
Default
  #11
New Member
 
Andrew Mortimer
Join Date: Oct 2013
Posts: 15
Rep Power: 13
AndrewMortimer is on a distinguished road
Just a note to say I solved this problem by doing an mpirun of potentialFoam before running simpleFoam.

That is I ran:

Code:
mpirun -np 6 potentialFoam -parallel
Thanks for your help Lieven
AndrewMortimer is offline   Reply With Quote

Old   August 7, 2015, 19:08
Default
  #12
Member
 
Mike
Join Date: Apr 2011
Location: Canada
Posts: 83
Rep Power: 15
saeedi is on a distinguished road
Hi,

I am new to OpenFoam and I am having similar problem.

The point is when I run decomposePar, it does not create folder "0" for each processor. I have done a test for 2D cases and anther 3D case and fir them decomposePar works OK. but now that I want to start running a real 3D cases, decomposePar does not do the job.

I do not know what I am missing.

BTW, I am running it on a public domain computer and here is the error message when I run decomposePar:

Time = 0


--> FOAM FATAL IO ERROR:
keyword type is undefined in dictionary "/global/home/saeedi/bluffbody-OP/Re300-AR4/0/p.boundaryField.outlet"

file: /global/home/saeedi/bluffbody-OP/Re300-AR4/0/p.boundaryField.outlet from line 34 to line 34.

From function dictionary::lookupEntry(const word&, bool, bool) const
in file db/dictionary/dictionary.C at line 437.

FOAM exiting
saeedi is offline   Reply With Quote

Old   August 7, 2015, 19:45
Default
  #13
Member
 
Mike
Join Date: Apr 2011
Location: Canada
Posts: 83
Rep Power: 15
saeedi is on a distinguished road
I think I figured out the problem.

I did looked to my file "p" at the line which specifies outlet. I thing I missed a line for that (type fixed value).
saeedi is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Foam::error::printStack(Foam::Ostream&) with simpleFoam -parallel U.Golling OpenFOAM Running, Solving & CFD 52 September 23, 2023 04:35
SimpleFoam run in Parallel jayrup OpenFOAM 9 July 26, 2019 01:00
simpleFoam in parallel issue plucas OpenFOAM Running, Solving & CFD 3 July 17, 2013 12:30
parallel Grief: BoundaryFields ok in single CPU but NOT in Parallel JR22 OpenFOAM Running, Solving & CFD 2 April 19, 2013 17:49
parallel simpleFoam freezes the whole system vangelis OpenFOAM Running, Solving & CFD 14 May 16, 2012 06:12


All times are GMT -4. The time now is 07:51.