CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

LTSReactingParcelFoam doesn't work in parallel

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   August 2, 2012, 07:41
Default LTSReactingParcelFoam doesn't work in parallel
  #1
Senior Member
 
Join Date: Jan 2010
Location: Stuttgart
Posts: 150
Rep Power: 16
Chrisi1984 is on a distinguished road
Hi all,

I figured out that the LTSReactingParcelFoam Solver does not run in parrelel.

When I start the simulation I receive the following error message.

Quote:
[1] --> FOAM FATAL ERROR:
[1] read failed
[1]
[1] From function UIPstream::UIPstream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber)
[1] in file UIPread.C at line [2]

[4]
[4] --> FOAM FATAL ERROR:
[4] read failed
[4]
[4] From function UIPstream::UIPstream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber)
[4] in file UIPread.C at line 114.
[4]
FOAM parallel run aborting
[4]
114.
[1]
FOAM parallel run aborting
[1]
[2]
[2] --> FOAM FATAL ERROR:
[2] read failed
[2]
[2] From function UIPstream::UIPstream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber)
[2] in file UIPread.C at line 114.
[2]
FOAM parallel run aborting
[2]
[1] #0 [4] Foam::error::printStack(Foam::Ostream&)#0 Foam::error::printStack(Foam::Ostream&)[2] #0 Foam::error::printStack(Foam::Ostream&) in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #1 Foam::error::abort() in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[4] #1 Foam::error::abort() in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #1 Foam::error::abort() in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #2 Foam::UIPstream::UIPstream(Foam::UPstream::commsTy pes, int, Foam::DynamicList<char, 0u, 2u, 1u>&, int&, int, bool, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[4] #2 Foam::UIPstream::UIPstream(Foam::UPstream::commsTy pes, int, Foam::DynamicList<char, 0u, 2u, 1u>&, int&, int, bool, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/usr/local/openfoam/OpenFOAM-2.1.0/plat in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/openmpi-1.5.3/libPstream.so"
[2] #3 Foam::IPstream::IPstream(Foam::UPstream::commsType s, int, int, int, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber)forms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #2 Foam::UIPstream::UIPstream(Foam::UPstream::commsTy pes, int, Foam::DynamicList<char, 0u, 2u, 1u>&, int&, int, bool, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/openmpi-1.5.3/libPstream.so"
[4] #3 Foam::IPstream::IPstream(Foam::UPstream::commsType s, int, int, int, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #4 in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/openmpi-1.5.3/libPstream.so"
[1] #3 Foam::IPstream::IPstream(Foam::UPstream::commsType s, int, int, int, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber)Foam::IOdictionary:: readFile(bool) in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[4] #4 Foam::IOdictionary::readFile(bool) in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #5 Foam::IOdictionary::IOdictionary(Foam::IOobject const&) in "/usr/local/openfoam/OpenFOA in "/usr/local/openfoMam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[4] #5 Foam::IOdictionary::IOdictionary(Foam::IOobject const&)-2.1.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #4 Foam::IOdictionary::readFile(bool) in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #6 Foam::IObasicSourceList::IObasicSourceList(Foam::f vMesh const&) in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #5 Foam::IOdictionary::IOdictionary(Foam::IOobject const&) in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[4] #6 Foam::IObasicSourceList::IObasicSourceList(Foam::f vMesh const&) in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
[4] #7 in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
[2] #7 in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #6 Foam::IObasicSourceList::IObasicSourceList(Foam::f vMesh const&) in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
[1] #7 mainmain in in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/bin/porousExplicitSourceReactingParcelFoam_new"
[4] #8 __libc_start_main"/usr/lmainocal/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/bin/porousExplicitSourceReactingParcelFoam_new"
[2] #8 __libc_start_main in "/lib64/libc.so.6"
[4] #9 in "/usr/local/openfoam/OpenFOAM-2.1.0/platforms/linux64GccDPOpt/bin/porousExplicitSourceReactingParcelFoam_new"
[1] #8 __libc_start_main in "/lib64/libc.so.6"
[2] #9 in "/lib64/libc.so.6"
[1] #9 _start_start at /usr/src/packages/BUILD/glibc-2.9/csu/../sysdeps/x86_64/elf/start.S:116
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 4 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
_start at /usr/src/packages/BUILD/glibc-2.9/csu/../sysdeps/x86_64/elf/start.S:116
at /usr/src/packages/BUILD/glibc-2.9/csu/../sysdeps/x86_64/elf/start.S:116
--------------------------------------------------------------------------
mpirun has exited due to process rank 4 with PID 3790 on
node fe-z0bb9 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
Have anybody faced the same problem?

What can be the reason for that problem?

Kind regards

Christian
Chrisi1984 is offline   Reply With Quote

Old   August 2, 2012, 16:24
Default
  #2
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Greetings Christian,

This issue is currently being tracked here: http://www.openfoam.org/mantisbt/view.php?id=579

Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   August 5, 2012, 07:19
Default
  #3
Senior Member
 
Join Date: Jan 2010
Location: Stuttgart
Posts: 150
Rep Power: 16
Chrisi1984 is on a distinguished road
Hello Bruno,

thank you for the link!

I tried to implement those recommendations.

After changing the files like described in our link I recompiled /src/Pstream with "Allwmake"

But the solver LTSReactingParcelFoam ist still not running in parallel. Now I receive the following error message:
Quote:
Creating field source list from sourcesProperties

[1]
[1] [2]
[2]
[2] --> FOAM FATAL IO ERROR: [4]
[4]
[4] --> FOAM FATAL IO ERROR:
[4] Istream not OK for reading dictionary
[4]
[4] file: IOstream at line 0.
[4]
[4] From function
[1] --> FOAM FATAL IO ERROR:
[1] Istream not OK for reading dictionary
[1]
[1] file: IOstream at line 0.
[1]
[1] From function dictionary::read(Istream&, bool)
[1] in file
[2] Istream not OK for reading dictionary
[2]
[2] file: IOstream at line 0.
[2]
[2] From function dictionary::read(Istream&, bool)
[2] in file Creating porous zones

dictionary::read(Istream&, bool)
[4] in file db/dictionary/dictionaryIO.C at line 85.
[4]
FOAM parallel run exiting
[4]
db/dictionary/dictionaryIO.C at line 85.
[1]
FOAM parallel run exiting
[1]
db/dictionary/dictionaryIO.C at line 85.
[2]
FOAM parallel run exiting
[2]

Starting time loop

--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 4 with PID 18035 on
node christian-laptop exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[christian-laptop:18030] 2 more processes have sent help message help-mpi-api.txt / mpi-abort
[christian-laptop:18030] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
What was my mistake in implementing the described changes?

Kind regards

Christian
Chrisi1984 is offline   Reply With Quote

Old   August 5, 2012, 09:01
Default
  #4
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Hi Christian,

Mmm... there might have been other modifications along the way that might be necessary as well. The simplest way would be for you to build the latest OpenFOAM 2.1.x and that way you'll be certain that no changes were missed .

Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   August 6, 2012, 04:08
Default
  #5
New Member
 
Dima Risch
Join Date: Jun 2011
Location: Cologne
Posts: 22
Rep Power: 15
dima is on a distinguished road
Hello Christian,

i think you have to recompile src/OpenFOAM/db/dictionary/dictionaryIO.C and the solver too

of course Bruno's way will be the safest, but if you have not done it yet try ...

in /OpenFOAM-2.1.x/src/OpenFOAM/
type: wmake libso (maybe Allwmake in /OpenFOAM-2.1.x/src is necessary)

and in /OpenFOAM-2.1.x/applications/solvers/lagrangian/LTSReactingParcelFoam/
type: wmake (maybe wclean before)

kind regards,
Dima
dima is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
error while running in parallel using openmpi on local mc 6 processors suryawanshi_nitin OpenFOAM 10 February 22, 2017 22:33
Parallel processing problem newbie29 OpenFOAM Running, Solving & CFD 1 June 22, 2012 05:23
CFX parallel multi-node jobs fail w/ SLURM on Ubuntu 10.04 danieru CFX 0 February 17, 2012 07:20
parallel performance on BX900 uzawa OpenFOAM Installation 3 September 5, 2011 16:52
Cases with small length scale work fine on a single processor but fail in parallel adona058 OpenFOAM Bugs 5 April 17, 2009 05:41


All times are GMT -4. The time now is 08:45.