CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Meshing & Mesh Conversion

[snappyHexMesh] troubles with sHM and parallel

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   August 30, 2012, 18:00
Default troubles with sHM and parallel
  #1
Super Moderator
 
Tobi's Avatar
 
Tobias Holzmann
Join Date: Oct 2010
Location: Bad Wörishofen
Posts: 2,711
Blog Entries: 6
Rep Power: 52
Tobi has a spectacular aura aboutTobi has a spectacular aura aboutTobi has a spectacular aura about
Send a message via ICQ to Tobi Send a message via Skype™ to Tobi
Hey guys,

I am using sHM since 1 year and since last week I get the following error by executing sHM parallel:

Code:
Create mesh for time = 1

[1] 
[1] 
[1] --> FOAM FATAL ERROR: 
[1] read failed
[1] 
[1]     From function UIPstream::UIPstream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber)
[1]     in file UIPread.C at line 114.
[1] 
FOAM parallel run aborting
[1] 
[2] 
[2] 
[2] --> FOAM FATAL ERROR: 
[2] read failed
[2] 
[2]     From function UIPstream::UIPstream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber)
[2]     in file UIPread.C at line 114.
[2] 
FOAM parallel run aborting
[2] 
Read mesh in = 0.12 s
[2] #0  [1] #0  Foam::error::printStack(Foam::Ostream&)Foam::error::printStack(Foam::Ostream&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #1  Foam::error::abort() in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #1  Foam::error::abort() in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #2  Foam::UIPstream::UIPstream(Foam::UPstream::commsTypes, int, Foam::DynamicList<char, 0u, 2u, 1u>&, int&, int, bool, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #2  Foam::UIPstream::UIPstream(Foam::UPstream::commsTypes, int, Foam::DynamicList<char, 0u, 2u, 1u>&, int&, int, bool, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/openmpi-1.5.3/libPstream.so"
[1] #3  Foam::IPstream::IPstream(Foam::UPstream::commsTypes, int, int, int, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/openmpi-1.5.3/libPstream.so"
[2] #3  Foam::IPstream::IPstream(Foam::UPstream::commsTypes, int, int, int, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #4  Foam::IOdictionary::readFile(bool) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #4  Foam::IOdictionary::readFile(bool) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #5  Foam::IOdictionary::IOdictionary(Foam::IOobject const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #5  Foam::IOdictionary::IOdictionary(Foam::IOobject const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #6  Foam::solution::solution(Foam::objectRegistry const&, Foam::fileName const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #6  Foam::solution::solution(Foam::objectRegistry const&, Foam::fileName const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #7  Foam::fvMesh::fvMesh(Foam::IOobject const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #7  Foam::fvMesh::fvMesh(Foam::IOobject const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
[1] #8   in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
[2] #8  

[1]  in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/bin/snappyHexMesh"
[1] #9  __libc_start_main[2]  in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/bin/snappyHexMesh"
[2] #9  __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6"
[1] #10   in "/lib/x86_64-linux-gnu/libc.so.6"
[2] #10  

[1]  in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/bin/snappyHexMesh"
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[2]  in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/bin/snappyHexMesh"
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 13823 on
node cfd exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[cfd:13821] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[cfd:13821] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
I compiled a other sHM - version. but removed it again. is it possible that I destroyed my 2.1.x version of sHM?

Or whats the error?
Thanks for reading and helping

PS: Running solvers in parallel mode is working fine as well

Tobi
Tobi is offline   Reply With Quote

Old   August 30, 2012, 18:54
Default
  #2
Super Moderator
 
Tobi's Avatar
 
Tobias Holzmann
Join Date: Oct 2010
Location: Bad Wörishofen
Posts: 2,711
Blog Entries: 6
Rep Power: 52
Tobi has a spectacular aura aboutTobi has a spectacular aura aboutTobi has a spectacular aura about
Send a message via ICQ to Tobi Send a message via Skype™ to Tobi
Okay it seems that there is more destroyed - I ll recompile OF.
Tobi is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[snappyHexMesh] Parallel sHM error vigges OpenFOAM Meshing & Mesh Conversion 12 September 23, 2022 02:23
[snappyHexMesh] Problem with boundaries with sHM in parallel running Loekatoni OpenFOAM Meshing & Mesh Conversion 0 January 24, 2019 08:56
[snappyHexMesh] OpenFoam in parallel with sHM and sFE pradyumnsingh OpenFOAM Meshing & Mesh Conversion 4 October 26, 2018 17:25
[snappyHexMesh] SHM parallel error. I honestly have no idea stark22 OpenFOAM Meshing & Mesh Conversion 4 December 13, 2017 01:42
[snappyHexMesh] shm in parallel with simple decomposition mihaipruna OpenFOAM Meshing & Mesh Conversion 6 July 16, 2015 05:55


All times are GMT -4. The time now is 14:08.