CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > General Forums > Main CFD Forum

Encountering an error while running a case in cluster using openfoam

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   July 31, 2018, 02:29
Default Encountering an error while running a case in cluster using openfoam
  #1
New Member
 
Suman Dathathreya
Join Date: Jul 2018
Posts: 13
Rep Power: 8
dnsuman is on a distinguished road
Hi
I am currently doing a project related to aircraft wings. While executing the analysis I get an error.


To run the case i had used



Quote:
/usr/bin/mpirun --hostfile machines -np 8 simpleFoam -parallel

Quote:
[0] [1]
[1]
[1] --> FOAM FATAL ERROR:
[1] UPstream::init(int& argc, char**& argv) : environment variable MPI_BUFFER_SIZE not defined
[1]
[1] From function [2]
[2]
[2] --> FOAM FATAL ERROR:
[2] UPstream::init(int& argc, char**& argv) : environment variable MPI_BUFFER_SIZE not defined
[2]
[2] From function static bool Foam::UPstream::init(int&, char**&)
[2] [3]
[3]
[3] --> FOAM FATAL ERROR:
[3] UPstream::init(int& argc, char**& argv) : environment variable MPI_BUFFER_SIZE not defined
[3]
[3] From function static bool Foam::UPstream::init(int&, char**&)
[3] in file UPstream.Cstatic bool Foam::UPstream::init(int&, char**&)
[1] in file UPstream.C at line 103.
[1]
FOAM parallel run aborting
[1]
[4]
[4]
[4] --> FOAM FATAL ERROR:
[4] UPstream::init(int& argc, char**& argv) : environment variable MPI_BUFFER_SIZE not defined
[4]
[4] From function static bool Foam::UPstream::init(int&, char**&)
[4] in file UPstream.C at line [5]
[5]
[5] --> FOAM FATAL ERROR:
[5] UPstream::init(int& argc, char**& argv) : environment variable MPI_BUFFER_SIZE not defined
[5]
[5] From function static bool Foam::UPstream::init(int&, char**&)
[5] in file UPstream.C at line 103.
[5]
FOAM parallel run aborting
[5] in file UPstream.C at line 103.
[2]
FOAM parallel run aborting
[2]
at line 103.
[3]
FOAM parallel run aborting
[3]
[6]
[6]
[6] --> FOAM FATAL ERROR:
[6] UPstream::init(int& argc, char**& argv) : environment variable MPI_BUFFER_SIZE not defined
[6]
[6] From function static bool Foam::UPstream::init(int&, char**&)
[6] in file UPstream.C at line 103.
[6]
FOAM parallel run aborting
[6]

103.
[4]
FOAM parallel run aborting
[4]
[0]

[6] #0 [0] [4] #0 Foam::error:rintStack(Foam::Ostream&)--> FOAM FATAL ERROR: [5] #0 Foam::error:rintStack(Foam::Ostream&)
Foam::error:rintStack(Foam::Ostream&)[3] #0 [2] #0 [1] #0 Foam::error:rintStack(Foam::Ostream&)Foam::error :rintStack(Foam::Ostream&)Foam::error:rintStac k(Foam::Ostream&)[0] UPstream::init(int& argc, char**& argv) : environment variable MPI_BUFFER_SIZE not defined
[0]
[0] From function static bool Foam::UPstream::init(int&, char**&)
[0] in file UPstream.C at line 103.
[0]
FOAM parallel run aborting
[0]
[0] #[7]
[7]
[7] --> FOAM FATAL ERROR:
[7] UPstream::init(int& argc, char**& argv) : environment variable MPI_BUFFER_SIZE not defined
[7]
[7] From function static bool Foam::UPstream::init(int&, char**&)
[7] in file UPstream.C at line 0103.
[7]
FOAM parallel run aborting
[7]
Foam::error:rintStack(Foam::Ostream&)[7] #0 Foam::error:rintStack(Foam::Ostream&) at ??:?
[2] #1 Foam::error::abort() at ??:?
[3] #1 Foam::error::abort() at ??:?
[4] #1 Foam::error::abort() at ??:?
[5] #1 Foam::error::abort() at ??:?
[6] #1 Foam::error::abort() at ??:?
[1] #1 Foam::error::abort() at ??:?
[7] #1 Foam::error::abort() at ??:?
[0] #1 Foam::error::abort() at ??:?
[2] #2 Foam::UPstream::init(int&, char**&) at ??:?
[4] #2 Foam::UPstream::init(int&, char**&) at ??:?
[5] #2 Foam::UPstream::init(int&, char**&) at ??:?
[1] #2 Foam::UPstream::init(int&, char**&) at ??:?
[6] #2 Foam::UPstream::init(int&, char**&) at ??:?
[3] #2 Foam::UPstream::init(int&, char**&) at ??:?
[2] #3 Foam::argList::argList(int&, char**&, bool, bool, bool) at ??:?
[5] #3 Foam::argList::argList(int&, char**&, bool, bool, bool) at ??:?
[4] #3 Foam::argList::argList(int&, char**&, bool, bool, bool) at ??:?
[1] #3 Foam::argList::argList(int&, char**&, bool, bool, bool) at ??:?
[6] #3 Foam::argList::argList(int&, char**&, bool, bool, bool) at ??:?
[3] #3 Foam::argList::argList(int&, char**&, bool, bool, bool) at ??:?
[7] #2 Foam::UPstream::init(int&, char**&) at ??:?
[0] #2 Foam::UPstream::init(int&, char**&) at ??:?
[2] #4 at ??:?
[0] #3 Foam::argList::argList(int&, char**&, bool, bool, bool) at ??:?
[5] #4 at ??:?
[4] #4 at ??:?
[7] #3 Foam::argList::argList(int&, char**&, bool, bool, bool)?? at ??:?
[1] #4 ? at ??:?
[3] #4 at ??:?
[6] #4 in "/usr/bin/simpleFoam"
[2] #5 __libc_start_main? at ??:?
[0] #4 in "/usr/bin/simpleFoam"
??[5] #5 __libc_start_main at ??:?
[7] #4 in "/usr/bin/simpleFoam"
[4] #5 __libc_start_main? in "/usr/bin/simpleFoam"
[1] #5 __libc_start_main in "/usr/bin/simpleFoam"
[3] #5 __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6"
[2] #6 in "/lib/x86_64-linux-gnu/libc.so.6"
[5] #6 in "/lib/x86_64-linux-gnu/libc.so? in "/usr/bin/simpleFoam"
[6] #5 __libc_start_main.6"
[4] #6 ?? in "/usr/bin/simpleFoam"
[0] #5 __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6"
[1] #6 in "/lib/x86_64-linux-gnu/libc.so.6"
[3] #6 in "/lib/x86_64-linux-gnu/libc.so.6"
[6] #6 in "/usr/bin/simpleFoam?"
[7] #5 __libc_start_main in "/usr/bin/simpleFoam"
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
?? in "/lib/x86_64 in "/usr/bin/simpleFoam"
-linux-gnu/libc.so.6"
[0] #6 in "/usr/bin/simpleFoam"
? in "/lib/x86_64-linux-gnu/libc.so.6"
[7] #6 in "/usr/bin/simpleFoam"
in "/usr/bin/simpleFoam"
?[linux:18216] 3 more processes have sent help message help-mpi-api.txt / mpi-abort
[linux:18216] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages





I've set up the nfs server and client and also enabled the passwordless ssh. I need help as I am new to openfoam.
dnsuman is offline   Reply With Quote

Old   August 3, 2018, 17:20
Default
  #2
Senior Member
 
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,764
Rep Power: 66
LuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura about
Are you sure that is the right commands to run on your cluster? I.e. try instead

Code:
/usr/bin/mpirun -np 8 -machinefile machines simpleFoam -parallel
LuckyTran is offline   Reply With Quote

Old   August 6, 2018, 02:35
Default
  #3
New Member
 
Suman Dathathreya
Join Date: Jul 2018
Posts: 13
Rep Power: 8
dnsuman is on a distinguished road
Hi
Sorry for the late reply. I had tried the code that you mentioned I got the same error as mentioned in my above post
dnsuman is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[Other] Basic questions about OpenFOAM cluster running and installing Fauster OpenFOAM Installation 0 May 25, 2018 16:00
Difficulty running an inviscid compressible nozzle flow using OpenFOAM rbozinoski OpenFOAM Running, Solving & CFD 11 December 29, 2015 08:19
Something weird encountered when running OpenFOAM in parallel on multiple nodes xpqiu OpenFOAM Running, Solving & CFD 2 May 2, 2013 05:59
How to install the OpenFoam in the cluster. Please help me! flying OpenFOAM Installation 6 November 27, 2009 04:00
Free surface boudary conditions with SOLA-VOF Fan Main CFD Forum 10 September 9, 2006 13:24


All times are GMT -4. The time now is 01:56.