CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

MPI run run fails of quadcore openfoam2.4

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   January 25, 2016, 07:33
Unhappy MPI run run fails of quadcore openfoam2.4
  #1
New Member
 
Priya Somasundaran
Join Date: Oct 2015
Posts: 6
Rep Power: 11
Priya Somasundaran is on a distinguished road
I am trying mpirun , decomposePar worked fine upon

/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.4.0 |
| \\ / A nd | Web: www.OpenFOAM.org |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 2.4.0-f0842aea0e77
Exec : decomposePar
Date : Jan 25 2016
Time : 19:28:34
Host : "GEOSCIENCE-PC"
PID : 20725
Case : /home/priya/OpenFOAM/priya-2.4.0/run/simulation
nProcs : 1
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time



Decomposing mesh region0

Create mesh

Calculating distribution of cells
Selecting decompositionMethod scotch

Finished decomposition in 19.3 s

Calculating original mesh data

Distributing cells to processors

Distributing faces to processors

Distributing points to processors

Constructing processor meshes

Processor 0
Number of cells = 1532953
Number of faces shared with processor 1 = 6519
Number of faces shared with processor 2 = 1733
Number of faces shared with processor 3 = 4880
Number of processor patches = 3
Number of processor faces = 13132
Number of boundary faces = 330832

Processor 1
Number of cells = 1540649
Number of faces shared with processor 0 = 6519
Number of faces shared with processor 2 = 4783
Number of faces shared with processor 3 = 709
Number of processor patches = 3
Number of processor faces = 12011
Number of boundary faces = 333490

Processor 2
Number of cells = 1653908
Number of faces shared with processor 0 = 1733
Number of faces shared with processor 1 = 4783
Number of faces shared with processor 3 = 5770
Number of processor patches = 3
Number of processor faces = 12286
Number of boundary faces = 353823

Processor 3
Number of cells = 1604279
Number of faces shared with processor 0 = 4880
Number of faces shared with processor 1 = 709
Number of faces shared with processor 2 = 5770
Number of processor patches = 3
Number of processor faces = 11359
Number of boundary faces = 348006

Number of processor faces = 24394
Max number of cells = 1653908 (4.48282% above average 1.58295e+06)
Max number of processor patches = 3 (0% above average 3)
Max number of faces between processors = 13132 (7.66582% above average 12197)

Time = 0

Processor 0: field transfer
Processor 1: field transfer
Processor 2: field transfer
Processor 3: field transfer

End.

priya@GEOSCIENCE-PC:~/OpenFOAM/priya-2.4.0/run/simulation$ mpirun –np <nProcs> buoyantBoussinesqSimpleFoam -parallel >& log &
[1] 20734
bash: nProcs: No such file or directory
priya@GEOSCIENCE-PC:~/OpenFOAM/priya-2.4.0/run/simulation$ mpirun –np 4 buoyantBoussinesqSimpleFoam -parallel >& log &
[2] 20735
[1] Exit 1 mpirun –np -parallel < nProcs > buoyantBoussinesqSimpleFoam &> log


nothing seems to happen . I am using opefoam240 version.

Any help will be appreciated
Thanks

Last edited by Priya Somasundaran; January 25, 2016 at 07:40. Reason: spelling mistake
Priya Somasundaran is offline   Reply With Quote

Old   January 25, 2016, 08:58
Default
  #2
Senior Member
 
akidess's Avatar
 
Anton Kidess
Join Date: May 2009
Location: Germany
Posts: 1,377
Rep Power: 30
akidess will become famous soon enough
Your first command is wrong. The second command might have worked, but you need to post the contents of the log file.
__________________
*On twitter @akidTwit
*Spend as much time formulating your questions as you expect people to spend on their answer.
akidess is offline   Reply With Quote

Old   January 25, 2016, 09:39
Smile log file
  #3
New Member
 
Priya Somasundaran
Join Date: Oct 2015
Posts: 6
Rep Power: 11
Priya Somasundaran is on a distinguished road
Thanks for the quick reply

sorry i forgot to post the log file

###### my command :

mpirun –np 4 buoyantBoussinesqSimpleFoam -parallel >& log &


###### the log file

--------------------------------------------------------------------------
mpirun was unable to launch the specified application as it could not find an executable:

Executable: –np
Node: GEOSCIENCE-PC

while attempting to start process rank 0.
--------------------------------------------------------------------------

I tried to search if there are any another MPI process my the command

update-alternatives --config mpirun
There is only one alternative in link group mpirun (providing /usr/bin/mpirun): /usr/bin/mpirun.openmpi
Nothing to configure.

any clues ?
Priya Somasundaran is offline   Reply With Quote

Old   January 25, 2016, 09:50
Default
  #4
Senior Member
 
akidess's Avatar
 
Anton Kidess
Join Date: May 2009
Location: Germany
Posts: 1,377
Rep Power: 30
akidess will become famous soon enough
I bet you copy and pasted that command. Type it in proper, and all will be good (i. e. I believe the hyphen is not what you think it is).
__________________
*On twitter @akidTwit
*Spend as much time formulating your questions as you expect people to spend on their answer.
akidess is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[mesh manipulation] Cannot get refineMesh to run in parallel smschnob OpenFOAM Meshing & Mesh Conversion 2 June 3, 2014 12:20
MPI error florencenawei OpenFOAM Installation 3 October 10, 2011 02:21
Sgimpi pere OpenFOAM 27 September 24, 2011 08:57
Building OpenFOAM on IRIX lakeat OpenFOAM Installation 7 July 16, 2008 08:27
MPI and parallel computation Wang Main CFD Forum 7 April 15, 2004 12:25


All times are GMT -4. The time now is 04:09.