CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Programming & Development

Calling MPI routines in OpenFOAM

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   July 19, 2022, 07:15
Default Calling MPI routines in OpenFOAM
  #1
New Member
 
Doğukan Teber
Join Date: Jul 2022
Location: Izmir, Türkiye
Posts: 13
Rep Power: 4
dogukanteber is on a distinguished road
I have been trying to parallelize the decomposePar utility in OpenFOAM, for more information https://www.cfd-online.com/Forums/openfoam-programming-development/243964-parallelizing-decomposepar-utility.html) Therefore, I need to call MPI routines in my new utility. I learned that OpenFOAM has Pstream wrapper for MPI routines. However, my program throws an error when I do not call decomposePar before I run the program. For instance, below is a basic hello world program:


Code:
 #include "fvCFD.H"
 
 int main(int argc, char* argv[]) {
   #include "setRootCase.H"

   #include "createTime.H"
   // cannot run in parallel without creating processor0 directory
   Pout << "Hello from process " << Pstream::myProcNo() << endl;

   return 0;
 }
Here are the commands that I execute in order (assume I have compiled my source file):


Code:
blockMesh
mpirun -np 4 <program-name> -parallel
And, here is the error message:

Code:
Pstream initialized with:
   floatTransfer   : 0
   nProcsSimpleSum  : 0
   commsType     : nonBlocking
   polling iterations : 0
 trapFpe: Floating point exception trapping enabled (FOAM_SIGFPE).
 fileModificationChecking :  Monitoring run-time modified files using timeStampMaster  (fileModificationSkew 5, maxFileModificationPolls 20)
 allowSystemOperations : Allowing user-supplied system call operations


 // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
 [0] 
 [0] 
 [0] --> FOAM FATAL ERROR: (openfoam-2106)
 [0] parallelDecompose: cannot open case directory "/home/dogukan/Documents/hydrology/tutorials/permaFoam/demoCase/processor0"
 [0] 
 [0] 
 FOAM parallel run exiting
 [0] 
 --------------------------------------------------------------------------
 MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
 with errorcode 1.


 NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
 You may or may not see output from other processes, depending on exactly when Open MPI kills them.
When I call below functions, everything works as expected:

Code:
blockMesh
decomposePar
mpirun -np 4 <program-name> -parallel
I want to run my program after I call blockMesh so that I can parallelize the decomposePar utility. Is it possible to use Pstream methods without calling the decomposePar utility?




OpenFOAM version: v2106


Thanks in advance

dogukanteber is offline   Reply With Quote

Reply

Tags
decomposepar, mpi, openfoam


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Getting Started with OpenFOAM wyldckat OpenFOAM 26 June 21, 2024 07:54
Map of the OpenFOAM Forum - Understanding where to post your questions! wyldckat OpenFOAM 10 September 2, 2021 06:29
[OpenFOAM.com] Install OpenFOAM 3.0.1 on cluster without root access; mpi error Madeinspace OpenFOAM Installation 1 July 4, 2020 15:16
MPI profiling OpenFOAM damBreak3D application mellanoxuser OpenFOAM Pre-Processing 0 April 14, 2008 00:15
MPI profiling OpenFOAM damBreak3D application mellanoxuser OpenFOAM Running, Solving & CFD 0 April 14, 2008 00:04


All times are GMT -4. The time now is 10:08.