|
[Sponsors] |
July 19, 2022, 07:15 |
Calling MPI routines in OpenFOAM
|
#1 |
New Member
Doğukan Teber
Join Date: Jul 2022
Location: Izmir, Türkiye
Posts: 13
Rep Power: 4 |
I have been trying to parallelize the decomposePar utility in OpenFOAM, for more information https://www.cfd-online.com/Forums/openfoam-programming-development/243964-parallelizing-decomposepar-utility.html) Therefore, I need to call MPI routines in my new utility. I learned that OpenFOAM has Pstream wrapper for MPI routines. However, my program throws an error when I do not call decomposePar before I run the program. For instance, below is a basic hello world program:
Code:
#include "fvCFD.H" int main(int argc, char* argv[]) { #include "setRootCase.H" #include "createTime.H" // cannot run in parallel without creating processor0 directory Pout << "Hello from process " << Pstream::myProcNo() << endl; return 0; } Code:
blockMesh mpirun -np 4 <program-name> -parallel Code:
Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking polling iterations : 0 trapFpe: Floating point exception trapping enabled (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 5, maxFileModificationPolls 20) allowSystemOperations : Allowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // [0] [0] [0] --> FOAM FATAL ERROR: (openfoam-2106) [0] parallelDecompose: cannot open case directory "/home/dogukan/Documents/hydrology/tutorials/permaFoam/demoCase/processor0" [0] [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. Code:
blockMesh decomposePar mpirun -np 4 <program-name> -parallel OpenFOAM version: v2106 Thanks in advance |
|
Tags |
decomposepar, mpi, openfoam |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Getting Started with OpenFOAM | wyldckat | OpenFOAM | 26 | June 21, 2024 07:54 |
Map of the OpenFOAM Forum - Understanding where to post your questions! | wyldckat | OpenFOAM | 10 | September 2, 2021 06:29 |
[OpenFOAM.com] Install OpenFOAM 3.0.1 on cluster without root access; mpi error | Madeinspace | OpenFOAM Installation | 1 | July 4, 2020 15:16 |
MPI profiling OpenFOAM damBreak3D application | mellanoxuser | OpenFOAM Pre-Processing | 0 | April 14, 2008 00:15 |
MPI profiling OpenFOAM damBreak3D application | mellanoxuser | OpenFOAM Running, Solving & CFD | 0 | April 14, 2008 00:04 |