CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Programming & Development

Calling MPI routines in OpenFOAM

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   July 19, 2022, 07:22
Default Calling MPI routines in OpenFOAM
  #1
New Member
 
Doğukan Teber
Join Date: Jul 2022
Location: Izmir, Türkiye
Posts: 13
Rep Power: 4
dogukanteber is on a distinguished road
I have been trying to parallelize the decomposePar utility in OpenFOAM, for more information. Therefore, I need to call MPI routines in my new utility. I learned that OpenFOAM has Pstream wrapper for MPI routines. However, my program throws an error when I do not call decomposePar before I run the program. For instance, below is a basic hello world program:


Code:
#include "fvCFD.H"


 int main(int argc, char* argv[]) {
   #include "setRootCase.H"


   #include "createTime.H"
   // cannot run in parallel without creating processor0 directory
   Pout << "Hello from process " << Pstream::myProcNo() << endl;


   return 0;
 }
Here are the commands that I execute in order (assume I have compiled my source file):



Code:
blockMesh
mpirun -np 4 <program-name> -parallel
And, here is the error message:




Code:
Pstream initialized with:
   floatTransfer   : 0
   nProcsSimpleSum  : 0
   commsType     : nonBlocking
   polling iterations : 0
 trapFpe: Floating point exception trapping enabled (FOAM_SIGFPE).
 fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 5, maxFileModificationPolls 20)
 allowSystemOperations : Allowing user-supplied system call operations


 // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
 [0] 
 [0] 
 [0] --> FOAM FATAL ERROR: (openfoam-2106)
 [0] parallelDecompose: cannot open case directory "/home/dogukan/Documents/hydrology/tutorials/permaFoam/demoCase/processor0"
 [0] 
 [0] 
 FOAM parallel run exiting
 [0] 
 --------------------------------------------------------------------------
 MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
 with errorcode 1.


 NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
 You may or may not see output from other processes, depending on
 exactly when Open MPI kills them.
When I call below functions, everything works as expected:
Code:
blockMesh
decomposePar
mpirun -np 4 <program-name> -parallel
"I want to run my program after I call blockMesh so that I can parallelize the decomposePar utility. Is it possible to use Pstream methods without calling the decomposePar utility?


OpenFOAM version: v2106


Thanks in advance

dogukanteber is offline   Reply With Quote

Old   July 19, 2022, 07:28
Default
  #2
Senior Member
 
Mark Olesen
Join Date: Mar 2009
Location: https://olesenm.github.io/
Posts: 1,715
Rep Power: 40
olesen has a spectacular aura aboutolesen has a spectacular aura about
disable processor checks:



https://www.openfoam.com/documentati...d8510fdfc1ab7e


argList::noCheckProcessorDirectories();
olesen is offline   Reply With Quote

Old   July 19, 2022, 09:23
Default
  #3
New Member
 
Doğukan Teber
Join Date: Jul 2022
Location: Izmir, Türkiye
Posts: 13
Rep Power: 4
dogukanteber is on a distinguished road
Thank you so much olesen. I have solved my problem. I do have another question, though. When I run the below program, it waits for the process 0 and does not terminate:

Source Code:

Code:
#include "fvCFD.H"
int main(int argc, char* argv[]) {

    argList::noCheckProcessorDirectories();
    argList::addNote
    (
        "Decompose a mesh and fields of a case in parallel execution"
    );
    argList::addOption
    (
        "decomposeParDict",
        "file",
        "Use specified file for decomposePar dictionary"
    );
    
    #include "setRootCase.H"

    scalar p_no = UPstream::myProcNo();
    if (p_no == 0 ) {
        #include "createTime.H"
    }

    Pout << "Hello from process " << p_no << endl;

    return 0;
}
Output:

Code:
Pstream initialized with:
    floatTransfer      : 0
    nProcsSimpleSum    : 0
    commsType          : nonBlocking
    polling iterations : 0
trapFpe: Floating point exception trapping enabled (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 5, maxFileModificationPolls 20)
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
[1] Hello from process 1
[2] Hello from process 2
[3] Hello from process 3
Create time

--> FOAM Warning :
    From static Foam::IOstreamOption::compressionType Foam::IOstreamOption::compressionEnum(const Foam::word&, Foam::IOstreamOption::compressionType)
    in file db/IOstreams/IOstreams/IOstreamOption.C at line 115
    Unknown compression specifier 'uncompressed', using compression off
^C^C
Commands:

Code:
blockMesh
mpirun -np 4 <program-name> -parallel
You may ask "Why did you include createTime.H only in process 0?" The reason is when I remove the if statement, the program threw bunch of trace that I did not understand:


Source code:


Code:
int main(int argc, char* argv[]) {

    argList::noCheckProcessorDirectories();
    argList::addNote
    (
        "Decompose a mesh and fields of a case in parallel execution"
    );
    argList::addOption
    (
        "decomposeParDict",
        "file",
        "Use specified file for decomposePar dictionary"
    );
    
    #include "setRootCase.H"

    #include "createTime.H"


    Pout << "Hello from process " << UPstream::myProcNo() << endl;

    return 0;
}
Output:
Code:
Pstream initialized with:
    floatTransfer      : 0
    nProcsSimpleSum    : 0
    commsType          : nonBlocking
    polling iterations : 0
trapFpe: Floating point exception trapping enabled (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 5, maxFileModificationPolls 20)
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

--> FOAM Warning : 
    From static Foam::IOstreamOption::compressionType Foam::IOstreamOption::compressionEnum(const Foam::word&, Foam::IOstreamOption::compressionType)
    in file db/IOstreams/IOstreams/IOstreamOption.C at line 115
    Unknown compression specifier 'uncompressed', using compression off
[dogukan:06483] *** An error occurred in MPI_Recv
[dogukan:06483] *** reported by process [1006436353,2]
[dogukan:06483] *** on communicator MPI_COMM_WORLD
[dogukan:06483] *** MPI_ERR_TRUNCATE: message truncated
[dogukan:06483] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[dogukan:06483] ***    and potentially your MPI job)
[3] #0  Foam::error::printStack(Foam::Ostream&) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/printStack/printStack.C:237
[3] #1  Foam::sigSegv::sigHandler(int) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/signals/sigSegv.C:51
[3] #2  ? in /lib/x86_64-linux-gnu/libpthread.so.0
[3] #3  ? in ~/OpenFOAM/dogukan-v2106/platforms/linux64GccDPInt32Debug/lib/libhydrology.so
[0] #0  Foam::error::printStack(Foam::Ostream&) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/printStack/printStack.C:237
[0] #1  Foam::sigSegv::sigHandler(int) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/signals/sigSegv.C:51
[0] #2  ? in /lib/x86_64-linux-gnu/libpthread.so.0
[0] #3  ? in ~/OpenFOAM/dogukan-v2106/platforms/linux64GccDPInt32Debug/lib/libhydrology.so
[1] #0  Foam::error::printStack(Foam::Ostream&) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/printStack/printStack.C:237
[1] #1  Foam::sigSegv::sigHandler(int) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/signals/sigSegv.C:51
[1] #2  ? in /lib/x86_64-linux-gnu/libpthread.so.0
[1] #3  Foam::refCount::refCount() in ~/OpenFOAM/dogukan-v2106/platforms/linux64GccDPInt32Debug/bin/parallelDecompose
[1] #4  Foam::function1Base::function1Base(Foam::word const&, Foam::dictionary const&) at /opt/OpenFOAM-v2106/src/OpenFOAM/primitives/functions/Function1/Function1/function1Base.C:47
[1] #5  Foam::Function1<Foam::Tensor<double> >::Function1(Foam::word const&, Foam::dictionary const&)[dogukan:06477] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[dogukan:06477] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
[warn] Epoll MOD(1) on fd 25 failed. Old events were 6; read change was 0 (none); write change was 2 (del); close change was 0 (none): Bad file descriptor
Why do you think that happens?


Thank you.
dogukanteber is offline   Reply With Quote

Old   July 19, 2022, 09:47
Default
  #4
Senior Member
 
Mark Olesen
Join Date: Mar 2009
Location: https://olesenm.github.io/
Posts: 1,715
Rep Power: 40
olesen has a spectacular aura aboutolesen has a spectacular aura about
Start with a simpler case (ie, with really basic BCs) - looks like you have something complex going on with the Function1 creation.

If you try to avoid the problem by only calling a parallel-aware routine like creating Time, you should not be overly surprised that it blocks infinitely. Not any better/worse than trying to call an MPI_Reduce from rank 0 only.


EDIT: don't be surprised or annoyed that I will very likely stop following up on your questions. As I said before, trying to parallelise decomposePar without a deeper understanding of the OpenFOAM internals is not a task that can be recommended. If, however, you are to continue pursuing this course, you should find someone locally to help you.
olesen is offline   Reply With Quote

Old   July 19, 2022, 09:54
Default
  #5
New Member
 
Doğukan Teber
Join Date: Jul 2022
Location: Izmir, Türkiye
Posts: 13
Rep Power: 4
dogukanteber is on a distinguished road
At this point, I am just trying to replicate decomposePar and trying to eliminate for loops to replace them with processes for performance improvements. So, I do not know exactly what is going on behind the scenes. May I ask, what does BCs stand for? Also, can you give me extra simple case examples?

Thank you for your fast responses. I appreciate it.
dogukanteber is offline   Reply With Quote

Old   July 19, 2022, 09:55
Default
  #6
New Member
 
Doğukan Teber
Join Date: Jul 2022
Location: Izmir, Türkiye
Posts: 13
Rep Power: 4
dogukanteber is on a distinguished road
Quote:
Originally Posted by olesen View Post
EDIT: don't be surprised or annoyed that I will very likely stop following up on your questions. As I said before, trying to parallelise decomposePar without a deeper understanding of the OpenFOAM internals is not a task that can be recommended. If, however, you are to continue pursuing this course, you should find someone locally to help you.
Okay sir. Thank you for your time.
dogukanteber is offline   Reply With Quote

Old   January 2, 2023, 10:46
Default
  #7
New Member
 
Volkan Atar
Join Date: Oct 2022
Posts: 23
Rep Power: 4
Volkanatar is on a distinguished road
Quote:
Originally Posted by dogukanteber View Post
Thank you so much olesen. I have solved my problem. I do have another question, though. When I run the below program, it waits for the process 0 and does not terminate:

Source Code:

Code:
#include "fvCFD.H"
int main(int argc, char* argv[]) {

    argList::noCheckProcessorDirectories();
    argList::addNote
    (
        "Decompose a mesh and fields of a case in parallel execution"
    );
    argList::addOption
    (
        "decomposeParDict",
        "file",
        "Use specified file for decomposePar dictionary"
    );
    
    #include "setRootCase.H"

    scalar p_no = UPstream::myProcNo();
    if (p_no == 0 ) {
        #include "createTime.H"
    }

    Pout << "Hello from process " << p_no << endl;

    return 0;
}
Output:

Code:
Pstream initialized with:
    floatTransfer      : 0
    nProcsSimpleSum    : 0
    commsType          : nonBlocking
    polling iterations : 0
trapFpe: Floating point exception trapping enabled (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 5, maxFileModificationPolls 20)
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
[1] Hello from process 1
[2] Hello from process 2
[3] Hello from process 3
Create time

--> FOAM Warning :
    From static Foam::IOstreamOption::compressionType Foam::IOstreamOption::compressionEnum(const Foam::word&, Foam::IOstreamOption::compressionType)
    in file db/IOstreams/IOstreams/IOstreamOption.C at line 115
    Unknown compression specifier 'uncompressed', using compression off
^C^C
Commands:

Code:
blockMesh
mpirun -np 4 <program-name> -parallel
You may ask "Why did you include createTime.H only in process 0?" The reason is when I remove the if statement, the program threw bunch of trace that I did not understand:


Source code:


Code:
int main(int argc, char* argv[]) {

    argList::noCheckProcessorDirectories();
    argList::addNote
    (
        "Decompose a mesh and fields of a case in parallel execution"
    );
    argList::addOption
    (
        "decomposeParDict",
        "file",
        "Use specified file for decomposePar dictionary"
    );
    
    #include "setRootCase.H"

    #include "createTime.H"


    Pout << "Hello from process " << UPstream::myProcNo() << endl;

    return 0;
}
Output:
Code:
Pstream initialized with:
    floatTransfer      : 0
    nProcsSimpleSum    : 0
    commsType          : nonBlocking
    polling iterations : 0
trapFpe: Floating point exception trapping enabled (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 5, maxFileModificationPolls 20)
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

--> FOAM Warning : 
    From static Foam::IOstreamOption::compressionType Foam::IOstreamOption::compressionEnum(const Foam::word&, Foam::IOstreamOption::compressionType)
    in file db/IOstreams/IOstreams/IOstreamOption.C at line 115
    Unknown compression specifier 'uncompressed', using compression off
[dogukan:06483] *** An error occurred in MPI_Recv
[dogukan:06483] *** reported by process [1006436353,2]
[dogukan:06483] *** on communicator MPI_COMM_WORLD
[dogukan:06483] *** MPI_ERR_TRUNCATE: message truncated
[dogukan:06483] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[dogukan:06483] ***    and potentially your MPI job)
[3] #0  Foam::error::printStack(Foam::Ostream&) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/printStack/printStack.C:237
[3] #1  Foam::sigSegv::sigHandler(int) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/signals/sigSegv.C:51
[3] #2  ? in /lib/x86_64-linux-gnu/libpthread.so.0
[3] #3  ? in ~/OpenFOAM/dogukan-v2106/platforms/linux64GccDPInt32Debug/lib/libhydrology.so
[0] #0  Foam::error::printStack(Foam::Ostream&) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/printStack/printStack.C:237
[0] #1  Foam::sigSegv::sigHandler(int) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/signals/sigSegv.C:51
[0] #2  ? in /lib/x86_64-linux-gnu/libpthread.so.0
[0] #3  ? in ~/OpenFOAM/dogukan-v2106/platforms/linux64GccDPInt32Debug/lib/libhydrology.so
[1] #0  Foam::error::printStack(Foam::Ostream&) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/printStack/printStack.C:237
[1] #1  Foam::sigSegv::sigHandler(int) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/signals/sigSegv.C:51
[1] #2  ? in /lib/x86_64-linux-gnu/libpthread.so.0
[1] #3  Foam::refCount::refCount() in ~/OpenFOAM/dogukan-v2106/platforms/linux64GccDPInt32Debug/bin/parallelDecompose
[1] #4  Foam::function1Base::function1Base(Foam::word const&, Foam::dictionary const&) at /opt/OpenFOAM-v2106/src/OpenFOAM/primitives/functions/Function1/Function1/function1Base.C:47
[1] #5  Foam::Function1<Foam::Tensor<double> >::Function1(Foam::word const&, Foam::dictionary const&)[dogukan:06477] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[dogukan:06477] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
[warn] Epoll MOD(1) on fd 25 failed. Old events were 6; read change was 0 (none); write change was 2 (del); close change was 0 (none): Bad file descriptor
Why do you think that happens?


Thank you.

Could you find any solution Dogukan? I'm having the same problem.
Volkanatar is offline   Reply With Quote

Reply

Tags
decomposepar, mpi, openfoam


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[OpenFOAM.com] Install OpenFOAM 3.0.1 on cluster without root access; mpi error Madeinspace OpenFOAM Installation 1 July 4, 2020 15:16
OpenFOAM course for beginners Jibran OpenFOAM Announcements from Other Sources 2 November 4, 2019 09:51
Cross-compiling OpenFOAM 1.7.0 on Linux for Windows 32 and 64bits with Mingw-w64 wyldckat OpenFOAM Announcements from Other Sources 3 September 8, 2010 07:25
MPI profiling OpenFOAM damBreak3D application mellanoxuser OpenFOAM Pre-Processing 0 April 14, 2008 00:15
MPI profiling OpenFOAM damBreak3D application mellanoxuser OpenFOAM Running, Solving & CFD 0 April 14, 2008 00:04


All times are GMT -4. The time now is 10:22.