|
[Sponsors] |
July 19, 2022, 07:22 |
Calling MPI routines in OpenFOAM
|
#1 |
New Member
Doğukan Teber
Join Date: Jul 2022
Location: Izmir, Türkiye
Posts: 13
Rep Power: 4 |
I have been trying to parallelize the decomposePar utility in OpenFOAM, for more information. Therefore, I need to call MPI routines in my new utility. I learned that OpenFOAM has Pstream wrapper for MPI routines. However, my program throws an error when I do not call decomposePar before I run the program. For instance, below is a basic hello world program:
Code:
#include "fvCFD.H" int main(int argc, char* argv[]) { #include "setRootCase.H" #include "createTime.H" // cannot run in parallel without creating processor0 directory Pout << "Hello from process " << Pstream::myProcNo() << endl; return 0; } Code:
blockMesh mpirun -np 4 <program-name> -parallel Code:
Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking polling iterations : 0 trapFpe: Floating point exception trapping enabled (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 5, maxFileModificationPolls 20) allowSystemOperations : Allowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // [0] [0] [0] --> FOAM FATAL ERROR: (openfoam-2106) [0] parallelDecompose: cannot open case directory "/home/dogukan/Documents/hydrology/tutorials/permaFoam/demoCase/processor0" [0] [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. Code:
blockMesh decomposePar mpirun -np 4 <program-name> -parallel OpenFOAM version: v2106 Thanks in advance |
|
July 19, 2022, 07:28 |
|
#2 |
Senior Member
Mark Olesen
Join Date: Mar 2009
Location: https://olesenm.github.io/
Posts: 1,715
Rep Power: 40 |
disable processor checks:
https://www.openfoam.com/documentati...d8510fdfc1ab7e argList::noCheckProcessorDirectories(); |
|
July 19, 2022, 09:23 |
|
#3 |
New Member
Doğukan Teber
Join Date: Jul 2022
Location: Izmir, Türkiye
Posts: 13
Rep Power: 4 |
Thank you so much olesen. I have solved my problem. I do have another question, though. When I run the below program, it waits for the process 0 and does not terminate:
Source Code: Code:
#include "fvCFD.H" int main(int argc, char* argv[]) { argList::noCheckProcessorDirectories(); argList::addNote ( "Decompose a mesh and fields of a case in parallel execution" ); argList::addOption ( "decomposeParDict", "file", "Use specified file for decomposePar dictionary" ); #include "setRootCase.H" scalar p_no = UPstream::myProcNo(); if (p_no == 0 ) { #include "createTime.H" } Pout << "Hello from process " << p_no << endl; return 0; } Code:
Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking polling iterations : 0 trapFpe: Floating point exception trapping enabled (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 5, maxFileModificationPolls 20) allowSystemOperations : Allowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // [1] Hello from process 1 [2] Hello from process 2 [3] Hello from process 3 Create time --> FOAM Warning : From static Foam::IOstreamOption::compressionType Foam::IOstreamOption::compressionEnum(const Foam::word&, Foam::IOstreamOption::compressionType) in file db/IOstreams/IOstreams/IOstreamOption.C at line 115 Unknown compression specifier 'uncompressed', using compression off ^C^C Code:
blockMesh mpirun -np 4 <program-name> -parallel Source code: Code:
int main(int argc, char* argv[]) { argList::noCheckProcessorDirectories(); argList::addNote ( "Decompose a mesh and fields of a case in parallel execution" ); argList::addOption ( "decomposeParDict", "file", "Use specified file for decomposePar dictionary" ); #include "setRootCase.H" #include "createTime.H" Pout << "Hello from process " << UPstream::myProcNo() << endl; return 0; } Code:
Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking polling iterations : 0 trapFpe: Floating point exception trapping enabled (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 5, maxFileModificationPolls 20) allowSystemOperations : Allowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time --> FOAM Warning : From static Foam::IOstreamOption::compressionType Foam::IOstreamOption::compressionEnum(const Foam::word&, Foam::IOstreamOption::compressionType) in file db/IOstreams/IOstreams/IOstreamOption.C at line 115 Unknown compression specifier 'uncompressed', using compression off [dogukan:06483] *** An error occurred in MPI_Recv [dogukan:06483] *** reported by process [1006436353,2] [dogukan:06483] *** on communicator MPI_COMM_WORLD [dogukan:06483] *** MPI_ERR_TRUNCATE: message truncated [dogukan:06483] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [dogukan:06483] *** and potentially your MPI job) [3] #0 Foam::error::printStack(Foam::Ostream&) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/printStack/printStack.C:237 [3] #1 Foam::sigSegv::sigHandler(int) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/signals/sigSegv.C:51 [3] #2 ? in /lib/x86_64-linux-gnu/libpthread.so.0 [3] #3 ? in ~/OpenFOAM/dogukan-v2106/platforms/linux64GccDPInt32Debug/lib/libhydrology.so [0] #0 Foam::error::printStack(Foam::Ostream&) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/printStack/printStack.C:237 [0] #1 Foam::sigSegv::sigHandler(int) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/signals/sigSegv.C:51 [0] #2 ? in /lib/x86_64-linux-gnu/libpthread.so.0 [0] #3 ? in ~/OpenFOAM/dogukan-v2106/platforms/linux64GccDPInt32Debug/lib/libhydrology.so [1] #0 Foam::error::printStack(Foam::Ostream&) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/printStack/printStack.C:237 [1] #1 Foam::sigSegv::sigHandler(int) at /opt/OpenFOAM-v2106/src/OSspecific/POSIX/signals/sigSegv.C:51 [1] #2 ? in /lib/x86_64-linux-gnu/libpthread.so.0 [1] #3 Foam::refCount::refCount() in ~/OpenFOAM/dogukan-v2106/platforms/linux64GccDPInt32Debug/bin/parallelDecompose [1] #4 Foam::function1Base::function1Base(Foam::word const&, Foam::dictionary const&) at /opt/OpenFOAM-v2106/src/OpenFOAM/primitives/functions/Function1/Function1/function1Base.C:47 [1] #5 Foam::Function1<Foam::Tensor<double> >::Function1(Foam::word const&, Foam::dictionary const&)[dogukan:06477] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal [dogukan:06477] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages [warn] Epoll MOD(1) on fd 25 failed. Old events were 6; read change was 0 (none); write change was 2 (del); close change was 0 (none): Bad file descriptor Thank you. |
|
July 19, 2022, 09:47 |
|
#4 |
Senior Member
Mark Olesen
Join Date: Mar 2009
Location: https://olesenm.github.io/
Posts: 1,715
Rep Power: 40 |
Start with a simpler case (ie, with really basic BCs) - looks like you have something complex going on with the Function1 creation.
If you try to avoid the problem by only calling a parallel-aware routine like creating Time, you should not be overly surprised that it blocks infinitely. Not any better/worse than trying to call an MPI_Reduce from rank 0 only. EDIT: don't be surprised or annoyed that I will very likely stop following up on your questions. As I said before, trying to parallelise decomposePar without a deeper understanding of the OpenFOAM internals is not a task that can be recommended. If, however, you are to continue pursuing this course, you should find someone locally to help you. |
|
July 19, 2022, 09:54 |
|
#5 |
New Member
Doğukan Teber
Join Date: Jul 2022
Location: Izmir, Türkiye
Posts: 13
Rep Power: 4 |
At this point, I am just trying to replicate decomposePar and trying to eliminate for loops to replace them with processes for performance improvements. So, I do not know exactly what is going on behind the scenes. May I ask, what does BCs stand for? Also, can you give me extra simple case examples?
Thank you for your fast responses. I appreciate it. |
|
July 19, 2022, 09:55 |
|
#6 | |
New Member
Doğukan Teber
Join Date: Jul 2022
Location: Izmir, Türkiye
Posts: 13
Rep Power: 4 |
Quote:
|
||
January 2, 2023, 10:46 |
|
#7 | |
New Member
Volkan Atar
Join Date: Oct 2022
Posts: 23
Rep Power: 4 |
Quote:
Could you find any solution Dogukan? I'm having the same problem. |
||
Tags |
decomposepar, mpi, openfoam |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
[OpenFOAM.com] Install OpenFOAM 3.0.1 on cluster without root access; mpi error | Madeinspace | OpenFOAM Installation | 1 | July 4, 2020 15:16 |
OpenFOAM course for beginners | Jibran | OpenFOAM Announcements from Other Sources | 2 | November 4, 2019 09:51 |
Cross-compiling OpenFOAM 1.7.0 on Linux for Windows 32 and 64bits with Mingw-w64 | wyldckat | OpenFOAM Announcements from Other Sources | 3 | September 8, 2010 07:25 |
MPI profiling OpenFOAM damBreak3D application | mellanoxuser | OpenFOAM Pre-Processing | 0 | April 14, 2008 00:15 |
MPI profiling OpenFOAM damBreak3D application | mellanoxuser | OpenFOAM Running, Solving & CFD | 0 | April 14, 2008 00:04 |