|
[Sponsors] |
Problem with Parallel simulation in OpenFoam v9 |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
April 14, 2024, 05:09 |
Problem with Parallel simulation in OpenFoam v9
|
#1 |
New Member
Pawan Yadav
Join Date: Aug 2023
Posts: 2
Rep Power: 0 |
Dear all,
I am trying to run a parallel simulation in OpenFoam v9 but I got an error that I have attached below. I have tried with the same case file in another system, and it is properly running in parallel. I also ran a damBreak case file in parallel then I got the same error. I using the Ubuntu 22.04 server. Please can you resolve my problems? Thank You *** An error occurred in MPI_Init_thread *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [che-epic-ws02:565996] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "(null)" (-43) instead of "Success" (0) -------------------------------------------------------------------------- -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- *** An error occurred in MPI_Init_thread *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [che-epic-ws02:565997] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "(null)" (-43) instead of "Success" (0) -------------------------------------------------------------------------- -------------------------------------------------------------------------- mpirun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[36180,1],0] Exit code: 1 |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Map of the OpenFOAM Forum - Understanding where to post your questions! | wyldckat | OpenFOAM | 10 | September 2, 2021 06:29 |
problem during mpi in server: expected Scalar, found on line 0 the word 'nan' | muth | OpenFOAM Running, Solving & CFD | 3 | August 27, 2018 05:18 |
Suggestion for a new sub-forum at OpenFOAM's Forum | wyldckat | Site Help, Feedback & Discussions | 20 | October 28, 2014 10:04 |
[mesh manipulation] Problem with RenumberMesh in parallel in OpenFOAM 2.1.1 | srini_esi | OpenFOAM Meshing & Mesh Conversion | 1 | November 8, 2013 03:48 |
A parallel simulation problem in DPM | satum | FLUENT | 0 | October 21, 2008 06:38 |