CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

mpirun suddently dosent work

Register Blogs Community New Posts Updated Threads Search

Like Tree1Likes
  • 1 Post By evrenykn

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   December 26, 2020, 07:38
Default mpirun suddently dosent work
  #1
Member
 
Join Date: Feb 2018
Posts: 58
Rep Power: 8
Kahnbein.Kai is on a distinguished road
Hello,
my mpirun on Ubuntu 20.04 dosent work anymore.
I used it a few weeks back and it works good and well. I just wanted to use it a few minutes ago and it doesent work.


I decompose the case first and then i use the
Code:
mpirun -np 4 interFoam -parallel
command.
Then this message appear.
Code:
kai@Kai-Desktop:~/OpenFOAM/kai-7/run/tutorials_of/multiphase/interFoam/laminar/damBreak_stl_II/damBreak$ mpirun -np 4 interFoam -parallel
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "(null)" (-43) instead of "Success" (0)
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "(null)" (-43) instead of "Success" (0)
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "(null)" (-43) instead of "Success" (0)
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "(null)" (-43) instead of "Success" (0)
--------------------------------------------------------------------------
*** An error occurred in MPI_Init_thread
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[Kai-Desktop:3304] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
*** An error occurred in MPI_Init_thread
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[Kai-Desktop:3305] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
*** An error occurred in MPI_Init_thread
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[Kai-Desktop:3306] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
*** An error occurred in MPI_Init_thread
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[Kai-Desktop:3307] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[48186,1],0]
  Exit code:    1
--------------------------------------------------------------------------
 kai@Kai-Desktop:~/OpenFOAM/kai-7/run/tutorials_of/multiphase/interFoam/laminar/damBreak_stl_II/damBreak$

Did anyone know what do to, to get it work again ?


Best regards
Kai
Kahnbein.Kai is offline   Reply With Quote

Old   December 28, 2020, 02:33
Default Update
  #2
Member
 
Join Date: Feb 2018
Posts: 58
Rep Power: 8
Kahnbein.Kai is on a distinguished road
Good morning,
im trying to fix this error by myself and i have a little update.
The ompi version i use is the:
Code:
kai@Kai-Desktop:~/Dokumente$ mpirun --version
mpirun (Open MPI) 4.0.3
If i create a *.c file, with the following content:
Code:
#include <mpi.h>
#include <stdio.h>

int main(int argc, char** argv) {
    // Initialize the MPI environment
    MPI_Init(NULL, NULL);

    // Get the number of processes
    int world_size;
    MPI_Comm_size(MPI_COMM_WORLD, &world_size);

    // Get the rank of the process
    int world_rank;
    MPI_Comm_rank(MPI_COMM_WORLD, &world_rank);

    // Get the name of the processor
    char processor_name[MPI_MAX_PROCESSOR_NAME];
    int name_len;
    MPI_Get_processor_name(processor_name, &name_len);

    // Print off a hello world message
    printf("Hello world from processor %s, rank %d out of %d processors\n",
           processor_name, world_rank, world_size);

    // Finalize the MPI environment.
    MPI_Finalize();
 }

After i compile it and execute it:
Code:
kai@Kai-Desktop:~/Dokumente$ mpirun -np 4 ./hello_world -parallel
Hello world from processor Kai-Desktop, rank 0 out of 4 processors
Hello world from processor Kai-Desktop, rank 1 out of 4 processors
Hello world from processor Kai-Desktop, rank 2 out of 4 processors
Hello world from processor Kai-Desktop, rank 3 out of 4 processors

In conclusion mpi works on my computer, or not ?

Why are of dosent work with it. ?


Best regards
Kai
Kahnbein.Kai is offline   Reply With Quote

Old   December 29, 2020, 07:26
Default
  #3
Member
 
Join Date: Feb 2018
Posts: 58
Rep Power: 8
Kahnbein.Kai is on a distinguished road
Update:


I installed OpenFOAM 8 and mpirun works again....


I dont know why this error still occurs in v7
Kahnbein.Kai is offline   Reply With Quote

Old   March 18, 2021, 15:25
Default
  #4
New Member
 
Evren Yilmaz Yakin
Join Date: Feb 2016
Location: Ankara, Turkey
Posts: 27
Rep Power: 10
evrenykn is on a distinguished road
Hi,

I have the same error now.

It started when after the upgrade from 18.04 to 20.04.

SOLVED

I uninstalled mpi and reinstalled. So, it solved.
Kahnbein.Kai likes this.
__________________
Best Regards,

Evren

Last edited by evrenykn; March 19, 2021 at 02:03. Reason: Solved
evrenykn is offline   Reply With Quote

Old   March 24, 2021, 14:07
Default
  #5
Member
 
Join Date: Feb 2018
Posts: 58
Rep Power: 8
Kahnbein.Kai is on a distinguished road
Hey evrenykn,
which packages did you uninstalled ?

Best regards
Kai
Kahnbein.Kai is offline   Reply With Quote

Old   November 23, 2021, 23:10
Default
  #6
New Member
 
Jin Zhang
Join Date: May 2018
Location: Germany
Posts: 15
Rep Power: 8
sjlouie91 is on a distinguished road
Quote:
Originally Posted by Kahnbein.Kai View Post
Hey evrenykn,
which packages did you uninstalled ?

Best regards
Kai
Hi, have you already solved this problem? I have the same problem as you did.

Best regards,
Jin
sjlouie91 is offline   Reply With Quote

Old   November 24, 2021, 11:58
Default
  #7
Member
 
Join Date: Feb 2018
Posts: 58
Rep Power: 8
Kahnbein.Kai is on a distinguished road
Hey Jin,

yes, you can update your openfoam version to 8 or 9 or update your linux operation system and mpirun.



After an update it works flawless again.


Best regards
Kai
Kahnbein.Kai is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
mpirun, best parameters pablodecastillo Hardware 18 November 10, 2016 13:36
Do all CFD analysts have to do some hands-on work except PhDs? e13drd Main CFD Forum 2 March 17, 2014 15:56
Rgid motion dosent work with SST model liujmljm SU2 4 November 20, 2013 07:45
mpirun doesn't work after update on 2.2.0 marcus85 OpenFOAM Running, Solving & CFD 7 May 20, 2013 08:05
ATTENTION! Reliability problems in CFX 5.7 Joseph CFX 14 April 20, 2010 16:45


All times are GMT -4. The time now is 16:21.