CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM

Open MPI error

Register Blogs Community New Posts Updated Threads Search

Like Tree13Likes
  • 2 Post By GrivalszkiP
  • 2 Post By hokhay
  • 3 Post By GrivalszkiP
  • 4 Post By Turin Turambar
  • 1 Post By shamantic
  • 1 Post By shamantic

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   August 3, 2020, 09:33
Default Open MPI error
  #1
Member
 
Grivalszki Péter
Join Date: Mar 2019
Location: Budapest, Hungary
Posts: 39
Rep Power: 7
GrivalszkiP is on a distinguished road
Hi!

I get this error message if I try to run parallel stuff:

Code:
vituki@VHDT08:/mnt/d/griva_modellek/wavebreak$ mpirun -np 8 snappyHexMesh -parallel
--------------------------------------------------------------------------
There are not enough slots available in the system to satisfy the 8
slots that were requested by the application:

  snappyHexMesh

Either request fewer slots for your application, or make more slots
available for use.

A "slot" is the Open MPI term for an allocatable unit where we can
launch a process.  The number of slots available are defined by the
environment in which Open MPI processes are run:

  1. Hostfile, via "slots=N" clauses (N defaults to number of
     processor cores if not provided)
  2. The --host command line parameter, via a ":N" suffix on the
     hostname (N defaults to 1 if not provided)
  3. Resource manager (e.g., SLURM, PBS/Torque, LSF, etc.)
  4. If none of a hostfile, the --host command line parameter, or an
     RM is present, Open MPI defaults to the number of processor cores

In all the above cases, if you want Open MPI to default to the number
of hardware threads instead of the number of processor cores, use the
--use-hwthread-cpus option.

Alternatively, you can use the --oversubscribe option to ignore the
number of available slots when deciding the number of processes to
launch.
--------------------------------------------------------------------------
I use OpenFOAM v2006 and OpenMPI v4.0.3. I have 8 threads (4 cores) in my computer and decomposePar run without problems. There weren't any problems like this before. Thank you in advance!
rmaries and Ashuthosh like this.
GrivalszkiP is offline   Reply With Quote

Old   August 17, 2020, 03:30
Default
  #2
Member
 
Join Date: Nov 2014
Posts: 92
Rep Power: 12
hokhay is on a distinguished road
Have you checked whether the multi-threading function is on?
Check how many CPU you have got with command 'htop'
GrivalszkiP and Ashuthosh like this.
hokhay is offline   Reply With Quote

Old   August 18, 2020, 12:01
Default
  #3
Member
 
Grivalszki Péter
Join Date: Mar 2019
Location: Budapest, Hungary
Posts: 39
Rep Power: 7
GrivalszkiP is on a distinguished road
Thank you, I have fixed it:

Newer versions of mpi detect wrong input by users, if they declare 8 cores when there are only 4 available.
You have 2 options:
- switch to 4 threads and gain more performance
- add the option --use-hwthread-cpus and run your simulation on lower performance with 8 threads
GrivalszkiP is offline   Reply With Quote

Old   April 27, 2021, 09:41
Default
  #4
New Member
 
Emre
Join Date: Oct 2016
Posts: 5
Rep Power: 10
Turin Turambar is on a distinguished road
Hello



I had the same error before. My computer has 8 cores and I was not able to run the simulation in parallel for more than 4 cores. I used "-oversubscribe" flag which allows processes on a node than processing elements (from mpirun man page), and it solved my problem. Here is the example:


Code:

mpirun -oversubscribe -np 8 interFoam -parallel | tee log.interFoam

I hope it is also a convenient way of solving this issue.


Best regards
thiagopl, wdx_cfd, wht and 1 others like this.
Turin Turambar is offline   Reply With Quote

Old   October 24, 2022, 09:35
Default multithreading with runParallel
  #5
New Member
 
Klaus Rädecke
Join Date: Jun 2009
Location: Rüsselsheim, Germany
Posts: 9
Rep Power: 17
shamantic is on a distinguished road
in OpenFOAM-v2206/bin/tools/RunFunctions,

I added --use-hwthread-cpus to the lines starting $mpirun, like this:


$mpirun --use-hwthread-cpus -n $nProcs $appRun $appArgs "$@" </dev/null >> $logFile 2>&1
wht likes this.
shamantic is offline   Reply With Quote

Old   December 9, 2022, 08:07
Thumbs up
  #6
Member
 
Thiago Parente Lima
Join Date: Sep 2011
Location: Diamantina, Brazil.
Posts: 65
Rep Power: 15
thiagopl is on a distinguished road
Quote:
Originally Posted by Turin Turambar View Post
Hello
I had the same error before. My computer has 8 cores and I was not able to run the simulation in parallel for more than 4 cores. I used "-oversubscribe" flag which allows processes on a node than processing elements (from mpirun man page), and it solved my problem. Here is the example:
Code:
mpirun -oversubscribe -np 8 interFoam -parallel | tee log.interFoam
I hope it is also a convenient way of solving this issue.
Best regards
I had the same problem and your solution worked for me as well. Thank you.
__________________
Fields of interest: buoyantFoam, chtMultRegionFoam.
thiagopl is offline   Reply With Quote

Old   January 13, 2023, 16:38
Default
  #7
New Member
 
Klaus Rädecke
Join Date: Jun 2009
Location: Rüsselsheim, Germany
Posts: 9
Rep Power: 17
shamantic is on a distinguished road
Guess that --oversubscribe does not throw an error if you use more threads as provided by your CPUs. --use-hwthread-cpus will still limit to the CPU provided threads. I do prefer this because I want to avoid overprovisioning above the hardware capabilities as this reduces efficency.
wht likes this.
shamantic is offline   Reply With Quote

Reply

Tags
error, mpirun, openmpi, parallel, slots


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[swak4Foam] swak4foam for OpenFOAM 4.0 mnikku OpenFOAM Community Contributions 80 May 17, 2022 09:06
Compile calcMassFlowC aurore OpenFOAM Programming & Development 13 March 23, 2018 08:43
OpenFOAM without MPI kokizzu OpenFOAM Installation 4 May 26, 2014 10:17
Compiling dynamicTopoFvMesh for OpenFOAM 2.1.x Saxwax OpenFOAM Installation 25 November 29, 2013 06:34
How to get the max value of the whole field waynezw0618 OpenFOAM Running, Solving & CFD 4 June 17, 2008 06:07


All times are GMT -4. The time now is 05:45.