|
[Sponsors] |
August 26, 2022, 06:48 |
MPI issues on debian Debian 11 bullseye
|
#1 |
New Member
Flavio Giannetti
Join Date: Mar 2021
Location: Italy
Posts: 14
Rep Power: 5 |
Hi guys
I have compiled SU2 on a new machine running Debian GNU/Linux 11 (bullseye). I installed Open MPI 4.1.0 with apt-get install and compiled everything. Whenever I try to run the program I get an error concerning MPI_win_create. I tried both SU2 version 7.3 and 7.4. I also tried mpi and mpich getting the same problem. The version of the libraries installed on the machine are libmpi.so.40.30.0 libmpich.so.12.1.10 This is weird! I have another machine running linux mint 19 (tessa) on which I compiled SU2 version 7.3 without problems. The only difference I can see is the MPI version that on the old machine is Open MPI version 3. Does anyone found a similar behaviour ? Any hints on how to solve the problem ? Thanks in advance for the help you can give me Flavio Here is the message I get flavio@cfd1 ~/prova $ mpirun -n 2 SU2_CFD inv_ONERAM6.cfg [cfd1:151413] *** An error occurred in MPI_Win_create [cfd1:151413] *** reported by process [1424949249,0] [cfd1:151413] *** on communicator MPI_COMM_WORLD [cfd1:151413] *** MPI_ERR_WIN: invalid window [cfd1:151413] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [cfd1:151413] *** and potentially your MPI job) [cfd1:151409] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal [cfd1:151409] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages PS I also tried the pre-compiled version of SU2 which uses mpich. The program starts but then it always crashes! |
|
August 27, 2022, 14:08 |
|
#2 |
Senior Member
Pedro Gomes
Join Date: Dec 2017
Posts: 466
Rep Power: 14 |
Hello,
I've had some problems on Ubuntu 22 with OpenMPI 4, related to HWLOC and something about 32 bit pci devices *shrug*. Maybe it's the same for you but you have the warnings silenced, see here https://github.com/open-mpi/hwloc/issues/354 With mpich 4 I get the warnings but the code runs fine. How did you build SU2 with mpich? Be careful if you have open mpi alongside mpich. This is my build command for mpich: export CC=mpicc.mpich export CXX=mpicxx.mpich export CXXFLAGS="-march=native -funroll-loops -ffast-math -fno-finite-math-only" ./meson.py build --optimization=2 --warnlevel=3 --prefix=$PWD/build -Dcustom-mpi=true If you find out the issue with Open MPI please update this thread. |
|
August 28, 2022, 07:26 |
mpich ok !
|
#3 | |
New Member
Flavio Giannetti
Join Date: Mar 2021
Location: Italy
Posts: 14
Rep Power: 5 |
Hi Pedro thanks a lot for your support.
I tried again implementing your hints. I had no success with openmpi. I alwasys get the same output without additional hints. However I used your example to recompile SU2 with mpich and its now working !!! I have a last question concerning the -Dwith-omp option. Can I recompile the code with mpich and -Dwith-omp=true or the option is just for Openmpi ? Thanks a lot fot your help Flavio Quote:
|
||
August 29, 2022, 20:10 |
|
#4 |
Senior Member
Pedro Gomes
Join Date: Dec 2017
Posts: 466
Rep Power: 14 |
Hi Flavio,
Glad it works. Yes you can use mpich and openmp |
|
November 29, 2022, 21:04 |
|
#5 |
New Member
Brandon Gleeson
Join Date: Apr 2018
Posts: 26
Rep Power: 8 |
Just tagging onto this thread; I observe the same fatal error when running the Quickstart in serial mode, but it runs just fine in parallel.
|
|
July 12, 2023, 05:54 |
|
#6 | |
New Member
Seungtae Kim
Join Date: Jun 2023
Posts: 7
Rep Power: 3 |
Quote:
Code:
ERROR : You are trying to launch a computation without initializing MPI but the wrapper has been built in parallel. Please add the --parallel option in order to initialize MPI for the wrapper. It seems awkward, since last time (somewhere between 2018 and 2020) I can compile SU2 with parallel support then run it in serial by simply typing SU2_CFD without any verbose mpirun command. |
||
December 25, 2023, 11:03 |
|
#7 | |
Member
Hüseyin Can Önel
Join Date: Sep 2018
Location: Ankara, Turkey
Posts: 47
Rep Power: 8 |
Quote:
|
||
March 2, 2024, 03:18 |
|
#8 |
New Member
Vidhan Kashyap
Join Date: Feb 2024
Posts: 1
Rep Power: 0 |
I encountered the same fatal error following the quick compilating guide.
HTML Code:
./meson.py build -Dcustom-mpi=true -Dextra-deps=mpich sudo ./ninja -C build install After installing Pkg-config and resolving the libfabric linking issue, the build was successful the issue was resolved. |
|
April 3, 2024, 11:02 |
|
#9 | |
Member
na
Join Date: Jul 2018
Posts: 90
Rep Power: 8 |
@DrRedskull, concerning:
Quote:
I always just use `--prefix=$(pwd)` which expands to your SU2 code repository. So in you code directory will be a `bin`-folder with the binaries and you should not have to deal with sudo. For what it is worth... I am dealing with the same issue as OP. Forcing `--mca osc ucx` fixes the problem but is not really satisfying. I am on openMPI 4.1.2 on WSL (Ubuntu). |
||
Tags |
su2 and openmpi gcc |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
nspan Turbomachinery MPI issues | AmarotoS7 | SU2 | 3 | May 29, 2022 05:25 |
MPI issues with my own solver | davide_c | OpenFOAM Running, Solving & CFD | 1 | March 23, 2012 09:57 |
Sgimpi | pere | OpenFOAM | 27 | September 24, 2011 08:57 |
Error using LaunderGibsonRSTM on SGI ALTIX 4700 | jaswi | OpenFOAM | 2 | April 29, 2008 11:54 |
Is Testsuite on the way or not | lakeat | OpenFOAM Installation | 6 | April 28, 2008 12:12 |