|
[Sponsors] |
[OpenFOAM.org] Compile OpenFOAM with OpenMPI+FCA v4.0.2 |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
September 11, 2019, 20:59 |
Compile OpenFOAM with OpenMPI+FCA v4.0.2
|
#1 |
New Member
Guillermo Tessi
Join Date: Sep 2019
Location: Argentina
Posts: 6
Rep Power: 7 |
Hello FOAMers,
Hope you are well. I'll describe an issue that I've come upon while trying to compile OpenFOAM-dev (downloaded on August, 28th) with Mellanox HPC-X toolbox v2.4.0 and it's OpenMPI+FCA lib, in Ubuntu 18.04 x64. These are the steps that I did in order to link OpenFOAM with this modded version of OpenMPI. - Firstly, I've edited the WM_MPLIB var in etc/bashrc file with the value "HPCXMPI". - Secondly, I've added the following block of code in etc/config.sh/mpi: Code:
HPCXMPI) export FOAM_MPI=openmpi-4.0.2a1 export MPI_ARCH_PATH=$HPCX_MPI_DIR _foamAddPath $MPI_ARCH_PATH/bin _foamAddLib $MPI_ARCH_PATH/lib ;; - Thirdly, I've edited $HOME/.bashrc script with the following, at the end of the file: Code:
# HPC-X export HPCX_HOME="/share/hpcx-2.4.0" source $HPCX_HOME/hpcx-init.sh hpcx_load # OpenFOAM source /share/OpenFOAM/OpenFOAM-dev/etc/bashrc - Fourthly, I've followed the install from source tutorial (https://openfoam.org/download/source), downloaded everything, compiled Scotch and ParaFOAM without issues, and then I've run the main OpenFOAM compile script. Now, here come some issues that I've come across: - Everything goes well until it tries to run the Allwmake apllications' script. Here is the console output: Code:
Allwmake applications wmake solvers wmake basic make[1]: going inside folder '/share/OpenFOAM/OpenFOAM-dev/applications/solvers/basic' wmake laplacianFoam make[2]: going inside folder '/share/OpenFOAM/OpenFOAM-dev/applications/solvers/basic/laplacianFoam' Making dependency list for source file laplacianFoam.C g++ -std=c++11 -m64 -Dlinux64 -DWM_ARCH_OPTION=64 -DWM_DP -DWM_LABEL_SIZE=32 -Wall -Wextra -Wold-style-cast -Wnon-virtual-dtor -Wno-unused-parameter -Wno-invalid-offsetof -Wno-attributes -O3 -DNoRepository -ftemplate-depth-100 -I/share/OpenFOAM/OpenFOAM-dev/src/finiteVolume/lnInclude -I/share/OpenFOAM/OpenFOAM-dev/src/meshTools/lnInclude -IlnInclude -I. -I/share/OpenFOAM/OpenFOAM-dev/src/OpenFOAM/lnInclude -I/share/OpenFOAM/OpenFOAM-dev/src/OSspecific/POSIX/lnInclude -fPIC -c laplacianFoam.C -o /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/applications/solvers/basic/laplacianFoam/laplacianFoam.o g++ -std=c++11 -m64 -Dlinux64 -DWM_ARCH_OPTION=64 -DWM_DP -DWM_LABEL_SIZE=32 -Wall -Wextra -Wold-style-cast -Wnon-virtual-dtor -Wno-unused-parameter -Wno-invalid-offsetof -Wno-attributes -O3 -DNoRepository -ftemplate-depth-100 -I/share/OpenFOAM/OpenFOAM-dev/src/finiteVolume/lnInclude -I/share/OpenFOAM/OpenFOAM-dev/src/meshTools/lnInclude -IlnInclude -I. -I/share/OpenFOAM/OpenFOAM-dev/src/OpenFOAM/lnInclude -I/share/OpenFOAM/OpenFOAM-dev/src/OSspecific/POSIX/lnInclude -fPIC -fuse-ld=bfd -Xlinker --add-needed -Xlinker --no-as-needed /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/applications/solvers/basic/laplacianFoam/laplacianFoam.o -L/share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib \ -lfiniteVolume -lfvOptions -lmeshTools -lOpenFOAM -ldl \ -lm -o /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/bin/laplacianFoam /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Comm_rank' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Abort' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `ompi_mpi_comm_null' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `ompi_mpi_group_null' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Alltoallv' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Scatterv' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Comm_group' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Comm_split' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Allreduce' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Bsend' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Buffer_detach' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `ompi_mpi_byte' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Recv' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Wait' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Test' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Alltoall' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Comm_size' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Waitall' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Get_count' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `ompi_mpi_double' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Comm_create' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Group_incl' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Probe' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Send' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `ompi_mpi_op_min' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Irecv' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Comm_free' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `ompi_mpi_comm_world' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Gatherv' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Init_thread' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Isend' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Finalize' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `ompi_mpi_op_sum' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Buffer_attach' /share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/lib/openmpi-4.0.2a1/libPstream.so: undefined reference to `MPI_Group_free' collect2: error: ld returned 1 exit status /share/OpenFOAM/OpenFOAM-dev/wmake/makefiles/general:140: recipe for target '/share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/bin/laplacianFoam' failed make[2]: *** [/share/OpenFOAM/OpenFOAM-dev/platforms/linux64GccDPInt32Opt/bin/laplacianFoam] Error 1 make[2]: going out of folder '/share/OpenFOAM/OpenFOAM-dev/applications/solvers/basic/laplacianFoam' /share/OpenFOAM/OpenFOAM-dev/wmake/makefiles/apps:39: recipe for target 'laplacianFoam' failed make[1]: *** [laplacianFoam] Error 2 make[1]: going out of folder '/share/OpenFOAM/OpenFOAM-dev/applications/solvers/basic' /share/OpenFOAM/OpenFOAM-dev/wmake/makefiles/apps:39: recipe for target 'basic' failed make: *** [basic] Error 2 guillote@guillote-VB:/share/OpenFOAM/OpenFOAM-dev$ Finally, I'm going to leave next the main checks and its outputs: Code:
which mpirun /share/hpcx-2.4.0/ompi/bin/mpirun Code:
which mpicc /share/hpcx-2.4.0/ompi/bin/mpicc Code:
echo $HPCX_MPI_DIR /share/hpcx-2.4.0/ompi Code:
echo $WM_MPLIB HPCXMPI Code:
echo $MPI_ARCH_PATH /share/hpcx-2.4.0/ompi Code:
echo $FOAM_MPI openmpi-4.0.2a1 Thank you very much in advance. Guille |
|
September 14, 2019, 10:17 |
|
#2 |
New Member
David
Join Date: Oct 2015
Location: mostly in France&Germany. Currently Palermo(Sicilia,Italy)
Posts: 4
Rep Power: 11 |
I also tried (without going into such detail), and would love to see this solved.
Here is why it is likely to bring a 50% speed gain on clusters with Mellanox infiniband: Mellanox' FCA , part of the HPC-X package, uses the processor on the Mellanox infiniband network adapters to speed up some of the mpi calls, namely the MPI_Allreduce function that accounts for 80% of all MPI time. See this study for details: https://www.hpcadvisorycouncil.com/p...l_2680_FCA.pdf And this link for the FCA package, part of the Mellanox HPC-X toolkit: https://www.mellanox.com/page/produc...s3imr630i3s2q5 A great day to you all ! |
|
September 16, 2019, 06:55 |
|
#3 |
Member
Join Date: Oct 2015
Location: Finland
Posts: 39
Rep Power: 11 |
Hey,
I would be interested in the outcome of this effort too. Our new cluster came with Mellanox hpcx-mpi 2.4.0, using ompi 4.0.2. Although we have a working openfoam compilation for different OF versions (4 to 7) and stock solvers are working with tutorials, especially at larger cases with higher number of nodes allocated we are getting MPI errors such as this one: Code:
[r14c17.bullx:229996] pml_ucx.c:686 Error: bsend: failed to allocate buffer [r14c17.bullx:229996] pml_ucx.c:797 Error: ucx send failed: No pending message [r14c17:229996] *** An error occurred in MPI_Bsend [r14c17:229996] *** reported by process [672351551,0] [r14c17:229996] *** on communicator MPI COMMUNICATOR 3 SPLIT FROM 0 [r14c17:229996] *** MPI_ERR_OTHER: known error not in list [r14c17:229996] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [r14c17:229996] *** and potentially your MPI job) srun: Job step aborted: Waiting up to 32 seconds for job step to finish. slurmstepd: error: *** STEP 205544.0 ON r14c17 CANCELLED AT 2019-09-16T12:32:20 *** Best, Bulut EDIT: Possible cause for this error (ompi version) : https://bugs.openfoam.org/view.php?id=3071 Last edited by blttkgl; September 23, 2019 at 04:41. Reason: Added a bug report |
|
October 13, 2019, 12:17 |
|
#4 |
New Member
Guillermo Tessi
Join Date: Sep 2019
Location: Argentina
Posts: 6
Rep Power: 7 |
Hello FOAMers,
I've finally managed to solve this issue. The problem was related to the compiler being unable to find the proper libraries, which I thought were already available in the LD_LIBRARY_PATH env var after loading the HPC-X environment. The wmake tool uses rules to resolve the paths for any dependencies and resources needed by the source code. When a new MPI entry key is added to the OpenFOAM-dev/etc/config.sh/mpi file (like I did), the tool will try to locate the rules file in the following path OpenFOAM-dev/wmake/rules/General. The rules file must be named as "mplib" + new MPI entry key. In my case, as I've chosen "HPCXMPI" as the MPI key the file was named mplibHPCXMPI. HPC-X uses OpenMPI, so I've used the same contents of the mplibOPENMPI file. The mplibHPCXMPI file must contain: Code:
PFLAGS = -DOMPI_SKIP_MPICXX PINC = -isystem $(MPI_ARCH_PATH)/include PLIBS = -L$(MPI_ARCH_PATH)/lib$(WM_COMPILER_LIB_ARCH) -L$(MPI_ARCH_PATH)/lib -lmpi
Now, if everything compiled correctly, to enable HPC-X and OpenFOAM the following commands must be run in order: Code:
ofdev-hpcx hpcx_load Code:
mpirun -x LD_LIBRARY_PATH -x PATH -x WM_PROJECT_DIR -x WM_PROJECT_INST_DIR -x WM_OPTIONS -x FOAM_APPBIN -x MPI_BUFFER_SIZE -hostfile machines -np 128 --bind-to socket --bind-to core -mca coll_hcoll_enable 1 -x HCOLL_MAIN_IB=mlx4_0:1 interFoam -case /10TB/KNLhcollgh88wingTest/ -parallel > log & Guille Last edited by guilleTessi; October 16, 2019 at 14:25. |
|
October 13, 2019, 12:24 |
|
#5 |
New Member
Guillermo Tessi
Join Date: Sep 2019
Location: Argentina
Posts: 6
Rep Power: 7 |
I forgot to attach a zip with the files already modded in my previus post. Sorry.
|
|
October 13, 2019, 15:19 |
|
#6 |
New Member
David
Join Date: Oct 2015
Location: mostly in France&Germany. Currently Palermo(Sicilia,Italy)
Posts: 4
Rep Power: 11 |
Isn't there a post missing above Guillermo ?
EDIT - no worries, post 4 didn't show this morning but it appears now! Congrats Guillermo, will test and post my timings. Last edited by davidtimide; October 13, 2019 at 18:03. |
|
October 18, 2019, 15:13 |
|
#7 |
New Member
Guillermo Tessi
Join Date: Sep 2019
Location: Argentina
Posts: 6
Rep Power: 7 |
UPDATE
This script is fully compatible with HPC-X v2.5.0. |
|
Tags |
fca, hpc-x, laplacianfoam, openmpi 4 |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Frequently Asked Questions about Installing OpenFOAM | wyldckat | OpenFOAM Installation | 3 | November 14, 2023 12:58 |
Map of the OpenFOAM Forum - Understanding where to post your questions! | wyldckat | OpenFOAM | 10 | September 2, 2021 06:29 |
[OpenFOAM.org] "Host key verification failed". upgrades to OpenMPI with OpenFoam 2.3.1 | derekm | OpenFOAM Installation | 0 | January 6, 2016 07:54 |
OpenFOAM v3.0.1 Training, London, Houston, Berlin, Jan-Mar 2016 | cfd.direct | OpenFOAM Announcements from Other Sources | 0 | January 5, 2016 04:18 |
New OpenFOAM Forum Structure | jola | OpenFOAM | 2 | October 19, 2011 07:55 |