|
[Sponsors] |
November 6, 2018, 17:36 |
compiling with openmpi 2018
|
#1 |
Member
Tom Jentink
Join Date: Jan 2013
Posts: 61
Rep Power: 13 |
getting an error on csr.c in metis routines. it's just a warning, but my parallel config is not running parallel, so I'm wondering if this is the problem.
CC GKlib/libmetis_a-csr.o GKlib/csr.c(796): warning #3180: unrecognized OpenMP #pragma #pragma omp parallel private(i, j, ncand, rsum, tsum, cand) ^ GKlib/csr.c(800): warning #3180: unrecognized OpenMP #pragma #pragma omp for schedule(static) |
|
November 6, 2018, 18:11 |
|
#2 |
Member
Tom Jentink
Join Date: Jan 2013
Posts: 61
Rep Power: 13 |
I got rid of the warning with --with-metis-cppflags="-fopenmp"
Now to see if it will run in parallel |
|
November 7, 2018, 12:06 |
|
#3 |
Member
Tom Jentink
Join Date: Jan 2013
Posts: 61
Rep Power: 13 |
It won't run the partitioning step now. It gets to "Read Grid File Information", lists the number of points, and then dies, saying "could not launch executable"
I'm using a pbs scheduler, and have all my path and environment set up properly i'm pretty sure. |
|
November 8, 2018, 06:53 |
|
#4 |
Senior Member
Pedro Gomes
Join Date: Dec 2017
Posts: 466
Rep Power: 14 |
What version of openmpi are you using specifically (x.y.z)?
How are you compiling the code? Before using a scheduler, have you tried running on a local machine? |
|
November 8, 2018, 09:41 |
|
#5 |
Member
Tom Jentink
Join Date: Jan 2013
Posts: 61
Rep Power: 13 |
openmpi_3.0.1_intel_2018
intel_2018.2.199 Haven't tried serial since it's really of no use to me and I can't run interactive on our main cluster. Right now, it seems to be a path issue to whatever is called to do the grid partitioning. But when I had that step working, it would load one cpu up with the entire parmetis job and eventually die due to maxxing out the memory. |
|
November 8, 2018, 16:58 |
Solved
|
#6 |
Member
Tom Jentink
Join Date: Jan 2013
Posts: 61
Rep Power: 13 |
I went with mpt instead of openmpi and it is working!
configure --prefix=/u/tjentink/SU2 --exec-prefix=/u/tjentink/SU2 --docdir=/u/tjentink/SU2 CXXFLAGS=-O2 --enable-metis --enable-parmetis --enable-cgns --enable-mpi --with-cc=mpicc --with-cxx=mpicxx --enable-tecio --libdir=/opt/hpe/hpc/mpt/mpt-2.16 |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
NUMAP FOAM Summer School 2018 | hjasak | OpenFOAM Announcements from Other Sources | 0 | March 1, 2018 07:19 |
OpenFOAM Training February-July 2018: Virtual, London, Houston, Berlin | cfd.direct | OpenFOAM Announcements from Other Sources | 0 | January 29, 2018 16:34 |
[OpenFOAM.org] Issues with OpenMPI compilation | Regis_ | OpenFOAM Installation | 5 | July 15, 2015 11:14 |
ThirdParty compilation error during OF-1.6-ext installation | achyutan | OpenFOAM Installation | 21 | February 28, 2013 06:58 |
Compiling 14 decompositionMethods with OpenMPI | gschaider | OpenFOAM Bugs | 2 | July 5, 2008 11:35 |