|
[Sponsors] |
October 7, 2017, 18:35 |
SU2 _ MPI Problem
|
#1 |
Member
Mehdi Mortazawy
Join Date: Mar 2017
Posts: 30
Rep Power: 9 |
Hello All,
I was wondering if there is any control over how MPI divides the mesh when running on a HPC cluster. Sometimes when I run a ~7million cell 2D mesh It works with 200 cores but when I go any value above (lets say 360) it either gets stuck at the “Communicationg....halo layers) step or throws a MPI FATAL ERROR. Any comments or suggestions are appreciated. Mehdi Last edited by mhd_mrt; November 4, 2017 at 09:47. Reason: Error changed |
|
Tags |
error, mpi, parallel computation, su2 |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Mellanox ConnectX-3 Infiniband problem with Platform MPI 9.1.3 | digitalmg | Hardware | 13 | January 23, 2019 15:28 |
SU2 Parallel Computation problem | Gramanzini | SU2 | 5 | September 29, 2018 13:32 |
Data Writing Problem in Intel MPI | xuegy | OpenFOAM Running, Solving & CFD | 2 | May 11, 2018 04:19 |
MPI problem when using snappyHexMesh | Mohamed Mousa | OpenFOAM Running, Solving & CFD | 3 | September 17, 2017 12:45 |
CFX parallel hp MPI problem | fluidmechanics | CFX | 5 | June 19, 2013 20:05 |