|
[Sponsors] |
February 9, 2018, 11:06 |
1712 finiteArea Parallel bug?
|
#1 |
Member
Andrew Coughtrie
Join Date: May 2011
Posts: 51
Rep Power: 15 |
Hi everyone,
I'm uncertain if this is a bug or if parallel running hasn't been implemented for the finiteArea method in 1712 yet. Whenever I try to run one of the finiteArea solvers in parallel I get the following error (I know it isn't at all helpful but its all i've got): Code:
[ceg-w617:9451] *** An error occurred in MPI_Recv [ceg-w617:9451] *** reported by process [2378104833,0] [ceg-w617:9451] *** on communicator MPI_COMM_WORLD [ceg-w617:9451] *** MPI_ERR_TRUNCATE: message truncated [ceg-w617:9451] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [ceg-w617:9451] *** and potentially your MPI job) [ceg-w617:9452] *** An error occurred in MPI_Recv [ceg-w617:9452] *** reported by process [2378104833,1] [ceg-w617:9452] *** on communicator MPI_COMM_WORLD [ceg-w617:9452] *** MPI_ERR_TRUNCATE: message truncated [ceg-w617:9452] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [ceg-w617:9452] *** and potentially your MPI job) Thanks Andy |
|
February 14, 2018, 12:02 |
64 Bit Labels
|
#2 |
Member
Andrew Coughtrie
Join Date: May 2011
Posts: 51
Rep Power: 15 |
Seems to work fine when I recompile with 32bit labels
Andy |
|
Tags |
finite area method, mpi error, parallel |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Explicitly filtered LES | saeedi | Main CFD Forum | 16 | October 14, 2015 12:58 |
simpleFoam parallel | AndrewMortimer | OpenFOAM Running, Solving & CFD | 12 | August 7, 2015 19:45 |
Is there a bug when running createBaffles in parallel??? | zfaraday | OpenFOAM Pre-Processing | 1 | May 12, 2015 14:32 |
Parallel Moving Mesh Bug for Multi-patch Case | albcem | OpenFOAM Bugs | 17 | April 29, 2013 00:44 |
Parallel Moving Mesh Bug for Multi-patch Case | albcem | OpenFOAM | 0 | May 21, 2009 01:23 |