|
[Sponsors] |
February 7, 2022, 07:43 |
MPI Error in Custom Utility
|
#1 |
New Member
Join Date: Nov 2016
Posts: 8
Rep Power: 10 |
Hi everyone,
I've written a bespoke utility that will identify droplets and other structures in a VOF field, and want to parallelise it so that it can be incorporated into a multiphase solver. This parallelisation is almost complete, however I've run into some edge cases that I cannot solve. Without going into endless detail, the parallelisation works by passing droplet IDs across processor boundaries using a volScalarField which tracks which droplet any cell is attached to. The utility will then write out a connectivity file for each processor, to enable any droplets that cross processor boundaries to be reconstructed as a post-processing step. My current issue seems to only occur if there is a processor domain which has no droplets requiring identification. The following loop loops over all processor patches and sets the patch face value to be the same as the cell centre value if the cell value is greater than 0. Code:
forAll(mesh.boundaryMesh(), patchi) { if (isA<processorPolyPatch>(mesh.boundaryMesh()[patchi])) { forAll(id.boundaryField()[patchi], facei) { label adjacentCell = mesh.boundaryMesh()[patchi].faceCells()[facei]; if (id[adjacentCell] > 0) { id.boundaryFieldRef()[patchi][facei] = id[adjacentCell]; } } id.boundaryFieldRef()[patchi].initEvaluate(Pstream::commsTypes::nonBlocking); id.boundaryFieldRef()[patchi].evaluate(Pstream::commsTypes::nonBlocking); } } Code:
[proteus:25213] *** An error occurred in MPI_Wait [proteus:25213] *** reported by process [1815216129,7] [proteus:25213] *** on communicator MPI_COMM_WORLD [proteus:25213] *** MPI_ERR_TRUNCATE: message truncated [proteus:25213] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [proteus:25213] *** and potentially your MPI job) In an attempt to synchronise the processors before this loop, I added the following reduce call, but this also throws an error. Code:
label tmp = Pstream::myProcNo(); reduce(tmp, maxOp<label>()); Code:
[proteus:25771] Read -1, expected 86400, errno = 14 [proteus:25771] *** An error occurred in MPI_Recv [proteus:25771] *** reported by process [4918845867728568321,7] [proteus:25771] *** on communicator MPI_COMM_WORLD [proteus:25771] *** MPI_ERR_TRUNCATE: message truncated [proteus:25771] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [proteus:25771] *** and potentially your MPI job) |
|
Tags |
mpi error, openfoam v5.0 |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Programming custom sample utility | RaulCA | OpenFOAM Programming & Development | 0 | August 28, 2019 06:24 |
Problems using the custom utility NusseltCalc | sajad6 | OpenFOAM Post-Processing | 3 | October 22, 2014 19:22 |
Sgimpi | pere | OpenFOAM | 27 | September 24, 2011 08:57 |
Error using LaunderGibsonRSTM on SGI ALTIX 4700 | jaswi | OpenFOAM | 2 | April 29, 2008 11:54 |
Is Testsuite on the way or not | lakeat | OpenFOAM Installation | 6 | April 28, 2008 12:12 |