CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

MPI truncation error when running in Parallel

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   May 14, 2020, 07:26
Default MPI truncation error when running in Parallel
  #1
New Member
 
Conor
Join Date: Oct 2016
Posts: 14
Rep Power: 10
ConorMD is on a distinguished road
Hi there,

I am encountering an MPI error when attempting to run an FSI model, coupling Openfoam with Calculix.

I have tried a variety of decomposition strategies.

Does anyone know what might be causing this error?

I do not encounter this error when running the CFD solver alone in parallel.

Thanks
Conor


Code:
PIMPLE: Converged in 5 iterations
ExecutionTime = 138.33 s  ClockTime = 138 s

---[preciceAdapter] [DEBUG] Writing coupling data...
---[preciceAdapter] [DEBUG] Advancing preCICE...
(2) 11:18:05 [impl::SolverInterfaceImpl]:1307 in mapReadData: Compute read mapping from mesh "Calculix_Mesh" to mesh "Fluid-Mesh-Nodes".
(2) 11:18:05 [mapping::NearestProjectionMapping]:134 in computeMapping: WARNING: 3D Mesh "Calculix_Mesh" does not contain triangles. Nearest projection mapping will map to primitives of lower dimension.
(1) 11:18:05 [impl::SolverInterfaceImpl]:1307 in mapReadData: Compute read mapping from mesh "Calculix_Mesh" to mesh "Fluid-Mesh-Nodes".
(0) 11:18:05 [impl::SolverInterfaceImpl]:1307 in mapReadData: Compute read mapping from mesh "Calculix_Mesh" to mesh "Fluid-Mesh-Nodes".
(1) 11:18:05 [mapping::NearestProjectionMapping]:134 in computeMapping: WARNING: 3D Mesh "Calculix_Mesh" does not contain triangles. Nearest projection mapping will map to primitives of lower dimension.
(0) 11:18:05 [mapping::NearestProjectionMapping]:134 in computeMapping: WARNING: 3D Mesh "Calculix_Mesh" does not contain triangles. Nearest projection mapping will map to primitives of lower dimension.
(2) 11:18:05 [mapping::NearestProjectionMapping]:210 in computeMapping: Mapping distance min:883.084 max:883.087 avg: 883.085 var: 9.58797e-07 cnt: 545
(2) 11:18:05 [impl::SolverInterfaceImpl]:378 in advance: it 2 of 4 | dt# 1 | t 0 of 15 | dt 0.0001 | max dt 0.0001 | ongoing yes | dt complete no | read-iteration-checkpoint | 
(0) 11:18:05 [mapping::NearestProjectionMapping]:210 in computeMapping: Mapping distance min:883.08 max:883.103 avg: 883.092 var: 3.08115e-05 cnt: 6106
(0) 11:18:05 [impl::SolverInterfaceImpl]:378 in advance: it 2 of 4 | dt# 1 | t 0 of 15 | dt 0.0001 | max dt 0.0001 | ongoing yes | dt complete no | read-iteration-checkpoint | 
---[preciceAdapter] [DEBUG] Reading a checkpoint...
---[preciceAdapter] [DEBUG] Reloaded time value t = 0.000000
---[preciceAdapter] [DEBUG] Moving mesh points to their previous locations...
(1) 11:18:05 [mapping::NearestProjectionMapping]:210 in computeMapping: Mapping distance min:883.079 max:883.104 avg: 883.089 var: 2.8458e-05 cnt: 14477
(1) 11:18:05 [impl::SolverInterfaceImpl]:378 in advance: it 2 of 4 | dt# 1 | t 0 of 15 | dt 0.0001 | max dt 0.0001 | ongoing yes | dt complete no | read-iteration-checkpoint | 
AMI: Creating addressing and weights between 2622 source faces and 2622 target faces
AMI: Patch source sum(weights) min/max/average = 0.999999999999996, 1, 1
AMI: Patch target sum(weights) min/max/average = 0.999999999999923, 1.00000000000005, 1
---[preciceAdapter] [DEBUG] Moved mesh points to their previous locations.
---[preciceAdapter] [DEBUG] Checkpoint was read. Time = 0.000000
---[preciceAdapter] [DEBUG] Reading coupling data...
---[preciceAdapter] [DEBUG] Adjusting the solver's timestep...
---[preciceAdapter] [DEBUG] The solver's timestep is the same as the coupling timestep.
forces foil1_forces write:
    sum of forces:
        pressure : (0 0 0)
        viscous  : (10.1662639064385 -3.61537727703529e-11 -8.42274018146461e-11)
        porous   : (0 0 0)
    sum of moments:
        pressure : (0 0 0)
        viscous  : (-1.05289660430678e-08 4.44774051343602 5.08313195340344)
        porous   : (0 0 0)

[orpc-cfd:11463] *** An error occurred in MPI_Recv
[orpc-cfd:11463] *** reported by process [1067515905,0]
[orpc-cfd:11463] *** on communicator MPI COMMUNICATOR 3 SPLIT FROM 0
[orpc-cfd:11463] *** MPI_ERR_TRUNCATE: message truncated
[orpc-cfd:11463] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[orpc-cfd:11463] ***    and potentially your MPI job)
ConorMD is offline   Reply With Quote

Old   May 20, 2020, 19:04
Default
  #2
HPE
Senior Member
 
HPE's Avatar
 
Herpes Free Engineer
Join Date: Sep 2019
Location: The Home Under The Ground with the Lost Boys
Posts: 931
Rep Power: 13
HPE is on a distinguished road
Hi,

- Have you issued any bug tickets in `precice`: https://github.com/precice/precice/issues . The error might be related to entities other than OF?
- Were you able to run the coupling with a single processor?
HPE is offline   Reply With Quote

Reply

Tags
fsi, mpi, parallel


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[cfMesh] Fatal error when parallel running with mpi chen112p OpenFOAM Community Contributions 6 August 6, 2019 04:10
[snappyHexMesh] Problem with boundaries with sHM in parallel running Loekatoni OpenFOAM Meshing & Mesh Conversion 0 January 24, 2019 08:56
[snappyHexMesh] Error while running SnappyHex in parallel mg.mithun OpenFOAM Meshing & Mesh Conversion 1 February 10, 2016 14:13
Fluent 14.0 file not running in parallel mode in cluster tejakalva FLUENT 0 February 4, 2015 08:02
MPI and parallel computation Wang Main CFD Forum 7 April 15, 2004 12:25


All times are GMT -4. The time now is 13:14.