|
[Sponsors] |
[snappyHexMesh] snappyHexMesh in parallel with cyclics |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
June 27, 2011, 17:47 |
snappyHexMesh in parallel with cyclics
|
#1 |
Member
Tony
Join Date: Jun 2010
Posts: 54
Rep Power: 16 |
Hi all,
I am trying to use snappyHexMesh in parallel to refine a region in a large domain. The domain has the following boundary conditions: Code:
boundaryField { recycle_1 { type cyclic; value uniform (8.0 0.0 0.0); } inflow { type fixedValue; value uniform (8.0 0.0 0.0); } outflow { type inletOutlet; inletValue uniform (0.0 0.0 0.0); value uniform (8.0 0.0 0.0); } recycle_2 { type cyclic; value uniform (8.0 0.0 0.0); } } Code:
using 64 processors [38] [38] [38] --> FOAM FATAL IO ERROR: [38] size 0 is not equal to the given value of 96 [38] [38] file: /scratch/lmartine/grid_Resolution/turbineMesh/processor38/0/p::boundaryField::recycle_1 from line 26 to line 28. [38] [38] From function Field<Type>::Field(const word& keyword, const dictionary&, const label) [38] in file /projects/nrel/apps/openfoam/src/OpenFOAM-1.7.1/src/OpenFOAM/lnInclude/Field.C at line 236. [38] FOAM parallel run exiting [38] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 38 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- -------------------------------------------------------------------------- mpirun has exited due to process rank 38 with PID 21329 on node rr124 exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- Im running OpenFOAM v 1.7.1 case.zip Thanks! Tony |
|
June 29, 2011, 11:43 |
|
#2 |
Member
Tony
Join Date: Jun 2010
Posts: 54
Rep Power: 16 |
Hi all,
I was able to get it to work by running reconstructParMesh and decomposePar againg. This is not an efficient fix since these utilities will become a bottleneck as the grids get bigger. I found the temporary fix in this bug report: Bug |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
[snappyHexMesh] snappyHexMesh in parallel | shailesh.nitk | OpenFOAM Meshing & Mesh Conversion | 33 | January 25, 2022 11:35 |
[snappyHexMesh] Running snappyHexMesh in parallel - optimizing | peterhess | OpenFOAM Meshing & Mesh Conversion | 2 | January 3, 2018 03:54 |
[snappyHexMesh] Problem with parallel run of snappyHexMesh | Lorenzo92 | OpenFOAM Meshing & Mesh Conversion | 5 | April 15, 2016 05:12 |
[snappyHexMesh] SnappyHexMesh in Parallel problem | swifty | OpenFOAM Meshing & Mesh Conversion | 10 | November 6, 2015 05:40 |
simpleFoam parallel | AndrewMortimer | OpenFOAM Running, Solving & CFD | 12 | August 7, 2015 19:45 |