CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Meshing & Mesh Conversion

[snappyHexMesh] snappyHexMesh in parallel with cyclics

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   June 27, 2011, 17:47
Default snappyHexMesh in parallel with cyclics
  #1
Member
 
Tony
Join Date: Jun 2010
Posts: 54
Rep Power: 16
tonyuprm is on a distinguished road
Hi all,

I am trying to use snappyHexMesh in parallel to refine a region in a large domain. The domain has the following boundary conditions:

Code:
boundaryField
{
    recycle_1
    {
        type            cyclic;
        value           uniform (8.0 0.0 0.0);
    }

    inflow
    {
        type            fixedValue;
        value           uniform (8.0 0.0 0.0);
    }

    outflow
    {
        type            inletOutlet;
        inletValue      uniform (0.0 0.0 0.0);
        value           uniform (8.0 0.0 0.0);
    }

    recycle_2
    {
        type            cyclic;
        value           uniform (8.0 0.0 0.0);
    }
}
I am able to run blockMesh, decomposePar using parMetis and refine the mesh using snappyHexMesh in parallel. I am using preservePatches option for my cyclic boundary conditions in decomposeParDict. The checkMesh utility says my mesh is OK. After I run the mesh however I am not able to run my solver and I get the following errors:

Code:
using  64  processors
[38]
[38]
[38] --> FOAM FATAL IO ERROR:
[38] size 0 is not equal to the given value of 96
[38]
[38] file: /scratch/lmartine/grid_Resolution/turbineMesh/processor38/0/p::boundaryField::recycle_1 from line 26 to line 28.
[38]
[38]     From function Field<Type>::Field(const word& keyword, const dictionary&, const label)
[38]     in file /projects/nrel/apps/openfoam/src/OpenFOAM-1.7.1/src/OpenFOAM/lnInclude/Field.C at line 236.
[38]
FOAM parallel run exiting
[38]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 38 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 38 with PID 21329 on
node rr124 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
Attached are my blockMeshDict, decomposeParDict and snappyHexMeshDict files.
Im running OpenFOAM v 1.7.1

case.zip

Thanks!

Tony
tonyuprm is offline   Reply With Quote

Old   June 29, 2011, 11:43
Default
  #2
Member
 
Tony
Join Date: Jun 2010
Posts: 54
Rep Power: 16
tonyuprm is on a distinguished road
Hi all,

I was able to get it to work by running reconstructParMesh and decomposePar againg. This is not an efficient fix since these utilities will become a bottleneck as the grids get bigger. I found the temporary fix in this bug report:

Bug
tonyuprm is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[snappyHexMesh] snappyHexMesh in parallel shailesh.nitk OpenFOAM Meshing & Mesh Conversion 33 January 25, 2022 11:35
[snappyHexMesh] Running snappyHexMesh in parallel - optimizing peterhess OpenFOAM Meshing & Mesh Conversion 2 January 3, 2018 03:54
[snappyHexMesh] Problem with parallel run of snappyHexMesh Lorenzo92 OpenFOAM Meshing & Mesh Conversion 5 April 15, 2016 05:12
[snappyHexMesh] SnappyHexMesh in Parallel problem swifty OpenFOAM Meshing & Mesh Conversion 10 November 6, 2015 05:40
simpleFoam parallel AndrewMortimer OpenFOAM Running, Solving & CFD 12 August 7, 2015 19:45


All times are GMT -4. The time now is 20:01.