CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Meshing & Mesh Conversion

[snappyHexMesh] reconstructPar after running snappyhexmesh in parallel

Register Blogs Community New Posts Updated Threads Search

Like Tree4Likes
  • 1 Post By PositronCascade
  • 1 Post By Antimony
  • 1 Post By Yann
  • 1 Post By PositronCascade

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   October 17, 2020, 09:53
Default reconstructPar after running snappyhexmesh in parallel
  #1
New Member
 
David
Join Date: Oct 2020
Posts: 21
Rep Power: 6
fidu is on a distinguished road
Hi I would like to reconstruct my case after running snappyhexmesh in parallel but I am unable to achieve that. Whenever I try I get the following error message:
Code:
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

--> FOAM Warning :
    From function int main(int, char**)
    in file reconstructPar.C at line 256
I would like to be abel to check the mesh before submitting it to the solver buoyantBoussinesqSimpleFoam. However I would be already be satisfied if I could directly sumbit the solver in parallel after running snappyhexmesh with the same number of cores. The problem there is that my boundary conditions are messed up as following error message shows:
Code:
--> FOAM FATAL IO ERROR:
[5] Cannot find patchField entry for heating
[5]
[5] file: /cluster/scratch/kaeserd/07_againg_new_coordinates_infolow/processor5/0/T.boundaryField
[5]
[5]     From function void Foam::GeometricField<Type, PatchField, GeoMesh>::Boundary::readField(const Foam::DimensionedField<TypeR, GeoMesh>&, const Foam::dictionary&) [with Type = double; PatchField = Foam::fvPatchField; GeoMesh = Foam::volMesh]
[5]     in file /dev/shm/spackapps/spack-stage/spack-stage-7EXqC3/OpenFOAM-v1806/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 191.
[5]
FOAM parallel run exiting
I asume this is because my boundary conditions are split up in the processerors (example from processor0/0/T) like this:
Code:
boundaryField
{
  inlet
    {

        type            fixedValue;  
        value           uniform 298.15;
}

outlet
{
type            zeroGradient;
 }

ground
{
 type            fixedValue;
 value           uniform 333.15;
}

frontAndBack
    {
  type            zeroGradient;
    }


procBoundary0to1
{
type            processor;
value           uniform 298.15;
} 

procBoundary0to3 
{
type            processor;
value           uniform 298.15;
}

procBoundary0to12
{
type            processor;
value           uniform 298.15;
}

procBoundary0to15  
{
type            processor;
value           uniform 298.15;
 }
}
My snappyhexMesh file looks like this:
Code:
/*--------------------------------*- C++ -*----------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.4.0                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
FoamFile
{
    version     2.0;
    format      ascii;
    class       dictionary;
    object      snappyHexMeshDict;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

#includeEtc "caseDicts/mesh/generation/snappyHexMeshDict.cfg"

castellatedMesh on;
snap            on;
addLayers       off;

geometry
{
    heating.stl
    {
        type triSurfaceMesh;
	    scale 0.001;
        name heating;
    }
    buildings.stl
    {
        type triSurfaceMesh;
	    scale 0.001;
        name buildings;
    }

};

castellatedMeshControls
{
//    nCellsBetweenLevels 3;

    features
    (
      { file  "heating.eMesh"; scale 0.001; level 3; }
      { file  "buildings.eMesh"; scale 0.001; level 3; }
    );

    refinementSurfaces
    {
        buildings
        {
            level (2 3);
            patchInfo { type wall; }
        }
        heating
        {
            level (2 3);
            patchInfo { type wall; }
        }
    }

    refinementRegions
    {
        heating
        {
            mode distance;
            levels ((0.077 2)); 
        }
        buildings
        {
            mode distance;
            levels ((0.077 2)); 
        }
    }

    locationInMesh (2 1 1);

    resolveFeatureAngle 60;
    allowFreeStandingZoneFaces true;

}

snapControls
{
    //- Number of patch smoothing iterations before finding correspondence
    //  to surface
    nSmoothPatch 3;

    //- Relative distance for points to be attracted by surface feature point
    //  or edge. True distance is this factor times local
    //  maximum edge length.
    tolerance 2.0;

    //- Number of mesh displacement relaxation iterations.
    nSolveIter 30;

    //- Maximum number of snapping relaxation iterations. Should stop
    //  before upon reaching a correct mesh.
    nRelaxIter 5;


    // Feature snapping

        //- Number of feature edge snapping iterations.
        //  Leave out altogether to disable.
        nFeatureSnapIter 10;

        //- Detect (geometric) features by sampling the surface (default=false)
        implicitFeatureSnap true;

        //- Use castellatedMeshControls::features (default = true)
        explicitFeatureSnap false;
}

addLayersControls
{
    layers
    {
        "CAD.*"
        {
            nSurfaceLayers 2;
        }
    }

    relativeSizes       true;
    expansionRatio      1.2;
    finalLayerThickness 0.5;
    minThickness        1e-3;
}

meshQualityControls
{

maxConcave 70;
minTetQuality 1E-12;
maxInternalSkewness 5;
maxBoundarySkewness 25;
}

writeFlags
(
    scalarLevels
    layerSets
    layerFields
);

mergeTolerance 1e-6;

// ************************************************************************* //
My decomposeParDict file is this:
Code:
/*--------------------------------*- C++ -*----------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.4.0                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
FoamFile
{
    version     2.0;
    format      ascii;
    class       dictionary;
    object      decomposeParDict;
}

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

numberOfSubdomains 48;

method          hierarchical;
// method          ptscotch;

simpleCoeffs
{
    n               (4 1 1);
    delta           0.001;
}

hierarchicalCoeffs
{
    n               (3 4 4);
    delta           0.001;
    order           xyz;
}

manualCoeffs
{
    dataFile        "cellDecomposition";
}


// ************************************************************************* //
Until now I run the case as follows:
blockMesh
surfaceFeatureExtract
decomposePar
mpirun -np 48 snappyHexMesh -overwrite -parallel
(I would like to reconstruct here)
mpirun -n 48 buoyantBoussinesqSimpleFoam -parallel
(and here )

Until now I run snappyhexmesh is series and in this case it worked.

Thank you a lot and sorry if my question should miss some important information. Thanks a lot in advance

best regards

fidu13

Last edited by fidu; October 18, 2020 at 16:06.
fidu is offline   Reply With Quote

Old   October 18, 2020, 15:10
Default
  #2
Senior Member
 
Zander Meiring
Join Date: Jul 2018
Posts: 125
Rep Power: 8
yambanshee is on a distinguished road
Have you tried to make use of the scotch method for decomposition? it could be that your structured deconstruction leads to breaking some of the boundaries in a weird way
yambanshee is offline   Reply With Quote

Old   October 18, 2020, 16:11
Default
  #3
New Member
 
David
Join Date: Oct 2020
Posts: 21
Rep Power: 6
fidu is on a distinguished road
Thanks a lot for your reply! Not yet but will try as soon as possible. However I meant the following boundary conditions as weird:
Code:
procBoundary0to1
{
type            processor;
value           uniform 298.15;
}
I would like that this boundary conditions I set back to my initial boundary condition after the reconstruction. So that I could decompose it again and submit it to the solver.

thanks again and sorry for the confusion
fidu is offline   Reply With Quote

Old   October 18, 2020, 20:33
Default
  #4
Member
 
Hasan Celik
Join Date: Sep 2016
Posts: 64
Rep Power: 10
PositronCascade is on a distinguished road
I didn't understand the problem so well, and I am sorry if you have done it and failed but have you tried reconstructParMesh after parallel run of snappyHexMesh? For instance, something like that:

Code:
reconstructParMesh -latestTime -mergeTol 1E-06 -noZero
fidu likes this.
PositronCascade is offline   Reply With Quote

Old   October 18, 2020, 23:23
Default
  #5
Senior Member
 
Join Date: Aug 2013
Posts: 407
Rep Power: 16
Antimony is on a distinguished road
Hi,

Hasan is right. If you have meshed in parallel, then you must do
Code:
reconstructParMesh
to reassemble the mesh.

Code:
reconstructPar
is used typically to reassemble the solution at all/particular timesteps.

So coming back to your workflow:
Quote:
mpirun -np 48 snappyHexMesh -overwrite -parallel
(I would like to reconstruct here)
mpirun -n 48 buoyantBoussinesqSimpleFoam -parallel
(and here )
What you would need to do is to run
Code:
reconstructParMesh
(with arguments that Hasan has stated) after running
Code:
snappyHexMesh
in parallel.

Once you have run your solver, you can run
Code:
reconstructPar
for the solution. Note that as long as you haven't remeshed, you don't need to run
Code:
reconstructParMesh
again (the one you do after the
Code:
snappyHexMesh
already has the updated mesh). And so only
Code:
reconstructPar
suffices.

Hope this helps

Cheers,
Antimony
fidu likes this.
Antimony is offline   Reply With Quote

Old   October 19, 2020, 06:24
Default
  #6
New Member
 
David
Join Date: Oct 2020
Posts: 21
Rep Power: 6
fidu is on a distinguished road
Hi and thanks a lot for your help. I did now manage to run snappy and reconstructParMesh. If I now try to run the solver I still get the following error message from each processor however:

Code:
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

[eu-g1-029-4:29347] 47 more processes have sent help message help-mpi-btl-openib.txt / error in device init
[eu-g1-029-4:29347] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

SIMPLE: convergence criteria
    field p_rgh  tolerance 0.0001
    field Ux     tolerance 0.0001
    field Uy     tolerance 0.0001
    field Uz     tolerance 0.0001
    field T      tolerance 0.0001
    field "(k|epsilon|omega)"    tolerance 0.0001

Reading thermophysical properties

Reading field T

[3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] Cannot find patchField entry for heating
[3]
[3] file: /cluster/scratch/kaeserd/parallel/1/new_case/05_new_coordinates_inflow/processor3/0/T.boundaryField
[3]
[3]     From function void Foam::GeometricField<Type, PatchField, GeoMesh>::Boundary::readField(const Foam::DimensionedField<TypeR, GeoMesh>&, const Foam::dictionary&) [with Type = double; PatchField = Foam::fvPatchField; GeoMesh = Foam::volMesh]
[3]     in file /dev/shm/spackapps/spack-stage/spack-stage-7EXqC3/OpenFOAM-v1806/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 191.
[3]
FOAM parallel run exiting
[3]
The heating is a stl file with which I also define the boundary conditions. If I have a look in processor0/0/T directory I see that boundary are defined as follows:
Code:
/*--------------------------------*- C++ -*----------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  v1806                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.com                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
FoamFile
{
    version     2.0;
    format      ascii;
    class       volScalarField;
    location    "0";
    object      T;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

dimensions      [0 0 0 1 0 0 0];


internalField   uniform 298.15;

boundaryField
{
    inlet
    {
        type            fixedValue;
        value           uniform 298.15;
    }
    outlet
    {
        type            zeroGradient;
    }
    ground
    {
        type            fixedValue;
        value           uniform 333.15;
    }
    frontAndBack
    {
        type            zeroGradient;
    }
    procBoundary0to1
    {
        type            processor;
        value           uniform 298.15;
    }
    procBoundary0to3
    {
        type            processor;
        value           uniform 298.15;
    }
    procBoundary0to12
    {
        type            processor;
        value           uniform 298.15;
    }
    procBoundary0to15
    {
        type            processor;
        value           uniform 298.15;
    }
}
How can I make sure that the initial boundary condition are updated or that the solver knows which condition he has to apply in each processor?

Thanks again for your help.
fidu is offline   Reply With Quote

Old   October 19, 2020, 06:58
Default
  #7
Senior Member
 
Yann
Join Date: Apr 2012
Location: France
Posts: 1,236
Rep Power: 29
Yann will become famous soon enoughYann will become famous soon enough
Hello Fidu,

Try to open the boundary file located in processor*/constant/polyMesh/boundary

Here you will see what are the names of your domain's boundaries. Each boundary should have a boundary condition defined for each variable in the 0 folder.

Looking at the error message in your last post, you have a boundary named "heating" and the solver crash because there is no boundary condition defined for this patch.

As long as you use decomposePar to distribute your 0 folder, you should not have to worry about "procBoundary*" patches. Those are the interface patches between each processor and they are automatically created by decomposePar.

I hope this helps,
Yann
fidu likes this.
Yann is online now   Reply With Quote

Old   October 19, 2020, 10:54
Default
  #8
New Member
 
David
Join Date: Oct 2020
Posts: 21
Rep Power: 6
fidu is on a distinguished road
Hi Yann

Thanks for your reply. I checked my boundary file in processor*/constant/polyMesh/boundary and there heating is defined. However when I checked processor*/0/T was heating not defined. Also I noticed that in the other files(U, p, p_rgh, alphat, epsilon ect) the heating condition was not defined, even thought I did define them in the original 0 directory. As I tried to change it manually and running the solver again I got the same error message. To change it I did open the file with manueally with vi processor*/0/T .

How can I solve this?
fidu is offline   Reply With Quote

Old   October 19, 2020, 10:55
Default
  #9
Member
 
Hasan Celik
Join Date: Sep 2016
Posts: 64
Rep Power: 10
PositronCascade is on a distinguished road
Quote:
Originally Posted by fidu View Post
Hi Yann

Thanks for your reply. I checked my boundary file in processor*/constant/polyMesh/boundary and there heating is defined. However when I checked processor*/0/T was heating not defined. Also I noticed that in the other files(U, p, p_rgh, alphat, epsilon ect) the heating condition was not defined, even thought I did define them in the original 0 directory. As I tried to change it manually and running the solver again I got the same error message. To change it I did open the file with manueally with vi processor*/0/T .

How can I solve this?

May you share your case file here? Did you write your result of the parallel mesh to the constant folder, or did it write it as another step such as 1 or 2?
PositronCascade is offline   Reply With Quote

Old   October 19, 2020, 11:47
Default
  #10
New Member
 
David
Join Date: Oct 2020
Posts: 21
Rep Power: 6
fidu is on a distinguished road
Sure. I have attached my entire case as a zip. So far my workflow was this:
  1. blockMesh
  2. surfaceFeatureExtract
  3. decomposePar
  4. mpirun snappyHexMesh -overwrite -parallel -n 48
  5. reconstructParMesh -latestTime -mergeTol 1E-06 -noZero
  6. buoyantBoussinesqSimpleFoam -parallel

When I remove all processor after reconstructParMesh and then decompose the case again I get almost the right boundary conditions. Just in processor*/0/U the the condition changed from uniform (0 0 0); to nonuniform 0(); for the heating boundary field.

Last edited by fidu; October 20, 2020 at 10:01.
fidu is offline   Reply With Quote

Old   October 19, 2020, 13:22
Default
  #11
Member
 
Hasan Celik
Join Date: Sep 2016
Posts: 64
Rep Power: 10
PositronCascade is on a distinguished road
Quote:
Originally Posted by fidu View Post
Sure. I have attached my entire case as a zip. So far my workflow was this:
  1. blockMesh
  2. surfaceFeatureExtract
  3. decomposePar
  4. mpirun snappyHexMesh -overwrite -parallel -n 48
  5. reconstructParMesh -latestTime -mergeTol 1E-06 -noZero
  6. buoyantBoussinesqSimpleFoam -parallel

When I remove all processor after reconstructParMesh and then decompose the case again I get almost the right boundary conditions. Just in processor*/0/U the the condition changed from uniform (0 0 0); to nonuniform 0(); for the heating boundary field.
I think you have a custom BC, so I couldn't run the solver part. But, modify your script as

Code:
 reconstructParMesh -constant -mergeTol 1E-06 -noZero
then, consider decomposePar once more and then run your solver. Till this part, I have checked and it works but I couldn't run the solver part as I have OFv7.
fidu likes this.
PositronCascade is offline   Reply With Quote

Old   October 20, 2020, 10:01
Default
  #12
New Member
 
David
Join Date: Oct 2020
Posts: 21
Rep Power: 6
fidu is on a distinguished road
Thanks a lot it is working now!!!
fidu is offline   Reply With Quote

Reply

Tags
decomposed mesh, openfoam 1806, parallel, snappyhesmeshdict


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
wrong zero boundary dir in parallel after snappyHexMesh HagenC OpenFOAM Pre-Processing 2 March 13, 2017 05:47
Running AMI case in parallel Kaskade OpenFOAM Running, Solving & CFD 3 March 14, 2016 16:58
simpleFoam parallel AndrewMortimer OpenFOAM Running, Solving & CFD 12 August 7, 2015 19:45
Fluent 14.0 file not running in parallel mode in cluster tejakalva FLUENT 0 February 4, 2015 08:02
Running CFX parallel distributed Under linux system with loadleveler queuing system ahmadbakri CFX 1 December 21, 2014 05:19


All times are GMT -4. The time now is 13:43.