CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Pre-Processing

wrong zero boundary dir in parallel after snappyHexMesh

Register Blogs Community New Posts Updated Threads Search

Like Tree3Likes
  • 1 Post By HagenC
  • 1 Post By HagenC
  • 1 Post By HagenC

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 8, 2017, 04:41
Default wrong zero boundary dir in parallel after snappyHexMesh
  #1
New Member
 
Hagen
Join Date: Nov 2016
Posts: 16
Rep Power: 10
HagenC is on a distinguished road
Hi everyone,
I propably miss something basic, nevertheless I have no clue.
I am doing parallel computing with OpenFOAM-v1612+ on 3 processors. Procedure is:
Code:
blockMesh
mpirun -np 3 redistributePar -decompose -parallel
In processor0/0/p.boundary I have procBoundary0to1 but not procBoundary0to2, which is ok I think, so there could be no connection between faces of the first and the last processor. This is in agree with processor0/constant/polyMesh/boundary, where also only procBoundary0to1 exists. But then after
Code:
mpirun -np 3 snappyHexMesh -overwrite -parallel
I have got procBoundary0to2 in processor0/constant/polyMesh/boundary but still procBoundary0to1 in processor0/0/p.boundary. So trying to run
Code:
mpirun -np 3 simpleFoam -parallel
gives the error message:
Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  v1612+                                |
|   \\  /    A nd           | Web:      www.OpenFOAM.com                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : v1612+
Exec   : simpleFoam -parallel
Date   : Mar 08 2017
Time   : 09:30:46
Host   : "simmachine"
PID    : 20842
Case   : /home/hagen/OpenFOAM/hagen-v1612+/run/stenose/RAS/kOmegaSST/long/steady
nProcs : 3
Slaves : 
2
(
"simmachine.20843"
"simmachine.20844"
)

Pstream initialized with:
    floatTransfer      : 0
    nProcsSimpleSum    : 0
    commsType          : nonBlocking
    polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 10)
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0


SIMPLE: convergence criteria
    field p     tolerance 0.01
    field U     tolerance 0.001
    field "(k|epsilon|omega)"     tolerance 0.001

Reading field p

[0] 
[0] 
[0] --> FOAM FATAL IO ERROR: 
[0] size 0 is not equal to the given value of 784
[0] 
[0] file: /home/hagen/OpenFOAM/hagen-v1612+/run/stenose/RAS/kOmegaSST/long/steady/processor0/0/p.boundaryField.inlet from line 26 to line 28.
[1] [2] 
[2] 
[2] --> FOAM FATAL IO ERROR: 
[2] Cannot find patchField entry for procBoundary2to0
[2] 
[2] file: /home/hagen/OpenFOAM/hagen-v1612+/run/stenose/RAS/kOmegaSST/long/steady/processor2/0/p.boundaryField from line 26 to line 42.
[2] 
[2]     From function void Foam::GeometricField<Type, PatchField, GeoMesh>::Boundary::readField(const Foam::DimensionedField<TypeR, GeoMesh>&, const Foam::dictionary&) [with Type = double; PatchField = Foam::fvPatchField; GeoMesh = Foam::volMesh]
[2]     in file /home/hagen/OpenFOAM/OpenFOAM-v1612+/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 191.
[2] 
FOAM parallel run exiting
[2] 
[0] 
[1] 
[1] 
--> FOAM FATAL IO ERROR: 
[1] size 0 is not equal to the given value of 1704
[1] 
[1] file: /home/hagen/OpenFOAM/hagen-v1612+/run/stenose/RAS/kOmegaSST/long/steady/processor1/0/p.boundaryField.outlet from line [0] 32 to line 33.
[1] 
[1]     From function Foam::Field<Type>::Field(const Foam::word&, const Foam::dictionary&, Foam::label) [with Type = double; Foam::label = int]
[1]     in file     From function /home/hagen/OpenFOAM/OpenFOAM-v1612+/src/OpenFOAM/lnInclude/Field.C at line 304.
[1] 
FOAM parallel run exiting
Foam::Field<Type>::Field(const Foam::word&, const Foam::dictionary&, Foam::label) [with Type = double; Foam::label = int][1] 

[0]     in file /home/hagen/OpenFOAM/OpenFOAM-v1612+/src/OpenFOAM/lnInclude/Field.C at line 304.
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[0] 
FOAM parallel run exiting
[0] 
[simmachine:20840] 2 more processes have sent help message help-mpi-api.txt / mpi-abort
[simmachine:20840] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
I can solve that by reconstructing and decomposing again:
Code:
mpirun -np 3 redistributePar -reconstruct -constant -parallel
mpirun -np 3 redistributePar -decompose -parallel
but there must be an easier way!
If I only decompose again after snappyHexMesh without reconstructing before, I end up getting only the blockMesh.
I appreciate any recommondations.
Thank you.
angatri_14 likes this.
HagenC is offline   Reply With Quote

Old   March 8, 2017, 10:30
Default Solved
  #2
New Member
 
Hagen
Join Date: Nov 2016
Posts: 16
Rep Power: 10
HagenC is on a distinguished road
Ok, I found a solution for that.
Seems that OpenFoam somehow gets confused with the information from the zero folder.
By using an empty dummy zero folder in your case directory, no zero folder will be created in the processor folders. So after snappyHexMesh you can use
Code:
restore0Dir -processor
to create 0 directories from an 0.org folder in the case directory. (Mind sourcing the run functions by . $WM_PROJECT_DIR/bin/tools/RunFunctions ).
And afterwards simpleFoam runs just fine.
By the way, with an older version (v1606+) I still struggeled with some ProcAddressing issues. This can be solved by using
Code:
mpirun -np 3 renumberMesh -overwrite -constant -parallel
after snappyHexMesh. The mesh will be renumbered and ill procAddressings will be deleted.
Best,
Hagen
angatri_14 likes this.
HagenC is offline   Reply With Quote

Old   March 13, 2017, 05:47
Default Solved 2
  #3
New Member
 
Hagen
Join Date: Nov 2016
Posts: 16
Rep Power: 10
HagenC is on a distinguished road
I realized, I have forgotten to mention one important thing:
You need to have a
Code:
"proc.*"
{
    type    processor;
}
boundary condition in each of your p, U, omega, what ever files in 0.org folder. (0.orig in version v1612+)
Best,
Hagen
angatri_14 likes this.
HagenC is offline   Reply With Quote

Reply

Tags
boundary, parallel, snappyhexmesh


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
sliding mesh problem in CFX Saima CFX 46 September 11, 2021 08:38
Basic Nozzle-Expander Design karmavatar CFX 20 March 20, 2016 09:44
Explicitly filtered LES saeedi Main CFD Forum 16 October 14, 2015 12:58
RPM in Wind Turbine Pankaj CFX 9 November 23, 2009 05:05
New topic on same subject - Flow around race car Tudor Miron CFX 15 April 2, 2004 07:18


All times are GMT -4. The time now is 17:22.