|
[Sponsors] |
August 4, 2013, 00:52 |
neighbour boundary has no faces in parallel
|
#1 |
New Member
David
Join Date: Mar 2010
Location: Vancouver, Canada
Posts: 13
Rep Power: 16 |
I have a multi-region domain, consisting of three blocks:
block_left, block_centre, block_right there are the obvious exterior boundaries and 6 interfaces between the blocks: for block_left: block_left_to_block_centre for block_centre: block_centre_to_block_left block_centre_to_block_right for block_right: block_right_to_block_centre I have a custom boundary condition between the blocks and it works perfectly in serial, when in parallel however the boundary condition stops working. It appears that the issue is related to some of the interfaces ending up with no faces after the decomposition. Here is a section of code from the boundary condition and the output from the solver log file. ** CODE ** int oldTag = UPstream::msgType(); UPstream::msgType() = oldTag+1; const mappedPatchBase& mpp = refCast<const mappedPatchBase> ( this->patch().patch() ); const polyMesh& neighbourMesh = mpp.sampleMesh(); const fvPatch& neighbourPatch = refCast<const fvMesh> ( neighbourMesh ).boundary()[mpp.samplePolyPatch().index()]; const fvMesh& principalMesh = patch().boundaryMesh().mesh(); Info<< tab << "******************************" << endl; Info<< "PRINCIPAL NAME" << nl << principalMesh.name() << endl; Info<< "NEIGHBOUR NAME" << nl << neighbourMesh.name() << endl; Info<< "PRINCIPAL PATCH" << nl << this->patch().patch() << endl; Info<< "NEIGHBOUR PATCH" << nl << neighbourPatch.patch() << endl; Info<< tab << "******************************" << endl; // Force a recalculation of mapping and schedule const mapDistribute& distMap = mpp.map(); (void)distMap.schedule(); vectorField pUnitNormal = this->patch().Sf()/this->patch().magSf(); vectorField nUnitNormal = neighbourPatch.Sf()/neighbourPatch.magSf(); Info << "p Unit Normal" << nl << pUnitNormal << endl; Info << "n Unit Normal" << nl << nUnitNormal << endl; **OUTPUT** PRINCIPAL NAME block_r NEIGHBOUR NAME block_c PRINCIPAL PATCH type mappedWall; nFaces 0; startFace 125; sampleMode nearestPatchFace; sampleRegion block_c; samplePatch block_c_to_block_r; offsetMode uniform; offset (0 0 0); NEIGHBOUR PATCH type mappedWall; nFaces 6; startFace 119; sampleMode nearestPatchFace; sampleRegion block_r; samplePatch block_r_to_block_c; offsetMode uniform; offset (0 0 0); ****************************** p Unit Normal 0() n Unit Normal 6{(1 0 0)} this behaviour alternates between the unit normal of the principal face and the one for the neighbour having no normal. This behaviour is entirely correlated to nFaces being 0 for the given patch. Does anyone have some insight? Your help is much appreciated. PG |
|
Tags |
boundary, boundary conditions, normal, openfoam, parallel |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Domain Imbalance | HMR | CFX | 5 | October 10, 2016 06:57 |
Radiation interface | hinca | CFX | 15 | January 26, 2014 18:11 |
An error has occurred in cfx5solve: | volo87 | CFX | 5 | June 14, 2013 18:44 |
Group/Merge boundary faces | Koga | OpenFOAM Pre-Processing | 2 | November 13, 2012 14:58 |
DecomposePar unequal number of shared faces | maka | OpenFOAM Pre-Processing | 6 | August 12, 2010 10:01 |