|
[Sponsors] |
[snappyHexMesh] Problem with boundaries with sHM in parallel running |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
January 24, 2019, 08:56 |
Problem with boundaries with sHM in parallel running
|
#1 |
New Member
Loek
Join Date: Oct 2018
Posts: 14
Rep Power: 8 |
Dear Foamers,
I got a question regarding using snappyHexMesh in parallel. So when I snap my mesh I create a boundary which I call 'bubble'. If I don't run in parallel, than this can be used. But now I'm running the sHM in parallel and a couple of new boundaries are introduces (the places where the mesh is cut to become split over processors). Now I got a couple of new boundary patches and these overwrite my 'bubble' boundary patch. So now if I want to run my solver in parallel the solver cannot find the bubble patchField for p or U. The code inside the p boundary are given for the non-parallel snapped mesh and the parallel snapped mesh: __________________________________________________ _____ The non-parallel case: The commands I run are: surfaceCheck constant/triSurface/bubble.stl cp -r 0.orig 0 surfaceOrient constant/triSurface/bubble.stl "(1e10 1e10 1e10)" constant/triSurface/bubble.stl surfaceFeatureExtract surfaceFeatureConvert constant/triSurface/bubble.eMesh edges.vtk blockMesh snappyHexMesh -overwrite extrudeMesh -> which conclude in the boundaries for p shown below: dimensions [0 2 -2 0 0 0 0]; internalField uniform 0; boundaryField { stillWall { type zeroGradient; } movingWall { type zeroGradient; } cylinderCenter { type zeroGradient; } bubbleWall { type zeroGradient; } front { type empty; } back { type empty; } bubble { type zeroGradient; } } _____________________________________________ The parallel case The commands I run: surfaceCheck constant/triSurface/bubble.stl cp -r 0.orig 0 surfaceOrient constant/triSurface/bubble.stl "(1e10 1e10 1e10)" constant/triSurface/bubble.stl surfaceFeatureExtract surfaceFeatureConvert constant/triSurface/bubble.eMesh edges.vtk blockMesh decomposePar mpirun -np 4 snappyHexMesh -overwrite -parallel mpirun -np 4 extrudeMesh -parallel -> which conclude in the boundaries for p shown below: dimensions [0 2 -2 0 0 0 0]; internalField uniform 0; boundaryField { stillWall { type zeroGradient; } movingWall { type zeroGradient; } cylinderCenter { type zeroGradient; } bubbleWall { type zeroGradient; } front { type empty; } back { type empty; } procBoundary0to1 { type processor; value uniform 0; } procBoundary0to2 { type processor; value uniform 0; } } _________________________________________ You see in the second case the boundaryPatch for bubble is gone. Does anyone know how to counter this problem? Sincerely, Loek |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Problem with foam-extend 4.0 ggi parallel run | Metikurke | OpenFOAM Running, Solving & CFD | 1 | December 6, 2018 16:51 |
Error running openfoam in parallel | fede32 | OpenFOAM Programming & Development | 5 | October 4, 2018 17:38 |
Problem for parallel running chtMultiregionFoam | Giancarlo_IngChimico | OpenFOAM Programming & Development | 4 | December 21, 2016 04:26 |
Problem in Running OpenFoam in Parallel | himanshu28 | OpenFOAM Running, Solving & CFD | 1 | July 11, 2013 10:19 |
Problem running a parallel fluent job on local machine via mpd | highhopes | FLUENT | 0 | March 3, 2011 06:07 |