CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Bugs

moveDynamicMesh in parallel chrashes

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   May 17, 2011, 13:06
Default moveDynamicMesh in parallel chrashes
  #1
Senior Member
 
Arne Stahlmann
Join Date: Nov 2009
Location: Hanover, Germany
Posts: 209
Rep Power: 18
Arnoldinho is on a distinguished road
Hi,

I'm testing the moveDynamicMesh solver in OF-1.6-ext. I ran the circCylinder3d tutorial in mesh/moveDynamicMesh. Running this in serial mode works fine, but crashes in parallel mode. Errors are the following:

- When decomposing it using four processors and simple decomposition (no matter in which direction), I get:
Quote:
~~~ Mesh Quality Statistics ~~~
Min: 0.592052
Max: 0.999858
Mean: 0.898061
Cells: 8406
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

[3]
[3]
[3] --> FOAM FATAL ERROR:
[3]
Cannot insert 606 in list: 5(596 593 597 -1 -1)
Labels: 565 and 566 were not found in sequence.
[3]
[3] From function inline void meshOps::insertLabel(const label, const label, const label, labelList&)
[3] in file dynamicTopoFvMesh/meshOpsI.H at line 579.
[3]
FOAM parallel run aborting
[3]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD
with errorcode 1.
When using metis decomposition, I get

Quote:
mpirun noticed that process rank 1 with PID 4013 on node PC12 exited on signal 11 (Segmentation fault).
Do I have to apply any different settings when running in parallel, or does this only work in serial mode?

Arne
Arnoldinho is offline   Reply With Quote

Old   May 17, 2011, 14:28
Default
  #2
Senior Member
 
Sandeep Menon
Join Date: Mar 2009
Location: Amherst, MA
Posts: 403
Rep Power: 25
deepsterblue will become famous soon enough
Arne,

Unfortunately, the version of dynamicTopoFvMesh in OF-1.6-ext is not parallel-aware. I have a local version that is, but may contain a few bugs that need to be ironed out before release. I suppose it could go into a branch on the git repository - I'll work on that, if there's sufficient community interest.
__________________
Sandeep Menon
University of Massachusetts Amherst
https://github.com/smenon
deepsterblue is offline   Reply With Quote

Old   May 17, 2011, 15:33
Default
  #3
Senior Member
 
Arne Stahlmann
Join Date: Nov 2009
Location: Hanover, Germany
Posts: 209
Rep Power: 18
Arnoldinho is on a distinguished road
Thanks Sandeep, I already read about your local parallel version in another thread. I hoped that the non-parallelization would generally have been solved... Nevertheless, could you provide me a copy of your version?
Arnoldinho is offline   Reply With Quote

Old   May 17, 2011, 16:43
Default
  #4
Senior Member
 
Sandeep Menon
Join Date: Mar 2009
Location: Amherst, MA
Posts: 403
Rep Power: 25
deepsterblue will become famous soon enough
I've started a git branch for parallel support (feature/parallelDynamicTopoFvMesh).

Please check:
http://openfoam-extend.git.sourcefor...amicTopoFvMesh

Please check out this branch, re-compile and let me know. You will also need to patch this branch with (feature/mesquiteHexPrismSupport), if you wish to use the mesquiteMotionSolver in parallel.
__________________
Sandeep Menon
University of Massachusetts Amherst
https://github.com/smenon
deepsterblue is offline   Reply With Quote

Old   May 18, 2011, 06:34
Default
  #5
Senior Member
 
Arne Stahlmann
Join Date: Nov 2009
Location: Hanover, Germany
Posts: 209
Rep Power: 18
Arnoldinho is on a distinguished road
Thanks, although it does not yet work. I'm not familiar with git, so maybe I made a mistake in this step... This is what I did to get your parallel version:

Code:
git pull
. /etc/bashrc
./Allwmake
which did not have an influence on the error. After that I tried

Code:
git pull
git checkout -f HEAD
. /etc/bashrc
./Allwmake
which recompiled the whole OF, but also did not have an effect. Error messages are the same as mentioned above, after using
Code:
mpirun -np 4 moveDynamicMesh -parallel
BTW: Do any of the moving mesh solvers/applications run in parallel? I tried the surfaceTracking/interTrackFoam/tank3D tutorial in OF-1.6-ext and got error messages running in parallel as well. In single mode, everything runs fine again.
Arnoldinho is offline   Reply With Quote

Old   May 18, 2011, 11:14
Default
  #6
Senior Member
 
Sandeep Menon
Join Date: Mar 2009
Location: Amherst, MA
Posts: 403
Rep Power: 25
deepsterblue will become famous soon enough
I'll bet that you're still on the master branch.

Switch to the new branch using:

git checkout feature/parallelDynamicTopoFvMesh

After this, issue an Allwmake. If you want mesquite working in parallel, you will need to copy the mesquiteMotionSolver.H/.C files from the other branch (or use a 'git merge').

I would suggest brushing up on git basics, since you'll need it.
__________________
Sandeep Menon
University of Massachusetts Amherst
https://github.com/smenon
deepsterblue is offline   Reply With Quote

Old   May 18, 2011, 12:04
Default
  #7
Senior Member
 
Arne Stahlmann
Join Date: Nov 2009
Location: Hanover, Germany
Posts: 209
Rep Power: 18
Arnoldinho is on a distinguished road
You were right, I was still working on the master branch. As suggested, I did:
git checkout feature/parallelDynamicTopoFvMesh
./Allwmake
git merge feature/parallelDynamicTopoFvMesh master

Using simple decomposition (decomposed in z direction) and 2 processors, I get the following error:

Quote:
[1] Patch: 4 Processor: 0 mSize: 274 sSize: 274 failed on match for face centres.
Mapping time: 0.076667 s
[1]
[1]
[1] --> FOAM FATAL ERROR:
[1] Matching for processor patches failed.
Patch faces written out to disk.
[1]
[1]
[1] From function
void dynamicTopoFvMesh::syncCoupledBoundaryOrdering
(
List<pointField>& centres,
List<pointField>& anchors,
labelListList& patchMaps,
labelListList& rotations
) const
[1]
[1] in file dynamicTopoFvMesh/dynamicTopoFvMeshCoupled.C at line 8660.
[1]
FOAM parallel run aborting
[1]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 1.
Any ideas? And yep, I will have a closer look on git basics!
Arnoldinho is offline   Reply With Quote

Old   May 18, 2011, 12:08
Default
  #8
Senior Member
 
Sandeep Menon
Join Date: Mar 2009
Location: Amherst, MA
Posts: 403
Rep Power: 25
deepsterblue will become famous soon enough
I meant merging the feature/mesquiteHexPrismSupport branch with the feature/parallelDynamicTopoFvMesh branch, not the master. The problem that you're getting is due to the fact that mesquite is not parallel-aware.

Merge the branches and re-compile. Your problem should go away.
__________________
Sandeep Menon
University of Massachusetts Amherst
https://github.com/smenon
deepsterblue is offline   Reply With Quote

Old   May 18, 2011, 12:24
Default
  #9
Senior Member
 
Arne Stahlmann
Join Date: Nov 2009
Location: Hanover, Germany
Posts: 209
Rep Power: 18
Arnoldinho is on a distinguished road
Fortunately, you were right! I merged the right branches, and now it seems to be running ... At least up to a certain time, then it crashes again. But this seems to be a problem with the decomposition... Will try to figure it out later.

Thanks for your help!
Arnoldinho is offline   Reply With Quote

Old   May 19, 2011, 06:18
Default
  #10
Senior Member
 
Arne Stahlmann
Join Date: Nov 2009
Location: Hanover, Germany
Posts: 209
Rep Power: 18
Arnoldinho is on a distinguished road
Coming back to the circCylinder3d tutorial: I made a few runs using simple and metis decomposition with processors 2 up to 4.
In parallel mode the runs sometimes crash, but I can't really figure out why and when.

An example: using metis decomp. and two processors, everhything works fine. Using simple decomp., the simulation crashes after a while, whereas the time is depending on the decomposition direction.
When decomposing the patch in fixedValuePatches (dynamicMeshDict), it runs fine first, but suddenly crashes again.

Two examples for an error message after crash:

Quote:
[1] --> FOAM FATAL ERROR:
[1] Encountered negative cell-quality!
Edge: 3922: (67 692)
vertexHull: 6(425 219 153 697 13 695)
Minimum Quality: -0.244762
[1]
[1] From function scalar dynamicTopoFvMesh::computeMinQuality(const label eIndex, labelList& hullVertices) const
[1] in file dynamicTopoFvMesh/edgeSwap.C at line 2246.
[1]
FOAM parallel run aborting
Quote:
[1] Face: 5705 :: 3(857 809 919) Patch: topWall Proc: 1
[1] Face: 6896 :: 3(857 919 779) Patch: procBoundary1to2 Proc: 1
[1] Face: 4844 :: 3(807 919 857) Patch: Internal Proc: 1
[1] >> Edge: 1207:11 492) mapped: (857 919)
[1] Face: 1990 :: 3(11 492 17) Patch: topWall Proc: 2
[1] Face: 2179 :: 3(11 105 492) Patch: procBoundary2to1 Proc: 2
[1] * * * Error in fillTables * * *
[1] Edge: 2831 :: (857 919)
[1] minQuality: -0.00629178
[1] Closed: false
[1]
[1]
[1] --> FOAM FATAL ERROR:
[1]
[1]
FOAM parallel run aborting
Arnoldinho is offline   Reply With Quote

Old   May 19, 2011, 10:44
Default
  #11
Senior Member
 
Sandeep Menon
Join Date: Mar 2009
Location: Amherst, MA
Posts: 403
Rep Power: 25
deepsterblue will become famous soon enough
Hmm... I have similar issues, and I'm looking into that at the moment. Can you add this line in the mesquiteOptions dict and tell me if it improves anything:

relaxationFactor 0.1;

You can play around with the factor a little bit (range [0-1]). Meanwhile, I'll look into a fix.

Also, could you post your dynamicMeshDict and decomposeParDict for the failing case?
__________________
Sandeep Menon
University of Massachusetts Amherst
https://github.com/smenon
deepsterblue is offline   Reply With Quote

Old   May 19, 2011, 11:18
Default
  #12
Senior Member
 
Arne Stahlmann
Join Date: Nov 2009
Location: Hanover, Germany
Posts: 209
Rep Power: 18
Arnoldinho is on a distinguished road
Hi, you can find the dicts attached, for a case mentioned above (#2). It crashes after about 10 seconds.

I put in relaxationFactor 0.1, but for the same case, it now crashes at t=6.2, giving the following error. Putting the relaxationFactor to 0.5 leads to a crash at t=8.5s, and to 0.9 at t=8.4s.

Quote:
Topo modifier time: 0.042774 s
Bisections :: Total: 0, Surface: 0
Collapses :: Total: 1, Surface: 1
Swaps :: Total: 3, Surface: 0
[2] Patch: 4 Processor: 1 mSize: 458 sSize: 458 failed on match for face centres.
Mapping time: 0.004613 s
[2]
[2]
[2] --> FOAM FATAL ERROR:
[2] Matching for processor patches failed.
Patch faces written out to disk.
[2]
[2]
[2] From function
void dynamicTopoFvMesh::syncCoupledBoundaryOrdering
(
List<pointField>& centres,
List<pointField>& anchors,
labelListList& patchMaps,
labelListList& rotations
) const
[2]
[2] in file dynamicTopoFvMesh/dynamicTopoFvMeshCoupled.C at line 8660.
[2]
FOAM parallel run aborting
[2]
[0] --------------------------------------------------------------------------
MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

[0]
[0] --> FOAM FATAL ERROR:
[0] Matching for processor patches failed.
Patch faces written out to disk.
[0]
[0]
[0] From function
void dynamicTopoFvMesh::syncCoupledBoundaryOrdering
(
List<pointField>& centres,
List<pointField>& anchors,
labelListList& patchMaps,
labelListList& rotations
) const
[0]
[0] in file dynamicTopoFvMesh/dynamicTopoFvMeshCoupled.C at line 8660.
[0]
FOAM parallel run aborting
[0]
[1]
[1]
[1] --> FOAM FATAL ERROR:
[1] Matching for processor patches failed.
Patch faces written out to disk.
[1]
[1]
[1] From function
void dynamicTopoFvMesh::syncCoupledBoundaryOrdering
(
List<pointField>& centres,
List<pointField>& anchors,
labelListList& patchMaps,
labelListList& rotations
) const
[1]
[1] in file dynamicTopoFvMesh/dynamicTopoFvMeshCoupled.C at line 8660.
[1]
FOAM parallel run aborting
[1]
Attached Files
File Type: txt decomposeParDict.txt (1,011 Bytes, 14 views)
File Type: txt dynamicMeshDict.txt (5.5 KB, 29 views)
Arnoldinho is offline   Reply With Quote

Old   May 19, 2011, 12:59
Default
  #13
Senior Member
 
Sandeep Menon
Join Date: Mar 2009
Location: Amherst, MA
Posts: 403
Rep Power: 25
deepsterblue will become famous soon enough
I've made a few modifications to the feature/mesquiteHexPrismSupport branch. Could you merge the changes and try again?

Once you've merged the changes, add this to the mesquiteOptions dictionary:

checkTetValidity true;

You should no longer require a relaxationFactor entry.
__________________
Sandeep Menon
University of Massachusetts Amherst
https://github.com/smenon
deepsterblue is offline   Reply With Quote

Old   May 19, 2011, 13:26
Default
  #14
Senior Member
 
Arne Stahlmann
Join Date: Nov 2009
Location: Hanover, Germany
Posts: 209
Rep Power: 18
Arnoldinho is on a distinguished road
Ok, did a git pull, merged the branches, recompiled, put your new entry in an tried again: error message at t=5.4s, given below. In paraview one can see that near one of the processor boundaries, the mesh (outer patch of the cylinder) is getting deformed (bump in the wall patch).

Quote:
Mesh OK.
ExecutionTime = 36.79 s ClockTime = 37 s

Time = 5.5
[PC12:17448] *** Process received signal ***
[PC12:17448] Signal: Floating point exception (8)
[PC12:17448] Signal code: (-6)
[PC12:17448] Failing at address: 0x3e800004428
[PC12:17449] *** Process received signal ***
[PC12:17449] Signal: Floating point exception (8)
[PC12:17449] Signal code: (-6)
[PC12:17449] Failing at address: 0x3e800004429
[PC12:17450] *** Process received signal ***
[PC12:17450] Signal: Floating point exception (8)
[PC12:17450] Signal code: (-6)
[PC12:17450] Failing at address: 0x3e80000442a
Solving for point motion: Initial residual: 1 Final residual: 0.131885 No Iterations: 774
[PC12:17448] [ 0] /lib/libc.so.6(+0x33af0) [0x7f27c517faf0]
[PC12:17448] [ 1] /lib/libc.so.6(gsignal+0x35) [0x7f27c517fa75]
[PC12:17448] [ 2] /lib/libc.so.6(+0x33af0) [0x7f27c517faf0]
[PC12:17448] [ 3] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libmesquiteMotionSolver.so(_ZN4Foam20mesquiteMotio nSolver2CGERKNS_5FieldINS_6VectorIdEEEERS4_S7_S7_S 7_+0x20f) [0x7f27c39501ef]
[PC12:17448] [ 4] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libmesquiteMotionSolver.so(_ZN4Foam20mesquiteMotio nSolver14smoothSurfacesEv+0x1fa) [0x7f27c39508da]
[PC12:17448] [ 5] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libmesquiteMotionSolver.so(_ZN4Foam20mesquiteMotio nSolver5solveEv+0x555) [0x7f27c3966895]
[PC12:17448] [ 6] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libdynamicMesh.so(_ZN4Foam12motionSolver9newPoints Ev+0x1d) [0x7f27c6dd80ed]
[PC12:17448] [ 7] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libdynamicFvMesh.so(_ZN4Foam17dynamicTopoFvMesh6up dateEv+0xa0) [0x7f27c772d370]
[PC12:17448] [ 8] moveDynamicMesh() [0x408567]
[PC12:17448] [ 9] /lib/libc.so.6(__libc_start_main+0xfd) [0x7f27c516ac4d]
[PC12:17448] [10] moveDynamicMesh() [0x407f49]
[PC12:17448] *** End of error message ***
[PC12:17450] [ 0] /lib/libc.so.6(+0x33af0) [0x7f342d782af0]
[PC12:17450] [ 1] /lib/libc.so.6(gsignal+0x35) [0x7f342d782a75]
[PC12:17450] [ 2] /lib/libc.so.6(+0x33af0) [0x7f342d782af0]
[PC12:17450] [ 3] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libmesquiteMotionSolver.so(_ZN4Foam20mesquiteMotio nSolver2CGERKNS_5FieldINS_6VectorIdEEEERS4_S7_S7_S 7_+0x20f) [0x7f342bf531ef]
[PC12:17450] [ 4] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libmesquiteMotionSolver.so(_ZN4Foam20mesquiteMotio nSolver14smoothSurfacesEv+0x1fa) [0x7f342bf538da]
[PC12:17450] [ 5] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libmesquiteMotionSolver.so(_ZN4Foam20mesquiteMotio nSolver5solveEv+0x555) [0x7f342bf69895]
[PC12:17450] [ 6] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libdynamicMesh.so(_ZN4Foam12motionSolver9newPoints Ev+0x1d) [0x7f342f3db0ed]
[PC12:17449] [ 0] /lib/libc.so.6(+0x33af0) [0x7f067a612af0]
[PC12:17449] [ 1] /lib/libc.so.6(gsignal+0x35) [0x7f067a612a75]
[PC12:17449] [ 2] /lib/libc.so.6(+0x33af0) [0x7f067a612af0]
[PC12:17449] [ 3] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libmesquiteMotionSolver.so(_ZN4Foam20mesquiteMotio nSolver2CGERKNS_5FieldINS_6VectorIdEEEERS4_S7_S7_S 7_+0x20f) [0x7f0678de31ef]
[PC12:17449] [ 4] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libmesquiteMotionSolver.so(_ZN4Foam20mesquiteMotio nSolver14smoothSurfacesEv+0x1fa) [0x7f0678de38da]
[PC12:17450] [ 7] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libdynamicFvMesh.so(_ZN4Foam17dynamicTopoFvMesh6up dateEv+0xa0) [0x7f342fd30370]
[PC12:17450] [ 8] moveDynamicMesh() [0x408567]
[PC12:17450] [ 9] /lib/libc.so.6(__libc_start_main+0xfd) [0x7f342d76dc4d]
[PC12:17450] [10] moveDynamicMesh() [0x407f49]
[PC12:17450] *** End of error message ***
[PC12:17449] [ 5] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libmesquiteMotionSolver.so(_ZN4Foam20mesquiteMotio nSolver5solveEv+0x555) [0x7f0678df9895]
[PC12:17449] [ 6] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libdynamicMesh.so(_ZN4Foam12motionSolver9newPoints Ev+0x1d) [0x7f067c26b0ed]
[PC12:17449] [ 7] /opt/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libdynamicFvMesh.so(_ZN4Foam17dynamicTopoFvMesh6up dateEv+0xa0) [0x7f067cbc0370]
[PC12:17449] [ 8] moveDynamicMesh() [0x408567]
[PC12:17449] [ 9] /lib/libc.so.6(__libc_start_main+0xfd) [0x7f067a5fdc4d]
[PC12:17449] [10] moveDynamicMesh() [0x407f49]
[PC12:17449] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 17448 on node PC12 exited on signal 8 (Floating point exception).
Arnoldinho is offline   Reply With Quote

Old   May 19, 2011, 13:33
Default
  #15
Senior Member
 
Sandeep Menon
Join Date: Mar 2009
Location: Amherst, MA
Posts: 403
Rep Power: 25
deepsterblue will become famous soon enough
One step ahead of you. The CG solver is hitting solution singularity. I've posted a fix, so please repeat and re-compile.
__________________
Sandeep Menon
University of Massachusetts Amherst
https://github.com/smenon
deepsterblue is offline   Reply With Quote

Old   May 19, 2011, 13:44
Default
  #16
Senior Member
 
Arne Stahlmann
Join Date: Nov 2009
Location: Hanover, Germany
Posts: 209
Rep Power: 18
Arnoldinho is on a distinguished road
Nope. We get closer (further), but:

Quote:
Time = 11.6
Solving for point motion: Initial residual: 1 Final residual: 0.00976717 No Iterations: 20
Solving for point motion: Initial residual: 1 Final residual: 0.00987566 No Iterations: 29
--> FOAM Warning :
From function bool dynamicTopoFvMesh::meshQuality(bool outputOption)
in file dynamicTopoFvMesh/dynamicTopoFvMeshCheck.C at line 213

Minimum cell quality is: -0.158568 at cell: 2610

~~~ Mesh Quality Statistics ~~~
Min: -0.158568
Max: 0.998581
Mean: 0.87132
Cells: 9252
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

[1] Face: 6991 :: 3(809 917 334) Patch: procBoundary1to2 Proc: 1
[1] Face: 6992 :: 3(917 809 916) Patch: procBoundary1to2 Proc: 1
[1] Face: 5539 :: 3(917 809 430) Patch: Internal Proc: 1
[1] >> Edge: 1148:137 47) mapped: (917 809)
[1] Face: 827 :: 3(137 58 47) Patch: Internal Proc: 2
[1] Face: 2261 :: 3(47 113 137) Patch: procBoundary2to1 Proc: 2
[1] Face: 2264 :: 3(137 49 47) Patch: procBoundary2to1 Proc: 2
[1] * * * Error in fillTables * * *
[1] Edge: 4765 :: (917 809)
[1] minQuality: -0.158568
[1] Closed: true
[1]
[1]
[1] --> FOAM FATAL ERROR:
[1]
[1]
FOAM parallel run aborting
[1]
Arnoldinho is offline   Reply With Quote

Old   May 19, 2011, 14:11
Default
  #17
Senior Member
 
Sandeep Menon
Join Date: Mar 2009
Location: Amherst, MA
Posts: 403
Rep Power: 25
deepsterblue will become famous soon enough
Hmm... So the boundary appears to be moving too quickly for the adaptation to keep up. I reduced the omega value in the fixedValuePatches dictionary from 0.15 to 0.1, which seemed to improve things a little bit. But the smoother seems to be distorting feature-edges, which I will need to look into.
__________________
Sandeep Menon
University of Massachusetts Amherst
https://github.com/smenon
deepsterblue is offline   Reply With Quote

Old   May 19, 2011, 14:52
Default
  #18
Senior Member
 
Arne Stahlmann
Join Date: Nov 2009
Location: Hanover, Germany
Posts: 209
Rep Power: 18
Arnoldinho is on a distinguished road
Ok, if I can help with it, let me know.

Another question, as you seem to be quite familiar with mesh motion solvers to me: At the moment I am trying to figure out what the similarities, differences and dependencies of the different mesh motion solvers are, coming with OF-1.6-ext.
The src/dynamicMesh directories give a lot of solvers, but unfortunately no description. Do you know any summary, describing all this?
My question is related to this topic http://www.cfd-online.com/Forums/ope...condition.html, as I need to couple an interFoam-related solver with mesh motion (what interDyMFoam is) and the FAM framework. Right now I'm struggling with the question 'which mesh motion solvers work with interDymFoam and do what I need'...

Arne
Arnoldinho is offline   Reply With Quote

Old   May 19, 2011, 14:57
Default
  #19
Senior Member
 
Sandeep Menon
Join Date: Mar 2009
Location: Amherst, MA
Posts: 403
Rep Power: 25
deepsterblue will become famous soon enough
They all do one thing - given a set of points, solve for new points which provide optimal cell quality. The various solvers differ mainly in terms of efficiency and robustness.

As with everything else in Foam, you have the freedom to choose what suits you the best.
__________________
Sandeep Menon
University of Massachusetts Amherst
https://github.com/smenon
deepsterblue is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Script to Run Parallel Jobs in Rocks Cluster asaha OpenFOAM Running, Solving & CFD 12 July 4, 2012 23:51
parallel performance on BX900 uzawa OpenFOAM Installation 3 September 5, 2011 16:52
HP MPI warning...Distributed parallel processing Peter CFX 10 May 14, 2011 07:17
IcoFoam parallel woes msrinath80 OpenFOAM Running, Solving & CFD 9 July 22, 2007 03:58
Parallel Computing Classes at San Diego Supercomputer Center Jan. 20-22 Amitava Majumdar Main CFD Forum 0 January 5, 1999 13:00


All times are GMT -4. The time now is 04:31.