|
[Sponsors] |
Can run setFields in parallel while decomposed? |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
December 9, 2013, 05:34 |
Can run setFields in parallel while decomposed?
|
#1 |
New Member
Youngkook Kim
Join Date: Jul 2013
Location: Singapore and South Korea
Posts: 20
Rep Power: 13 |
Hi all,
I am running interfoam with quite a heavy mesh on 24 CPU. And I need to test with several meshing case. For this, what I am doing now is: 1. blockMesh 2. decomposePar(meshes are decomposed) 3. mpirun -np 24 snappyhexmesh.exe -overwrite -parallel 4. reconstructparMesh 5. (prepare fields including alpha1. in 0 directory) 6. setfields 7. decomposePar 8. run the solver in parallel Only if I can run setfields while decomposed, I can save more time. However, error occurs at step 5 with followings: 1. blockMesh 2. (prepare fields including alpha1. in 0 directory.) 3. decomposePar (mesh and fields in 0 are decomposed) 4. mpirun -np 24 snappyhexmesh.exe -overwrite -parallel 5. mpirun -np 24 setfields.exe -parallel -> ERROR 6. run the solver Do you know what is wrong for setfields in parallel?? |
|
December 9, 2013, 23:39 |
|
#2 |
New Member
Youngkook Kim
Join Date: Jul 2013
Location: Singapore and South Korea
Posts: 20
Rep Power: 13 |
I found the patch created by SHM is not shown in Paraview... It's because the patch is not created when I decompose(this is before running SHM). I added the patch with empty in 'boundary' before decomposing, and it's settled. But error still persists with setfields with parallel. The error message is:
keyword procBoundary16to12 is undefined in dictionary "C:\~~~~~\alpha1::boundaryField" Boundaryfield for the SHM patch is defined in 0 folder. But I cannot read alpha1 in the decomposed 0, probably it's in binary. I hope someone can give an advice. Last edited by totalart; December 10, 2013 at 01:00. |
|
August 21, 2018, 00:07 |
anw
|
#3 |
New Member
Chaewoong Ban
Join Date: Jun 2013
Posts: 18
Rep Power: 13 |
1. blockMesh
2. (prepare fields including alpha1. in 0 directory.) ** just put boundaryField such as "procBoundary.*" in your alpha1. file "procBoundary.*" { type processor; value uniform 0; } 3. decomposePar -copyZero(mesh and fields in 0 are decomposed) 4. mpirun -np 24 snappyHexmesh -overwrite -parallel 5. mpirun -np 24 setFields -parallel -> ERROR 6. run the solver |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
dynamicMesh parallel run | popcorn | OpenFOAM Running, Solving & CFD | 0 | October 2, 2012 13:34 |
Script to Run Parallel Jobs in Rocks Cluster | asaha | OpenFOAM Running, Solving & CFD | 12 | July 4, 2012 23:51 |
Parallel run in fluent for multiphase flow | apurv | FLUENT | 2 | August 3, 2011 20:44 |
Parallel Run on dynamically mounted partition | braennstroem | OpenFOAM Running, Solving & CFD | 14 | October 5, 2010 15:43 |
Run in parallel a 2mesh case | cosimobianchini | OpenFOAM Running, Solving & CFD | 2 | January 11, 2007 07:33 |