|
[Sponsors] |
[snappyHexMesh] Running snappyHexMesh in parallel creates new time directories |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
October 4, 2018, 14:22 |
Running snappyHexMesh in parallel creates new time directories
|
#1 |
Member
Hüseyin Can Önel
Join Date: Sep 2018
Location: Ankara, Turkey
Posts: 47
Rep Power: 8 |
I want to run snappyHexMesh in parallel and then pimpleFoam in parallel also. I'm using the following script for mesh preparation:
Code:
blockMesh > log.blockMesh decomposePar > log.decomposePar.1 mpirun -np $nProc snappyHexMesh -latestTime -parallel > log.snappyHexMesh reconstructParMesh -latestTime > log.reconstructPar1 renumberMesh -latestTime > renumberMesh.log rm -rf processor* topoSet > log.topoSet Code:
decomposePar > log.decomposePar.2 ls -d processor* | xargs -I {} rm -rf ./{}/0 ls -d processor* | xargs -I {} cp -r 0.org ./{}/0 mpirun -np $nProc pimpleFoam -parallel > log.pimpleFoam reconstructPar > log.reconstructPar.2 Thanks. |
|
October 4, 2018, 17:59 |
|
#2 |
Member
Luis Eduardo
Join Date: Jan 2011
Posts: 85
Rep Power: 15 |
Try using "mpirun -np $nProc snappyHexMesh -latestTime -parallel -overwrite> log.snappyHexMesh", I use the "-overwrite" option and I don't get new time directories.
What I use to run my cases is a Allrun file: Code:
#!/bin/sh cd ${0%/*} || exit 1 # Run from this directory # Source tutorial run functions . $WM_PROJECT_DIR/bin/tools/RunFunctions # Make dummy 0 directory mkdir 0 runApplication blockMesh cp system/decomposeParDict.hierarchical system/decomposeParDict runApplication decomposePar runParallel snappyHexMesh -overwrite find . -type f -iname "*level*" -exec rm {} \; ls -d processor* | xargs -I {} cp -r 0.org ./{}/0 $1 runParallel topoSet runParallel `getApplication` runApplication reconstructParMesh -constant runApplication reconstructPar cp -a 0.org/. 0/ |
|
October 5, 2018, 07:01 |
|
#3 |
Member
Hüseyin Can Önel
Join Date: Sep 2018
Location: Ankara, Turkey
Posts: 47
Rep Power: 8 |
Hi Luis,
I do not have runParallel and runApplication commands, I guess they come with another application you have. Also, qsub and pbs scripts does not accept them I guess. Is there a way to do it with mpirun? |
|
October 5, 2018, 17:03 |
|
#4 |
Member
Luis Eduardo
Join Date: Jan 2011
Posts: 85
Rep Power: 15 |
Hi,
These commands will be available when you execute the line ". $WM_PROJECT_DIR/bin/tools/RunFunctions", but you can probably use the mpi as well (I never used it, so I can't give you more information about it, sorry!) When I run "runApplication blockMesh" I get the same result as your "blockMesh > log.blockMesh" command, so probably you can keep it. I guess you could replace "runParallel" by "mpirun -np $nProc". For example, runParallel topoSet may have the same result as mpirun -np $nProc topoSet > log.topoSet Also, I don't need to reconstruct my case before running the solver, because topoSet can be run in parallel. It seems to be the same case as yours, you just reconstruct it to run topoSet, and after decompose it again, correct? Best Regards, Luis |
|
October 5, 2018, 21:40 |
|
#5 |
Member
Hüseyin Can Önel
Join Date: Sep 2018
Location: Ankara, Turkey
Posts: 47
Rep Power: 8 |
Hi,
Thanks to you, sourcing that file allowed me to use runApplication and runParallel commands in pbs script job submissions! My idea was to successfully generate the mesh in parallel first error free, then continue on solving without intermediate reconstruction steps. I'm still not sure how to do it with mpirun commands, but runParallel function has taken care of everything smoothly! I'll post the final script when I get on my pc. |
|
September 27, 2022, 16:20 |
Can you post your script?
|
#6 |
New Member
Diego Andrade
Join Date: Aug 2022
Posts: 1
Rep Power: 0 |
do you mind posting your script?
|
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
[General] Extracting ParaView Data into Python Arrays | Jeffzda | ParaView | 30 | November 6, 2023 22:00 |
courant number increases to rather large values | 6863523 | OpenFOAM Running, Solving & CFD | 22 | July 6, 2023 00:48 |
pimpleDyMFoam computation randomly stops | babapeti | OpenFOAM Running, Solving & CFD | 5 | January 24, 2018 06:28 |
High Courant Number @ icoFoam | Artex85 | OpenFOAM Running, Solving & CFD | 11 | February 16, 2017 14:40 |
Superlinear speedup in OpenFOAM 13 | msrinath80 | OpenFOAM Running, Solving & CFD | 18 | March 3, 2015 06:36 |