|
[Sponsors] |
[snappyHexMesh] shm in parallel with simple decomposition |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
July 2, 2013, 11:36 |
shm in parallel with simple decomposition
|
#1 |
Senior Member
Mihai Pruna
Join Date: Apr 2010
Location: Boston
Posts: 195
Rep Power: 16 |
Hi, I need some help getting SHM to run in parallel on OF 2.1.1
Here is my script: Code:
echo Started At date #!/bin/sh # Source tutorial run functions . $WM_PROJECT_DIR/bin/tools/RunFunctions blockMesh surfaceFeatureExtract -includedAngle 150 -writeObj constant/triSurface/capri.stl capri decomposePar mpirun -np 4 snappyHexMesh -overwrite -parallel reconstructPar decomposePar mpirun -np 4 rhoSimplecFoam -parallel reconstructPar sample sample -dict sampleDictSTL ptot echo Finished At date Code:
/*--------------------------------*- C++ -*----------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.1.0 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ FoamFile { version 2.0; format ascii; class dictionary; object decomposeParDict; } // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // //assume 4 cores numberOfSubdomains 4; method simple; simpleCoeffs { n (4 1 1); delta 0.001; } // ************************************************************************* // Code:
--> FOAM FATAL ERROR: No times selected From function reconstructPar in file reconstructPar.C at line 139. FOAM exiting --> FOAM FATAL ERROR: Case is already decomposed with 4 domains, use the -force option or manually remove processor directories before decomposing. e.g., rm -rf /media/data/sduct1mil-parallel/processor* From function decomposePar in file decomposePar.C at line 253. FOAM exiting [0] [1] [1] [1] --> FOAM FATAL IO ERROR: [1] keyword vol1face1 is undefined in dictionary "/media/data/sduct1mil-parallel/processor1/0/p::boundaryField" [1] [1] file: /media/data/sduct1mil-parallel/processor1/0/p::boundaryField from line 26 to line 57. [1] [1] From function dictionary::subDict(const word& keyword) const [1] in file db/dictionary/dictionary.C at line 461. [1] FOAM parallel run exiting [1] [2] [2] [2] --> FOAM FATAL IO ERROR: [2] keyword vol1face1 is undefined in dictionary "/media/data/sduct1mil-parallel/processor2/0/p::boundaryField" [2] [2] file: /media/data/sduct1mil-parallel/processor2/0/p::boundaryField from line 26 to line 57. [2] [2] From function dictionary::subDict(const word& keyword) const [2] in file db/dictionary/dictionary.C at line 461. [2] FOAM parallel run exiting [2] [3] [3] [3] --> FOAM FATAL IO ERROR: [3] keyword vol1face1 is undefined in dictionary "/media/data/sduct1mil-parallel/processor3/0/p::boundaryField" [3] [3] file: /media/data/sduct1mil-parallel/processor3/0/p::boundaryField from line 26 to line 52. [3] [3] From function dictionary::subDict(const word& keyword) const [3] in file db/dictionary/dictionary.C at line 461. [3] FOAM parallel run exiting [3] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [0] [0] --> FOAM FATAL IO ERROR: [0] keyword vol1face1 is undefined in dictionary "/media/data/sduct1mil-parallel/processor0/0/p::boundaryField" [0] [0] file: /media/data/sduct1mil-parallel/processor0/0/p::boundaryField from line 26 to line 52. [0] [0] From function dictionary::subDict(const word& keyword) const [0] in file db/dictionary/dictionary.C at line 461. [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- mpirun has exited due to process rank 3 with PID 23606 on node ubuntu exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [ubuntu:23602] 3 more processes have sent help message help-mpi-api.txt / mpi-abort [ubuntu:23602] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages --> FOAM FATAL ERROR: No times selected From function reconstructPar in file reconstructPar.C at line 139. FOAM exiting |
|
July 3, 2013, 09:05 |
|
#2 |
Senior Member
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 16 |
Hello,
You should try the command "reconstructParMesh" instead of "reconstructPar". Regards, Aurelien |
|
July 3, 2013, 10:21 |
|
#3 | |
Senior Member
Mihai Pruna
Join Date: Apr 2010
Location: Boston
Posts: 195
Rep Power: 16 |
Quote:
running it with -time 0 as parameter does not work. |
||
July 3, 2013, 10:31 |
|
#4 |
Senior Member
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 16 |
You can't recreate the folder 0 from the parallel output of snappyHexMesh (or at least I'm not aware of such a capability of OpenFOAM). You have to build it by hand before the 2nd call to decomposePar.
blockMesh surfaceFeatureExtract -includedAngle 150 -writeObj constant/triSurface/capri.stl capri decomposePar (you may need to copy paste the capri.eMesh file in the folders processori) mpirun -np 4 snappyHexMesh -overwrite -parallel reconstructParMesh -constant (Not sure about the -constant option, this command allow you to have the whole mesh in the folder ./constant/polyMesh ) Here you check that your folder ./0 is OK decomposePar mpirun -np 4 rhoSimplecFoam -parallel reconstructPar -latestTime (this option is optionnal) |
|
July 10, 2013, 04:40 |
|
#5 |
Senior Member
Artur
Join Date: May 2013
Location: Southampton, UK
Posts: 372
Rep Power: 20 |
Not sure if the previous answers solved your problem but I had the same error when trying to decompose a case with Processor 0, Processor 1, etc. folders already in it. Removing them fixed it for me.
|
|
July 10, 2013, 05:25 |
|
#6 |
Senior Member
Join Date: Aug 2010
Location: Groningen, The Netherlands
Posts: 216
Rep Power: 19 |
if I got your problem right the processor folders are causing
the error messages so you could use the force flag to avoid deleting them separately and they will automatically be overwritten: decomposePar -force decomposeParMesh -force for further hints on what flags are available type: decomposePar --help decomposeParMesh --help regards |
|
July 16, 2015, 05:55 |
|
#7 |
Senior Member
|
take care that the -force option will delete all your processor* directories even if the times to decompose do not overlap those already decomposed.
|
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
simpleFoam parallel | AndrewMortimer | OpenFOAM Running, Solving & CFD | 12 | August 7, 2015 19:45 |
[snappyHexMesh] Strange behavior of sHM in a simple case | Tobi | OpenFOAM Meshing & Mesh Conversion | 0 | November 20, 2014 10:22 |
Parallel bug related to domain decomposition | akidess | OpenFOAM Bugs | 0 | November 16, 2011 11:05 |
parallel results different depending on decomposition strategy | franzisko | OpenFOAM | 3 | November 4, 2009 07:37 |
parallel mapFields produces solution singularity at decomposition plane | florian_krause | OpenFOAM | 0 | October 23, 2009 04:40 |