CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Meshing & Mesh Conversion

[snappyHexMesh] shm in parallel with simple decomposition

Register Blogs Community New Posts Updated Threads Search

Like Tree3Likes
  • 1 Post By Aurelien Thinat
  • 1 Post By Artur
  • 1 Post By louisgag

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   July 2, 2013, 11:36
Default shm in parallel with simple decomposition
  #1
Senior Member
 
Mihai Pruna
Join Date: Apr 2010
Location: Boston
Posts: 195
Rep Power: 16
mihaipruna is on a distinguished road
Hi, I need some help getting SHM to run in parallel on OF 2.1.1

Here is my script:
Code:
echo Started At
date
#!/bin/sh
# Source tutorial run functions
. $WM_PROJECT_DIR/bin/tools/RunFunctions
blockMesh
surfaceFeatureExtract -includedAngle 150 -writeObj constant/triSurface/capri.stl capri
decomposePar
mpirun -np 4 snappyHexMesh -overwrite -parallel
reconstructPar
decomposePar
mpirun -np 4 rhoSimplecFoam -parallel
reconstructPar
sample
sample -dict sampleDictSTL
ptot
echo Finished At
date
decomposeParDict file:

Code:
/*--------------------------------*- C++ -*----------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.1.0                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
FoamFile
{
    version     2.0;
    format      ascii;
    class       dictionary;
    object      decomposeParDict;
}

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
//assume 4 cores
numberOfSubdomains 4;

method          simple;


simpleCoeffs
{
    n               (4 1 1);
    delta           0.001;
}




// ************************************************************************* //
and the errors:

Code:
--> FOAM FATAL ERROR: 
No times selected

    From function reconstructPar
    in file reconstructPar.C at line 139.

FOAM exiting



--> FOAM FATAL ERROR: 
Case is already decomposed with 4 domains, use the -force option or manually
remove processor directories before decomposing. e.g.,
    rm -rf /media/data/sduct1mil-parallel/processor*


    From function decomposePar
    in file decomposePar.C at line 253.

FOAM exiting

[0] [1] 
[1] 
[1] --> FOAM FATAL IO ERROR: 
[1] keyword vol1face1 is undefined in dictionary "/media/data/sduct1mil-parallel/processor1/0/p::boundaryField"
[1] 
[1] file: /media/data/sduct1mil-parallel/processor1/0/p::boundaryField from line 26 to line 57.
[1] 
[1]     From function dictionary::subDict(const word& keyword) const
[1]     in file db/dictionary/dictionary.C at line 461.
[1] 
FOAM parallel run exiting
[1] 
[2] 
[2] 
[2] --> FOAM FATAL IO ERROR: 
[2] keyword vol1face1 is undefined in dictionary "/media/data/sduct1mil-parallel/processor2/0/p::boundaryField"
[2] 
[2] file: /media/data/sduct1mil-parallel/processor2/0/p::boundaryField from line 26 to line 57.
[2] 
[2]     From function dictionary::subDict(const word& keyword) const
[2]     in file db/dictionary/dictionary.C at line 461.
[2] 
FOAM parallel run exiting
[2] 
[3] 
[3] 
[3] --> FOAM FATAL IO ERROR: 
[3] keyword vol1face1 is undefined in dictionary "/media/data/sduct1mil-parallel/processor3/0/p::boundaryField"
[3] 
[3] file: /media/data/sduct1mil-parallel/processor3/0/p::boundaryField from line 26 to line 52.
[3] 
[3]     From function dictionary::subDict(const word& keyword) const
[3]     in file db/dictionary/dictionary.C at line 461.
[3] 
FOAM parallel run exiting
[3] 

--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[0] 
[0] --> FOAM FATAL IO ERROR: 
[0] keyword vol1face1 is undefined in dictionary "/media/data/sduct1mil-parallel/processor0/0/p::boundaryField"
[0] 
[0] file: /media/data/sduct1mil-parallel/processor0/0/p::boundaryField from line 26 to line 52.
[0] 
[0]     From function dictionary::subDict(const word& keyword) const
[0]     in file db/dictionary/dictionary.C at line 461.
[0] 
FOAM parallel run exiting
[0] 
--------------------------------------------------------------------------
mpirun has exited due to process rank 3 with PID 23606 on
node ubuntu exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[ubuntu:23602] 3 more processes have sent help message help-mpi-api.txt / mpi-abort
[ubuntu:23602] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages


--> FOAM FATAL ERROR: 
No times selected

    From function reconstructPar
    in file reconstructPar.C at line 139.

FOAM exiting
I attached the log as well
Attached Files
File Type: zip testpar.txt.zip (13.3 KB, 1 views)
mihaipruna is offline   Reply With Quote

Old   July 3, 2013, 09:05
Default
  #2
Senior Member
 
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 16
Aurelien Thinat is on a distinguished road
Hello,

You should try the command "reconstructParMesh" instead of "reconstructPar".

Regards,

Aurelien
Aurelien Thinat is offline   Reply With Quote

Old   July 3, 2013, 10:21
Default
  #3
Senior Member
 
Mihai Pruna
Join Date: Apr 2010
Location: Boston
Posts: 195
Rep Power: 16
mihaipruna is on a distinguished road
Quote:
Originally Posted by Aurelien Thinat View Post
Hello,

You should try the command "reconstructParMesh" instead of "reconstructPar".

Regards,

Aurelien
Hi Aurelien, actually I tried that but it only works if I give it the constant folder as parameter and does not recreate the files in time 0.
running it with -time 0 as parameter does not work.
mihaipruna is offline   Reply With Quote

Old   July 3, 2013, 10:31
Default
  #4
Senior Member
 
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 16
Aurelien Thinat is on a distinguished road
You can't recreate the folder 0 from the parallel output of snappyHexMesh (or at least I'm not aware of such a capability of OpenFOAM). You have to build it by hand before the 2nd call to decomposePar.


blockMesh
surfaceFeatureExtract -includedAngle 150 -writeObj constant/triSurface/capri.stl capri
decomposePar

(you may need to copy paste the capri.eMesh file in the folders processori)

mpirun -np 4 snappyHexMesh -overwrite -parallel reconstructParMesh -constant
(Not sure about the -constant option, this command allow you to have the whole mesh in the folder ./constant/polyMesh )

Here you check that your folder ./0 is OK

decomposePar
mpirun -np 4 rhoSimplecFoam -parallel
reconstructPar -latestTime (this option is optionnal)
Ramzy1990 likes this.
Aurelien Thinat is offline   Reply With Quote

Old   July 10, 2013, 04:40
Default
  #5
Senior Member
 
Artur's Avatar
 
Artur
Join Date: May 2013
Location: Southampton, UK
Posts: 372
Rep Power: 20
Artur will become famous soon enough
Not sure if the previous answers solved your problem but I had the same error when trying to decompose a case with Processor 0, Processor 1, etc. folders already in it. Removing them fixed it for me.
Ramzy1990 likes this.
Artur is offline   Reply With Quote

Old   July 10, 2013, 05:25
Default
  #6
Senior Member
 
Join Date: Aug 2010
Location: Groningen, The Netherlands
Posts: 216
Rep Power: 19
colinB is on a distinguished road
if I got your problem right the processor folders are causing
the error messages so you could use the force flag to avoid deleting them
separately and they will automatically be overwritten:

decomposePar -force
decomposeParMesh -force

for further hints on what flags are available
type:

decomposePar --help
decomposeParMesh --help

regards
colinB is offline   Reply With Quote

Old   July 16, 2015, 05:55
Default
  #7
Senior Member
 
louisgag's Avatar
 
Louis Gagnon
Join Date: Mar 2009
Location: Stuttgart, Germany
Posts: 338
Rep Power: 18
louisgag is on a distinguished road
Send a message via ICQ to louisgag
take care that the -force option will delete all your processor* directories even if the times to decompose do not overlap those already decomposed.
Ramzy1990 likes this.
louisgag is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
simpleFoam parallel AndrewMortimer OpenFOAM Running, Solving & CFD 12 August 7, 2015 19:45
[snappyHexMesh] Strange behavior of sHM in a simple case Tobi OpenFOAM Meshing & Mesh Conversion 0 November 20, 2014 10:22
Parallel bug related to domain decomposition akidess OpenFOAM Bugs 0 November 16, 2011 11:05
parallel results different depending on decomposition strategy franzisko OpenFOAM 3 November 4, 2009 07:37
parallel mapFields produces solution singularity at decomposition plane florian_krause OpenFOAM 0 October 23, 2009 04:40


All times are GMT -4. The time now is 17:43.