CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Pre-Processing

running mapFields in parallel

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 1, 2019, 14:28
Default running mapFields in parallel
  #1
Member
 
K
Join Date: Jul 2017
Posts: 97
Rep Power: 9
mkhm is on a distinguished road
Dear foamers,

I want to use the mapFields pre-processing utility of OpenFOAM to map fields from a coarse mesh into a much refined mesh. The problem is that this seems to take too much time. I think that is not normal (more than 10 hours in serial). So I tried to speed up things by running the mapfields in parrallel. But my tries were not successful. My command line is :

mpirun -np$NSLOTS mapFields case_1 -case case_2 -sourceTime latestTime -parallel> log


and I tried to get inspiration from :
mapFields taking too long


I tried different variants of this command as :

mpirun -np$NSLOTS mapFields case_1 -case case_2-sourceTime latestTime -parallel> log

mpirun -np$NSLOTS mapFields case_1 -case case_2 -mapMethod direct -sourceTime latestTime -parallel> log

mpirun -np$NSLOTS mapFields case_1 -case case_2 -mapMethod direct -sourceTime latestTime -fields '(U T p)' -consistent -noLagrangian -parallel> log

mpirun -np$NSLOTS mapFields case_1 -case case_2 -mapMethod direct -sourceTime latestTime -fields '(U T p)' -consistent -parallel> log

I have :
Invalid option: -parallel,-noLagrangian, -fields
as errors.

Is someone could help me ?
I guess that as in the above mentioned link, they don't use the same version than me, I got other error message like
Invalid option: -fields. However, what I want to do is very basic. Just run mapFields in parallel to speed up.

I would be glad for your help.
thanks in advance,
Mary






mkhm is offline   Reply With Quote

Old   March 8, 2019, 02:27
Default
  #2
Senior Member
 
Zander Meiring
Join Date: Jul 2018
Posts: 125
Rep Power: 8
yambanshee is on a distinguished road
I've had success with map fields in parallel using the following script (extracted from a submit file for a hpc) to run on OF5


Code:
mpirun -np $nproc -machinefile $PBS_NODEFILE mapFieldsPar -consistent -parallel -case mid-ke mid-ke-wallfunc > map.out
The main differences I note between our scripts is the use of mapFields vs mapFieldsPar. I have moved over to using mapFieldsPar even when doing single core mapping due to better stability.
Further more, you seem to be missing a space between -np and $NSlots
Lastly, if you think there might be a version difference, please include what version you are using
yambanshee is offline   Reply With Quote

Old   April 15, 2019, 04:56
Default
  #3
Member
 
K
Join Date: Jul 2017
Posts: 97
Rep Power: 9
mkhm is on a distinguished road
Quote:
Originally Posted by yambanshee View Post
I've had success with map fields in parallel using the following script (extracted from a submit file for a hpc) to run on OF5


Code:
mpirun -np $nproc -machinefile $PBS_NODEFILE mapFieldsPar -consistent -parallel -case mid-ke mid-ke-wallfunc > map.out
The main differences I note between our scripts is the use of mapFields vs mapFieldsPar. I have moved over to using mapFieldsPar even when doing single core mapping due to better stability.
Further more, you seem to be missing a space between -np and $NSlots
Lastly, if you think there might be a version difference, please include what version you are using



Thanks for your answer. I am using the versions 4.x, 4.1 and 2.3.0 of OpenFoam. I can not find "mapFieldsPar" in none of them. Which version of OpenFoam are you using ?


Best regards,
Mary
mkhm is offline   Reply With Quote

Old   November 27, 2019, 20:59
Default
  #4
Member
 
Geon-Hong Kim
Join Date: Feb 2010
Location: Ulsan, Republic of Korea
Posts: 36
Rep Power: 16
Geon-Hong is on a distinguished road
Quote:
Originally Posted by mkhm View Post
Thanks for your answer. I am using the versions 4.x, 4.1 and 2.3.0 of OpenFoam. I can not find "mapFieldsPar" in none of them. Which version of OpenFoam are you using ?


Best regards,
Mary

You can find it in OF-5 or later versions.
You can also find it in OF+ versions, i.e. v1906
Geon-Hong is offline   Reply With Quote

Old   April 3, 2020, 11:50
Default
  #5
Senior Member
 
louisgag's Avatar
 
Louis Gagnon
Join Date: Mar 2009
Location: Stuttgart, Germany
Posts: 338
Rep Power: 18
louisgag is on a distinguished road
Send a message via ICQ to louisgag
I run a case in simpleFoam and map into a pimpleFoam case.
I have an AMI interface but I don't use it in simpleFoam.
I have both in simpleFoam and pimpleFoam periodic boundaries (cyclic).
They both run in parallel.


I'm also getting a terribly slow mapping.
Using as inspiration both this thread and that from mapFields taking too long did not really make things faster.
There is no way I get mapFieldsPar to run: I always get a seg fault.
I tried both AABB and LOD.


Only mapFields works, and it is always extremely slow, even when using mapNearest and consistent (mapFields does not allow the direct method).


So only something like this (with any mapMethod) works:


Code:
runApplication mapFields  -mapMethod mapNearest -parallelSource -parallelTarget -sourceTime latestTime -consistent  ./tmpCase
louisgag is offline   Reply With Quote

Old   June 3, 2021, 17:07
Default mapFieldsPar issue
  #6
Member
 
Join Date: Apr 2021
Posts: 41
Rep Power: 5
AlxB is on a distinguished road
Hello,
I have an issue using mapFieldsPar.

here is the command line I use in an Allrun script:
Code:
runParallel mapFieldsPar simple -case pimple -sourceTime latestTime -consistent -decomposeParDict
(ps: I've also tried without the -decomposeParDict option)

'pimple' and 'simple' are the two subfolders of the case file.
the Allrun script is located in the main folder.

everything is running smoothly except the mapFieldsPar operation.
Here is the Allrun log:
Code:
Running decomposePar on ../run/AA01p/simple
Running renumberMesh (16 processes) on ../run/AA01p/simple
Restore 0/ from 0.orig/ for processor directories
Running checkMesh (16 processes) on ../run/AA01p/simple
Running patchSummary (16 processes) on ../run/AA01p/simple
Running decomposePar on ../run/AA01p/pimple
Running renumberMesh (16 processes) ../run/AA01p/pimple
Restore 0/ from 0.orig/ for processor directories
Running checkMesh (16 processes) on ../run/AA01p/pimple
Running patchSummary (16 processes) on ../run/AA01p/pimple
Running potentialFoam (16 processes) on ../run/AA01p/simple
Running setFields (16 processes) on ../run/AA01p/simple
Running simpleFoam (16 processes) on ../run/AA01p/simple
Error getting 'numberOfSubdomains' from 'system/decomposeParDict'
Running mapFieldsPar (1 processes) on ../run/AA01p
Running reconstructParMesh on ../run/AA01p/simple
Running reconstructPar on ../run/AA01p/simple
Running pimpleFoam (16 processes) on ../run/AA01p/pimple
Running reconstructParMesh on ../run/AA01p/pimple
Running reconstructPar on ./run/AA01p/pimple
and the mapFieldsPar says:
Code:
--> FOAM FATAL ERROR: (openfoam-2012)
attempt to run parallel on 1 processor

    From static bool Foam::UPstream::init(int&, char**&, bool)
    in file UPstream.C at line 289.

FOAM aborting

#0  Foam::error::printStack(Foam::Ostream&) at ??:?
#1  Foam::error::exitOrAbort(int, bool) at ??:?
#2  Foam::UPstream::init(int&, char**&, bool) at ??:?
#3  Foam::argList::argList(int&, char**&, bool, bool, bool) at ??:?
#4  ? at ??:?
#5  __libc_start_main in /lib/x86_64-linux-gnu/libc.so.6
#6  ? at ??:?
[oli-t7610:31119] *** Process received signal ***
[oli-t7610:31119] Signal: Aborted (6)
[oli-t7610:31119] Signal code:  (-6)
[oli-t7610:31119] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x153c0)[0x7f5505cf83c0]
[oli-t7610:31119] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0xcb)[0x7f5505b3718b]
[oli-t7610:31119] [ 2] /lib/x86_64-linux-gnu/libc.so.6(abort+0x12b)[0x7f5505b16859]
[oli-t7610:31119] [ 3] /opt/OpenFOAM/OpenFOAM-v2012/platforms/linux64Gcc63DPInt32Opt/lib/libOpenFOAM.so(_ZN4Foam5error11exitOrAbortEib+0x1f0)[0x7f5506591360]
[oli-t7610:31119] [ 4] /opt/OpenFOAM/OpenFOAM-v2012/platforms/linux64Gcc63DPInt32Opt/lib/openmpi-4.0.3/libPstream.so(_ZN4Foam8UPstream4initERiRPPcb+0xf24)[0x7f55058e5ab4]
[oli-t7610:31119] [ 5] /opt/OpenFOAM/OpenFOAM-v2012/platforms/linux64Gcc63DPInt32Opt/lib/libOpenFOAM.so(_ZN4Foam7argListC1ERiRPPcbbb+0x6cd)[0x7f55065ba6ad]
[oli-t7610:31119] [ 6] mapFieldsPar[0x429c98]
[oli-t7610:31119] [ 7] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf3)[0x7f5505b180b3]
[oli-t7610:31119] [ 8] mapFieldsPar[0x42bdb7]
[oli-t7610:31119] *** End of error message ***
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 0 on node oli-t7610 exited on signal 6 (Aborted).
--------------------------------------------------------------------------
then I decided to locate the mapFieldsPar operation under the 'pimple' folder and I tried to run the following commands but got the following outputs...:
Code:
runParallel mapFieldsPar ../simple -case -sourceTime latestTime –consistent
returns: Expected 1 arguments but found 2

runParallel mapFieldsPar -consistent -case ../simple -sourceTime latestTime 
returns: Expected 1 arguments but found 0

runParallel mapFieldsPar -parallelSource -case ../simple -parallelTarget -sourceTime latestTime –consistent
returns: Expected 1 arguments but found 0
             Invalid option: -parallelSource
             Invalid option: -parallelTarget

runParallel mapFieldsPar -case ../simple -consistent -sourceTime latestTime
returns: Expected 1 arguments but found 0
please may I have some help to understand where the problem could come from ?

thank you

Last edited by AlxB; June 4, 2021 at 04:16.
AlxB is offline   Reply With Quote

Old   June 7, 2021, 04:38
Default
  #7
Senior Member
 
louisgag's Avatar
 
Louis Gagnon
Join Date: Mar 2009
Location: Stuttgart, Germany
Posts: 338
Rep Power: 18
louisgag is on a distinguished road
Send a message via ICQ to louisgag
The error seems quite clear: you have to define the number of processors in the decomposeParDict
louisgag is offline   Reply With Quote

Old   June 10, 2021, 04:19
Default
  #8
Member
 
Join Date: Apr 2021
Posts: 41
Rep Power: 5
AlxB is on a distinguished road
Thanks Louis,
here is my decomposeParDict:

Code:
FoamFile
{
    version     2.0;
    format      ascii;
    class       dictionary;
    object      decomposeParDict;
}

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

numberOfSubdomains 16;

method          hierarchical;

simpleCoeffs
{
    n               (4 2 2);
    delta           0.001;
}

hierarchicalCoeffs
{
    n               (4 2 2);
    delta           0.001;
    order           xyz;
}

manualCoeffs
{
    dataFile        "cellDecomposition";
}
may be something wrong in it ?
AlxB is offline   Reply With Quote

Old   September 14, 2021, 12:59
Default
  #9
Member
 
Join Date: Apr 2021
Posts: 41
Rep Power: 5
AlxB is on a distinguished road
Hello,

is there anyone able to run mapFields with runParallel in an Allrun file ?

Code:
runParallel mapFields
So far it has not been able to catch the correct number of processors as defined in the " decomposeParDict

Code:
"Error getting 'numberOfSubdomains' from system/decomposeParDict'
Running mapFields (1 processes) on ..
is there a way to do it differently ?


thank you
AlxB is offline   Reply With Quote

Old   September 14, 2021, 13:48
Default
  #10
Member
 
Geon-Hong Kim
Join Date: Feb 2010
Location: Ulsan, Republic of Korea
Posts: 36
Rep Power: 16
Geon-Hong is on a distinguished road
Quote:
Originally Posted by AlxB View Post
Hello,

is there anyone able to run mapFields with runParallel in an Allrun file ?

Code:
runParallel mapFields
So far it has not been able to catch the correct number of processors as defined in the " decomposeParDict

Code:
"Error getting 'numberOfSubdomains' from system/decomposeParDict'
Running mapFields (1 processes) on ..
is there a way to do it differently ?


thank you
Why don't you execute "mpirun" command instead the runParallel script function directly?
Geon-Hong is offline   Reply With Quote

Old   September 16, 2021, 14:12
Default
  #11
Member
 
Join Date: Apr 2021
Posts: 41
Rep Power: 5
AlxB is on a distinguished road
Hello,
thank you for your answer.

I got eventually some success with the following command:

Code:
runParallel mapFieldsPar ../<sourceCase> -consistent -parallel -sourceTime latestTime


Now I have a new problem:

I change the boundary conditions for that new case with an increase in rotational velocity for an object ($spinSpeed rotatingWallVelocity) and an increase in velocity for the domain inlet. (uniform $flowVelocity vector)

I start, for that new case, with a fresh 0.orig folder, immediatly copied as a 0 folder, and then proceed with mapFieldsPar to initialize the fields with the results from the previous simulation.

The updated velocity values are in the 0.orig and 0 folders.

However, when starting to run I can check that :
. the rotating wall velocity has been correctly updated
. the inlet velocity has not been correctly updated
Is there a trick to make it working ?

Should I add another command to impose the inlet velocity values ? (like setFields or other ?)

Below the U folder:
Code:
#include        "$FOAM_CASE/0.orig/include/initialConditions"
dimensions      [0 1 -1 0 0 0 0];

internalField   uniform $flowVelocity;
boundaryField
{
    #includeEtc "caseDicts/setConstraintTypes"
	inlet
	{
		type  	  fixedValue;
		value 	  $internalField;
	}
       object
       {
                type            rotatingWallVelocity;
		origin          ( 0 0 0 );
                axis            ( 0 1 0 );
		omega           $spinSpeed;
       }
}
Thank you for your help
AlxB is offline   Reply With Quote

Reply

Tags
mapfields, openfoam, parallel calculation, pre-processing


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[snappyHexMesh] Error while running SnappyHex in parallel mg.mithun OpenFOAM Meshing & Mesh Conversion 1 February 10, 2016 14:13
Explicitly filtered LES saeedi Main CFD Forum 16 October 14, 2015 12:58
mapFields parallel imbe88 OpenFOAM Running, Solving & CFD 0 August 8, 2014 09:48
problem about running parallel on cluster killsecond OpenFOAM Running, Solving & CFD 3 July 23, 2014 22:13
OpenFoam Parallel running shipman OpenFOAM Running, Solving & CFD 3 August 17, 2013 11:50


All times are GMT -4. The time now is 12:18.