CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

InterDyMFoam takes for ever in parallel

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   October 14, 2013, 09:41
Default InterDyMFoam takes for ever in parallel
  #1
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
Dear all:

I am running a parallel case for seismic data using interdymfoam. Seismic data can be erratic and not smooth such as sinusoidal data. The data may have several displacements in one direction and then suddenly switch and have displacements in another direction. I am running a case with approx 250 data points -- which is not much. My mesh is 600x100. I have seen in this forum people suggesting that for 50,000 cells one processor is enough. In my case I have 6 processors. Yet it is taking over 12 hours. Initially I had 12 processors and it was taking more than 12 hours so I reduced the number of processors to 6. If I look at the log file, I see that each step takes a few iterations to solve. So can any one suggest ways I can speed up the process? I have reproduced the log file below and I am not quite sure I understand each and every line, but perhaps hidden in those lines are flags that are trying to tell me why it is taking so long? Any advice will be greatly appreciated, Thanks!!

sinppet of the log file from when the analysis started:
*-------------------------------------------------------------------------------------------------------------------*
"CFS-Server.26896"
"CFS-Server.26897"
"CFS-Server.26898"
"CFS-Server.26899"
"CFS-Server.26900"
)

Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : nonBlocking
polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Selecting dynamicFvMesh solidBodyMotionFvMesh
Selecting solid-body motion function tabulated6DoFMotion
Applying solid body motion to entire mesh
Reading field p_rgh

Reading field U

Reading/calculating face flux field phi

Reading transportProperties

Selecting incompressible transport model Newtonian
Selecting incompressible transport model Newtonian
Selecting turbulence model type RASModel
Selecting RAS turbulence model laminar

Reading g
Calculating field g.h

No finite volume options present


PIMPLE: Operating solver in PISO mode

time step continuity errors : sum local = 0, global = 0, cumulative = 0
GAMGPCG: Solving for pcorr, Initial residual = 1, Final residual = 2.08439e-10, No Iterations 1
time step continuity errors : sum local = 2.56957e-07, global = 6.54836e-13, cumulative = 6.54836e-13
Courant Number mean: 1.21013e-05 max: 4.18906e-05

Starting time loop

Reading surface description:
leftwalls
rightwalls

Interface Courant Number mean: 0 max: 0
Courant Number mean: 1.18641e-05 max: 4.10692e-05
deltaT = 0.00116279
Time = 0.00116279

solidBodyMotionFunctions::tabulated6DoFMotion::tra nsformation(): Time = 0.00116279 transformation: ((0 0.0228556 0) (1 (0 0 0)))
Execution time for mesh.update() = 0.07 s
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 0 Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 0 Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 0 Max(alpha1) = 1.00001
GAMG: Solving for p_rgh, Initial residual = 1, Final residual = 0.00989367, No Iterations 14
time step continuity errors : sum local = 0.000466452, global = -6.67992e-16, cumulative = 6.54168e-13
GAMGPCG: Solving for p_rgh, Initial residual = 0.0197322, Final residual = 1.38145e-09, No Iterations 12
time step continuity errors : sum local = 1.4508e-10, global = 5.7683e-16, cumulative = 6.54745e-13
ExecutionTime = 1.04 s ClockTime = 2 s

Interface Courant Number mean: 0 max: 0
Courant Number mean: 5.34521 max: 44.9452
deltaT = 1.29336e-05
Time = 0.00117572

solidBodyMotionFunctions::tabulated6DoFMotion::tra nsformation(): Time = 0.00117572 transformation: ((0 0.0231166 0) (1 (0 0 0)))
Execution time for mesh.update() = 0.09 s
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 0 Max(alpha1) = 1.00001
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 0 Max(alpha1) = 1.00001
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 0 Max(alpha1) = 1.00001
GAMG: Solving for p_rgh, Initial residual = 0.929805, Final residual = 0.0088692, No Iterations 9
time step continuity errors : sum local = 1.62837e-06, global = -5.29716e-18, cumulative = 6.5474e-13
GAMGPCG: Solving for p_rgh, Initial residual = 0.0549742, Final residual = 4.49069e-10, No Iterations 9
time step continuity errors : sum local = 4.12279e-13, global = -5.20441e-18, cumulative = 6.54735e-13
ExecutionTime = 1.71 s ClockTime = 2 s

Interface Courant Number mean: 0.000239332 max: 0.550102
Courant Number mean: 0.0611436 max: 0.550102
deltaT = 1.17536e-05
Time = 0.00118748
musahossein is offline   Reply With Quote

Old   October 14, 2013, 10:29
Default
  #2
Senior Member
 
Pablo Higuera
Join Date: Jan 2011
Location: Auckland
Posts: 627
Rep Power: 19
Phicau is on a distinguished road
Hi,

I may point the obvious, but your case takes a lot of time probably because your deltaT is very small.

My best guess is that your earthquake signal induces very large velocities the first time step, yielding "Courant Number mean: 5.34521 max: 44.9452". You should try a smaller deltaT to minimize this initial effect.

Furthermore, you may be experiencing the so-called spurious velocities afterwards as your maximum Courant number is located at the interface between both fluids.

On top of that I will try to use 1 processor only. In my scaling testing using interFoam, I have a magic number of 100.000~150.000 cells/processor.

These are actually my best wild guesses, as you provide very few details from your mesh and BCs.

Best,

Pablo
Phicau is offline   Reply With Quote

Old   October 14, 2013, 10:54
Default
  #3
Senior Member
 
Join Date: Dec 2011
Posts: 111
Rep Power: 20
haakon will become famous soon enough
Depending on your linear equation solver settings, OpenFOAM might scale well down to as little as 5 000 - 10 000 cells/process, but this require the correct case and much fine tuning.

Anyways, what Phicau pointed out regarding the Courant number is probably what you need to concentrate about. Decreasing the time step is often the only solution, but it might also be wise to investigate whether you can make any changes to the grid to reduce the Courant number. Perhaps it is a special region with high velocities that drives this number high? A small increase in the cell size here might help (but of course only as long as you capture the necessary physics involved).

At last: The Courant number shall not vary with the number of processors in use, in that case something is seriously wrong.
haakon is offline   Reply With Quote

Old   October 14, 2013, 12:11
Default
  #4
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
Gentlemen:

Thankyou for your response. I am looking at tanksloshing. The tank dimensions are 1mx1mx0.1m. Water depth is 0.6m. This is a two phase (water / air) problem. The CofG is located at the water air interface and centered along the length of the tank. So the left face of the tank is 0.5m away and the right face of the tank is also 0.5m away. Attached is the mesh and geometry information in the blockMeshDict file:

vertices
(

// SFINTAIR-DISPLACEMENT.V2
(-0.05 -0.50 -0.600) // Vertex back lower left corner = 0
(-0.05 0.50 -0.600) // Vertex back lower right corner= 1
(-0.05 0.50 0.400) // Vertex back upper right corner= 2
(-0.05 -0.50 0.400) // Vertex back upper left corner = 3

(0.05 -0.50 -0.600) // Vertex front lower left corner = 4
(0.05 0.50 -0.600) // Vertex front lower right corner= 5
(0.05 0.50 0.400) // Vertex front upper right corner= 6
(0.05 -0.50 0.400) // Vertex front upper left corner = 7
);

blocks
(
// block0
hex (0 1 2 3 4 5 6 7)
(100 600 1)
simpleGrading (1 1 1)
);

//patches
boundary
(
lowerWall
{
type patch;
faces
(
(0 1 5 4)
);
}
rightWall
{
type patch;
faces
(
(1 2 6 5)
);
}
atmosphere
{
type patch;
faces
(
(2 3 7 6)
);
}
leftWall
{
type patch;
faces
(
(0 4 7 3)
);
}
frontAndBack
{
type Empty;
faces
(
(4 5 6 7)
(0 3 2 1)
);
}
);

In the controlDict file, the parameters are as follows:
application interDyMFoam;
startFrom startTime;
startTime 0;
stopAt endTime;
endTime 48;
deltaT 0.001;
writeControl adjustableRunTime;
writeInterval 0.05;
purgeWrite 0;
writeFormat ascii;
writePrecision 6;
writeCompression compressed;
timeFormat general;
timePrecision 6;
runTimeModifiable yes;
adjustTimeStep yes;
maxCo 0.5;
maxAlphaCo 0.5;
maxDeltaT 1;

I have a question about the deltaT. Based on the mesh, my my deltaX=1m/100= 0.01 so If I want to have a courant number of 0.5, then deltaT=0.05*0.01/1= 0.0005. I have no idea about the velocity U. I am assuming U=1, hence the deltaT I am using. How would one obtain U to get a better estimate for deltaT?

Thanks
musahossein is offline   Reply With Quote

Old   October 14, 2013, 12:20
Default
  #5
Senior Member
 
Pablo Higuera
Join Date: Jan 2011
Location: Auckland
Posts: 627
Rep Power: 19
Phicau is on a distinguished road
Beware, your mesh has 1:6 aspect ratio cells. Do you really need such detail in the Z direction with respect to the Y? Based on my experience for free surface flows you should go for something closer to 1:1, I personally like 1:2, and never beyond 1:5.

Even though checkMesh yields Mesh OK, you have to be very careful about it, as your results fully depend on it.

Best,

Pablo
Phicau is offline   Reply With Quote

Old   October 14, 2013, 13:09
Default
  #6
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
You bring up a very interesting point. I dont know any better. However, when I run the same with a sinusoidal input, the run completes (with 12 processors) in about 6 hours, with 4000 data input. So I did not change the mesh size assuming that it was not a mesh problem.

What is your thought on deltaT?
musahossein is offline   Reply With Quote

Old   October 16, 2013, 22:02
Default
  #7
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
Quote:
Originally Posted by Phicau View Post
Beware, your mesh has 1:6 aspect ratio cells. Do you really need such detail in the Z direction with respect to the Y? Based on my experience for free surface flows you should go for something closer to 1:1, I personally like 1:2, and never beyond 1:5.

Even though checkMesh yields Mesh OK, you have to be very careful about it, as your results fully depend on it.

Best,

Pablo
I changed the mesh ratio to 1:5, and reduced deltaT to 0.01. I have reduced processor number to 2 from 6. Still openfoam is running for over 20 hours. Question is though that this is earthquake data where the displacements are erratic (4cm in one direction then 2cm in the opposite direction) and discrete. There is no analytical function to describe it. So could it be that openfoam is having a hard time interpolating between the data points? The log file does not show any erratic behavor - or may be the log file is complaining all right, but I dont know where to look? Any help would be greatly appreciated. I am including snippet of log file herewith:

Interface Courant Number mean: 0.0124589 max: 0.372336
Courant Number mean: 0.0748294 max: 0.49897
deltaT = 4.49679e-05
Time = 0.102739

solidBodyMotionFunctions::tabulated6DoFMotion::tra nsformation(): Time = 0.102739 transformation: ((0 0.232156 0) (1 (0 0 0)))
Execution time for mesh.update() = 0.05 s
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 6.79989e-32 Max(alpha1) = 1.00004
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 6.79989e-32 Max(alpha1) = 1.00004
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 6.79989e-32 Max(alpha1) = 1.00004
GAMG: Solving for p_rgh, Initial residual = 0.10069, Final residual = 0.00058348, No Iterations 2
time step continuity errors : sum local = 5.95014e-07, global = 8.12922e-17, cumulative = -5.6182e-15
GAMGPCG: Solving for p_rgh, Initial residual = 0.0134842, Final residual = 1.1313e-09, No Iterations 11
time step continuity errors : sum local = 1.15361e-12, global = 8.15249e-17, cumulative = -5.53667e-15
ExecutionTime = 1788.55 s ClockTime = 1795 s

Interface Courant Number mean: 0.0124613 max: 0.372751
Courant Number mean: 0.0749913 max: 0.499211
deltaT = 4.50107e-05
Time = 0.102784

solidBodyMotionFunctions::tabulated6DoFMotion::tra nsformation(): Time = 0.102784 transformation: ((0 0.232022 0) (1 (0 0 0)))
Execution time for mesh.update() = 0.05 s
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 6.79989e-32 Max(alpha1) = 1.00004
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 6.79989e-32 Max(alpha1) = 1.00004
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 6.79989e-32 Max(alpha1) = 1.00004
GAMG: Solving for p_rgh, Initial residual = 0.0994884, Final residual = 0.000577669, No Iterations 2
time step continuity errors : sum local = 5.91006e-07, global = -1.21072e-16, cumulative = -5.65775e-15
GAMGPCG: Solving for p_rgh, Initial residual = 0.0133797, Final residual = 1.82521e-09, No Iterations 10
time step continuity errors : sum local = 1.92295e-12, global = -1.20349e-16, cumulative = -5.7781e-15
ExecutionTime = 1788.96 s ClockTime = 1796 s

Interface Courant Number mean: 0.0124672 max: 0.37248
Courant Number mean: 0.0750803 max: 0.499105
deltaT = 4.50536e-05
Time = 0.102829

solidBodyMotionFunctions::tabulated6DoFMotion::tra nsformation(): Time = 0.102829 transformation: ((0 0.231887 0) (1 (0 0 0)))
Execution time for mesh.update() = 0.05 s
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 6.79989e-32 Max(alpha1) = 1.00004
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 6.79989e-32 Max(alpha1) = 1.00004
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 6.79989e-32 Max(alpha1) = 1.00004
GAMG: Solving for p_rgh, Initial residual = 0.0977454, Final residual = 0.000615333, No Iterations 2
time step continuity errors : sum local = 6.41009e-07, global = 5.24555e-17, cumulative = -5.72564e-15
GAMGPCG: Solving for p_rgh, Initial residual = 0.0131217, Final residual = 1.93769e-09, No Iterations 10
time step continuity errors : sum local = 2.04804e-12, global = 5.23548e-17, cumulative = -5.67328e-15
ExecutionTime = 1789.37 s ClockTime = 1796 s

Interface Courant Number mean: 0.0125151 max: 0.372502
Courant Number mean: 0.0751691 max: 0.498562
deltaT = 4.51831e-05
Time = 0.102874

solidBodyMotionFunctions::tabulated6DoFMotion::tra nsformation(): Time = 0.102874 transformation: ((0 0.231751 0) (1 (0 0 0)))
Execution time for mesh.update() = 0.05 s
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 6.79989e-32 Max(alpha1) = 1.00004
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 6.79989e-32 Max(alpha1) = 1.00004
MULES: Solving for alpha1
Phase-1 volume fraction = 0.6 Min(alpha1) = 6.79989e-32 Max(alpha1) = 1.00004
GAMG: Solving for p_rgh, Initial residual = 0.0995661, Final residual = 0.000583261, No Iterations 2
time step continuity errors : sum local = 6.12205e-07, global = 8.57513e-17, cumulative = -5.58753e-15
GAMGPCG: Solving for p_rgh, Initial residual = 0.0133902, Final residual = 1.45622e-09, No Iterations 11
time step continuity errors : sum local = 1.52179e-12, global = 8.59522e-17, cumulative = -5.50158e-15
ExecutionTime = 1789.79 s ClockTime = 1796 s

Interface Courant Number mean: 0.0126211 max: 0.37309
Courant Number mean: 0.0754036 max: 0.498553
deltaT = 4.53134e-05
Time = 0.102919
musahossein is offline   Reply With Quote

Old   October 18, 2013, 11:20
Default
  #8
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
can anyone tell me what the function of the maxalphaCo is in the setFieldsDirectory in OpenFoam sloshingtank2D case? I understand the maxCo applies to one of the phases - say water in the tank for example. But what does maxCo apply to? The verbiage of the setFieldsDict is appended below.

Thanks

*--------------------------------------------------------------------------------------------------------------------------*
FoamFile
{
version 2.0;
format ascii;
class dictionary;
location "system";
object setFieldsDict;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

defaultFieldValues
// define the entire analysis space as phase zero or gas phase
(
volScalarFieldValue alpha1 0
);

// now define a region in the gas phase that has the fluid. It extends from the origin to 100 meters
// below the tank, and 100 meters in the x and y direction.
// This is to ensure that the area you want is filled with the fluid. Phase value is 1

regions
(
boxToCell
{
box ( -100 -100 -100 ) ( 100 100 0 );
fieldValues
(
volScalarFieldValue alpha1 1
);
}
);


// ************************************************** *********************** //
musahossein is offline   Reply With Quote

Old   October 24, 2013, 12:50
Default
  #9
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
Quote:
Originally Posted by Phicau View Post
Beware, your mesh has 1:6 aspect ratio cells. Do you really need such detail in the Z direction with respect to the Y? Based on my experience for free surface flows you should go for something closer to 1:1, I personally like 1:2, and never beyond 1:5.

Even though checkMesh yields Mesh OK, you have to be very careful about it, as your results fully depend on it.

Best,

Pablo
I figured out what the problem was. It was a units problem. I was inputting displacements in meters when they should have been in cm. As the tank was only 1m long, the displacements being input were therefore huge and making OpenFOAM work overtime and giving results that did not make sense. But I guess this experience supports OpenFOAMs robustness in solving such problems.
musahossein is offline   Reply With Quote

Reply

Tags
interdymfoam, parallel processing, takes for ever, takes long time


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Parallel interDyMFoam cellLevel problem tgvosk OpenFOAM Running, Solving & CFD 5 February 19, 2014 03:24
interDyMFoam, problems in mesh motion solutor run in parallel DLC OpenFOAM 11 December 11, 2012 03:20
Problems in mesh motion solutor in parallel 4 interDyMFoam. DLC Main CFD Forum 0 November 21, 2009 17:17
Problems in mesh motion solutor in parallel 4 interDyMFoam. DLC OpenFOAM 0 November 21, 2009 09:54
Running interDyMFoam in parallel sega OpenFOAM Running, Solving & CFD 1 March 12, 2009 06:54


All times are GMT -4. The time now is 01:16.