CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Computational time

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 12, 2009, 05:23
Default Hi, All, I am runing a tran
  #1
Member
 
Vivien
Join Date: Mar 2009
Posts: 52
Rep Power: 17
sunnysun is on a distinguished road
Hi, All,

I am runing a transient case with both icoFoam and turbFoam, the mesh composed of 300,000 elements.

I use parallel computing with 15 processors(64 bits), but it turns out to be very slow. adaptive time step is used. Below is part of output from 2 solvers.

ICOFOAM:
Time = 0.090534016

Courant Number mean: 0.0042189121 max: 0.60001494
deltaT = 5.051708e-06
DILUPBiCG: Solving for Ux, Initial residual = 1.5933376e-05, Final residual = 9.3428658e-09, No Iterations 2
DILUPBiCG: Solving for Uy, Initial residual = 1.7857106e-05, Final residual = 1.1072651e-09, No Iterations 2
DILUPBiCG: Solving for Uz, Initial residual = 1.6937772e-05, Final residual = 4.5817176e-09, No Iterations 2
DICPCG: Solving for p, Initial residual = 3.6714912e-05, Final residual = 9.753555e-09, No Iterations 937
DICPCG: Solving for p, Initial residual = 0.00022323789, Final residual = 9.9495147e-09, No Iterations 870
time step continuity errors : sum local = 1.1644498e-12, global = -3.4127869e-14, cumulative = 5.5155785e-13
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
ExecutionTime = 139686.95 s ClockTime = 140261 s

Time = 0.090539068

Courant Number mean: 0.0042189072 max: 0.60001581
deltaT = 5.0515749e-06
DILUPBiCG: Solving for Ux, Initial residual = 1.6457677e-05, Final residual = 4.6470092e-09, No Iterations 3
DILUPBiCG: Solving for Uy, Initial residual = 1.8183019e-05, Final residual = 1.1097131e-09, No Iterations 3
DILUPBiCG: Solving for Uz, Initial residual = 1.7496236e-05, Final residual = 8.7850725e-09, No Iterations 2
DICPCG: Solving for p, Initial residual = 3.916581e-05, Final residual = 9.5537303e-09, No Iterations 934
DICPCG: Solving for p, Initial residual = 0.00022626984, Final residual = 9.5267804e-09, No Iterations 914
time step continuity errors : sum local = 1.1548003e-12, global = -1.7614837e-14, cumulative = 5.3394302e-13
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
ExecutionTime = 139696.76 s ClockTime = 140271 s

Time = 0.090544119

Courant Number mean: 0.0042188909 max: 0.60001492
deltaT = 5.0514493e-06
DILUPBiCG: Solving for Ux, Initial residual = 1.5920686e-05, Final residual = 9.2325137e-09, No Iterations 2
DILUPBiCG: Solving for Uy, Initial residual = 1.7845416e-05, Final residual = 9.8299531e-10, No Iterations 2
DILUPBiCG: Solving for Uz, Initial residual = 1.6922093e-05, Final residual = 4.5258186e-09, No Iterations 2
DICPCG: Solving for p, Initial residual = 3.8415655e-05, Final residual = 9.4206439e-09, No Iterations 938
DICPCG: Solving for p, Initial residual = 0.00023378581, Final residual = 9.4035601e-09, No Iterations 864
time step continuity errors : sum local = 1.1591606e-12, global = 2.6889353e-15, cumulative = 5.3663195e-13
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
ExecutionTime = 139706.31 s ClockTime = 140281 s

Time = 0.090549171

Courant Number mean: 0.0042188863 max: 0.60001584
deltaT = 5.051316e-06
DILUPBiCG: Solving for Ux, Initial residual = 1.6469925e-05, Final residual = 4.7423517e-09, No Iterations 3
DILUPBiCG: Solving for Uy, Initial residual = 1.8189916e-05, Final residual = 1.2159212e-09, No Iterations 3
DILUPBiCG: Solving for Uz, Initial residual = 1.7507658e-05, Final residual = 8.9428823e-09, No Iterations 2
DICPCG: Solving for p, Initial residual = 3.9920706e-05, Final residual = 9.8779321e-09, No Iterations 933
DICPCG: Solving for p, Initial residual = 0.00023749663, Final residual = 9.958994e-09, No Iterations 914
time step continuity errors : sum local = 1.1595706e-12, global = 1.3994702e-14, cumulative = 5.5062665e-13
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
ExecutionTime = 139716.1 s ClockTime = 140291 s


TURBFOAM:


Time = 0.011246492

Courant Number mean: 0.0021497253 max: 0.29183874
deltaT = 4.1666667e-06
DILUPBiCG: Solving for Ux, Initial residual = 2.2243458e-05, Final residual = 6.4891902e-08, No Iterations 1
DILUPBiCG: Solving for Uy, Initial residual = 2.7101833e-05, Final residual = 1.6199912e-07, No Iterations 1
DILUPBiCG: Solving for Uz, Initial residual = 2.3161504e-05, Final residual = 1.6636072e-07, No Iterations 1
DICPCG: Solving for p, Initial residual = 6.4919684e-05, Final residual = 9.5536623e-07, No Iterations 870
DICPCG: Solving for p, Initial residual = 6.7920792e-05, Final residual = 9.9707514e-07, No Iterations 13
DICPCG: Solving for p, Initial residual = 8.065135e-06, Final residual = 7.5610997e-07, No Iterations 3
time step continuity errors : sum local = 4.3769708e-12, global = -4.630939e-14, cumulative = -3.7206299e-11
DICPCG: Solving for p, Initial residual = 4.9009229e-06, Final residual = 9.478687e-07, No Iterations 65
DICPCG: Solving for p, Initial residual = 7.1595084e-06, Final residual = 8.0333983e-07, No Iterations 2
DICPCG: Solving for p, Initial residual = 1.1750355e-06, Final residual = 5.0730276e-07, No Iterations 1
time step continuity errors : sum local = 2.93423e-12, global = -1.9983413e-13, cumulative = -3.7406133e-11
DILUPBiCG: Solving for epsilon, Initial residual = 1.0007713e-05, Final residual = 4.9980981e-08, No Iterations 1
DILUPBiCG: Solving for k, Initial residual = 3.374637e-05, Final residual = 2.9900197e-07, No Iterations 1
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
ExecutionTime = 17468.16 s ClockTime = 157562 s

Time = 0.011250659

Courant Number mean: 0.0021497677 max: 0.29185204
deltaT = 4.1666667e-06
DILUPBiCG: Solving for Ux, Initial residual = 2.2222925e-05, Final residual = 6.6488393e-08, No Iterations 1
DILUPBiCG: Solving for Uy, Initial residual = 2.7102149e-05, Final residual = 1.639788e-07, No Iterations 1
DILUPBiCG: Solving for Uz, Initial residual = 2.3168021e-05, Final residual = 1.8278849e-07, No Iterations 1
DICPCG: Solving for p, Initial residual = 5.2848402e-05, Final residual = 9.6086881e-07, No Iterations 867
DICPCG: Solving for p, Initial residual = 6.600024e-05, Final residual = 9.7140246e-07, No Iterations 13
DICPCG: Solving for p, Initial residual = 7.8676029e-06, Final residual = 7.3652595e-07, No Iterations 3
time step continuity errors : sum local = 4.270429e-12, global = 4.5538476e-14, cumulative = -3.7360594e-11
DICPCG: Solving for p, Initial residual = 4.7932553e-06, Final residual = 9.1380874e-07, No Iterations 24
DICPCG: Solving for p, Initial residual = 4.0275424e-06, Final residual = 7.9633809e-07, No Iterations 1
DICPCG: Solving for p, Initial residual = 9.6547429e-07, Final residual = 9.6547429e-07, No Iterations 0
time step continuity errors : sum local = 5.3984369e-12, global = 5.6480473e-13, cumulative = -3.679579e-11
DILUPBiCG: Solving for epsilon, Initial residual = 9.9681975e-06, Final residual = 9.9681975e-06, No Iterations 0
DILUPBiCG: Solving for k, Initial residual = 3.3737499e-05, Final residual = 2.9891437e-07, No Iterations 1
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
ExecutionTime = 17474.9 s ClockTime = 157626 s

Time = 0.011254826

Courant Number mean: 0.0021498106 max: 0.29186488
deltaT = 4.1666667e-06
DILUPBiCG: Solving for Ux, Initial residual = 2.2233614e-05, Final residual = 6.5251044e-08, No Iterations 1
DILUPBiCG: Solving for Uy, Initial residual = 2.7089605e-05, Final residual = 1.6233944e-07, No Iterations 1
DILUPBiCG: Solving for Uz, Initial residual = 2.3153049e-05, Final residual = 1.6663021e-07, No Iterations 1
DICPCG: Solving for p, Initial residual = 6.453333e-05, Final residual = 9.5711004e-07, No Iterations 870
DICPCG: Solving for p, Initial residual = 6.8593867e-05, Final residual = 9.8837497e-07, No Iterations 16
DICPCG: Solving for p, Initial residual = 8.3200747e-06, Final residual = 7.3665399e-07, No Iterations 3
time step continuity errors : sum local = 4.2746891e-12, global = -1.250224e-13, cumulative = -3.6920812e-11
DICPCG: Solving for p, Initial residual = 4.9091131e-06, Final residual = 9.0890313e-07, No Iterations 65
DICPCG: Solving for p, Initial residual = 6.9693982e-06, Final residual = 7.8703718e-07, No Iterations 2
DICPCG: Solving for p, Initial residual = 1.1489768e-06, Final residual = 4.9787673e-07, No Iterations 1
time step continuity errors : sum local = 2.8983148e-12, global = -1.6075718e-13, cumulative = -3.7081569e-11
DILUPBiCG: Solving for epsilon, Initial residual = 1.0005115e-05, Final residual = 5.0014711e-08, No Iterations 1
DILUPBiCG: Solving for k, Initial residual = 3.3738596e-05, Final residual = 2.9894466e-07, No Iterations 1
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0
ExecutionTime = 17482.23 s ClockTime = 157694 s

Is there any way to improve the computational performence?

Thanks in advance!

Vivien
sunnysun is offline   Reply With Quote

Old   March 12, 2009, 07:15
Default Hi Vivien, On occasions I h
  #2
New Member
 
Greg Collecutt
Join Date: Mar 2009
Location: Brisbane, Queensland, Australia
Posts: 21
Rep Power: 17
gcollecutt is on a distinguished road
Hi Vivien,

On occasions I have forgotten to put the -parallel option on the command line and have run 8 or more simultaneous copies of the single process version. How I got my PhD I don't know! I doubt you have made this mistake.

I also see that the DICPCG solver is working pretty hard to solve the pressure field. Normally I would expect it to take less than 10 iterations to converge when running a courant number of 0.3. Sometimes I find the GAMG solver better for stubborn fields, but I have noted that this solver does not parallelise very well.

How well does it run when you use a constant inlet condition (rather than the time varying one I think you are using)?

Greg.
gcollecutt is offline   Reply With Quote

Old   March 12, 2009, 09:30
Default Hi, Greg, I did put the "-p
  #3
Member
 
Vivien
Join Date: Mar 2009
Posts: 52
Rep Power: 17
sunnysun is on a distinguished road
Hi, Greg,

I did put the "-parallel" command...

May I know how fine is your mesh and how long is your simulation? Which solver did you use?

Many thanks!!

Vivien
sunnysun is offline   Reply With Quote

Old   March 13, 2009, 05:25
Default Hi Vivien, I read somewher
  #4
Senior Member
 
Rishi .
Join Date: Mar 2009
Posts: 149
Rep Power: 17
hellorishi is on a distinguished road
Hi Vivien,

I read somewhere on the forum that for efficient parallel execution, each processor should get atleast 100k cells. In your case 300k/15cpu = 20k cells per cpu. So there is a lot of communication involved and possibly the network speed is bottleneck. Variant of Conjugate Gradient(CG) solvers need global data in each iteration step.

My suggestion:
1. Run the case on 1CPU and then 2 or 4 cpus which are on the same node(depending if you have dual cores or quad) and check the speedup, if any.

2. Increase the domain size to so that ~100cells/cpu

3. Can you post your decomposePar logfile. It will give information about the communication required.

Rishi
hellorishi is offline   Reply With Quote

Old   March 13, 2009, 07:48
Default Rishi, Vivien, I am using a
  #5
New Member
 
Greg Collecutt
Join Date: Mar 2009
Location: Brisbane, Queensland, Australia
Posts: 21
Rep Power: 17
gcollecutt is on a distinguished road
Rishi, Vivien,

I am using a custom solver which is a variant of reactingFoam and includes a fair amount of lagranigian particle tracking and computation. In my case I have 300K cells + 200K particles and obtain near linear speed improvement all the way to 64 CPUs (i.e. less than 5000 cells per process). My mesh is long and thin and so I use simple decomposition on the long axis, so each processor mesh has at most two processor boundaries.

Greg.
gcollecutt is offline   Reply With Quote

Old   March 16, 2009, 04:32
Default
  #6
Senior Member
 
Ivan Flaminio Cozza
Join Date: Mar 2009
Location: Torino, Piemonte, Italia
Posts: 210
Rep Power: 18
ivan_cozza is on a distinguished road
Send a message via MSN to ivan_cozza
Quote:
Originally Posted by sunnysun View Post
Hi, Greg,

I did put the "-parallel" command...

May I know how fine is your mesh and how long is your simulation? Which solver did you use?

Many thanks!!

Vivien
Hi guys!
In my opinion the problem is in the p solver, try tu use GAMG as linear solver, in my incompressible simulations it put the p iterations down from 900 up to 20...
Ivan
ivan_cozza is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Record computational time Edwin FLUENT 5 October 22, 2008 03:28
Relation between physical and computational time Salman Main CFD Forum 2 August 8, 2005 10:17
Relation of computational time step with real time Salman Main CFD Forum 2 August 3, 2005 15:13
computational time dyn-mesh mange FLUENT 0 November 19, 2004 13:34
computational time and convergence rate AK FLUENT 1 January 20, 2004 14:53


All times are GMT -4. The time now is 21:28.