CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

error when running in parallel, but succeed in a single core.

Register Blogs Community New Posts Updated Threads Search

Like Tree2Likes
  • 2 Post By sharonyue

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   December 2, 2014, 03:40
Default error when running in parallel, but succeed in a single core.
  #1
Senior Member
 
Dongyue Li
Join Date: Jun 2012
Location: Beijing, China
Posts: 849
Rep Power: 18
sharonyue is on a distinguished road
Hi guys,

I talked with many guys about this issue, they have faced this problem once or twice when we are running some cases. to put in simple, that is when u succeed in running a case with only a single core in your machine, but it will blow up in parallel.

I happened to get into this trouble these day, 2 yrs ago when I was doing some training on OpenFOAM I accidentally run into this problem, but I neglected it. However, when I wanna speed up my simulation, I can not use a single core.

This is the log file:

single core:
Code:
PIMPLE: Operating solver in PISO mode


Starting time loop

Courant Number mean: 0.0394037 max: 0.49847
Max Ur Courant Number = 0.0929619
deltaT = 0.000726744
Time = 5.00073

MULES: Solving for alpha.air
MULES: Solving for alpha.air
alpha.air volume fraction = 0.00199999  Min(alpha1) = 5.25086e-05  Max(alpha1) = 0.0298173
DILUPBiCG:  Solving for TY, Initial residual = 0.000198921, Final residual = 2.11904e-17, No Iterations 8
GAMG:  Solving for p, Initial residual = 0.00891682, Final residual = 3.51213e-05, No Iterations 3
GAMG:  Solving for p, Initial residual = 0.00150613, Final residual = 1.24395e-05, No Iterations 3
GAMG:  Solving for p, Initial residual = 0.000241716, Final residual = 7.33671e-08, No Iterations 5
smoothSolver:  Solving for epsilonm, Initial residual = 2.29257e-05, Final residual = 6.22539e-08, No Iterations 4
smoothSolver:  Solving for km, Initial residual = 3.06159e-05, Final residual = 7.12566e-08, No Iterations 6
ExecutionTime = 2.49 s

Courant Number mean: 0.0395294 max: 0.499942
Max Ur Courant Number = 0.095134
deltaT = 0.000726744
Time = 5.00145

MULES: Solving for alpha.air
MULES: Solving for alpha.air
alpha.air volume fraction = 0.00199999  Min(alpha1) = 5.2494e-05  Max(alpha1) = 0.0298275
DILUPBiCG:  Solving for TY, Initial residual = 0.000200717, Final residual = 1.09388e-17, No Iterations 8
GAMG:  Solving for p, Initial residual = 0.00903888, Final residual = 3.94824e-05, No Iterations 3
GAMG:  Solving for p, Initial residual = 0.000881921, Final residual = 6.50462e-06, No Iterations 3
GAMG:  Solving for p, Initial residual = 0.000136874, Final residual = 6.05967e-08, No Iterations 5
smoothSolver:  Solving for epsilonm, Initial residual = 1.21255e-05, Final residual = 5.58432e-08, No Iterations 4
smoothSolver:  Solving for km, Initial residual = 2.34679e-05, Final residual = 7.04817e-08, No Iterations 6
ExecutionTime = 4.27 s
parallel:
Code:
PIMPLE: Operating solver in PISO mode


Starting time loop

Courant Number mean: 0.0394037 max: 0.49847
Max Ur Courant Number = 0.0929619
deltaT = 0.000726744
Time = 5.00073

MULES: Solving for alpha.air
MULES: Solving for alpha.air
alpha.air volume fraction = 0.00199999  Min(alpha1) = 5.25086e-05  Max(alpha1) = 0.0298173
DILUPBiCG:  Solving for TY, Initial residual = 0.000198921, Final residual = 5.953e-17, No Iterations 9
GAMG:  Solving for p, Initial residual = 0.00918711, Final residual = 3.26779e-05, No Iterations 4
GAMG:  Solving for p, Initial residual = 0.001776, Final residual = 7.50224e-06, No Iterations 4
GAMG:  Solving for p, Initial residual = 0.000296505, Final residual = 3.69827e-08, No Iterations 7
smoothSolver:  Solving for epsilonm, Initial residual = 2.41189e-05, Final residual = 7.0696e-08, No Iterations 4
smoothSolver:  Solving for km, Initial residual = 3.18021e-05, Final residual = 7.4419e-08, No Iterations 6
ExecutionTime = 1.19 s

Courant Number mean: 0.0395316 max: 0.499996
Max Ur Courant Number = 0.0951141
deltaT = 0.000726744
Time = 5.00145

MULES: Solving for alpha.air
MULES: Solving for alpha.air
alpha.air volume fraction = 0.00199999  Min(alpha1) = 5.24938e-05  Max(alpha1) = 0.0298275
DILUPBiCG:  Solving for TY, Initial residual = 0.00020167, Final residual = 6.63252e-14, No Iterations 4001
GAMG:  Solving for p, Initial residual = 0.00912649, Final residual = 7.89893e-05, No Iterations 3
GAMG:  Solving for p, Initial residual = 0.00105189, Final residual = 4.65768e-06, No Iterations 4
GAMG:  Solving for p, Initial residual = 0.000162372, Final residual = 4.39752e-08, No Iterations 6
^Cmpirun: killing job...
Actually it didnot blow up in parallel after this time step. but Im sure it will. Iterations number is way too high its not a common sense.

Additional information is :
model: two fluid model
solver: tailor-made solver.
fvsolutions for TY:
Code:
    {
        solver          PBiCG;
        preconditioner  DILU;
        tolerance       1e-17;//
        relTol          0;
    }
Any comments is appreciated.Thanks
sharonyue is offline   Reply With Quote

Old   December 12, 2014, 14:24
Default
  #2
Senior Member
 
Dongyue Li
Join Date: Jun 2012
Location: Beijing, China
Posts: 849
Rep Power: 18
sharonyue is on a distinguished road
I found some threads on this:

http://www.cfd-online.com/Forums/ope...tml#post246561

http://www.cfd-online.com/Forums/ope...terations.html

looks like DILUPBiCG is incompatible with parallel running sometimes.. I should use GAMG I think
BlnPhoenix and daniyalaltaf like this.
sharonyue is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Parallel Running With Problems guilha OpenFOAM Running, Solving & CFD 1 July 26, 2014 11:55
problem about running parallel on cluster killsecond OpenFOAM Running, Solving & CFD 3 July 23, 2014 22:13
Something weird encountered when running OpenFOAM in parallel on multiple nodes xpqiu OpenFOAM Running, Solving & CFD 2 May 2, 2013 05:59
Parallel computing quad core Prad Main CFD Forum 13 February 9, 2009 15:28
FEDORA CORE and PARALLEL processing Tuks CFX 2 August 20, 2005 12:05


All times are GMT -4. The time now is 16:14.