CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM

Cumulative continuity error large in parallel simulations

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   October 27, 2010, 10:42
Default Cumulative continuity error large in parallel simulations
  #1
Member
 
Heng Xiao
Join Date: Mar 2009
Location: Zurich, Switzerland
Posts: 58
Rep Power: 17
xiao is on a distinguished road
Dear all,

It is not the first time people have reported that they get different results in serial and parallel simulations using OpenFOAM.
One immediate explanation is that the preconditioning in parallel indeed has some randomness, depending on the order or operations.
http://www.cfd-online.com/Forums/ope...llel-runs.html
Also, one has to admit that preconditioning and iterative linear solver is no easy task in parallel.

However, the problem I have now cannot be explained by any of the things I found in the forum or anything I can imagine:

My cumulative continuity error is large in the parallel simulations,
even if the pressure equation converged OK! Serial simulations are fine.

Here is a summary of my solver and case:
Solver: pisoFoam (RANS + LaunderSharma KE model), I added a random forcing in certain regions as source terms in momentum equation. (without the forcing, everything was OK).

Case: periodic channel flow, with a small hump in the middle (even with the simple Re395 channel flow case, the problem below still exists, but less sever). Mesh is OK, with max non-orthogonality of 30 degrees.

BC: walls on the top and bottom. Cyclic BCs on the streamwise and spanwise directions.

fvSolution/fvScheme: both GAMG and PCG/PBiCG are tried. Pressure converges to tolerance of 1E-8 (relTol=0) for the final solving (pFinal), othewise to 1E-6 (relTol 0.04). Velocity and k, epsilon converged to 1E-8 or even smaller.
nCorrectors = 2;
nNonOrthogonalCorrectors 1;
Time stepping: Euler
Spatial: limitedLinear for div. Linear for Laplacian.

Here is the output for the last step:

Time = 8000

Courant Number mean: 0.00746705 max: 0.0869933
GAMG: Solving for Ux, Initial residual = 0.00035758, Final residual = 9.2098e-09, No Iterations 1
GAMG: Solving for Uy, Initial residual = 0.00168457, Final residual = 4.57093e-12, No Iterations 2
GAMG: Solving for Uz, Initial residual = 0.00175593, Final residual = 6.0287e-12, No Iterations 2
GAMG: Solving for p, Initial residual = 0.00536092, Final residual = 0.000108022, No Iterations 2
GAMG: Solving for p, Initial residual = 0.000290028, Final residual = 7.85811e-06, No Iterations 3
RAS time step continuity errors : sum local = 6.63938e-07, global = -6.63775e-07, cumulative = -0.0306293
GAMG: Solving for p, Initial residual = 6.99397e-05, Final residual = 2.91147e-06, No Iterations 2
GAMG: Solving for p, Initial residual = 1.1464e-05, Final residual = 8.45853e-09, No Iterations 10
time step continuity errors : sum local = 6.63775e-07, global = -6.63775e-07, cumulative = -0.03063
GAMG: Solving for epsilon, Initial residual = 0.00102844, Final residual = 2.50781e-12, No Iterations 2
GAMG: Solving for k, Initial residual = 0.00102089, Final residual = 2.54785e-12, No Iterations 2
RAS uncorrected Ubar = 0.020188 gradP = 1.27122e-05
ExecutionTime = 22810.4 s ClockTime = 24581 s

The time history of the cumulative continuity error and (instantaneous) global continuity error is plotted and attached. Note the latter is scaled by 1E4, in order to appear on the same plot.

The serial runs are normal, with the cumulative error in the same order as instantaneous cont. error or smaller, even with large forcing on the momentum equation.

I think the cyclic boundary condition may be the reason. The flux through those boundaries are not exactly zero, but in the order of 1E-5, as I examined using patchIntegrate. But why is the continuity error in parallel cases are all in the same sign, so that they accumulate instead of cancel each other? Why does it only appear in parallel? I don't understand.

Any insights or suggestions are appreciated!


Best
Heng
Attached Images
File Type: png mass-loss.png (7.6 KB, 154 views)
xiao is offline   Reply With Quote

Old   February 23, 2011, 09:04
Default same situation
  #2
New Member
 
Chiku
Join Date: Sep 2010
Posts: 12
Rep Power: 16
iznal is on a distinguished road
hi

I've got the same PROBLEM like yours. Have you tried to simulate it with the whole system, is this Problem real from the cyclicggi face?

thx
iznal is offline   Reply With Quote

Old   February 23, 2011, 12:22
Default
  #3
Member
 
Heng Xiao
Join Date: Mar 2009
Location: Zurich, Switzerland
Posts: 58
Rep Power: 17
xiao is on a distinguished road
The problem is due to the cyclic faces, parallel computing, plus a forcing term in my momentum equation. So, when do serial computation, everything was fine. when I don't have the forcing term, everything was fine ... the combination of all these (cyclic + parallel computing + forcing) lead this problem.

Heng
xiao is offline   Reply With Quote

Old   March 7, 2011, 05:13
Default one more question
  #4
New Member
 
Chiku
Join Date: Sep 2010
Posts: 12
Rep Power: 16
iznal is on a distinguished road
Quote:
Originally Posted by xiao View Post
The problem is due to the cyclic faces, parallel computing, plus a forcing term in my momentum equation. So, when do serial computation, everything was fine. when I don't have the forcing term, everything was fine ... the combination of all these (cyclic + parallel computing + forcing) lead this problem.

Heng
Thx for the quick replying. I'm really confused here. Does OpenFOAM have some possibilities to solve this problem? I'm using OpenFOAM 1.5 right now, have u tried the new versions? Could the solvers be better than them in OpenFOAM 1.5?
iznal is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Transient simulation not converging skabilan OpenFOAM Running, Solving & CFD 14 December 17, 2019 00:12
How to write k and epsilon before the abnormal end xiuying OpenFOAM Running, Solving & CFD 8 August 27, 2013 16:33
Convergence moving mesh lr103476 OpenFOAM Running, Solving & CFD 30 November 19, 2007 15:09
IcoFoam parallel woes msrinath80 OpenFOAM Running, Solving & CFD 9 July 22, 2007 03:58
Could anybody help me see this error and give help liugx212 OpenFOAM Running, Solving & CFD 3 January 4, 2006 19:07


All times are GMT -4. The time now is 17:55.