|
[Sponsors] |
Cumulative continuity error large in parallel simulations |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
October 27, 2010, 10:42 |
Cumulative continuity error large in parallel simulations
|
#1 |
Member
Heng Xiao
Join Date: Mar 2009
Location: Zurich, Switzerland
Posts: 58
Rep Power: 17 |
Dear all,
It is not the first time people have reported that they get different results in serial and parallel simulations using OpenFOAM. One immediate explanation is that the preconditioning in parallel indeed has some randomness, depending on the order or operations. http://www.cfd-online.com/Forums/ope...llel-runs.html Also, one has to admit that preconditioning and iterative linear solver is no easy task in parallel. However, the problem I have now cannot be explained by any of the things I found in the forum or anything I can imagine: My cumulative continuity error is large in the parallel simulations, even if the pressure equation converged OK! Serial simulations are fine. Here is a summary of my solver and case: Solver: pisoFoam (RANS + LaunderSharma KE model), I added a random forcing in certain regions as source terms in momentum equation. (without the forcing, everything was OK). Case: periodic channel flow, with a small hump in the middle (even with the simple Re395 channel flow case, the problem below still exists, but less sever). Mesh is OK, with max non-orthogonality of 30 degrees. BC: walls on the top and bottom. Cyclic BCs on the streamwise and spanwise directions. fvSolution/fvScheme: both GAMG and PCG/PBiCG are tried. Pressure converges to tolerance of 1E-8 (relTol=0) for the final solving (pFinal), othewise to 1E-6 (relTol 0.04). Velocity and k, epsilon converged to 1E-8 or even smaller. nCorrectors = 2; nNonOrthogonalCorrectors 1; Time stepping: Euler Spatial: limitedLinear for div. Linear for Laplacian. Here is the output for the last step: Time = 8000 Courant Number mean: 0.00746705 max: 0.0869933 GAMG: Solving for Ux, Initial residual = 0.00035758, Final residual = 9.2098e-09, No Iterations 1 GAMG: Solving for Uy, Initial residual = 0.00168457, Final residual = 4.57093e-12, No Iterations 2 GAMG: Solving for Uz, Initial residual = 0.00175593, Final residual = 6.0287e-12, No Iterations 2 GAMG: Solving for p, Initial residual = 0.00536092, Final residual = 0.000108022, No Iterations 2 GAMG: Solving for p, Initial residual = 0.000290028, Final residual = 7.85811e-06, No Iterations 3 RAS time step continuity errors : sum local = 6.63938e-07, global = -6.63775e-07, cumulative = -0.0306293 GAMG: Solving for p, Initial residual = 6.99397e-05, Final residual = 2.91147e-06, No Iterations 2 GAMG: Solving for p, Initial residual = 1.1464e-05, Final residual = 8.45853e-09, No Iterations 10 time step continuity errors : sum local = 6.63775e-07, global = -6.63775e-07, cumulative = -0.03063 GAMG: Solving for epsilon, Initial residual = 0.00102844, Final residual = 2.50781e-12, No Iterations 2 GAMG: Solving for k, Initial residual = 0.00102089, Final residual = 2.54785e-12, No Iterations 2 RAS uncorrected Ubar = 0.020188 gradP = 1.27122e-05 ExecutionTime = 22810.4 s ClockTime = 24581 s The time history of the cumulative continuity error and (instantaneous) global continuity error is plotted and attached. Note the latter is scaled by 1E4, in order to appear on the same plot. The serial runs are normal, with the cumulative error in the same order as instantaneous cont. error or smaller, even with large forcing on the momentum equation. I think the cyclic boundary condition may be the reason. The flux through those boundaries are not exactly zero, but in the order of 1E-5, as I examined using patchIntegrate. But why is the continuity error in parallel cases are all in the same sign, so that they accumulate instead of cancel each other? Why does it only appear in parallel? I don't understand. Any insights or suggestions are appreciated! Best Heng |
|
February 23, 2011, 09:04 |
same situation
|
#2 |
New Member
Chiku
Join Date: Sep 2010
Posts: 12
Rep Power: 16 |
hi
I've got the same PROBLEM like yours. Have you tried to simulate it with the whole system, is this Problem real from the cyclicggi face? thx |
|
February 23, 2011, 12:22 |
|
#3 |
Member
Heng Xiao
Join Date: Mar 2009
Location: Zurich, Switzerland
Posts: 58
Rep Power: 17 |
The problem is due to the cyclic faces, parallel computing, plus a forcing term in my momentum equation. So, when do serial computation, everything was fine. when I don't have the forcing term, everything was fine ... the combination of all these (cyclic + parallel computing + forcing) lead this problem.
Heng |
|
March 7, 2011, 05:13 |
one more question
|
#4 | |
New Member
Chiku
Join Date: Sep 2010
Posts: 12
Rep Power: 16 |
Quote:
|
||
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Transient simulation not converging | skabilan | OpenFOAM Running, Solving & CFD | 14 | December 17, 2019 00:12 |
How to write k and epsilon before the abnormal end | xiuying | OpenFOAM Running, Solving & CFD | 8 | August 27, 2013 16:33 |
Convergence moving mesh | lr103476 | OpenFOAM Running, Solving & CFD | 30 | November 19, 2007 15:09 |
IcoFoam parallel woes | msrinath80 | OpenFOAM Running, Solving & CFD | 9 | July 22, 2007 03:58 |
Could anybody help me see this error and give help | liugx212 | OpenFOAM Running, Solving & CFD | 3 | January 4, 2006 19:07 |