|
[Sponsors] |
Speedup with GAMG for simplefoam forward Step |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
June 11, 2007, 19:33 |
Hey all,
I'm currentley worki
|
#1 |
Guest
Posts: n/a
|
Hey all,
I'm currentley working on the tutorial case in of simpleFoam. For a comparison of the calculation time with cfx, I have set the solver to GAMG such that both use approximately the same Solver. But instead of a speedup, the calculations need more time than with the default PCG/PBiCG-Solvers- I took the following setting: p GAMG { tolerance 1e-08; relTol 0; smoother GaussSeidel; cacheAgglomeration true; nCellsInCoarsestLevel 10; agglomerator faceAreaPair; mergeLevels 1; }; I tried to change the smallest number of cells and 10 was the best setting for that case (I tried different dense meshes). Do you have any suggestions why the time is worse or how I could improve it? By the way, does it make sense to use a preconditioner, 'cause in the tutorial case they don't? Cheers Florian |
|
June 12, 2007, 03:25 |
Why do you need tolerance 1e-
|
#2 |
Senior Member
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 703
Rep Power: 21 |
Why do you need tolerance 1e-08? Won't 1e-06 do? How bad is your mesh? Do you use non-conformal faces (i.e. hanging nodes)? AMG will give very good results provided the discretization is decent and the case is fairly large (1 million plus is good). You can try to reduce the iteration count by using the DICGaussSeidel smoother. However, it will increase your computational expense a bit.
Personally, I have found great improvement in the multigrid solver. It beats the pants out of ICCG for my vortex shedding case; upto 3 times as fast! |
|
June 12, 2007, 04:31 |
The setting is just for a time
|
#3 |
Guest
Posts: n/a
|
The setting is just for a time measurement, so tolerance doesn't matter as long as I use everywhere the same. Well I didn't change the tutorial case of the forward Step. I just increased the number of elements in all directions, so doubled it every time. So there should be no hanging nodes, it's a fairly simple mesh. What do you mean with descent discretization, the discretization methods!?
But I'll try your recommendations and increase the mesh density! Thanks a lot! Florian |
|
June 12, 2007, 04:35 |
Apologies. I should have elabo
|
#4 |
Senior Member
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 703
Rep Power: 21 |
Apologies. I should have elaborated. By decent discretization, I refer to pure orthogonal grids with 0 skewness. Check out this[1] thread for more details.
[1] http://www.cfd-online.com/OpenFOAM_D...es/1/4094.html |
|
June 12, 2007, 06:22 |
Here is a recent OpenFOAM pres
|
#5 |
Senior Member
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 703
Rep Power: 21 |
Here[1] is a recent OpenFOAM presentation on the multi-grid solver.
Check out slide 29 'Computational examples' to see what I meant earlier. [1] http://powerlab.fsb.hr/ped/kturbo/Op...nadaPrecon.pdf |
|
June 24, 2007, 11:09 |
Thanks a lot for the link to t
|
#6 |
Guest
Posts: n/a
|
Thanks a lot for the link to the presentation!
I've run now several simulations up to 2 mio elements.But I realized that with increasing the number of elements I also have to change nCellsInCoarsestLevel, 'cause otherwise the time for GAMG is even higher than with others like PCG. Is there an approximate formula which tells me what is the best setting of nCellsInCoarsestLevel according to a certain number of cells? Something like: nCellsInCoarsestLevel ~ C *#Cells with C being a constant (e.g. 0.1...) Regards |
|
June 24, 2007, 12:26 |
Check the older posts in the f
|
#7 |
Senior Member
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 703
Rep Power: 21 |
Check the older posts in the forum[1,2]. If I recall correctly, I think the recommendation was anything from a dozen to a couple hundred cells in Serial mode and around 20-30 cells in Parallel mode. This is of course the recommendation for OpenFOAM 1.3 (i.e. AMG solver). I am not sure if it can be readily translated to the GAMG solver in OpenFOAM 1.4. On second thoughts however, I think you are doing the right thing, i.e. experimenting with different values.
[1] http://www.cfd-online.com/OpenFOAM_D...tml?1162930507 [2] http://www.cfd-online.com/OpenFOAM_D...tml?1172129883 |
|
June 24, 2007, 12:30 |
Oh, and by the way, if you use
|
#8 |
Senior Member
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 703
Rep Power: 21 |
Oh, and by the way, if you use a uniformly spaced purely orthogonal grid, I've noticed that changing
mergeLevels 1; to mergeLevels 2; or even mergeLevels 3; speeds up the GAMG solver. |
|
June 24, 2007, 12:58 |
How these changes affect the f
|
#9 |
Senior Member
Alberto Passalacqua
Join Date: Mar 2009
Location: Ames, Iowa, United States
Posts: 1,912
Rep Power: 36 |
How these changes affect the final result, if they do? What about stability of the solution?
Regards, A.
__________________
Alberto Passalacqua GeekoCFD - A free distribution based on openSUSE 64 bit with CFD tools, including OpenFOAM. Available as in both physical and virtual formats (current status: http://albertopassalacqua.com/?p=1541) OpenQBMM - An open-source implementation of quadrature-based moment methods. To obtain more accurate answers, please specify the version of OpenFOAM you are using. |
|
June 24, 2007, 22:44 |
To my knowledge, multi-grid is
|
#10 |
Senior Member
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 703
Rep Power: 21 |
To my knowledge, multi-grid is merely a solution approach. Deep down it also uses conventional solvers such as the conjugate gradient. So I don't quite see why the final result would be affected unless you change tolerance and/or reltol to other values. As for the stability of the numerical solution, it is primarily affected by the choice of numerical schemes used for space/time discretization.
Someone please correct me if I'm wrong! |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Forward step | shuo | OpenFOAM Running, Solving & CFD | 2 | December 15, 2023 14:53 |
SimpleFoam and Time Step continuity errors | philippose | OpenFOAM Running, Solving & CFD | 64 | September 13, 2023 11:26 |
forward step: treating singularity pt for WENO FV | DSS | Main CFD Forum | 1 | January 18, 2007 03:15 |
Inviscid flow over a forward/backward facing step | Abhijit Tilak | Main CFD Forum | 3 | February 16, 2005 10:18 |
Supersonic flow over a forward facing step | Chan K I | Main CFD Forum | 1 | October 19, 2000 17:43 |