|
[Sponsors] |
December 2, 2013, 05:54 |
Help to make LES simulation faster
|
#1 |
Member
Luca
Join Date: Mar 2013
Posts: 68
Rep Power: 13 |
Hi everybody,
I'l trying to run a wind tunnel simulation with an LES turbulence model with pisoFoam. The domain is made of 17M cells. The simulation in made of a channel with some roughness blocks in the firsts 3/4 of it, and with a building model in the last quarter. Actually I'm solving with a time step equal to 0.0001s (10'000 Hz). This is done to keep the maxCo below one. Actually my maxCo is ~0.88. The problem is that the meanCo is something like 0.007 and the simulation is running very slowly. With 252 CPUs it's taking more or less 2.9s per iterations. This means that it takes 1 day to solve 1.25 seconds. Since i need to simulate at least 2-3 minutes and i can't use more CPU, what can i do? I was thinking to increase the time step since the meanCo is way smaller than 1, but this would cause the maxCo to be bigger than 1 in some points. If this point is in the roughness region (where i don't really care if the solution is wrong in some small portions) do you think the solution will still converge? How can i know "where" the Courant is bigger than 1? Do you have other ideas other than increasing the timestep? Maybe use pimpleFoam? Thanks a lot. Regards, Luca |
|
December 2, 2013, 06:34 |
|
#2 | |
Senior Member
Bernhard
Join Date: Sep 2009
Location: Delft
Posts: 790
Rep Power: 22 |
It is a bit difficult to give any advise, could you maybe share fvSchemes and a snippet of the log file?
Quote:
|
||
December 2, 2013, 08:22 |
|
#3 |
Senior Member
Join Date: Dec 2011
Posts: 111
Rep Power: 20 |
A low timestep and high resolution is some of the nature of LES. It's unavoidable. You can of course adjust the mesh in problematic areas, and perhaps end up with a slightly larger timestep, however, in the end you cannot overcome the fact that LES is computationally intensive by nature. If you only can afford a RANS simulation, stick to that. Perhaps you can use a RANS model to initiate a physical sane initial condition to save some time?
Last edited by haakon; December 2, 2013 at 08:23. Reason: Typo |
|
December 2, 2013, 08:55 |
|
#4 |
Senior Member
Lieven
Join Date: Dec 2011
Location: Leuven, Belgium
Posts: 299
Rep Power: 23 |
I'm with haakon on this one. LES is by definition expensive so don't expect to be able to do it cheaply. If you find a way to do this, it will probably make you a very rich man ;-). I would even recommend to further reduce your time step such that also maxCo is significantly smaller than 1.0 (for time integration accuracy). I would not switch to pimpleFoam since underrelaxation is quite unphysical in an LES context.
As I see it, there are two things you can do: 1. adjust your mesh. If you can't afford a simulation of 17M cells, just don't do it then. A converged solution on a 4M cells mesh is probably better than a halfway converged solution on a 17M cells. 2. Use a cheaper LES turbulence model (if you can) e.g. dynamics models are more expensive to compute than the classical smagorisky model. Using a RANS model to compute an initial field could help you speed up the conversion but this is certainly not guaranteed. Cheers, L |
|
December 2, 2013, 09:34 |
|
#5 |
Member
Luca
Join Date: Mar 2013
Posts: 68
Rep Power: 13 |
Hi Bernhard. Thank you for reading.
Here is my fvSchemes: Code:
ddtSchemes { default backward; } gradSchemes { default Gauss linear; grad(p) Gauss linear; grad(U) Gauss linear; } divSchemes { default none; div(phi,U) Gauss linear; div(phi,k) Gauss limitedLinear 1; div(phi,B) Gauss limitedLinear 1; div(phi,nuTilda) Gauss limitedLinear 1; div(B) Gauss linear; div((nuEff*dev(T(grad(U))))) Gauss linear; } laplacianSchemes { default none; laplacian(nuEff,U) Gauss linear corrected; laplacian((1|A(U)),p) Gauss linear corrected; laplacian(DkEff,k) Gauss linear corrected; laplacian(DBEff,B) Gauss linear corrected; laplacian(DnuTildaEff,nuTilda) Gauss linear corrected; } interpolationSchemes { default linear; interpolate(U) linear; } snGradSchemes { default corrected; } fluxRequired { default no; p ; } Code:
Time = 1.2372 Courant Number mean: 0.00771093 max: 0.898045 DILUPBiCG: Solving for Ux, Initial residual = 4.38996e-05, Final residual = 8.81599e-09, No Iterations 1 DILUPBiCG: Solving for Uy, Initial residual = 0.0012287, Final residual = 2.04545e-07, No Iterations 1 DILUPBiCG: Solving for Uz, Initial residual = 0.000869898, Final residual = 1.5147e-07, No Iterations 1 DICPCG: Solving for p, Initial residual = 0.00549978, Final residual = 0.000262121, No Iterations 3 time step continuity errors : sum local = 2.52331e-10, global = -5.10701e-14, cumulative = 2.4203e-09 DICPCG: Solving for p, Initial residual = 0.000364781, Final residual = 9.84024e-07, No Iterations 97 time step continuity errors : sum local = 9.47303e-13, global = -4.74805e-14, cumulative = 2.42025e-09 ExecutionTime = 84873 s ClockTime = 85168 s forceCoeffs output: Cm = 2.37554 Cd = 39.8978 Cl = 64.6208 Cl(f) = 34.6859 Cl(r) = 29.9349 Time = 1.23725 Courant Number mean: 0.00771094 max: 0.89373 DILUPBiCG: Solving for Ux, Initial residual = 4.39007e-05, Final residual = 8.81516e-09, No Iterations 1 DILUPBiCG: Solving for Uy, Initial residual = 0.00122869, Final residual = 2.04562e-07, No Iterations 1 DILUPBiCG: Solving for Uz, Initial residual = 0.0008699, Final residual = 1.51441e-07, No Iterations 1 DICPCG: Solving for p, Initial residual = 0.0055296, Final residual = 0.000264223, No Iterations 3 time step continuity errors : sum local = 2.54366e-10, global = -5.4272e-14, cumulative = 2.4202e-09 DICPCG: Solving for p, Initial residual = 0.000368202, Final residual = 9.85297e-07, No Iterations 115 time step continuity errors : sum local = 9.48546e-13, global = -5.09298e-14, cumulative = 2.42015e-09 ExecutionTime = 84875.5 s ClockTime = 85170 s forceCoeffs output: Cm = 2.38226 Cd = 39.8857 Cl = 64.572 Cl(f) = 34.6683 Cl(r) = 29.9038 Time = 1.2373 Courant Number mean: 0.00771095 max: 0.889406 DILUPBiCG: Solving for Ux, Initial residual = 4.39019e-05, Final residual = 8.81608e-09, No Iterations 1 DILUPBiCG: Solving for Uy, Initial residual = 0.00122868, Final residual = 2.04589e-07, No Iterations 1 DILUPBiCG: Solving for Uz, Initial residual = 0.000869913, Final residual = 1.51427e-07, No Iterations 1 DICPCG: Solving for p, Initial residual = 0.00552333, Final residual = 0.000261467, No Iterations 3 time step continuity errors : sum local = 2.51721e-10, global = -5.30145e-14, cumulative = 2.4201e-09 DICPCG: Solving for p, Initial residual = 0.00036565, Final residual = 9.99106e-07, No Iterations 94 time step continuity errors : sum local = 9.61889e-13, global = -5.46456e-14, cumulative = 2.42004e-09 ExecutionTime = 84877.9 s ClockTime = 85172 s forceCoeffs output: Cm = 2.36583 Cd = 39.8604 Cl = 64.5751 Cl(f) = 34.6534 Cl(r) = 29.9217 Thank you very much |
|
December 2, 2013, 10:01 |
|
#6 |
Senior Member
Bernhard
Join Date: Sep 2009
Location: Delft
Posts: 790
Rep Power: 22 |
Where I said fvSchemes I meant fvSolution, excuse me. Did you ever try to solve the pressure equation using a GAMG method? On 100-200 cpus it is said to be more efficient in solving the pressure equation (generally) than PCG. You are using more processors, but it might be worth experimenting a bit with these settings.
|
|
December 2, 2013, 10:15 |
|
#7 |
Member
Luca
Join Date: Mar 2013
Posts: 68
Rep Power: 13 |
I know LES is computationally expensive. I don't expect to make it quickly and easily. I was just asking if there was a way to make it a little faster. Maybe with some improvements I could achieve to solve the case twice or four time faster. That would mean a lot of gain, even if the computation is still very expensive.
I will try to initialise the case with a RANS or with a coarser mesh. I will also try to switch to GAMG and see if i find any improvements. About the LES model I already use a Smagorinsky one, so I can't use a cheaper one. Lastly, there are many papers that suggest that RANS is not so good in my field of study (wind flow around a low rise building), so i can't switch to that. PS: this is my fvSolution: Code:
solvers { p { solver PCG; preconditioner DIC; tolerance 1e-06; relTol 0.05; } pFinal { solver PCG; preconditioner DIC; tolerance 1e-06; relTol 0; } U { solver PBiCG; preconditioner DILU; tolerance 1e-05; relTol 0; } k { solver PBiCG; preconditioner DILU; tolerance 1e-05; relTol 0; } nuTilda { solver PBiCG; preconditioner DILU; tolerance 1e-05; relTol 0; } } PISO { nCorrectors 2; nNonOrthogonalCorrectors 0; pRefCell 0; pRefValue 0; } |
|
December 2, 2013, 11:40 |
|
#8 |
Senior Member
Lieven
Join Date: Dec 2011
Location: Leuven, Belgium
Posts: 299
Rep Power: 23 |
Hi PJ,
Two remarks about the fvSchemes: 1. you should set relTol of p also to 0. 2. Are you sure nCorrectors = 2 is enough for your mesh? If you have a fully orthogonal mesh, you can set this to 0. If not, 2 is most likely too little... Cheers, L |
|
December 3, 2013, 03:07 |
|
#9 |
Senior Member
Bernhard
Join Date: Sep 2009
Location: Delft
Posts: 790
Rep Power: 22 |
||
December 3, 2013, 04:41 |
|
#10 |
Senior Member
Lieven
Join Date: Dec 2011
Location: Leuven, Belgium
Posts: 299
Rep Power: 23 |
How sorry, your fully correct Bernhard!
I wasn't paying attention to the pFinal entry :-D (the solver I'm using doesn't have it). Cheers, Lieven |
|
December 3, 2013, 06:03 |
|
#11 |
Member
Luca
Join Date: Mar 2013
Posts: 68
Rep Power: 13 |
Thank everybody for your kind help.
Now i'm running some tests on a smaller 2M cells case to benchmark a bit the solutions you proposed. I know that LES is expensive, I'm not asking to run it on my laptop in few hours. But in my lab we already ran 15-20M cells cases that were solving about 10-15 seconds every day. In a week we had our 1-2 minutes simulation. I could accept to run the simulation in 2 weeks, but not in 8. The problem with this case is that I have some very small cells near the model and there the Co number is 100 times bigger than everywhere else. To maintain the Co < 1 there i have to solve with a timestep 100 times smaller and though this slow down the simulation a lot. I was looking for the best way to solve this. Make a coarser mesh is of course and option, but it's not good for our purpose. I was therefore looking is a better solution exists. Thank you very much. I will post the results of the benchmark as soon as i have them |
|
December 4, 2013, 05:27 |
|
#12 |
Senior Member
Philipp
Join Date: Jun 2011
Location: Germany
Posts: 1,297
Rep Power: 27 |
Hi, you mentioned that you use LES because RANS doesn't work. Did you try the SAS model?
__________________
The skeleton ran out of shampoo in the shower. |
|
December 9, 2019, 07:33 |
|
#13 |
New Member
Praharsha Reddy
Join Date: Dec 2019
Posts: 14
Rep Power: 7 |
How can we convert LES simulation to RANS simulation
|
|
December 10, 2019, 03:38 |
|
#14 |
Senior Member
Ruiyan Chen
Join Date: Jul 2016
Location: Hangzhou, China
Posts: 162
Rep Power: 10 |
If other people have used same amount of cells as yours and their cases are running way faster than yous, I don't think it means that much. It really depends on the details of the mesh. You mentioned something about very small cells near the model (I suppose you mean the walls of a building or something similar), do you really need that refined mesh there? What is the yPlus value? Obviously in LES you want to have a very refined mesh, expecially near the wall, but if the wall areas are not that important, maybe a wall model is enough. Typically, walls are crucial for internal flows, but may not be that important for external flows. My experience is that the spalding wall model (nutUSpaldingWallFunction) works really well.
I always initiate my LES from an unsteady RANS simulation. For the unsteady RANS I use very large time steps, and because of this sometimes I have to switch my time scheme to first order (If not it easily diverges). Not sure if it's the correct way of doing things though. Note that sometimes it does make your case run faster but sometimes it does not. Again, it depends on the actual problem. |
|
December 10, 2019, 05:13 |
|
#15 | |
Senior Member
Santiago Lopez Castano
Join Date: Nov 2012
Posts: 354
Rep Power: 16 |
Quote:
Last edited by Santiago; December 10, 2019 at 06:11. Reason: typo |
||
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
How make a video for transient simulation in CFX post | Anna Tian | CFX | 1 | July 6, 2013 17:35 |
Large Eddy Simulation - LES | stix | Main CFD Forum | 5 | April 8, 2013 09:23 |
OpenFOAM 1.7.1 installation problem on OpenSUSE 11.3 | flakid | OpenFOAM Installation | 16 | December 28, 2010 09:48 |
OpenFOAM with IBM AIX | matthias | OpenFOAM Installation | 20 | March 25, 2008 03:36 |
Need LES simulation knowledge / reference material | Sam | CFX | 7 | January 25, 2008 17:30 |