|
[Sponsors] |
January 19, 2012, 13:46 |
pisoFoam pressure issue
|
#1 |
New Member
James
Join Date: Jan 2012
Posts: 11
Rep Power: 14 |
Hello everyone! I have been trying to simulate a scalar using pisoFoam, which is an incompressible transient solver. I modified the solver to solve for the scalar equation and it runs fine.
My problem is that when solving pressure, it takes from 500 to 600 iterations! which is slowing the run-time tremendously. I am using a delta t of 1e-6, which keeps a stable Co of 2.5 at the moment. My mesh size is about 1.3 million cells and I am running this on parallel with 128 processors, but even with that it takes 10 seconds per iteration. In the intial conditions for pressure, I am using 0 for internal field, just like in the tutorials. I changed this to atmospheric pressure and nothing really changed. I am open to any suggestions and I appreciate your help in advance. |
|
January 20, 2012, 03:34 |
|
#2 |
Senior Member
Anton Kidess
Join Date: May 2009
Location: Germany
Posts: 1,377
Rep Power: 30 |
Co 2.5 is too large for a PISO solver - it should be smaller 1. And something important you forgot to mention: which linear solver are you using?
__________________
*On twitter @akidTwit *Spend as much time formulating your questions as you expect people to spend on their answer. |
|
January 20, 2012, 12:08 |
|
#3 |
New Member
James
Join Date: Jan 2012
Posts: 11
Rep Power: 14 |
For pressure I am using PCG and for velocity PBiCG. Also, I am using the backward implicit scheme for the time derivative. I had originally set the Co # to 2.5 because it gave me the highest attainable time step (1e-6) since I was trying to get this to run a bit faster. So I guess I have to change Co # back to 1 and take the 1e-7 hit?
Thanks for your reply by the way |
|
January 20, 2012, 13:02 |
|
#4 |
Senior Member
Anton Kidess
Join Date: May 2009
Location: Germany
Posts: 1,377
Rep Power: 30 |
Either you have Courant number lower 1, or you switch to pimpleFoam. Keep in mind that even though your time step will get smaller, you'll end up doing less work per time step. You can probably also speed up things by using GAMG to solve the pressure correction.
__________________
*On twitter @akidTwit *Spend as much time formulating your questions as you expect people to spend on their answer. |
|
January 23, 2012, 03:00 |
|
#5 |
New Member
James
Join Date: Jan 2012
Posts: 11
Rep Power: 14 |
Thanks for your suggestion! In pisoFoam I decreased my deltat to get a Co of about 0.6 and changed to GAMG in the linear solver for p. This decreased 2 seconds on the time. I also tried pimplefoam with a max Co of 2 and GAMG and I get the same time as pisoFoam, but my delta t is an order of magnitude greater, which would be faster. I guess pimplefoam and 8 seconds per iteration is the fastest I can do.
My second question is that if I refine my mesh, do you think I can increase the time performance? It's a bit counter intuitive since even though I am adding more points to make the calculations easier, there are also more calculations to be done. My third question is that since this is a transient problem and LES, I've read in various threads that I should not start from a 0 initial condition, but a steady state one using simplesrfFoam. Is this true? Sorry for all the questions, but you've been a lot of help. Thanks once again. |
|
January 23, 2012, 09:46 |
|
#6 |
Senior Member
Anton Kidess
Join Date: May 2009
Location: Germany
Posts: 1,377
Rep Power: 30 |
2nd question - No, I can't imagine refining the mesh to do anything but increase the computational time.
3rd question - I don't know anything about LES. Hopefully someone else will give an answer.
__________________
*On twitter @akidTwit *Spend as much time formulating your questions as you expect people to spend on their answer. |
|
January 24, 2012, 06:39 |
|
#7 |
Senior Member
Vesselin Krastev
Join Date: Jan 2010
Location: University of Tor Vergata, Rome
Posts: 368
Rep Power: 20 |
With LES turbulence modeling, you have to respect the MaxCo < 1 condition (sometimes even lower), otherwise you will not resolve adequately the turbulence time scales (in simple words: if you succeed in a stable simulation with Co > 1, which is not so obvious, you will miss a lot of the high frequency turbulent structures in your flow, and therefore the meaning of running a LES). In general, I don't know any other efficient way to speed up a LES than massive parallelization, as for turbulence spatial resolution reasons (yes, you have to account for both temporal and spacial resolution) the LES grids are generally quite "heavy". In that sense, my experience tell me that the PCG (or PBiCG) solver on pressure is more efficient in parallel scaling for high numbers of treads/cores than the GAMG one, which however is generally faster for serial or low number of treads/cores.
Regards V. |
|
January 24, 2012, 13:59 |
|
#8 |
New Member
James
Join Date: Jan 2012
Posts: 11
Rep Power: 14 |
Thank you Anton for your help!
Vesselin: First of all, thanks for your very helpful input. I'm kind of new to openfoam and RANS/LES simulations, but I'm learning a lot. So I guess obsessing over a faster run-time is not the best way to go at it if I want to do a good LES simulation. What I got from your response is that I need to go back to PBICG or PCG because I did notice that even though my simulation run-time decreased using pimpleFoam and GAMG, my scalar solution completely diverged. I will go back to pisoFoam with PCG linear solver and try to figure out why my last Pressure takes about 500 iterations to solve when everything else takes about 2 or 3. Do you think the pressure solver in pisoFoam was built more for RANS simulations rather than LES? Thanks once again V. |
|
January 24, 2012, 14:37 |
|
#9 | ||
Senior Member
Vesselin Krastev
Join Date: Jan 2010
Location: University of Tor Vergata, Rome
Posts: 368
Rep Power: 20 |
Quote:
Quote:
1) the checkMesh log file 2) your fvSolution file 3) the type of physical connection between the computing nodes Regards V. |
|||
January 25, 2012, 04:18 |
|
#10 |
New Member
James
Join Date: Jan 2012
Posts: 11
Rep Power: 14 |
Here are the files. Regarding your last question, I use a supercomputer that is available to various universities for parallel computing, so I don't know exactly the physical connections they use, I assume they are pretty fast though.
|
|
January 25, 2012, 06:13 |
|
#11 |
Senior Member
Vesselin Krastev
Join Date: Jan 2010
Location: University of Tor Vergata, Rome
Posts: 368
Rep Power: 20 |
Ok, let's start with the mesh: if you were simulating with a RANS approach i would say that the the checkMesh is quite ok (I usually don't like to have pyramids in my domain, but that's only my opinion), but I cannot say the same about LES. Theoretically speaking a LES can run on any kind of mesh, but my actual experience told me that is quite hard to obtain a good quality LES on anything that's not pure hexahedral shape meshes (it is again correlated to the higher spatial/temporal resolution and accuracy required by the LES modeling). If it's feasible, remeshing your domain with hexas will be a good starting point to the road for a good quality and converging LES.
About the fvSolution: it seems to confirm my previous statement, because you have a quite high relative tolerance on the pressure solver (relTol set to 0.05 means that at each pressure iteration the linear solver will push the relative residuals down to at least 0.05, wich is a relatively high value). So, if the number of iterations is so high, it is more likely that the solution is diverging for some reasons (too big deltaT + too bad mesh, for instance), rather than a problem correlated with the linear solver itself. Finally, sorry for not asking for this in my previous post, but can you also post your fvSchemes and (if you can) an example of some iterations from a simulation log file? Regards V. PS-In general, you have to take in mind that with LES you'll need much more patience than with RANS approaches, but that's the price to pay for simulating "more turbulence stuff" than simply modeling it... PPS-I agree with you about the HPC facility: generally speaking they have robust and fast connecting webs between the computing nodes, so let's assume this is true also for your case. |
|
January 25, 2012, 18:19 |
|
#12 |
New Member
James
Join Date: Jan 2012
Posts: 11
Rep Power: 14 |
I have been noticing from some LEShow dense LES meshes are, specially near inlets, so I will start refining my mesh. Another option that I have been thinking about is implementing a different pressure solver. Modifying the pisoFoam source code to instead of using the implicit PISO method, maybe use an explicit Runga Kutta, which is inherently faster.
Here are the extra files and part of the log file. Thanks once again for the help |
|
January 30, 2012, 07:26 |
|
#13 | |
Senior Member
Vesselin Krastev
Join Date: Jan 2010
Location: University of Tor Vergata, Rome
Posts: 368
Rep Power: 20 |
Quote:
1) lower all your absolute tolerances (the tolerance entry in the fvSolution file), except the pFinal one, to let's say 1e-12 (from your log file I see that you are not solving anymore for Zmix, as the absolute tolerance has fallen below the cut-off value: in general is not good to stop solving for something if the solution actually has still to evolve). 2) If Zmix is a concentration value, which has to be bounded between 0 and 1, use a strictly bounded convection scheme on div(phi,Zmix), such as for instance Gauss limitedVanLeer 0 1 (this will bound the Zmix value during the cell-to-face interpolation strictly between 0 and 1). 3) For div(phi,U) use a "V" scheme (e. g. Gauss limitedLinearV 1) 4) At least in the inital part of your simulation time, use a slightly less restrictive pFinal tolerance value (e.g. 1e-05 instead of of 1e-06): the huge number of iterations in your case are due to the second (and last) PISO pressure correction, where the solver tries to push down the residuals below the pFinal tolerance entry. Also, if you keep to use meshes with significant non-orthogonality values, add 1 or 2 non-orthogonality correctors: this will increase the number of pressure loops, but each of them should be not very expensive because they will follow the relative tolerance value (0.05 in your case). In addition, the final loop will start from a lower initial residual value, which should be beneficial in therms of the final loop number of iterations. That's it, I'm absolutely not a big LES expert, but I hope those little pieces of advice will improve your simulations. Regards V. |
||
January 30, 2012, 23:54 |
|
#14 |
New Member
James
Join Date: Jan 2012
Posts: 11
Rep Power: 14 |
Thank you so much for these helpful hints. I will apply them and let you know how it goes. Thanks once again for your help V.
|
|
February 9, 2012, 17:56 |
|
#15 |
New Member
James
Join Date: Jan 2012
Posts: 11
Rep Power: 14 |
Hello once again V and sorry to bother you once again,
So I applied your suggestions, but I kept on getting the high number of iterations. My next step without redoing the mesh was to modify the solver to solve for pressure explicitly through the Runge Kutta 2 scheme. I tested it on a simple geometry, a channel, and it worked great. Now, I am trying to implement it to my more complicated geometry and my pressure is increasing very rapidly until it explodes and the solution completely diverges very early. My geometry includes some inlets and an outlet. I followed the same boundary conditions for p and U. P: wall - zeroGradient inlets - zeroGradient outlet - fixedValue 0 U: wall - fixedValue 0 inlets - fixedValue 11 outlet - fixedValue 0 At first I thought it was the boundary conditions, but I have messed around with them and nothing, I have ran out of ideas other than redoing the mesh. |
|
February 9, 2012, 18:21 |
|
#16 | |
Senior Member
Vesselin Krastev
Join Date: Jan 2010
Location: University of Tor Vergata, Rome
Posts: 368
Rep Power: 20 |
Quote:
Regards V. |
||
February 9, 2012, 19:03 |
|
#17 |
New Member
James
Join Date: Jan 2012
Posts: 11
Rep Power: 14 |
It is diverging as well. The pressure gets up to 10^9
|
|
February 10, 2012, 05:40 |
|
#18 |
Senior Member
Vesselin Krastev
Join Date: Jan 2010
Location: University of Tor Vergata, Rome
Posts: 368
Rep Power: 20 |
Ok, if changing the BC for U at the outlet doesn't make any change, then probably it is time to rebuild the mesh...
V. |
|
September 7, 2012, 15:11 |
|
#19 |
Member
Join Date: Jun 2011
Posts: 80
Rep Power: 15 |
Hi all!!
I am writing because I have also a problem with PISO algorithm and I am stuck since longtime ago... The point is I am trying to simulate the flow past a cube (Re=3000, 30000, 300000) by using URANS (RNG-ke, e.g.). My problem is that I do not get an unsteady behaviour in the wake - I put several probes in the wake and I reconstruct the solution and the case falls into steady state... I use the fvSchemes and fvSolution files which are included in the motorbike tutorial but nothing transient comes... About my mesh, I use a fully structured mesh with a refinement zone around the cube. I tried this case in 2D and I got the transient properly... Also, I have tested a similar case: flow past a square cylinder with the same computational domain size and numerical schemes and voila, I got it!! Finally, if I use the simpleFoam algorithm even though it is said is a steady state algorithm, I got the transient!! What it is happening??? I hope you will help me... Thank you so much!! Best, Last edited by maalan; September 7, 2012 at 15:35. |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Recorded Pressure Issue | Benjikos | FLUENT | 0 | November 30, 2010 11:06 |
pressure BC-natural convection | engahmed | Main CFD Forum | 0 | June 16, 2010 12:38 |
pressure in Natural convection??? | engahmed | FLUENT | 0 | June 14, 2010 13:51 |
Neumann pressure BC and velocity field | Antech | Main CFD Forum | 0 | April 25, 2006 03:15 |
Terrible Mistake In Fluid Dynamics History | Abhi | Main CFD Forum | 12 | July 8, 2002 10:11 |