|
[Sponsors] |
LES fails within first time steps (pimpleFoam) |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
January 4, 2021, 06:21 |
LES fails within first time steps (pimpleFoam)
|
#1 |
New Member
Helen Alina Pabst
Join Date: Dec 2020
Location: Germany
Posts: 8
Rep Power: 5 |
Hi everyone,
I am trying run a LES for a flow around an airfoil. As I am quite new to OpenFOAM I used a tutorial as a basis and tried to modify it for my case. In some later calculations I also want to use synthetic inflow turbulence with the method proposed by Klein et al. so I chose the verificationandValidation/turbulentInflow tutorial. However my first calculation should be without inflow turbulence. As I want to compare my result so results from another code the following points should also be fulfilled in my simulation: - classical Smagorinsky model - Re=100.000 - second order accuracy temporary and spacially As this will cost a lot of CPUh I will use our supercomputers. As my resources there are limited my idea was to test the solver settings and so on with a lower Re of 10.000 on my computer to make sure it works at least in the beginning. So I decreased my velocity by factor 10 (and increased my time step, to start with the same CFL-number). Don't be surprised about my values for viscosity etc.. The calculation I use as comparison is dimensionless. ... but my OpenFOAM LES does not run. It always fails within the first four time step. My residues begin to rise dramtically. I tried a lot with varying the solver settings, using other schemes and so on, but nothing really works. Furthermore it is very slow. (Ok I only have got 6 processors, but that slow?!) I don't think its my grid. This should be fine (I will attach the checkMesh.log). There are some high aspect ratio cells near the airfoil, but this worked fine for other simulations. y+ is below 1. I really hope someone can help me, give me a hint where to start or tell me my stupid mistake, that I can't find. I attached my fvSolution, fvSchemes,controlDict and 0.orig folder. And also a little picture of my grid and my last pimpleFoam.log. As this is my first post here, please let me know if I forgot something. Regards, Helen checkMeshlog.txt pimplefoamlog.txt Grid.png Last edited by Helen Alina; January 4, 2021 at 08:08. |
|
January 4, 2021, 07:01 |
|
#2 |
Senior Member
Join Date: Apr 2020
Location: UK
Posts: 745
Rep Power: 14 |
Hi Helen.
From a quick scan of your pimple log file, it looks like the solver is struggling to get a good pressure solution: the linear solver is maxing out on 1000 iterations on the second pressure loop of each PIMPLE iteration, and the final p residual is rising each time ... until on the 4th time step the continuity error is getting large and the solution blows up. The other thing that the log file shows is that there's an amazing difference between the mean and max Courant number: 6 orders of magnitude! Do you really mean for your grid to vary in cell size by 6 orders of magnitude, or is there something wrong with the mesh? From the checkMesh output, it looks like your max aspect ratio is 7900, which is extremely high. This is probably what is causing the pressure solver problems. You could try increasing the maxIterations on the pressure solver from 1000 to 2000, but I suspect that the real answer is to fix the grid, by improving the aspect ratio. I generally try keep it below 100, and ideally 10 or less. Finally, you mention that you are wanting to run this as an LES simulation - how will you deal with the wall BC when you do that? Is LES/Smagorinsky the right tool? If that is the case, what will your near wall grid resolution be? Much coarse than now, so should you not start off my trying to simulate that? At the moment with a y+=1 and no SGS model you are effectively running a coarse DNS ... and that will be VERY expensive to run, even at low Re (and I am not sure that your reduced Re of 10,000 is particularly low!). That's the reason, I guess, for your 10^6 variation in cell size across the grid. Finally, on the long run times - the main reason for this is the time spent in the pressure solver. You can speed that up ny making it easier for the pressure solver to solve ... but that could increase the cell count considerably. However, one quick comment - you should turn off the field averaging in this initial "spin-up" time, and will save a tiny bit of time (and a lot of disk space) - there's no point field averaging until the turbulent fields have settled down. So, my overall suggestion is to have a think about what you want to achieve at this stage of your project, and then to maybe consider changing approach slightly. Good luck! |
|
January 4, 2021, 07:54 |
|
#3 |
New Member
Helen Alina Pabst
Join Date: Dec 2020
Location: Germany
Posts: 8
Rep Power: 5 |
Hi Tobermory,
thanks for your reply! Unfortunately I have to take this mesh. As I mentioned I have to compare the OpenFOAM results to results from another code. So my supervisor wants me to use the same mesh. My y+ is smaller than one, so I am trying to run a wall-resolved LES. And yes, this will be unfortunately very expensive. The other simulation took 45 days on 96 processors. I do not really understand what you mean by "no SGS model". I (am trying to) use the Smagorinsky model. I will try as you suggested to increase my maxIter, lower my Reynoldsnumber for the test simulation to 1000 and disable the field averaging. Regards, Helen |
|
January 4, 2021, 08:48 |
|
#4 |
Senior Member
Join Date: Apr 2020
Location: UK
Posts: 745
Rep Power: 14 |
Dear Helen - it sounds like you have a tough assignment! Understood re: the wall resolved LES, and forgive my comment about "no SGS" - you do indeed have it switched on, as per the log file. I clearly read the file too quickly.
The other thing to consider is the gradient scheme - you are using least squares method, which is generally pretty good. However, this can cause problems for high aspect ratio cells - see the discussion in Appendix C of: https://arxiv.org/pdf/1606.05556.pdf.Try changing the scheme to Gauss linear, and see if that helps the pressure solver. |
|
January 5, 2021, 06:36 |
|
#5 |
New Member
Helen Alina Pabst
Join Date: Dec 2020
Location: Germany
Posts: 8
Rep Power: 5 |
Hi Tobermory,
I looked at the paper and replaced leastquare with Gauss linear. It worked! Thanks a lot for the tip! However, I still need extremely many interactions for my pressure. For example, if I set maxIter to 2000 for pFinal, OpenFoam still uses all of them. Do you have another idea how I can solve this? I've also tried around with the tolerances of p and the number of loops, but haven't gotten much further. I also attached the log.file of my last attempt. Here you can see quite well the problem with the pressure iterations. Regards, Helen ResidualsP.pdf log.pimplefoam.txt |
|
January 5, 2021, 09:14 |
|
#7 |
New Member
Helen Alina Pabst
Join Date: Dec 2020
Location: Germany
Posts: 8
Rep Power: 5 |
Thanks for the tip! I'm going to try it with PBiCGStab.
|
|
January 6, 2021, 05:59 |
|
#8 |
New Member
Helen Alina Pabst
Join Date: Dec 2020
Location: Germany
Posts: 8
Rep Power: 5 |
The tip with PBiCGStab and GAMG as preconditioner worked! I now need significantly less pressure iterations for the most part. Many thanks!
Unfortunately the simulation has become much slower despite fewer pressure iterations. Of course, 6 processors are unsuitable for such a problem, but I want to test what starts reliably before I use my resources on the supercomputer. (Hopefully from next week I will have a workstation with 24 processors to work with and to do my tests.) And what is even more unpleasant: My calculation crashed in the 5th time step. My courant number has exploded somehow. I currently have 3 outer, 2 inner and 1 corrector loop. However, when I look at my log.file it is hard for me to see, whether I need more or less. Is a corrector loop useful at all in my case? Or are my (relative) tolerances not well chosen? I am also unsure whether my schemes are suitable. The only requirement I have is that my simulation should be second order accurate in time and space. (I have limitedlinear set to 1 for now, as I hoped that this will make it more stable. My plan would actually be to use 0.1.) Since I have little experience with OpenFOAM, I would be very happy if I get some tips. log.pimplefoam.txt fvSolution.txt fvSchemes.txt |
|
January 6, 2021, 09:41 |
|
#9 |
Senior Member
|
Most happy to read that the good old trick of combining multigrid and Krylov for the pressure equation continues to work!
My intuition is that the mesh quality is now the bottleneck in further improving simulation runs. My intuition, however, has been wrong many times before. My humble suggestion is to discuss creating a sequence of meshes with your supervisor. The magnificent input by Tobermory provides an excellent motivation to discuss the mesh with your supervisor. OpenFoam does come with a mature set of tools for mesh generation. Have a look at e.g. snappyHexMesh or cfMesh. Possibly the effort of generating the meshes again comes with the benefit of studying different shapes of the airfoils. I am keen to see how your project evolves. Please keep us posted. |
|
January 6, 2021, 22:53 |
|
#10 |
Senior Member
Ruiyan Chen
Join Date: Jul 2016
Location: Hangzhou, China
Posts: 162
Rep Power: 10 |
Your checkMesh message seems to tell you that it is failed.
|
|
January 7, 2021, 04:56 |
|
#11 |
Senior Member
Join Date: Apr 2020
Location: UK
Posts: 745
Rep Power: 14 |
Well, strictly speaking the checkMesh log is saying that the mesh failed to meet the checkMesh criteria on aspect ratio ... that doesn't mean that the mesh is a failure and that it's impossible to get the code to run stably, but it does mean that it will be hard to do so.
Great suggestion Domenico on the solver (I must learn more about the different options there), but I fear ultimately that your conclusion is right - you may struggle Helen to get any solver to behave with such high aspect ratio cells. I did see an interesting paper where the author was using a pair of overset meshes to get around the boundary layer resolution problem, for high Re: one mesh near the wall, extending from the wall to y+ ~100 with a refined streamwise resolution to improve the AR, and the other extending from near the wall to the outer boundaries. I'll see if I can dig out the paper for you, although I am not sure whether OF can support overset meshes so it might be a blind alley ... Edit: here is the paper. https://turbmodels.larc.nasa.gov/Oth...-Uzun-hump.pdf |
|
January 7, 2021, 06:13 |
|
#12 |
New Member
Helen Alina Pabst
Join Date: Dec 2020
Location: Germany
Posts: 8
Rep Power: 5 |
Hi Tobermory,
the approach with the overset meshes sounds super interesting! I guess I could just use the overPimpleDyMFoam solver. Thank you very much for your effort to find the paper! I will read it carefully this afternoon. I also had an idea regarding the grid: My idea was to at first use a grid where I unite some cells near the wall in wall-normal direction. Then my y+ value will be much larger than desired, but as soon as the simulation runs stable, I would try to interpolate the results to my actual grid, to then continue the simulation with these values. Since I've never used the mapfields utility before, I'm not sure if this works. It was just a spontaneous idea. But before I start working on the grid, however, I will try out just a few more settings regarding solver, schemes etc. to hopefully get around it. |
|
January 7, 2021, 06:22 |
|
#13 |
Senior Member
Join Date: Apr 2020
Location: UK
Posts: 745
Rep Power: 14 |
Yes, good idea - just make sure to make notes on what you find, since otherwise my experience is that the knowledge soon evaporates! Try also googling for OF tips on high AR meshes, and let us know if you find any other magic tips!
One other thing to keep in mind: in an ideal world, for LES or DNS, or any method where you are trying to resolve the turbulence on the grid, you want your cells to have an AR as close to 1 as possible, since otherwise you are (implicit grid) filtering the different velocity components with different lengthscales ... which messes with the isotropy of the smallest scales. I know that this is a real challenge for high Re aero applications, but it's something to keep in mind. The overset mesh approach could help with this, but I notice that they only had a streamwise 2:1 ratio (coarse:fine) for their mesh, which doesn't help you much ... you might want to try a 5:1 or 10:1 ratio. Good luck again, and as Domenico says - keep us posted! |
|
January 7, 2021, 07:29 |
|
#14 |
Senior Member
|
Hmmm ....
The geometry is purely 2D correct? And is a NACA airfoil for which an analytical expression for the shape exists, correct? If so, what NACA is it? Can one easily generate a script (matlab, python or any of the alternatives) that spits out an STL file for the geometry or download the script from somewhere? What is currently the cell count? What requirement on y+ do you impose? What is a typical range in Reynolds numbers you would be interested in? Best wishes. |
|
January 7, 2021, 07:41 |
|
#15 |
New Member
Helen Alina Pabst
Join Date: Dec 2020
Location: Germany
Posts: 8
Rep Power: 5 |
Hi Domenico,
my geometry is 3D. But just a few cells in z direction (for the turbulence). Unfortunately its is not a NACA airfoil. It's an airfoil from a wind turbine manufacturer, but I have the geometry. I have got 8 million cells. My y+ has to be smaller than 1, since I am trying to do a wall-resolved LES. My final Reynolds number has to be 1e5. Regards, Helen |
|
Tags |
converge case setup, failed, les simulation, pimplefoam, scheme |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Inconsistencies in reading .dat file during run time in new injection model | Scram_1 | OpenFOAM | 0 | March 23, 2018 23:29 |
pressure in incompressible solvers e.g. simpleFoam | chrizzl | OpenFOAM Running, Solving & CFD | 13 | March 28, 2017 06:49 |
Coupling time duration, Coupling time steps | Jiricbeng | CFX | 0 | April 29, 2015 09:37 |
Star cd es-ice solver error | ernarasimman | STAR-CD | 2 | September 12, 2014 01:01 |
mixerVesselAMI2D's mass is not balancing | sharonyue | OpenFOAM Running, Solving & CFD | 6 | June 10, 2013 10:34 |