CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

rhoCentralFoam: Long Simulation times on relatively coarse mesh

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   May 1, 2020, 12:51
Default rhoCentralFoam: Long Simulation times on relatively coarse mesh
  #1
New Member
 
Imthiaz Syed
Join Date: Nov 2019
Posts: 6
Rep Power: 7
siasyeda is on a distinguished road
Hello everyone,


I am trying to validate OpenFOAM's rhoCentralFoam solver with the NASA Flat Plate boundary layer for a Mach 2.0 flow: https://turbmodels.larc.nasa.gov/ZPG...ateSS_val.html


I am using a relatively coarse 137x97 mesh (with mesh refinement at wall with Y+ approx 1) with the SST turbulence model. This simple simulation takes around 5-6 hours to converge running on 40 cores, which is just weird. For comparison, Steady State SU2 runs this setup in around 30mins, so I am sure I am doing something wrong here. I have here the simulation case, and output: https://drive.google.com/file/d/1X6KgMVktMKPv1h3t0HIo_VpZzDJjbcwR/view?fbclid=IwAR0eZjAEZio7FW9gyTKOCox-6VpJh1QisrG6TJkFZHpZLLJVsfFwooH7h_c



You can see the results first using paraFoam if you need, and try running it yourself to see how slow the convergence is. I am unsure if this is due to the Boundary Conditions that I have used, or something that I have defined in my fvSchemes or controlDict, so if anyone can comment on that, I would very much appreciate it.


I got a recommendation to run this simulation first using local time stepping to get a steady state result, then use the unsteady solver to speed this simulation up. However, using LTS, it takes over 3 hours just to get a decent steady state solution, which is also quite concerning.


I am at the end of my wits, and I am fairly new to OpenFOAM, so any and all help or advise is much appreciated.



Thanks!


P.S. I have seen other threads of users commenting on how slow rhoCentralFoam is, but I have also read research papers that credit rhoCentralFoam as a relatively fast solver, so I am currently at odds with this info.
siasyeda is offline   Reply With Quote

Old   May 1, 2020, 16:16
Default
  #2
Senior Member
 
Join Date: Aug 2015
Posts: 494
Rep Power: 15
clapointe is on a distinguished road
First issue I see is that 40 processors is way (way) too many for a case this small (~13k cells). To illustrate, I took the hyStrath mach 2 flat plate example (https://github.com/vincentcasseau/hy...arFlatPlateLTS) and reduced it to approximately the same size (~14k cells). It ran on a single processor in about 0.7 hours. (Note that I've just done this to show how long a similar case might take with a similar solver -- the mesh and results will likely be different than what you will get with rhoCentralFoam).

Caelan
clapointe is offline   Reply With Quote

Old   May 1, 2020, 17:52
Default
  #3
New Member
 
Imthiaz Syed
Join Date: Nov 2019
Posts: 6
Rep Power: 7
siasyeda is on a distinguished road
Yes, I agree with you, which is why I am really confused about this. I guess something different between the case you describe and what I am running is yours is a laminar flow while this here is turbulent using the SST model. Any ideas on why this simple case is so expensive to run?
siasyeda is offline   Reply With Quote

Old   May 1, 2020, 18:00
Default
  #4
Senior Member
 
Join Date: Aug 2015
Posts: 494
Rep Power: 15
clapointe is on a distinguished road
Have you run yours in serial? Like I said, mine is run in serial. I've not run rhoCentralFoam much with a turbulence model, so perhaps someone who has could step in.

Caelan
clapointe is offline   Reply With Quote

Old   May 2, 2020, 05:39
Default
  #5
Member
 
Thomas Sprich
Join Date: Mar 2015
Posts: 76
Rep Power: 11
Swift is on a distinguished road
Hi,

Please test as Caelan has suggested and run your case on a single core (in serial). Generally you should have 50k-100k cells per core, above which you will get no improvement in solve time. The reason is that the inter-processor communication becomes the bottleneck in the solution.

See this for a discussion on running in parallel.


Lets rule out this issue as a cause for your slow solve time which will put us in a better position to solve your issue.

Regards,
Thomas
Swift is offline   Reply With Quote

Old   May 3, 2020, 23:14
Default
  #6
New Member
 
Imthiaz Syed
Join Date: Nov 2019
Posts: 6
Rep Power: 7
siasyeda is on a distinguished road
Hi everyone,


I did some more reading the past day and thought it might have been with some of my boundary conditions. Sure enough, the simulation runs quicker, but it still takes around 5-6 hours to simulate. In regards to @clapointe and @Swift, I reran the simulation on a single core, and as you both suspected, there was little to no change in performance. It is very slightly faster with the extra cores, but performance is definitely not proportional to the 39 extra cores. Here is the updated zip, with a lower resolution mesh. This one is reaaally coarse, but still takes around 1.5hours-2hours to converge: https://drive.google.com/open?id=1sU...MfW15ujUUjDOXi


I am still struggling to comprehend why these simulations are taking so long to converge, or do you think this convergence rate is about nominal for cases at these Mach Numbers with low-Re treatment at the wall?



Thanks!
siasyeda is offline   Reply With Quote

Old   May 4, 2020, 12:55
Default
  #7
Member
 
Thomas Sprich
Join Date: Mar 2015
Posts: 76
Rep Power: 11
Swift is on a distinguished road
Hi,


I'm not an expert in rhoCentralFoam, so bear this in mind when considering my response.


The first thing that I would do is use a regular mesh to avoid the high aspect ration that you have. See the checkMesh log below:


Code:
Checking geometry...
    Overall domain bounding box (-0.33333 0 -0.5) (2 1 0.5)
    Mesh has 2 geometric (non-empty/wedge) directions (1 1 0)
    Mesh has 2 solution (non-empty) directions (1 1 0)
    All edges aligned with or perpendicular to non-empty directions.
    Boundary openness (-4.8980485e-18 9.796097e-18 -5.7062265e-16) OK.
 ***High aspect ratio cells found, Max aspect ratio: 35828.266, number of cells 189
  <<Writing 189 cells with high aspect ratio to set highAspectRatioCells
    Minimum face area = 5.5396231e-08. Maximum face area = 0.37220328.  Face area magnitudes OK.
    Min volume = 5.5396231e-08. Max volume = 0.069742525.  Total volume = 2.33333.  Cell volumes OK.
    Mesh non-orthogonality Max: 0 average: 0
    Non-orthogonality check OK.
    Face pyramids OK.
    Max skewness = 8.3811437e-15 OK.
    Coupled point location match (average 0) OK.

Failed 1 mesh checks.
I know you have chosen this refinement to capture the features, but at least eliminate this as a possible cause.



The next thing to check are your boundary conditions for k. You have specified the inlet and outlet values. I would make the outlet zeroGradient (which you seem to have had before) and the obstacle I would initialise with the $internalField.





The next comments are loooong shots but they stood out for me as things I would check.

I have only had nuTilda as zeroGradient in my simulations - so check if you are sure that is correct. Likewise for nut, I have only used wallFunctions, never calculated.


I hope you find something useful in that.


Thomas
Swift is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Issues on the simulation of high-speed compressible flow within turbomachinery dowlee OpenFOAM Running, Solving & CFD 11 August 6, 2021 07:40
Star CCM Overset Mesh Error (Rotating Turbine) thezack Siemens 7 October 12, 2016 12:14
[mesh manipulation] Importing Multiple Meshes thomasnwalshiii OpenFOAM Meshing & Mesh Conversion 18 December 19, 2015 19:57
Airfoil simulation solution interfered by mesh Dvergr OpenFOAM Running, Solving & CFD 1 September 28, 2014 03:05
Mesh motion with Translation & Rotation Doginal CFX 2 January 12, 2014 07:21


All times are GMT -4. The time now is 06:22.