CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > Siemens > STAR-CCM+

CFL Number

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 10, 2022, 14:45
Default CFL Number
  #1
New Member
 
Ash
Join Date: Feb 2021
Posts: 20
Rep Power: 5
ashf9 is on a distinguished road
Hi,

Does anyone know what the CFL number does and is it applicable to steady flow simulations. When I run my simulations I sometimes get the error saying
AMG solve rejected. CFL 216.14 -> 108.07. Would someone be able to explain to me why this is the case and what a reasonable CFL number would be for an aerofoil under stall?

Many Thanks,
ashf9 is offline   Reply With Quote

Old   March 10, 2022, 15:31
Default
  #2
Senior Member
 
Matt
Join Date: Aug 2014
Posts: 947
Rep Power: 18
fluid23 is on a distinguished road
The steady solver uses a pseudo-time marching approach so there is still a CFL number. From the output you shared, it looks like you have the default 'automatic control' option enabled and what's happening is the solver is advancing, trying to manage your CFL number to speed up convergence. It didn't resolve the iteration at the CFL it had chosen, so it reduced it's guess and tried again.

The default limits for auto control are a little ridiculous in my opinion. I usually impose a new (lower) limit when I start seeing AML rejection repeatedly.

Last edited by fluid23; March 10, 2022 at 16:55.
fluid23 is offline   Reply With Quote

Old   March 11, 2022, 05:14
Default
  #3
New Member
 
Ash
Join Date: Feb 2021
Posts: 20
Rep Power: 5
ashf9 is on a distinguished road
Ah I see, thank you I've set the limit to 500, but the problem is it converges nicely for a 2D case but when I impose it on a 3D simulation with 2 slip walls, it doesn't seem to be converging, would you say I need to go lower?
Quote:
Originally Posted by fluid23 View Post
The steady solver uses a pseudo-time marching approach so there is still a CFL number. From the output you shared, it looks like you have the default 'automatic control' option enabled and what's happening is the solver is advancing, trying to manage your CFL number to speed up convergence. It didn't resolve the iteration at the CFL it had chosen, so it reduced it's guess and tried again.

The default limits for auto control are a little ridiculous in my opinion. I usually impose a new (lower) limit when I start seeing AML rejection repeatedly.
ashf9 is offline   Reply With Quote

Old   March 11, 2022, 10:25
Default
  #4
Senior Member
 
Matt
Join Date: Aug 2014
Posts: 947
Rep Power: 18
fluid23 is on a distinguished road
The value of the CFL number shouldn't make a difference in the final result as long as it isn't so high that the solver is unstable or so low that the solver is too stiff. Poor convergence can be due to a number of things, most commonly poorly posed boundary conditions and/or poor quality mesh. I would start there in trying to track down convergence issues.

May I ask how you are judging convergence? Many people place too much emphasis on residual convergence. The 3 orders of magnitude rule that you often see quoted is a good place to start, but if you have a really good initialization, you may not get 3 orders of magnitude. I have run into this using the grid sequencing expert initialization option.

In such cases, it is helpful to consider the magnitude of the residuals, not the normalized residuals, where they are occurring in the domain, etc. You can setup your own residual monitors by turning on temporary storage and creating some plots. I have gone as far as to create separate residual plots for separate regions to help me understand what was happening with the domain.

All I am saying is, don't get hung up on residual convergence. Convergence of figures of merit (drag coefficient for example) is much, much more important. I have seen flow fields where the residual plots are converged but the things I care about in the flow are still asymptotically approaching some value.
fluid23 is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
decomposePar no field transfert Jeanp OpenFOAM Pre-Processing 3 June 18, 2022 13:01
[snappyHexMesh] Error snappyhexmesh - Multiple outside loops avinashjagdale OpenFOAM Meshing & Mesh Conversion 53 March 8, 2019 10:42
[mesh manipulation] Importing Multiple Meshes thomasnwalshiii OpenFOAM Meshing & Mesh Conversion 18 December 19, 2015 19:57
foam-extend_3.1 decompose and pyfoam warning shipman OpenFOAM 3 July 24, 2014 09:14
Cluster ID's not contiguous in compute-nodes domain. ??? Shogan FLUENT 1 May 28, 2014 16:03


All times are GMT -4. The time now is 18:15.