CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > ANSYS > FLUENT

LES Diverges With Different .cas File - Does .cas File Change?

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   August 19, 2021, 11:04
Default LES Diverges With Different .cas File - Does .cas File Change?
  #1
Member
 
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11
artkingjw is on a distinguished road
Hi,

This seems like a very strange problem to have so I want to know if I'm doing something wrong. I'm running some LES at the moment (simple compressible wall bounded flow, static mesh) and was able to get converging runs without much trouble. I am on Fluent 2021R1.

My problems seem to be coming from my workflow so I'll describe it here first. Note that I don't use the TUI a lot as I don't have extensive experience in Fluent.

To keep my workflow neat and modular, I generate 'setup' .cas files first with very small meshes. These files are small, they load very quickly on my local machine or on interactive HPC sessions, making changes easy, while also allowing me to swap meshes very quickly. The cases that I run contain large mesh files as is expected for LES. So I typically generate the following .cas files and corresponding .dat files:

1) Steady.cas - completed -> Steady_done.cas
2) LES1.cas - completed -> LES1_done.cas
3) LES2.cas - completed -> LES2_done.cas

First, I use 'Steady.cas', load in my large mesh file (replace-mesh), then generate the initial steady state solution. Next, I load 'LES1.cas', load in the same large mesh file, load in 'Steady_done.dat', and run the calculation. LES1 is used to overcome any potential startup transients and reach a statistically average-able state etc. Everything works up to this point.

'LES2.cas' is the setup file meant to run the 'real' part of the simulation, with data sampling, regular solution exports, custom field functions etc. Just as before, my plan is to load 'LES2.cas', load the large mesh file in, load in 'LES1_done.dat' to resume the simulation, then carry on. Unfortunately, the simulation will diverge if I do this.

Instead, the only workaround seems to be to modify 'LES1_done.cas' manually, until I've replicated the setup of LES2. Note that there is no difference between LES1 and LES2 in terms of the boundary conditions, LES model, solution method etc. The only differences are the presence of custom field functions (that do not 'feedback' into the simulation), data sampling, automatic solution exports, and custom lines/planes for solution exports.

To troubleshoot, I've even re-used 'LES1.cas' in place of 'LES2.cas'. So I load 'LES1.cas', then load 'LES1_done.dat'. This diverges too. The only way to resume the simulation is to use a derivative of 'LES1_done.cas'.

The only conclusion I can draw from this, is that 'LES1_done.cas' is somehow different to 'LES1.cas', apart from it's size due to the mesh. Can anyone provide insight to this strange issue? The only possible change I can think of is my use of 'extrapolate variables' in 'LES1.cas' and 'LES2.cas' - maybe there's some information stored in the .cas file? My impression from past experience has been that the .cas file merely contains the settings of your case (BC, methods etc.) and the mesh.

Any help/solution to my problem would be appreciated. Thanks.
artkingjw is offline   Reply With Quote

Old   August 19, 2021, 16:01
Default
  #2
Senior Member
 
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,690
Rep Power: 66
LuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura about
When you load the .dat does it recognize solutions at time levels n-1 (and myabe even n-2 if you are doing 2nd order)?


Have you tried iterating the current time-step without advancing in time to reconstruct all the temporary variables?
LuckyTran is offline   Reply With Quote

Old   August 19, 2021, 21:48
Default
  #3
Member
 
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11
artkingjw is on a distinguished road
Quote:
Originally Posted by LuckyTran View Post
When you load the .dat does it recognize solutions at time levels n-1 (and myabe even n-2 if you are doing 2nd order)?
Yes I am doing second order. I'm sorry but I don't really understand your question. I thought the data contained within a .dat file (say 'LES1_done.dat') is only for the time step that was last completed?

Quote:
Originally Posted by LuckyTran View Post
Have you tried iterating the current time-step without advancing in time to reconstruct all the temporary variables?
Oh? I've never known such a thing. It looks like specifying something like /solve/dti 0 100 should do? Ie. 0 time steps, but 100 iterations allowed.
artkingjw is offline   Reply With Quote

Old   August 19, 2021, 22:47
Default
  #4
Senior Member
 
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,690
Rep Power: 66
LuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura about
Just do /solve/iterate


It's not enough to store just the solution at the current time-step. It's determined by whatever is your temporal discretization. You need the previous time-step for 1st order and the previous 2 time-steps for 2nd order. If you write your own CFD code and use a more fancy temporal discretization you can need even more time-steps.
LuckyTran is offline   Reply With Quote

Old   August 20, 2021, 00:21
Default
  #5
Member
 
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11
artkingjw is on a distinguished road
Quote:
Originally Posted by LuckyTran View Post
Just do /solve/iterate
Noted. I gave this a shot and it didn't work. Here's the script I used:

/solve/iterate

/solve set time-step 0.00001
/solve/dti 10000 150



I'll show part of the job output here:

> Number of iterations [1]
iter continuity x-velocity y-velocity z-velocity energy time/iter
!10085 solution is converged
10085 9.9018e-07 4.7341e-14 1.7326e-14 3.7117e-14 2.8626e-13 0:00:00 1
10086 1.7508e+03 1.1955e-07 6.8739e-08 9.1631e-08 5.5154e-03 0:00:00 0

>
>
Updating solution at time levels N and N-1.
done.

iter continuity x-velocity y-velocity z-velocity energy time/iter
10086 1.7508e+03 1.1955e-07 6.8739e-08 9.1631e-08 5.5154e-03 0:02:48 150
Stabilizing temperature to enhance linear solver robustness.
10087 2.1357e+03 1.1038e-05 2.7699e-05 1.2161e-05 7.0038e-03 0:02:33 149
Stabilizing temperature to enhance linear solver robustness.



Quote:
Originally Posted by LuckyTran View Post
It's not enough to store just the solution at the current time-step. It's determined by whatever is your temporal discretization. You need the previous time-step for 1st order and the previous 2 time-steps for 2nd order. If you write your own CFD code and use a more fancy temporal discretization you can need even more time-steps.
I see. Yes that makes sense. Does this mean that Fluent does not automatically save 2 time-steps worth of data per .dat file? Even if I've always been using 2nd order time discretization?

Does this mean that I should save multiple time steps in succession? Would I just use 2 read-data operations to load in both of these successive time steps? Or is there a special function for this?

Thanks.

P.S. Also, while this is very valuable information. I'm not sure if it explains the strange behavior where the the simulation proceeds normally if I use 'LES1_done.cas'. I did not need to load any special data. I guess it's possible that the 'LES1_done.dat' contains two time steps worth of data, and only 'LES1_done.cas' knows how to read it? I'm probably waayyyy off on this...
artkingjw is offline   Reply With Quote

Old   August 20, 2021, 04:54
Default
  #6
Senior Member
 
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,690
Rep Power: 66
LuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura about
You need to do /solve/iterate 100 or so. Please do more than 1 iteration.

Fluent saves all the time-steps it into the .dat. That's why I asked if you saw it load them when you read in the .dat to your new .cas. It needs to. Question is, does it? Read your output!

However, the fact that your residuals spiked after 1 iteration without advancing in time means there is already an inconsistency with the data being loaded. If you load converged data into a fresh case then the continuity residual should be stuck at 1 and the others will maintain their asymptotic behavior. But all of yours spike.
LuckyTran is offline   Reply With Quote

Old   August 20, 2021, 06:09
Default
  #7
Member
 
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11
artkingjw is on a distinguished road
Quote:
Originally Posted by LuckyTran View Post
You need to do /solve/iterate 100 or so. Please do more than 1 iteration.

Fluent saves all the time-steps it into the .dat. That's why I asked if you saw it load them when you read in the .dat to your new .cas. It needs to. Question is, does it? Read your output!

However, the fact that your residuals spiked after 1 iteration without advancing in time means there is already an inconsistency with the data being loaded. If you load converged data into a fresh case then the continuity residual should be stuck at 1 and the others will maintain their asymptotic behavior. But all of yours spike.
I have been reading my output to diagnose this problem. I have not seen anything different to what I've seen before. This is what is typical when the script comes to reading old data (minus details of addresses/directories):

>
Reading from [address of HPC system]:"/scratch/.../LES1_done.dat.h5" in NODE0 mode ...


Reading results.
Warning: The time step size (1) in the session did not match the time step size in the data file (1e-05), and has been overwritten by the value from the data file.

Parallel variables...
Done.


Does this seem incorrect to you? The output before this indicates the reading of my mesh, and the output after this indicates the start of solution advancement. The data loaded is definitely converged.

I'll let you know later once I've tried /solve/iterate 100 instead.

But thanks for letting me know that .dat files contain all time steps. I guess this explains why they gradually grow in size as more time steps are done?
artkingjw is offline   Reply With Quote

Old   August 20, 2021, 09:37
Default
  #8
Member
 
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11
artkingjw is on a distinguished road
I've now tried the /solve/iterate 100 method and found something even more interesting.

With this method, the solution will converge/diverge depending on the number of CPU cores used. If I give the job 840 cores, it converges. If I give it something larger, say 1680 cores or 1736, it will diverge. But if I run the job on 28 cores, it diverges (or rather, the residuals trends towards divergence, I didn't wait around for that to happen, this was just for the lols) .

I even got the job to converge without doing /solve/iterate 100 earlier today, but I forgot how many cores I did that time, I haven't been able to replicate it since then.

To me this now appears to be another problem where I've seem to hit some sort of limit for a ratio of number of cells to number of CPU cores. I'm not sure what it is, but if anyone is curious my number of polyhedral cells is ~ 27,000,000 for the case I have been working on so far.
artkingjw is offline   Reply With Quote

Old   August 20, 2021, 14:18
Default
  #9
Senior Member
 
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,690
Rep Power: 66
LuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura about
Quote:
Originally Posted by artkingjw View Post
But thanks for letting me know that .dat files contain all time steps. I guess this explains why they gradually grow in size as more time steps are done?
Fluent stores all the time-steps that it needs (all 1 or 2 of them), it doesn't store all the time-steps ever. The .dat only grows after the first few time steps as it adds fields for the previous time-step and maybe fields for statistics and such. It doesn't continuously grow in size forever unless you have some other goodies happening.

Quote:
Originally Posted by artkingjw View Post

Reading results.
Warning: The time step size (1) in the session did not match the time step size in the data file (1e-05), and has been overwritten by the value from the data file.
Why are your time-step sizes not the same? That's a big problem. You should look into it.

It's normal for different number of cores to affect what the residuals look like as you iterate.

I have no idea what you mean when you say converge. Are you saying solve/iterate 100 converges or diverges or are you talking about dual-time-iterate? If you can't converge without time-stepping, then it's meaningless to go to the next time-step.
LuckyTran is offline   Reply With Quote

Old   August 20, 2021, 17:24
Default
  #10
Senior Member
 
Lorenzo Galieti
Join Date: Mar 2018
Posts: 375
Rep Power: 12
LoGaL is on a distinguished road
Are you reading the .dat file on top of the .cas file? (I already know you and your complicated approaches )
LoGaL is offline   Reply With Quote

Old   August 21, 2021, 00:23
Default
  #11
Member
 
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11
artkingjw is on a distinguished road
Quote:
Originally Posted by LuckyTran View Post
Fluent stores all the time-steps that it needs (all 1 or 2 of them), it doesn't store all the time-steps ever. The .dat only grows after the first few time steps as it adds fields for the previous time-step and maybe fields for statistics and such. It doesn't continuously grow in size forever unless you have some other goodies happening.
Yes gotcha that makes sense. I should have thought about that more carefully.

Quote:
Originally Posted by LuckyTran View Post
Why are your time-step sizes not the same? That's a big problem. You should look into it.
Oh I thought it's not a problem? The time step in the setup file is just set to 1 second (much too large). I reset the time step in my running script. It just so happens that the time step in 'LES1_done.dat' is 1e-5. I will also run this next stage in 1e-5. So I really don't see the problem here. Sorry if it isn't obvious to you. I should have mentioned earlier.

Quote:
Originally Posted by LuckyTran View Post
It's normal for different number of cores to affect what the residuals look like as you iterate.
I see. I wasn't aware that it could be such a drastic change. The residuals start blowing up immediately when I used anything other than 840 cores.

Quote:
Originally Posted by LuckyTran View Post
I have no idea what you mean when you say converge. Are you saying solve/iterate 100 converges or diverges or are you talking about dual-time-iterate? If you can't converge without time-stepping, then it's meaningless to go to the next time-step.
Yes the solve/iterate 100 diverges.

Also, did you mean that if I can't converge *before* time-stepping? ie. if solve/iterate 100 diverges, then there's no point moving to dual-time-iterate. In either case it diverges after a few iterations. This is true for if I start the run with solve/iterate 100, or if I go straight into dual-time-iterate.
artkingjw is offline   Reply With Quote

Old   August 21, 2021, 00:30
Default
  #12
Member
 
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11
artkingjw is on a distinguished road
Quote:
Originally Posted by LoGaL View Post
Are you reading the .dat file on top of the .cas file? (I already know you and your complicated approaches )
Hi Lorenzo.

Yes I read the .cas file first, then read the mesh, then read the .dat file.
artkingjw is offline   Reply With Quote

Old   August 22, 2021, 12:03
Default
  #13
Senior Member
 
Lorenzo Galieti
Join Date: Mar 2018
Posts: 375
Rep Power: 12
LoGaL is on a distinguished road
I am not sure but it could be that the dat files contains only one timestep and you are using second order discretization.. so once you load it and it finds only one timestep it freaks out

I guess that if you are using a journal file, it could be worth to use only one case (the post-processing one) and turn on all the post-processing goodies once you have iterated enough. There are commands to deactivate the automatic plots export and to change the frequency with which you export the solution. You can also clear the statistics, again with the journal.

So you would start, deactivate all the post-process stuff, set solution write to a large number, iterate, then clear the statistics and activate everything

Last edited by LoGaL; August 22, 2021 at 15:21.
LoGaL is offline   Reply With Quote

Old   August 23, 2021, 08:23
Default
  #14
Member
 
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11
artkingjw is on a distinguished road
Yea it sounds like that's more or less what I'll have to do. Strange quirks of Fluent.
artkingjw is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
how to calculate mass flow rate on patches and summation of that during the run? immortality OpenFOAM Post-Processing 104 February 16, 2021 08:46
OpenFoam "Permission denied" and "command not found" problems. iyidaniel@yahoo.co.uk OpenFOAM Running, Solving & CFD 11 January 2, 2018 06:47
[OpenFOAM.org] Compile OF 2.3 on Mac OS X .... the patch gschaider OpenFOAM Installation 225 August 25, 2015 19:43
friction forces icoFoam ofslcm OpenFOAM 3 April 7, 2012 10:57
"parabolicVelocity" in OpenFoam 2.1.0 ? sawyer86 OpenFOAM Running, Solving & CFD 21 February 7, 2012 11:44


All times are GMT -4. The time now is 16:51.