|
[Sponsors] |
LES Diverges With Different .cas File - Does .cas File Change? |
![]() |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
![]() |
![]() |
#1 |
Member
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11 ![]() |
Hi,
This seems like a very strange problem to have so I want to know if I'm doing something wrong. I'm running some LES at the moment (simple compressible wall bounded flow, static mesh) and was able to get converging runs without much trouble. I am on Fluent 2021R1. My problems seem to be coming from my workflow so I'll describe it here first. Note that I don't use the TUI a lot as I don't have extensive experience in Fluent. To keep my workflow neat and modular, I generate 'setup' .cas files first with very small meshes. These files are small, they load very quickly on my local machine or on interactive HPC sessions, making changes easy, while also allowing me to swap meshes very quickly. The cases that I run contain large mesh files as is expected for LES. So I typically generate the following .cas files and corresponding .dat files: 1) Steady.cas - completed -> Steady_done.cas 2) LES1.cas - completed -> LES1_done.cas 3) LES2.cas - completed -> LES2_done.cas First, I use 'Steady.cas', load in my large mesh file (replace-mesh), then generate the initial steady state solution. Next, I load 'LES1.cas', load in the same large mesh file, load in 'Steady_done.dat', and run the calculation. LES1 is used to overcome any potential startup transients and reach a statistically average-able state etc. Everything works up to this point. 'LES2.cas' is the setup file meant to run the 'real' part of the simulation, with data sampling, regular solution exports, custom field functions etc. Just as before, my plan is to load 'LES2.cas', load the large mesh file in, load in 'LES1_done.dat' to resume the simulation, then carry on. Unfortunately, the simulation will diverge if I do this. Instead, the only workaround seems to be to modify 'LES1_done.cas' manually, until I've replicated the setup of LES2. Note that there is no difference between LES1 and LES2 in terms of the boundary conditions, LES model, solution method etc. The only differences are the presence of custom field functions (that do not 'feedback' into the simulation), data sampling, automatic solution exports, and custom lines/planes for solution exports. To troubleshoot, I've even re-used 'LES1.cas' in place of 'LES2.cas'. So I load 'LES1.cas', then load 'LES1_done.dat'. This diverges too. The only way to resume the simulation is to use a derivative of 'LES1_done.cas'. The only conclusion I can draw from this, is that 'LES1_done.cas' is somehow different to 'LES1.cas', apart from it's size due to the mesh. Can anyone provide insight to this strange issue? The only possible change I can think of is my use of 'extrapolate variables' in 'LES1.cas' and 'LES2.cas' - maybe there's some information stored in the .cas file? My impression from past experience has been that the .cas file merely contains the settings of your case (BC, methods etc.) and the mesh. Any help/solution to my problem would be appreciated. Thanks. |
|
![]() |
![]() |
![]() |
![]() |
#2 |
Senior Member
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,690
Rep Power: 66 ![]() ![]() ![]() |
When you load the .dat does it recognize solutions at time levels n-1 (and myabe even n-2 if you are doing 2nd order)?
Have you tried iterating the current time-step without advancing in time to reconstruct all the temporary variables? |
|
![]() |
![]() |
![]() |
![]() |
#3 | |
Member
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11 ![]() |
Quote:
Oh? I've never known such a thing. It looks like specifying something like /solve/dti 0 100 should do? Ie. 0 time steps, but 100 iterations allowed. |
||
![]() |
![]() |
![]() |
![]() |
#4 |
Senior Member
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,690
Rep Power: 66 ![]() ![]() ![]() |
Just do /solve/iterate
It's not enough to store just the solution at the current time-step. It's determined by whatever is your temporal discretization. You need the previous time-step for 1st order and the previous 2 time-steps for 2nd order. If you write your own CFD code and use a more fancy temporal discretization you can need even more time-steps. |
|
![]() |
![]() |
![]() |
![]() |
#5 | |
Member
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11 ![]() |
Noted. I gave this a shot and it didn't work. Here's the script I used:
/solve/iterate /solve set time-step 0.00001 /solve/dti 10000 150 I'll show part of the job output here: > Number of iterations [1] iter continuity x-velocity y-velocity z-velocity energy time/iter !10085 solution is converged 10085 9.9018e-07 4.7341e-14 1.7326e-14 3.7117e-14 2.8626e-13 0:00:00 1 10086 1.7508e+03 1.1955e-07 6.8739e-08 9.1631e-08 5.5154e-03 0:00:00 0 > > Updating solution at time levels N and N-1. done. iter continuity x-velocity y-velocity z-velocity energy time/iter 10086 1.7508e+03 1.1955e-07 6.8739e-08 9.1631e-08 5.5154e-03 0:02:48 150 Stabilizing temperature to enhance linear solver robustness. 10087 2.1357e+03 1.1038e-05 2.7699e-05 1.2161e-05 7.0038e-03 0:02:33 149 Stabilizing temperature to enhance linear solver robustness. Quote:
Does this mean that I should save multiple time steps in succession? Would I just use 2 read-data operations to load in both of these successive time steps? Or is there a special function for this? Thanks. P.S. Also, while this is very valuable information. I'm not sure if it explains the strange behavior where the the simulation proceeds normally if I use 'LES1_done.cas'. I did not need to load any special data. I guess it's possible that the 'LES1_done.dat' contains two time steps worth of data, and only 'LES1_done.cas' knows how to read it? I'm probably waayyyy off on this... |
||
![]() |
![]() |
![]() |
![]() |
#6 |
Senior Member
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,690
Rep Power: 66 ![]() ![]() ![]() |
You need to do /solve/iterate 100 or so. Please do more than 1 iteration.
Fluent saves all the time-steps it into the .dat. That's why I asked if you saw it load them when you read in the .dat to your new .cas. It needs to. Question is, does it? Read your output! However, the fact that your residuals spiked after 1 iteration without advancing in time means there is already an inconsistency with the data being loaded. If you load converged data into a fresh case then the continuity residual should be stuck at 1 and the others will maintain their asymptotic behavior. But all of yours spike. |
|
![]() |
![]() |
![]() |
![]() |
#7 | |
Member
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11 ![]() |
Quote:
> Reading from [address of HPC system]:"/scratch/.../LES1_done.dat.h5" in NODE0 mode ... Reading results. Warning: The time step size (1) in the session did not match the time step size in the data file (1e-05), and has been overwritten by the value from the data file. Parallel variables... Done. Does this seem incorrect to you? The output before this indicates the reading of my mesh, and the output after this indicates the start of solution advancement. The data loaded is definitely converged. I'll let you know later once I've tried /solve/iterate 100 instead. But thanks for letting me know that .dat files contain all time steps. I guess this explains why they gradually grow in size as more time steps are done? |
||
![]() |
![]() |
![]() |
![]() |
#8 |
Member
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11 ![]() |
I've now tried the /solve/iterate 100 method and found something even more interesting.
With this method, the solution will converge/diverge depending on the number of CPU cores used. If I give the job 840 cores, it converges. If I give it something larger, say 1680 cores or 1736, it will diverge. But if I run the job on 28 cores, it diverges (or rather, the residuals trends towards divergence, I didn't wait around for that to happen, this was just for the lols) . I even got the job to converge without doing /solve/iterate 100 earlier today, but I forgot how many cores I did that time, I haven't been able to replicate it since then. To me this now appears to be another problem where I've seem to hit some sort of limit for a ratio of number of cells to number of CPU cores. I'm not sure what it is, but if anyone is curious my number of polyhedral cells is ~ 27,000,000 for the case I have been working on so far. |
|
![]() |
![]() |
![]() |
![]() |
#9 | ||
Senior Member
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,690
Rep Power: 66 ![]() ![]() ![]() |
Quote:
Quote:
It's normal for different number of cores to affect what the residuals look like as you iterate. I have no idea what you mean when you say converge. Are you saying solve/iterate 100 converges or diverges or are you talking about dual-time-iterate? If you can't converge without time-stepping, then it's meaningless to go to the next time-step. |
|||
![]() |
![]() |
![]() |
![]() |
#10 |
Senior Member
Lorenzo Galieti
Join Date: Mar 2018
Posts: 375
Rep Power: 12 ![]() |
Are you reading the .dat file on top of the .cas file? (I already know you and your complicated approaches
![]() |
|
![]() |
![]() |
![]() |
![]() |
#11 | ||||
Member
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11 ![]() |
Quote:
Quote:
Quote:
Quote:
Also, did you mean that if I can't converge *before* time-stepping? ie. if solve/iterate 100 diverges, then there's no point moving to dual-time-iterate. In either case it diverges after a few iterations. This is true for if I start the run with solve/iterate 100, or if I go straight into dual-time-iterate. |
|||||
![]() |
![]() |
![]() |
![]() |
#12 |
Member
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11 ![]() |
||
![]() |
![]() |
![]() |
![]() |
#13 |
Senior Member
Lorenzo Galieti
Join Date: Mar 2018
Posts: 375
Rep Power: 12 ![]() |
I am not sure but it could be that the dat files contains only one timestep and you are using second order discretization.. so once you load it and it finds only one timestep it freaks out
I guess that if you are using a journal file, it could be worth to use only one case (the post-processing one) and turn on all the post-processing goodies once you have iterated enough. There are commands to deactivate the automatic plots export and to change the frequency with which you export the solution. You can also clear the statistics, again with the journal. So you would start, deactivate all the post-process stuff, set solution write to a large number, iterate, then clear the statistics and activate everything Last edited by LoGaL; August 22, 2021 at 15:21. |
|
![]() |
![]() |
![]() |
![]() |
#14 |
Member
Arthur
Join Date: Apr 2015
Posts: 34
Rep Power: 11 ![]() |
Yea it sounds like that's more or less what I'll have to do. Strange quirks of Fluent.
|
|
![]() |
![]() |
![]() |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
how to calculate mass flow rate on patches and summation of that during the run? | immortality | OpenFOAM Post-Processing | 104 | February 16, 2021 08:46 |
OpenFoam "Permission denied" and "command not found" problems. | iyidaniel@yahoo.co.uk | OpenFOAM Running, Solving & CFD | 11 | January 2, 2018 06:47 |
[OpenFOAM.org] Compile OF 2.3 on Mac OS X .... the patch | gschaider | OpenFOAM Installation | 225 | August 25, 2015 19:43 |
friction forces icoFoam | ofslcm | OpenFOAM | 3 | April 7, 2012 10:57 |
"parabolicVelocity" in OpenFoam 2.1.0 ? | sawyer86 | OpenFOAM Running, Solving & CFD | 21 | February 7, 2012 11:44 |