|
[Sponsors] |
chtMultiRegionSimpleFoam won't restart. P or T divergence. Continuity? |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
March 5, 2021, 00:11 |
chtMultiRegionSimpleFoam won't restart. P or T divergence. Continuity?
|
#1 |
New Member
Mark Yobb
Join Date: Feb 2021
Location: Calgary, AB
Posts: 4
Rep Power: 5 |
Hello all.
I am have trouble getting reliable results from a chtMultiRegionSimpleFoam model. Additionally I am struggling with solution stability... I'm hoping someone can point me in the right direction. The mesh was generated parametrically in blockMesh and checkMesh says everything is OK. The geometry is a length of a 1/8 (45 degree) quadrant of annular pipes with hot air flowing upwards between them. Axially orientated aluminum fins span radially between the OD of the inner pipe and the ID of the outer pipe. I am using laminar flow as the Re is approximately 600. The up flowing air is quite hot. It starts at 1273 K and loses heat to the inner and outer perimeters of the annulus as well as to the fins that span between. I have used a line of symmetry through the center of the radial fins. The mesh is more course than I ultimately want for the final solution. The intention is to use mapFields to map the fields onto a finer meshed version of the model once I get a stable solution so I can refine the results. I have tried a number of different strategies to get some meaningful results. The big issue is the solver tends to blow up based on T or p_rgh... I found that if I used under relaxation factors of around 0.4 I could get the solver to run. A number of other settings I have tried would blow up shortly after the solver started. I finally got the settings that seemed to run though the residuals for p_rgh were always a bit high (on the order of 0.001 to 0.0001) and oscillatory. Things settled down and I was able to run the solver for a couple of days. I would take a look at the velocity profile changes as time steps progressed to get a handle on whether the problem was converging or not. (As well as I was watching the residuals with foamMonitor. (This is the 8th major version of the model that I am running.) As this model was stable (or so it seemed) I spent some time looking at the development of the p_rgh, rho, and T contours in paraview etc. A few things seemed odd: 1. Non zero at wall velocity profiles were indicated in paraview. Numerous posts I have found indicated that this was an issue with paraview (though possibly and old issue?) so after some contemplation and further digging I chalked this up to something I would check in detail once I refined the mesh. In prior runs when I refined the mesh paraview would start indicating proper no slip velocities. 2. I ran the model for 194 seconds at 0.0002 time steps. This took a couple days. The maximum Courant numbers is 0.782 in the last time step. I noticed that the time step directories progressed with additional digits below the 0.000000001 digit. Not sure how this makes sense. Anyway is seems like a peripheral issue. The final time step is 193.440000004. The preceding written time step is 193.400000004173137...? 3. I was intending to use a reference Pressure (pRefCell) so the solver didn't have to carry the additional 5 digits (absolute pressure nominally 1 bar.) This did not work and I read somewhere that this is not implemented for the Simple solver so I went back to put 100000 pa in the field files as a boundary condition. I am not including all of the errors from all of the runs because this would be way too much info for one post. I have seemingly resolved all of the other issue it is just this final stability issue that has be stymied. Once I stopped the run (with nextWrite in controlDic) I looked through the numbers to determine where I was at. I postProcessed out the mass fluxes and heat fluxes. I was doing a cross check to see if the change in enthalpy of the air passing through the channel was equal to the integration of the heat flux on the surface of the channel when to my surprise I discovered that the discharge MASS flux is only about 60 percent of the inlet mass flux.? I have read that heat flux is not necessarily conserved in OpenFOAM which I was prepared to accept... but conservation of mass has to be maintained at convergence. This seems like way to much continuity error (considering how long things have run.) I thought I might run things for a bit longer to see if the discharge mass rate would converge toward the inlet mass rate.... (It really seems to me like after running this for two days there should be nowhere near this much error. The velocity and temperature contours were not moving with each successive time step.) The problem though is that I can not get the model to run again (despite the fact that it ran along fine for a couple days.) I tried changing my p_rgh solver to PCG but it just changed how it blew up. Instead of blowing up on p_rgh it blew up on T. I have found this solve to be very sensitive to blowing up on T / p_rgh.... I have tried running Pimple (chtMultiRegionFoam) as well on prior runs to skirt this issue. Pimple also blowing up in a similar way... I have changed my solvers, change by gradSchemes, changed my snGradSchemes, changed divSchemes etc.... I have scanned through every piece of info I can find. One thing that worked in prior runs was turning off gravity. The issue was that when I turned gravity back on (after the velocity profile had developed) even to only 0.1 m/s2 things just blew up... For this reason the prior run with the current settings was run at 5 m/s2 to see if I could get things resolving. It worked but it would not continue running after I stopped it either. To be clear when I write blew up I mean 1000 iterations and p_rgh to to something like 3.239e74 or T going minus and the solver throwing errors and stopping. I don't want to abandon two days of run time and start again... (This run had gravity at 9.81 and things were starting to look good.) Things could just end up in the same spot. I had trouble getting the solver to run for any amount of time in the first place. I have read about how cht models suffer from the difference in rates of convergence of the solid to fluid regions.... I found a paper where someone put an outside loop that would progress the solid heat transfer so things approached steady state faster.... This seems possibly unrelated though because I was already able to run for 2 days (and a bit.) Not sure I understand this issue as indicated in the forums either. To me the heat transfer rate of the solid region is extremely high. The issue is the heat capacity of any steel.... Which holds heat for numerous time steps... I have not seen mention of the issue in this light.... Maybe I am not understanding the solver issue? My priority is getting things running and understanding where I am at in terms of convergence.... Continuity seems like it should be the least of the issues. I am attaching everything that the file size limit will allow: fvSolutions, fvSchemes, thermophysicalProperties etc etc as a zip. Can anyone provide some hints on how I might stabilize this solver? Kind regards, Mark |
|
March 5, 2021, 08:37 |
|
#2 |
Senior Member
Arjun
Join Date: Mar 2009
Location: Nurenberg, Germany
Posts: 1,290
Rep Power: 34 |
Start the calculation with very very small under-relaxation factors. Then change the relaxation factor after 1 or 2 iteration.
The issue is that perhaps the fluxes are not well constructed when the restart files are read. After 1 or 2 iterations the fluxes will be properly constructed. Then you can resume the normal calculation. |
|
March 8, 2021, 05:43 |
|
#3 |
New Member
Mark Yobb
Join Date: Feb 2021
Location: Calgary, AB
Posts: 4
Rep Power: 5 |
Arjun,
Thanks so much for a quick reply. You got me going again! Thank-you. I have spent the last number of days using your advise. I am not sure I have fully resolved my issue... I am able to get things moving again when I use really really lower under relaxation. To get things going I used: relaxationFactors { fields { rho 0.0003; p_rgh 0.003; } equations { U 0.0004; h 0.0004; } } Once things got going I could make the factors higher... typical as: fields { rho 0.008; p_rgh 0.08; } equations { U 0.008; h 0.008; } I note a few things that give me pause: 1. I could not make the relaxation factors much large or the solver would diverge as before... 2. It seemed that p_rgh had to have an order of magnitude larger relaxation factor to make things stable. 3. Despite the very low relaxation factors the solver still progressed. 4. When I set the relaxation factors very low the disparity in the mass fluxes in and out soon went away. I am wondering if the low relaxation factor requirement is associated with the slow progress of the heat build up in the solid (metallic) portion of the domain? Could this also be the case with the continuity errors? Could this be associated with the large density changes in this simulation? Density changes from approximately 0.2 to 0.8 as the flow progress through the channel. Any insight would be greatly appreciated! Kind regards, Mark |
|
March 11, 2021, 04:25 |
|
#4 |
Senior Member
Arjun
Join Date: Mar 2009
Location: Nurenberg, Germany
Posts: 1,290
Rep Power: 34 |
Actually I will not be able to be much help. But i know that this issue of people not able to restart the calculation in openfoam is recurring here in forums.
So far my understanding is that fluxes are not stored in restart so they need to be constructed. This can cause troubles (it needs to have flux dissipation too). (For this reason in our solver Wildkatze we store everything that is needed to construct the fluxes to how they were before the restore was saved. In our case restart is smooth). May be some other variable is missing in restart file, which is also needed but was not stored? |
|
Tags |
chtmultiregionsimpefoam, continuity, convergence, stability problem |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
pimpleDyMFoam computation randomly stops | babapeti | OpenFOAM Running, Solving & CFD | 5 | January 24, 2018 06:28 |
fluent divergence for no reason | sufjanst | FLUENT | 2 | March 23, 2016 17:08 |
Upgraded from Karmic Koala 9.10 to Lucid Lynx10.04.3 | bookie56 | OpenFOAM Installation | 8 | August 13, 2011 05:03 |
IcoFoam parallel woes | msrinath80 | OpenFOAM Running, Solving & CFD | 9 | July 22, 2007 03:58 |
Could anybody help me see this error and give help | liugx212 | OpenFOAM Running, Solving & CFD | 3 | January 4, 2006 19:07 |