|
[Sponsors] |
Fatal overflow in linear solver occur when execute solution in parallel |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
March 6, 2020, 14:35 |
|
#21 |
Senior Member
Join Date: Nov 2015
Posts: 246
Rep Power: 12 |
Yes, I have this warning message. But point with coordinates (0.004, 0.252, 0.0) is placed in part, filled with air. And when I define Pressure Level Information manually, therefore I perform equivalent action, I suppose.
Code:
+--------------------------------------------------------------------+ | Reference Pressure Information | +--------------------------------------------------------------------+ Domain Group: Default Domain Pressure has not been set at any boundary conditions. The pressure will be set to 0.00000E+00 at the following location: Domain : Default Domain Node : 1 (equation 1) Coordinates : ( 4.00000E-03, 2.52000E-01, 0.00000E+00). +--------------------------------------------------------------------+ | ****** Notice ****** | | This is a multiphase simulation in a closed system. | | A global correction will be applied to the volume fractions to | | accelerate mass conservation. | +--------------------------------------------------------------------+ Domain Group: Default Domain Buoyancy has been activated. The absolute pressure will include hydrostatic pressure contribution, using the following reference coordinates: ( 4.00000E-03, 2.52000E-01, 0.00000E+00). Code:
PRESSURE LEVEL INFORMATION: Cartesian Coordinates = 0.128 [m], 0.192 [m], 0 [m] Option = Cartesian Coordinates Pressure Level = 0 [atm] END |
|
March 10, 2020, 10:16 |
|
#22 |
Senior Member
Join Date: Nov 2015
Posts: 246
Rep Power: 12 |
Small update – I have recalculate test cases using Coeff Loop control and my solution look stable. With automatic timesteping most of the timesteps converge less than 10 loops.
Based on pressure results I can judge that level of RMS residuals 1e-4 is adequate for this simulation. |
|
March 12, 2020, 05:45 |
|
#23 |
Senior Member
Join Date: Nov 2015
Posts: 246
Rep Power: 12 |
At this time I have check all numerical and physics setting.
Here is summary of my tests. -) RMS Residuals level of 1e-4 is adequate. -) Turbulence and surface tension can be neglected. -) Homogeneous multiphase model is preferable unless whitepapers recommend to use inhomogeneous. -) Manual placing point for Pressure Level Information is mandatory. Automatic pressure level point selection produce unrealistic results. -) Running solution in parallel may or may not lead to convergence problems. Use parallel solution with caution. I warn that my conclusions are applicable only to my simulation settings and only in context of my problem goals. They not fit all sloshing problems. I thank Opaque, Gert-Jan and ghorrocs for help. I learned a lot of new CFX tricks from this thread from you all. Thank you! |
|
March 12, 2020, 06:02 |
|
#24 |
Super Moderator
Glenn Horrocks
Join Date: Mar 2009
Location: Sydney, Australia
Posts: 17,871
Rep Power: 144 |
Thanks for the summary.
It is quite rare for parallel simulations to be different to serial simulations. So your case is one of the rare exceptions.
__________________
Note: I do not answer CFD questions by PM. CFD questions should be posted on the forum. |
|
March 12, 2020, 07:01 |
|
#25 |
Senior Member
Gert-Jan
Join Date: Oct 2012
Location: Europe
Posts: 1,928
Rep Power: 28 |
I just found out that the problems with alignment of free surfaces with partition divisions is recognized by CFX and is described in paragraph 7.18.5.9. in the Manual.
|
|
April 25, 2023, 23:43 |
|
#26 | |
Member
Ashkan Kashani
Join Date: Apr 2016
Posts: 46
Rep Power: 10 |
Hello. I would appreciate your comments on the following.
Problem description: The flow underneath a floating stationary rectangular body. For more modelling details: See the attached CCL file. What's wrong? I'm facing the same divergence problem when doing parallel runs. I'm running the simulation on 128 cores. Everything starts off smoothly. But at some point during the transient solution, the linear solver starts to fail (as signified by "F", see Figure 1), which persists until the CFX solution crashes eventually with the following message: +--------------------------------------------------------------------+ | ERROR #004100018 has occurred in subroutine FINMES. | | Message: | | Fatal overflow in linear solver. | +--------------------------------------------------------------------+ As discussed above, I also suspect that the partitioning is to blame. My suspicion is supported by the fact that the free surface happens to coincide with some interfaces of the adjacent partitions, see Figure 2. My questions: 1- In the case partitioning topology is involved, how to ensure other partitioning methods do a better job? Quote:
3- Any other recommendations to get it to converge more easily? Last edited by Ashkan Kashani; April 26, 2023 at 00:44. |
||
April 26, 2023, 00:00 |
|
#27 |
Super Moderator
Glenn Horrocks
Join Date: Mar 2009
Location: Sydney, Australia
Posts: 17,871
Rep Power: 144 |
Making the element aspect ratio closer to 1 always improves the numerical stability. But whether your simulation has a problem with numerical stability depends on what you are modelling, how you have set the simulation up, mesh quality and many other factors. So some simulations will be very sensitive to this, and some will not.
__________________
Note: I do not answer CFD questions by PM. CFD questions should be posted on the forum. |
|
April 26, 2023, 00:48 |
|
#28 |
Member
Ashkan Kashani
Join Date: Apr 2016
Posts: 46
Rep Power: 10 |
I would also appreciate your comment on the problem with partitioning.
|
|
April 26, 2023, 01:17 |
|
#29 |
Super Moderator
Glenn Horrocks
Join Date: Mar 2009
Location: Sydney, Australia
Posts: 17,871
Rep Power: 144 |
You editted your question and changed it after I had answered it! Please don't do that in future. If you have another question or request further clarification please add a new post to the thread.
Yes, the partitioning might be affecting stability. Then just change the partitioning algorithm. There are many different partitioning algorithms, look in the documentation for available options. FINMES error: See FAQ https://www.cfd-online.com/Wiki/Ansy...do_about_it.3F
__________________
Note: I do not answer CFD questions by PM. CFD questions should be posted on the forum. |
|
April 26, 2023, 05:16 |
|
#30 |
Senior Member
Gert-Jan
Join Date: Oct 2012
Location: Europe
Posts: 1,928
Rep Power: 28 |
Not sure if it helps, but here I suggest to try to partition in x-direction, perpendicular to the free surface. In the solver manager/Define Run/Partitioner Tab you can select various methods from which a main axis direction is one of the options.
I would not start with 128 partitions, because that might lead to many thin slices, but with less. Just give it a try. |
|
April 28, 2023, 02:37 |
|
#31 |
Member
Ashkan Kashani
Join Date: Apr 2016
Posts: 46
Rep Power: 10 |
Thank you ghorrocks and Gert-Jan. I've got two more questions regarding your responses.
1- Since in my pseudo-2D simulation all the elements share the same size in the direction of mesh extrusion, the resulting aspect ratio will have a broad range that may exceed 1 by far, no matter what value is set for the thickness (which is equal to extrusion length between symmetry planes). Given that, how to keep the aspect ratio close to 1 in order to improve numerical stability? 2- I would like to try partitions that are aligned in one specific direction (normal to the free surface in my setup). However, I can't find the User Specified Direction partitioning method among the options given for the command -part-mode in the documentation. The only options are 'metis-kway' (MeTiS k-way), 'metis-rec' (MeTiS Recursive Bisection), 'simple' (Simple Assignment), 'drcb' (Directional Recursive Coordinate Bisection), 'orcb' (Optimized Recursive Coordinate Bisection) and 'rcb' (Recursive Coordinate Bisection). How to set this up? I would appreciate your help. |
|
April 28, 2023, 03:21 |
|
#32 |
Senior Member
Gert-Jan
Join Date: Oct 2012
Location: Europe
Posts: 1,928
Rep Power: 28 |
Reagrding point 1, use the same mesh size everywhere and make your extrusion depth the same as your mesh size. There is no workaround here. But you can create larger mesh sizes in air and water, far away from the free surface. But when using the user specific direction partitioning mehtod, I don't know what will happen if you have less elements over the length of you sky and sea floor than partitions you have. Better perform a few tests here.
Regarding point 2, there are multiple options, 1) you can create an Execution control in CFX-Pre (Insert>Solver>Execution Control). There you have the same options as in the solver manager, so you can select the partioning method you like. This partitioning information is written to the definition file so already available when running from command line. No need the add additional settings. 2) if you use a results file to start from in the command line, and you do not specify anything, the solver will run the case with the same settings as the result file was run. The partitioning settings were written to the results file, so everything is already available. 3) you can extract the execution control settings from a clean definition file by typing: cfx5cmds -read -def <file.def> -ccl <settings_def.ccl> Do the same for a succesful results file: cfx5cmds -read -def <file.res> -ccl <settings_res.ccl> Then using a text editor copy and paste the execution control section from the results-ccl to the definition-ccl Then overwrite the old settings in the definition file with the new settings by: cfx5cmds -write -def <file.def> -ccl <settings_def.ccl> and off you go. |
|
May 28, 2023, 14:05 |
|
#33 | |
Member
Ashkan Kashani
Join Date: Apr 2016
Posts: 46
Rep Power: 10 |
Quote:
I tried different partitioning methods only to DELAY the solver failure, rather than getting rid of it indefinitely. Eventually, I realized that the instability issue is stemming from a large extrusion length, leading to very high aspect ratio elements that are apparently implicated in the solver failure. Fixing this stabilized the solution greatly. |
||
May 28, 2023, 19:48 |
|
#34 |
Super Moderator
Glenn Horrocks
Join Date: Mar 2009
Location: Sydney, Australia
Posts: 17,871
Rep Power: 144 |
Improving mesh quality always helps, and sometimes in ways you would not expect. Good to hear you got it working.
__________________
Note: I do not answer CFD questions by PM. CFD questions should be posted on the forum. |
|
June 23, 2023, 13:12 |
|
#35 |
Member
Ashkan Kashani
Join Date: Apr 2016
Posts: 46
Rep Power: 10 |
Hello again
I've got another relevant inquiry so I'm posting it here. I appreciate your comments. I have observed cases where the linear solver keeps failing ('F' is returned in the output file). Still, the transient solution seems to go on unaffected, i.e. the RMS values remain low and the monitor points (such as lift) do not show anything odd evolving. 1- Why does the linear solver failure not mess up the RMS values (RMS values are still well below the tolerance)? Are those two unrelated matters? 2- Under such circumstances, are the results still reliable regardless of the recurrent failure of the linear solver |
|
June 24, 2023, 00:20 |
|
#36 |
Super Moderator
Glenn Horrocks
Join Date: Mar 2009
Location: Sydney, Australia
Posts: 17,871
Rep Power: 144 |
You need to understand the structure of the solver to answer those questions.
When you do iterations in CFX you are seeing the outer loop of the solver. Each one of these iterations has the coefficients updated to account for the non-linear nature of the Navier Stokes equations. But at each iteration the non-linear parts of the Navier Stokes equations are linearised, which leaves you with a set of linear equations to solve. CFX uses a Multigrid solver to solve these linear equations. If you have studied numerical methods you would know that there are many linear equation solvers out there - from matrix inversion (which exactly solves the equations in one go, but the number of equations required is impractical for all but small problems) to iterative solvers (which iterate to the solution and can use far less calculations than direct solvers, but do not give an exact answer so require iterations until they are close enough). So the "F" for the linear solver shows the Multigrid solver has not iterated to its specified accuracy. As the outer equations progress the inner linear solver is really just giving you better linearisation coefficients. The inner solution just needs to be good enough that the linearisation is better than the last outer iteration. This can be done with quite poor inner equation convergence. That is why the convergence reported on the inner equations is only 0.1 or 0.01, that is all which is required. And often coarser than that is adequate as well - which is why your simulation still converges despite the poor linear solver failure. Of course if the linear solver does a really bad job and the linearisation gets worse then the outer loops are going to diverge and the run will crash. So you cannot go too far with this. But if the linear solver works moderately well that is often still enough for the outer loop to still converge.
__________________
Note: I do not answer CFD questions by PM. CFD questions should be posted on the forum. |
|
July 7, 2023, 12:34 |
|
#37 |
Member
Ashkan Kashani
Join Date: Apr 2016
Posts: 46
Rep Power: 10 |
Thanks ghorrocks, As always, thoroughly answered.
|
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
mathematical meaning of linear solver overflow | pamstad | CFX | 2 | October 8, 2019 01:54 |
some questions about :Fatal overflow in linear solver | chen bg | Main CFD Forum | 0 | October 30, 2018 22:48 |
simpleFoam parallel | AndrewMortimer | OpenFOAM Running, Solving & CFD | 12 | August 7, 2015 19:45 |
2D isothermal cylinder not converging | UPengineer | OpenFOAM Running, Solving & CFD | 7 | March 13, 2014 06:17 |
linear solver overflow | peggy | CFX | 1 | February 8, 2001 02:39 |