CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > SU2

Mesh deformation memory requirements

Register Blogs Community New Posts Updated Threads Search

Like Tree1Likes
  • 1 Post By pcg

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 20, 2018, 09:36
Default Mesh deformation memory requirements
  #1
New Member
 
Join Date: Feb 2018
Posts: 4
Rep Power: 8
aa.g is on a distinguished road
Dear all,

I am performing a static FSI calculation where I couple SU2 with an external structural solver using the CFluidDriver class through the python wrapper. Once the structural deformations are applied to the fluid surface mesh, the volume mesh is deformed using the StaticMeshUpdate method (I have found element stiffness based on inverse wall distance to be more robust for this).

My progress bottlenecks here, due to seemingly excessive memory requirements of the mesh deformation routine (beats my 32GB RAM for a relatively small 3M cell mesh ...). Having tried several workarounds, I have come to the following observations:
  • Process crashes ("Killed" when running in serial, or a more informative "Exit status 9" when running in parallel) after calculation of the element volumes/wall distances: perhaps during the assembly of the stiffness matrix ? System Load Viewer spikes to >99.8% RAM Memory before crashing.
  • For smaller deformations, it might happen that the deformation proceeds successfully.
  • Increasing the number of deformation increments (DEFORM_NONLINEAR_ITER) seems to affect the issue as well. This is most likely related to the previous observation.
I hope this can give a reasonable overview of the context. The problem appears both with 5.0.0 and 6.0.0. Having struggled with this for a moment now, I would like to know the following :
  • Are the reported memory requirements reasonable for the described case (over 32GB RAM for ~3M cell volume mesh deformation) ? If so, my machine is simply not sufficient ... although 32GB seems excessive for the calculation at hand.
  • What settings in the Grid Deformation Parameters section of the .cfg could allow me to reduce these memory requirements ? Perhaps my setting of the solver (FGMRES), preconditioner (ILU) or the remaining parameters are not well-suited, although I suspect it is irrelevant if the problem is a large stiffness matrix.
  • Bonus: Why does the amplitude of the deformation (seem to) affect the memory requirements of the volume mesh movement routine ? Is there perhaps some kind of "radius" outside of which elements are not included in the calculation, which depends on the boundary deformation ?
If anyone has experience with a similar process including the volume mesh deformation, any comment would be very welcome and helpful.
aa.g is offline   Reply With Quote

Old   March 25, 2018, 23:10
Default
  #2
hlk
Senior Member
 
Heather Kline
Join Date: Jun 2013
Posts: 309
Rep Power: 14
hlk is on a distinguished road
Thanks for your question.
In my experience memory can be an issue for deformation.
I would suggest first setting DEFORM_CONSOLE_OUTPUT to YES in order to confirm that it is crashing at some point during the linear solve.
If that is the problem, you can try:
-using DEFORM_LINEAR_SOLVER=RESTARTED_FGMRES along with LINEAR_SOLVER_RESTART_FREQUENCY = n (it defaults to 10)
-raising the tolerance on the deformation,
-decreasing the number of linear iterations and/or increasing the number of nonlinear iterations.
- set VISUALIZE_DEFORMATION = YES to get some more information on whether there are other issues

FGMRES can use a lot of memory because it is storing information from it's iterations; R-FGMRES gets around this by restarting the solution periodically, using the last iteration as a new initial point.

you might try limiting the number of iterations first, which will help troubleshooting other problems that might be apparent when plotting the output geometry.

Quote:
Originally Posted by aa.g View Post
Dear all,

I am performing a static FSI calculation where I couple SU2 with an external structural solver using the CFluidDriver class through the python wrapper. Once the structural deformations are applied to the fluid surface mesh, the volume mesh is deformed using the StaticMeshUpdate method (I have found element stiffness based on inverse wall distance to be more robust for this).

My progress bottlenecks here, due to seemingly excessive memory requirements of the mesh deformation routine (beats my 32GB RAM for a relatively small 3M cell mesh ...). Having tried several workarounds, I have come to the following observations:
  • Process crashes ("Killed" when running in serial, or a more informative "Exit status 9" when running in parallel) after calculation of the element volumes/wall distances: perhaps during the assembly of the stiffness matrix ? System Load Viewer spikes to >99.8% RAM Memory before crashing.
  • For smaller deformations, it might happen that the deformation proceeds successfully.
  • Increasing the number of deformation increments (DEFORM_NONLINEAR_ITER) seems to affect the issue as well. This is most likely related to the previous observation.
I hope this can give a reasonable overview of the context. The problem appears both with 5.0.0 and 6.0.0. Having struggled with this for a moment now, I would like to know the following :
  • Are the reported memory requirements reasonable for the described case (over 32GB RAM for ~3M cell volume mesh deformation) ? If so, my machine is simply not sufficient ... although 32GB seems excessive for the calculation at hand.
  • What settings in the Grid Deformation Parameters section of the .cfg could allow me to reduce these memory requirements ? Perhaps my setting of the solver (FGMRES), preconditioner (ILU) or the remaining parameters are not well-suited, although I suspect it is irrelevant if the problem is a large stiffness matrix.
  • Bonus: Why does the amplitude of the deformation (seem to) affect the memory requirements of the volume mesh movement routine ? Is there perhaps some kind of "radius" outside of which elements are not included in the calculation, which depends on the boundary deformation ?
If anyone has experience with a similar process including the volume mesh deformation, any comment would be very welcome and helpful.
hlk is offline   Reply With Quote

Old   April 6, 2018, 11:37
Default
  #3
New Member
 
Join Date: Feb 2018
Posts: 4
Rep Power: 8
aa.g is on a distinguished road
Dear hlk,

Thank you very much for your detailed and helpful answer. Let me get back to you with a quantitative estimate of the improvements in my specific case.

In the meanwhile, I would like to point out another minor issue that I came across when testing my process on a larger model: due to the index iVertex being declared as an unsigned short in several places across the CDriver classes, one runs into overflow errors when more than 65535 surface nodes are "owned" by a single process. Was this intended by the developers ? At the cost of a few bytes, the problem is simply solved by refactoring this variable to unsigned, or even unsigned long.

Quote:
Originally Posted by hlk View Post
Thanks for your question.
In my experience memory can be an issue for deformation.
I would suggest first setting DEFORM_CONSOLE_OUTPUT to YES in order to confirm that it is crashing at some point during the linear solve.
If that is the problem, you can try:
-using DEFORM_LINEAR_SOLVER=RESTARTED_FGMRES along with LINEAR_SOLVER_RESTART_FREQUENCY = n (it defaults to 10)
-raising the tolerance on the deformation,
-decreasing the number of linear iterations and/or increasing the number of nonlinear iterations.
- set VISUALIZE_DEFORMATION = YES to get some more information on whether there are other issues

FGMRES can use a lot of memory because it is storing information from it's iterations; R-FGMRES gets around this by restarting the solution periodically, using the last iteration as a new initial point.

you might try limiting the number of iterations first, which will help troubleshooting other problems that might be apparent when plotting the output geometry.
aa.g is offline   Reply With Quote

Old   April 9, 2018, 20:45
Default
  #4
hlk
Senior Member
 
Heather Kline
Join Date: Jun 2013
Posts: 309
Rep Power: 14
hlk is on a distinguished road
For reporting bugs, we suggest using the github issue tracker:
https://github.com/su2code/SU2/issues

You may see a couple of 'issues' that are actually questions more appropriate for the forum - occasionally people get confused about which one to use, or don't know which category their question falls under. The forum is generally used for questions about how to use SU2 (like your original question) that can be answered by anyone with experience with SU2 or CFD, while the issue tracker is meant to be for reporting bugs (like an unsigned short being used where it ought to be unsigned long) that require attention from code developers.

If you would like to fix the problem yourself, please see the developers docs available here:
https://su2code.github.io/docs/home/
hlk is offline   Reply With Quote

Old   March 24, 2020, 13:47
Default Follow up
  #5
New Member
 
Peter
Join Date: May 2019
Posts: 13
Rep Power: 7
jomunkas is on a distinguished road
Hi,

I encountered a similar issue as I am still using v6. The suggestion from hlk are clear. Thank you. Just one more question:
Is there any setting to use also hard disk memory?





Quote:
Originally Posted by aa.g View Post
Dear all,

I am performing a static FSI calculation where I couple SU2 with an external structural solver using the CFluidDriver class through the python wrapper. Once the structural deformations are applied to the fluid surface mesh, the volume mesh is deformed using the StaticMeshUpdate method (I have found element stiffness based on inverse wall distance to be more robust for this).

My progress bottlenecks here, due to seemingly excessive memory requirements of the mesh deformation routine (beats my 32GB RAM for a relatively small 3M cell mesh ...). Having tried several workarounds, I have come to the following observations:
  • Process crashes ("Killed" when running in serial, or a more informative "Exit status 9" when running in parallel) after calculation of the element volumes/wall distances: perhaps during the assembly of the stiffness matrix ? System Load Viewer spikes to >99.8% RAM Memory before crashing.
  • For smaller deformations, it might happen that the deformation proceeds successfully.
  • Increasing the number of deformation increments (DEFORM_NONLINEAR_ITER) seems to affect the issue as well. This is most likely related to the previous observation.
I hope this can give a reasonable overview of the context. The problem appears both with 5.0.0 and 6.0.0. Having struggled with this for a moment now, I would like to know the following :
  • Are the reported memory requirements reasonable for the described case (over 32GB RAM for ~3M cell volume mesh deformation) ? If so, my machine is simply not sufficient ... although 32GB seems excessive for the calculation at hand.
  • What settings in the Grid Deformation Parameters section of the .cfg could allow me to reduce these memory requirements ? Perhaps my setting of the solver (FGMRES), preconditioner (ILU) or the remaining parameters are not well-suited, although I suspect it is irrelevant if the problem is a large stiffness matrix.
  • Bonus: Why does the amplitude of the deformation (seem to) affect the memory requirements of the volume mesh movement routine ? Is there perhaps some kind of "radius" outside of which elements are not included in the calculation, which depends on the boundary deformation ?
If anyone has experience with a similar process including the volume mesh deformation, any comment would be very welcome and helpful.
jomunkas is offline   Reply With Quote

Old   March 24, 2020, 16:34
Default
  #6
pcg
Senior Member
 
Pedro Gomes
Join Date: Dec 2017
Posts: 466
Rep Power: 14
pcg is on a distinguished road
Hi Peter, no, SU2 does not run out-of-core adding to Heathers' suggestions you can also try the CONJUGATE_GRADIENT linear solver.
Do note that a number of issues with 3D mesh deformation have been addressed for v7...
jomunkas likes this.
pcg is offline   Reply With Quote

Reply

Tags
deformation, fsi, memory, python, su2_def


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[snappyHexMesh] Add Mesh Layers doesnt work on the whole surface Kryo OpenFOAM Meshing & Mesh Conversion 13 February 17, 2022 08:34
how to set periodic boundary conditions Ganesh FLUENT 15 November 18, 2020 07:09
decomposePar problem: Cell 0contains face labels out of range vaina74 OpenFOAM Pre-Processing 37 July 20, 2020 06:38
Moving mesh Niklas Wikstrom (Wikstrom) OpenFOAM Running, Solving & CFD 122 June 15, 2014 07:20
Icemcfd 11: Loss of mesh from surface mesh option? Joe CFX 2 March 26, 2007 19:10


All times are GMT -4. The time now is 01:41.