|
[Sponsors] |
Decomposition Methods: which ones are best for large scale simulations? |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
January 7, 2010, 13:03 |
Decomposition Methods: which ones are best for large scale simulations?
|
#1 |
New Member
Join Date: Nov 2009
Posts: 17
Rep Power: 17 |
What decomposition method (Metis, Scotch, simple, hierarchical) you guys favor for large scaled simulations?
|
|
January 8, 2010, 03:22 |
|
#2 | |
Senior Member
Mark Olesen
Join Date: Mar 2009
Location: https://olesenm.github.io/
Posts: 1,714
Rep Power: 40 |
Quote:
Simply try out metis/scotch and see if you are happy with the results. You can view the separate processor* domains in paraview (eg, via paraFoam) to see if it matches up with your expectations. BTW: I believe that future versions of OpenFOAM might be switching to scotch instead of metis due to licensing aspects. I found that the scotch decomposition was fairly similar to metis anyhow. |
||
January 8, 2010, 05:35 |
|
#3 | |
Senior Member
BastiL
Join Date: Mar 2009
Posts: 530
Rep Power: 20 |
Quote:
What I miss is a parallel partitioning method freeing us from the need to have machines with very large memory. Regards |
||
January 8, 2010, 05:38 |
|
#4 |
New Member
Join Date: Nov 2009
Posts: 17
Rep Power: 17 |
Hi.
I'm using also scotch or metis since they do not require any manual input. I checked the subdomains in paraview and they seem to be convenient. But the problem is that I get a terrible speed-up (scaling) if I increase the processor number. In other words, nearly all computational effort is invested on solving the pressure-poisson equation in PISO loop, which does not scale well with increased processor number. Are you satisfied with parallelisation of OpenFOAM? I'm asking because considering this computational performance I can not simulate anything big. Maybe I'm doing something wrong. |
|
January 8, 2010, 05:58 |
|
#5 | ||
Senior Member
Mark Olesen
Join Date: Mar 2009
Location: https://olesenm.github.io/
Posts: 1,714
Rep Power: 40 |
Quote:
Quote:
Last edited by olesen; January 8, 2010 at 06:07. Reason: added reference to faq |
|||
October 29, 2015, 07:13 |
|
#6 | |
Senior Member
Join Date: Mar 2015
Posts: 250
Rep Power: 12 |
HI Mark,
can you please tell me how to load the separate processor domains into ParaView? Best regards, Kate Quote:
|
||
October 29, 2015, 10:00 |
|
#7 |
Member
Ron Burnett
Join Date: Feb 2013
Posts: 42
Rep Power: 13 |
Nothing special is needed Kate. From within each processor file make the
VTK conversion, then open Paraview and load as you normally would. Does this help? |
|
October 29, 2015, 11:41 |
|
#8 |
Senior Member
Join Date: Mar 2015
Posts: 250
Rep Power: 12 |
Sorry, somehow I overlooked the constant directory in the processor files.
Have a nice day, Kate |
|
August 15, 2017, 21:39 |
|
#9 | |
New Member
Koushik C
Join Date: Aug 2017
Posts: 1
Rep Power: 0 |
Quote:
When it comes to parallel programming, there is a sweet spot between the no of processors you can use and your domain. Say your domain has 10000 nodes/points, then you have to find out what is the optimum no of processors you can add such that the communications are minimal. Vtune amplifier will be helpful in this analysis. Hope this helps. Regards, Koushik Sent from my MI 5 using CFD Online Forum mobile app |
||
December 14, 2023, 03:36 |
|
#10 |
Member
MWRS
Join Date: Apr 2018
Location: Islamabad
Posts: 99
Rep Power: 8 |
Hi everyone.
Just wanted to share that I was having trouble with parallel processing. I have found that aspect ratio is very crucial for partitioning interfaces. |
|
Tags |
domain decomposition, parallel computing |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Any questions about Runge-Kutta methods | Runge_Kutta | Main CFD Forum | 33 | September 9, 2019 17:32 |
RANS modelling and URANS methods in unsteady flow simulations | immortality | OpenFOAM Running, Solving & CFD | 31 | January 3, 2014 08:19 |
OpenFOAM, Courant number and implicit methods | fsaltara | OpenFOAM | 8 | December 28, 2012 05:16 |
comments on FDM, FEM, FVM, SM, SEM, DSEM, BEM | kenn | Main CFD Forum | 2 | July 18, 2004 19:28 |
I just wonder why "SIMPLE" | Junseok Kim | Main CFD Forum | 21 | May 20, 2001 09:47 |