|
[Sponsors] |
PETSC gives a slowdown instead of speedup (GAMG) |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
February 23, 2021, 00:39 |
PETSC gives a slowdown instead of speedup (GAMG)
|
#1 |
Member
Junting Chen
Join Date: Feb 2016
Location: Ontario Canada
Posts: 38
Rep Power: 10 |
Hello all, I am a beginner on Petsc library. Basically I am using boomerAMG configurations listed in this paper.
https://prace-ri.eu/wp-content/uploa...-Framework.pdf Comparing performance using SIMPLE algorism, Petsc pressure solver takes 4 times of the time that GAMG pressure solver takes. The simulations is relatively simple, 2million cells of flow passing bluff body running on 22 CPU. I don't think this is right... I have seen quite a lot of people saying Petsc being the better one (similar or faster speed, better scaling). I guess petsc solver was not properly configured in my case. I haven't spent much time on understanding the math behind Petsc. Just want to know whether Petsc solver's performance/stability is heavily relied on input parameters. Thanks! Junting |
|
February 24, 2021, 06:18 |
|
#2 |
Senior Member
Hrvoje Jasak
Join Date: Mar 2009
Location: London, England
Posts: 1,907
Rep Power: 33 |
I am not surprised at all. Linear solvers in FOAM are hand-written to be optimal for the storage structure of the matrix, solver algorithms are simple and well understood and parallel communication is optimised to extreme. It would be VERY optimistic to have a general purpose linear algebra library match this, especially with the extra cost of copying the matrix from one format to another.
Hrv
__________________
Hrvoje Jasak Providing commercial FOAM/OpenFOAM and CFD Consulting: http://wikki.co.uk |
|
February 26, 2021, 00:23 |
|
#3 |
Member
Junting Chen
Join Date: Feb 2016
Location: Ontario Canada
Posts: 38
Rep Power: 10 |
thanks for clearing up my doubt!
|
|
February 28, 2021, 15:07 |
|
#4 |
Senior Member
|
I do share the optimism for various reasons.
The first reason is that once inside of the PETSc environment, one can easily perform event stage logging using PetscLogEventBegin/End to seperate problem set-up (including matrix format conversion) from the linear system solve. See e.g. https://www.mcs.anl.gov/petsc/petsc-...ventBegin.html The second reason is the Section 6 entitled conclusions of the white paper found here https://prace-ri.eu/wp-content/uploa...-Framework.pdf Thank you for your wonderful work! Domenico Lahaye. |
|
Tags |
petsc |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Suppress twoPhaseEulerFoam energy | AlmostSurelyRob | OpenFOAM Running, Solving & CFD | 33 | September 25, 2018 18:45 |
chtMultiRegionSimpleFoam turbulent case | Aditya Patil | OpenFOAM Running, Solving & CFD | 6 | April 24, 2017 23:13 |
rhoSimplecFoam Mach0.8 no pressure values | CFDnewbie147 | OpenFOAM Running, Solving & CFD | 16 | November 23, 2013 06:58 |
pimpleFoam: turbulence->correct(); is not executed when using residualControl | hfs | OpenFOAM Running, Solving & CFD | 3 | October 29, 2013 09:35 |
Differences between serial and parallel runs | carsten | OpenFOAM Bugs | 11 | September 12, 2008 12:16 |