CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

GAMG with cyclic boundary in parallel = crash

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 4, 2016, 09:05
Default GAMG with cyclic boundary in parallel = crash
  #1
Senior Member
 
rdbisme's Avatar
 
Ruben Di Battista
Join Date: May 2013
Location: Paris
Posts: 137
Rep Power: 13
rdbisme is on a distinguished road
Hello,

I'm trying to simulate a two-phase problem in a pipe using twoPhaseEulerFoam. I noticed that when I run the case in parallel and I have GAMG as solver for p_rgh the computation crashes at first iteration on p_rgh.

Seems related to this (http://www.openfoam.org/mantisbt/view.php?id=1247), but I'm on OpenFOAM 3.0.1.

Any of you had experience about that? There are additional things to setup in decomposeParDict?

I tried to setup preservePatches for the cyclic patches but no luck since now.

The serial simulation instead proceeds and gives reasonable results.

Code:
(Courant Number mean: 0.278509 max: 0.299811
Max Ur Courant Number = 0.170775
deltaT = 0.000320513
Time = 0.0131983

PIMPLE: iteration 1
MULES: Solving for alpha.SLN2
MULES: Solving for alpha.SLN2
alpha.SLN2 volume fraction = 0.13  Min(alpha.SLN2) = 0.0625242  Max(alpha.SLN2) = 0.236231
Constructing momentum equations
Pressure gradient source: uncorrected Ubar = 1.3, pressure gradient = 217.899
Pressure gradient source: uncorrected Ubar = 1.3, pressure gradient = 1280.75
min T.SLN2 63.1
min T.LN2 63.2
GAMG:  Solving for p_rgh, Initial residual = 0.282455, Final residual = 5.98708e+40, No Iterations 500
Pressure gradient source: uncorrected Ubar = -5.01445e+35, pressure gradient = 2.45967e+41
Pressure gradient source: uncorrected Ubar = -2.08706e+35, pressure gradient = 7.42086e+41
PIMPLE: iteration 2
MULES: Solving for alpha.SLN2
MULES: Solving for alpha.SLN2
smoothSolver:  Solving for alpha.SLN2, Initial residual = 2.96995e-17, Final residual = 6.90969e-18, No Iterations 1
alpha.SLN2 volume fraction = 1.02942e+45  Min(alpha.SLN2) = -7.5363e+59  Max(alpha.SLN2) = 4.85275e+59
Constructing momentum equations
Pressure gradient source: uncorrected Ubar = 2.34146e+22, pressure gradient = -6.61316e+84
Pressure gradient source: uncorrected Ubar = 4.48319e+21, pressure gradient = -6.54619e+84)
rdbisme is offline   Reply With Quote

Old   April 4, 2016, 12:25
Default
  #2
Senior Member
 
Mahdi Hosseinali
Join Date: Apr 2009
Location: NB, Canada
Posts: 273
Rep Power: 18
anishtain4 is on a distinguished road
I'm having similar issue too. I'm running channel359 case with dynamicLagranian model, it runs well when I'm using 4 nodes, and crashes on 8 nodes. However it runs for a considerable number of time steps before it crashes. It just happened recently so I didn't know where it's coming from but now that I checked I have similar settings for pressure solving method.

I suspect it could be a result of domain decomposition as it is sensitive to number of nodes. I have used the scotch method. Is this the same method you are using too?

PS: running on OF3.0.1 too.
anishtain4 is offline   Reply With Quote

Old   April 5, 2016, 06:19
Default
  #3
Senior Member
 
rdbisme's Avatar
 
Ruben Di Battista
Join Date: May 2013
Location: Paris
Posts: 137
Rep Power: 13
rdbisme is on a distinguished road
Quote:
Originally Posted by anishtain4 View Post
I'm having similar issue too. I'm running channel359 case with dynamicLagranian model, it runs well when I'm using 4 nodes, and crashes on 8 nodes. However it runs for a considerable number of time steps before it crashes. It just happened recently so I didn't know where it's coming from but now that I checked I have similar settings for pressure solving method.

I suspect it could be a result of domain decomposition as it is sensitive to number of nodes. I have used the scotch method. Is this the same method you are using too?

PS: running on OF3.0.1 too.
Since your case crashes after a while it seems a problem with your settings. Try to lower the Courant number or the underrelaxation factors. Are the results with 4 nodes reasonable?

For the moment I solved the problem changing the solver for pressure:

Code:
    p_rgh
    {
        solver          PCG;
        preconditioner  DIC;
        tolerance       1e-6;
        relTol          0;
        minIter         1;
    }
rdbisme is offline   Reply With Quote

Old   April 5, 2016, 11:34
Default
  #4
Senior Member
 
Mahdi Hosseinali
Join Date: Apr 2009
Location: NB, Canada
Posts: 273
Rep Power: 18
anishtain4 is on a distinguished road
results of solution in serial and with 2 and 4 nodes converge to a reasonable solution. But using 6 and 8 nodes one of the variables diverges in one step, so I assume it should be the domain decomposition that triggers the problem.

Thanks for sharing the solver method. I'll use it and will see how it goes.
anishtain4 is offline   Reply With Quote

Reply

Tags
cyclic boundaries, gamg scaling multigrid, twophaseeulerfoam


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Radiation in semi-transparent media with surface-to-surface model? mpeppels CFX 11 August 22, 2019 08:30
Problem with complex eigenValues Harak OpenFOAM Running, Solving & CFD 11 January 26, 2016 02:48
Extremely slow simulation with interDyMFoam jrrygg OpenFOAM Running, Solving & CFD 9 April 23, 2013 11:14
pisoFoam - unstable pressure residual Industrial_CFD OpenFOAM Running, Solving & CFD 21 February 24, 2013 16:39
Interfoam blows on parallel run danvica OpenFOAM Running, Solving & CFD 16 December 22, 2012 03:09


All times are GMT -4. The time now is 15:06.