CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Simulations are not converging for many elementes (more than 1 000 000)

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 27, 2017, 07:55
Post Simulations are not converging for many elementes (more than 1 000 000)
  #1
New Member
 
Join Date: May 2013
Posts: 21
Rep Power: 13
silvai is on a distinguished road
Hi OpenFOAM people!

I am simulating blood flow in a 3D artery, in OpenFOAM code, and unfortunatly, my simulations are not converging for many elements (more than 1 000 000 like you can see below in the "Boundary" file). And many elements are necessary for mt work. I know this is a problem element, since for a number of elements much lower (5 or 10 times) the simulation converges.

In the "boundary" file, the elements are:

(
wall-interior
{
type wall;
nFaces 59205;
startFace 1248069;
}
inlet_lca
{
type patch;
nFaces 2625;
startFace 1307274;
}
outlet_lda
{
type patch;
nFaces 2130;
startFace 1309899;
}
outlet_lcx
{
type patch;
nFaces 2214;
startFace 1312029;
}

In the "controlDict" file:

startFrom latestTime;
startTime 0;
stopAt endTime;
endTime 3.0;
deltaT 0.001;
writeControl timeStep;
writeInterval 10;
purgeWrite 0;
writeFormat ascii;
writePrecision 6;
writeCompression off;
timeFormat general;
timePrecision 6;
runTimeModifiable true;

In "svSchemes" file:

ddtSchemes
{
default Euler;
}
gradSchemes
{
default Gauss linear;
grad(p) leastSquares;
}
divSchemes
{
default none;
div(phi,U) Gauss linear;
}
laplacianSchemes
{
default none;
laplacian(nu,U) Gauss linear corrected;
laplacian((1|A(U)),p) Gauss linear corrected;
}
interpolationSchemes
{
default linear;
interpolate(U) linear;
}
snGradSchemes
{
default corrected;
}
fluxRequired
{
default no;
p ;
}


In "fvSolution" file:

solvers
{
p
{
solver PCG;
preconditioner DIC;
tolerance 1e-06;
relTol 0;
}
U
{
solver PBiCG;
preconditioner DILU;
tolerance 1e-05;
relTol 0;
}
}
PISO
{
nCorrectors 4;
nNonOrthogonalCorrectors 2;
}

Someone have any sugestion what parameters I should modify for that simulations can converge for so many elements ( and also for running faster)?

I would be very happy and grateful with your answers and sugestions.
silvai is offline   Reply With Quote

Old   March 27, 2017, 11:31
Default
  #2
New Member
 
Join Date: Jan 2013
Location: Lisboa-Funchal
Posts: 23
Rep Power: 13
guilha is on a distinguished road
How did you do the mesh? Run the checkMesh with the options -allGeometry and -allTopology.
__________________
Se Urso Vires Foge Tocando Gaita Para Hamburgo
guilha is offline   Reply With Quote

Old   March 27, 2017, 13:53
Default
  #3
Senior Member
 
piu58's Avatar
 
Uwe Pilz
Join Date: Feb 2017
Location: Leipzig, Germany
Posts: 744
Rep Power: 15
piu58 is on a distinguished road
> And many elements are necessary for mt work.

Really? I recommend to start wit 2d, may be cylinder symmetric. Some of the problems may arise in this case already and can be solved without much effort.

You may get convergence problems if you have very small and large cells in the same geometry. An artery is not a geometric complicated structure and may be simplified, if you look at differets aspects of your simulation in different geometric mapping. This may save complexity and cell numbers too.

What do you want to find out with your simulation?
__________________
Uwe Pilz
--
Die der Hauptbewegung überlagerte Schwankungsbewegung ist in ihren Einzelheiten so hoffnungslos kompliziert, daß ihre theoretische Berechnung aussichtslos erscheint. (Hermann Schlichting, 1950)
piu58 is offline   Reply With Quote

Old   April 3, 2017, 11:19
Post first problem solved! :) Now... another problem...
  #4
New Member
 
Join Date: May 2013
Posts: 21
Rep Power: 13
silvai is on a distinguished road
Thank you for your answer.

2D simulations were already done in previous work. However, I need 3D simulations, now, for futher study.

I already solved this problem!!!
Previous simulations are converging much more faster using the multi-grid solver GAMG for Pressure. And, I also used a bounded scheme for div(phi, U) to avoid divergence. The 3D blood flow simulations using a simple model to define blood rheology and conservative equations implemented in OpenFoam, are running faster (only 1 day!) for more than 1 000 000 elements.
Also, the results are accurate since they are coincident with that obtained through the comercial code ANSYS.

However.... I also implemented a complex blood rheology to define in a more accurate way the blood physical properties. These properties are defined through a constitutive equation, which conservative equations depend on this constitutive equation. So, since OpenFoam is an Open Source Code, I modified the conservative equations and implemented the constitutive equation (defining blood rheology more correctly). This code was already validated in 2D dimension of the artery (with much less elements!).

In this way... My question is....

I want to simulate the previous case, in 3D dimension with more than 1 000 000 elements, faster. The simulations are too slow... with the risk of divergence , since, now, this code is more complex than the first one.

Do you have any suggestion (modify any method or parameter) for that simulations can run a lot more faster, for this case, and do not have the risk to diverge?

*************************************************
"boundary" file:

(
wall-interior
{
type wall;
nFaces 59205;
startFace 1248069;
}
inlet_lca
{
type patch;
nFaces 2625;
startFace 1307274;
}
outlet_lda
{
type patch;
nFaces 2130;
startFace 1309899;
}
outlet_lcx
{
type patch;
nFaces 2214;
startFace 1312029;
}
)

**************************************************
The "controlDict" file than I am using is:

application elasticshearthinFluidFoam;
startFrom latestTime;
startTime 0;
stopAt endTime;
endTime 3.0;
deltaT 0.0005;
writeControl adjustableRunTime;
writeInterval 0.01;
purgeWrite 0;
writeFormat ascii;
writePrecision 6;
writeCompression uncompressed;
timeFormat general;
timePrecision 6;
graphFormat raw;
runTimeModifiable yes;
adjustTimeStep on;
maxCo 0.8;
maxDeltaT 0.001;
************************************************** ***
The "fvSchemes" file is:

ddtSchemes
{
default Euler;
}
gradSchemes
{
default Gauss linear;
grad(p) Gauss linear;
grad(U) Gauss linear;
}
divSchemes
{
default none;
div(phi,U) Gauss SFCD;
div(phi,detAfirst) Gauss Minmod;
div(phi,detAsecond) Gauss Minmod;
div(phi,detAthird) Gauss Minmod;
div(phi,detAfourth) Gauss Minmod;
div(phi,taufirst) Gauss Minmod;
div(phi,tausecond) Gauss Minmod;
div(phi,tauthird) Gauss Minmod;
div(phi,taufourth) Gauss Minmod;
div(tau) Gauss linear;
}
laplacianSchemes
{
default none;
laplacian(etaPEff,U) Gauss linear corrected;
laplacian(etaPEff+etaS,U) Gauss linear corrected;
laplacian((1|A(U)),p) Gauss linear corrected;
}
interpolationSchemes
{
default linear;
interpolate(HbyA) linear;
}
snGradSchemes
{
default corrected;
}
fluxRequired
{
default no;
p;
}
*************************************************
The "fvSolution" file is:

solvers
{
p
{
solver GAMG;
preconditioner
{
// preconditioner Cholesky;
preconditioner FDIC;
cycle W-cycle;
policy AAMG;
nPreSweeps 0;
nPostSweeps 2;
groupSize 4;
minCoarseEqns 20;
nMaxLevels 100;
scale off;
smoother ILU;
}
tolerance 1e-05;
relTol 0;
minIter 0;
maxIter 800;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 100;
mergeLevels 1;
smoother GaussSeidel;
}

U
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

detAfirst
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

detAsecond
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

detAthird
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

detAfourth
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

taufirst
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

tausecond
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

tauthird
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

taufourth
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
};

}
PISO
{
nCorrectors 1;
nNonOrthogonalCorrectors 2;
pRefCell 200;
pRefValue 10000;
// nCorrectors 2;
// nNonOrthogonalCorrectors 0;
// pRefCell 0;
// pRefValue 0;
}

relaxationFactors
{
p 0.3;
U 0.5;
detAfirst 0.3;
detAsecond 0.3;
detAthird 0.3;
detAfourth 0.3;
taufirst 0.3;
tausecond 0.3;
tauthird 0.3;
taufourth 0.3;
}
************************************************** *********


I will be greatful! Thank you very much!
silvai is offline   Reply With Quote

Old   April 20, 2017, 06:44
Default
  #5
New Member
 
Join Date: Jan 2013
Location: Lisboa-Funchal
Posts: 23
Rep Power: 13
guilha is on a distinguished road
Sorry for the late answer. I am glad that you could resolve your issues.
I do not know what you can do to speed up the simulation just by the setup. An obvious answer is to run it in parallel.
__________________
Se Urso Vires Foge Tocando Gaita Para Hamburgo
guilha is offline   Reply With Quote

Old   April 20, 2017, 08:16
Default
  #6
New Member
 
Join Date: May 2013
Posts: 21
Rep Power: 13
silvai is on a distinguished road
Thank you very much for your answer.

I already solved the problem.
silvai is offline   Reply With Quote

Reply

Tags
3d simulations, many elements, not convergence, openfoam


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On



All times are GMT -4. The time now is 12:50.