CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Extremely slow simulation with interDyMFoam

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 18, 2013, 06:16
Default Extremely slow simulation with interDyMFoam
  #1
Member
 
Anon
Join Date: Oct 2012
Posts: 33
Rep Power: 14
jrrygg is on a distinguished road
Hi,

I am running a simulation with interDyMFoam (Air/Water, AMI with rotating and stationary mesh) which runs extremely slow, and I am looking for some ways to improve the performance. The main reason seems to be low time steps (e-8) to keep the Courant number < 1, but it should be possible to speed up the calculation of every time step I think.

I am looking for some comments on the residual targets/tolerances set in fvSolution, and maybe on my fvSchemes file also?

I recently increased the toleranse of the pressure terms to 1e-6 instead of 1e-8 too increase the speed of the simulations, could this be further increased?

I am running decomposed on 128 cores by the way.

Any tips on how to improve the performance will be greatly appreciated.

Regards,

Jone

Some output:
Code:
Interface Courant Number mean: 6.455324963e-06 max: 0.5005681575
Courant Number mean: 0.0002116964997 max: 0.5882487797
deltaT = 5.569042488e-08
Time = 0.006919137503

solidBodyMotionFunctions::rotatingMotion::transformation(): Time = 0.006919137503 transformation: ((0 0 0) (0.9738608726 (0 0.2271453297 0)))
AMI: Creating addressing and weights between 4453 source faces and 4453 target faces
AMI: Patch source weights min/max/average = 0.9985002144, 1.001921397, 1.000047858
AMI: Patch target weights min/max/average = 0.9990857617, 1.001532226, 1.000018088
Execution time for mesh.update() = 0.07 s
MULES: Solving for alpha1
Phase-1 volume fraction = 0.00295762538  Min(alpha1) = -1.735479458e-18  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 0.002957636622  Min(alpha1) = -1.731270876e-18  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 0.002957647861  Min(alpha1) = -1.732182027e-18  Max(alpha1) = 1
GAMG:  Solving for p_rgh, Initial residual = 0.02411778836, Final residual = 8.942364364e-07, No Iterations 40
time step continuity errors : sum local = 2.893569178e-13, global = 4.456094558e-14, cumulative = 9.7491866e-11
GAMG:  Solving for p_rgh, Initial residual = 0.01070900624, Final residual = 8.263136668e-07, No Iterations 17
time step continuity errors : sum local = 2.632099118e-13, global = -7.978696644e-15, cumulative = 9.74838873e-11
ExecutionTime = 61236.58 s  ClockTime = 61388 s

Interface Courant Number mean: 6.448986145e-06 max: 0.5055292675
Courant Number mean: 0.0002114044481 max: 0.5878898938
deltaT = 5.504594754e-08
Time = 0.006919192549

solidBodyMotionFunctions::rotatingMotion::transformation(): Time = 0.006919192549 transformation: ((0 0 0) (0.9738604585 (0 0.2271471051 0)))
AMI: Creating addressing and weights between 4453 source faces and 4453 target faces
AMI: Patch source weights min/max/average = 0.9985011005, 1.001921476, 1.00004786
AMI: Patch target weights min/max/average = 0.9990854226, 1.001532248, 1.000018089
Execution time for mesh.update() = 0.07 s
MULES: Solving for alpha1
Phase-1 volume fraction = 0.002957658974  Min(alpha1) = -1.733118704e-18  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 0.002957670084  Min(alpha1) = -1.733538446e-18  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 0.002957681198  Min(alpha1) = -1.734437289e-18  Max(alpha1) = 1
GAMG:  Solving for p_rgh, Initial residual = 0.02904518995, Final residual = 8.139273874e-07, No Iterations 38
time step continuity errors : sum local = 2.577957076e-13, global = -5.558341087e-14, cumulative = 9.742830389e-11
GAMG:  Solving for p_rgh, Initial residual = 0.01185015756, Final residual = 8.340824853e-07, No Iterations 24
time step continuity errors : sum local = 2.717511422e-13, global = 4.323848088e-14, cumulative = 9.747154237e-11
ExecutionTime = 61237.22 s  ClockTime = 61389 s
My fvSolution:
Code:
FoamFile
{
    version     2.0;
    format      ascii;
    class       dictionary;
    location    "system";
    object      fvSolution;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

solvers
{
    pcorr
    {
        solver          GAMG;
        tolerance       1e-06; //1e-08
        relTol          0;
        smoother        DIC;
        nPreSweeps      0;
        nPostSweeps     2;
        nFinestSweeps   2;
        cacheAgglomeration true;
        nCellsInCoarsestLevel 10;
        agglomerator    faceAreaPair;
        mergeLevels     1;
    }

    p_rgh
    {
        solver          GAMG;
        tolerance       1e-06; //1e-08
        relTol          0;
        smoother        DIC;
        nPreSweeps      0;
        nPostSweeps     2;
        nFinestSweeps   2;
        cacheAgglomeration true;
        nCellsInCoarsestLevel 10;
        agglomerator    faceAreaPair;
        mergeLevels     1;
    }

    p_rghFinal
    {
        solver          GAMG;
        tolerance       1e-06; //1e-08
        relTol          0;
        smoother        DIC;
        nPreSweeps      0;
        nPostSweeps     2;
        nFinestSweeps   2;
        cacheAgglomeration true;
        nCellsInCoarsestLevel 10;
        agglomerator    faceAreaPair;
        mergeLevels     1;
    }

    U
    {
        solver          smoothSolver;
        smoother        GaussSeidel;
        tolerance       1e-06;
        relTol          0;
        nSweeps         1;
    }

    UFinal
    {
        solver          smoothSolver;
        smoother        GaussSeidel;
        tolerance       1e-06;
        relTol          0;
        nSweeps         1;
    }

}

PIMPLE
{
    momentumPredictor no; //yes
    nCorrectors     2;
    nNonOrthogonalCorrectors 0;
    nAlphaCorr      1;
    nAlphaSubCycles 3;
    cAlpha          1.5;
    correctPhi      no;

   /* pRefPoint       (0.0013 0.0017 0.0017);
    pRefValue       1e5; */
}

relaxationFactors
{
    fields
    {
    }
    equations
    {
        "U.*"           1;
    }
}


// ************************************************************************* //
fvSchemes:
Code:
FoamFile
{
    version     2.0;
    format      ascii;
    class       dictionary;
    location    "system";
    object      fvSchemes;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

ddtSchemes
{
    default         Euler;
}

gradSchemes
{
    default         Gauss linear;
}

divSchemes
{
    div(rho*phi,U)  Gauss limitedLinearV 1;
    div(phi,alpha)  Gauss vanLeer01;
    div(phirb,alpha) Gauss interfaceCompression;
    //Following added because of crash on Vilje
    div((muEff*dev(T(grad(U))))) Gauss linear;
    div((nuEff*dev(T(grad(U))))) Gauss linear;
}

laplacianSchemes
{
    default         Gauss linear limited 1.0;
}

interpolationSchemes
{
    default         linear;
}

snGradSchemes
{
    default         limited 1.0;
}

fluxRequired
{
    default         no;
    p_rgh;
    pcorr;
    alpha;
}


// ************************************************************************* //
jrrygg is offline   Reply With Quote

Old   April 22, 2013, 12:56
Default
  #2
Senior Member
 
sail's Avatar
 
Vieri Abolaffio
Join Date: Jul 2010
Location: Always on the move.
Posts: 308
Rep Power: 17
sail is on a distinguished road
increasing the tolerance is definitely not what i would raccomand. you are trading a small speedup for accuracy. definitely a no-no.

any chanches you can use a coarser mesh? the courant number is set by the cell size and the flow velocity so increasing the cell size lowers the Courant number, thus allowing a higher timestep. also, check your maeh, maybe it is just a cell or two that are small and are slowing down all the simulation.

I don't know if this strategy is doable for your case though.

another thing you might want to try is to initialize the case with the mesh steady, maybe even using LTSInterFoam, even without turbulent quantities, just to do not start you simulation with big local velocities given by the mesh movement (I'm just guessing, something like a propeller rotating at 300 rpm, inflow velocity 10 ms, and there you go, big relative speed between flow and mesh motion, because the flow haven't picked it up yet...)
__________________
http://www.leadingedge.it/
Naval architecture and CFD consultancy
sail is offline   Reply With Quote

Old   April 22, 2013, 13:45
Default
  #3
ngj
Senior Member
 
Niels Gjoel Jacobsen
Join Date: Mar 2009
Location: Copenhagen, Denmark
Posts: 1,903
Rep Power: 37
ngj will become famous soon enoughngj will become famous soon enough
Hi Jone,

From my experience with moving meshes and VOF methods, then you really have to consider the following:

Code:
momentumPrediction on; 
nCorrectors 3; // At least. I sometimes use as many as 5
nNonOrthogonalCorrectors <larger>?;
With respect to the non-orthogonal correctors, then you are using a method with rotating meshes, which must give rise to at least occasional high non-orthogonalities.

In addition, I have read some threads about poor performance of the AMI on a large number of processors. You might consider the need of 128 processors, now that one iteration only takes 0.64 s.

Kind regards

Niels
ngj is offline   Reply With Quote

Old   April 22, 2013, 15:21
Default
  #4
Senior Member
 
sail's Avatar
 
Vieri Abolaffio
Join Date: Jul 2010
Location: Always on the move.
Posts: 308
Rep Power: 17
sail is on a distinguished road
Quote:
Originally Posted by ngj View Post
Hi Jone,

From my experience with moving meshes and VOF methods, then you really have to consider the following:

Code:
momentumPrediction on; 
nCorrectors 3; // At least. I sometimes use as many as 5
nNonOrthogonalCorrectors <larger>?;
With respect to the non-orthogonal correctors, then you are using a method with rotating meshes, which must give rise to at least occasional high non-orthogonalities.

In addition, I have read some threads about poor performance of the AMI on a large number of processors. You might consider the need of 128 processors, now that one iteration only takes 0.64 s.

Kind regards

Niels
silly me, I haven't noticed that. how big is your mesh?
__________________
http://www.leadingedge.it/
Naval architecture and CFD consultancy
sail is offline   Reply With Quote

Old   April 22, 2013, 17:55
Default
  #5
Member
 
Anon
Join Date: Oct 2012
Posts: 33
Rep Power: 14
jrrygg is on a distinguished road
Vieri and Nils, thank you very much for your answers! I will look into them tomorrow morning.

My mesh is around 1 million cells, so not that big. In CFX it takes less than 1/10 of the time with 4 millions, so I really donīt get it.

I agree that the mesh might be the reason, I actually think that there are a couple of very small cells limiting the simulation time. Maybe I can find a way to limit the min cell size in Ansys Meshing to compare the speed.

Initialization is a very good idea that I will look into.

Have a good evening!


Regards,

Jone
jrrygg is offline   Reply With Quote

Old   April 22, 2013, 18:02
Default
  #6
Senior Member
 
sail's Avatar
 
Vieri Abolaffio
Join Date: Jul 2010
Location: Always on the move.
Posts: 308
Rep Power: 17
sail is on a distinguished road
128 cores for a 1 million mesh it is really too much. I do think that you are getting too much mpi communication overhead.

try with a more reasonable number, like 20-40 cores top.
__________________
http://www.leadingedge.it/
Naval architecture and CFD consultancy
sail is offline   Reply With Quote

Old   April 23, 2013, 05:50
Default
  #7
Member
 
Anon
Join Date: Oct 2012
Posts: 33
Rep Power: 14
jrrygg is on a distinguished road
First of all, I also tried with a coarser mesh. It was ok in checkMesh, but it stilled crashed with the following error:

Code:
#0  Foam::error::printStack(Foam::Ostream&) in "/opt/openfoam211/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
#1  Foam::sigFpe::sigHandler(int) in "/opt/openfoam211/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
#2   in "/lib/x86_64-linux-gnu/libc.so.6"
#3  void Foam::MULES::limiter<Foam::geometricOneField, Foam::zeroField, Foam::zeroField>(Foam::Field<double>&, Foam::geometricOneField const&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::zeroField const&, Foam::zeroField const&, double, double, int) in "/opt/openfoam211/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
#4  void Foam::MULES::limit<Foam::geometricOneField, Foam::zeroField, Foam::zeroField>(Foam::geometricOneField const&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh>&, Foam::zeroField const&, Foam::zeroField const&, double, double, int, bool) in "/opt/openfoam211/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
#5  Foam::MULES::explicitSolve(Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh>&, double, double) in "/opt/openfoam211/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
#6  
 in "/opt/openfoam211/platforms/linux64GccDPOpt/bin/interDyMFoam"
#7  __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6"
#8  
 in "/opt/openfoam211/platforms/linux64GccDPOpt/bin/interDyMFoam"
Last timesteps:
Code:
Interface Courant Number mean: 1.597370001e-13 max: 3.367889415e-08
Courant Number mean: 8.492343262e-05 max: 47.93150517
deltaT = 1.542054914e-14
Time = 5.124667456e-05

solidBodyMotionFunctions::rotatingMotion::transformation(): Time = 5.124667456e-05 transformation: ((0 0 0) (0.9999985598 (0 0.001697167211 0)))
AMI: Creating addressing and weights between 1860 source faces and 1860 target faces
AMI: Patch source weights min/max/average = 0.9999999855, 1.000273407, 1.00002072
AMI: Patch target weights min/max/average = 0.9999986967, 1.000284245, 1.000020412
Execution time for mesh.update() = 0.25 s
--> FOAM Warning : 
    From function Time::operator++()
    in file db/Time/Time.C at line 1010
    Increased the timePrecision from 10 to 11 to distinguish between timeNames at time 5.124667455e-05
MULES: Solving for alpha1
Phase-1 volume fraction = 2.340101492e-05  Min(alpha1) = -2.94026316e-21  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 2.340101492e-05  Min(alpha1) = -2.940263151e-21  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 2.340101493e-05  Min(alpha1) = -2.940263144e-21  Max(alpha1) = 1
smoothSolver:  Solving for Ux, Initial residual = 0.03289536739, Final residual = 5.931840904e-09, No Iterations 22
smoothSolver:  Solving for Uy, Initial residual = 0.02652586093, Final residual = 6.490399401e-09, No Iterations 21
smoothSolver:  Solving for Uz, Initial residual = 0.06385161287, Final residual = 9.329710362e-09, No Iterations 22
GAMG:  Solving for p_rgh, Initial residual = 0.8168173646, Final residual = 5.329552555e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.5691559964, Final residual = 4.333370438e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.6077577617, Final residual = 5.096296929e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.5588035124, Final residual = 4.970930311e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.6553288826, Final residual = 6.082410996e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.6012161847, Final residual = 5.743818456e-09, No Iterations 20
time step continuity errors : sum local = 2.767907786e-13, global = 6.720125451e-14, cumulative = -4.563931424e-06
GAMG:  Solving for p_rgh, Initial residual = 0.7547558975, Final residual = 8.26595506e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.4767521403, Final residual = 4.33189641e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.500681991, Final residual = 5.307664532e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.4631253002, Final residual = 4.796368045e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.5959068421, Final residual = 6.221265876e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.5300655458, Final residual = 5.512629022e-09, No Iterations 20
time step continuity errors : sum local = 6.317765168e-13, global = 1.567796998e-13, cumulative = -4.563931267e-06
GAMG:  Solving for p_rgh, Initial residual = 0.6942039022, Final residual = 7.554443004e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.5165126574, Final residual = 5.254193207e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.6475607497, Final residual = 6.787541225e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.5574720976, Final residual = 5.810655172e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.7423454876, Final residual = 7.734423209e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.6165483516, Final residual = 6.41331222e-09, No Iterations 20
time step continuity errors : sum local = 1.508767972e-12, global = 3.749318923e-13, cumulative = -4.563930892e-06
GAMG:  Solving for p_rgh, Initial residual = 0.8276795512, Final residual = 8.954560476e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.558609031, Final residual = 5.530478839e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.6826850609, Final residual = 7.158367366e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.5769959496, Final residual = 5.989697425e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.7749635251, Final residual = 8.06470252e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.6336308451, Final residual = 6.581617802e-09, No Iterations 20
time step continuity errors : sum local = 3.559817726e-12, global = 8.850757951e-13, cumulative = -4.563930007e-06
GAMG:  Solving for p_rgh, Initial residual = 0.8547424795, Final residual = 9.199343842e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.5841017513, Final residual = 5.844350229e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.7306124828, Final residual = 7.63158262e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.6069032171, Final residual = 6.294821239e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.8194731067, Final residual = 8.514487236e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.6594746017, Final residual = 6.843822577e-09, No Iterations 20
time step continuity errors : sum local = 8.411599725e-12, global = 2.092559032e-12, cumulative = -4.563927914e-06
ExecutionTime = 501.96 s  ClockTime = 502 s

Interface Courant Number mean: 1.770494893e-13 max: 4.980492024e-08
Courant Number mean: 0.0002330663221 max: 139.7058538
deltaT = 1.103786901e-15
Time = 5.1246674565e-05

solidBodyMotionFunctions::rotatingMotion::transformation(): Time = 5.124667456e-05 transformation: ((0 0 0) (0.9999985598 (0 0.001697167211 0)))
AMI: Creating addressing and weights between 1860 source faces and 1860 target faces
AMI: Patch source weights min/max/average = 0.9999999855, 1.000273407, 1.00002072
AMI: Patch target weights min/max/average = 0.9999986967, 1.000284245, 1.000020412
Execution time for mesh.update() = 0.23 s
--> FOAM Warning : 
    From function Time::operator++()
    in file db/Time/Time.C at line 1010
    Increased the timePrecision from 11 to 12 to distinguish between timeNames at time 5.124667456e-05
MULES: Solving for alpha1
Phase-1 volume fraction = 2.340101493e-05  Min(alpha1) = -2.940263142e-21  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 2.340101493e-05  Min(alpha1) = -2.94026314e-21  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 2.340101493e-05  Min(alpha1) = -2.940263136e-21  Max(alpha1) = 1
smoothSolver:  Solving for Ux, Initial residual = 0.06477118372, Final residual = 8.642332518e-09, No Iterations 26
smoothSolver:  Solving for Uy, Initial residual = 0.02933841898, Final residual = 6.713449257e-09, No Iterations 25
smoothSolver:  Solving for Uz, Initial residual = 0.1070822693, Final residual = 8.780816192e-09, No Iterations 27
GAMG:  Solving for p_rgh, Initial residual = 0.8893907442, Final residual = 6.03093516e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.2999964081, Final residual = 4.400563944e-09, No Iterations 19
GAMG:  Solving for p_rgh, Initial residual = 0.2001912473, Final residual = 4.323988997e-09, No Iterations 19
GAMG:  Solving for p_rgh, Initial residual = 0.1991798232, Final residual = 4.856585103e-09, No Iterations 19
GAMG:  Solving for p_rgh, Initial residual = 0.2166203943, Final residual = 5.449721145e-09, No Iterations 19
GAMG:  Solving for p_rgh, Initial residual = 0.2325097647, Final residual = 5.987364552e-09, No Iterations 19
time step continuity errors : sum local = 2.31991253e-13, global = 4.653817902e-14, cumulative = -4.563927868e-06
GAMG:  Solving for p_rgh, Initial residual = 0.2530189377, Final residual = 6.73790163e-09, No Iterations 19
GAMG:  Solving for p_rgh, Initial residual = 0.2376544809, Final residual = 6.337231792e-09, No Iterations 19
GAMG:  Solving for p_rgh, Initial residual = 0.2508918284, Final residual = 7.096666627e-09, No Iterations 19
GAMG:  Solving for p_rgh, Initial residual = 0.2671706917, Final residual = 7.384764598e-09, No Iterations 19
GAMG:  Solving for p_rgh, Initial residual = 0.3142132031, Final residual = 8.694850032e-09, No Iterations 19
GAMG:  Solving for p_rgh, Initial residual = 0.3269917566, Final residual = 8.999909757e-09, No Iterations 19
time step continuity errors : sum local = 4.947432806e-13, global = 1.061610488e-13, cumulative = -4.563927762e-06
GAMG:  Solving for p_rgh, Initial residual = 0.3896779346, Final residual = 4.01177967e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.3602979394, Final residual = 9.679480202e-09, No Iterations 19
GAMG:  Solving for p_rgh, Initial residual = 0.4086001377, Final residual = 4.23695469e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.4096242098, Final residual = 4.220163954e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.4891555194, Final residual = 5.018675493e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.4798885981, Final residual = 4.903916905e-09, No Iterations 20
time step continuity errors : sum local = 4.304573434e-13, global = 9.361216275e-14, cumulative = -4.563927668e-06
GAMG:  Solving for p_rgh, Initial residual = 0.5731626992, Final residual = 5.785088066e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.4692696901, Final residual = 4.65822784e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.5222446816, Final residual = 5.323751511e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.5051875075, Final residual = 5.107623445e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.6048712058, Final residual = 6.10837163e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.5734690483, Final residual = 5.770318327e-09, No Iterations 20
time step continuity errors : sum local = 9.866099684e-13, global = 2.125564627e-13, cumulative = -4.563927455e-06
GAMG:  Solving for p_rgh, Initial residual = 0.6854054465, Final residual = 6.829505031e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.537832268, Final residual = 5.286631514e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.5994082542, Final residual = 6.028535219e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.5680692402, Final residual = 5.673560985e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.6801833379, Final residual = 6.789791194e-09, No Iterations 20
GAMG:  Solving for p_rgh, Initial residual = 0.6322346784, Final residual = 6.292624607e-09, No Iterations 20
time step continuity errors : sum local = 2.267217457e-12, global = 4.844195478e-13, cumulative = -4.563926971e-06
ExecutionTime = 540.1 s  ClockTime = 540 s

Interface Courant Number mean: 1.178773889e-13 max: 3.493308291e-08
Courant Number mean: 8.800536524e-05 max: 51.12002034
deltaT = 2.159206695e-16
Time = 5.12466745648e-05

solidBodyMotionFunctions::rotatingMotion::transformation(): Time = 5.124667456e-05 transformation: ((0 0 0) (0.9999985598 (0 0.001697167211 0)))
AMI: Creating addressing and weights between 1860 source faces and 1860 target faces
AMI: Patch source weights min/max/average = 0.9999999855, 1.000273407, 1.00002072
AMI: Patch target weights min/max/average = 0.9999986967, 1.000284245, 1.000020412
Execution time for mesh.update() = 0.25 s
I played around with the parameters in fvSolution for this one with no success.

About the number of cores you are right, it is too many. But anyway it has been running way faster with 128 than with 8 cores. I heard before that there should be around 50k cells for each core, so 20 could be a good number of cores.

I am trying to initialize with potentialFoam by adding these lines to fvSolutions:
Code:
potentialFlow
{
    nNonOrthogonalCorrectors 10;
}
How can I use LTSInterFoam to initialize? The quantities are quite large as you say, inlet speed of almost 40 ms-1 and quick rotation.
jrrygg is offline   Reply With Quote

Old   April 23, 2013, 08:55
Default
  #8
Senior Member
 
sail's Avatar
 
Vieri Abolaffio
Join Date: Jul 2010
Location: Always on the move.
Posts: 308
Rep Power: 17
sail is on a distinguished road
your simulation is beyond recovery earlier than that. with the auto time stepping you are already at a DeltaT =1e-14 s, average courant number of 8 and max Co 45.

i dont' know what you changed or if it is the mesh but something went horribly wrong earlier.

I'd suggest to disable the autotimestep, at least for the tests. in this way if there is an issue the sim crashes instantly instead of dragging up wasting time. consider it as euthanasia.

to initialize the solution, change the time scheme in your control dict and just run LTSInterFoam instead of interDyMFoam. it will tell you if something is amiss. if you want to be really cool, add a MRF zone at your rotating region.
than you can bring the developed field into your simulation.
__________________
http://www.leadingedge.it/
Naval architecture and CFD consultancy
sail is offline   Reply With Quote

Old   April 23, 2013, 09:01
Default
  #9
Member
 
Anon
Join Date: Oct 2012
Posts: 33
Rep Power: 14
jrrygg is on a distinguished road
Quote:
Originally Posted by sail View Post
your simulation is beyond recovery earlier than that. with the auto time stepping you are already at a DeltaT =1e-14 s, average courant number of 8 and max Co 45.

i dont' know what you changed or if it is the mesh but something went horribly wrong earlier.

I'd suggest to disable the autotimestep, at least for the tests. in this way if there is an issue the sim crashes instantly instead of dragging up wasting time. consider it as euthanasia.

to initialize the solution, change the time scheme in your control dict and just run LTSInterFoam instead of interDyMFoam. it will tell you if something is amiss. if you want to be really cool, add a MRF zone at your rotating region.
Thank you Vieri!

So should I run the full simulation with LTSInterFoam, or just the first timestep(s)? How do I apply the result so that it initializes interDyMFoam?

About this particular simulation that crashed it must be something wrong with the mesh. I tried to make a coarser mesh just for testing, but I guess it resolves the flow very poorly.
jrrygg is offline   Reply With Quote

Old   April 23, 2013, 11:14
Default
  #10
Senior Member
 
sail's Avatar
 
Vieri Abolaffio
Join Date: Jul 2010
Location: Always on the move.
Posts: 308
Rep Power: 17
sail is on a distinguished road
it is very problem dependent, but i'd say run not necessarly until convergence, but not far from it either. 1000-3000 iterations, maybe?

use mapFields to move the data or just edit manually the BCs if needed.
__________________
http://www.leadingedge.it/
Naval architecture and CFD consultancy
sail is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
InterDyMFoam for breaking wave simulation yannH OpenFOAM Running, Solving & CFD 7 July 26, 2010 09:50
error using interDyMFoam with kOmegaSST to simulate sloshing anmartin OpenFOAM Running, Solving & CFD 0 July 20, 2010 14:21
I lose some fluid during simulation using InterDyMFoam anmartin OpenFOAM Running, Solving & CFD 0 April 20, 2010 16:19
FSI TWO-WAY SIMULATION Smagmon CFX 1 March 6, 2009 14:24
slow simulation Shuo Main CFD Forum 2 February 28, 2008 20:07


All times are GMT -4. The time now is 12:20.