|
[Sponsors] |
July 25, 2019, 04:40 |
interfoam parallel running exit code 145
|
#1 |
New Member
kindi jan
Join Date: Jul 2019
Posts: 2
Rep Power: 0 |
Hello,
I have an issue running a simulation with interFoam solver (openfoam version 5.0.0). My simulation is RANS k-w SST with VOF model of a free surface flow in channel managing a structure in it. and i am figuring out the velocity field around the structure and the head water. After parallelizing my case, i run the solver, it takes about 1 or 2 days running and it gives me this error (i just copied the last part of the log file): PIMPLE: iteration 1 smoothSolver: Solving for alpha.water, Initial residual = 4.097e-09, Final residual = 4.097e-09, No Iterations 0 Phase-1 volume fraction = 0.7029 Min(alpha.water) = -0.0002897 Max(alpha.water) = 1.007 MULES: Correcting alpha.water Phase-1 volume fraction = 0.7029 Min(alpha.water) = -0.0002897 Max(alpha.water) = 1.007 GAMG: Solving for p_rgh, Initial residual = 0.2751, Final residual = 0.002164, No Iterations 13 time step continuity errors : sum local = 1.977e-10, global = 2.921e-12, cumulative = 4.478e-07 GAMG: Solving for p_rgh, Initial residual = 0.08213, Final residual = 7.147e-06, No Iterations 50 time step continuity errors : sum local = 1.634e-12, global = -3.544e-14, cumulative = 4.478e-07 smoothSolver: Solving for omega, Initial residual = 9.928e-07, Final residual = 9.928e-07, No Iterations 0 smoothSolver: Solving for k, Initial residual = 1.566e-06, Final residual = 3.268e-08, No Iterations 1 bounding k, min: 0 max: 1.439e+77 average: 4.407e+73 ExecutionTime = 7889 s ClockTime = 8363 s surfaceFieldValue inletFlux write: sum(inlet) of rhoPhi = -499.5 surfaceFieldValue outletFlux write: sum(outlet) of rhoPhi = 0 surfaceFieldValue atmosphereFlux write: sum(atmosphere) of rhoPhi = -7.511e+93 Courant Number mean: 1.834e-07 max: 8.948 Interface Courant Number mean: 1.22e-08 max: 2.914 deltaT = 1.279e-107 Time = 0.001915785357951 PIMPLE: iteration 1 smoothSolver: Solving for alpha.water, Initial residual = 4.094e-09, Final residual = 4.094e-09, No Iterations 0 Phase-1 volume fraction = 0.7029 Min(alpha.water) = -0.0002897 Max(alpha.water) = 1.007 MULES: Correcting alpha.water Phase-1 volume fraction = 0.7029 Min(alpha.water) = -0.0002897 Max(alpha.water) = 1.007 ------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code.. Per user-direction, the job has been aborted. ------------------------------------------------------- -------------------------------------------------------------------------- mpirun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[12165,1],3] Exit code: 145 -------------------------------------------------------------------------- i tried to change the wall's conditions to wallfunctions and not y+=1 conditions but it does the same thing. Can you help me figure out the problem? i didn't find the answer elsewhere. Thank you. |
|
July 25, 2019, 05:02 |
|
#2 |
Member
Rodrigo
Join Date: Mar 2010
Posts: 98
Rep Power: 16 |
Something went wrong with the calculation already before the error. I suggest you to investigate further the reason for such big values for k:
Code:
bounding k, min: 0 max: 1.439e+77 average: 4.407e+73 |
|
July 25, 2019, 05:04 |
|
#3 |
Member
Rodrigo
Join Date: Mar 2010
Posts: 98
Rep Power: 16 |
Same with the fluxes. I guess it became unstable at once.
Code:
surfaceFieldValue atmosphereFlux write: sum(atmosphere) of rhoPhi = -7.511e+93 |
|
July 25, 2019, 05:05 |
|
#4 |
Member
Rodrigo
Join Date: Mar 2010
Posts: 98
Rep Power: 16 |
...indeed, this is the probe of it
Code:
deltaT = 1.279e-107 |
|
July 26, 2019, 04:44 |
|
#5 |
New Member
kindi jan
Join Date: Jul 2019
Posts: 2
Rep Power: 0 |
Thank you for your answer.
k is calculated with the equation (k=1.5*(UI)²) for the internalmesh and equal to zero at wall boudaries for y+=1 condition. i tried also with kwallfunction i gives me the same error. |
|
July 27, 2019, 08:48 |
|
#6 | |
Member
Rodrigo
Join Date: Mar 2010
Posts: 98
Rep Power: 16 |
Quote:
PS: Disregarding of that, this information may be insufficient to give a diagnostic of your problem. I suggest you to upload your case, so that other users can more efficiently help you to fix the settings. |
||
February 25, 2023, 08:58 |
|
#7 |
New Member
Carlos
Join Date: Feb 2023
Location: Madrid
Posts: 1
Rep Power: 0 |
Hi! I have a similar error, but I see a few differences.
I'm trying to run my own case, but to be honest I'm actually learning. Here the error: PIMPLE: iteration 1 forces forces: rho: rho Not including porosity effects Restraint translationDamper: force (0 0 8643.63) Restraint rotationDamper: moment (-0 -15031.6 -0) 6-DoF rigid body motion Centre of rotation: (3.5854 0 -0.10963) Centre of mass: (3.5854 0 -0.10963) Orientation: (0.981731 0 0.190275 0 1 0 -0.190275 0 0.981731) Linear velocity: (-0 -0 -1.00618) Angular velocity: (0 1.3262 0) GAMG: Solving for pcorr, Initial residual = 1, Final residual = 0.0791026, No Iterations 3 time step continuity errors : sum local = 2.00954e-07, global = 2.21845e-08, cumulative = 2.86727e-07 smoothSolver: Solving for alpha.water, Initial residual = 0.000108711, Final residual = 5.08216e-11, No Iterations 9 Phase-1 volume fraction = 0.767756 Min(alpha.water) = -6.81899e-05 Max(alpha.water) = 1.00424 Applying the previous iteration compression flux MULES: Correcting alpha.water MULES: Correcting alpha.water MULES: Correcting alpha.water MULES: Correcting alpha.water Phase-1 volume fraction = 0.767756 Min(alpha.water) = -0.00788331 Max(alpha.water) = 1.01183 -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- mpirun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[45478,1],3] Exit code: 145 Also, I have to mention that previously I had another problem with my case. I tried to fix it by reducing the refinement boxes in the diferent topoSetDict.# and then it let me do: mpirun -np 4 interFoam -parallel after this comand, it begin to run and then the exit code 145 ocurred I will apreciate info about this problem or just the right place guide or documentation to solve this, becasue as I said previously I'm to starting using OpenFoam and I don't know where to search. Thank you so much for your help. |
|
Tags |
exit code, interfoam, parallel error |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
OF 2.0.1 parallel running problems | moser_r | OpenFOAM Running, Solving & CFD | 9 | July 27, 2022 04:15 |
[snappyHexMesh] Error while running SnappyHex in parallel | mg.mithun | OpenFOAM Meshing & Mesh Conversion | 1 | February 10, 2016 14:13 |
Parallel running of 3D multiphase turbulence model (unknown problem!!) | MOHAMMAD67 | OpenFOAM Running, Solving & CFD | 7 | November 23, 2015 11:53 |
Issues running custom code in parallel | BigBlueDart | OpenFOAM Programming & Development | 4 | October 23, 2013 07:17 |
Something weird encountered when running OpenFOAM in parallel on multiple nodes | xpqiu | OpenFOAM Running, Solving & CFD | 2 | May 2, 2013 05:59 |