|
[Sponsors] |
May 27, 2022, 03:24 |
problem with parallel simulation
|
#1 |
New Member
Anis Hanani
Join Date: May 2022
Posts: 6
Rep Power: 4 |
Hello everyone!
I have a problem running a parallel simulation. In the first steps of my case, I do the following instructions: -surfaceFeatureExtract -blockMesh -snappyHexMesh -setFields -decomposePar All these went well. But when I type : foamJob -parallel interFoam The simulation encounters a problem and gives this message Time = 2.5 PIMPLE: iteration 1 smoothSolver: Solving for alpha.water, Initial residual = 0.209678, Final residual = 9.91126e-09, No Iterations 840 Phase-1 volume fraction = -1.55948e+08 Min(alpha.water) = -2.52115e+15 Max(alpha.water) = 1.68999e+15 MULES: Correcting alpha.water Phase-1 volume fraction = -1.55925e+08 Min(alpha.water) = -1.26256e+26 Max(alpha.water) = 2.36982e+26 GAMG: Solving for p_rgh, Initial residual = 0.000183673, Final residual = 0.00784301, No Iterations 50 time step continuity errors : sum local = 2.2925e+12, global = 3.35574e+08, cumulative = 3.35506e+08 GAMG: Solving for p_rgh, Initial residual = 1.17896e-06, Final residual = 4.15992e-09, No Iterations 3 time step continuity errors : sum local = 6.19977e+16, global = -7.42357e+15, cumulative = -7.42357e+15 ------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code.. Per user-direction, the job has been aborted. ------------------------------------------------------- -------------------------------------------------------------------------- mpirun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[33237,1],3] Exit code: 144 The simulation was supposed to run for 60 seconds but it stopped at 2.5 seconds. Does anyone knows what I'm doing wrong? |
|
May 29, 2022, 09:34 |
|
#2 | |
Member
Lorenzo
Join Date: Apr 2020
Location: Italy
Posts: 46
Rep Power: 6 |
Hi Anis,
Look at the values of alpha.water before the first MULES correction, they are clearly not correct, and you can tell the same also for p_rgh. Quote:
I suggest you to try checking the boundary conditions inside the 0 folder , then running the case in serial and see what happens at the very first iterations. Regards, Lorenzo |
||
Tags |
openfoam, parallel |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Run Mode:Platform MPI Local Parallel core problem | mztcu | CFX | 0 | October 13, 2016 04:14 |
Problem with an old Simulation | FrankW | CFX | 3 | February 8, 2016 05:28 |
running multiple parallel cases in openfoam slows the simulation | kkpal | OpenFOAM Running, Solving & CFD | 2 | August 21, 2015 12:08 |
damBreak case parallel run problem | behzad-cfd | OpenFOAM Running, Solving & CFD | 5 | August 2, 2015 18:18 |
A parallel simulation problem in DPM | satum | FLUENT | 0 | October 21, 2008 06:38 |