|
[Sponsors] |
Foam fatal error with rhoPimpleDyMFoam in parallel |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
July 7, 2017, 04:42 |
Foam fatal error with rhoPimpleDyMFoam in parallel
|
#1 |
Member
Join Date: May 2017
Posts: 38
Rep Power: 9 |
Hello,
I work on a compressible case in subsonic for external aerodynamic. I used rhoPimpleDyMFoam. When I run the simulation, after few iteration I see this fatal error: Code:
[14] [14] --> FOAM FATAL ERROR: [14] Maximum number of iterations exceeded [14] [14] From function Foam::scalar Foam::species::thermo<Thermo, Type>::T(Foam::scalar, Foam::scalar, Foam::scalar, Foam::scalar (Foam::species::thermo<Thermo, Type>::*)(Foam::scalar, Foam::scalar)const, Foam::scalar (Foam::species::thermo<Thermo, Type>::*)(Foam::scalar, Foam::scalar)const, Foam::scalar (Foam::species::thermo<Thermo, Type>::*)(Foam::scalar)const) const [with Thermo = Foam::hConstThermo<Foam::perfectGas<Foam::specie> >; Type = Foam::sensibleInternalEnergy; Foam::scalar = double; Foam::species::thermo<Thermo, Type> = Foam::species::thermo<Foam::hConstThermo<Foam::perfectGas<Foam::specie> >, Foam::sensibleInternalEnergy>] [14] in file /Software/OpenFOAM/OpenFOAM-v1606+/src/thermophysicalModels/specie/lnInclude/thermoI.H at line 66. [14] FOAM parallel run aborting [14] [14] #0 Foam::error::printStack(Foam::Ostream&)-------------------------------------------------------------------------- An MPI process has executed an operation involving a call to the "fork()" system call to create a child process. Open MPI is currently operating in a condition that could result in memory corruption or other system errors; your MPI job may hang, crash, or produce silent data corruption. The use of fork() (or system() or other calls that create child processes) is strongly discouraged. The process that invoked fork was: Local host: cfdnode09 (PID 31528) MPI_COMM_WORLD rank: 14 If you are *absolutely sure* that your application will successfully and correctly survive a call to fork(), you may disable this warning by setting the mpi_warn_on_fork MCA parameter to 0. -------------------------------------------------------------------------- in "/Software/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so" [14] #1 Foam::error::abort() in "/Software/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so" [14] #2 Foam::species::thermo<Foam::hConstThermo<Foam::perfectGas<Foam::specie> >, Foam::sensibleInternalEnergy>::TEs(double, double, double) const in "/Software/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libfluidThermophysicalModels.so" [14] #3 Foam::hePsiThermo<Foam::psiThermo, Foam::pureMixture<Foam::constTransport<Foam::species::thermo<Foam::hConstThermo<Foam::perfectGas<Foam::specie> >, Foam::sensibleInternalEnergy> > > >::calculate() in "/Software/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libfluidThermophysicalModels.so" [14] #4 Foam::hePsiThermo<Foam::psiThermo, Foam::pureMixture<Foam::constTransport<Foam::species::thermo<Foam::hConstThermo<Foam::perfectGas<Foam::specie> >, Foam::sensibleInternalEnergy> > > >::correct() in "/Software/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libfluidThermophysicalModels.so" [14] #5 ? in "/Software/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/bin/rhoPimpleDyMFoam" [14] #6 __libc_start_main in "/lib64/libc.so.6" [14] #7 ? in "/Software/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/bin/rhoPimpleDyMFoam" -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 14 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- this is my fvSolution file: Code:
solvers { "rho.*" { solver diagonal; } "p.*" { solver GAMG; preconditioner DILU; tolerance 1e-7; relTol 0; smoother GaussSeidel; } "(U|e).*" { $p; tolerance 1e-6; smoother DILU; } "(k|epsilon).*" { $p; tolerance 1e-6; } cellDisplacement { solver GAMG; tolerance 1e-5; relTol 0; smoother GaussSeidel; cacheAgglomeration true; nCellsInCoarsestLevel 10; agglomerator faceAreaPair; mergeLevels 1; } } PIMPLE { nOuterCorrectors 1; nCorrectors 2; nNonOrthogonalCorrectors 0; } relaxationFactors { fields { p 0.3; } equations { "(U|k|epsilon)" 0.7; "(U|k|epsilon)Final" 1.0; } } - for U Code:
dimensions [0 1 -1 0 0 0 0]; internalField uniform (0 238 0); boundaryField { inlet { type freestream; pInf 100000; TInf 300; UInf (0 238 0); gamma 1.4; freestreamValue uniform (0 238 0); } outlet { type inletOutlet; inletValue uniform (0 238 0); value uniform (0 238 0); } capsule { type zeroGradient; } far_field { type noSlip; } } Code:
dimensions [1 -1 -2 0 0 0 0]; internalField uniform 100000; boundaryField { inlet { type zeroGradient; } outlet { type inletOutlet; inletValue uniform 100000; value uniform 100000; } capsule { type zeroGradient; } far_field { type zeroGradient; } } Code:
dimensions [0 0 0 1 0 0 0]; internalField uniform 300; boundaryField { inlet { type inletOutlet; inletValue uniform 300; value uniform 300; } outlet { type inletOutlet; inletValue uniform 300; value uniform 300; } capsule { type zeroGradient; } far_field { type zeroGradient; } } Code:
dimensions [0 2 -1 0 0 0 0]; internalField uniform 0; boundaryField { inlet { type calculated; value uniform 0; } outlet { type calculated; value uniform 0; } capsule { type nutkWallFunction; Cmu 0.09; kappa 0.41; E 9.8; value uniform 0; } far_field { type zeroGradient; } } Code:
dimensions [1 -1 -1 0 0 0 0]; internalField uniform 0; boundaryField { inlet { type calculated; value uniform 0; } outlet { type calculated; value uniform 0; } capsule { type compressible::alphatWallFunction; Prt 0.85; value uniform 0; } far_field { type zeroGradient; } } thanks in advance |
|
July 7, 2017, 06:15 |
|
#2 |
Senior Member
Oskar
Join Date: Nov 2015
Location: Poland
Posts: 184
Rep Power: 11 |
I have found topic with similar problem. Cannot analyse it deeper cause lack of free time. I hope You will find something useful here:
http://archive.is/58Isy I wish You success! sheaker |
|
July 7, 2017, 07:01 |
|
#3 |
Member
Join Date: May 2017
Posts: 38
Rep Power: 9 |
thanks for your reply, but I didn't find the solution in this post.
|
|
Tags |
rhopimpledymfoam |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Foam::error::PrintStack | almir | OpenFOAM Running, Solving & CFD | 92 | May 21, 2024 08:56 |
chtMultiRegionSimpleFoam: crash on parallel run | student666 | OpenFOAM Running, Solving & CFD | 3 | April 20, 2017 12:05 |
[mesh manipulation] Importing Multiple Meshes | thomasnwalshiii | OpenFOAM Meshing & Mesh Conversion | 18 | December 19, 2015 19:57 |
simpleFoam parallel | AndrewMortimer | OpenFOAM Running, Solving & CFD | 12 | August 7, 2015 19:45 |
decomposePar is missing a library | whk1992 | OpenFOAM Pre-Processing | 8 | March 7, 2015 08:53 |