|
[Sponsors] |
FV patch problems when running pisoFoam in parallel |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
July 14, 2010, 03:31 |
FV patch problems when running pisoFoam in parallel
|
#1 |
New Member
Chris Butler
Join Date: Jun 2010
Posts: 21
Rep Power: 16 |
Hi all,
I will begin this thread by stating that my experience in C++ is very limited. Despite this I have managed to modify Hrv's parabolic boundary condition (http://www.cfd-online.com/Forums/ope...tml#post190604) to perform a three dimensional version of the boundary condition. I run into trouble when I try and run the boundary condition in parallel: Code:
[1] #0 Foam::error::printStack(Foam::Ostream&)-------------------------------------------------------------------------- An MPI process has executed an operation involving a call to the "fork()" system call to create a child process. Open MPI is currently operating in a condition that could result in memory corruption or other system errors; your MPI job may hang, crash, or produce silent data corruption. The use of fork() (or system() or other calls that create child processes) is strongly discouraged. The process that invoked fork was: Local host: system011 (PID 17805) MPI_COMM_WORLD rank: 1 If you are *absolutely sure* that your application will successfully and correctly survive a call to fork(), you may disable this warning by setting the mpi_warn_on_fork MCA parameter to 0. -------------------------------------------------------------------------- in "/usr/local/openfoam/1.6-gcc/OpenFOAM-1.6/lib/linux64GccDPOpt/libOpenFOAM.so" [1] #1 Foam::sigFpe::sigFpeHandler(int) in "/usr/local/openfoam/1.6-gcc/OpenFOAM-1.6/lib/linux64GccDPOpt/libOpenFOAM.so" [1] #2 __restore_rt at sigaction.c:0 [1] #3 Foam::rectilinearVelocityFvPatchVectorField::updateCoeffs() in "/provider/project/user/OpenFOAM/1.6/lib/linux64GccDPOpt/librectilinearBC.so" [1] #4 Foam::rectilinearVelocityFvPatchVectorField::rectilinearVelocityFvPatchVectorField(Foam::fvPatch const&, Foam::DimensionedField<Foam::Vector<double>, Foam::volMesh> const&, Foam::dictionary const&) in "/provider/project/user/OpenFOAM/1.6/lib/linux64GccDPOpt/librectilinearBC.so" [1] #5 Foam::fvPatchField<Foam::Vector<double> >::adddictionaryConstructorToTable<Foam::rectilinearVelocityFvPatchVectorField>::New(Foam::fvPatch const&, Foam::DimensionedField<Foam::Vector<double>, Foam::volMesh> const&, Foam::dictionary const&) in "/provider/project/user/OpenFOAM/1.6/lib/linux64GccDPOpt/librectilinearBC.so" [1] #6 Foam::fvPatchField<Foam::Vector<double> >::New(Foam::fvPatch const&, Foam::DimensionedField<Foam::Vector<double>, Foam::volMesh> const&, Foam::dictionary const&) in "/usr/local/openfoam/1.6-gcc/OpenFOAM-1.6/applications/bin/linux64GccDPOpt/pisoFoam" [1] #7 Foam::GeometricField<Foam::Vector<double>, Foam::fvPatchField, Foam::volMesh>::GeometricBoundaryField::GeometricBoundaryField(Foam::fvBoundaryMesh const&, Foam::DimensionedField<Foam::Vector<double>, Foam::volMesh> const&, Foam::dictionary const&) in "/usr/local/openfoam/1.6-gcc/OpenFOAM-1.6/applications/bin/linux64GccDPOpt/pisoFoam" [1] #8 Foam::GeometricField<Foam::Vector<double>, Foam::fvPatchField, Foam::volMesh>::readField(Foam::Istream&) in "/usr/local/openfoam/1.6-gcc/OpenFOAM-1.6/applications/bin/linux64GccDPOpt/pisoFoam" [1] #9 Foam::GeometricField<Foam::Vector<double>, Foam::fvPatchField, Foam::volMesh>::GeometricField(Foam::IOobject const&, Foam::fvMesh const&) in "/usr/local/openfoam/1.6-gcc/OpenFOAM-1.6/applications/bin/linux64GccDPOpt/pisoFoam" [1] #10 main in "/usr/local/openfoam/1.6-gcc/OpenFOAM-1.6/applications/bin/linux64GccDPOpt/pisoFoam" [1] #11 __libc_start_main in "/lib64/libc.so.6" [1] #12 Foam::regIOobject::writeObject(Foam::IOstream::streamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) const in "/usr/local/openfoam/1.6-gcc/OpenFOAM-1.6/applications/bin/linux64GccDPOpt/pisoFoam" [system011:17805] *** Process received signal *** [system011:17805] Signal: Floating point exception (8) [system011:17805] Signal code: (-6) [system011:17805] Failing at address: 0x2100000458d [system011:17805] [ 0] /lib64/libc.so.6 [0x38464302d0] [system011:17805] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x3846430265] [system011:17805] [ 2] /lib64/libc.so.6 [0x38464302d0] [system011:17805] [ 3] /provider/project/user/OpenFOAM/1.6/lib/linux64GccDPOpt/librectilinearBC.so(_ZN4Foam37rectilinearVelocityFvPatchVectorField12updateCoeffsEv+0x63e) [0x2aaab96eeaae] [system011:17805] [ 4] /provider/project/user/OpenFOAM/1.6/lib/linux64GccDPOpt/librectilinearBC.so(_ZN4Foam37rectilinearVelocityFvPatchVectorFieldC1ERKNS_7fvPatchERKNS_16DimensionedFieldINS_6VectorIdEENS_7volMeshEEERKNS_10dictionaryE+0x319) [0x2aaab96edf19] [system011:17805] [ 5] /provider/project/user/OpenFOAM/1.6/lib/linux64GccDPOpt/librectilinearBC.so(_ZN4Foam12fvPatchFieldINS_6VectorIdEEE31adddictionaryConstructorToTableINS_37rectilinearVelocityFvPatchVectorFieldEE3NewERKNS_7fvPatchERKNS_16DimensionedFieldIS2_NS_7volMeshEEERKNS_10dictionaryE+0x47) [0x2aaab96f4f77] [system011:17805] [ 6] pisoFoam(_ZN4Foam12fvPatchFieldINS_6VectorIdEEE3NewERKNS_7fvPatchERKNS_16DimensionedFieldIS2_NS_7volMeshEEERKNS_10dictionaryE+0x234) [0x422ea4] [system011:17805] [ 7] pisoFoam(_ZN4Foam14GeometricFieldINS_6VectorIdEENS_12fvPatchFieldENS_7volMeshEE22GeometricBoundaryFieldC1ERKNS_14fvBoundaryMeshERKNS_16DimensionedFieldIS2_S4_EERKNS_10dictionaryE+0xd3) [0x42cbc3] [system011:17805] [ 8] pisoFoam(_ZN4Foam14GeometricFieldINS_6VectorIdEENS_12fvPatchFieldENS_7volMeshEE9readFieldERNS_7IstreamE+0xbe) [0x44158e] [system011:17805] [ 9] pisoFoam(_ZN4Foam14GeometricFieldINS_6VectorIdEENS_12fvPatchFieldENS_7volMeshEEC1ERKNS_8IOobjectERKNS_6fvMeshE+0xf3) [0x441be3] [system011:17805] [10] pisoFoam [0x414a16] [system011:17805] [11] /lib64/libc.so.6(__libc_start_main+0xf4) [0x384641d994] [system011:17805] [12] pisoFoam(_ZNK4Foam11regIOobject11writeObjectENS_8IOstream12streamFormatENS1_13versionNumberENS1_15compressionTypeE+0xd1) [0x413a09] [system011:17805] *** End of error message *** -------------------------------------------------------------------------- mpirun noticed that process rank 1 with PID 17805 on node system011 exited on signal 8 (Floating point exception). This error happens on multiple systems. The boundary condition is (currently) time independent so if I step forward once in serial I can successfully use the boundary condition by changing its type to fixedValue (nonuniform). Error occurs under 1.6 & 1.7 I have forced (using simple decomposition) to ensure that the boundary is not split across processors. This does not help. Any help would be much appreciated. Attached is a case directory with the boundary condition source included. Cheers, CB |
|
July 19, 2010, 02:12 |
Error found
|
#2 |
New Member
Chris Butler
Join Date: Jun 2010
Posts: 21
Rep Power: 16 |
Just wanted to say I found the error. It was due to the parallel reduce within the bound box function was not enabled. With the bound box incorrect my boundary condition function created singular values.
CB |
|
March 21, 2011, 06:06 |
|
#3 | |
Member
Jitao Liu
Join Date: Mar 2009
Location: Jinan , China
Posts: 64
Rep Power: 17 |
Quote:
I am suffering a similar problem as you have mentioned above. Please give me some suggestions. Thanks in advance. Here is the log: /*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM Extend Project: Open source CFD | | \\ / O peration | Version: 1.6-ext | | \\ / A nd | Web: www.extend-project.de | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 1.6-ext-1f09367282cf Exec : conjugateRHCMFoam -parallel Date : Mar 21 2011 Time : 17:59:50 Host : linux-szab PID : 7621 Case : /home/of16ext/OpenFOAM/of16ext-1.6-ext/run/mycase/conjugateRHCMFoam/RHCM-electric/ERHCM-3D/ERHCM-3D-Parallel/polymer nProcs : 4 Slaves : 3 ( linux-szab.7622 linux-szab.7623 linux-szab.7624 ) Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : blocking SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 0 // using new solver syntax: pcorr { solver PCG; tolerance 1e-10; relTol 0; preconditioner DIC; } // using new solver syntax: pd { solver PCG; tolerance 1e-07; relTol 0.05; preconditioner DIC; } // using new solver syntax: pdFinal { solver PCG; tolerance 1e-07; relTol 0; preconditioner DIC; } // using new solver syntax: T+T { solver smoothSolver; smoother ILU; nSweeps 3; minIter 0; maxIter 1000; tolerance 1e-06; relTol 0; } // using new solver syntax: U { solver PBiCG; tolerance 1e-06; relTol 0; preconditioner DILU; } // using new solver syntax: T { solver PCG; preconditioner DIC; tolerance 1e-06; relTol 0; } Reading g Reading transportProperties Reading field pd Reading field alpha1 Reading field U Reading field T Reading/calculating face flux field phi Reading transportProperties Selecting incompressible transport model CrossArrhenius Selecting incompressible transport model Newtonian Calculating field g.h Selecting turbulence model type laminar Reading field Tsolid Reading solid conductivity kT time step continuity errors : sum local = 0.000183958, global = -0.000183958, cumulative = -0.000183958 DICPCG: Solving for pcorr, Initial residual = 1, Final residual = 8.8536e-11, No Iterations 138 time step continuity errors : sum local = 1.62871e-14, global = 1.59112e-15, cumulative = -0.000183958 Courant Number mean: 7.7292e-05 max: 2.27224 velocity magnitude: 1.285 Starting time loop Courant Number mean: 1.69873e-05 max: 0.499393 velocity magnitude: 1.285 deltaT = 0.00021978 Time = 0.00021978 Courant Number mean: 1.69873e-05 max: 0.499393 velocity magnitude: 1.285 MULES: Solving for alpha1 Liquid phase volume fraction = 2.02152e-05 Min(alpha1) = 0 Max(alpha1) = 1 MULES: Solving for alpha1 Liquid phase volume fraction = 3.90937e-05 Min(alpha1) = 0 Max(alpha1) = 1 DICPCG: Solving for pd, Initial residual = 1, Final residual = 0.0188053, No Iterations 2 DICPCG: Solving for pd, Initial residual = 1.3167e-05, Final residual = 1.02047e-07, No Iterations 2 DICPCG: Solving for pd, Initial residual = 1.61937e-05, Final residual = 1.91329e-08, No Iterations 3 time step continuity errors : sum local = 4.82666e-08, global = -4.52055e-08, cumulative = -0.000184003 Solving the coupled energy equation [linux-szab:07624] *** Process received signal *** [linux-szab:07624] Signal: Floating point exception (8) [linux-szab:07624] Signal code: (-6) [linux-szab:07624] Failing at address: 0x3e800001dc8 [linux-szab:07624] [ 0] /lib64/libc.so.6 [0x7fa07e3ef6e0] [linux-szab:07624] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x7fa07e3ef645] [linux-szab:07624] [ 2] /lib64/libc.so.6 [0x7fa07e3ef6e0] [linux-szab:07624] [ 3] /home/of16ext/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libcoupledLduMatrix.so(_ZNK4Foam21coupledCholeskyP recon12preconditionERNS_10FieldFieldINS_5FieldEdEE RKS3_h+0x24c) [0x7fa081a3365c] [linux-szab:07624] [ 4] /home/of16ext/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libcoupledLduMatrix.so(_ZNK4Foam18coupledIluSmooth er6smoothERNS_10FieldFieldINS_5FieldEdEERKS3_hi+0x 2e7) [0x7fa081a36167] [linux-szab:07624] [ 5] /home/of16ext/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libcoupledLduMatrix.so(_ZNK4Foam19coupledSmoothSol ver5solveERNS_10FieldFieldINS_5FieldEdEERKS3_h+0x6 e0) [0x7fa081a3cc90] [linux-szab:07624] [ 6] /home/of16ext/OpenFOAM/OpenFOAM-1.6-ext/lib/linux64GccDPOpt/libcoupledLduMatrix.so(_ZN4Foam15coupledFvMatrixId E5solveERKNS_10dictionaryE+0x60c) [0x7fa081a453dc] [linux-szab:07624] [ 7] conjugateRHCMFoam [0x43d82d] [linux-szab:07624] [ 8] conjugateRHCMFoam [0x427f41] [linux-szab:07624] [ 9] /lib64/libc.so.6(__libc_start_main+0xe6) [0x7fa07e3db586] [linux-szab:07624] [10] conjugateRHCMFoam [0x4209e9] [linux-szab:07624] *** End of error message *** -------------------------------------------------------------------------- mpirun noticed that process rank 3 with PID 7624 on node linux-szab exited on signal 8 (Floating point exception). -------------------------------------------------------------------------- |
||
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
[CGNS] CGNS converters available | mbeaudoin | OpenFOAM Meshing & Mesh Conversion | 137 | December 14, 2018 05:20 |
[blockMesh] Cyclic BC's: Possible face ordering problem? (Channel flow) | sega | OpenFOAM Meshing & Mesh Conversion | 3 | September 28, 2010 13:46 |
Problems with MFIX code and Parallel Processing. | Fernando Pio | Main CFD Forum | 4 | August 29, 2006 15:33 |
running multiple Fluent parallel jobs | Michael Bo Hansen | FLUENT | 8 | June 7, 2006 09:52 |
CFX - Parallel Problems | CFX User | CFX | 0 | November 1, 2004 19:12 |