|
[Sponsors] |
November 24, 2008, 21:14 |
Hi
I run a calculation using
|
#1 |
New Member
Sara Schairer
Join Date: Mar 2009
Posts: 26
Rep Power: 17 |
Hi
I run a calculation using the solver interFoam in parallel, using 8 cores. I decompose my directory and start the calculation. Now, the strange thing is this: when I run the calculation on a single core I get an error message as well, but only at a time about 0.09 seconds. When I run exactly the same calculation in parallel I get this error message after about 0.01 seconds. Can anybody explain why the error is different when parallel computation is used? I also tried almost every possible BC's for pressure and velocity. Yet, at some point the calculation always breaks. Is there anything else except the BC's which can be responsible for the error? Sorry for the long post. Cheers, Sara This is how the error message looks like: [6] [0] #0 Foam::error::printStack(Foam:stream&)[2] #0 Foam::error::printStack(Foam:stream&)#0 Foam::error::printStack(Foam:stream&) in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so" in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so" [0] #1 Foam::sigFpe::sigFpeHandler(int) in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so" [2] #1 Foam::sigFpe::sigFpeHandler(int)[6] #1 Foam::sigFpe::sigFpeHandler(int) in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so" [0] #2 in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so" [2] #2 in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so" [6] #2 ?????? in "/lib64/libc.so.6" [6] #3 void Foam::MULES::limiter<foam::onefield,>(Foam::Field< double>&, Foam::oneField const&, Foam::GeometricField<double,> const&, Foam::GeometricField<double,> const&, Foam::GeometricField<double,> const&, Foam::zeroField const&, Foam::zeroField const&, double, double, int) in "/lib64/libc.so.6" [0] #3 void Foam::MULES::limiter<foam::onefield,>(Foam::Field< double>&, Foam::oneField const&, Foam::GeometricField<double,> const&, Foam::GeometricField<double,> const&, Foam::GeometricField<double,> const&, Foam::zeroField const&, Foam::zeroField const&, double, double, int) in "/lib64/libc.so.6" [2] #3 void Foam::MULES::limiter<foam::onefield,>(Foam::Field< double>&, Foam::oneField const&, Foam::GeometricField<double,> const&, Foam::GeometricField<double,> const&, Foam::GeometricField<double,> const&, Foam::zeroField const&, Foam::zeroField const&, double, double, int) in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libfiniteVolume.so" [6] #4 in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libfiniteVolume.so" [0] #4 void Foam::MULES::explicitSolve<foam::onefield,>(Foam:: oneField const&, Foam::GeometricField<double,>&, Foam::GeometricField<double,> const&, Foam::GeometricField<double,>&, Foam::zeroField const&, Foam::zeroField const&, double, double) in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libfiniteVolume.so" [2] #4 void Foam::MULES::explicitSolve<foam::onefield,>(Foam:: oneField const&, Foam::GeometricField<double,>&, Foam::GeometricField<double,> const&, Foam::GeometricField<double,>&, Foam::zeroField const&, Foam::zeroField const&, double, double) void Foam::MULES::explicitSolve<foam::onefield,>(Foam:: oneField const&, Foam::GeometricField<double,>&, Foam::GeometricField<double,> const&, Foam::GeometricField<double,>&, Foam::zeroField const&, Foam::zeroField const&, double, double) in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libfi in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libfiniteVolume.so" [0] #5 Foam::MULES::explicitSolve(Foam::GeometricField<do uble,>&, Foam::GeometricField<double,> const&, Foam::GeometricField<double,>&, double, double)niteVolume.so" [6] #5 Foam::MULES::explicitSolve(Foam::GeometricField<do uble,>&, Foam::GeometricField<double,> const&, Foam::GeometricField<double,>&, double, double) in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libfiniteVolume.so" [2] #5 Foam::MULES::explicitSolve(Foam::GeometricField<do uble,>&, Foam::GeometricField<double,> const&, Foam::GeometricField<double,>&, double, double) in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libfiniteVolume.so" [6] #6 in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libfiniteVolume.so" [2] #6 in "/opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libfiniteVolume.so" [0] #6 mainmainmain in "/opt/OpenFOAM/OpenFOAM-1.5/applications/bin/linux64GccDPOpt/interFoam" [6] #7 __libc_start_main in "/opt/OpenFOAM/OpenFOAM-1.5/applications/bin/linux64GccDPOpt/interFoam" [0] #7 __libc_start_main in "/opt/OpenFOAM/OpenFOAM-1.5/applications/bin/linux64GccDPOpt/interFoam" [2] #7 __libc_start_main in "/lib64/libc.so.6" [6] #8 in "/lib64/libc.so.6" [0] #8 in "/lib64/libc.so.6" [2] #8 Foam::regIOobject::readIfModified()Foam::regIOobje ct::readIfModified()Foam::regI Oobject::readIfModified() in "/opt/OpenFOAM/OpenFOAM-1.5/applications/bin/linux64GccDPOpt/interFoam" [xe048:03366] *** Process received signal *** [xe048:03366] Signal: Floating point exception (8) [xe048:03366] Signal code: (-6) [xe048:03366] Failing at address: 0x178a00000d26 [xe048:03366] [ 0] /lib64/libc.so.6 [0x2aaaace3b0b0] [xe048:03366] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x2aaaace3b055] [xe048:03366] [ 2] /lib64/libc.so.6 [0x2aaaace3b0b0] [xe048:03366] [ 3] /opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libfiniteVolume.so(_ZN4Foam5MULES 7limiterINS_8oneFieldENS_9zeroFieldES3_EEvRNS_5Fie ldIdEERKT_RKNS_14GeometricFiel dIdNS_12fvPatchFieldENS_7volMeshEEERKNSA_IdNS_13fv sPatchFieldENS_11surfaceMeshEE ESK_RKT0_RKT1_ddi+0xe6a) [0x2aaaab67d6fa] [xe048:03366] [ 4] /opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libfiniteVolume.so(_ZN4Foam5MULES 13explicitSolveINS_8oneFieldENS_9zeroFieldES3_EEvR KT_RNS_14GeometricFieldIdNS_12 fvPatchFieldENS_7volMeshEEERKNS7_IdNS_13fvsPatchFi eldENS_11surfaceMeshEEERSE_RKT 0_RKT1_dd+0x274) [0x2aaaab6804e4] [xe048:03366] [ 5] /opt/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libfiniteVolume.so(_ZN4Foam5MULES 13explicitSolveERNS_14GeometricFieldIdNS_12fvPatch FieldENS_7volMeshEEERKNS1_IdNS _13fvsPatchFieldENS_11surfaceMeshEEERS8_dd+0x24) [0x2aaaab669f34] [xe048:03366] [ 6] interFoam [0x4201af] [xe048:03366] [ 7] /lib64/libc.so.6(__libc_start_main+0xf4) [0x2aaaace288a4] [xe048:03366] [ 8] interFoam(_ZN4Foam11regIOobject14readIfModifiedEv+ 0x1c9) [0x41cda9] [xe048:03366] *** End of error message *** mpirun noticed that job rank 0 with PID 3360 on node xe048 exited on signal 15 (Terminated). ================================================== ============================== Job ID: 47913.xe User ID: schairer Group ID: curtince Job Name: openfoam_test Session ID: 3248 Resource List: neednodes=1:ppn=8,nodes=1:ppn=8,walltime=01:00:00 Resources Used: cput=00:00:00,mem=10148kb,vmem=237236kb,walltime=0 0:01:12 Queue Name: normal Account String: |
|
November 25, 2008, 03:29 |
Hi Sara,
In your error mess
|
#2 |
Senior Member
Gijsbert Wierink
Join Date: Mar 2009
Posts: 383
Rep Power: 18 |
Hi Sara,
In your error message it says there was a floating point exception. Usually when that happens in my simulations the Courant number exploded to a very high value and the simulation was terminated. An easy way to fix this is by making the time step smaller (usually helps, but not always). There are also methods to do a Courant number controlled timestep, so that if the Co number increases the timestep decreases. The fact that your simulation crashes roughly 8x as fast when using 8 cores seems to be due to parallel speed up :-). Try it with a much smaller timestep and/or Courant limited timestep. Regards, Gijs
__________________
Regards, Gijs |
|
April 19, 2011, 05:33 |
|
#3 |
New Member
Carl
Join Date: Jan 2011
Location: Bremen /Gothenburg
Posts: 11
Rep Power: 15 |
Hej Gijsbert, hej forum,
I have the same error message but the time stepping is not the problem. Do know any other error sources that might lead to this error message? Ragards, Cjm |
|
April 19, 2011, 06:05 |
|
#4 |
Senior Member
Anton Kidess
Join Date: May 2009
Location: Germany
Posts: 1,377
Rep Power: 30 |
Division by Zero.
|
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
InterFoam PLIC | ntrask | OpenFOAM Running, Solving & CFD | 1 | January 8, 2009 13:32 |
InterFoam | floooo | OpenFOAM Running, Solving & CFD | 0 | November 3, 2008 12:00 |
Theory behind InterFoam | in_flu_ence | OpenFOAM Running, Solving & CFD | 2 | October 27, 2007 16:07 |
Performance of interFoam running in parallel | hsieh | OpenFOAM Running, Solving & CFD | 8 | September 14, 2006 10:15 |
InterFoam problem running parallel | vatant | OpenFOAM Running, Solving & CFD | 0 | April 28, 2006 20:22 |