CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM

Running in parallel crashed

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   September 9, 2010, 10:50
Default Running in parallel crashed
  #1
Member
 
jingjing
Join Date: Mar 2009
Location: shanghai,China
Posts: 30
Rep Power: 17
zhajingjing is on a distinguished road
Hi
I run a 3D case about wave past through a fixed deck, it alawys crashed down at some time when using parallel pattern with four cpu cores at one computer.
The case runs well using serial pattern, but with so large number of cells, I hope the calculation under parallel pattern. I use parallel well for previous 2D wave case, I don't know why I can't us it now?
Can someone help me?

Thanks


[QUOTE]
\*---------------------------------------------------------------------------*/
...
Courant Number mean: 0.01321465 max: 0.4910337
deltaT = 0.01
Time = 6.02

linear Wave piston position is 0.04913088
DICPCG: Solving for cellDisplacementx, Initial residual = 0.03646105, Final residual = 9.419756e-08, No Iterations 265
DICPCG: Solving for cellDisplacementy, Initial residual = 0, Final residual = 0, No Iterations 0
DICPCG: Solving for cellDisplacementz, Initial residual = 0, Final residual = 0, No Iterations 0
Execution time for mesh.update() = 20.98 s
time step continuity errors : sum local = 4.881283e-10, global = 2.962087e-11, cumulative = -1.050635e-05
DICPCG: Solving for pcorr, Initial residual = 1, Final residual = 9.605745e-08, No Iterations 276
DICPCG: Solving for pcorr, Initial residual = 3.071048e-08, Final residual = 3.071048e-08, No Iterations 0
time step continuity errors : sum local = 4.690172e-17, global = 1.462225e-18, cumulative = -1.050635e-05
MULES: Solving for gamma
Liquid phase volume fraction = 0.5008951 Min(gamma) = 0 Max(gamma) = 1
DILUPBiCG: Solving for Ux, Initial residual = 0.009444449, Final residual = 3.807359e-11, No Iterations 4
DILUPBiCG: Solving for Uy, Initial residual = 0.08637048, Final residual = 3.172662e-11, No Iterations 5
DILUPBiCG: Solving for Uz, Initial residual = 0.05539011, Final residual = 1.323595e-11, No Iterations 5
DICPCG: Solving for pd, Initial residual = 0.02780171, Final residual = 9.739116e-11, No Iterations 168
DICPCG: Solving for pd, Initial residual = 9.713962e-11, Final residual = 9.713962e-11, No Iterations 0
time step continuity errors : sum local = 4.956621e-10, global = -2.379498e-11, cumulative = -1.050638e-05
DICPCG: Solving for pd, Initial residual = 0.001852054, Final residual = 8.941101e-11, No Iterations 160
DICPCG: Solving for pd, Initial residual = 8.926664e-11, Final residual = 8.926664e-11, No Iterations 0
time step continuity errors : sum local = 4.561905e-10, global = 2.771491e-11, cumulative = -1.050635e-05
ExecutionTime = 2473.54 s ClockTime = 2603 s

Courant Number mean: 0.01306616 max: 0.8920261
deltaT = 0.01
Time = 6.03

linear Wave piston position is 0.05089371
DICPCG: Solving for cellDisplacementx, Initial residual = 0.03435676, Final residual = 9.554684e-08, No Iterations 264
DICPCG: Solving for cellDisplacementy, Initial residual = 0, Final residual = 0, No Iterations 0
DICPCG: Solving for cellDisplacementz, Initial residual = 0, Final residual = 0, No Iterations 0
Execution time for mesh.update() = 21.08 s
time step continuity errors : sum local = 4.562227e-10, global = 2.771687e-11, cumulative = -1.050632e-05
DICPCG: Solving for pcorr, Initial residual = 1, Final residual = 9.47173e-08, No Iterations 276
DICPCG: Solving for pcorr, Initial residual = 2.952091e-08, Final residual = 2.952091e-08, No Iterations 0
time step continuity errors : sum local = 4.322669e-17, global = -1.95121e-18, cumulative = -1.050632e-05
MULES: Solving for gamma
Liquid phase volume fraction = 0.5009314 Min(gamma) = 0 Max(gamma) = 1
DILUPBiCG: Solving for Ux, Initial residual = 0.009435191, Final residual = 2.906598e-11, No Iterations 4
DILUPBiCG: Solving for Uy, Initial residual = 0.08689201, Final residual = 5.580103e-12, No Iterations 5
DILUPBiCG: Solving for Uz, Initial residual = 0.0589454, Final residual = 2.203328e-12, No Iterations 5
DICPCG: Solving for pd, Initial residual = 0.02735342, Final residual = 9.269468e-11, No Iterations 168
DICPCG: Solving for pd, Initial residual = 9.255112e-11, Final residual = 9.255112e-11, No Iterations 0
time step continuity errors : sum local = 4.730309e-10, global = -2.418304e-11, cumulative = -1.050635e-05
DICPCG: Solving for pd, Initial residual = 0.001864015, Final residual = 9.437408e-11, No Iterations 158
DICPCG: Solving for pd, Initial residual = 9.42454e-11, Final residual = 9.42454e-11, No Iterations 0
time step continuity errors : sum local = 4.823429e-10, global = 3.309898e-11, cumulative = -1.050631e-05
ExecutionTime = 2545.61 s ClockTime = 2676 s

Courant Number mean: 0.01290612 max: 2.573966
deltaT = 0.003333333
Time = 6.033333

linear Wave piston position is 0.05147078
DICPCG: Solving for cellDisplacementx, Initial residual = 0.01111891, Final residual = 9.792253e-08, No Iterations 252
DICPCG: Solving for cellDisplacementy, Initial residual = 0, Final residual = 0, No Iterations 0
DICPCG: Solving for cellDisplacementz, Initial residual = 0, Final residual = 0, No Iterations 0
Execution time for mesh.update() = 19.96 s
time step continuity errors : sum local = 1.607847e-10, global = 1.103325e-11, cumulative = -1.05063e-05
DICPCG: Solving for pcorr, Initial residual = 1, Final residual = 9.673263e-08, No Iterations 277
DICPCG: Solving for pcorr, Initial residual = 3.079098e-08, Final residual = 3.079098e-08, No Iterations 0
time step continuity errors : sum local = 1.555729e-17, global = -3.750854e-19, cumulative = -1.05063e-05
MULES: Solving for gamma
Liquid phase volume fraction = 0.5009433 Min(gamma) = 0 Max(gamma) = 1
DILUPBiCG: Solving for Ux, Initial residual = 0.00230477, Final residual = 9.883591e-11, No Iterations 3
DILUPBiCG: Solving for Uy, Initial residual = 0.01309769, Final residual = 245330.1, No Iterations 1001
DILUPBiCG: Solving for Uz, Initial residual = 0.01205578, Final residual = 3.080377e-12, No Iterations 4
DICPCG: Solving for pd, Initial residual = 0.9999977, Final residual = 9.253089e-11, No Iterations 239
DICPCG: Solving for pd, Initial residual = 4.151806e-11, Final residual = 4.151806e-11, No Iterations 0
time step continuity errors : sum local = 2.035476e-05, global = -1.173797e-06, cumulative = -1.16801e-05
DICPCG: Solving for pd, Initial residual = 0.7254799, Final residual = 9.786953e-11, No Iterations 235
DICPCG: Solving for pd, Initial residual = 6.724763e-10, Final residual = 9.996866e-11, No Iterations 23
time step continuity errors : sum local = 4.238734e-06, global = 2.943024e-07, cumulative = -1.13858e-05
ExecutionTime = 2778.01 s ClockTime = 2912 s

Courant Number mean: 23435.28 max: 5.125833e+09
[1] [0] #0 #0 Foam::error:rintStack(Foam::Ostream&)Foam::error :rintStack(Foam::Ostream&) in "/home/jingjing/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/ in "/home/jingjing/OpenFOAM/OpenlFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
[0] #1 Foam::sigFpe::sigFpeHandler(int)ibOpenFOAM.so"
[1] #1 Foam::sigFpe::sigFpeHandler(int) in "/home/jingjing/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
[1] #2 in "/home/jingjing/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
[0] #2 __restore_rt__restore_rt at sigaction.c:0
[0] #3 Foam::Time::adjustDeltaT() at sigaction.c:0
[1] #3 Foam::Time::adjustDeltaT() in "/home/jingjing/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
[1] #4 in "/home/jingjing/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
[0] #4 mainmain in "/home/jingjing/OpenFOAM/OpenFOAM-1.5/applications/bin/linux64GccDPOpt/interDyMFoam"
[0] #5 __libc_start_main in "/home/jingjing/OpenFOAM/OpenFOAM-1.5/applications/bin/linux64GccDPOpt/interDyMFoam"
[1] #5 __libc_start_main in "/lib64/libc.so.6"
[0] #6 in "/lib64/libc.so.6"
[1] #6 Foam::regIOobject::writeObject(Foam::IOstream::str eamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) const in "/home/jingjing/OpenFOAM/OpenFOAM-1.5/applications/bin/linux64GccDPOpt/interDyMFoam"
[jingjing:29413] *** Process received signal ***
[jingjing:29413] Signal: Floating point exception (8)
[jingjing:29413] Signal code: (-6)
[jingjing:29413] Failing at address: 0x1f4000072e5
[jingjing:29413] [ 0] /lib64/libc.so.6 [0x382fc322a0]
[jingjing:29413] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x382fc32215]
[jingjing:29413] [ 2] /lib64/libc.so.6 [0x382fc322a0]
[jingjing:29413] [ 3] /home/jingjing/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so(_ZN4Foam4Time12adjustDeltaTEv+0x58) [0x7f83d24d3698]
[jingjing:29413] [ 4] interDyMFoam [0x420e57]
[jingjing:29413] [ 5] /lib64/libc.so.6(__libc_start_main+0xfa) [0x382fc1e32a]
[jingjing:29413] [ 6] interDyMFoam(_ZNK4Foam11regIOobject11writeObjectEN S_8IOstream12streamFormatENS1_13versionNumberENS1_ 15compressionTypeE+0xe9) [0x41de99]
[jingjing:29413] *** End of error message ***
Foam::regIOobject::writeObject(Foam::IOstream::str eamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) constmpirun noticed that job rank 0 with PID 29413 on node jingjing exited on signal 8 (Floating point exception).
1 additional process aborted (not shown)
[QUOTE]

Last edited by zhajingjing; September 12, 2010 at 00:24.
zhajingjing is offline   Reply With Quote

Old   September 14, 2010, 05:34
Default
  #2
Senior Member
 
su_junwei's Avatar
 
su junwei
Join Date: Mar 2009
Location: Xi'an China
Posts: 151
Rep Power: 20
su_junwei is on a distinguished road
Send a message via MSN to su_junwei
It seems that you have used an adjustable time step procedure. Maybe after the time point 6.02, the flow field changed much, and time step adjusting procedure cannot react quickly. you can set a fixed and smaller time step and restart the simulation and see what happens.

Regards, Junwei
su_junwei is offline   Reply With Quote

Old   September 14, 2010, 06:01
Default
  #3
Member
 
jingjing
Join Date: Mar 2009
Location: shanghai,China
Posts: 30
Rep Power: 17
zhajingjing is on a distinguished road
I have ever tried fixed time step, but it may be not the key point of the problem.

You see,
/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 1.5 |
| \\ / A nd | Web: http://www.OpenFOAM.org |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Exec : /export/home/zjj/Deck/deck1/interDyMFoam -parallel
Date : Sep 14 2010
Time : 16:56:32
Host : node8
PID : 8093
Case : /export/home/zjj/Deck/deck1
nProcs : 4
Slaves :
3
(
node8.8094
node8.8095
node8.8096
)

Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : nonBlocking

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 22.698

Selecting dynamicFvMesh dynamicMotionSolverFvMesh
Selecting motion solver: displacementLaplacian
water depth 1.4
wave type linearWave
wave omega 5.5478
wave height 0.12
wave number 3.138377
w=1.99404
piston stroke 0.03008966
Selecting motion diffusion: inverseDistance

Reading environmentalProperties
Reading field pd

Reading field gamma

Reading field U

Reading/calculating face flux field phi

Reading transportProperties

Selecting incompressible transport model Newtonian
Selecting incompressible transport model Newtonian
Selecting RAS turbulence model laminar
time step continuity errors : sum local = 1.296917e-05, global = 1.236028e-05, cumulative = 1.236028e-05
DICPCG: Solving for pcorr, Initial residual = 1, Final residual = 9.714769e-08, No Iterations 283
time step continuity errors : sum local = 1.259925e-12, global = -6.114055e-15, cumulative = 1.236028e-05
Courant Number mean: 0.009158565 max: 0.3974658

Starting time loop

Courant Number mean: 0.009158565 max: 0.3974658
Time = 22.706

linear Wave piston position is 0.001385484
DICPCG: Solving for cellDisplacementx, Initial residual = 0.2668942, Final residual = 9.819123e-07, No Iterations 258
DICPCG: Solving for cellDisplacementy, Initial residual = 0, Final residual = 0, No Iterations 0
DICPCG: Solving for cellDisplacementz, Initial residual = 0, Final residual = 0, No Iterations 0
Execution time for mesh.update() = 5.17 s
time step continuity errors : sum local = 1.282437e-05, global = -1.231289e-05, cumulative = 4.739272e-08
DICPCG: Solving for pcorr, Initial residual = 1, Final residual = 9.145362e-08, No Iterations 283
time step continuity errors : sum local = 1.172835e-12, global = 5.784445e-15, cumulative = 4.739273e-08
MULES: Solving for gamma
Liquid phase volume fraction = 0.499892 Min(gamma) = 0 Max(gamma) = 1
DILUPBiCG: Solving for Ux, Initial residual = 0.005605051, Final residual = 3.458147e-08, No Iterations 2
[0] #0 [1] #0 Foam::error:rintStack(Foam::Ostream&)[2] #0 Foam::error:rintStack(Foam::Ostream&)[3] #0 Foam::error:rintStack(Foam::Ostream&)Foam::error :rintStack(Foam::Ostream&) in "/export/home/zjj/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
[1] #1 Foam::sigFpe::sigFpeHandler(int) in "/export/home/zjj/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
in [0] #1 "/export/home/zjj/OpenFFoam::sigFpe::sigFpeHandler(int)OAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
[2] #1 Foam::sigFpe::sigFpeHandler(int) in "/export/home/zjj/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
[3] #1 Foam::sigFpe::sigFpeHandler(int) in "/export/home/zjj/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
[1] #2 in "/export/home/zjj/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
[0] #2 in "/export/home/zjj/Ope in "/export/home/zjj/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
[3] #2 nFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
[2] #2 __restore_rt__restore_rt__restore_rt__restore_rt at sigaction.c:0
[0] #3 Foam::PBiCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const at sigaction.c:0
[1] #3 Foam::PBiCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const at sigaction.c:0
[3] #3 Foam::PBiCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const at sigaction.c:0
[2] #3 Foam::PBiCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const in "/export/home/zjj/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
[1] #4 in "/export/home/zjj/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so"
[0] #4 in "/export/home/zjj/OpenFOAM/OpenFOAM-1.5/lib/ in "/export/home/zjj/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenlinux64GccDPOpt/libOpenFOAM.so"FOAM.so"
[3] #4
[2] #4 Foam::fvMatrix<Foam::Vector<double> >::solve(Foam::Istream&)Foam::fvMatrix<Foam::Vecto r<double> >::solve(Foam::Istream&)Foam::fvMatrix<Foam::Vecto r<double> >::solve(Foam::Istream&)Foam::fvMatrix<Foam::Vecto r<double> >::solve(Foam::Istream&) in "/export/home/zjj/Deck/deck1/interDyMFoam"
[0] #5 in "/export/home/zjj/Deck/deck1/interDyMFoam"
[1] #5 in "/export/home/zjj/Deck/deck1/interDyMFoam"
[2] #5 in "/export/home/zjj/Deck/deck1/interDyMFoam"
[3] #5 mainmainmainmain in in "/export/home/zjj/Deck/deck1/interDyMFoam"
[1] #6 "/export/home/zjj/Deck/deck1/interDyMFoam"
[0] #6 __libc_start_main__libc_start_main in "/export/home/zjj/Deck/deck1/interDyMFoam"
[2] #6 __libc_start_main in "/export/home/zjj/Deck/deck1/interDyMFoam"
[3] #6 __libc_start_main in "/lib64/libc.so.6"
[0] #7 Foam::regIOobject::writeObject(Foam::IOstream::str eamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) const in "/lib64/libc.so.6"
[1] #7 Foam::regIOobject::writeObject(Foam::IOstream::str eamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) const in "/lib64/libc.so.6"
[2] #7 Foam::regIOobject::writeObject(Foam::IOstream::str eamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) const in "/lib64/libc.so.6"
[3] #7 Foam::regIOobject::writeObject(Foam::IOstream::str eamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) const in "/export/home/zjj/Deck/deck1/interDyMFoam"
[node8:08094] *** Process received signal ***
[node8:08094] Signal: Floating point exception (8)
[node8:08094] Signal code: (-6)
[node8:08094] Failing at address: 0x20100001f9e
in "/export/home/zjj/Deck/deck1/interDyMFoam"
[node8:08094] [ 0] /lib64/libc.so.6 [0x3d9d830280]
[node8:08094] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x3d9d830215]
[node8:08094] [ 2] /lib64/libc.so.6 [0x3d9d830280]
[node8:08094] [ 3] /export/home/zjj/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so(_ZNK4Foam5PBiCG5solveERNS_5FieldIdE ERKS2_h+0x11db) [0x2ad890ba992b]
[node8:08094] [ 4] /export/home/zjj/Deck/deck1/interDyMFoam [0x45d64c]
[node8:08094] [ 5] /export/home/zjj/Deck/deck1/interDyMFoam [0x426875]
[node8:08094] [ 6] /lib64/libc.so.6(__libc_start_main+0xf4) [0x3d9d81d974]
[node8:08094] [ 7] /export/home/zjj/Deck/deck1/interDyMFoam(_ZNK4Foam11regIOobject11writeObjectEN S_8IOstream12streamFormatENS1_13versionNumberENS1_ 15compressionTypeE+0xe9) [0x41de99]
[node8:08094] *** End of error message ***
[node8:08093] *** Process received signal ***
[node8:08093] Signal: Floating point exception (8)
[node8:08093] Signal code: (-6)
[node8:08093] Failing at address: 0x20100001f9d
[node8:08093] [ 0] /lib64/libc.so.6 [0x3d9d830280]
[node8:08093] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x3d9d830215]
[node8:08093] [ 2] /lib64/libc.so.6 [0x3d9d830280]
[node8:08093] [ 3] /export/home/zjj/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so(_ZNK4Foam5PBiCG5solveERNS_5FieldIdE ERKS2_h+0x11db) [0x2b1815cf492b]
[node8:08093] [ 4] /export/home/zjj/Deck/deck1/interDyMFoam [0x45d64c]
[node8:08093] [ 5] /export/home/zjj/Deck/deck1/interDyMFoam [0x426875]
[node8:08093] [ 6] /lib64/libc.so.6(__libc_start_main+0xf4) [0x3d9d81d974]
[node8:08093] [ 7] /export/home/zjj/Deck/deck1/interDyMFoam(_ZNK4Foam11regIOobject11writeObjectEN S_8IOstream12streamFormatENS1_13versionNumberENS1_ 15compressionTypeE+0xe9) [0x41de99]
[node8:08093] *** End of error message ***
in "/export/home/zjj/Deck/deck1/interDyMFoam"
[node8:08095] *** Process received signal ***
[node8:08095] Signal: Floating point exception (8)
[node8:08095] Signal code: (-6)
[node8:08095] Failing at address: 0x20100001f9f
[node8:08095] [ 0] /lib64/libc.so.6 [0x3d9d830280]
[node8:08095] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x3d9d830215]
[node8:08095] [ 2] /lib64/libc.so.6 [0x3d9d830280]
[node8:08095] [ 3] /export/home/zjj/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so(_ZNK4Foam5PBiCG5solveERNS_5FieldIdE ERKS2_h+0x11db) [0x2b80f81bd92b]
[node8:08095] [ 4] /export/home/zjj/Deck/deck1/interDyMFoam [0x45d64c]
[node8:08095] [ 5] /export/home/zjj/Deck/deck1/interDyMFoam [0x426875]
[node8:08095] [ 6] /lib64/libc.so.6(__libc_start_main+0xf4) [0x3d9d81d974]
[node8:08095] [ 7] /export/home/zjj/Deck/deck1/interDyMFoam(_ZNK4Foam11regIOobject11writeObjectEN S_8IOstream12streamFormatENS1_13versionNumberENS1_ 15compressionTypeE+0xe9) [0x41de99]
[node8:08095] *** End of error message ***
in "/export/home/zjj/Deck/deck1/interDyMFoam"
[node8:08096] *** Process received signal ***
[node8:08096] Signal: Floating point exception (8)
[node8:08096] Signal code: (-6)
[node8:08096] Failing at address: 0x20100001fa0
[node8:08096] [ 0] /lib64/libc.so.6 [0x3d9d830280]
[node8:08096] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x3d9d830215]
[node8:08096] [ 2] /lib64/libc.so.6 [0x3d9d830280]
[node8:08096] [ 3] /export/home/zjj/OpenFOAM/OpenFOAM-1.5/lib/linux64GccDPOpt/libOpenFOAM.so(_ZNK4Foam5PBiCG5solveERNS_5FieldIdE ERKS2_h+0x11db) [0x2b4a13bfa92b]
[node8:08096] [ 4] /export/home/zjj/Deck/deck1/interDyMFoam [0x45d64c]
[node8:08096] [ 5] /export/home/zjj/Deck/deck1/interDyMFoam [0x426875]
[node8:08096] [ 6] /lib64/libc.so.6(__libc_start_main+0xf4) [0x3d9d81d974]
[node8:08096] [ 7] /export/home/zjj/Deck/deck1/interDyMFoam(_ZNK4Foam11regIOobject11writeObjectEN S_8IOstream12streamFormatENS1_13versionNumberENS1_ 15compressionTypeE+0xe9) [0x41de99]
[node8:08096] *** End of error message ***
mpirun noticed that job rank 0 with PID 8093 on node node8-ib exited on signal 8 (Floating point exception).
3 additional processes aborted (not shown)


I notice that it quite often crash when calculating the Uy, so the U solver"PBiCG" usually pops out, although sometimes it doesn't appear.

I have try "simple" "metis" pattern(except "manual", I don't know how) running on my PC or cluster, but the problem still exist.

The case is about 3D wave tank , wave generated at one end by moving wall, pass through a deck. Mesh is produced from gambit.

I try to provide detial as much as possible, Could someone help me?
zhajingjing is offline   Reply With Quote

Old   September 15, 2010, 04:55
Default
  #4
New Member
 
Snehal Janwe
Join Date: May 2010
Location: Stuttgart
Posts: 10
Rep Power: 16
snehal is on a distinguished road
hi,

try increasing the no of correctors and northogonal correctors....if dat doesnt help den try reducing the time step further also simultaenously applying the correctors(both)... that shud solve the problem.....
snehal is offline   Reply With Quote

Old   September 15, 2010, 08:12
Default
  #5
Member
 
jingjing
Join Date: Mar 2009
Location: shanghai,China
Posts: 30
Rep Power: 17
zhajingjing is on a distinguished road
Hi ,snehal
I will try according to your advise.


BTW, I run the same case in OpenFOAM-1.5-dev, (previously OpenFOAM-1.5), and get the following information.Also it runs well under serial pattern, has problem under parallel pattern.

Code:
[duanmu@localhost Deck]$ mpirun -np 2 interDyMFoam -parallel 
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  1.5-dev                               |
|   \\  /    A nd           | Revision: exported                              |
|    \\/     M anipulation  | Web:      http://www.OpenFOAM.org               |
\*---------------------------------------------------------------------------*/
Exec   : interDyMFoam -parallel
Date   : Sep 15 2010
Time   : 07:20:57
Host   : localhost.localdomain
PID    : 22287
Case   : /home/duanmu/zjj/Deck
nProcs : 2
Slaves : 
1
(
localhost.localdomain.22288
)

Pstream initialized with:
    floatTransfer     : 0
    nProcsSimpleSum   : 0
    commsType         : nonBlocking

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create dynamic mesh for time = 0

Selecting dynamicFvMesh dynamicMotionSolverFvMesh
Selecting motion solver: displacementLaplacian
--> FOAM Warning : 
    From function dlLibraryTable::open(const fileName& functionLibName)
    in file db/dlLibraryTable/dlLibraryTable.C at line 86
    could not load /home/duanmu/OpenFOAM/OpenFOAM-1.6/lib/linuxGccDPOpt/libfvMotionSolvers.so: undefined symbol: _ZN4Foam15pointPatchFieldIdE5debugE
water depth 1.4
wave type linearWave
wave omega 5.5478
wave height 0.14
wave number 3.138377
w=1.99404
piston stroke 0.0351046
Selecting motion diffusion: inverseDistance

Reading environmentalProperties
Reading field pd

Reading field gamma

Reading field U

Reading/calculating face flux field phi

Reading transportProperties

Selecting incompressible transport model Newtonian
Selecting incompressible transport model Newtonian
Selecting RAS turbulence model laminar
time step continuity errors : sum local = 0, global = 0, cumulative = 0
PCG:  Solving for pcorr, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors : sum local = 0, global = 0, cumulative = 0
Courant Number mean: 0 max: 0 velocity magnitude: 0

Starting time loop

Courant Number mean: 0 max: 0 velocity magnitude: 0
deltaT = 0.008333333
Time = 0.008333333

linear Wave piston position is 3.750903e-05
PCG:  Solving for cellDisplacementx, Initial residual = 1, Final residual = 9.230985e-07, No Iterations 260
PCG:  Solving for cellDisplacementy, Initial residual = 0, Final residual = 0, No Iterations 0
PCG:  Solving for cellDisplacementz, Initial residual = 0, Final residual = 0, No Iterations 0
Execution time for mesh.update() = 21 s
time step continuity errors : sum local = 0, global = 0, cumulative = 0
PCG:  Solving for pcorr, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors : sum local = 0, global = 0, cumulative = 0
MULES: Solving for gamma
Liquid phase volume fraction = 0.4998556  Min(gamma) = 0  Max(gamma) = 1
PBiCG:  Solving for Ux, Initial residual = 1, Final residual = 7.220864e-10, No Iterations 3
PBiCG:  Solving for Uy, Initial residual = 1, Final residual = 4.717869e-09, No Iterations 2
PBiCG:  Solving for Uz, Initial residual = 1, Final residual = 1.299526e-09, No Iterations 3
PCG:  Solving for pd, Initial residual = 1, Final residual = 9.549982e-08, No Iterations 121
time step continuity errors : sum local = 2.43082e-06, global = 5.194373e-09, cumulative = 5.194373e-09
PCG:  Solving for pd, Initial residual = 0.000362808, Final residual = 9.54551e-08, No Iterations 57
time step continuity errors : sum local = 5.057481e-06, global = 7.846905e-07, cumulative = 7.898849e-07
ExecutionTime = 50.74 s  ClockTime = 57 s

Courant Number mean: 0.0001599159 max: 0.01164876 velocity magnitude: 0.01410846
deltaT = 0.008333333
Time = 0.01666667

linear Wave piston position is 0.000149956
PCG:  Solving for cellDisplacementx, Initial residual = 0.7481973, Final residual = 9.81093e-07, No Iterations 266
PCG:  Solving for cellDisplacementy, Initial residual = 0, Final residual = 0, No Iterations 0
PCG:  Solving for cellDisplacementz, Initial residual = 0, Final residual = 0, No Iterations 0
[1] [0] 
[0] 

[0] face 193 area does not match neighbour by 199.322% -- possible face ordering problem.
patch: procBoundary0to1 my area:0.349388 neighbour area: 0.00059294 matching tolerance: 0.001
Mesh face: 890930 vertices: 4((19.519 1.34 2.235) (7.9 1.35 1.49) (7.90001 1.35 1.43071) (7.90001 1.34 1.43071))
Rerun with processor debug flag set for more information.[1] 

[1] [0] 
face 193 area does not match neighbour by 199.322% -- possible face ordering problem.
patch: procBoundary1to0 my area:0.00059294 neighbour area: 0.349388 matching tolerance: 0.001
Mesh face: 886645 vertices: 4((7.9 1.34 1.49) (7.90001 1.34 1.43071) (7.90001 1.35 1.43071) (7.9 1.35 1.49))
Rerun with processor debug flag set for more information.[0] 
    From function [1] processorPolyPatch::calcGeometry()

[0] [1]     From function     in file meshes/polyMesh/polyPatches/constraint/processor/processorPolyPatch.CprocessorPolyPatch::calcGeometry()
[1]     in file  at line 204meshes/polyMesh/polyPatches/constraint/processor/processorPolyPatch.C.
 at line 204[0] 
FOAM parallel run exiting
.
[0] [1] 

FOAM parallel run exiting
[1] 
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 22288 on
node localhost.localdomain exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[localhost.localdomain:22286] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[localhost.localdomain:22286] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
then I change the setting in etc/controlDict:
processor 1;
processorLduInterface 1;
processorLduInterfaceField 1;


No additional information appear
Code:
Courant Number mean: 0.0001599159 max: 0.01164876 velocity magnitude: 0.01410846
deltaT = 0.008333333
Time = 0.01666667

linear Wave piston position is 0.000149956
PCG:  Solving for cellDisplacementx, Initial residual = 0.7481973, Final residual = 9.81093e-07, No Iterations 266
PCG:  Solving for cellDisplacementy, Initial residual = 0, Final residual = 0, No Iterations 0
PCG:  Solving for cellDisplacementz, Initial residual = 0, Final residual = 0, No Iterations 0
[0] 
[0] 
[0] face 193 area does not match neighbour by 199.322% -- possible face ordering problem.
patch: procBoundary0to1 my area:0.349388 neighbour area: 0.00059294 matching tolerance: 0.001
Mesh face: 890930 vertices: 4((19.519 1.34 2.235) (7.9 1.35 1.49) (7.90001 1.35 1.43071) (7.90001 1.34 1.43071))
Rerun with processor debug flag set for more information.
[0] 
[0]     From function processorPolyPatch::calcGeometry()
[0]     in file meshes/polyMesh/polyPatches/constraint/processor/processorPolyPatch.C[1] 
[1] 
[1] face 193 area does not match neighbour by 199.322% -- possible face ordering problem.
patch: procBoundary1to0 my area:0.00059294 neighbour area: 0.349388 matching tolerance: 0.001
Mesh face: 886645 vertices: 4((7.9 1.34 1.49) (7.90001 1.34 1.43071) (7.90001 1.35 1.43071) (7.9 1.35 1.49))
Rerun with processor debug flag set for more information.
[1] 
[1]     From function processorPolyPatch::calcGeometry()
[1]     in file meshes/polyMesh/polyPatches/constraint/processor/processorPolyPatch.C at line 204.
[1] 
FOAM parallel run exiting
[1] 
 at line 204.--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

[0] 
FOAM parallel run exiting
[0] 
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 22771 on
node localhost.localdomain exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[localhost.localdomain:22769] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[localhost.localdomain:22769] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
zhajingjing is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
running OpenFoam in parallel vishwa OpenFOAM Running, Solving & CFD 22 August 2, 2015 09:53
Running PimpleDyMFoam in parallel paul b OpenFOAM Running, Solving & CFD 8 April 20, 2011 06:21
Statically Compiling OpenFOAM Issues herzfeldd OpenFOAM Installation 21 January 6, 2009 10:38
Kubuntu uses dash breaks All scripts in tutorials platopus OpenFOAM Bugs 8 April 15, 2008 08:52
running multiple Fluent parallel jobs Michael Bo Hansen FLUENT 8 June 7, 2006 09:52


All times are GMT -4. The time now is 16:23.