|
[Sponsors] |
Parallel running of 3D multiphase turbulence model (unknown problem!!) |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
October 28, 2011, 14:18 |
Parallel running of 3D multiphase turbulence model (unknown problem!!)
|
#1 |
Member
|
Dear friends Hi
I faced a problem during execution of 3D multiphase turbulence model as follows: geeko@linux:/opt/OpenFOAM/OpenFOAM-2.0.x/run/CDMA> mpirun -np 2 interFoam -parallel > log & [1] 4220 geeko@linux:/opt/OpenFOAM/OpenFOAM-2.0.x/run/CDMA> [0] [1] #0 Foam::error:rintStack(Foam::Ostream&)#0 Foam::error:rintStack(Foam::Ostream&) in "/opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [1] #1 Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [0] #1 Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [1] #2 in "/opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [0] #2 in "/lib64/libc.so.6" [0] #3 void Foam::MULES::limiter<Foam::geometricOneField, Foam::zeroField, Foam::zeroField>(Foam::Field<double>&, Foam::geometricOneField const&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::zeroField const&, Foam::zeroField const&, double, double, int) in "/lib64/libc.so.6" [1] #3 void Foam::MULES::limiter<Foam::geometricOneField, Foam::zeroField, Foam::zeroField>(Foam::Field<double>&, Foam::geometricOneField const&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::zeroField const&, Foam::zeroField const&, double, double, int) in "/opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so" [0] #4 void Foam::MULES::explicitSolve<Foam::geometricOneField , Foam::zeroField, Foam::zeroField>(Foam::geometricOneField const&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh>&, Foam::zeroField const&, Foam::zeroField const&, double, double) in "/opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so" [1] #4 void Foam::MULES::explicitSolve<Foam::geometricOneField , Foam::zeroField, Foam::zeroField>(Foam::geometricOneField const&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh>&, Foam::zeroField const&, Foam::zeroField const&, double, double) in "/opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so" [0] #5 Foam::MULES::explicitSolve(Foam::GeometricField<do uble, Foam::fvPatchField, Foam::volMesh>&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh>&, double, double) in "/opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so" [1] #5 Foam::MULES::explicitSolve(Foam::GeometricField<do uble, Foam::fvPatchField, Foam::volMesh>&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh>&, double, double) in "/opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so" [1] #6 in "/opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so" [0] #6 [1] in "/opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/bin/interFoam" [1] #7 __libc_start_main[0] in "/opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/bin/interFoam" [0] #7 __libc_start_main in "/lib64/libc.so.6" [1] #8 in "/lib64/libc.so.6" [0] #8 [1] at /usr/src/packages/BUILD/glibc-2.11.3/csu/../sysdeps/x86_64/elf/start.S:116 [linux:04225] *** Process received signal *** [linux:04225] Signal: Floating point exception (8) [linux:04225] Signal code: (-6) [linux:04225] Failing at address: 0x3e800001081 [linux:04225] [ 0] /lib64/libc.so.6(+0x32b30) [0x7fbf65499b30] [linux:04225] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x7fbf65499ab5] [linux:04225] [ 2] /lib64/libc.so.6(+0x32b30) [0x7fbf65499b30] [linux:04225] [ 3] /opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so(_ZN4Foam5MULES7limiterINS_17geo metricOneFieldENS_9zeroFieldES3_EEvRNS_5FieldIdEER KT_RKNS_14GeometricFieldIdNS_12fvPatchFieldENS_7vo lMeshEEERKNSA_IdNS_13fvsPatchFieldENS_11surfaceMes hEEESK_RKT0_RKT1_ddi+0x1235) [0x7fbf67027a25] [linux:04225] [ 4] /opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so(_ZN4Foam5MULES13explicitSolveIN S_17geometricOneFieldENS_9zeroFieldES3_EEvRKT_RNS_ 14GeometricFieldIdNS_12fvPatchFieldENS_7volMeshEEE RKNS7_IdNS_13fvsPatchFieldENS_11surfaceMeshEEERSE_ RKT0_RKT1_dd+0x29d) [0x7fbf670292ad] [linux:04225] [ 5] /opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so(_ZN4Foam5MULES13explicitSolveER NS_14GeometricFieldIdNS_12fvPatchFieldENS_7volMesh EEERKNS1_IdNS_13fvsPatchFieldENS_11surfaceMeshEEER S8_dd+0x24) [0x7fbf6701b724] [linux:04225] [ 6] interFoam() [0x42ef12] [linux:04225] [ 7] /lib64/libc.so.6(__libc_start_main+0xfd) [0x7fbf65485bfd] [linux:04225] [ 8] interFoam() [0x426729] [linux:04225] *** End of error message *** [0] at /usr/src/packages/BUILD/glibc-2.11.3/csu/../sysdeps/x86_64/elf/start.S:116 [linux:04224] *** Process received signal *** [linux:04224] Signal: Floating point exception (8) [linux:04224] Signal code: (-6) [linux:04224] Failing at address: 0x3e800001080 [linux:04224] [ 0] /lib64/libc.so.6(+0x32b30) [0x7f6b9c31ab30] [linux:04224] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x7f6b9c31aab5] [linux:04224] [ 2] /lib64/libc.so.6(+0x32b30) [0x7f6b9c31ab30] [linux:04224] [ 3] /opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so(_ZN4Foam5MULES7limiterINS_17geo metricOneFieldENS_9zeroFieldES3_EEvRNS_5FieldIdEER KT_RKNS_14GeometricFieldIdNS_12fvPatchFieldENS_7vo lMeshEEERKNSA_IdNS_13fvsPatchFieldENS_11surfaceMes hEEESK_RKT0_RKT1_ddi+0x1235) [0x7f6b9dea8a25] [linux:04224] [ 4] /opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so(_ZN4Foam5MULES13explicitSolveIN S_17geometricOneFieldENS_9zeroFieldES3_EEvRKT_RNS_ 14GeometricFieldIdNS_12fvPatchFieldENS_7volMeshEEE RKNS7_IdNS_13fvsPatchFieldENS_11surfaceMeshEEERSE_ RKT0_RKT1_dd+0x29d) [0x7f6b9deaa2ad] [linux:04224] [ 5] /opt/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so(_ZN4Foam5MULES13explicitSolveER NS_14GeometricFieldIdNS_12fvPatchFieldENS_7volMesh EEERKNS1_IdNS_13fvsPatchFieldENS_11surfaceMeshEEER S8_dd+0x24) [0x7f6b9de9c724] [linux:04224] [ 6] interFoam() [0x42ef12] [linux:04224] [ 7] /lib64/libc.so.6(__libc_start_main+0xfd) [0x7f6b9c306bfd] [linux:04224] [ 8] interFoam() [0x426729] [linux:04224] *** End of error message *** -------------------------------------------------------------------------- mpirun noticed that process rank 0 with PID 4224 on node linux exited on signal 8 (Floating point exception). -------------------------------------------------------------------------- [1]+ Exit 136 mpirun -np 2 interFoam -parallel > log Also you can see the error in log file is: MULES: Solving for alpha1 Liquid phase volume fraction = 0.0214814 Min(alpha1) = -2.21538e-20 Max(alpha1) = 1 MULES: Solving for alpha1 Liquid phase volume fraction = 0.0214816 Min(alpha1) = -1.56645e-20 Max(alpha1) = 1 DICPCG: Solving for p_rgh, Initial residual = 0.264009, Final residual = 0.0106049, No Iterations 1 time step continuity errors : sum local = 2.49113e-06, global = -7.16946e-11, cumulative = -2.50461e-05 DICPCG: Solving for p_rgh, Initial residual = 0.859341, Final residual = 0.0411729, No Iterations 105 time step continuity errors : sum local = 4.15288e+07, global = 3.019e+06, cumulative = 3.019e+06 DICPCG: Solving for p_rgh, Initial residual = 0.988467, Final residual = 9.83101e-08, No Iterations 277 time step continuity errors : sum local = 32309.7, global = -382.232, cumulative = 3.01862e+06 DILUPBiCG: Solving for epsilon, Initial residual = 1, Final residual = 6.48318e-09, No Iterations 69 DILUPBiCG: Solving for k, Initial residual = 1, Final residual = 7.52821e-09, No Iterations 42 ExecutionTime = 2777.82 s ClockTime = 3719 s Courant Number mean: 5.98015e+11 max: 3.74606e+15 Interface Courant Number mean: 3.67632e+09 max: 2.2837e+13 deltaT = 1.98464e-21 --> FOAM Warning : From function Time:perator++() in file db/Time/Time.C at line 937 Increased the timePrecision from 6 to 7 to distinguish between timeNames at time 0.771539 Time = 0.7715394 --> FOAM Warning : From function Time:perator++() in file db/Time/Time.C at line 937 Increased the timePrecision from 7 to 8 to distinguish between timeNames at time 0.771539 who knows what the problem is? |
|
October 28, 2011, 14:52 |
|
#2 | |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Greetings Mohammad,
Quoting from my signature link: Quote:
Best regards, Bruno
__________________
|
||
October 28, 2011, 15:25 |
|
#3 |
Senior Member
Kent Wardle
Join Date: Mar 2009
Location: Illinois, USA
Posts: 219
Rep Power: 21 |
Mohammad,
As Bruno suggests, take a look at the output. You have a floating point exception--which means something exploded. Your log shows (as typical) it is the velocity that went crazy and drove your Co number up to crazy high values and your time step down to crazy low values. This is not uncommon in interFoam when something is wrong with the input. A few things to try in figuring it out: I would suggest you give it a try with turbulence turned off. Does it run? How about using a different turbulence model? I use LES since interFoam is already transient and mesh and timestep are typically small anyway. When I have used kEpsilon I have always had lots of trouble with stability such as this. Perhaps try a different model--what is appropriate will depend a little on what problem you are looking at. It looks like alpha1 is being conserved fine (min~0, max=1). Take a close look at your BCs for U, p (pd, p_rgh, whatever), k, and epsilon and makes sure everything is consistent. If you want help, post these here. Based on your run time (>0.77s), it looks like it must have gone OK for a while and then suddenly crashed? This is also not out of the ordinary in my experience. One thing I have recently started doing is adding in a timeDump.H file to my solver near the end of the runtime.run() loop. This will read in an extra parameter from system/controlDict called minDeltaT (there is already a maxDeltaT there) and execute a runTime.writeAndEnd() if it drops below this value. The value of this will depend on your case and what is typical. For me I set it as 1e-8. Anyway, this will at least give you the field outputs for the last iteration and allow you to take a look and see what is happening--maybe things are going crazy near a certain boundary or cells. It will also save time by killing things when they become unreasonable and not force it to drive the time step all the way to a non-conservation death. Perhaps this is useful to you or others. One final thing, presumably you are using fvSchemes from one of the interFoam tutorials. You might try to change the divScheme on U to upwind just to see if helps. You probably don't want this for accuracy, but it can help in diagnosis. Also for interFoam cases, it has been suggested to me to use: laplacianSchemesI am not an expert on the numerical subtleties, but these have worked for me. Diagnosing these problems isn't always easy, but it does seem like it is usually an inconsistency in the input. Although, that will usually manifest itself quickly, unlike your case so.... Good luck, Kent |
|
November 1, 2011, 14:56 |
|
#4 |
Member
|
dear kwardle
I model my case without implementing turbulence. it worked correctly. as you told me I used les model but I faced the same error as kepsilon. I attached my model's pic with 0 folder and boundary file. |
|
November 2, 2011, 12:07 |
|
#5 |
Senior Member
Kent Wardle
Join Date: Mar 2009
Location: Illinois, USA
Posts: 219
Rep Power: 21 |
In case it may be useful to others (or someone might disagree and have a better solution--which is helpful to me ;-) ), this was my response sent to Mohammad via e-mail--he had sent me all his relevant files.
===== Mohammad, Indeed, if things ran fine with laminar, then it is a problem with the turbulence. In general, I think things look OK. I just have a couple suggestions of things to try. First, I would not use a value of cAlpha greater than 1. While you can technically do so, it tends to result in unphysical instabilities at the interface. This may not be your particular problem, but it may help in the long term. Also, if you are using LES in this case, you will need to update the BCs on nuSgs to be similar to what you have on k—inletOutlet for your outlet and cover surfaces. You could try to use nuSgsWallFunction as well for the wall type if you want to try wall functions. Did you try to use a different RAS model and see if that helped? Another final suggestions, try adding under-relaxation (and perhaps turning on momentumPredictor) in fvSolution. Add under-relaxation by adding the following to fvSolution: relaxationFactors { p_rgh 0.6; U 0.8; alpha1 1; k 0.6; epsilon 0.6; } The actual values are just suggestions and you may start with everything at 1 and just decrease k/epsilon (assuming you are trying kEpsilon model). It is possible this might help as well with your stability problems. Good luck, Kent --- One additional thing. Rather than use zeroGradient on your alpha1 boundaries, you might want to use inletOutlet for the cover and outlet. In case your problem is due to reversed flow at the outlets (have you checked the last iteration’s results to see) this might help. Presumably you are mapping in some non-full inlet condition, so zeroGradient is OK there. =============== |
|
November 11, 2011, 01:12 |
|
#6 |
Member
|
Dear Kent
Hi I could run my model 1.5 without any error. I changed the time step to 5e-5 and turn the adjustTimeStep off. But it took a long time. what do u suggest to reduce the run time. And also basically I want to know decreasing timestep is an efficient way to solve the turbulent models.? Is timestep reasonable? |
|
November 22, 2015, 14:23 |
|
#7 | |
Senior Member
Hasan K.J.
Join Date: Dec 2011
Location: Bristol, United Kingdom
Posts: 200
Rep Power: 15 |
Quote:
can some one tell me, where do I have to add this timeDump.H file, I am using pimpleFoam and OF 2.2.0 for all my simulations, Can you please explain in detail I have no clue about these codes inside the solvers. Regards, Hasan K.J
__________________
"Real knowledge is to know the extent of one's ignorance." - Confucius |
||
November 23, 2015, 11:53 |
|
#8 |
Senior Member
Kent Wardle
Join Date: Mar 2009
Location: Illinois, USA
Posts: 219
Rep Power: 21 |
Just put the file in your solver directory and insert the include statement in the main .C file for the solver (will have the same name as the solver, pimpleFoam.C in your case) at the end of the main loop like this:
Code:
while (runTime.run()) { <solver stuff> runTime.write(); include "timeDump.H" <Info stuff here> } |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Problem in running job in parallel | Tarak | OpenFOAM | 0 | March 19, 2011 21:34 |
Problem running a parallel fluent job on local machine via mpd | highhopes | FLUENT | 0 | March 3, 2011 06:07 |
Problem running parallel. | Hernán | STAR-CCM+ | 1 | December 23, 2009 13:04 |
Turbulence model problem? | PattiMichelle | Phoenics | 0 | July 11, 2007 20:35 |
Problem of B.C. in Eulerian multiphase model | Derek Jing | FLUENT | 0 | May 12, 2002 12:52 |