|
[Sponsors] |
Problem for parallel running chtMultiregionFoam |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
June 17, 2013, 06:34 |
Problem for parallel running chtMultiregionFoam
|
#1 |
New Member
Giancarlo
Join Date: Apr 2013
Location: Milan
Posts: 21
Rep Power: 13 |
Hi FOAMers,
I have developed a new solver that it is able to treat multi-region meshes. This solver has the same architecture of chtMultiregionFoam but it is able to solve material and energy balance for reactive system too. When I launch a simulation in parallel I have a strange problem: it works with no problem for the first time-step, after that it crashes when reads the file: compressibleMultiRegionCourantNo.H. This is very strange: why are there problems if it has to repeat the same operations of the first time step? Can anyone help me? Thanks Best regards Giancarlo |
|
June 18, 2013, 09:08 |
|
#2 |
New Member
Giancarlo
Join Date: Apr 2013
Location: Milan
Posts: 21
Rep Power: 13 |
I post the error.
Can anyone help to understand the nature of error? Thanks Giancarlo Code:
[1] [2] #0 #0 Foam::error::printStack(Foam::Ostream&)Foam::error::printStack(Foam::Ostream&) in "/home/OpenFOAM/OpenFOAM-2.1.x/platfo in "/home/cfduser1/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [1] #rms/linux64GccDPOpt/lib/libOpenFOAM.so"1 Foam::sigSegv::sigHandler(int) [2] #1 Foam::sigSegv::sigHandler(int) in "/home/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [1] #2 in "/home/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [2] #2 __restore_rt__restore_rt at sigaction.c:0 [2] #3 at sigaction.c:0 [1] #3 mainmain in "/home/GentileG/gianca/Run/parallel/multiFinal_test_parallel" [1] #4 __libc_start_main in "/home/GentileG/gianca/Run/parallel/multiFinal_test_parallel" [2] #4 __libc_start_main in "/lib64/libc.so.6" [1] #5 in "/lib64/libc.so.6" [2] #5 Foam::regIOobject::writeObject(Foam::IOstream::streamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) constFoam::regIOobject::writeObject(Foam::IOstream::streamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) const in "/home/GentileG/gianca/Run/parallel/multiFinal_test_parallel" [compute:18709] *** Process received signal *** [compute:18709] Signal: Segmentation fault (11) [compute:18709] Signal code: (-6) [compute:18709] Failing at address: 0x623b00004915 [compute:18709] [ 0] /lib64/libc.so.6 [0x367c230280] [compute:18709] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x367c230215] [compute:18709] [ 2] /lib64/libc.so.6 [0x367c230280] [compute:18709] [ 3] multiFinal_test_parallel [0x456912] [compute:18709] [ 4] /lib64/libc.so.6(__libc_start_main+0xf4) [0x367c21d974] [compute:18709] [ 5] multiFinal_test_parallel(_ZNK4Foam11regIOobject11writeObjectENS_8IOstream12streamFormatENS1_13versionNumberENS1_15compressionTypeE+0x151) [0x4204a9] [compute:18709] *** End of error message *** -------------------------------------------------------------------------- mpirun noticed that process rank 1 with PID 18709 on node compute-3-11.local exited on signal 11 (Segmentation fault). -------------------------------------------------------------------------- [compute.local:18707] 2 more processes have sent help message help-mpi-btl-base.txt / btl:no-nics [compute.local:18707] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages vi log [1]+ Exit 139 mpirun -np 3 multiFinal_test_parallel -parallel > log |
|
August 5, 2013, 10:56 |
|
#3 |
New Member
M Bay
Join Date: Jun 2013
Location: Germany
Posts: 10
Rep Power: 13 |
Hallo Giancarlo,
Im having exactly the same Problem. I noticed in my Simulation that the Tempratur Value in the Air Region are too high. At the second Time step i get the same Error that you have posted. It seems that OF having Problems when he try s to calculate h in Fluid Region. You can try to run the case seriely without decomposen it. If you get your case working please let me know, how did you do it. Regards |
|
August 25, 2013, 08:50 |
|
#4 | |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Greetings to all!
mbay101's problem is being addressed here: http://www.cfd-online.com/Forums/ope...egionfoam.html @Giancarlo: The key issue is a bad memory access: Quote:
From your description, it looks like the problem is that at least one field/array has been destroyed when the iteration was finished. Best regards, Bruno
__________________
|
||
December 21, 2016, 04:26 |
|
#5 |
Senior Member
Manu Chakkingal
Join Date: Feb 2016
Location: Delft, Netherlands
Posts: 129
Rep Power: 10 |
Hello
I have a new solver with structure of chtMultiregionfoam which behaves like bouyantboussinesqpimplefoam for fluid region. I ran the simulations in series and the solution converges. When I try to run it in parallel it crashes in the first step. Used OF2.4.0 I tried to reduce the courant number to attain initial stability , but then the solver crashes after 2nd time step. Code:
Test1 deltaT = 4.5530327e-107 Test2 --> FOAM Warning : From function Time::operator++() in file db/Time/Time.C at line 1061 Increased the timePrecision from 62 to 63 to distinguish between timeNames at time 2.0707573e-07 Time = 2.07075734119138104400662664383858668770699296146631240844726562e-07 Solving for fluid region air DILUPBiCG: Solving for T, Initial residual = 0.010403611, Final residual = 2.7324966e-12, No Iterations 1 max(T) [0 0 0 1 0 0 0] 300.02011 DICPCG: Solving for p_rgh, Initial residual = 1, Final residual = 0.0099236724, No Iterations 251 time step continuity errors : sum local = 5.0400286e-07, global = 1.0693134e-19 -------------------------------------------------------------------------- mpirun noticed that process rank 4 with PID 17152 on node n11-42 exited on signal 8 (Floating point exception). problem solved: Temperature anomoly at pressure reference cell
__________________
Regards Manu Last edited by manuc; December 30, 2016 at 03:06. |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Parallel running of 3D multiphase turbulence model (unknown problem!!) | MOHAMMAD67 | OpenFOAM Running, Solving & CFD | 7 | November 23, 2015 11:53 |
problem with running in parallel | dhruv | OpenFOAM | 3 | November 25, 2011 06:06 |
Problem running parallel. | Hernán | STAR-CCM+ | 1 | December 23, 2009 13:04 |
Problem running parallel | Hernán | Main CFD Forum | 0 | December 22, 2009 05:36 |
Kubuntu uses dash breaks All scripts in tutorials | platopus | OpenFOAM Bugs | 8 | April 15, 2008 08:52 |