|
[Sponsors] |
Segmentation fault when trying to run in parallel |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
November 22, 2021, 03:01 |
Segmentation fault when trying to run in parallel
|
#1 |
New Member
swapnil
Join Date: Feb 2015
Location: India
Posts: 5
Rep Power: 11 |
Hi,
I am trying to read velocity data from files using codedFixedValue. I am trying to use new velocity every iteration (I have ~100 velocity input files). When I tried this in serial/single core it worked but when I tried it in parallel it crashes with mpirun noticed that process rank 0 with PID 0 on node user exited on signal 11 (Segmentation fault) I also tried by decomposing it into 40 cores and run, but gives same error. I am using this code Code:
velocity-inlet-5 { type codedFixedValue; value uniform (0 0 0); name MyProfile; code #{ const fvPatch& boundaryPatch = patch(); const vectorField& Cf = boundaryPatch.Cf(); vectorField& field = *this; scalar t = this->db().time().value(); //scalar index = this->db().time().timeIndex(); int index = this->db().time().timeIndex(); Info<<"The time index is "<<index <<" and current time is "<<t; index = index%4; // #include"/home/user/OpenFOAM/Programming/Joel_Guerrero/OF8/101programming/codeStream_BC/2Delbow_UparabolicInlet_timeDep1/U" #include"../../constant/Inlet_data/Velocity_Call" /* forAll(Cf, faceI) { //To acces time use this->db().time().value(); //field[faceI] = vector(sin(this->db().time().value())*U_0*(1-(pow(Cf[faceI].y()-p_ctr,2))/(p_r*p_r)),0,0); //field[faceI] = vector(sin(t)*U_0*(1-(pow(Cf[faceI].y()-p_ctr,2))/(p_r*p_r)),0,0); field[faceI] = field[faceI]; } */ #}; } Code:
if (index ==1) #include"U0" else if (index==2) #include"U1" else if(index==3) #include"U2" Code:
{ field= { {0.42623317, 0, 0}, {0.42623317, 0, 0} , {1.7901793, 0, 0}, {1.1082062, 0, 0}, {1.7901793 ,0 ,0} , {1.562855, 0, 0}, {1.1082062 ,0, 0}, {1.562855 ,0 ,0} }; } The error message is Code:
mpirun -np 2 icoFoam -parallel /*---------------------------------------------------------------------------*\ ========= | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox \\ / O peration | Website: https://openfoam.org \\ / A nd | Version: 8 \\/ M anipulation | \*---------------------------------------------------------------------------*/ Build : 8-1c9b5879390b Exec : icoFoam -parallel Date : Nov 22 2021 Time : 12:02:19 Host : "user-h110m-h" PID : 14501 I/O : uncollated Case : /home/user/OpenFOAM/Programming/Joel_Guerrero/OF8/101programming/codeStream_BC/2Delbow_UparabolicInlet_timeDep1/test nProcs : 2 Slaves : 1("user-h110m-h.14502") Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking polling iterations : 0 sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 10) allowSystemOperations : Allowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 0 Reading transportProperties Reading field p Reading field U Using dynamicCode for fixedValueFvPatchField MySnapProfile at line 31 in "/home/user/OpenFOAM/Programming/Joel_Guerrero/OF8/101programming/codeStream_BC/2Delbow_UparabolicInlet_timeDep1/test/processor0/0/U/boundaryField/velocity-inlet-5" Reading/calculating face flux field phi Starting time loop fieldMinMax minmaxdomain write: min(p) = 0 in cell 0 at location (48.666667 63.333333 0) on processor 0 max(p) = 0 in cell 0 at location (48.666667 63.333333 0) on processor 0 min(U) = (0 0 0) in cell 0 at location (48.666667 63.333333 0) on processor 0 max(U) = (0 0 0) in cell 0 at location (48.666667 63.333333 0) on processor 0 Time = 0.05 Courant Number mean: 0.00017826052 max: 0.17320508 [0] #0 Foam::error::printStack(Foam::Ostream&)The time index is 1 and current time is 0.05The field is (0.42623317 0 0) (0.42623317 0 0) (1.7901793 0 0) (1.1082062 0 0) (1.7901793 0 0) (1.562855 0 0) (1.1082062 0 0) (1.562855 0 0) at ??:? [0] #1 Foam::sigSegv::sigHandler(int) at ??:? [0] #2 ? in "/lib/x86_64-linux-gnu/libc.so.6" [0] #3 Foam::tmp<Foam::Field<Foam::Vector<double> > > Foam::operator*<Foam::Vector<double> >(Foam::UList<double> const&, Foam::tmp<Foam::Field<Foam::Vector<double> > > const&) at ??:? [0] #4 Foam::fv::gaussConvectionScheme<Foam::Vector<double> >::fvmDiv(Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<Foam::Vector<double>, Foam::fvPatchField, Foam::volMesh> const&) const at ??:? [0] #5 ? in "/opt/openfoam8/platforms/linux64GccDPInt32Opt/bin/icoFoam" [0] #6 ? in "/opt/openfoam8/platforms/linux64GccDPInt32Opt/bin/icoFoam" [0] #7 __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6" [0] #8 ? in "/opt/openfoam8/platforms/linux64GccDPInt32Opt/bin/icoFoam" [user-h110m-h:14501] *** Process received signal *** [user-h110m-h:14501] Signal: Segmentation fault (11) [user-h110m-h:14501] Signal code: (-6) [user-h110m-h:14501] Failing at address: 0x3e8000038a5 [user-h110m-h:14501] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x3f040)[0x7f0ca2546040] [user-h110m-h:14501] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0xc7)[0x7f0ca2545fb7] [user-h110m-h:14501] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x3f040)[0x7f0ca2546040] [user-h110m-h:14501] [ 3] /opt/openfoam8/platforms/linux64GccDPInt32Opt/lib/libfiniteVolume.so(_ZN4FoammlINS_6VectorIdEEEENS_3tmpINS_5FieldIT_EEEERKNS_5UListIdEERKS7_+0x94)[0x7f0ca4f189c4] [user-h110m-h:14501] [ 4] /opt/openfoam8/platforms/linux64GccDPInt32Opt/lib/libfiniteVolume.so(_ZNK4Foam2fv21gaussConvectionSchemeINS_6VectorIdEEE6fvmDivERKNS_14GeometricFieldIdNS_13fvsPatchFieldENS_11surfaceMeshEEERKNS5_IS3_NS_12fvPatchFieldENS_7volMeshEEE+0x2c4)[0x7f0ca56ea494] [user-h110m-h:14501] [ 5] icoFoam(+0x43add)[0x555d0ce46add] [user-h110m-h:14501] [ 6] icoFoam(+0x23873)[0x555d0ce26873] [user-h110m-h:14501] [ 7] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xe7)[0x7f0ca2528bf7] [user-h110m-h:14501] [ 8] icoFoam(+0x25ada)[0x555d0ce28ada] [user-h110m-h:14501] *** End of error message *** -------------------------------------------------------------------------- mpirun noticed that process rank 0 with PID 0 on node user exited on signal 11 (Segmentation fault). Can anyone please help me with this error. I need to run this in order to do LES with different velocity snapshots every iteration. |
|
November 22, 2021, 06:22 |
|
#2 |
Senior Member
Mark Olesen
Join Date: Mar 2009
Location: https://olesenm.github.io/
Posts: 1,714
Rep Power: 40 |
No comments about style or anything, but please take a closer look at what your "U0, U1, U2" files look like. It seems that they are assigning eight values to the field!!!
Think what would happen with your simulation (in serial) if you tried to run with more faces or few faces on that boundary patch. You could expect it to fail. Now check your assumptions for the parallel case. Using a printf-style of debug, add this bit of code in: Code:
code #{ Pout<< "Patch has " << patch().size() << " faces" << endl; ... |
|
November 22, 2021, 07:41 |
|
#3 |
New Member
swapnil
Join Date: Feb 2015
Location: India
Posts: 5
Rep Power: 11 |
Thank you for replying sir.
I am aware that this code will fail if mesh is changed. I am not very good at OpenFOAM programming to automate it. I am using 8 values for the velocity as I know its count on the patch. I am just trying to test it for serial as well as parallel cases. In actual case I have few hundred patches in the inlet. I tried to put Pout<< "Patch has " << patch().size() << " faces" << endl; in the code and in serial it is fine but in parallel one patch has 8 faces and other it shows 0 faces!!! So this could be the reason for the failure. Code:
Time = 0.05 Courant Number mean: 0.00017826052 max: 0.17320508 [1] Patch has 8 faces The time index is 1 and current time is 0.05[0] Patch has 0 faces [0] #0 Foam::error::printStack(Foam::Ostream&)The field is (0.42623317 0 0) (0.42623317 0 0) (1.7901793 0 0) (1.1082062 0 0) (1.7901793 0 0) (1.562855 0 0) (1.1082062 0 0) (1.562855 0 0) at ??:? [0] #1 Foam::sigSegv::sigHandler(int) at ??:? [0] #2 ? in "/lib/x86_64-linux-gnu/libc.so.6" [0] #3 Foam::tmp<Foam::Field<Foam::Vector<double> > > Foam::operator*<Foam::Vector<double> >(Foam::UList<double> const&, Foam::tmp<Foam::Field<Foam::Vector<double> > > const&) at ??:? [0] #4 Foam::fv::gaussConvectionScheme<Foam::Vector<double> >::fvmDiv(Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<Foam::Vector<double>, Foam::fvPatchField, Foam::volMesh> const&) const at ??:? [0] #5 ? in "/opt/openfoam8/platforms/linux64GccDPInt32Opt/bin/icoFoam" [0] #6 ? in "/opt/openfoam8/platforms/linux64GccDPInt32Opt/bin/icoFoam" [0] #7 __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6" [0] #8 ?[user-h110m-h:22969] *** Process received signal *** in "/opt/openfoam8/platforms/linux64GccDPInt32Opt/bin/icoFoam" [user-h110m-h:22969] Signal: Segmentation fault (11) [user-h110m-h:22969] Signal code: (-6) [user-h110m-h:22969] Failing at address: 0x3e8000059b9 [user-h110m-h:22969] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x3f040)[0x7f06c0df4040] [user-h110m-h:22969] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0xc7)[0x7f06c0df3fb7] [user-h110m-h:22969] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x3f040)[0x7f06c0df4040] [user-h110m-h:22969] [ 3] /opt/openfoam8/platforms/linux64GccDPInt32Opt/lib/libfiniteVolume.so(_ZN4FoammlINS_6VectorIdEEEENS_3tmpINS_5FieldIT_EEEERKNS_5UListIdEERKS7_+0x94)[0x7f06c37c69c4] [user-h110m-h:22969] [ 4] /opt/openfoam8/platforms/linux64GccDPInt32Opt/lib/libfiniteVolume.so(_ZNK4Foam2fv21gaussConvectionSchemeINS_6VectorIdEEE6fvmDivERKNS_14GeometricFieldIdNS_13fvsPatchFieldENS_11surfaceMeshEEERKNS5_IS3_NS_12fvPatchFieldENS_7volMeshEEE+0x2c4)[0x7f06c3f98494] [user-h110m-h:22969] [ 5] icoFoam(+0x43add)[0x560e2db60add] [user-h110m-h:22969] [ 6] icoFoam(+0x23873)[0x560e2db40873] [user-h110m-h:22969] [ 7] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xe7)[0x7f06c0dd6bf7] [user-h110m-h:22969] [ 8] icoFoam(+0x25ada)[0x560e2db42ada] [user-h110m-h:22969] *** End of error message *** -------------------------------------------------------------------------- mpirun noticed that process rank 0 with PID 0 on node user exited on signal 11 (Segmentation fault). Or this method would not work. |
|
Tags |
les simulation, openmpi, parallel, segmentaion fault |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Problem in foam-extend 4.0 ggi parallel run | Metikurke | OpenFOAM Running, Solving & CFD | 0 | February 20, 2018 07:34 |
[mesh manipulation] Importing Multiple Meshes | thomasnwalshiii | OpenFOAM Meshing & Mesh Conversion | 18 | December 19, 2015 19:57 |
[snappyHexMesh] SnappyHexMesh multioprocessor run causes segmentation fault | MarcusNHofer | OpenFOAM Meshing & Mesh Conversion | 5 | June 11, 2015 09:06 |
Segmentation fault in interFoam run through openMPI | voingiappone | OpenFOAM | 16 | November 2, 2011 07:49 |
Working directory via command line | Luiz | CFX | 4 | March 6, 2011 21:02 |