CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

SigFpe when running ANY application in parallel

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 23, 2015, 05:55
Default [SOLVED] SigFpe when running ANY application in parallel
  #1
Pj.
Member
 
Luca
Join Date: Mar 2013
Posts: 68
Rep Power: 13
Pj. is on a distinguished road
Hi everybody.

I have a very simple case made of a box-shaped volume created as a single block in blockMesh. The mesh is made of cubic cells, without any non-ortogonality. Everything runs fine until I try to run the solver or any other application using mpirun. Then the polymesh loading fails and create a sigFpe error.

As a simple demonstration of the error i run checkMesh (single core), decomposePar, checkmesh (decomposed).

When i run checkMesh this is the output:
Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.2.1                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : 2.2.1-57f3c3617a2d
Exec   : checkMesh
Date   : Apr 23 2015
Time   : 10:06:40
Host   : "node166"
PID    : 30400
Case   : /gpfs/scratch/userexternal/lamerio0/Rete/M3
nProcs : 1
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create polyMesh for time = 0

Time = 0

Mesh stats
    points:           6031169
    faces:            17826816
    internal faces:   17562624
    cells:            5898240
    faces per cell:   6
    boundary patches: 7
    point zones:      0
    face zones:       0
    cell zones:       0

Overall number of cells of each type:
    hexahedra:     5898240
    prisms:        0
    wedges:        0
    pyramids:      0
    tet wedges:    0
    tetrahedra:    0
    polyhedra:     0

Checking topology...
    Boundary definition OK.
    Cell to face addressing OK.
    Point usage OK.
    Upper triangular ordering OK.
    Face vertices OK.
    Number of regions: 1 (OK).

Checking patch topology for multiply connected surfaces...
    Patch               Faces    Points   Surface topology
    cyclic_bottom       61440    62177    ok (non-closed singly connected)
    cyclic_top          61440    62177    ok (non-closed singly connected)
    cyclic_left         61440    62177    ok (non-closed singly connected)
    cyclic_right        61440    62177    ok (non-closed singly connected)
    in                  5040     5328     ok (non-closed singly connected)
    out                 9216     9409     ok (non-closed singly connected)
    net                 4176     4649     ok (non-closed singly connected)

Checking geometry...
    Overall domain bounding box (0 -0.04 -0.04) (0.8 0.04 0.04)
    Mesh (non-empty, non-wedge) directions (1 1 1)
    Mesh (non-empty) directions (1 1 1)
    Boundary openness (1.7218198e-16 3.407543e-16 -4.6831815e-17) OK.
    Max cell openness = 3.5101045e-16 OK.
    Max aspect ratio = 1.5 OK.
    Minimum face area = 6.9444444e-07. Maximum face area = 1.0416667e-06.  Face area magnitudes OK.
    Min volume = 8.6805556e-10. Max volume = 8.6805556e-10.  Total volume = 0.00512.  Cell volumes OK.
    Mesh non-orthogonality Max: 0 average: 0
    Non-orthogonality check OK.
    Face pyramids OK.
    Max skewness = 2.8799967e-06 OK.
    Coupled point location match (average 1.232301e-17) OK.

Mesh OK.

End
Sounds great!

Now i run decomposePar with this decomposeParDict:
Code:
FoamFile
{
    version     2.0;
    format      ascii;
    class       dictionary;
    object      decomposeParDict;
}

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

numberOfSubdomains 10;
method scotch;

distributed false;

roots
(
);
It returns:
Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.2.1                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : 2.2.1-57f3c3617a2d
Exec   : decomposePar
Date   : Apr 23 2015
Time   : 10:00:12
Host   : "node166"
PID    : 23719
Case   : /gpfs/scratch/userexternal/lamerio0/Rete/M3
nProcs : 1
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time



Decomposing mesh region0

Create mesh

Calculating distribution of cells
Selecting decompositionMethod scotch

Finished decomposition in 17.57 s

Calculating original mesh data

Distributing cells to processors

Distributing faces to processors

Distributing points to processors

Constructing processor meshes

Processor 0
    Number of cells = 591664
    Number of faces shared with processor 1 = 12561
    Number of faces shared with processor 1 = 15
    Number of faces shared with processor 1 = 24
    Number of faces shared with processor 3 = 12095
    Number of faces shared with processor 3 = 13
    Number of faces shared with processor 3 = 4
    Number of faces shared with processor 3 = 8
    Number of processor patches = 7
    Number of processor faces = 24720
    Number of boundary faces = 24004

Processor 1
    Number of cells = 589712
    Number of faces shared with processor 0 = 12561
    Number of faces shared with processor 0 = 15
    Number of faces shared with processor 0 = 24
    Number of faces shared with processor 9 = 12199
    Number of faces shared with processor 9 = 2
    Number of faces shared with processor 9 = 2
    Number of faces shared with processor 9 = 10
    Number of faces shared with processor 9 = 3
    Number of processor patches = 8
    Number of processor faces = 24816
    Number of boundary faces = 25086

Processor 2
    Number of cells = 590427
    Number of faces shared with processor 4 = 11695
    Number of faces shared with processor 4 = 4
    Number of faces shared with processor 4 = 6
    Number of faces shared with processor 4 = 5
    Number of processor patches = 4
    Number of processor faces = 11710
    Number of boundary faces = 33342

Processor 3
    Number of cells = 591917
    Number of faces shared with processor 0 = 12095
    Number of faces shared with processor 0 = 4
    Number of faces shared with processor 0 = 13
    Number of faces shared with processor 0 = 8
    Number of faces shared with processor 4 = 11647
    Number of faces shared with processor 4 = 6
    Number of faces shared with processor 4 = 2
    Number of faces shared with processor 4 = 1
    Number of faces shared with processor 4 = 14
    Number of processor patches = 9
    Number of processor faces = 23790
    Number of boundary faces = 24428

Processor 4
    Number of cells = 589664
    Number of faces shared with processor 2 = 11695
    Number of faces shared with processor 2 = 4
    Number of faces shared with processor 2 = 5
    Number of faces shared with processor 2 = 6
    Number of faces shared with processor 3 = 11647
    Number of faces shared with processor 3 = 2
    Number of faces shared with processor 3 = 6
    Number of faces shared with processor 3 = 14
    Number of faces shared with processor 3 = 1
    Number of processor patches = 9
    Number of processor faces = 23380
    Number of boundary faces = 24842

Processor 5
    Number of cells = 590639
    Number of faces shared with processor 6 = 12086
    Number of faces shared with processor 6 = 1
    Number of faces shared with processor 6 = 7
    Number of faces shared with processor 6 = 1
    Number of faces shared with processor 6 = 13
    Number of processor patches = 5
    Number of processor faces = 12108
    Number of boundary faces = 33578

Processor 6
    Number of cells = 589995
    Number of faces shared with processor 5 = 12086
    Number of faces shared with processor 5 = 7
    Number of faces shared with processor 5 = 1
    Number of faces shared with processor 5 = 13
    Number of faces shared with processor 5 = 1
    Number of faces shared with processor 7 = 11834
    Number of faces shared with processor 7 = 10
    Number of faces shared with processor 7 = 1
    Number of faces shared with processor 7 = 19
    Number of processor patches = 9
    Number of processor faces = 23972
    Number of boundary faces = 24414

Processor 7
    Number of cells = 589221
    Number of faces shared with processor 6 = 11834
    Number of faces shared with processor 6 = 1
    Number of faces shared with processor 6 = 10
    Number of faces shared with processor 6 = 19
    Number of faces shared with processor 8 = 11479
    Number of faces shared with processor 8 = 12
    Number of faces shared with processor 8 = 5
    Number of faces shared with processor 8 = 8
    Number of processor patches = 8
    Number of processor faces = 23368
    Number of boundary faces = 25196

Processor 8
    Number of cells = 588220
    Number of faces shared with processor 7 = 11479
    Number of faces shared with processor 7 = 5
    Number of faces shared with processor 7 = 12
    Number of faces shared with processor 7 = 8
    Number of faces shared with processor 9 = 11772
    Number of faces shared with processor 9 = 20
    Number of faces shared with processor 9 = 10
    Number of faces shared with processor 9 = 2
    Number of processor patches = 8
    Number of processor faces = 23308
    Number of boundary faces = 23794

Processor 9
    Number of cells = 586781
    Number of faces shared with processor 1 = 12199
    Number of faces shared with processor 1 = 2
    Number of faces shared with processor 1 = 2
    Number of faces shared with processor 1 = 3
    Number of faces shared with processor 1 = 10
    Number of faces shared with processor 8 = 11772
    Number of faces shared with processor 8 = 20
    Number of faces shared with processor 8 = 2
    Number of faces shared with processor 8 = 10
    Number of processor patches = 9
    Number of processor faces = 24020
    Number of boundary faces = 25052

Number of processor faces = 107596
Max number of cells = 591917 (0.35485162% above average 589824)
Max number of processor patches = 9 (18.421053% above average 7.6)
Max number of faces between processors = 24816 (15.320272% above average 21519.2)

Time = 0

Processor 0: field transfer
Processor 1: field transfer
Processor 2: field transfer
Processor 3: field transfer
Processor 4: field transfer
Processor 5: field transfer
Processor 6: field transfer
Processor 7: field transfer
Processor 8: field transfer
Processor 9: field transfer

End.
But, if I run any application using mpirun on the decomposed case I receive a sigFpe error after the "Create polyMesh for time = 0" phase.
E.g. if I run " mpirun -np 10 checkMesh -parallel" I receive:

[error moved in the next post due to characters limit]

I tried to change OF version from 2.3.0 to 2.2.1, I also changed cluster, nothing worked.

I al so changed the number of processors from 10 to 9 and to 11, but neither this worked.

How can I solve the problem?

Last edited by Pj.; April 23, 2015 at 08:28.
Pj. is offline   Reply With Quote

Old   April 23, 2015, 05:56
Default
  #2
Pj.
Member
 
Luca
Join Date: Mar 2013
Posts: 68
Rep Power: 13
Pj. is on a distinguished road
The sigFpe error returned by checkMesh
Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.2.1                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : 2.2.1-57f3c3617a2d
Exec   : checkMesh -parallel
Date   : Apr 23 2015
Time   : 10:05:51
Host   : "node166"
PID    : 29821
Case   : /gpfs/scratch/userexternal/lamerio0/Rete/M3
nProcs : 10
Slaves :
9
(
"node166.29822"
"node166.29823"
"node166.29824"
"node166.29825"
"node166.29826"
"node166.29827"
"node166.29828"
"node166.29829"
"node166.29830"
)

Pstream initialized with:
    floatTransfer      : 0
    nProcsSimpleSum    : 0
    commsType          : nonBlocking
    polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create polyMesh for time = 0


checkMesh:29827 terminated with signal 11 at PC=7fdfce7f54da SP=7fff71ac1150.  Backtrace:
[2] #0  [7] Foam::error::printStack(Foam::Ostream&)#0  [8] #0  [9] Foam::error::printStack(Foam::Ostream&)#0  [5] # 0  [4] Foam::error::printStack(Foam::Ostream&)#0  Foam::error::printStack(Foam::Ostream&)Foam::error::printStack(Fo am::Ostream&)Foam::error::printStack(Foam::Ostream&)[3] #0  Foam::error::printStack(Foam::Ostream&)[0] #0  Foam::er ror::printStack(Foam::Ostream&)[1] #0  Foam::error::printStack(Foam::Ostream&) at ??:?
[3] #1  Foam::sigFpe::sigHandler(int) at ??:?
 at ??:?
[9] #1   at ??:?
[7] #1  Foam::sigSegv::sigHandler(int)[0] #1  Foam::sigSegv::sigHandler(int) at ??:?
Foam::sigSegv::sigHandler(int)[4] #1   at ??:?
[2] #1  Foam::sigSegv::sigHandler(int)Foam::sigSegv::sigHandler(int) at ??:?
[1] #1  Foam::sigSegv::sigHandler(int) at ??:?
[5] # at ??:?
[8] #1  1  Foam::sigSegv::sigHandler(int)Foam::sigSegv::sigHandler(int) at ??:?
[3] #2   at ??:?
[7] #2   at ??:?
[9] #2   at ??:?
[2] #2   at ??:?
[5] #2   in "/lib64/libc.so.6"
[3] #3  Foam::processorPolyPatch::updateMesh(Foam::PstreamBuffers&) at ??:?
[1] #2   at ??:?
[0] #2   at ??:?
[8] #2   at ??:?
[4] #2   in "/lib64/libc.so.6"
[7] #3  Foam::processorPolyPatch::updateMesh(Foam::PstreamBuffers&) in "/lib64/libc.so.6"
[9] #3  Foam::processorPolyPatch::updateMesh(Foam::PstreamBuffers&) in "/lib64/libc.so.6"
[2] #3  Foam::processorPolyPatch::updateMesh(Foam::PstreamBuffers&) in "/lib64/libc.so.6"
[5] #3  Foam::processorPolyPatch::updateMesh(Foam::PstreamBuffers&) in "/lib64/libc.so.6"
[1] #3  Foam::processorPolyPatch::updateMesh(Foam::PstreamBuffers&) at ??:?
[3] #4  Foam::polyBoundaryMesh::updateMesh() in "/lib64/libc.so.6"
[4] #3  Foam::processorPolyPatch::updateMesh(Foam::PstreamBuffers&) in "/lib64/libc.so.6"
[8] #3   in "/lib64/libc.so.6"
Foam::processorPolyPatch::updateMesh(Foam::PstreamBuffers&)[0] #3  Foam::processorPolyPatch::updateMesh(Foam::Pstre amBuffers&) at ??:?
[7] #4  Foam::polyBoundaryMesh::updateMesh() at ??:?
[9] #4  Foam::polyBoundaryMesh::updateMesh() at ??:?
[2] #4  Foam::polyBoundaryMesh::updateMesh() at ??:?
[5] #4  Foam::polyBoundaryMesh::updateMesh() at ??:?
[3] #5  Foam::polyMesh::polyMesh(Foam::IOobject const&) at ??:?
[1] #4  Foam::polyBoundaryMesh::updateMesh() at ??:?
[4] #4  Foam::polyBoundaryMesh::updateMesh() at ??:?
[0] #4  Foam::polyBoundaryMesh::updateMesh() at ??:?
[8] #4  Foam::polyBoundaryMesh::updateMesh() at ??:?
[7] #5  Foam::polyMesh::polyMesh(Foam::IOobject const&) at ??:?
[9] #5  Foam::polyMesh::polyMesh(Foam::IOobject const&) at ??:?
[2] #5  Foam::polyMesh::polyMesh(Foam::IOobject const&) at ??:?
[3] #6   at ??:?
[1] #5  Foam::polyMesh::polyMesh(Foam::IOobject const&) at ??:?
[5] #5  Foam::polyMesh::polyMesh(Foam::IOobject const&) at ??:?
[4] #5  Foam::polyMesh::polyMesh(Foam::IOobject const&)
 at ??:?
[7] #6   at ??:?
[9] #6   at ??:?
[8] #5  Foam::polyMesh::polyMesh(Foam::IOobject const&) at ??:?
[0] #5  Foam::polyMesh::polyMesh(Foam::IOobject const&)[3]  at ??:?
[3] #7  __libc_start_main at ??:?
[5] #6   at ??:?
[1] #6   at ??:?
[2] #6

 at ??:?
[4] #6
 at ??:?
[8] #6   in "/lib64/libc.so.6"
[3] #8
 at ??:?
[0] #6
[7]  at ??:?
[7] #7  __libc_start_main[9]  at ??:?
[9] #7  __libc_start_main

[5]  at ??:?
[5] #7  __libc_start_main[1]  at ??:?
[1] #7  __libc_start_main
 in "/lib64/libc.so.6"
[7] #8
 in "/lib64/libc.so.6"
[9] #8  [4]  at ??:?
[4] #7  __libc_start_main[2]  at ??:?
[2] #7  __libc_start_main[3]  at ??:?
[node166:29824] *** Process received signal ***
[node166:29824] Signal: Floating point exception (8)
[node166:29824] Signal code:  (-6)
[node166:29824] Failing at address: 0x6ab300007480
 in "/lib64/libc.so.6"
[5] #8
[node166:29824] [ 0] /lib64/libc.so.6(+0x35640)[0x7fcaeb103640]
[node166:29824] [ 1] /lib64/libc.so.6(gsignal+0x39)[0x7fcaeb1035c9]
[node166:29824] [ 2] /lib64/libc.so.6(+0x35640)[0x7fcaeb103640]
[node166:29824] [ 3] /cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/l inux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam18processorPolyPatch10updateMeshERNS_14PstreamBuffersE+0x204)[0x7fcaec2e0 344]
[node166:29824] [ 4] /cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/l inux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam16polyBoundaryMesh10updateMeshEv+0x2b1)[0x7fcaec2e4d11]
[node166:29824] [ 5] [8]  at ??:?
[8] #7  __libc_start_main/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platfor ms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam8polyMeshC1ERKNS_8IOobjectE+0xcfa)[0x7fcaec33575a]
[node166:29824] [ 6] checkMesh[0x409c78]
[node166:29824] [ 7] /lib64/libc.so.6(__libc_start_main+0xf5)[0x7fcaeb0efaf5]
[node166:29824] [ 8] checkMesh[0x40aab9]
[node166:29824] *** End of error message ***

 in "/lib64/libc.so.6"
[1] #8  [0]  at ??:?
[0] #7  __libc_start_main[7]  at ??:?

checkMesh:29828 terminated with signal 11 at PC=7f109fbf75c9 SP=7fff80095e78.  Backtrace:
/lib64/libc.so.6(gsignal+0x39)[0x7f109fbf75c9]
/lib64/libc.so.6(+0x35640)[0x7f109fbf7640]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam18processorPolyPatch10updateMeshERNS_14PstreamBuffersE+0x219)[0x7f10a0dd4359]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam16polyBoundaryMesh10updateMeshEv+0x2b1)[0x7f10a0dd8d11]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam8polyMeshC1ERKNS_8IOobjectE+0xcfa)[0x7f10a0e2975a]
checkMesh[0x409c78]
/lib64/libc.so.6(__libc_start_main+0xf5)[0x7f109fbe3af5]
checkMesh[0x40aab9]


[9]  at ??:?

checkMesh:29830 terminated with signal 11 at PC=7f178dac35c9 SP=7fff45671cf8.  Backtrace:
/lib64/libc.so.6(gsignal+0x39)[0x7f178dac35c9]
 in "/lib64/libc.so./lib64/libc.so.6(+0x35640)[0x7f178dac3640]
6"
[4] #8  /cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOp t/lib/libOpenFOAM.so(_ZN4Foam18processorPolyPatch10updateMeshERNS_14PstreamBuffersE+0x219)[0x7f178eca0359]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam16polyBoundaryMesh10updateMeshEv+0x2b1)[0x7f178eca4d11]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam8polyMeshC1ERKNS_8IOobjectE+0xcfa)[0x7f178ecf575a]
checkMesh[0x409c78]
/lib64/libc.so.6(__libc_start_main+0xf5)[0x7f178daafaf5]
checkMesh[0x40aab9]
 in "/lib64/libc.so.6"
[8] #8   in "/lib64/libc.so.6"
[2] #8   in "/lib64/libc.so.6"
[0] #8  [5]  at ??:?

checkMesh:29826 terminated with signal 11 at PC=7f2772bdd5c9 SP=7fff33a875b8.  Backtrace:
/lib64/libc.so.6(gsignal+0x39)[0x7f2772bdd5c9]
/lib64/libc.so.6(+0x35640)[0x7f2772bdd640]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam18processorPolyPatch10updateMeshERNS_14PstreamBuffersE+0x219)[0x7f2773dba359]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam16polyBoundaryMesh10updateMeshEv+0x2b1)[0x7f2773dbed11]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam8polyMeshC1ERKNS_8IOobjectE+0xcfa)[0x7f2773e0f75a]
checkMesh[0x409c78]
[1]  at ??:?
/lib64/libc.so.6(__libc_start_main+0xf5)[0x7f2772bc9af5]
checkMesh[0x40aab9]

checkMesh:29822 terminated with signal 11 at PC=7f727af225c9 SP=7fffcedd8f78.  Backtrace:
/lib64/libc.so.6(gsignal+0x39)[0x7f727af225c9]
/lib64/libc.so.6(+0x35640)[0x7f727af22640]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam18processorPolyPatch10updateMeshERNS_14PstreamBuffersE+0x20d)[0x7f727c0ff34d]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam16polyBoundaryMesh10updateMeshEv+0x2b1)[0x7f727c103d11]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam8polyMeshC1ERKNS_8IOobjectE+0xcfa)[0x7f727c15475a]
checkMesh[0x409c78]
/lib64/libc.so.6(__libc_start_main+0xf5)[0x7f727af0eaf5]
checkMesh[0x40aab9]




[8]  at ??:?

checkMesh:29829 terminated with signal 11 at PC=7f2a8b8e35c9 SP=7fffe927b2b8.  Backtrace:
/lib64/libc.so.6(gsignal+0x39)[0x7f2a8b8e35c9]
/lib64/libc.so.6(+0x35640)[0x7f2a8b8e3640]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam18processorPolyPatch10updateMeshERNS_14PstreamBuffersE+0x219)[0x7f2a8cac0359]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam16polyBoundaryMesh10updateMeshEv+0x2b1)[0x7f2a8cac4d11]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam8polyMeshC1ERKNS_8IOobjectE+0xcfa)[0x7f2a8cb1575a]
checkMesh[0x409c78]
/lib64/libc.so.6(__libc_start_main+0xf5)[0x7f2a8b8cfaf5]
checkMesh[0x40aab9]
[4]  at ??:?

checkMesh:29825 terminated with signal 11 at PC=7f77c1eaa5c9 SP=7fff6da6d078.  Backtrace:
/lib64/libc.so.6(gsignal+0x39)[0x7f77c1eaa5c9]
/lib64/libc.so.6(+0x35640)[0x7f77c1eaa640]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam18processorPolyPatch10updateMeshERNS_14PstreamBuffersE+0x33c)[0x7f77c308747c]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam16polyBoundaryMesh10updateMeshEv+0x2b1)[0x7f77c308bd11]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam8polyMeshC1ERKNS_8IOobjectE+0xcfa)[0x7f77c30dc75a]
checkMesh[0x409c78]
/lib64/libc.so.6(__libc_start_main+0xf5)[0x7f77c1e96af5]
checkMesh[0x40aab9]
[2]  at ??:?

checkMesh:29823 terminated with signal 11 at PC=7f17651a65c9 SP=7fff7d9894b8.  Backtrace:
/lib64/libc.so.6(gsignal+0x39)[0x7f17651a65c9]
/lib64/libc.so.6(+0x35640)[0x7f17651a6640]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam18processorPolyPatch10updateMeshERNS_14PstreamBuffersE+0x219)[0x7f1766383359]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam16polyBoundaryMesh10updateMeshEv+0x2b1)[0x7f1766387d11]
/cineca/prod/applications/openfoam/2.2.1/openmpi--1.8.4--gnu--4.9.2/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/li bOpenFOAM.so(_ZN4Foam8polyMeshC1ERKNS_8IOobjectE+0xcfa)[0x7f17663d875a]
checkMesh[0x409c78]
/lib64/libc.so.6(__libc_start_main+0xf5)[0x7f1765192af5]
checkMesh[0x40aab9]
--------------------------------------------------------------------------
mpirun noticed that process rank 3 with PID 29824 on node node166 exited on signal 8 (Floating point exception).
--------------------------------------------------------------------------
Pj. is offline   Reply With Quote

Old   April 23, 2015, 08:27
Default
  #3
Pj.
Member
 
Luca
Join Date: Mar 2013
Posts: 68
Rep Power: 13
Pj. is on a distinguished road
Solved using the fix from wyldckat found here http://www.cfd-online.com/Forums/ope...-openfoam.html

Quote:
Originally Posted by wyldckat View Post
Hi guilha,

Mmm... I originally thought you were using OpenFOAM 2.1... but if it's OpenFOAM 2.0.1, then my guess is that you're having problems with decomposing cyclic patches. Actually, I vaguely remember that only in OpenFOAM 2.2 were fully fixed the issues with decomposing cyclic patches.

Try using the "preservePatches" entry in "decomposeParDict". In the file "applications/utilities/parallelProcessing/decomposePar/decomposeParDict" you should find this example:
Code:
//- Keep owner and neighbour on same processor for faces in patches:
// (makes sense only for cyclic patches)
//preservePatches (cyclic_half0 cyclic_half1);
Uncomment the last line and use the names of your cyclic patches.

Best regards,
Bruno
Pj. is offline   Reply With Quote

Old   April 23, 2015, 15:53
Default
  #4
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
For future reference, this was reported here: http://www.openfoam.org/mantisbt/view.php?id=1668 - and they've answered back that this has already been fixed in 2.3.1 and 2.3.x.
wyldckat is offline   Reply With Quote

Reply

Tags
decomposepar, mpi error


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
OF 2.0.1 parallel running problems moser_r OpenFOAM Running, Solving & CFD 9 July 27, 2022 04:15
Error running simpleFoam in parallel Yuby OpenFOAM Running, Solving & CFD 14 October 7, 2021 05:38
simpleFoam parallel AndrewMortimer OpenFOAM Running, Solving & CFD 12 August 7, 2015 19:45
Running Error using Compressible OpenFoam Parallel mode dhendria OpenFOAM Running, Solving & CFD 0 February 13, 2014 21:53
Running Parallel on Windows using Python Scripts amarkkassery SU2 Installation 6 April 4, 2013 13:37


All times are GMT -4. The time now is 11:51.