CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Meshing & Mesh Conversion

[snappyHexMesh] problems using snappyHexMesh 2.1.0 on a supercomputer

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   September 9, 2014, 03:28
Default problems using snappyHexMesh 2.1.0 on a supercomputer
  #1
Member
 
sqing
Join Date: Sep 2012
Location: Dalian
Posts: 77
Rep Power: 14
Sunxing is on a distinguished road
Hi foamers,

I'm using snappyHexMesh on a supercomputer, but there is something wrong that I can't solve. Can anyone give me some hints?
(OpenFoam-2.1.0 and red-hat 5.3)
here is the log file for snappyHexMesh:
Code:
srun.sz: job 5712297 queued and waiting for resources
srun.sz: job 5712297 has been allocated resources
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.1.0                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : 2.1.0-bd7367f93311
Exec   : /home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/bin/snappyHexMesh
Date   : Sep 09 2014
Time   : 09:11:57
Host   : "cn40404"
PID    : 26954
Case   : /home/export/base/envisiongrp/envision_lijun/CFDcalc/calc/panan/pananSnappyTest0/project/sector_0
nProcs : 1
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time
Create mesh for time = 0
Read mesh in = 5.8 s
Overall mesh bounding box  : (562162.0001 3202162 0) (566161.9999 3206162 5000)
Relative tolerance         : 1e-06
Absolute matching distance : 0.00754983432931
Reading refinement surfaces.
Read refinement surfaces in = 3.42 s
Reading refinement shells.
Read refinement shells in = 0 s
Setting refinement level of surface to be consistent with shells.
Checked shell refinement in = 0 s
Reading features.
Read features in = 0 s

Determining initial surface intersections
-----------------------------------------
Edge intersection testing:
    Number of edges             : 3030000
    Number of edges to retest   : 3030000
#0  Foam::error::printStack(Foam::Ostream&) in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/lib/libOpenFOAM.so"
#1  Foam::sigFpe::sigHandler(int) in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/lib/libOpenFOAM.so"
#2  __restore_rt at sigaction.c:0
#3  Foam::triangleFuncs::intersectAxesBundle(Foam::Vector<double> const&, Foam::Vector<double> const&, Foam::Vector<double> const&, int, Foam::Field<Foam::Vector<double> > const&, double, Foam::Vector<double>&) in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/lib/libmeshTools.so"
#4  Foam::triangleFuncs::intersectBb(Foam::Vector<double> const&, Foam::Vector<double> const&, Foam::Vector<double> const&, Foam::treeBoundBox const&) in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/lib/libmeshTools.so"
#5  Foam::treeDataTriSurface::overlaps(int, Foam::treeBoundBox const&) const in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/lib/libmeshTools.so"
#6  Foam::indexedOctree<Foam::treeDataTriSurface>::divide(Foam::List<int> const&, Foam::treeBoundBox const&, Foam::List<Foam::List<int> >&) const in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/lib/libmeshTools.so"
#7  Foam::indexedOctree<Foam::treeDataTriSurface>::divide(Foam::treeBoundBox const&, Foam::DynamicList<Foam::List<int>, 0u, 2u, 1u>&, int) const in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/lib/libmeshTools.so"
#8  Foam::indexedOctree<Foam::treeDataTriSurface>::indexedOctree(Foam::treeDataTriSurface const&, Foam::treeBoundBox const&, int, double, double) in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/lib/libmeshTools.so"
#9  Foam::triSurfaceMesh::tree() const in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/lib/libmeshTools.so"
#10  Foam::triSurfaceMesh::findLineAny(Foam::Field<Foam::Vector<double> > const&, Foam::Field<Foam::Vector<double> > const&, Foam::List<Foam::PointIndexHit<Foam::Vector<double> > >&) const in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/lib/libmeshTools.so"
#11  Foam::refinementSurfaces::findHigherIntersection(Foam::Field<Foam::Vector<double> > const&, Foam::Field<Foam::Vector<double> > const&, Foam::List<int> const&, Foam::List<int>&, Foam::List<int>&) const in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/lib/libautoMesh.so"
#12  Foam::meshRefinement::updateIntersections(Foam::List<int> const&) in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/lib/libautoMesh.so"
#13  Foam::meshRefinement::meshRefinement(Foam::fvMesh&, double, bool, Foam::refinementSurfaces const&, Foam::refinementFeatures const&, Foam::shellSurfaces const&) in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/lib/libautoMesh.so"
#14  main in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/bin/snappyHexMesh"
#15  __libc_start_main in "/lib64/libc.so.6"
#16  Foam::regIOobject::writeObject(Foam::IOstream::streamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) const in "/home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/bin/snappyHexMesh"
srun.sz: error: cn40404: task 0: Floating point exception

Last edited by wyldckat; September 13, 2014 at 06:46.
Sunxing is offline   Reply With Quote

Old   September 13, 2014, 06:52
Default
  #2
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Greetings Sunxing,

This is a possible source of major problems:
Code:
Overall mesh bounding box  : (562162.0001 3202162 0) (566161.9999 3206162 5000)
I strongly advise you to move your original geometry closer to the origin, as well as the base mesh. This is because meshing at 560 km from the origin can lead to very large numerical errors.

In addition, if possible, upgrading to OpenFOAM 2.1.1 or 2.1.x might fix some problems that existed in snappyHexMesh back in 2.1.0.

Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   September 13, 2014, 10:17
Default
  #3
Member
 
sqing
Join Date: Sep 2012
Location: Dalian
Posts: 77
Rep Power: 14
Sunxing is on a distinguished road
Quote:
Originally Posted by wyldckat View Post
I strongly advise you to move your original geometry closer to the origin, as well as the base mesh. This is because meshing at 560 km from the origin can lead to very large numerical errors.
Hi Bruno,

Thanks for your reply. I forgot to metion that this error just happen when it was parallel processing. If i do it without parallel there is nothing wrong, while it is very time consuming.

regards,
sunxing
Sunxing is offline   Reply With Quote

Old   September 13, 2014, 10:50
Default
  #4
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Hi Sunxing,

Parallel? But the output text you provided clearly states that only one processor was being used:
Quote:
Code:
Build  : 2.1.0-bd7367f93311
Exec   : /home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/bin/snappyHexMesh
Date   : Sep 09 2014
Time   : 09:11:57
Host   : "cn40404"
PID    : 26954
Case   : /home/export/base/envisiongrp/envision_lijun/CFDcalc/calc/panan/pananSnappyTest0/project/sector_0
nProcs : 1
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations
It's the value to the right of "nProcs:".
In addition, on the line "Exec:" it does not show the "-parallel" option, therefore it's very unlikely this was executed in parallel.

Best regards,
Bruno
wyldckat is offline   Reply With Quote

Old   September 13, 2014, 11:17
Default
  #5
Member
 
sqing
Join Date: Sep 2012
Location: Dalian
Posts: 77
Rep Power: 14
Sunxing is on a distinguished road
Hi Bruno,

Sorry that i offered the wrong log file. It should be:
Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.1.0                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : 2.1.0-bd7367f93311
Exec   : /home/export/base/envisiongrp/envision_lijun/OpenFOAM/OpenFOAM-2.1.0/platforms/linux64IccDPOpt/bin/snappyHexMesh -parallel
Date   : Sep 11 2014
Time   : 19:35:41
Host   : "cn40553"
PID    : 6592
Case   : /home/export/base/envisiongrp/envision_lijun/CFDcalc/calc/dongxinghe/DXH10_snappy/project/sector_00
nProcs : 50
Slaves : 
49
(
"cn40553.6593"
"cn40553.6594"
"cn40553.6595"
"cn40553.6596"
"cn40553.6597"
"cn40553.6598"
"cn40553.6599"
"cn40553.6600"
"cn40553.6601"
"cn40554.2019"
"cn40554.2020"
"cn40554.2021"
"cn40554.2022"
"cn40554.2023"
"cn40554.2024"
"cn40554.2025"
"cn40554.2026"
"cn40554.2027"
"cn40554.2028"
"cn40555.11904"
"cn40555.11905"
"cn40555.11906"
"cn40555.11907"
"cn40555.11908"
"cn40555.11909"
"cn40555.11910"
"cn40555.11911"
"cn40555.11912"
"cn40555.11913"
"cn40556.25572"
"cn40556.25573"
"cn40556.25574"
"cn40556.25575"
"cn40556.25576"
"cn40556.25577"
"cn40556.25578"
"cn40556.25579"
"cn40556.25580"
"cn40556.25581"
"cn40557.25994"
"cn40557.25995"
"cn40557.25996"
"cn40557.25997"
"cn40557.25998"
"cn40557.25999"
"cn40557.26000"
"cn40557.26001"
"cn40557.26002"
"cn40557.26003"
)

Pstream initialized with:
    floatTransfer     : 0
    nProcsSimpleSum   : 0
    commsType         : nonBlocking
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Read mesh in = 2.46 s

Overall mesh bounding box  : (317267.9718 4392949.972 0) (318768.0282 4394450.028 6000)
Relative tolerance         : 1e-06
Absolute matching distance : 0.00636398752405

Reading refinement surfaces.
Read refinement surfaces in = 9.66 s

Reading refinement shells.
Read refinement shells in = 0 s

Setting refinement level of surface to be consistent with shells.
Checked shell refinement in = 0 s

Reading features.
Read features in = 0 s


Determining initial surface intersections
-----------------------------------------

Edge intersection testing:
    Number of edges             : 37800000
    Number of edges to retest   : 37800000
    Number of intersected edges : 252077
Calculated surface intersections in = 13.92 s

Initial mesh : cells:12500000  faces:37800000  points:12801051
Cells per refinement level:
    0	12500000

Adding patches for surface regions
----------------------------------

Patch	Type	Region
-----	----	------
terrain:

6	wall	terrain

Added patches in = 0.08 s

Selecting decompositionMethod hierarchical

Refinement phase
----------------

[37] Found point (318018 4393700 4000) in cell 204850 on processor 37

Surface refinement iteration 0
------------------------------

Marked for refinement due to surface intersection : 500000 cells.
Determined cells to refine in = 0.95 s
Selected for refinement : 500000 cells (out of 12500000)
Edge intersection testing:
    Number of edges             : 49062222
    Number of edges to retest   : 14518296
    Number of intersected edges : 1009563
Refined mesh in = 5.19 s
After refinement surface refinement iteration 0 : cells:16000000  faces:49062222  points:17064279
Cells per refinement level:
    0	12000000
    1	4000000
Balanced mesh in = 7.98 s
After balancing surface refinement iteration 0 : cells:16000000  faces:49062222  points:17064279
Cells per refinement level:
    0	12000000
    1	4000000

Surface refinement iteration 1
------------------------------

Marked for refinement due to surface intersection : 2000000 cells.
Determined cells to refine in = 0.5 s
Selected for refinement : 2093173 cells (out of 16000000)
Edge intersection testing:
    Number of edges             : 96061644
    Number of edges to retest   : 59514734
    Number of intersected edges : 4037188
Refined mesh in = 16.57 s
After refinement surface refinement iteration 1 : cells:30652211  faces:96061644  points:34761292
Cells per refinement level:
    0	11906827
    1	2745384
    2	16000000
Balanced mesh in = 43.47 s
After balancing surface refinement iteration 1 : cells:30652211  faces:96061644  points:34761292
Cells per refinement level:
    0	11906827
    1	2745384
    2	16000000

Surface refinement iteration 2
------------------------------

No cells marked for refinement since reached limit 30000000.
Determined cells to refine in = 1.79 s
Selected for refinement : 0 cells (out of 30652211)
Stopping refining since too few cells selected.


Removing mesh beyond surface intersections
------------------------------------------

Found point (318018 4393700 4000) in cell -1 in global region 1 out of 2 regions.
Keeping all cells in region 1 containing point (318018 4393700 4000)
Selected for keeping : 17201334 cells.
Edge intersection testing:
    Number of edges             : 55670113
    Number of edges to retest   : 4037054
    Number of intersected edges : 4037188

Shell refinement iteration 0
----------------------------

Marked for refinement due to refinement shells    : 0 cells.
Determined cells to refine in = 29.13 s
Selected for internal refinement : 0 cells (out of 17201334)
Stopping refining since too few cells selected.


Splitting mesh at surface intersections
---------------------------------------

Introducing baffles for 4037188 faces that are intersected by the surface.

Edge intersection testing:
    Number of edges             : 59707272
    Number of edges to retest   : 32082314
    Number of intersected edges : 8030964
Created baffles in = 14 s


After introducing baffles : cells:17201334  faces:59707272  points:22730384
Cells per refinement level:
    0	3605746
    1	1373002
    2	12222586

Introducing baffles to block off problem cells
----------------------------------------------

markFacesOnProblemCells : marked 8070117 additional internal faces to be converted into baffles.
Analyzed problem cells in = 3.52 s


Introducing baffles to delete problem cells.

Edge intersection testing:
    Number of edges             : 67777389
    Number of edges to retest   : 24222615
    Number of intersected edges : 8030967
Created baffles in = 11.38 s


After introducing baffles : cells:17201334  faces:67777389  points:22793116
Cells per refinement level:
    0	3605746
    1	1373002
    2	12222586

Remove unreachable sections of mesh
-----------------------------------

[14] 
[14] 
[14] --> FOAM FATAL IO ERROR: 
[14] error in IOstream "IOstream" for operation operator>>(Istream&, List<T>&) : reading first token
[14] 
[14] file: IOstream at line 0.
[14] 
[14]     From function IOstream::fatalCheck(const char*) const
[14]     in file db/IOstreams/IOstreams/IOstream.C at line 111.
[14] 
srun.sz: Job step aborted: Waiting up to 2 seconds for job step to finish.
FOAM parallel run exiting
[14] 
In: PMI_Abort(1, application called MPI_Abort(MPI_COMM_WORLD, 1) - process 14)
slurmd[cn40556]: *** STEP 5720419.0 KILLED AT 2014-09-11T19:38:30 WITH SIGNAL 9 ***
slurmd[cn40555]: *** STEP 5720419.0 KILLED AT 2014-09-11T19:38:33 WITH SIGNAL 9 ***
slurmd[cn40553]: *** STEP 5720419.0 KILLED AT 2014-09-11T19:38:25 WITH SIGNAL 9 ***
slurmd[cn40557]: *** STEP 5720419.0 KILLED AT 2014-09-11T19:38:34 WITH SIGNAL 9 ***
slurmd[cn40554]: *** STEP 5720419.0 KILLED AT 2014-09-11T19:38:26 WITH SIGNAL 9 ***
slurmd[cn40556]: *** STEP 5720419.0 KILLED AT 2014-09-11T19:38:30 WITH SIGNAL 9 ***
slurmd[cn40553]: *** STEP 5720419.0 KILLED AT 2014-09-11T19:38:25 WITH SIGNAL 9 ***
slurmd[cn40554]: *** STEP 5720419.0 KILLED AT 2014-09-11T19:38:26 WITH SIGNAL 9 ***
slurmd[cn40555]: *** STEP 5720419.0 KILLED AT 2014-09-11T19:38:33 WITH SIGNAL 9 ***
slurmd[cn40557]: *** STEP 5720419.0 KILLED AT 2014-09-11T19:38:34 WITH SIGNAL 9 ***
srun.sz: error: cn40556: tasks 30,32,34-36,38-39: Killed
srun.sz: error: cn40554: tasks 10-19: Killed
srun.sz: error: cn40557: tasks 40-49: Killed
srun.sz: error: cn40553: tasks 0-9: Killed
srun.sz: error: cn40555: tasks 20-29: Killed
srun.sz: error: cn40556: tasks 31,33,37: Killed
Best regards,
sunxing
Sunxing is offline   Reply With Quote

Old   September 13, 2014, 14:27
Default
  #6
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Hi Sunxing,

OK, I've done a bit of searching here in the forum and one recurring situation with this kind of error message is usually related to the MPI toolbox or how OpenFOAM is connecting to it:
  • Either this occurs due to an incorrect build of the "libPstream.so" library in OpenFOAM.
    • If it works fine with running solvers in parallel, then this should not be the problem.
  • Or because the MPI toolbox is unable to handle more than certain number of processors.
    • If the MPI toolbox is to blame, then you can try reducing the number of processors being used, possibly down to 32, 16, 8 or even 4.
Can you provide the "snappyHexMeshDict" file you're using? Some of the parameters you're using might be at the root of the problem.

Best regards,
Bruno
wyldckat is offline   Reply With Quote

Old   September 14, 2014, 22:05
Default
  #7
Member
 
sqing
Join Date: Sep 2012
Location: Dalian
Posts: 77
Rep Power: 14
Sunxing is on a distinguished road
Hi Bruno,

Thanks very much for your patience and the full answer. Following your advise, I have tested the same case with 8 processors and it failed again. But for another case of less girds it succeed with 128 processors.
Here are the snappyHexMeshDict and log file:

Best regards
sunxng
Attached Files
File Type: txt log.snappy8.txt (8.3 KB, 7 views)
File Type: txt snappyHexMeshDict.txt (12.3 KB, 18 views)
Sunxing is offline   Reply With Quote

Old   September 15, 2014, 16:24
Default
  #8
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Hi Sunxing,

You forgot to attach the file "include/snappyPara", which apparently has the settings for the few important variables I was looking for, which might be the source of the problem.

By the way, do you know which MPI toolbox is being used? The following commands should give you some feedback:
Code:
echo $FOAM_MPI
mpirun --version
I ask this because it's possible for some MPI toolboxes to increase the memory buffer used for sharing data between parallel applications.

I did some more searching for the latest error message you got, namely:
Code:
In: PMI_Abort(1, Fatal error in MPI_Recv:
Message truncated, error stack:
MPIDI_CH3U_Receive_data_found(284): Message from rank 3 and tag 1 truncated; 105 bytes received but buffer size is 4
)
and this seems possible to occur for various possible reasons. The most likely suspect is a bug in the MPI toolbox; the other is that there is a bug in OpenFOAM 2.1.0.

Perhaps the system administrators of the supercomputer you're using have updated the MPI toolbox and not rebuilt OpenFOAM 2.1.0 with that new version of the MPI?

Best regards,
Bruno

---------------------

edit: I went looking for where the crash is likely to be occurring and according to the last output message given to you by snappyHexMesh, this seems to be the problem: https://github.com/OpenFOAM/OpenFOAM...nement.C#L1835
Quote:
Code:
Foam::autoPtr<Foam::mapPolyMesh> Foam::meshRefinement::splitMeshRegions
(
    const point& keepPoint
)
{
    // Determine connected regions. regionSplit is the labelList with the
    // region per cell.
    regionSplit cellRegion(mesh_);

    label regionI = -1;

    label cellI = mesh_.findCell(keepPoint);

    if (cellI != -1)
    {
        regionI = cellRegion[cellI];
    }

    reduce(regionI, maxOp<label>());
namely the last line quoted above. That is the method called for coordinating the "regionI" variable between all parallel processes and for some reason it crashes. It's pretty much just 4 bytes being shared between all processors, which in this example, equates to at least 4*8=32 bytes being sent to-from all processes. This makes me believe even more that this is a bug in the MPI toolbox being used.
__________________

Last edited by wyldckat; September 15, 2014 at 16:40. Reason: see "edit:"
wyldckat is offline   Reply With Quote

Old   September 16, 2014, 23:06
Default
  #9
Member
 
sqing
Join Date: Sep 2012
Location: Dalian
Posts: 77
Rep Power: 14
Sunxing is on a distinguished road
Hi Bruno,

I don't know much about the MPI thing. I tried the commands you offered. If I source the OF-2.1.0, then the feedback is
Code:
[envision_lijun@sn02 sector_0]$    echo $FOAM_MPI
mpi
[envision_lijun@sn02 sector_0]$ mpirun --version
Missing: program name
Program --version either does not exist, is not 
executable, or is an erroneous argument to mpirun.
If i source OF-2.3.0, then the feedback is
Code:
[envision_lijun@sn02 sector_0]$ echo $FOAM_MPI
openmpi-1.6.5
[envision_lijun@sn02 sector_0]$ mpirun --version
mpirun (Open MPI) 1.6.5
.
Maybe there are some errors in the MPI, but i don't know how to fix them and either has the permission to change something about MPI. So maybe i should seek for other method to generate grids. However, I want to know do the MPI errors have bad effects on my calculation progress?

Best regards,
sunxing
Attached Files
File Type: txt snappyPara.txt (821 Bytes, 9 views)
Sunxing is offline   Reply With Quote

Old   September 20, 2014, 10:30
Default
  #10
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Hi Sunxing,

OK, you're limiting the maximum local cells to 10 million cells and the global to 30 million. This can lead to unexpected mesh refinements, which could imply why this error is occurring. In addition, I took another look at the output in one of your posts and there is this:
Quote:
Code:
Surface refinement iteration 2
------------------------------

No cells marked for refinement since reached limit 30000000.
Determined cells to refine in = 1.79 s
Selected for refinement : 0 cells (out of 30652211)
Stopping refining since too few cells selected.
As you can see, the top global limit was reached, which is leading to not all of the desired cells to be properly refined. In addition, this limitation is only being triggered while still refining the whole complete original initial mesh. In other words, if you look at the output in the following lines, the total cell count drops down to 17 million cells, after it removes the cells that do not matter. Problem is that no further refinement can occur from this point onward, which means that you've lost some possibly vital/crucial cells that were needed for a proper mesh generation.

Try increasing the maximum limit to 60-90 million cells. The final mesh seems to only end up with roughly half of that.


As to answer your recent post:
Quote:
Originally Posted by Sunxing View Post
If I source the OF-2.1.0, then the feedback is
Code:
[envision_lijun@sn02 sector_0]$    echo $FOAM_MPI
mpi
[envision_lijun@sn02 sector_0]$ mpirun --version
Missing: program name
Program --version either does not exist, is not 
executable, or is an erroneous argument to mpirun.
Mmm... it only stated "mpi" and "mpirun" doesn't exist... you could try running:
Code:
mpiexec --version
which is usually the other name for "mpirun".

Problem is that when "FOAM_MPI=mpi", this means that the MPI located at "/opt/mpi" is the one meant to be used. More specifically, you can try running:
Code:
echo $MPI_ARCH_PATH
ls -l $MPI_ARCH_PATH
this will tell you what is the actual path to the MPI toolbox being used and then it will list the contents of the folder at that path.

Quote:
Originally Posted by Sunxing View Post
If i source OF-2.3.0, then the feedback is
Code:
[envision_lijun@sn02 sector_0]$ echo $FOAM_MPI
openmpi-1.6.5
[envision_lijun@sn02 sector_0]$ mpirun --version
mpirun (Open MPI) 1.6.5
Does your mesh generate well with this version? If it also does not, then possibly it's unlikely to be an MPI problem.


Quote:
Originally Posted by Sunxing View Post
Maybe there are some errors in the MPI, but i don't know how to fix them and either has the permission to change something about MPI. So maybe i should seek for other method to generate grids.
I strongly suggest that you contact the administrators of the machine you're using, to diagnose that issue in more detail, namely if the MPI toolbox has had any change or if the OpenFOAM 2.1.0 was damaged in any way.

Quote:
Originally Posted by Sunxing View Post
However, I want to know do the MPI errors have bad effects on my calculation progress?
Very short answer: I don't know.

Long answer: It depends. MPI related errors can be due to a variety of reasons, including incorrect use or a situation that was not contemplated in a specific MPI version.
For example, what comes to my mind is that some MPI libraries might not take into account what should be done if an array of 0 size is sent through the MPI channels; it doesn't make much sense trying to send an array of 0 dimension via MPI, but this could work in some MPI toolboxes but not on others.

Therefore, without more details, I can only suppose that perhaps you'll only have problems with this MPI toolbox in OpenFOAM 2.1.0 if and only if there is an identical situation in the solver, to the one you're triggering with snappyHexMesh. In other words, the solver can only work well in parallel if the mesh is good enough to use it in parallel.

Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[snappyHexMesh] SnappyHexMesh problems sjtuwjh OpenFOAM Meshing & Mesh Conversion 5 November 25, 2020 19:49
[snappyHexMesh] Problems with scaling meshes when meshing with SnappyHexmesh Bnitter OpenFOAM Meshing & Mesh Conversion 1 November 15, 2018 09:26
[snappyHexMesh] Problems meshing an impeller with snappyHexMesh kandelabr OpenFOAM Meshing & Mesh Conversion 13 June 9, 2017 07:18
[snappyHexMesh] Problems with coarse/unsnapped cells in snappyHexMesh scareneb OpenFOAM Meshing & Mesh Conversion 2 August 13, 2014 08:13
[snappyHexMesh] SnappyhexMesh and Symmetryplane bastil OpenFOAM Meshing & Mesh Conversion 10 October 7, 2008 18:07


All times are GMT -4. The time now is 08:27.