|
[Sponsors] |
March 29, 2015, 00:14 |
InterTrackFoam parallel issue
|
#1 |
New Member
Jason
Join Date: Dec 2014
Location: Shanghai, China
Posts: 10
Rep Power: 11 |
Hey, guys
I've been working on this for several weeks, but nothing out. Actually I've browsed the post related to this problem: InterTrackFoam any information The thing is I cannot split the domain and leave the freesurface patch as a whole into the master processor. I see lots of interTrackFoam users have worked it out, so hope you guys could help me a little bit. This problem is really a pain in the neck! Thanks |
|
March 29, 2015, 01:46 |
|
#2 | |
New Member
Jason
Join Date: Dec 2014
Location: Shanghai, China
Posts: 10
Rep Power: 11 |
Quote:
nProcs : 8 Slaves : 7 ( w002.5303 w002.5304 w002.5305 w002.5306 w002.5307 w002.5308 w002.5309 ) Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : blocking SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create dynamic mesh for time = 0 Selecting dynamicFvMesh dynamicMotionSolverFvMesh Selecting motion solver: laplace Selecting motion diffusivity: uniform Reading field p Reading field U Reading/calculating face flux field phi Found free surface patch. ID: 3 [0] [1] [1] [1] --> FOAM FATAL IO ERROR: [1] cannot open file [1] [1] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor1/0/fluidIndicator at line 0. [1] [1] From function regIOobject::readStream() [1] in file db/regIOobject/regIOobjectRead.C at line 61. [1] FOAM parallel run exiting [2] [2] [2] --> FOAM FATAL IO ERROR: [2] cannot open file [2] [2] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor2/0/fluidIndicator at line 0. [2] [2] From function regIOobject::readStream() [2] in file db/regIOobject/regIOobjectRead.C at line 61. [2] FOAM parallel run exiting [2] [3] [3] [3] --> FOAM FATAL IO ERROR: [3] cannot open file [3] [3] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor3/0/fluidIndicator at line 0. [3] [3] From function regIOobject::readStream() [3] in file db/regIOobject/regIOobjectRead.C at line 61. [3] FOAM parallel run exiting [3] [4] [4] [4] --> FOAM FATAL IO ERROR: [4] cannot open file [4] [4] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor4/0/fluidIndicator at line 0. [4] [4] From function regIOobject::readStream() [4] in file db/regIOobject/regIOobjectRead.C at line 61. [4] FOAM parallel run exiting [4] [5] [5] [5] --> FOAM FATAL IO ERROR: [5] cannot open file [5] [5] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor5/0/fluidIndicator at line 0. [5] [5] From function regIOobject::readStream() [5] in file db/regIOobject/regIOobjectRead.C at line 61. [5] FOAM parallel run exiting [5] [1] [6] [6] [6] --> FOAM FATAL IO ERROR: [6] cannot open file [6] [6] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor6/0/fluidIndicator at line 0. [6] [6] From function regIOobject::readStream() [6] in file db/regIOobject/regIOobjectRead.C at line 61. [6] FOAM parallel run exiting [6] [0] [0] --> FOAM FATAL IO ERROR: [0] cannot open file [0] [0] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor0/0/fluidIndicator at line 0. [0] [0] From function regIOobject::readStream() [0] in file db/regIOobject/regIOobjectRead.C at line 61. [0] FOAM parallel run exiting [0] [7] [7] [7] --> FOAM FATAL IO ERROR: [7] cannot open file [7] [7] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor7/0/fluidIndicator at line 0. [7] [7] From function regIOobject::readStream() [7] in file db/regIOobject/regIOobjectRead.C at line 61. [7] FOAM parallel run exiting [7] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- -------------------------------------------------------------------------- mpirun has exited due to process rank 6 with PID 5308 on node w002 exiting improperly. There are two reasons this could occur: 1. this process did not call "init" before exiting, but others in the job did. This can cause a job to hang indefinitely while it waits for all processes to call "init". By rule, if one process calls "init", then ALL processes must call "init" prior to termination. 2. this process called "init", but exited without calling "finalize". By rule, all processes that call "init" MUST call "finalize" prior to exiting or it will be considered an "abnormal termination" This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [w002:05301] 7 more processes have sent help message help-mpi-api.txt / mpi-abort [w002:05301] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages |
||
March 30, 2015, 03:40 |
|
#3 |
New Member
Jason
Join Date: Dec 2014
Location: Shanghai, China
Posts: 10
Rep Power: 11 |
Dear all
I've tackled the manual decompose problem with funkySetFields. I have to say this utility is quite in handy and powerful. You just need to put into an expression to distribute the processors where every point is assigned and hit the funkySetFields command to dump them into a dataFile. Change it a little bit and done. Unfortunately, the interTrackFoam can't still work in parallel yet. Here is the error message: Code:
// ***********************************************************************// /*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | foam-extend: Open Source CFD | | \\ / O peration | Version: 3.0 | | \\ / A nd | Web: http://www.extend-project.de | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 3.0-7a7445ead09d Exec : interTrackFoam -parallel Date : Mar 30 2015 Time : 14:25:58 Host : w002 PID : 26609 CtrlDict : /home/##/OpenFOAM/foam-extend-3.0/etc/controlDict Case : /home/##/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump nProcs : 2 Slaves : 1 ( w002.26610 ) Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : blocking SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create dynamic mesh for time = 0 Selecting dynamicFvMesh dynamicMotionSolverFvMesh Selecting motion solver: laplace Selecting motion diffusivity: uniform Reading field p Reading field U Reading/calculating face flux field phi Found free surface patch. ID: 3 [1] [1] [1] --> FOAM FATAL IO ERROR: [1] cannot open file [1] [1] file: /home/##/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor1/0/fluidIndicator at line 0. [1] [1] From function regIOobject::readStream() [1] in file db/regIOobject/regIOobjectRead.C at line 61. [1] FOAM parallel run exiting [1] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- -------------------------------------------------------------------------- mpirun has exited due to process rank 1 with PID 26610 on node w002 exiting improperly. There are two reasons this could occur: 1. this process did not call "init" before exiting, but others in the job did. This can cause a job to hang indefinitely while it waits for all processes to call "init". By rule, if one process calls "init", then ALL processes must call "init" prior to termination. 2. this process called "init", but exited without calling "finalize". By rule, all processes that call "init" MUST call "finalize" prior to exiting or it will be considered an "abnormal termination" This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- |
|
May 28, 2015, 06:55 |
|
#5 |
New Member
Chiara
Join Date: Oct 2012
Posts: 11
Rep Power: 14 |
Hi Jason,
sorry for the late answer, but maybe this will help in the future. There is a utility in the surfaceTracking folder called setFluidIndicator, you just need to compile it and then run setFluidIndicator before decomposePar in your case folder. The fluidIndicator is set to 1 for the denser phase and 0 elsewhere. The solver treats the fluidIndicator in a different way for serial and parallel runs. In serial, if not present is generated automatically, in parallel it must be present in the processor*/0 folder. Best, Chiara |
|
March 9, 2016, 10:57 |
|
#6 |
New Member
Hf
Join Date: Nov 2012
Posts: 29
Rep Power: 14 |
Hello there,
I'm trying to run interTrackFoam in parallel. Could you please detail how to manually decompose using funkySetFields? Thank you. "I've tackled the manual decompose problem with funkySetFields. I have to say this utility is quite in handy and powerful. You just need to put into an expression to distribute the processors where every point is assigned and hit the funkySetFields command to dump them into a dataFile. Change it a little bit and done. " |
|
Tags |
intertrackfoam, openfoam 1.6-ext, parallel |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
OF 2.0.1 parallel running problems | moser_r | OpenFOAM Running, Solving & CFD | 9 | July 27, 2022 04:15 |
Some questions about a multi region case run in parallel | zfaraday | OpenFOAM Running, Solving & CFD | 5 | February 23, 2017 11:25 |
[mesh manipulation] Cannot get refineMesh to run in parallel | smschnob | OpenFOAM Meshing & Mesh Conversion | 2 | June 3, 2014 12:20 |
parallel running issue in ansys 14 fluent | ade4921 | FLUENT | 3 | March 6, 2014 06:35 |
Problems running in parallel - missing controlDict | Argen | OpenFOAM Running, Solving & CFD | 4 | June 7, 2012 04:50 |