CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

InterTrackFoam parallel issue

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 29, 2015, 00:14
Default InterTrackFoam parallel issue
  #1
New Member
 
Jason
Join Date: Dec 2014
Location: Shanghai, China
Posts: 10
Rep Power: 11
zbli is on a distinguished road
Hey, guys

I've been working on this for several weeks, but nothing out.
Actually I've browsed the post related to this problem: InterTrackFoam any information
The thing is I cannot split the domain and leave the freesurface patch as a whole into the master processor.
I see lots of interTrackFoam users have worked it out, so hope you guys could help me a little bit. This problem is really a pain in the neck!

Thanks
zbli is offline   Reply With Quote

Old   March 29, 2015, 01:46
Default
  #2
New Member
 
Jason
Join Date: Dec 2014
Location: Shanghai, China
Posts: 10
Rep Power: 11
zbli is on a distinguished road
Quote:
Originally Posted by zbli View Post
Hey, guys

I've been working on this for several weeks, but nothing out.
Actually I've browsed the post related to this problem: InterTrackFoam any information
The thing is I cannot split the domain and leave the freesurface patch as a whole into the master processor.
I see lots of interTrackFoam users have worked it out, so hope you guys could help me a little bit. This problem is really a pain in the neck!

Thanks
This is what I ran into when I typed in "mpirun -np 8 interTrackFoam -parallel":

nProcs : 8
Slaves :
7
(
w002.5303
w002.5304
w002.5305
w002.5306
w002.5307
w002.5308
w002.5309
)

Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : blocking
SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create dynamic mesh for time = 0

Selecting dynamicFvMesh dynamicMotionSolverFvMesh
Selecting motion solver: laplace
Selecting motion diffusivity: uniform

Reading field p

Reading field U

Reading/calculating face flux field phi

Found free surface patch. ID: 3
[0] [1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] cannot open file
[1]
[1] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor1/0/fluidIndicator at line 0.
[1]
[1] From function regIOobject::readStream()
[1] in file db/regIOobject/regIOobjectRead.C at line 61.
[1]
FOAM parallel run exiting
[2]
[2]
[2] --> FOAM FATAL IO ERROR:
[2] cannot open file
[2]
[2] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor2/0/fluidIndicator at line 0.
[2]
[2] From function regIOobject::readStream()
[2] in file db/regIOobject/regIOobjectRead.C at line 61.
[2]
FOAM parallel run exiting
[2]
[3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] cannot open file
[3]
[3] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor3/0/fluidIndicator at line 0.
[3]
[3] From function regIOobject::readStream()
[3] in file db/regIOobject/regIOobjectRead.C at line 61.
[3]
FOAM parallel run exiting
[3]
[4]
[4]
[4] --> FOAM FATAL IO ERROR:
[4] cannot open file
[4]
[4] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor4/0/fluidIndicator at line 0.
[4]
[4] From function regIOobject::readStream()
[4] in file db/regIOobject/regIOobjectRead.C at line 61.
[4]
FOAM parallel run exiting
[4]
[5]
[5]
[5] --> FOAM FATAL IO ERROR:
[5] cannot open file
[5]
[5] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor5/0/fluidIndicator at line 0.
[5]
[5] From function regIOobject::readStream()
[5] in file db/regIOobject/regIOobjectRead.C at line 61.
[5]
FOAM parallel run exiting
[5]
[1]
[6]
[6]
[6] --> FOAM FATAL IO ERROR:
[6] cannot open file
[6]
[6] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor6/0/fluidIndicator at line 0.
[6]
[6] From function regIOobject::readStream()
[6] in file db/regIOobject/regIOobjectRead.C at line 61.
[6]
FOAM parallel run exiting
[6]

[0]
[0] --> FOAM FATAL IO ERROR:
[0] cannot open file
[0]
[0] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor0/0/fluidIndicator at line 0.
[0]
[0] From function regIOobject::readStream()
[0] in file db/regIOobject/regIOobjectRead.C at line 61.
[0]
FOAM parallel run exiting
[0]
[7]
[7]
[7] --> FOAM FATAL IO ERROR:
[7] cannot open file
[7]
[7] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor7/0/fluidIndicator at line 0.
[7]
[7] From function regIOobject::readStream()
[7] in file db/regIOobject/regIOobjectRead.C at line 61.
[7]
FOAM parallel run exiting
[7]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 6 with PID 5308 on
node w002 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[w002:05301] 7 more processes have sent help message help-mpi-api.txt / mpi-abort
[w002:05301] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
zbli is offline   Reply With Quote

Old   March 30, 2015, 03:40
Default
  #3
New Member
 
Jason
Join Date: Dec 2014
Location: Shanghai, China
Posts: 10
Rep Power: 11
zbli is on a distinguished road
Dear all

I've tackled the manual decompose problem with funkySetFields. I have to say this utility is quite in handy and powerful. You just need to put into an expression to distribute the processors where every point is assigned and hit the funkySetFields command to dump them into a dataFile. Change it a little bit and done.

Unfortunately, the interTrackFoam can't still work in parallel yet. Here is the error message:
Code:
// ***********************************************************************//
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | foam-extend: Open Source CFD                    |
|  \\    /   O peration     | Version:  3.0                                   |
|   \\  /    A nd           | Web:         http://www.extend-project.de       |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build    : 3.0-7a7445ead09d
Exec     : interTrackFoam -parallel
Date     : Mar 30 2015
Time     : 14:25:58
Host     : w002
PID      : 26609
CtrlDict : /home/##/OpenFOAM/foam-extend-3.0/etc/controlDict
Case     : /home/##/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump
nProcs   : 2
Slaves : 
1
(
w002.26610
)

Pstream initialized with:
    floatTransfer     : 0
    nProcsSimpleSum   : 0
    commsType         : blocking
SigFpe   : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create dynamic mesh for time = 0

Selecting dynamicFvMesh dynamicMotionSolverFvMesh
Selecting motion solver: laplace
Selecting motion diffusivity: uniform

Reading field p

Reading field U

Reading/calculating face flux field phi

Found free surface patch. ID: 3
[1] 
[1] 
[1] --> FOAM FATAL IO ERROR: 
[1] cannot open file
[1] 
[1] file: /home/##/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor1/0/fluidIndicator at line 0.
[1] 
[1]     From function regIOobject::readStream()
[1]     in file db/regIOobject/regIOobjectRead.C at line 61.
[1] 
FOAM parallel run exiting
[1] 
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 26610 on
node w002 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
So... still the 'fluidindicator'. I barely even know him.
zbli is offline   Reply With Quote

Old   March 30, 2015, 05:17
Default
  #4
New Member
 
Jason
Join Date: Dec 2014
Location: Shanghai, China
Posts: 10
Rep Power: 11
zbli is on a distinguished road
[resolved]
I'm so glad that I finally done this problem. Now the case is running just fine.
This post helps me out by configuring an alpha file, and I rename it into 'fluidIndicator'.
zbli is offline   Reply With Quote

Old   May 28, 2015, 06:55
Default
  #5
New Member
 
Chiara
Join Date: Oct 2012
Posts: 11
Rep Power: 14
Chia is on a distinguished road
Hi Jason,

sorry for the late answer, but maybe this will help in the future.
There is a utility in the surfaceTracking folder called setFluidIndicator, you just need to compile it and then run setFluidIndicator before decomposePar in your case folder.

The fluidIndicator is set to 1 for the denser phase and 0 elsewhere. The solver treats the fluidIndicator in a different way for serial and parallel runs. In serial, if not present is generated automatically, in parallel it must be present in the processor*/0 folder.

Best,

Chiara
Chia is offline   Reply With Quote

Old   March 9, 2016, 10:57
Default
  #6
New Member
 
Hf
Join Date: Nov 2012
Posts: 29
Rep Power: 14
jasonchen is on a distinguished road
Hello there,

I'm trying to run interTrackFoam in parallel. Could you please detail how to manually decompose using funkySetFields? Thank you.

"I've tackled the manual decompose problem with funkySetFields. I have to say this utility is quite in handy and powerful. You just need to put into an expression to distribute the processors where every point is assigned and hit the funkySetFields command to dump them into a dataFile. Change it a little bit and done. "
jasonchen is offline   Reply With Quote

Reply

Tags
intertrackfoam, openfoam 1.6-ext, parallel


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
OF 2.0.1 parallel running problems moser_r OpenFOAM Running, Solving & CFD 9 July 27, 2022 04:15
Some questions about a multi region case run in parallel zfaraday OpenFOAM Running, Solving & CFD 5 February 23, 2017 11:25
[mesh manipulation] Cannot get refineMesh to run in parallel smschnob OpenFOAM Meshing & Mesh Conversion 2 June 3, 2014 12:20
parallel running issue in ansys 14 fluent ade4921 FLUENT 3 March 6, 2014 06:35
Problems running in parallel - missing controlDict Argen OpenFOAM Running, Solving & CFD 4 June 7, 2012 04:50


All times are GMT -4. The time now is 21:49.