CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Problem with MPPICFoam tutorial

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   October 26, 2015, 13:18
Default Problem with MPPICFoam tutorial
  #1
New Member
 
Roger Hoffmann
Join Date: Apr 2011
Posts: 3
Rep Power: 15
rogerhoffmann is on a distinguished road
Greetings,

I am having problems to complete the MPPICFoam tutorial "cyclone".
It runs in parallel, but when reaches the time 1s, when the particles are beggining to be injected, it crashes.
If I run serial then it works.
I am trying to understand what is happening, but I could not find any similar question in the forum.
I am using Ubuntu 15.05 and openFoam 2.4.0.

The end of the log file:

Evolving kinematicCloud

Solving 3-D cloud kinematicCloud

--> Cloud: kinematicCloud injector: model1
Added 49 new parcels

GAMG: Solving for kinematicCloud:alpha, Initial residual = 1.60165e-08, Final residual = 1.60165e-08, No Iterations 0
Cloud: kinematicCloud
Current number of parcels = 343
Current mass in system = 0.0142167
Linear momentum = (-0.14173 -0.000459467 8.24334e-06)
|Linear momentum| = 0.141731
Linear kinetic energy = 0.721134
model1:
number of parcels added = 343
mass introduced = 0.0142167
Parcels absorbed into film = 0
New film detached parcels = 0
Parcel fate (number, mass) : patch walls
- escape = 0, 0
- stick = 0, 0
Parcel fate (number, mass) : patch inlet
- escape = 0, 0
- stick = 0, 0
Parcel fate (number, mass) : patch outlet
- escape = 0, 0
- stick = 0, 0
Min cell volume fraction = 0
Max cell volume fraction = 0.0122033
Min dense number of parcels = 3.57316

PIMPLE: iteration 1
smoothSolver: Solving for U.airx, Initial residual = 0.000767108, Final residual = 1.37258e-06, No Iterations 1
smoothSolver: Solving for U.airy, Initial residual = 0.000647376, Final residual = 1.21264e-06, No Iterations 1
smoothSolver: Solving for U.airz, Initial residual = 0.00114102, Final residual = 2.86035e-06, No Iterations 1
GAMG: Solving for p, Initial residual = 0.0227031, Final residual = 0.000133388, No Iterations 5
time step continuity errors : sum local = 2.64758e-08, global = -5.61738e-09, cumulative = 3.52815e-10
GAMG: Solving for p, Initial residual = 0.00255501, Final residual = 4.6614e-07, No Iterations 8
time step continuity errors : sum local = 9.8628e-11, global = 2.67723e-11, cumulative = 3.79587e-10
smoothSolver: Solving for k.air, Initial residual = 0.000374038, Final residual = 8.18823e-07, No Iterations 1
ExecutionTime = 1788.76 s ClockTime = 1799 s

Courant Number mean: 0.0497088 max: 0.172362
Time = 1.0018

Evolving kinematicCloud

Solving 3-D cloud kinematicCloud

--> Cloud: kinematicCloud injector: model1
Added 49 new parcels

[roger-desktop:2253] *** An error occurred in MPI_Recv
[roger-desktop:2253] *** on communicator MPI_COMM_WORLD
[roger-desktop:2253] *** MPI_ERR_TRUNCATE: message truncated
[roger-desktop:2253] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
--------------------------------------------------------------------------
mpirun has exited due to process rank 2 with PID 2254 on
node roger-desktop exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[roger-desktop:02251] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[roger-desktop:02251] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
rogerhoffmann is offline   Reply With Quote

Old   October 26, 2015, 15:10
Default More information
  #2
New Member
 
Roger Hoffmann
Join Date: Apr 2011
Posts: 3
Rep Power: 15
rogerhoffmann is on a distinguished road
The other MPPICFoam tutorials run fine in parallel.
Just this case didnt run, that, unfortunately is exactly the one I need.
Thanks for any help...

Last edited by rogerhoffmann; October 26, 2015 at 17:15.
rogerhoffmann is offline   Reply With Quote

Old   January 7, 2022, 11:57
Default Same problem
  #3
Member
 
Bill Lasher
Join Date: Jun 2009
Posts: 36
Rep Power: 17
lasherwc is on a distinguished road
Did you get this resolved? I had the same problem at about 2 seconds.
lasherwc is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
multiRegionHeater tutorial visualization problem Bufacchi OpenFOAM Running, Solving & CFD 3 September 18, 2015 10:20
Problem on Fluent Tutorial: Horizontal Film Boilig Feng FLUENT 2 April 13, 2013 06:34
paraView was working, now not working goldbeard OpenFOAM Installation 8 March 28, 2013 23:02
newbie problem with cavity tutorial miki OpenFOAM Running, Solving & CFD 8 September 2, 2012 16:22
Help! Compiled UDF problem 4 Wave tank tutorial Shane FLUENT 1 September 3, 2010 03:32


All times are GMT -4. The time now is 13:56.