|
[Sponsors] |
August 10, 2009, 03:31 |
OpenFOAM v1.6 & OpenMPI & functionObjects
|
#1 |
Member
bruce
Join Date: May 2009
Location: Germany
Posts: 42
Rep Power: 17 |
Hello All,
I am moving my previous post from "OpenFOAM Running / Solving / CFD" since it seems to be a bug. Even though i notice this bug in my real time test case, i also notice this in tutorial case of OpenFOAM, for example CASE: tutorials/incompressible/simpleFoam/pitzDaily COMMAND: mpirun -np 4 simpleFoam -parallel I run a case parallel with "functionObjects" in controlDict in OpenFOAM Version 1.6. I use pre compiled OpenFOAM1.6 & openmpi versions. - case runs fine in single processor with "functionObjects" - case runs fine in parallel multiple processor without "functionObjects" The problem is case do not run with "functionObjects" in parallel !!! So i get an run time error. Here is the content of the functionObjects i use in controlDict file, HTML Code:
functions { fieldMinMax { type fieldMinMax; functionObjectLibs ("libfieldFunctionObjects.so"); log yes; outputControl timeStep; outputInterval 1; mode magnitude; fields ( U p ); } } Time = 1 smoothSolver: Solving for Ux, Initial residual = 0.000858153, Final residual = 4.80409e-05, No Iterations 4 smoothSolver: Solving for Uy, Initial residual = 0.00247583, Final residual = 0.000145901, No Iterations 4 smoothSolver: Solving for Uz, Initial residual = 0.00376188, Final residual = 0.000214772, No Iterations 4 GAMG: Solving for p, Initial residual = 0.140115, Final residual = 0.0044083, No Iterations 2 time step continuity errors : sum local = 0.0024423, global = -1.95703e-05, cumulative = -1.95703e-05 smoothSolver: Solving for omega, Initial residual = 0.000519947, Final residual = 2.3265e-05, No Iterations 3 smoothSolver: Solving for k, Initial residual = 0.00221736, Final residual = 9.98441e-05, No Iterations 3 ExecutionTime = 46.25 s ClockTime = 47 s [cfd4:17702] *** An error occurred in MPI_Recv [cfd4:17702] *** on communicator MPI_COMM_WORLD [cfd4:17702] *** MPI_ERR_TRUNCATE: message truncated [cfd4:17702] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort) -------------------------------------------------------------------------- mpirun has exited due to process rank 0 with PID 17702 on node cfd4 exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- Thanks |
|
August 10, 2009, 07:32 |
|
#2 |
Senior Member
Mattijs Janssens
Join Date: Mar 2009
Posts: 1,419
Rep Power: 26 |
I pushed a fix to 1.6.x.
Thanks for reporting Mattijs |
|
August 10, 2009, 08:46 |
|
#3 |
Member
bruce
Join Date: May 2009
Location: Germany
Posts: 42
Rep Power: 17 |
Thanks mattijs,
It works ! Thanks |
|
July 29, 2010, 18:02 |
|
#4 |
Member
Mathieu Olivier
Join Date: Mar 2009
Location: Quebec City, Canada
Posts: 77
Rep Power: 17 |
Hi,
It seems that the bug reported by Bruce is also present in OpenFOAM-1.5-dev (rev. 1758). In short, I have the same error message with a parallel case that uses functionObjects. The case runs fine in serial. Some more details : - The simulation stops at the end of the second timestep (even if all residuals look good). - The bug appeared when I switch the linear solvers (for "p" and "cellDisplacement") from PCG to GAMG. - I started the simulation with PCG, waited a few timesteps, then switched to GAMG (without stopping the simulation) and the simulation seems to run correctly... for now... - There is no problem with the parallel simulation if I don't use a functionObject (which is, of course, not an interesting option). Thanks, Mathieu |
|
July 30, 2010, 01:34 |
|
#5 |
Member
Mathieu Olivier
Join Date: Mar 2009
Location: Quebec City, Canada
Posts: 77
Rep Power: 17 |
Hi again,
Please disregard my last post: it seems the problem is in a "home made" functionObject. Sorry about that. Mathieu |
|
December 16, 2011, 13:19 |
|
#6 |
Senior Member
Daniel WEI (老魏)
Join Date: Mar 2009
Location: Beijing, China
Posts: 689
Blog Entries: 9
Rep Power: 21 |
Hi, I met the same error. Could you please recall what the reason for that error?
I am using a home made functionObjects too. Thanks a lot.
__________________
~ Daniel WEI ------------- Boeing Research & Technology - China Beijing, China |
|
December 16, 2011, 15:14 |
|
#7 |
Senior Member
Daniel WEI (老魏)
Join Date: Mar 2009
Location: Beijing, China
Posts: 689
Blog Entries: 9
Rep Power: 21 |
Ok, I think MPI_BUFFER_SIZE is not enough for my case.
__________________
~ Daniel WEI ------------- Boeing Research & Technology - China Beijing, China |
|
December 16, 2011, 15:37 |
|
#8 |
Senior Member
Daniel WEI (老魏)
Join Date: Mar 2009
Location: Beijing, China
Posts: 689
Blog Entries: 9
Rep Power: 21 |
When I decrease the Nz, the grids points in the spanwise direction, everything works fine.
I am very confused
__________________
~ Daniel WEI ------------- Boeing Research & Technology - China Beijing, China |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Superlinear speedup in OpenFOAM 13 | msrinath80 | OpenFOAM Running, Solving & CFD | 18 | March 3, 2015 06:36 |
OpenFOAM v1.6 & OpenMPI & functionObjects | bruce | OpenFOAM Running, Solving & CFD | 1 | August 7, 2009 14:15 |
Modified OpenFOAM Forum Structure and New Mailing-List | pete | Site News & Announcements | 0 | June 29, 2009 06:56 |
OpenFOAM Debian packaging current status problems and TODOs | oseen | OpenFOAM Installation | 9 | August 26, 2007 14:50 |
OpenFOAM 14 with OpenMPI 12 | fhy | OpenFOAM Installation | 0 | July 12, 2007 19:12 |