|
[Sponsors] |
August 6, 2009, 04:58 |
OpenFOAM v1.6 & OpenMPI & functionObjects
|
#1 |
Member
bruce
Join Date: May 2009
Location: Germany
Posts: 42
Rep Power: 17 |
Hello All,
I run a case parallel with "functionObjects" in controlDict in OpenFOAM Version 1.6. I use pre compiled OpenFOAM1.6 & openmpi versions. - case runs fine in single processor with "functionObjects" - case runs fine in parallel multiple processor without "functionObjects" The problem is case do not run with "functionObjects" in parallel !!! So i get an run time error. Here is the content of the functionObjects i use in controlDict file, functions { fieldMinMax { type fieldMinMax; functionObjectLibs ("libfieldFunctionObjects.so"); log yes; outputControl timeStep; outputInterval 1; mode magnitude; fields ( U p ); } } Here is an error output from simpleFoam solver, Time = 1 smoothSolver: Solving for Ux, Initial residual = 0.000858153, Final residual = 4.80409e-05, No Iterations 4 smoothSolver: Solving for Uy, Initial residual = 0.00247583, Final residual = 0.000145901, No Iterations 4 smoothSolver: Solving for Uz, Initial residual = 0.00376188, Final residual = 0.000214772, No Iterations 4 GAMG: Solving for p, Initial residual = 0.140115, Final residual = 0.0044083, No Iterations 2 time step continuity errors : sum local = 0.0024423, global = -1.95703e-05, cumulative = -1.95703e-05 smoothSolver: Solving for omega, Initial residual = 0.000519947, Final residual = 2.3265e-05, No Iterations 3 smoothSolver: Solving for k, Initial residual = 0.00221736, Final residual = 9.98441e-05, No Iterations 3 ExecutionTime = 46.25 s ClockTime = 47 s [cfd4:17702] *** An error occurred in MPI_Recv [cfd4:17702] *** on communicator MPI_COMM_WORLD [cfd4:17702] *** MPI_ERR_TRUNCATE: message truncated [cfd4:17702] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort) -------------------------------------------------------------------------- mpirun has exited due to process rank 0 with PID 17702 on node cfd4 exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- For information, MPI_BUFFER_SIZE=20000000 So it seems OpenMPI crashes calling function{} !? I am glad to provide necessary information to debug the problem. Thanks |
|
August 7, 2009, 14:15 |
|
#2 |
Senior Member
Mattijs Janssens
Join Date: Mar 2009
Posts: 1,419
Rep Power: 26 |
Can you report a bug in the OpenFOAM-bugs section?
|
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Superlinear speedup in OpenFOAM 13 | msrinath80 | OpenFOAM Running, Solving & CFD | 18 | March 3, 2015 06:36 |
Modified OpenFOAM Forum Structure and New Mailing-List | pete | Site News & Announcements | 0 | June 29, 2009 06:56 |
Adventure of fisrst openfoam installation on Ubuntu 710 | jussi | OpenFOAM Installation | 0 | April 24, 2008 15:25 |
OpenFOAM Debian packaging current status problems and TODOs | oseen | OpenFOAM Installation | 9 | August 26, 2007 14:50 |
OpenFOAM 14 with OpenMPI 12 | fhy | OpenFOAM Installation | 0 | July 12, 2007 19:12 |