|
[Sponsors] |
September 14, 2011, 12:41 |
error using Grid Engine
|
#1 |
Senior Member
Andrea Ferrari
Join Date: Dec 2010
Posts: 319
Rep Power: 16 |
Hi foamers,
I'm trying to use Grid Engine to launch my simulation in parallel on a cluster. Up to now I used the classic "mpirun -np whatever whatever -parallel > log &" for parallel simulation and everything was fine, but i need to use Grid Engine. So i wrote a small script and i submitted my job using qsub. I get this error using interFoam: Code:
Starting openFoam with GridEngine... (Follow execution details in file "qsub_start_openFoam.sh.o[JobID]".) -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 11 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [11] [11] [11] --> FOAM FATAL IO ERROR: [11] error in IOstream "IOstream" for operation operator>>(Istream&, List<T>&) [11] [11] file: IOstream at line 0. [11] [11] From function IOstream::fatalCheck(const char*) const [11] in file db/IOstreams/IOstreams/IOstream.C at line 108. [11] FOAM parallel run exiting [11] [8] [8] [8] --> FOAM FATAL IO ERROR: [8] error in IOstream "IOstream" for operation operator>>(Istream&, List<T>&) [8] [8] file: IOstream at line 0. [8] [8] From function IOstream::fatalCheck(const char*) const [8] in file db/IOstreams/IOstreams/IOstream.C at line 108. [8] FOAM parallel run exiting [8] [10] [10] [10] --> FOAM FATAL IO ERROR: [10] error in IOstream "IOstream" for operation operator>>(Istream&, List<T>&) [10] [10] file: IOstream at line 0. [10] [10] From function IOstream::fatalCheck(const char*) const [10] in file db/IOstreams/IOstreams/IOstream.C at line 108. [10] FOAM parallel run exiting [10] [9] [9] [9] --> FOAM FATAL IO ERROR: [9] wrong token type - expected int found on line 0 the word 'Y' [9] [9] file: IOstream at line 0. [9] [9] From function operator>>(Istream&, int&) [9] in file primitives/ints/int/intIO.C at line 68. [9] FOAM parallel run exiting [9] -------------------------------------------------------------------------- mpirun has exited due to process rank 11 with PID 23274 on node node12.cluster exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [node11.cluster:17619] 3 more processes have sent help message help-mpi-api.txt / mpi-abort [node11.cluster:17619] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Simulation terminated. This is my log file: Code:
/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 1.7.1 | | \\ / A nd | Web: www.OpenFOAM.com | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 1.7.1-03e7e056c215 Exec : interFoam -parallel Date : Sep 14 2011 Time : 17:17:08 Host : node11.cluster PID : 17623 Case : /home/aferrari/Simulation/OstacoliPerturbati/BigGeometry/Case2_616_side06/provaNuova2e-2alpha150/36proc nProcs : 16 Slaves : 15 ( node11.cluster.17624 node11.cluster.17625 node11.cluster.17626 node11.cluster.17627 node11.cluster.17628 node11.cluster.17629 node11.cluster.17630 node12.cluster.23271 node12.cluster.23272 node12.cluster.23273 node12.cluster.23274 node12.cluster.23275 node12.cluster.23276 node12.cluster.23277 node12.cluster.23278 ) Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 0 Reading field p_rgh Reading field alpha1 Reading field U Reading/calculating face flux field phi Reading transportProperties Selecting incompressible transport model Newtonian Selecting incompressible transport model Newtonian Selecting turbulence model type laminar Reading g Calculating field g.h volScalarField gh("gh", g & mesh.C()); surfaceScalarField ghf("ghf", g & mesh.Cf()); I really can not figure out what could be the problem. If i launch the same case using mpirun everything is ok. Any idea? thanks andrea |
|
September 15, 2011, 05:14 |
|
#2 |
Senior Member
Andrea Ferrari
Join Date: Dec 2010
Posts: 319
Rep Power: 16 |
This is the script i wrote:
#!/bin/bash #$ -S /bin/bash #$ -e $JOB_NAME.e$JOB_ID #$ -o $JOB_NAME.o$JOB_ID #$ -cwd #$ -j y #$ -pe orte 36 echo "" echo "Starting openFoam with GridEngine..." echo "(Follow execution details in file \"qsub_start_openFoam.sh.o[JobID]\".)" echo "" mpirun interFoam -parallel > log echo "" echo "Simulation terminated." echo "" Then i launch the simulation using "qsub script.sh". Does anyone known why it works with simpleFoam and not with interFoam? Any help is appreciated!! best andrea |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
CFX integration with Sun Grid Engine | mausmi | CFX | 2 | February 4, 2016 17:30 |
Grid Engine OpenFOAM15dev and OpenMPI124 | tian | OpenFOAM Installation | 11 | February 26, 2009 11:43 |
CFX and Sun Grid Engine | David Hargreaves | CFX | 1 | August 26, 2005 00:50 |
Combustion Convergence problems | Art Stretton | Phoenics | 5 | April 2, 2002 06:59 |
Troubles modelling flow through a grid | Hans Klaufus | CFX | 1 | June 28, 2000 17:43 |