|
[Sponsors] |
June 23, 2011, 07:11 |
Sgimpi
|
#1 |
Member
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 15 |
I have compiled OpenFOAM 1.7.1 for openmpi and I need to use it with sgimpi. How can I do it? I've changed $WM_MPLIB and $WM_PROJECT_DIR/wmake/rules/linuxIA64Icc/mpilibSGIMPI where I've made
PFLAGS = -DOMPI_SKIP_MPICXX PINC = -I/opt/sgi/mpt/mpt-2.03/include PLIBS = -L/opt/sgi/mpt/mpt-2.03/lib -lmpi and after ./Allmake at WM_PROJECT_DIR Is this right? |
|
June 25, 2011, 08:47 |
|
#2 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128 |
Hi Pere,
Yeah, that should be pretty much it. I'm not sure if there is anything else missing for building with SGI-MPI. No, wait, have you checked "OpenFOAM-1.7.1/etc/settings.sh"? There might be some issues with Metis and Scotch if you have previously built on the same machine with OpenMPI. So you might need to either run Allwmake twice or play it safe and rebuild the whole thing again with the new options. Best regards, Bruno
__________________
|
|
June 30, 2011, 07:32 |
|
#3 |
Member
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 15 |
Hi Bruno,
Yes, I've checked settings.sh and I added: SGIMPI) export MPI_ARCH_PATH=/prod/OPENFOAM1.7.1/OPENFOAM-1.7.1-SGI/OPENFOAM-1.7.1/mpt-2.03 # Tell OpenMPI where to find its install directory export OPAL_PREFIX=$MPI_ARCH_PATH _foamAddPath $MPI_ARCH_PATH/bin _foamAddLib $MPI_ARCH_PATH/lib _foamAddMan $MPI_ARCH_PATH/man export FOAM_MPI_LIBBIN=/prod/OPENFOAM1.7.1/OPENFOAM-1.7.1-SGI/OPENFOAM-1.7.1/mpt-2.03/lib but when Itry to execute OpenFOAM this message appairs: Warning: Command line arguments for program should be given after the program name. Assuming that -parallel is a command line argument for the program. Missing: program name Program buoyantBoussinesqPimpleFoam either does not exist, is not executable, or is an erroneous argument to mpirun. Does not detect correctly mpirun.... Does anyone knows where is the problem? |
|
June 30, 2011, 07:59 |
|
#4 |
Senior Member
Anton Kidess
Join Date: May 2009
Location: Germany
Posts: 1,377
Rep Power: 30 |
Does buoyantBoussinesqPimpleFoam run in serial? Did you try giving mpirun the complete path (i.e. ~/OpenFOAM/USER-V.x/applications/bin/linux64GccDPOpt/buoyantBoussinesqPimpleFoam)?
|
|
June 30, 2011, 18:52 |
|
#5 | ||
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128 |
Hi Pere and Anton,
@Pere:Right now I don't have time to go into details, so I'll simply copy-paste the links I've got on one of my blog posts: Quote:
Best regards and good luck! Bruno
__________________
|
|||
July 1, 2011, 04:32 |
|
#6 |
Member
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 15 |
Yes I tried the full path and the same error appears...
|
|
July 2, 2011, 05:49 |
|
#7 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128 |
Hi Pere,
I'm sorry, but this is too little information I'm unable to deduce whether it's an environment problem or if something is missing when you run mpirun. So, here's a list of things I'll need to know:
Code:
mkdir -p $FOAM_RUN run cp -r $FOAM_TUTORIALS . cd tutorials/heatTransfer/buoyantBoussinesqSimpleFoam/iglooWithFridges blockMesh snappyHexMesh -overwrite decomposePar `which mpirun` -np 6 /home/myusername/OpenFOAM/OpenFOAM-1.7.1/bin/foamExec buoyantBoussinesqSimpleFoam -parallel Code:
echo `which mpirun` Bruno
__________________
|
|
July 4, 2011, 05:40 |
|
#8 |
Member
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 15 |
I've solucionazied the first error and now I had this problem:
PI: pirineus: 0x9d400004db86f34: MPI: pirineus: 0x9d400004db86f34: MPI: pirineus: 0x9d400004db86f34: --> FOAM FATAL ERROR: MPI: pirineus: 0x9d400004db86f34: Trying to use the dummy Pstream library. MPI: pirineus: 0x9d400004db86f34: This dummy library cannot be used in parallel mode MPI: pirineus: 0x9d400004db86f34: MPI: pirineus: 0x9d400004db86f34: From function Pstream::init(int& argc, char**& argv) MPI: pirineus: 0x9d400004db86f34: in file Pstream.C at line 39. MPI: pirineus: 0x9d400004db86f34: MPI: pirineus: 0x9d400004db86f34: FOAM exiting MPI: pirineus: 0x9d400004db86f34: MPI: could not run executable (case #4) Did I have to modify Pstream.C? |
|
July 4, 2011, 20:40 |
|
#9 | |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128 |
Quote:
Changing the Pstream code is only necessary in very specific situations, namely for MPI libraries that use non-conventional ways of doing MPI business. This would usually only be the case for older MPI libraries and some closed source ones, such for Cray super-computers (I'm just guessing on this last one).
__________________
|
||
July 5, 2011, 04:11 |
|
#10 |
Member
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 15 |
Thanks wyldckat. So what do you think I must do now?
|
|
July 5, 2011, 07:40 |
|
#11 |
Member
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 15 |
I've changed Allwmake in Pstream and I put wmake libso mpi instead of wmake libso dummy, after ..../Pstream>./Allwmake....but the error is the same again...
|
|
July 6, 2011, 20:36 |
|
#12 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128 |
Hi Pere,
I didn't have time to respond sooner... anyway, it's time for some divide and conquer:
Best regards and good luck! Bruno
__________________
|
|
July 8, 2011, 05:08 |
|
#13 |
Member
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 15 |
I cannot build parallelTest, this error appears:
SOURCE=parallelTest.C ; g++ -m64 -Dlinux64 -DWM_DP -Wall -Wextra -Wno-unused-parameter -Wold-style-cast -Wnon-virtual-dtor -O3 -DNoRepository -ftemplate-depth-40 -IlnInclude -I. -I/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/src/OpenFOAM/lnInclude -I/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/src/OSspecific/POSIX/lnInclude -fPIC -c $SOURCE -o Make/linux64GccDPOpt/parallelTest.o In file included from /prod/OPENFOAM1.7.1/ThirdParty-1.7.1/gcc-4.5.2/bin/../lib/gcc/x86_64-unknown-linux-gnu/4.5.2/../../../../include/c++/4.5.2/bits/localefwd.h:42:0, from /prod/OPENFOAM1.7.1/ThirdParty-1.7.1/gcc-4.5.2/bin/../lib/gcc/x86_64-unknown-linux-gnu/4.5.2/../../../../include/c++/4.5.2/string:45, from /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/src/OpenFOAM/lnInclude/string.H:51, from /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/src/OpenFOAM/lnInclude/stringList.H:35, from /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/src/OpenFOAM/lnInclude/argList.H:73, from parallelTest.C:32: /prod/OPENFOAM1.7.1/ThirdParty-1.7.1/gcc-4.5.2/bin/../lib/gcc/x86_64-unknown-linux-gnu/4.5.2/../../../../include/c++/4.5.2/x86_64-unknown-linux-gnu/bits/c++locale.h:52:23: error: ‘uselocale’ was not declared in this scope /prod/OPENFOAM1.7.1/ThirdParty-1.7.1/gcc-4.5.2/bin/../lib/gcc/x86_64-unknown-linux-gnu/4.5.2/../../../../include/c++/4.5.2/x86_64-unknown-linux-gnu/bits/c++locale.h:52:45: error: invalid type in declaration before ‘;’ token /prod/OPENFOAM1.7.1/ThirdParty-1.7.1/gcc-4.5.2/bin/../lib/gcc/x86_64-unknown-linux-gnu/4.5.2/../../../../include/c++/4.5.2/x86_64-unknown-linux-gnu/bits/c++locale.h: In function ‘int std::__convert_from_v(__locale_struct* const&, char*, int, const char*, ...)’: /prod/OPENFOAM1.7.1/ThirdParty-1.7.1/gcc-4.5.2/bin/../lib/gcc/x86_64-unknown-linux-gnu/4.5.2/../../../../include/c++/4.5.2/x86_64-unknown-linux-gnu/bits/c++locale.h:72:53: error: ‘__gnu_cxx::__uselocale’ cannot be used as a function /prod/OPENFOAM1.7.1/ThirdParty-1.7.1/gcc-4.5.2/bin/../lib/gcc/x86_64-unknown-linux-gnu/4.5.2/../../../../include/c++/4.5.2/x86_64-unknown-linux-gnu/bits/c++locale.h:97:33: error: ‘__gnu_cxx::__uselocale’ cannot be used as a function make: *** [Make/linux64GccDPOpt/parallelTest.o] Error 1 Is it possible that I need new headers? libraries? The real problem is that OpenFOAM needs Pstream/mpi instead of Pstream/dummy from running ni parallel? Thanks in advance wlydcat |
|
July 8, 2011, 07:49 |
|
#14 |
Member
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 15 |
Ok finally I can start parallelTest....
|
|
July 8, 2011, 08:01 |
|
#15 |
Member
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 15 |
when I made parallelTest:
*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 1.7.1 | | \\ / A nd | Web: www.OpenFOAM.com | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 1.7.1-03e7e056c215 Exec : parallelTest Date : Jul 08 2011 Time : 12:47:19 Host : pirineus PID : 340804 Case : /tmp/ppuigdom/snappy24_2/snappy24 nProcs : 1 SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Starting transfers End -mpirun -np 2 parallelTest /*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 1.7.1 | | \\ / A nd | Web: www.OpenFOAM.com | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 1.7.1-03e7e056c215 Exec : parallelTest Date : Jul 08 2011 Time : 12:50:25 Host : pirineus PID : 341946 Case : /tmp/ppuigdom/snappy24_2/snappy24 nProcs : 1 SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time End MPI: pirineus: 0x9d400004db8713b: MPI: pirineus: 0x9d400004db8713b: Starting transfers MPI: pirineus: 0x9d400004db8713b: MPI: could not run executable (case #4) -mpirun -np 2 which foamExec parallelTest /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/bin/foamExec /home/ppuigdom/OpenFOAM/ppuigdom-1.7.1/applications/bin/linux64GccDPOpt/parallelTest MPI: could not run executable (case #3) MPI: No details available, no log files found -mpirun -np 2 parallelTest -parallel MPI: pirineus: 0x9d400004db87141: MPI: pirineus: 0x9d400004db87141: MPI: pirineus: 0x9d400004db87141: --> FOAM FATAL ERROR: MPI: pirineus: 0x9d400004db87141: Trying to use the dummy Pstream library. MPI: pirineus: 0x9d400004db87141: This dummy library cannot be used in parallel mode MPI: pirineus: 0x9d400004db87141: MPI: pirineus: 0x9d400004db87141: From function Pstream::init(int& argc, char**& argv) MPI: pirineus: 0x9d400004db87141: in file Pstream.C at line 39. MPI: pirineus: 0x9d400004db87141: MPI: pirineus: 0x9d400004db87141: FOAM exiting MPI: pirineus: 0x9d400004db87141: MPI: could not run executable (case #4) -foamJob -p -s parallelTest Parallel processing using SGIMPI with 10 processors Executing: mpirun -np 10 /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/bin/foamExec parallelTest -parallel | tee log MPI: pirineus: 0x9d400004db87144: MPI: pirineus: 0x9d400004db87144: MPI: pirineus: 0x9d400004db87144: --> FOAM FATAL ERROR: MPI: pirineus: 0x9d400004db87144: Trying to use the dummy Pstream library. MPI: pirineus: 0x9d400004db87144: This dummy library cannot be used in parallel mode MPI: pirineus: 0x9d400004db87144: MPI: pirineus: 0x9d400004db87144: From function Pstream::init(int& argc, char**& argv) MPI: pirineus: 0x9d400004db87144: in file Pstream.C at line 39. MPI: pirineus: 0x9d400004db87144: MPI: pirineus: 0x9d400004db87144: FOAM exiting MPI: pirineus: 0x9d400004db87144: MPI: could not run executable (case #4) Is the problem dummy library? how can I change it for mpi library? Thanks |
|
July 9, 2011, 05:57 |
|
#16 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128 |
Hi Pere,
Just the other day I had to suggest to another person to do the following instruction: Code:
mv $FOAM_LIBBIN/dummy $FOAM_LIBBIN/dummy__ Then try again. By the way, you didn't mention if the first two commands had worked or not. I suppose they did and that's why you went ahead to test parallelTest. Best regards, Bruno
__________________
|
|
July 11, 2011, 05:19 |
|
#17 |
Member
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 15 |
mpirun -np 2 bash -c "ls -l"
total 12 drwxr-x--- 2 ppuigdom cesca 4096 16 jun 11:17 0 drwxr-x--- 4 ppuigdom cesca 4096 16 jun 11:17 constant -rw-r----- 1 ppuigdom cesca 0 11 jul 09:49 log drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:21 processor0 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor1 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor2 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor3 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor4 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor5 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor6 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor7 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor8 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor9 drwxr-x--- 2 ppuigdom cesca 4096 16 jun 11:17 system MPI: could not run executable (case #3) MPI: No details available, no log files found Matat mpirun -np 2 bash -c "export" declare -x AWK="gawk" declare -x BASH_ENV="/home/ppuigdom/.bashrc" declare -x BINARY_TYPE="linux2.6-glibc2.3-x86_64" declare -x BINARY_TYPE_HPC="" declare -x BSUB_BLOCK_EXEC_HOST="" declare -x COLORTERM="1" declare -x CPATH="/prod/intel/Compiler/11.1/073/ipp/em64t/include:/prod/intel/Compiler/11.1/073/mkl/include:/prod/intel/Compiler/11.1/073/tbb/include:/prod/intel/Compiler/11.1/073/ipp/em64t/include:/prod/intel/Compiler/11.1/073/mkl/include:/prod/intel/Compiler/11.1/073/tbb/include" declare -x CPU="x86_64" declare -x CSHEDIT="emacs" declare -x CVS_RSH="ssh" declare -x DADES="/dades/ppuigdom" declare -x DISPLAY="localhost:18.0" declare -x DYLD_LIBRARY_PATH="/prod/intel/Compiler/11.1/073/tbb/intel64/cc4.1.0_libc2.4_kernel2.6.16.21/lib:/prod/intel/Compiler/11.1/073/tbb/intel64/cc4.1.0_libc2.4_kernel2.6.16.21/lib" declare -x EGO_BINDIR="/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/bin" declare -x EGO_CONFDIR="/usr/share/lsf/conf/ego/CESCA/kernel" declare -x EGO_ESRVDIR="/usr/share/lsf/conf/ego/CESCA/eservice" declare -x EGO_LIBDIR="/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/lib" declare -x EGO_LOCAL_CONFDIR="/usr/share/lsf/conf/ego/CESCA/kernel" declare -x EGO_SERVERDIR="/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/etc" declare -x EGO_TOP="/usr/share/lsf" declare -x EGO_VERSION="1.2" declare -x ENV="/etc/bash.bashrc" declare -x FOAM_APP="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/applications" declare -x FOAM_APPBIN="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/applications/bin/linux64GccDPOpt" declare -x FOAM_INST_DIR="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI" declare -x FOAM_JOB_DIR="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/jobControl" declare -x FOAM_LIB="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/lib" declare -x FOAM_LIBBIN="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/lib/linux64GccDPOpt" declare -x FOAM_MPI_LIBBIN="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/mpt-2.03/lib" declare -x FOAM_RUN="/home/ppuigdom/OpenFOAM/ppuigdom-1.7.1/run" declare -x FOAM_SIGFPE="" declare -x FOAM_SITE_APPBIN="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/site/1.7.1/bin/linux64GccDPOpt" declare -x FOAM_SITE_LIBBIN="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/site/1.7.1/lib/linux64GccDPOpt" declare -x FOAM_SOLVERS="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/applications/solvers" declare -x FOAM_SRC="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/src" declare -x FOAM_TUTORIALS="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/tutorials" declare -x FOAM_USER_APPBIN="/home/ppuigdom/OpenFOAM/ppuigdom-1.7.1/applications/bin/linux64GccDPOpt" declare -x FOAM_USER_LIBBIN="/home/ppuigdom/OpenFOAM/ppuigdom-1.7.1/lib/linux64GccDPOpt" declare -x FOAM_UTILITIES="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/applications/utilities" declare -x FPATH="/prod/intel/Compiler/11.1/073/mkl/include:/prod/intel/Compiler/11.1/073/mkl/include" declare -x FROM_HEADER="" declare -x G_BROKEN_FILENAMES="1" declare -x G_FILENAME_ENCODING="@locale,UTF-8,ISO-8859-15,CP1252" declare -x HISTSIZE="1000" declare -x HOME="/home/ppuigdom" declare -x HOST="pirineus" declare -x HOSTNAME="pirineus" declare -x HOSTTYPE="X86_64" declare -x INCLUDE="/prod/intel/Compiler/11.1/073/ipp/em64t/include:/prod/intel/Compiler/11.1/073/mkl/include:/prod/intel/Compiler/11.1/073/ipp/em64t/include:/prod/intel/Compiler/11.1/073/mkl/include" declare -x INFODIR="/usr/local/info:/usr/share/info:/usr/info" declare -x INFOPATH="/usr/local/info:/usr/share/info:/usr/info" declare -x INPUTRC="/etc/inputrc" declare -x INTEL_LICENSE_FILE="/prod/intel/Compiler/11.1/073/licenses:/opt/intel/licenses:/home/ppuigdom/intel/licenses:/prod/intel/Compiler/11.1/073/licenses:/opt/intel/licenses:/home/ppuigdom/intel/licenses" declare -x IPPROOT="/prod/intel/Compiler/11.1/073/ipp/em64t" declare -x JAVA_BINDIR="/usr/lib64/jvm/jre/bin" declare -x JAVA_HOME="/usr/share/lsf/perf/../jre/linux-x86_64" declare -x JAVA_ROOT="/usr/lib64/jvm/jre" declare -x JRE_HOME="/usr/lib64/jvm/jre" declare -x KMP_AFFINITY="disable" declare -x LANG="ca_ES.UTF-8" declare -x LD_LIBRARY_PATH="/prod/OPENFOAM1.7.1/ThirdParty-1.7.1/platforms/linux64Gcc/paraview-3.8.0/lib/paraview-3.8:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/mpt-2.03/lib:/usr/lib64/gcc/x86_64-suse-linux/4.3:/home/ppuigdom/OpenFOAM/ppuigdom-1.7.1/lib/linux64GccDPOpt:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/site/1.7.1/lib/linux64GccDPOpt:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/lib/linux64GccDPOpt:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/lib/linux64GccDPOpt/dummy:/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/lib:/usr/lib64/jvm/jre-1.6.0-sun/lib/amd64/server:/usr/lib64/jvm/jre-1.6.0-sun/lib/amd64:/usr/share/lsf/perf/1.2/linux-x86_64/lib:/usr/share/lsf/perf/ego/1.2/linux-x86_64/lib:/usr/share/lsf/perf/lsf/7.0/linux-x86_64/lib:/usr/share/lsf/1.2/linux2.6-glibc2.3-x86_64/lib:/prod/intel/Compiler/11.1/073/lib/intel64:/prod/intel/Compiler/11.1/073/ipp/em64t/sharedlib:/prod/intel/Compiler/11.1/073/mkl/lib/em64t:/prod/intel/Compiler/11.1/073/tbb/intel64/cc4.1.0_libc2.4_kernel2.6.16.21/lib:/opt/sgi/sgimc/lib" declare -x LD_PRELOAD="libxpmem.so" declare -x LESS="-M -I" declare -x LESSCLOSE="lessclose.sh %s %s" declare -x LESSKEY="/etc/lesskey.bin" declare -x LESSOPEN="lessopen.sh %s" declare -x LESS_ADVANCED_PREPROCESSOR="no" declare -x LIB="/prod/intel/Compiler/11.1/073/ipp/em64t/lib:/prod/intel/Compiler/11.1/073/ipp/em64t/lib:" declare -x LIBRARY_PATH="/prod/intel/Compiler/11.1/073/lib/intel64:/prod/intel/Compiler/11.1/073/ipp/em64t/lib:/prod/intel/Compiler/11.1/073/mkl/lib/em64t:/prod/intel/Compiler/11.1/073/tbb/intel64/cc4.1.0_libc2.4_kernel2.6.16.21/lib:/prod/intel/Compiler/11.1/073/lib/intel64:/prod/intel/Compiler/11.1/073/ipp/em64t/lib:/prod/intel/Compiler/11.1/073/mkl/lib/em64t:/prod/intel/Compiler/11.1/073/tbb/intel64/cc4.1.0_libc2.4_kernel2.6.16.21/lib" declare -x LICENSE_FILE="/usr/share/lsf/conf/perf/CESCA/conf/license.dat" declare -x LM_LICENSE_FILE="27000@192.168.17.1" declare -x LOADEDMODULES="" declare -x LOGNAME="ppuigdom" declare -x LSB_ACCT_FILE="/tmp/ppuigdom/.1310369970.702642.acct" declare -x LSB_BATCH_JID="702642" declare -x LSB_CHKFILENAME="/home/ppuigdom/.lsbatch/1310369970.702642" declare -x LSB_CPUSET_DEDICATED="YES" declare -x LSB_DJOB_HB_INTERVAL="15" declare -x LSB_DJOB_HOSTFILE="/home/ppuigdom/.lsbatch/1310369970.702642.hostfile" declare -x LSB_DJOB_NUMPROC="1" declare -x LSB_DJOB_RU_INTERVAL="15" declare -x LSB_ECHKPNT_RSH_CMD="ssh -p2122" declare -x LSB_EEXEC_REAL_GID="" declare -x LSB_EEXEC_REAL_UID="" declare -x LSB_EXEC_CLUSTER="CESCA" declare -x LSB_EXIT_PRE_ABORT="99" declare -x LSB_HOSTS="pirineus" declare -x LSB_HOST_CPUSETS="1 pirineus /CESCA@702642 " declare -x LSB_INTERACTIVE="Y" declare -x LSB_JOBEXIT_STAT="0" declare -x LSB_JOBFILENAME="/home/ppuigdom/.lsbatch/1310369970.702642" declare -x LSB_JOBID="702642" declare -x LSB_JOBINDEX="0" declare -x LSB_JOBNAME="/bin/bash" declare -x LSB_JOBRES_CALLBACK="52029@pirineus" declare -x LSB_JOBRES_PID="756365" declare -x LSB_JOB_EXECUSER="ppuigdom" declare -x LSB_JOB_STARTER="/usr/local/bin/lsf_job_starter" declare -x LSB_MCPU_HOSTS="pirineus 1 " declare -x LSB_QUEUE="short" declare -x LSB_SHMODE="y" declare -x LSB_SUB_HOST="pirineus" declare -x LSB_TRAPSIGS="trap # 15 10 12 2 1" declare -x LSB_UNIXGROUP_INT="cesca" declare -x LSB_XFJOB="Y" declare -x LSFUSER="ppuigdom" declare -x LSF_BINDIR="/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/bin" declare -x LSF_EAUTH_AUX_PASS="yes" declare -x LSF_EAUTH_CLIENT="user" declare -x LSF_EAUTH_SERVER="mbatchd@CESCA" declare -x LSF_EGO_ENVDIR="/usr/share/lsf/conf/ego/CESCA/kernel" declare -x LSF_ENVDIR="/usr/share/lsf/conf" declare -x LSF_INVOKE_CMD="bsub.lsf" declare -x LSF_LIBDIR="/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/lib" declare -x LSF_LIM_API_NTRIES="1" declare -x LSF_LOGDIR="/usr/share/lsf/log" declare -x LSF_SERVERDIR="/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/etc" declare -x LSF_VERSION="23" declare -x LS_COLORS="no=00:fi=00:di=01;34:ln=00;36i=40;33: so=01;35:do=01;35:bd=40;33;01:cd=40;33;01r=41;33 ;01:ex=00;32:*.cmd=00;32:*.exe=01;32:*.com=01;32:* .bat=01;32:*.btm=01;32:*.dll=01;32:*.tar=00;31:*.t bz=00;31:*.tgz=00;31:*.rpm=00;31:*.deb=00;31:*.arj =00;31:*.taz=00;31:*.lzh=00;31:*.lzma=00;31:*.zip= 00;31:*.zoo=00;31:*.z=00;31:*.Z=00;31:*.gz=00;31:* .bz2=00;31:*.tb2=00;31:*.tz2=00;31:*.tbz2=00;31:*. avi=01;35:*.bmp=01;35:*.fli=01;35:*.gif=01;35:*.jp g=01;35:*.jpeg=01;35:*.mng=01;35:*.mov=01;35:*.mpg =01;35:*.pcx=01;35:*.pbm=01;35:*.pgm=01;35:*.png=0 1;35:*.ppm=01;35:*.tga=01;35:*.tif=01;35:*.xbm=01; 35:*.xpm=01;35:*.dl=01;35:*.gl=01;35:*.wmv=01;35:* .aiff=00;32:*.au=00;32:*.mid=00;32:*.mp3=00;32:*.o gg=00;32:*.voc=00;32:*.wav=00;32:" declare -x LS_EXEC_T="START" declare -x LS_JOBPID="756365" declare -x LS_OPTIONS="-N --color=tty -T 0" declare -x LS_SUBCWD="/tmp/ppuigdom/snappy24_2/snappy24" declare -x MACHTYPE="x86_64-suse-linux" declare -x MAIL="/var/mail/ppuigdom" declare -x MANPATH="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/mpt-2.03/man:/usr/share/lsf/7.0/man:/prod/intel/Compiler/11.1/073/man/en_US:/prod/intel/Compiler/11.1/073/mkl/man/en_US:/prod/intel/Compiler/11.1/073/mkl/../man/en_US:/prod/intel/Compiler/11.1/073/mkl/man/en_US:/usr/local/man:/usr/share/man:/prod/intel/vtune/man:" declare -x MAQUINA="pirineus" declare -x MGR_HOME="/opt/sgi/sgimc" declare -x MINICOM="-c on" declare -x MKLROOT="/prod/intel/Compiler/11.1/073/mkl" declare -x MODULEPATH="/opt/modules/tools:/opt/modules/modulefiles" declare -x MODULESHOME="/usr/share/modules" declare -x MODULE_VERSION="3.1.6" declare -x MODULE_VERSION_STACK="3.1.6" declare -x MORE="-sl" declare -x MPI_ARCH_PATH="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/mpt-2.03" declare -x MPI_BUFFER_SIZE="20000000" declare -x MPI_DRANK="0" declare -x MPI_ENVIRONMENT="6a085854 38938 0 e007c336 9d400004db871d1" declare -x MPI_IDB_PATH="/prod/intel/Compiler/11.1/073/bin/intel64/idb" declare -x MPI_SHARED_NEIGHBORHOOD="host" declare -x NLSPATH="/prod/intel/Compiler/11.1/073/lib/intel64/locale/%l_%t/%N:/prod/intel/Compiler/11.1/073/ipp/em64t/lib/locale/%l_%t/%N:/prod/intel/Compiler/11.1/073/mkl/lib/em64t/locale/%l_%t/%N:/prod/intel/Compiler/11.1/073/idb/intel64/locale/%l_%t/%N:/prod/intel/Compiler/11.1/073/lib/intel64/locale/%l_%t/%N:/prod/intel/Compiler/11.1/073/ipp/em64t/lib/locale/%l_%t/%N:/prod/intel/Compiler/11.1/073/mkl/lib/em64t/locale/%l_%t/%N:/prod/intel/Compiler/11.1/073/idb/intel64/locale/%l_%t/%N" declare -x NNTPSERVER="news" declare -x OLDPWD declare -x OPAL_PREFIX="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/mpt-2.03" declare -x OSTYPE="linux" declare -x OS_TYPE="linux-x86_64" declare -x PAGER="less" declare -x PAMENV="/prod/ESI-SW/visualenv66/env-Linux" declare -x PAMHOME="/prod/ESI-SW/visualenv66" declare -x PAMLANG="FR" declare -x PATH="/prod/OPENFOAM1.7.1/ThirdParty-1.7.1/platforms/linux64Gcc/paraview-3.8.0/bin:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/mpt-2.03/bin:/usr/bin/gcc:/home/ppuigdom/OpenFOAM/ppuigdom-1.7.1/applications/bin/linux64GccDPOpt:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/site/1.7.1/bin/linux64GccDPOpt:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/applications/bin/linux64GccDPOpt:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/wmake:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/bin:/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/bin:/prod/pgi/linux86-64/10.5/bin:/usr/share/lsf/gui/2.0/bin:/usr/share/lsf/perf/1.2/bin:/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/etc:/prod/intel/Compiler/11.1/073/bin/intel64:/usr/local/bin:/usr/bin:/bin:/usr/bin/X11:/usr/X11R6/bin:/usr/games:/usr/lib64/jvm/jre/bin:/usr/lib/mit/bin:/usr/lib/mit/sbin:/opt/sgi/sgimc/bin:/opt/sgi/sbin" declare -x PERF_CONFDIR="/usr/share/lsf/conf/perf/CESCA/conf" declare -x PERF_DATADIR="/usr/share/lsf/work/CESCA/perf/data" declare -x PERF_ENV="-DPERF_TOP=/usr/share/lsf/perf -DPERF_CONFDIR=/usr/share/lsf/conf/perf/CESCA/conf -DPERF_WORKDIR=/usr/share/lsf/work/CESCA/perf -DPERF_LOGDIR=/usr/share/lsf/log/perf -DPERF_DATADIR=/usr/share/lsf/work/CESCA/perf/data" declare -x PERF_LIB="/usr/share/lsf/perf/1.2/linux-x86_64/lib:/usr/share/lsf/perf/ego/1.2/linux-x86_64/lib:/usr/share/lsf/perf/lsf/7.0/linux-x86_64/lib" declare -x PERF_LIB_TYPE="linux-x86_64" declare -x PERF_LOGDIR="/usr/share/lsf/log/perf" declare -x PERF_TOP="/usr/share/lsf/perf" declare -x PERF_VERSION="1.2" declare -x PERF_WORKDIR="/usr/share/lsf/work/CESCA/perf" declare -x PGI="/prod/pgi" declare -x PROFILEREAD="true" declare -x PV_PLUGIN_PATH="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/lib/linux64GccDPOpt/paraview-3.8" declare -x PWD="/tmp/ppuigdom/snappy24_2/snappy24" declare -x PYTHONSTARTUP="/etc/pythonstart" declare -x ParaView_DIR="/prod/OPENFOAM1.7.1/ThirdParty-1.7.1/platforms/linux64Gcc/paraview-3.8.0" declare -x ParaView_MAJOR="3.8" declare -x ParaView_VERSION="3.8.0" declare -x QT_SYSTEM_DIR="/usr/share/desktop-data" declare -x SBD_KRB5CCNAME_VAL="" declare -x SCRATCH="/cescascratch/ppuigdom" declare -x SHELL="/usr/local/bin/bash" declare -x SHLVL="7" declare -x SSH_CLIENT="192.94.163.141 57706 2122" declare -x SSH_CONNECTION="192.94.163.141 57706 84.88.8.106 2122" declare -x SSH_SENDS_LOCALE="yes" declare -x SSH_TTY="/dev/pts/28" declare -x TERM="xterm" declare -x TMOUT="10800" declare -x TMPDIR="/tmp/ppuigdom/702642.201107110949" declare -x USER="ppuigdom" declare -x WINDOWMANAGER="/usr/bin/gnome" declare -x WM_ARCH="linux64" declare -x WM_ARCH_OPTION="64" declare -x WM_CC="gcc" declare -x WM_CFLAGS="-m64 -fPIC" declare -x WM_COMPILER="Gcc" declare -x WM_COMPILER_LIB_ARCH="64" declare -x WM_COMPILE_OPTION="Opt" declare -x WM_CXX="g++" declare -x WM_CXXFLAGS="-m64 -fPIC" declare -x WM_DIR="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/wmake" declare -x WM_LDFLAGS="-m64" declare -x WM_LINK_LANGUAGE="c++" declare -x WM_MPLIB="SGIMPI" declare -x WM_OPTIONS="linux64GccDPOpt" declare -x WM_OSTYPE="POSIX" declare -x WM_PRECISION_OPTION="DP" declare -x WM_PROJECT="OpenFOAM" declare -x WM_PROJECT_DIR="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1" declare -x WM_PROJECT_INST_DIR="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI" declare -x WM_PROJECT_USER_DIR="/home/ppuigdom/OpenFOAM/ppuigdom-1.7.1" declare -x WM_PROJECT_VERSION="1.7.1" declare -x WM_THIRD_PARTY_DIR="/prod/OPENFOAM1.7.1/ThirdParty-1.7.1" declare -x XAUTHLOCALHOSTNAME="pirineus" declare -x XCURSOR_THEME="DMZ" declare -x XDG_CONFIG_DIRS="/etc/xdg" declare -x XDG_DATA_DIRS="/usr/share:/etc/opt/kde3/share:/opt/kde3/share" declare -x XKEYSYMDB="/usr/share/X11/XKeysymDB" declare -x XLSF_UIDDIR="/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/lib/uid" declare -x XNLSPATH="/usr/share/X11/nls" declare -x enf="-n" declare -x enl="" declare -x ftp_proxy="http://192.168.255.254:8080" declare -x http_proxy="http://192.168.255.254:8080" declare -x https_proxy="http://192.168.255.254:8080" declare -x no_proxy="localhost, 127.0.0.1" MPI: could not run executable (case #3) MPI: No details available, no log files found Matat ppuigdom@pirineus:/tmp/ppuigdom/snappy24_2/snappy24> |
|
July 11, 2011, 06:10 |
|
#18 | |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128 |
Hi Pere,
Ah HA! Now we are getting somewhere! OK, right on the first command 'mpirun -np 2 bash -c "ls -l"', it doesn't run the second process. This seems to mean that:
There are three more commands that might be useful for testing:
There is another thing that you might want to check, although this is for OpenMPI, you'll have to search for the respective option: Quote:
Bruno
__________________
Last edited by wyldckat; July 11, 2011 at 07:27. Reason: missing spaces on the first two command lines :( |
||
July 11, 2011, 07:13 |
|
#19 |
Member
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 15 |
ppuigdom@pirineus:/tmp/ppuigdom/snappy24_2/snappy24> `which mpirun` -np 2 `which foamExec`bash -c "ls -l"
MPI: pirineus: 0x9d400004db871fb: /bin/sh: /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/bin/foamExecbash: El fitxer o directori no existeix MPI: pirineus: 0x9d400004db871fb: /bin/sh: line 0: exec: /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/bin/foamExecbash: cannot execute: El fitxer o direMPI: pirineus: 0x9d400004db871fb: ctori no existeix MPI: could not run executable (case #4) Matat ppuigdom@pirineus:/tmp/ppuigdom/snappy24_2/snappy24> `which mpirun` -np 2 `which foamExec``which bash` -c "ls -l" MPI: pirineus: 0x9d400004db871fe: /bin/sh: /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/bin/foamExec/usr/local/bin/bash: No és un directori MPI: pirineus: 0x9d400004db871fe: /bin/sh: line 0: exec: /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/bin/foamExec/usr/local/bin/bash: cannot execute: NMPI: pirineus: 0x9d400004db871fe: o és un directori MPI: could not run executable (case #4) Matat ppuigdom@pirineus:/tmp/ppuigdom/snappy24_2/snappy24> `which mpirun` -np 2 `which bash` -c "ls -l" total 12 drwxr-x--- 2 ppuigdom cesca 4096 16 jun 11:17 0 drwxr-x--- 4 ppuigdom cesca 4096 16 jun 11:17 constant -rw-r----- 1 ppuigdom cesca 0 11 jul 09:49 log drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:21 processor0 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor1 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor2 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor3 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor4 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor5 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor6 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor7 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor8 drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor9 drwxr-x--- 2 ppuigdom cesca 4096 16 jun 11:17 system MPI: could not run executable (case #3) MPI: No details available, no log files found Matat |
|
July 11, 2011, 07:28 |
|
#20 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128 |
I'm so sorry, but the online text editor seems to have eaten the spaces between "` `" Either that or I didn't type properly...
Here it is again fixed the first two commands:
__________________
|
|
|
|