|
[Sponsors] |
January 20, 2012, 11:49 |
Problems with MPI implementation
|
#1 |
Member
Francesco Capuano
Join Date: May 2010
Posts: 81
Rep Power: 16 |
Hi everybody,
I'm having some problems running OpenFOAM v 2.1.0 in parallel mode. In particular, I need to update the PStream library to switch from OpenMPI to Intel MPI, which is actually installed on the cluster I'm using. I have followed the instructions in http://openfoamwiki.net/index.php/HowTo_Pstream but they refer to old versions of OpenFOAM (v. 1.XXX) and I cannot figure out what's the problem. Is there anyone who has successfully switched from OpenMPI to other MPI implementations? |
|
January 21, 2012, 14:57 |
|
#2 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Greetings Francesco,
The wiki page isn't very out of date, the instructions there still apply to the latest versions of OpenFOAM... with a few minor adjustments... Basically, there are 3 places that need attention (OpenFOAM version number is just a reference):
Best regards, Bruno
__________________
|
|
January 21, 2012, 16:57 |
|
#3 |
Member
Francesco Capuano
Join Date: May 2010
Posts: 81
Rep Power: 16 |
Thanks Bruno for your quick and clear reply!
As soon as I get back to work I will try and let you know! Regards, Francesco |
|
January 24, 2012, 05:06 |
|
#4 |
Member
Francesco Capuano
Join Date: May 2010
Posts: 81
Rep Power: 16 |
Dear Bruno,
I get the following message: Warning in /u2/capuanf/OpenFOAM/OpenFOAM-2.0.1/etc/config/settings.csh: MPI_ROOT not a valid mpt installation directory. Please set MPI_ROOT to the mpt installation directory. (usually done by loading the mpt module) MPI_ROOT currently set to '/dummy' MPI_ROOT is not properly set! How can I fix this? Problem solved! I just defined the MPI_ROOT variable properly in my .bashrc file and it worked! Thanks in advance! Last edited by francesco_capuano; January 24, 2012 at 14:33. Reason: Problem solved |
|
September 19, 2012, 06:21 |
|
#5 |
Senior Member
Sandy Lee
Join Date: Mar 2009
Posts: 213
Rep Power: 18 |
Hi Francesco,
I met the same problem. Could you give a detailed list how to set the MPI_ROOT variable in .bashrc ?? Thanks |
|
September 19, 2012, 12:09 |
|
#6 | |
Member
Francesco Capuano
Join Date: May 2010
Posts: 81
Rep Power: 16 |
Quote:
you can define the MPI_ROOT variable by adding the line export MPI_ROOT=mpipath at the end of the your .bashrc file. mpipath is the mpi installation path of your computer/cluster. You can either try to find it yourself or ask it to your cluster technician. Just to give you a clue, in my case the installation path is /cm/shared/apps/intel/ics/impi/4.0.3.008/ In case you have further questions, please don't hesitate to ask. Regards, Francesco |
||
January 15, 2013, 21:46 |
MPI version and OpenFOAM 2.1
|
#7 |
Member
,...
Join Date: Apr 2011
Posts: 92
Rep Power: 14 |
Hi FOAMERS
I am using OpenFOAM 2.1,and have LAM/MPI already installed (LAM 7.1.4/MPI 2 C++/ROMIO - Indiana University). When I am trying to rum icofoam on 3 processors I get this error ------------ It seems that [at least] one of the processes that was started with mpirun did not invoke MPI_INIT before quitting (it is possible that more than one process did not invoke MPI_INIT -- mpirun was only notified of the first one, which was on node n0). mpirun can *only* be used with MPI programs (i.e., programs that invoke MPI_INIT and MPI_FINALIZE). You can use the "lamexec" program to run non-MPI programs over the lambooted nodes. ------------ Any idea why i get this error? I have already run some cases on parallel with OpenFOAM 1.7 and mpirun (Open MPI) 1.3.2. Can I used LAM/MPI instead of OpenMPI? If yes, is there a certain version of LAM/MPI which is compatible with OpenFOAM 2.1? |
|
January 16, 2013, 02:53 |
|
#8 | |
Senior Member
Nima Samkhaniani
Join Date: Sep 2009
Location: Tehran, Iran
Posts: 1,267
Blog Entries: 1
Rep Power: 25 |
i dont know what the error is, but there is a suggestion in error line
Quote:
__________________
My Personal Website (http://nimasamkhaniani.ir/) Telegram channel (https://t.me/cfd_foam) |
||
Tags |
mpi, pstream library |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
MPI InterFoam | metro | OpenFOAM Running, Solving & CFD | 7 | March 27, 2013 04:24 |
Problem with MPI? | cwang5 | OpenFOAM Running, Solving & CFD | 3 | July 12, 2010 11:38 |
Code Saturne mass sources MPI problem | Pat84 | Main CFD Forum | 9 | April 21, 2010 09:02 |
non-linear k-epsilon model implementation problems | Saidi | Main CFD Forum | 2 | March 4, 2010 14:23 |
MPI PROBLEMS | gtg627e | OpenFOAM Running, Solving & CFD | 20 | October 5, 2007 05:02 |