|
[Sponsors] |
June 16, 2012, 11:06 |
Running Paraview parallel
|
#1 |
Super Moderator
Tobias Holzmann
Join Date: Oct 2010
Location: Bad Wörishofen
Posts: 2,711
Blog Entries: 6
Rep Power: 52 |
Hi all,
i am using paraview for 2 years now and I wanna use my 6 cores for the rendering now. For that I have to compile ParaView like the introduction given here: http://paraview.org/Wiki/ParaView:Build_And_Install I installed all Prerequisites but when i wanna configure my paraview settings with ccmake and switch on "use MPI" and set the MPI_LIBRARY path I ll get these messages: Code:
Could not find the required MPI libraries CMake Error: The following variables are used in this project, but they are set to NOTFOUND. Please set them or make sure they are set and tested correctly in the CMake files: MPI_EXTRA_LIBRARY linked by target "IceTMPI" in directory /home/shorty/OpenFOAM/ParaView-3.14.1-Source/Utilities/IceT/src/communication MPI_LIBRARY linked by target "VPIC" in directory /home/shorty/OpenFOAM/ParaView-3.14.1-Source/VTK/Utilities/VPIC linked by target "Cosmo" in directory /home/shorty/OpenFOAM/ParaView-3.14.1-Source/VTK/Utilities/Cosmo linked by target "Xdmf" in directory /home/shorty/OpenFOAM/ParaView-3.14.1-Source/Utilities/Xdmf2/libsrc [code] ./configure --prefix=/home/shorty/OpenFOAM/openMPI make make install [/make] After that I get the libs in that folder under "libs/" there is for example the file libmpi.so ect... well I do not understand why I can t fix that problem! Do someone know that problem? Or I am doing sth. wrong? Thanks in advance Tobi |
|
June 16, 2012, 18:38 |
|
#2 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Hi Tobi,
I haven't been doing any big builds of ParaView lately, but I've written a lot on this subject, which is mostly accessible from here: Related issues to ParaView with OpenFOAM - Fixes and solutions You'll find instructions on how to get ParaView 3.12.0 with everything on it for OpenFOAM consumption here: ParaView 3.12.0 SuperBuild on OpenFOAM - Well, not everything on it, but all of the default stuff that is officially installed in the builds provided at http://www.paraview.org Additionally, on that blog post about the SuperBuild, it should be possible to use 3.14.1 instead of 3.12.0; for the modified source code for 3.14.1, visit this project: https://code.google.com/p/unofficial...ew-dev-install - but remember: you should still use the folder name for 3.12.0 on disk, just so you don't need to change anything on OpenFOAM settings files Best regards, Bruno
__________________
|
|
June 16, 2012, 20:05 |
|
#3 | |
Super Moderator
Tobias Holzmann
Join Date: Oct 2010
Location: Bad Wörishofen
Posts: 2,711
Blog Entries: 6
Rep Power: 52 |
Quote:
its only you who is answering me Thanks for that - all the time! Okay I tryed it with PV3.12 but with the same result. Build openmpi again into $HOME/OpenFOAM/openMPI/ -> there are include and lib folders and the files are in it. Set the path in ccmake to /home/shorty/OpenFOAM/openMPI /home/shorty/OpenFOAM/openMPI/lib /home/shorty/OpenFOAM/openMPI/lib/ /home/shorty/OpenFOAM/openMPI/lib/libmpi.so . . . but without succsess While compiling openmpi i get those warnings (maybe thats the problem?) Code:
vt_mpiwrap.gen.c: In function 'MPI_Register_datarep': vt_mpiwrap.gen.c:4381:5: warning: '__malloc_hook' is deprecated (declared at /usr/include/malloc.h:176) [-Wdeprecated-declarations] vt_mpiwrap.gen.c:4381:5: warning: '__realloc_hook' is deprecated (declared at /usr/include/malloc.h:179) [-Wdeprecated-declarations] vt_mpiwrap.gen.c:4381:5: warning: '__free_hook' is deprecated (declared at /usr/include/malloc.h:173) [-Wdeprecated-declarations] vt_mpiwrap.gen.c:4391:5: warning: '__malloc_hook' is deprecated (declared at /usr/include/malloc.h:176) [-Wdeprecated-declarations] vt_mpiwrap.gen.c:4391:5: warning: '__realloc_hook' is deprecated (declared at /usr/include/malloc.h:179) [-Wdeprecated-declarations] vt_mpiwrap.gen.c:4391:5: warning: '__free_hook' is deprecated (declared at /usr/include/malloc.h:173) [-Wdeprecated-declarations] Wow Bruno, thats the truth! THANKS TO YOU - ALL THE TIMES! Tobi |
||
September 21, 2018, 03:48 |
|
#4 |
Senior Member
Lukas Fischer
Join Date: May 2018
Location: Germany, Munich
Posts: 117
Rep Power: 8 |
Have you found a solution?
To use paraview in parallel (e.g. 4 cores) I open a shell and type: mpirun -np 4 pvserver Afterwards I start paraview. I click on "connect" and then on "add server". Adjust the name and go to configure where I choose Startup Type "manual". Then I save it. Now I can choose the server in the list and click on apply. If you now go to View and click on Memory inspector you can see that you are connected to the pvserver which uses 4 cores. It is well document in the paraviewusersguide in chapter fiften: REMOTE AND PARALLEL VISUALIZATION Last edited by lukasf; September 26, 2018 at 07:28. |
|
December 5, 2018, 05:19 |
|
#5 |
New Member
Metikurke
Join Date: May 2017
Posts: 21
Rep Power: 9 |
Hi Lukas,
As far my understanding is concerned you are mentioning about remote computers. Can you please let me know how to use the available cores on my local computer to run paraview in parallel. I have a 7 million cell data and Q-criteria is consuming almost half a day to get displayed. It would be great if you can give some insight. I have a 4 core workstation, with 32 gb ram and 1 gb graphics. Thank you. |
|
December 6, 2018, 03:46 |
|
#6 | |
Senior Member
Lukas Fischer
Join Date: May 2018
Location: Germany, Munich
Posts: 117
Rep Power: 8 |
Quote:
I am mentioning a local computer with e.g. 4 processors. Try the instructions I gave and / or read the Paraview user guide's section. It is working. With respect to the Q-criteria you should maybe try not to display it in the whole CFD domain. Try a contour-filter with a specific value for the Q-criteria or a threshold filter with a SMALL range of values. This will be much less computational expensive. If you do not see anything decrease/ increase the value by a factor of 10 until you see something. |
||
March 25, 2019, 15:37 |
|
#7 | |
Member
Join Date: Feb 2019
Posts: 37
Rep Power: 7 |
Quote:
Hi Lukas, I'm trying to run Paraview in parallel mode inside Ubuntu desktop 18 with 4 cores. When I execute mpirun -np 4 pvserver I have the following: Waiting for client... Connection URL: cs://4PI:11111 Accepting connection(s): 4PI:11111 Waiting for client... Connection URL: cs://4PI:11111 ERROR: In /home/buildslave/dashboards/buildbot/paraview-pvbinsdash-linux-shared-release_superbuild/build/superbuild/paraview/src/VTK/Common/System/vtkSocket.cxx, line 206 vtkServerSocket (0xc24de0): Socket error in call to bind. Address already in use. I dont have the firewall on. Does anyone know how I can solve this problem? Thanks. Edit: solved! Last edited by jmenendez; March 25, 2019 at 18:50. |
||
March 26, 2019, 02:27 |
|
#8 |
Senior Member
Lukas Fischer
Join Date: May 2018
Location: Germany, Munich
Posts: 117
Rep Power: 8 |
How did you solve it?
|
|
March 26, 2019, 08:28 |
|
#9 |
Member
Join Date: Feb 2019
Posts: 37
Rep Power: 7 |
Instead of using the mpirun command, I install Paraview from the official page and I use the executable mpiexec from the /bin directory of Paraview: mpiexec -np cores pvserver Maybe it's because the precompiled version of Ubuntu does not have mpi enabled. I do not know exactly. |
|
April 7, 2020, 14:46 |
|
#10 |
Member
Marco Ghiani
Join Date: Jan 2011
Posts: 35
Rep Power: 15 |
Hi all ..
I would ask you where is written the server name ? |
|
January 25, 2023, 12:15 |
|
#11 |
Senior Member
Lukas Fischer
Join Date: May 2018
Location: Germany, Munich
Posts: 117
Rep Power: 8 |
I wanted to keep the files on the Cluster and visualize the results on my local computer. Moreover, I wanted to run ParaView (PV) in parallel on the cluster.
Make sure to install the same PV versions on the cluster and local machine. Using Linux systems I did this: terminal 1: change dir to paraview/bin directory and execute for 4 cores in parallel: ./mpiexec -np 4 ./pvserver --force-offscreen-rendering --server-port=11111 here I needed to use the mpiexec command instead of the mpi command, suggested by my solution. terminal 2: ssh -L '11111:node131.cluster:11111' -N yourUserNameOnCluster@yourClusterIP Note: Nothing is happening in the termal but the command is working. Morever, I do not know why I have to do this. terminal 3: connect to server via GUI on local machine (Server Type: Client/Server; Host: localhost; Port: 11111) |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
OF 2.0.1 parallel running problems | moser_r | OpenFOAM Running, Solving & CFD | 9 | July 27, 2022 04:15 |
[General] Running paraview parallel on windows 7 | Naruto | ParaView | 3 | April 22, 2017 08:12 |
Error in running in parallel | Deagle | OpenFOAM Running, Solving & CFD | 2 | November 29, 2016 04:08 |
Run in parallel slow and cannot open the mesh in Paraview | qjh888 | OpenFOAM Running, Solving & CFD | 0 | July 13, 2016 05:55 |
problem about running parallel on cluster | killsecond | OpenFOAM Running, Solving & CFD | 3 | July 23, 2014 22:13 |