|
[Sponsors] |
June 3, 2016, 08:31 |
Graphics card for Paraview
|
#1 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
I am currently finalizing a new workstation and wanted to double-check on one specific topic: what is the best GPU for Paraview.
In my opinion, since Paraview only uses standard OpenGL instructions in single precision (?), the "professional" graphics cards like Quadro and FirePro take no advantage from the driver optimizations that make them superior in some other professional applications. Especially not since I am running linux. To be more precise, the Quadro M4000 8GB currently costs around 800€ and delivers a raw performance of 2572GFLOPS (Single) and 107GFLOPS (Double). The new GTX 1080 which also comes with 8GB of VRAM delivers about three times as much GFLOPS in single precision, has more shading units, faster memory... for a lower price. Even the performance in double precision is much higher. Am I missing something or is there really no reason to use a Quadro/FirePro graphics card if Paraview is the only program used or visualization? |
|
June 10, 2016, 04:07 |
|
#2 |
Member
Mohammed Gowhar
Join Date: Feb 2014
Posts: 48
Rep Power: 12 |
One word - reliability. Rendering with Iray for AutoDesk Inventor (or similar) drives the card very hard, especially if used for extended periods. Quadros are designed to withstand that kind of usage, consumer cards aren't.
Really depends on what you need to do with your card. The 1080 is a great card at a good price, but if you need reliability and intend to render for days on end then a Quadro might well be a better bet, in the long run. |
|
June 10, 2016, 04:54 |
|
#3 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
Thanks for sharing your opinion. But throwing in the term "reliability" sounds like the usual marketing Nvidia and Intel use to advertise their professional product line. I can say with absolute certainty that this argument is invalid for CPUs and I highly doubt that it holds true for GPUs. Especially since almost every consumer partner graphics card has a better cooling system than the corresponding Quadro card.
Were there ever any studies to support the claim about reliability, or in case of consumer cards, the lack thereof? I don't think folding@home or bitcoin mining a few years ago would have been a thing if consumer cards could not handle the workload. |
|
June 10, 2016, 05:13 |
|
#4 |
Senior Member
Joern Beilke
Join Date: Mar 2009
Location: Dresden
Posts: 533
Rep Power: 20 |
Are you sure that the paraview performance depends a lot on the GPU? It might be CPU bound.
|
|
June 10, 2016, 05:36 |
|
#5 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
In my experience it is CPU-bound during tasks like loading the model or new time steps.
But interactively manipulating the model and waiting for the "non-decimated" geometry to be rendered uses the GPU. I tested this with different graphics cards, using a faster one definitely helped. |
|
June 10, 2016, 11:50 |
|
#6 | |
Senior Member
Erik
Join Date: Feb 2011
Location: Earth (Land portion)
Posts: 1,188
Rep Power: 23 |
Quote:
If you are using a program which doesn't have its graphics tailored specifically for use on a "professional" graphics card, like a lot of CAD programs do, then I see no reason to use it. |
||
June 13, 2016, 04:04 |
|
#7 |
Senior Member
Joern Beilke
Join Date: Mar 2009
Location: Dresden
Posts: 533
Rep Power: 20 |
@evcelica
There are good reasons to use a Quadro card. The smaller ones (Quadro 600) are usually enough for most of the cfd users. They are cheap, small and don't use a lot of energy (around 45 W). Cooling and noise are the biggest challenges when you have a workstation under your desk. |
|
June 13, 2016, 04:38 |
|
#8 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
Cheap, small and low energy consumption are not a unique property of low-range Quadro cards. The consumer cards they are based on have identical properties, but cost even less.
In addition to that, the Quadro cards available now are usually one generation behind the consumer cards. This means the latest "Maxwell" Quadro cards are still based on 28nm chips while the latest "Pascal" consumer cards use 14nm chips with much higher energy efficiency. Last edited by flotus1; June 13, 2016 at 05:40. |
|
October 23, 2017, 13:28 |
|
#9 |
New Member
Pablo
Join Date: Oct 2017
Posts: 5
Rep Power: 9 |
Did anything come out of the final choice for the GPU? Was there a significant improvement in the final workstation?
I recently was able to run ParaView 5.4 on a PC with a GTX 1080 but did not observe any significant improvements compared to a Geforce GT530...at least not in the streamline generating filter. Was wondering how your final workstation made it out. |
|
October 23, 2017, 13:48 |
|
#10 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
Thanks to budget restrictions, I settled for a GTX 1060 6GB.
So far I have not run into any issues that could be pinned down to the choice of GPU. Paraview does a good job to enable running on lower-end hardware, for example the "decimation" of geometries while interacting with the model. I honestly don't know if PV can use GPU for some actual computation of filters, I never needed that. My conclusion based on a bit of research and the experience with the workstations in our lab: Quadro/FirePro: unnecessary. A decent amount of GPU VRAM: nice to have. |
|
October 23, 2017, 13:56 |
|
#11 |
New Member
Pablo
Join Date: Oct 2017
Posts: 5
Rep Power: 9 |
Were there any steps you took to "better integrate" the GPU to ParaView? I am aware of and updated OpenGL and graphics drivers in general. I believe ParaView can see that the GPU is installed, but am not able to determine if there are any restrictions that were automatically set up that needs changing.
Was there anything else you had to do before you had it running nicely? |
|
October 23, 2017, 15:42 |
|
#12 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
I installed the Nvidia driver as usual, that's it. No additional tinkering required. We are using OpensSuse btw.
|
|
October 24, 2017, 13:22 |
|
#13 |
New Member
Pablo
Join Date: Oct 2017
Posts: 5
Rep Power: 9 |
Thanks for the insight flotus1. Much appreciated.
|
|
December 5, 2018, 06:50 |
|
#14 |
New Member
samuel
Join Date: Sep 2018
Posts: 5
Rep Power: 8 |
Do you think a graphic card is mandatory with a great configuration like 2*Epyc?
Because it seems to me it's possible to use multicore with Paraview. Samuel |
|
December 5, 2018, 17:01 |
|
#15 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
The processing in Paraview can be done in parallel. I.e. the calculations that PV does on the fields: calculating Q-critetion and the likes.
But this does not decrease the requirements for the graphics adapter. Maybe more cores help if you are using software rendering, a topic I know virtually nothing about. |
|
January 27, 2019, 14:22 |
|
#16 |
New Member
Brandon Gleeson
Join Date: Apr 2018
Posts: 26
Rep Power: 8 |
Another aspect worth considering when choosing a GPU, is getting a CUDA >=3.0 enabled one to allow the use of Nvidia's IndeX ParaView plugin. I'm using an AMD card currently, so don't have any experience with it, but it looks interesting from what I've watched.
Is anybody using IndeX? This is from the readme under the ParaView 5.6 IndeX plugin directory: The NVIDIA IndeX for ParaView Plugin is compatible with: |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Error using Fluent 15 and GPU card | David Christopher | FLUENT | 1 | June 26, 2014 19:35 |
Graphics card or Processor for Fluent | PedFr0 | Hardware | 5 | March 25, 2013 13:29 |
Best graphics card | grtabor | OpenFOAM | 2 | July 23, 2009 12:02 |
How to change and fix the size of graphics window? | Rob | FLUENT | 1 | February 7, 2003 11:15 |
graphics card | christof | FLUENT | 2 | August 2, 2001 16:03 |