|
[Sponsors] |
Verification/Suggestion of Hardware Setup for CFD with OpenFOAM |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
September 21, 2020, 15:55 |
Verification/Suggestion of Hardware Setup for CFD with OpenFOAM
|
#1 |
New Member
Dominik
Join Date: Apr 2016
Location: Duisburg, Germany
Posts: 3
Rep Power: 10 |
Hi there,
i would like to build up a new home pc for CFD purposes. My budget is between 2.000 - 2.500 EUR. I already have built up a hardware setup consisting of the following hardware: Mainboard Asus PRIME X299-A, Intel X299 Processor Intel Core i9 10920X 12x 3.5GHz Power Supply Seasonic Core GC-650 Gold, 650 Watt Cooler be quiet! Dark Rock Pro 4 RAM 32GB DDR4-RAM PC-3600 (4x 8GB) overclocking DDR4-RAM, G.Skill Ripjaws V, 4x 8GB Graphics Nvidia GeForce GTX1660 Ti 6GB, Palit StormX OC 6144MB GDDR6-RAM Hard Disk 1 NVME M.2 SSD 500GB Kingston A2000 Hard Disk 2 SSD SSD 500GB Samsung 860 Evo MZ-76E500B Concerning the RAM, I have a question: Is there anything hardware specific at the RAM itself, so that it is suitable for Quad Channel Mode? I am asking because the webshop states that the RAM is "Dual Channel". But I thought that the quad channel mode depends solely on the memory controller of the mainboard and the number of modules (4 or 8)? I would be very happy about any hints whether the setup sounds reasonable for CFD calculations. Furthermore I would appreciate any suggestions if one could build a different system which would be more suitable still considering the budget. Thank you very much in advance for your replies. Cheers Dominik |
|
September 22, 2020, 15:31 |
|
#2 | ||
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
Quote:
Theoretically, you could run into trouble with 4 DIMMs that came in 2 different packages, despite buying the same brand and model. Some memory manufacturers change the ICs in their memory modules without notice. But the chances of that happening when you buy two packs from one shop at the same time are pretty slim. And the chances of that leading to incompatibilities without extreme overclocking is also pretty small. Quote:
91€ over budget, but absolutely worth it. Considering you get 2TB of NVMe storage, a really fast 16-core CPU with 64GB of RAM, and the option to double performance later on by just dropping in a second CPU and RAM. |
|||
September 22, 2020, 16:08 |
|
#3 |
Senior Member
Join Date: May 2012
Posts: 551
Rep Power: 16 |
You can shave off 150 Euros by changing the GPU to an RX 570 (or rather a 580 since they are only 10 Euros more today).
You can also opt for a 9800X and save another 400 Euros for almost the same performance in CFD. So for 550 Euros less you would essentially have the same performance. The Epyc suggestion will be a much better CFD option though, especially in the long run (I assume that you have unlimited parallel licenses, if you do not then the Epyc might be worse). |
|
September 24, 2020, 14:57 |
|
#4 |
New Member
Dominik
Join Date: Apr 2016
Location: Duisburg, Germany
Posts: 3
Rep Power: 10 |
Thank you both for the quick replies!
@flotus: The Epyc configuration seems to be a good choice. But I have two more questions: I found that the mainboard you propose supports only 2666 MHz RAM and you recommended 3200 MHz RAM. Is there a reason that you did not choose the "NT model" of that mainboard: https://geizhals.de/supermicro-h11ds...loc=at&hloc=de Or is it not worth the higher price? Furthermore I was wondering why the PCIe ->M.2 Converter is needed, because the mainboard does have a M.2 slot? Does it support higher read/write speeds? Thank you for your reply! Cheers Dominik |
|
September 24, 2020, 15:06 |
|
#5 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
I linked you the revision 2 of this board, which does in fact support Epyc Rome CPUs, and memory transfer rates up to 3200MT/s out of the box. The info on this site is manually curated, and not always 100% accurate. So be careful to get a board labeled as "R2.0".
The only difference between H11DSi and H11DSi-NT: the latter has built-in 10Gigabit Ethernet. Entirely up to you whether you need that. That m.2 to PCIe adapter card will help you get the most out of a fast NVMe SSD. The onboard m.2 slot on any Supermicro H11DSi is only connected via 2 PCIe 3.0 lanes, limiting sequential throughput on this kind of SSD. They need 4 lanes for maximum throughput. |
|
September 24, 2020, 15:42 |
|
#6 |
New Member
Dominik
Join Date: Apr 2016
Location: Duisburg, Germany
Posts: 3
Rep Power: 10 |
Thanks agasin for your quick reply. Just another question: Is the ECC RAM necessary. Or would'n it be a good choice to to get a faster non ECC RAM like that one (for nearly the same price):
https://geizhals.de/g-skill-ripjaws-...loc=at&hloc=de Cheers Dominik |
|
September 24, 2020, 16:24 |
|
#7 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
This is not about ECC, that's just a nice bonus. Epyc CPUs and boards are incompatible with UDIMM, you need RDIMM aka registered memory. There are very few RDIMMs without ECC, But they are not cheaper, and have just the same loose timings as reg ECC.
Additionally, there is no memory overclocking on this platform. Epyc CPUs have a hard cap at their rated memory transfer rate. It is technically possible to adjust memory timings with an unlocked bios on H11DSi motherboards. But let's leave such extreme measures out of the equation for now If you want even higher memory performance, spending more money on 16GB modules with 2 ranks would be an easier route. The price/performance ratio of this measure is pretty bad though. |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Map of the OpenFOAM Forum - Understanding where to post your questions! | wyldckat | OpenFOAM | 10 | September 2, 2021 06:29 |
UNIGE February 13th-17th - 2107. OpenFOAM advaced training days | joegi.geo | OpenFOAM Announcements from Other Sources | 0 | October 1, 2016 20:20 |
Suggestion for a new sub-forum at OpenFOAM's Forum | wyldckat | Site Help, Feedback & Discussions | 20 | October 28, 2014 10:04 |
Hardware for OpenFOAM Cluster | Edge99 | Hardware | 1 | March 6, 2013 15:09 |
State of OpenFOAM for hardware support | Sonthun | OpenFOAM | 1 | October 23, 2010 02:09 |