|
[Sponsors] |
October 26, 2021, 08:44 |
AMD Epyc hardware for ANSYS HPC
|
#1 |
New Member
Chefbouza
Join Date: Oct 2021
Posts: 10
Rep Power: 5 |
Dear Members,
I want to build a 128 cores hardware to use with ANSYS solvers (mainly FLuent, and a bit Mechanical) with 3 HPC packs licenses (go up to 128 cores on a single calculation). In my understand, nowadays AMD Epyc Milan processors are much more interesting than the Intel Xeon Ice Lake ones (better performance/price ratios). My company IT supplier is DELL. Dell propose a 2 sockets rack servers where one can have for example 2x 7763 processors, with 16x32GB of DDR4 RAM (use the 2x8 memory channels of the CPUs). And with 2x 6.4 Tb NVME mixed use SSDs. In the CFD-online forum, I never saw an Epyc 7763 hardware configuration. I don't know if it is a price matter or a performance one. One can also think to build a 2 compute nodes with 2 CPU each and then have for example 2x2xEpyc 7543 CPUs (128 cores to use the 3 ANSYS HPC pack licenses simultaneously). But the inconvenient point is that we will have to interconnect the 2 compute nodes and we don't have any experience to manage cluster architecture (with the single compute node with 2x7763, we have nothing to manage). Can you give an advice if a 2x 7763 configuration is a good choice ? Ps: My plan is to invest on the next year Epyc Genoa generation (DDR5, 12 memory channels, ...), still the Milan one is a good parallel. Thank you in advance for you help ! |
|
October 26, 2021, 09:13 |
|
#2 | |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
The reason why 64-core Epyc CPUs are rarely recommended here is the usual: with 8 cores per memory channel, scaling to all 128 cores of a dual-socket machine will be far from ideal.
Which means you would make better use of those rather expensive Ansys licenses by using two nodes with two 32-core CPUs each. These 32-core CPUs don't even have to be from the top of the stack in order to be faster. If it has to be 128 cores in a single system, I guess the 7763 is the best money can buy. For connecting two systems, you can either go with 10 GIgabit Ethernet and see how that goes, or with directly connected Infiniband. No expensive IB switch is required for only 2 nodes. Setting up such a mini-cluster isn't terribly complicated, but sure does take some time. I can't make that decision for you. You are basically trading convenience for performance. Quote:
|
||
October 26, 2021, 09:32 |
|
#3 | |
New Member
Chefbouza
Join Date: Oct 2021
Posts: 10
Rep Power: 5 |
Thank you very much for your answer !
Quote:
Do we have such benchmarks where we compare high cores count single machine with 2 machines connected in Infiniband ? Thank you! |
||
October 26, 2021, 10:35 |
|
#4 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
Ansys used to publish benchmarks with intra-node scaling. Not anymore unfortunately. But the fact that most of their benchmarks use the 32-core Epyc 75F3 tells us a lot
All I can offer is OpenFOAM scaling found herein: General recommendations for CFD hardware [WIP] And we have this intra-node benchmark for Fluent with 16-core Epyc CPUs: Xeon Gold Cascade Lake vs Epyc Rome - CFX & Fluent - Benchmarks (Windows Server 2019) You can already see how scaling is lower than ideal, only ~75% efficiency on 32 cores. |
|
October 26, 2021, 12:51 |
|
#5 | |
New Member
Chefbouza
Join Date: Oct 2021
Posts: 10
Rep Power: 5 |
Quote:
Thank you very much Alex for all this help ! |
||
Tags |
amd epyc, ansys fluent, hardware, hpc |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
General recommendations for CFD hardware [WIP] | flotus1 | Hardware | 19 | June 23, 2024 19:02 |
Operating System for AMD Epyc Workstation | jakethejake | Hardware | 14 | November 19, 2019 06:52 |
AMD Epyc Mini Cluster Hardware for StarCCM+ | clearsign | Hardware | 1 | April 24, 2019 17:28 |
Building Workstation using 2 x AMD EPYC 7301 | Ivanrips | Hardware | 16 | January 21, 2019 10:39 |
AMD Epyc CFD benchmarks with Ansys Fluent | flotus1 | Hardware | 55 | November 12, 2018 06:33 |