CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > General Forums > Hardware

Suggestions on the hardware configuration

Register Blogs Community New Posts Updated Threads Search

Like Tree2Likes
  • 2 Post By flotus1

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   June 22, 2020, 08:34
Default Suggestions on the hardware configuration
  #1
New Member
 
kukka
Join Date: Sep 2018
Posts: 15
Rep Power: 8
lxlylzl.123 is on a distinguished road
Hi all,


I am planning to purchase a new desktop for my Lab for numerical simulations using Fluent v.16.2 and 18.1 (with no restrictions on the number of cores (research license)) and XFlow v.2020x (with restrictions of 32 cores), and more likely CCM+ in the near future.

I would be working on multiphase (Euler-Lagrange, Euler-Euler and free surface flow) problems, conjugate heat transfer as well as street canyon based problems. Mostly, I would prefer LES simulations over the number of cells in a range 8 to 12 millions (or more). Our budget is around $5,000. The time step size may go below 1.0e-5 second in some simulations, with total runtime of 20-50 seconds.

What I have learnt from this forum is that the performance of AMD Epyc series is ahead of the Intel (due to scalability issues with Intel). However, my first priority would be Intel, in the range of $5K. Is it possible to have a configuration with a decent scalability and overall performance? All options are invited.
lxlylzl.123 is offline   Reply With Quote

Old   June 22, 2020, 09:13
Default
  #2
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
Let's assume those 5000$ would all go towards hardware, and not into the pockets of a big OEM. Then the fastest Intel-system you could buy would look like this:

SSI-EEB case - 100$
Power supply - 130$
NVMe SSD 2TB: 300$
CPU coolers - 160$
Graphics card - 250$
Motherboard: ASUS WS C621E Sage - 550$
CPUs: 2x Xeon Silver 4216- 2100$
RAM: 12x16GB DDR4-2400 dual-rank - 950$

That's about 4550$, so some budget left for additional hard drives or whatever else you might need.
Stepping up within Intels portfolio is next to impossible due to budget constraints. The next higher performing CPU that makes some sense is the Xeon Gold 6226R, which costs over 1500$, and requires faster (=more expensive) memory.
flotus1 is offline   Reply With Quote

Old   June 22, 2020, 14:44
Default
  #3
New Member
 
kukka
Join Date: Sep 2018
Posts: 15
Rep Power: 8
lxlylzl.123 is on a distinguished road
Dear Flotus1, thanks very much for your reply and suggestion regarding the system configuration.


Xeon Gold 6226R seems to be a better choice, and for this I will have to increase my budget by another $1k (now, a total of $6K). Could you please recommend hardware required for the 2 x Xeon Gold 6226R processor?

Just out of curiosity, will the above configuration (with a total of 32 cores) perform better (especially the scalability) than if I go for 1 x AMD EPYC 7F72 (24 cores)? This AMD model seems to be promising. If yes, could you please provide me the configuration for this as well? In this, is it possible to use a dual socket motherboard with only one processor installed and the other slot left blank for adding up another EPYC 7F72 in the future?
lxlylzl.123 is offline   Reply With Quote

Old   June 22, 2020, 15:14
Default
  #4
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
Quote:
Could you please recommend hardware required for the 2 x Xeon Gold 6226R processor?
All you need to change are CPUs and memory. 12x16GB DDR4-2933 dual-rank.

Quote:
Just out of curiosity, will the above configuration (with a total of 32 cores) perform better (especially the scalability) than if I go for 1 x AMD EPYC 7F72 (24 cores)?
Yes, it will show better scaling than any single CPU, and also higher overall performance.

Quote:
is it possible to use a dual socket motherboard with only one processor installed
Yes, you can do that. Just be careful which CPU socket you pick. Most of the stuff on dual-socket motherboards is connected to one socket.

It would be negligent of me not to ask two questions here:
1) Why Intel? Just playing it safe, or any other reasons?
2) Why the 7F Epyc CPUs? They may outperform their non-F counterparts slightly, but at the cost of much worse price/performance. Even cheaper Epyc CPUs like 7302 outperform Intels high frequency models on a per-core performance metric: Xeon Gold Cascade Lake vs Epyc Rome - CFX & Fluent - Benchmarks (Windows Server 2019)
flotus1 is offline   Reply With Quote

Old   June 23, 2020, 10:28
Default
  #5
New Member
 
kukka
Join Date: Sep 2018
Posts: 15
Rep Power: 8
lxlylzl.123 is on a distinguished road
Alex, thanks a lot. This information was really helpful for me.


1. Why Intel? Just playing it safe, or any other reasons?
Actually, I may jump to GPU based simulations in the future. I am not very much sure if AMD configuration would support Nvidia (CUDA cores), and to what extent. Secondly, no one in my circle (including myself) have used AMDs.


2. Why the 7F Epyc CPUs? They may outperform their non-F counterparts slightly, but at the cost of much worse price/performance.
That's correct Alex, and I agree with you on this.



Just a small thought....
If one uses 2x Epyc 7302 (a total of 32 cores) with 128 gb ram (8gb x 16), and 2 x Xeon Gold 6226R (a total of 32 cores) with 96 gb ram (8gb x 12) on the other hand, which among the two would be faster on the simulation (physical) run time? I could not locate any benchmark report on these two variants (if it isn't, whats your view on this?). Secondly, which among these two configurations would turn out to be cost effective.


Your views on the above would mean a lot to me to make a final decision.
lxlylzl.123 is offline   Reply With Quote

Old   June 23, 2020, 11:35
Default
  #6
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
Your choice of CPU has no impact on CUDA support.
Funny sidenote: Just look at what Nvidia did with their DGX systems, which is at the absolute high end of what is currently possible with GPU acceleration. They used AMD Epyc CPUs due to their higher overall PCIe bandwidth.
There are other factors to consider though. If you are building a system yourself with 2 AMD Epyc CPUs, your only motherboard choice is Supermicro H11DSi. Which only has two PCIe x16 slots, which are both connected to CPU1. So not ideal in case you want to use multiple GPUs.
Then again, there are quite a few obstacles to overcome when using GPU acceleration in software like Fluent or CCM+. One of them are extremely expensive GPUs. To be frank: if you are on a budget of 6000$ now, GPU acceleration won't be a viable option. A Quadro GV100 costs around 10000€.

Quote:
If one uses 2x Epyc 7302 (a total of 32 cores) with 128 gb ram (8gb x 16), and 2 x Xeon Gold 6226R (a total of 32 cores) with 96 gb ram (16gb x 6) on the other hand, which among the two would be faster on the simulation (physical) run time?
Epyc will definitely be faster. Even after fixing the memory population of those Xeons to e.g. 12x8GB, my money would still be on the Epyc solution. The difference won't be huge, but current Xeons just can't compete with the memory bandwidth and cache size advantage of Epyc Rome. At least for CFD workloads.
Habib-CFD and Dr. Nubian like this.
flotus1 is offline   Reply With Quote

Old   June 24, 2020, 14:11
Default
  #7
New Member
 
Join Date: Dec 2017
Posts: 11
Rep Power: 9
Kukka is on a distinguished road
Alex, thanks very much for making the things clear
Kukka is offline   Reply With Quote

Old   April 13, 2021, 15:10
Default
  #8
New Member
 
kukka
Join Date: Sep 2018
Posts: 15
Rep Power: 8
lxlylzl.123 is on a distinguished road
Hi Alex.

Due to the pandemic situation, I could not get the required components viz. EPYC 7302 and other related hardware. Till date, I find it hard to get this stuff in my region; it has been out of stock since long even in the renowned shops.

However, after huge efforts, I managed to get the hardware, the details of which are as follows:

1. 2X EPYC 7402 (a total of 48 cores)
2. 2X 2U Active Heatsink
3. AsrockRack ROME2D16-2T mainboard
4. 16X 16GB DDR‐4 ECC REG 3200MHz (256GB total)
5. Tower Chassis With 1300W PSU
6. Quadro P2200 graphics card
7. 1TB NVMe M.2 PCIe 4.0
8. 4X 8TB SATA Enterprise 7200RPM

The total cost would be around $8,400 which is quite high. However, I have planned to buy this. I could have bought a single processor with 128 GB RAM, but I am getting a double with an addition of another $2400, which I can afford. A good resource would be helpful in the coming time too.

I need your view on this. Do you think any changes is required.

1. Is the graphics card ok?

2. Don't we have liquid cooling for EPYC? The temperature here would be around 48 degrees (maximum), and I hope the cooling system I have chosen is sufficient.

3. I plan to combine the 3 SATA drives in a single volume of 24TB and one single 8TB. The 24TB space would help to save large files in XFlow. The XFlow software, being LBM based, saves files at the user defined frequencies and preserves them all. Hence, large drive size is required.

4. I hope the motherboard I have chosen is okay. If not, please suggest me one.


Your views on above would mean a lot to me
lxlylzl.123 is offline   Reply With Quote

Old   April 13, 2021, 17:00
Default
  #9
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
Quote:
1. Is the graphics card ok?
It's fine for display output, but won't do anything for GPU acceleration. If you need to shave off some of the cost, this would be a good place to start. Graphics cards have become insanely expensive these days, but something like a Geforce GTX 1660 should still be cheaper.

Quote:
2. Don't we have liquid cooling for EPYC? The temperature here would be around 48 degrees (maximum), and I hope the cooling system I have chosen is sufficient.
There are plenty of AIO liquid coolers for the TR4/SP3 socket. Only few of them -if any- were designed specifically for these huge CPUs though. The others won't cover the whole die area, leading to suboptimal results. The challenge might be to get two of them mounted inside a case. That will require some planning.
Or you could do what I did: a full custom loop water cooling, including CPU VRMs. The CPUs themselves are relatively easy to cool due to their large surface area. The VRM heatsinks on these boards are designed for high airflow in server cases. In a quiet workstation case. CPU VRMs are usually the first component that causes thermal throttling. Full disclosure: water cooling wasn't really necessary in my case, I just got bored during the first lockdown.
IMG_20200405_160041_small.jpg
That being said, a large air cooler is usually enough for these CPUs. "2U active heatsinks" will be loud as hell. The options you have strongly depend on the case you pick. 48°C ambient is a real challenge though. Air conditioning seems like the easier solution, compared to designing a cooling system to handle that. And also helps the human in front of the desk

Quote:
3. I plan to combine the 3 SATA drives in a single volume of 24TB and one single 8TB. The 24TB space would help to save large files in XFlow. The XFlow software, being LBM based, saves files at the user defined frequencies and preserves them all. Hence, large drive size is required.
You could do that, but be aware that you don't get any redundancy. If one drive fails, at least part of the data is gone.
I dabble in LES with LBM myself. I have a RAID6 of spinning drives for capacity, and a single 7.68TiB NVMe SSD for fast storage with the current projects. Keep in mind that hard drives cap out at around 200MB/s sequential. That's rather slow to fill 24TB. And if you RAID0 them, all of the data will be gone with a single drive failure.

Quote:
4. I hope the motherboard I have chosen is okay. If not, please suggest me one.
Should be fine.
flotus1 is offline   Reply With Quote

Old   April 14, 2021, 02:27
Default raidz
  #10
Senior Member
 
Will Kernkamp
Join Date: Jun 2014
Posts: 372
Rep Power: 14
wkernkamp is on a distinguished road
Quote:
Originally Posted by lxlylzl.123 View Post
3. I plan to combine the 3 SATA drives in a single volume of 24TB and one single 8TB. The 24TB space would help to save large files in XFlow. The XFlow software, being LBM based, saves files at the user defined frequencies and preserves them all. Hence, large drive size is required.

You may want to combine all four drives into a zfs raidz configuration. With ambient temperature at 48C, you are exposed to drive failures.
wkernkamp is offline   Reply With Quote

Old   April 14, 2021, 07:08
Default
  #11
New Member
 
kukka
Join Date: Sep 2018
Posts: 15
Rep Power: 8
lxlylzl.123 is on a distinguished road
Dear Alex,


Thanks very much for your valuable suggestions; they are very helpful to me


1. There is a huge scarcity of graphics card here, so I am left with no other option.


2. As you have mentioned, I will start working on the cooling part to eliminate the drawbacks. The maximum (inside) temperature in my location ranges from 40 to 44 degrees during peak summers, otherwise it is lower than that.


3. The 7.68TB NVMe SSD that you have is a great stuff. Unfortunately, I am already out of my budget, and this storage media is very expensive for me. It could be left for future upgrade. However, I have planned to purchase 2TB NVMe M.2 PCIe 4.0 instead of 1TB, along with 2X 8TB HDD (merged) plus 1X 8TB (single). Fluent files could be stored on this 2 TB NVME SSD and later transferred to single HDD. Small XFlow files could also be handled in a similar way. Bigger XFlow files could be saved directly to 2X8 = 16 TB storage (I know the write speed would be significantly lower than on the SSD).

Alex, is it advisable to go for 16 TB single storage instead of 2X 8 TB storage? Are 16 TB 7200rpm drives noisy?
lxlylzl.123 is offline   Reply With Quote

Old   April 14, 2021, 07:10
Default
  #12
New Member
 
kukka
Join Date: Sep 2018
Posts: 15
Rep Power: 8
lxlylzl.123 is on a distinguished road
Quote:
Originally Posted by wkernkamp View Post
You may want to combine all four drives into a zfs raidz configuration. With ambient temperature at 48C, you are exposed to drive failures.

Dear Will Kernkamp, Thanks very much for the valuable suggestion. I will keep it in mind
lxlylzl.123 is offline   Reply With Quote

Old   April 14, 2021, 08:24
Default
  #13
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
Quote:
Alex, is it advisable to go for 16 TB single storage instead of 2X 8 TB storage? Are 16 TB 7200rpm drives noisy?
Depends on how much risk you want.
In order to get full capacity with 2 drives, you can either use JBOD or RAID0. The former usually only loses the data one the drive that fails. The latter loses all data in case of a single drive failure.
Compared to a single disk, the risk of data loss is approximately doubled.
As for noise: compared to the rest of the workstation chugging along in 40°C ambient temperature, you won't hear the hard drives.
I bought the SSD used, for around 800€ if I remember correctly. New SSDs with those specs are way outside of my comfort zone.

A more general remark: modern hardware seems to be very hard to source in your location, and at pretty steep prices.
That's usually a good fit for these dual-socket 2011-3 setups you can get rather cheap from Aliexpress. They won't be as fast, but much cheaper. That's one way to "future-proof" your system, and in my opinion one of the best. Buying very cheap allows for more frequent upgrades.
flotus1 is offline   Reply With Quote

Old   April 14, 2021, 13:24
Default
  #14
New Member
 
kukka
Join Date: Sep 2018
Posts: 15
Rep Power: 8
lxlylzl.123 is on a distinguished road
Dear Alex,


Thanks for this valuable information
lxlylzl.123 is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[OpenFOAM.org] Cannot open configuration file mpicc-wrapper-data.txt mpicc: Not found wenjuny OpenFOAM Installation 0 November 25, 2019 10:01
Suggestions for StarCCM+ Workstation configuration ifil Hardware 15 October 30, 2018 06:09
Best hardware configuration for cluster and server. pradeep.uvce Hardware 0 January 6, 2016 15:47
Multiple Configuration Simulation Tristan CFX 0 November 14, 2009 00:01
What hardware configuration should be preferred? Albert Main CFD Forum 2 February 27, 2003 19:15


All times are GMT -4. The time now is 19:13.