|
[Sponsors] |
October 26, 2019, 16:00 |
|
#221 |
Senior Member
Join Date: Oct 2009
Posts: 140
Rep Power: 17 |
2 x AMD EPYC 7351, 16 x 8GB DDR4, Supermicro h11dsi-nt
OpenSuse Leap 15.1, Kernel 4.12.14-lp151.28.20-default OpenFOAM-7 Code:
# cores Wall time (s): ------------------------ 1 686.41 2 420.24 4 171.41 6 115.23 8 90.14 12 66.58 16 54.64 20 48.03 24 43.04 28 42.56 32 35.76 |
|
November 4, 2019, 18:47 |
|
#222 | |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
Quote:
https://www.anandtech.com/show/14694...epyc-2nd-gen/6 |
||
November 6, 2019, 01:07 |
|
#223 |
New Member
anonymous
Join Date: Oct 2019
Posts: 4
Rep Power: 7 |
flotus1,
It was run in the default, "one NUMA domain per socket". I haven't had the opportunity yet to experiment with the options in: https://developer.amd.com/wp-content...56745_0.75.pdf I can try running the NPS4 setting if you're interested, but I may need some guidance on how to set it. I didn't initially see it in the bios, but could have missed it. mh-cfd, The Motherboard is a SuperMicro H11DSi version 2.0. It was purchased from https://www.interpromicro.com/ based on a tip from the thread below: https://forums.servethehome.com/inde...yc-rome.25430/ |
|
November 6, 2019, 13:58 |
|
#224 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
Due to an appalling lack of Epyc Rome equipment on my part, I can not help you with finding that bios option. But I would not be surprised if Supermicro just left it out. "Screw that noise, more options would just confuse our customers"
It is partly out of curiosity, but I also think it should give you some better performance with NUMA-aware software like OpenFOAM. |
|
November 12, 2019, 08:33 |
|
#225 |
New Member
Join Date: Jul 2015
Posts: 10
Rep Power: 11 |
2 x AMD EPYC 7371, 16 x 16GB DDR4 Dual-Rank, Supermicro h11dsi
Windows 10 Pro Vers. 1903 Build 18362.418 - WSL Ubuntu 18.04 LTS OpenFOAM-6 (precompiled package from openfoam.org) Code:
# cores Wall time (s): ------------------------ 1 1254.01 2 447.25 4 212.51 6 139.17 8 101.92 12 88.24 16 88.04 20 83.5 24 74.72 28 70.44 32 87.87 Any ideas? Or is it just windows 10? Last edited by jakethejake; November 14, 2019 at 03:53. Reason: corrected build data |
|
November 12, 2019, 12:05 |
|
#226 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
WSL = Windows subsystem for linux?
It might not be the best solution if you want near bare-metal performance. You might want to try a dockerized version of Openfoam, or a proper VM. |
|
November 15, 2019, 19:03 |
|
#227 |
Senior Member
Join Date: May 2012
Posts: 552
Rep Power: 16 |
My experience is that the WSL is almost as fast as native Linux for this benchmark. Writing to disk often should be avoided though.
Here is some (old) information. WSL has seen improvements after this post. https://www.phoronix.com/scan.php?pa...900x-wsl&num=1 |
|
November 18, 2019, 13:28 |
|
#228 |
New Member
Join Date: Jul 2015
Posts: 10
Rep Power: 11 |
Coming back with fresh results.. hardware build from above:
2 x AMD EPYC 7371, 16 x 16GB DDR4 Dual-Rank, Supermicro h11dsi Software: both OpenFOAM-6 (precompiled package from openfoam.org) 1) Ubuntu 18.04.3 LTS (natively installed) Code:
# cores Wall time (s): ------------------------ 1 838.71 2 395.21 4 193.89 6 120.27 8 88.43 12 64.74 16 48.43 20 44.14 24 38.32 28 37.14 32 32.7 Code:
# cores Wall time (s): ------------------------ 1 1512.35 2 815.88 4 379.47 6 292.36 8 255.1 12 195.8 16 183.8 20 182.69 24 173.91 28 170.37 32 173.75 |
|
November 23, 2019, 11:46 |
|
#229 | |
Member
Join Date: Jul 2011
Posts: 53
Rep Power: 15 |
Quote:
Are you able to run Fluent benchmarks? |
||
December 3, 2019, 10:13 |
Epyc Rome Benchmark
|
#230 |
New Member
Henrik
Join Date: Mar 2019
Posts: 8
Rep Power: 7 |
Hi!
Is the below the only EPYC Rome benchmark available here? I am looking to get a new Linux workstation, currently considering the Epyc Rome CPUs. By the way - looks really promising from the below results! |
|
December 3, 2019, 14:58 |
|
#231 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
Yes, these are the only Epyc Rome results we have so far.
Yet I don't think you can go wrong with them. Especially for a general purpose workstation, they are a huge improvement over 1st gen due to the less complicated NUMA topology. |
|
December 3, 2019, 16:08 |
|
#232 |
Member
Join Date: Jul 2011
Posts: 53
Rep Power: 15 |
I’m in contact with a compute service provider who is offering to perhaps benchmark some fluent cases for me on a dual 7302 setup. Will post back if I get it done.
|
|
December 4, 2019, 03:16 |
|
#233 |
New Member
Henrik
Join Date: Mar 2019
Posts: 8
Rep Power: 7 |
I agree flotus1 - maybe the doubled L3 cache has a say too?
Wondering if they manage to increase the clock freq in the future. |
|
December 9, 2019, 12:10 |
Result with AMD 3960x
|
#234 |
New Member
Shui Pei
Join Date: Mar 2009
Posts: 27
Rep Power: 17 |
Here is my result. Newlt configured workstation with Threadripper 3960x, 3.8 GHz 24C, 64 G memory (4 channel)
# cores Wall time (s): ------------------------ 1 550.49 2 299.15 4 161.65 6 120.55 8 101.56 12 99.13 16 93.74 20 93.71 24 93.65 |
|
December 9, 2019, 12:13 |
|
#235 |
New Member
Shui Pei
Join Date: Mar 2009
Posts: 27
Rep Power: 17 |
Hi I am wondering why my 3960x is worse so much than the EPYC. Their design should be close and 3960x do have 24 native cores.
Anything I may be missing? |
|
December 9, 2019, 13:52 |
|
#236 | |
New Member
Erik
Join Date: Jul 2019
Posts: 7
Rep Power: 7 |
Quote:
Epyc has 8 memory channels for using only one cpu. The older and newer Threadripper run on 4 memory channels, despite for the newest TRx Platform, that can run on 8 channels. So there is a lack on memory channels cosing worser results than on Epyc. |
||
December 9, 2019, 13:57 |
|
#237 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
Hence why Epyc 2nd gen is the better choice compared to TR 3000, at least with CFD/FEA as the main applications.
TR is better suited for the elusive "content creator" type. Your TR 3960 results don't seem too far off. A TR 1950X on the first page of this thread is more than 50% slower running all cores. You could probably still optimize memory latency and timings. Which memory are you using, at which frequency and latencies? And you could probably use NUMA mode for a slight performance increase in OpenFOAM. |
|
December 10, 2019, 08:14 |
|
#238 | |
New Member
Shui Pei
Join Date: Mar 2009
Posts: 27
Rep Power: 17 |
You can check the memory in the attachment. It is some ordinary DDR4 memory for daily usage, they are running under XMP with 3200Mhz.
May I know the bandwidth and latency of a decent EPYC system will be like? Thank you! I try to run three 8 cores simulation simultaneously and all become slow. So the bottleneck of memory limit the paralel performance of Threadripper CPU, at least for OpenFOAM. Also, even though the motherboard has 8 dimms for memory, the maximum channels supported by threadripper 3nd is 4, so there wont be any improvement if I top up them. Quote:
|
||
December 10, 2019, 15:19 |
|
#239 | |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
Quote:
|
||
December 10, 2019, 18:44 |
|
#240 |
Senior Member
Will Kernkamp
Join Date: Jun 2014
Posts: 372
Rep Power: 14 |
Geekbench 5 Compute scores (Linux)
Processor 7402P 3960X Single-Core Score 1035 1338 Multi-Core Score 22720 25199 Shui Pei, you see that in the Geekbench test, the threadripper does better than the comparable Epyc. However, that test is a little light on the memeory access speed factor, which dominates the CFD results on this thread. |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
How to contribute to the community of OpenFOAM users and to the OpenFOAM technology | wyldckat | OpenFOAM | 17 | November 10, 2017 16:54 |
UNIGE February 13th-17th - 2107. OpenFOAM advaced training days | joegi.geo | OpenFOAM Announcements from Other Sources | 0 | October 1, 2016 20:20 |
OpenFOAM Training Beijing 22-26 Aug 2016 | cfd.direct | OpenFOAM Announcements from Other Sources | 0 | May 3, 2016 05:57 |
New OpenFOAM Forum Structure | jola | OpenFOAM | 2 | October 19, 2011 07:55 |
Hardware for OpenFOAM LES | LijieNPIC | Hardware | 0 | November 8, 2010 10:54 |