|
[Sponsors] |
May 7, 2020, 23:40 |
Info needed: HP ML350 Gen9 and Dell T7910
|
#1 |
Member
EM
Join Date: Sep 2019
Posts: 59
Rep Power: 7 |
Hello,
I have been asking around for info on the title systems, but the response seems meager. I am looking into buying second hand one or the other of the above systems with 2x 16c core v3 and 256 gig @2133 ddr4. Both are out of production and there is the question if it were possible to install later versions of linux -- specifically Ubuntu 18.04 and CentOs 7.6. These two distros have tested (by myself) drivers for the radeon_vii cards which i use as opencl accelerators for doing dns. The basic question is: would ubuntu 18.04 and/or CentOS 7.6 install on either hp ml350 g9 or dell t7910? If yes, is it possible to know if the amd drivers for the radeon_vii will be working on them? btw, these two system were chosen because they have 4x x16 pci-3 slots with min 3 slots having x16 bandwidth. Thanks. -- Last edited by gnwt4a; May 7, 2020 at 23:41. Reason: typos |
|
May 8, 2020, 06:07 |
|
#2 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
I don't have any experience with these two models in particular. But I have never had any trouble installing current Linux versions on somewhat recent OEM workstations.
I see a bigger problem with the power supply and cooling. Radeon VII cards have a TDP of 295W, and require two 8-pin connectors each. Not sure if the power supplies in these workstations have the connectors and the power for 3 or even 4 of these cards. And in addition to that, Radeon VII have a cooler design with axial fans, dumping all heat into the case. The cooling of these workstations is not designed to handle that. Instead, they rely on blower-style GPUs that exhaust heat directly through the back. |
|
May 8, 2020, 10:00 |
|
#3 |
Member
EM
Join Date: Sep 2019
Posts: 59
Rep Power: 7 |
Thanks. Helpful as usual.
Power and thermal issues can be overcome by the user. In general not all components run at full clip at the same time. When the gpus are busy, the load on the cpus may not be large - and the inverse. By contrast, if the system gets bricked because of software problems, it is very serious for eol systems. Also, power usage and noise are very difficult to deal with. With these uncertainties in mind, who would risk ~3K euro on such systems? -- |
|
May 8, 2020, 14:10 |
|
#4 | |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
Quote:
It is not just the total power consumption you have to keep an eye on, by balancing power between CPUs and GPUs. A power supply has individual rails for each of these, with individual power limits. It is hard to find exact specifications for these OEM parts. Let me give you a different example: An HP Z840 workstation with the largest PSU has a total of three 6-pin PCIe power connectors. In contrast to the usual specifications for this type of connector, it can carry up to 150W, and thus can be split up into 2 6-pin connectors. If I wanted to connect a Radeon VII with it's 8-pin connectors, I would have to use two rails from the power supply. So without exceeding the specs and risking to trigger OCP, one Radeon VII card is the maximum for this workstation. Despite the PSUs 1450W rating @230V. You could probably undervolt the GPUs, or limit their power consumption. But I don't know how easy that is on Linux. Long story short: personally, with the intent of using several Radeon VII GPUs, I would not go with an OEM workstation. Instead, I would cobble something together from aftermarket parts. With a special focus on a large case with plenty of airflow, and a decent power supply with all the 8-pin connectors these cards need. It would probably land below 3000$ with Xeon E5 v3 CPUs, excluding the GPUs. |
||
May 11, 2020, 03:25 |
|
#5 |
Member
EM
Join Date: Sep 2019
Posts: 59
Rep Power: 7 |
the following prices r from ebay in euros:
asus Z10PE-D16 WS ~410, 2x E5-2698 V3 1500, 16x 16GB - DDR4 2133MHz (PC4-17000, 2Rx4) rdimm 900. that is ~2800 before u add psu, box. so 3K for and OEM box sound about right - unless u know better. |
|
May 11, 2020, 06:14 |
|
#6 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,427
Rep Power: 49 |
3000 dollars for an OEM workstation with these exact specs are about the usual market rate. I won't argue with that.
But I think we already established that you would run into some severe limitations with such a box. So it is not just about the spec sheet. You have not yet disclosed what specs you really need. Let's say you really benefit from the 16 cores per CPU of a Xeon E5-2698 v3. Then it can still be had for less than 500$ each on ebay. Comparing some of the prices for these used Xeons, you will notice that the 2698v3 has a horrible price/performance ratio. 12-core variants like the 2678 v3 start as low as 100$. The only reason to buy a system with Haswell Xeons in 2020 is price. Expensive CPUs defeat the purpose. Same for paying 3.50$/GB on used DDR4-2133 reg ECC. There are cheaper offers, even on ebay. And be careful which motherboard you chose. The PCIe slot spacing on Asus Z10PE-D16 WS will only allow you to fit three dual-slot GPUs. The D8 variant can fit four cards. Last edited by flotus1; May 11, 2020 at 08:30. |
|
Tags |
dns, workstations |
|
|