CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Viability of Sun T5120 (UltraSPARC T2) for CFD

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   February 8, 2017, 18:18
Default Viability of Sun T5120 (UltraSPARC T2) for CFD
  #1
New Member
 
Join Date: Feb 2017
Posts: 1
Rep Power: 0
Leonux is on a distinguished road
Hi everyone! I have recently noticed that some old UltraSPARC servers can now be found online for under $200 so I decided to investigate if one of these machines is a viable alternative for performing CFD calculations in terms of performance for the price. The Sun T5120 seemed like the obvious choice since the machines can be found for little money and also since the T2 has also been preliminarily benchmarked in numerically-intensive tasks by these guys:

https://doc.itc.rwth-aachen.de/displ...multiplication

The results shown here look somewhat promising I think.

So I recently bought a Sun T5120 for around $200 and installed Debian 9 testing (Sparc64 2016-11-25 iso). The system has the following specs:

Code:
$ uname -a
Linux antares 4.5.0-2-sparc64-smp #1 SMP Debian 4.5.2-1 (2016-04-28) sparc64 GNU/Linux
This is a single socket with an 8 core CPU (8 threads each). Nontheless, Debian detects the CPU thread layout as follows:
Code:
$ lscpu
Architecture:          sparc64
CPU op-mode(s):        32-bit, 64-bit
Byte Order:            Big Endian
CPU(s):                64
On-line CPU(s) list:   0-63
Thread(s) per core:    4
Core(s) per socket:    16
Socket(s):             1
Flags:                 sun4v
Lots of ram for little money:
Code:
$ cat /proc/meminfo 
MemTotal:       33053792 kB
MemFree:        26013512 kB
MemAvailable:   32269728 kB
Buffers:          401384 kB
Cached:          5766488 kB
SwapCached:         2400 kB
Active:          4353928 kB
Inactive:        2068008 kB
Active(anon):     397096 kB
Inactive(anon):   155960 kB
Active(file):    3956832 kB
Inactive(file):  1912048 kB
Unevictable:           0 kB
Mlocked:               0 kB
SwapTotal:      16410384 kB
SwapFree:       16401288 kB
Dirty:              3840 kB
Writeback:             0 kB
AnonPages:        255800 kB
Mapped:           100000 kB
Shmem:            295176 kB
Slab:             532072 kB
SReclaimable:     479112 kB
SUnreclaim:        52960 kB
KernelStack:       10576 kB
PageTables:         6744 kB
NFS_Unstable:          0 kB
Bounce:                0 kB
WritebackTmp:          0 kB
CommitLimit:    32937280 kB
Committed_AS:    1271808 kB
VmallocTotal:   103075020800 kB
VmallocUsed:           0 kB
VmallocChunk:          0 kB
AnonHugePages:         0 kB
HugePages_Total:       0
HugePages_Free:        0
HugePages_Rsvd:        0
HugePages_Surp:        0
Hugepagesize:       8192 kB
Code:
$ lspci
02:00.0 PCI bridge: PLX Technology, Inc. PEX 8533 32-lane, 6-port PCI Express Switch (rev aa)
03:01.0 PCI bridge: PLX Technology, Inc. PEX 8533 32-lane, 6-port PCI Express Switch (rev aa)
03:02.0 PCI bridge: PLX Technology, Inc. PEX 8533 32-lane, 6-port PCI Express Switch (rev aa)
03:08.0 PCI bridge: PLX Technology, Inc. PEX 8533 32-lane, 6-port PCI Express Switch (rev aa)
03:09.0 PCI bridge: PLX Technology, Inc. PEX 8533 32-lane, 6-port PCI Express Switch (rev aa)
04:00.0 PCI bridge: PLX Technology, Inc. PEX 8517 16-lane, 5-port PCI Express Switch (rev ac)
05:01.0 PCI bridge: PLX Technology, Inc. PEX 8517 16-lane, 5-port PCI Express Switch (rev ac)
05:02.0 PCI bridge: PLX Technology, Inc. PEX 8517 16-lane, 5-port PCI Express Switch (rev ac)
05:03.0 PCI bridge: PLX Technology, Inc. PEX 8517 16-lane, 5-port PCI Express Switch (rev aa)
06:00.0 PCI bridge: PLX Technology, Inc. PEX8112 x1 Lane PCI Express-to-PCI Bridge (rev aa)
07:00.0 USB controller: NEC Corporation OHCI USB Controller (rev 43)
07:00.1 USB controller: NEC Corporation OHCI USB Controller (rev 43)
07:00.2 USB controller: NEC Corporation uPD72010x USB 2.0 Controller (rev 04)
08:00.0 Ethernet controller: Intel Corporation 82571EB Gigabit Ethernet Controller (rev 06)
08:00.1 Ethernet controller: Intel Corporation 82571EB Gigabit Ethernet Controller (rev 06)
09:00.0 Ethernet controller: Intel Corporation 82571EB Gigabit Ethernet Controller (rev 06)
09:00.1 Ethernet controller: Intel Corporation 82571EB Gigabit Ethernet Controller (rev 06)
0a:00.0 SCSI storage controller: LSI Logic / Symbios Logic SAS1068E PCI-Express Fusion-MPT SAS (rev 04)
0b:00.0 PCI bridge: PLX Technology, Inc. PEX 8533 32-lane, 6-port PCI Express Switch (rev aa)
0c:01.0 PCI bridge: PLX Technology, Inc. PEX 8533 32-lane, 6-port PCI Express Switch (rev aa)
0c:02.0 PCI bridge: PLX Technology, Inc. PEX 8533 32-lane, 6-port PCI Express Switch (rev aa)
0c:08.0 PCI bridge: PLX Technology, Inc. PEX 8533 32-lane, 6-port PCI Express Switch (rev aa)
0c:09.0 PCI bridge: PLX Technology, Inc. PEX 8533 32-lane, 6-port PCI Express Switch (rev aa)
0c:0a.0 PCI bridge: PLX Technology, Inc. PEX 8533 32-lane, 6-port PCI Express Switch (rev aa)
Code:
$ gcc --version
gcc (Debian 6.3.0-5) 6.3.0 20170124
Copyright (C) 2016 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
I installed OpenFOAM 4.1 from source using the following compiler
flags:
Code:
-O3 -m64 -mcpu=niagara2 -mvis2 -ftree-vectorize
-- MY FIST TEST --
I put together this test quickly using only duct tape, glue, and paper clips. If it turns out that this -for fun- project gets enough interest from some of y'all, I may come up with a better test case.

The test geometry is a pipe with a diameter d=10mm and a length L=500mm. The mesh was generated by gmsh with 139944 cells. Here is part of the gmshToFoam command output:
Code:
Cells:
    total:139944
    hex  :0
    prism:0
    pyr  :0
    tet  :139944

CellZones:
Zone    Size
    0    139944

Skipping tag  at line 207727
Patch 0 gets name patch0
Patch 1 gets name patch1
Patch 2 gets name patch2
Here is part of my controlDict file:
Code:
application     icoFoam;
startFrom       startTime;
startTime       0;
stopAt          endTime;
endTime         150;
deltaT          0.1;
writeControl    timeStep;
writeFrequency  1;
purgeWrite      0;
writeFormat     ascii;
writePrecision  6;
writeCompression off;
timeFormat      general;
timePrecision   6;
runTimeModifiable true;
... and decomposeParDict:
Code:
numberOfSubdomains 64; 
method          simple; 
simpleCoeffs
{
    n               (64 1 1);
    delta           0.001;
}
Code:
solvers
{
    p
    {
        solver          PCG;
        preconditioner  DIC;
        tolerance       1e-06;
        relTol          0.05;
    }

    pFinal
    {
        $p;
        relTol          0;
    }

    U
    {
        solver          smoothSolver;
        smoother        symGaussSeidel;
        tolerance       1e-05;
        relTol          0;
    }
}

PISO
{
    nCorrectors     2;
    nNonOrthogonalCorrectors 2;
    pRefCell        0;
    pRefValue       0;
}
Boundary conditions are what you would expect for a typical pipeflow
problem.

RESULTS

This is how I ran the case on the T5120:
Code:
mpirun -np 64 icoFoam -parallel
The case ran in
Code:
 403 seconds
using 64 threads,
Code:
 473 seconds
using 32 threads, and
Code:
 743 seconds
using 16 threads.

For comparison, the run time using my laptop (Intel T7300) was
Code:
 1006 seconds
using 2 threads.

Hopefully someone (a grad student with little funding ) finds this information useful and allow for making a more educated decision when considering buying a now cheap T5120 for CFD.
Leonux is offline   Reply With Quote

Old   February 9, 2017, 06:36
Default
  #2
Senior Member
 
Hrvoje Jasak
Join Date: Mar 2009
Location: London, England
Posts: 1,907
Rep Power: 33
hjasak will become famous soon enough
Hi,

It actually looks much better than I thought. I would still buy low-core high-clock Intel chips, but that does not account for the budget constraint.

In any case, it is really cool to see FOAM on a Sun again, since a lot of early 1990-s work has been done on Sun (and SGI) workstations. Oh, the good old times...

Hrv

Good luck,

Hrv
__________________
Hrvoje Jasak
Providing commercial FOAM/OpenFOAM and CFD Consulting: http://wikki.co.uk
hjasak is offline   Reply With Quote

Old   February 10, 2017, 13:42
Default
  #3
Senior Member
 
Paulo Vatavuk
Join Date: Mar 2009
Location: Campinas, Brasil
Posts: 200
Rep Power: 18
vatavuk is on a distinguished road
Hi Dr Jasak,

Can you explain why low-core is good?
Isn't a Core I7 better than a I5?

Best Regards,
Paulo
vatavuk is offline   Reply With Quote

Reply

Tags
cfd, performance, ultrasparc


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On



All times are GMT -4. The time now is 12:15.