Random hardware

Discussion in 'General Hardware' started by cyklondx, May 19, 2019.

  1. cyklondx

    cyklondx Limp Gawd

    Mar 19, 2018
    I created this thread to show off some random computer hardware - retro, and new, desktop and server.

    It will likely contain my opinions on the hardware, photos, screenshots, tests, problems i've encountered, what i liked and what i didn't.

    Note1: Most of cases will be in racks, and they aren't for "show", i only care about good/decent temps for 24x7x365 ops with 100% utilization.
    Note2: I mainly use them to either play around, run BOINC projects, 3d graphics software (3dsmax, vue, poser etc.), and sometimes games.
    Note3: 1U, 2U, 4U, 6U describes the size of the rack case. (use duckduckgo/google for examples)

    Starting with, let me tell you about current hardware i'm playing with from newest to oldest:


    Mobo = MSI x470 Pro Gaming Carbon
    CPU = Ryzen 1700x @ 3.9GHz (water loop #1)
    RAM = 16GB DDR4 3200MHz
    GPU = Fury-X @ PCI-E 3.0 x8
    Drives =
    2x NVMe Samsung 960 in RAID0 @ PCI-E 3.0 x8 ~ (not using m.2 slot, just pci-e card)
    4x Intel SSDSC2BA400G3 in RAID0 @ SATA (@mobo)
    PSU = Platimax 850W


    Mobo = MSI x370 Pro Gaming Carbon
    CPU = Ryzen 1700x @ 3.7GHz (water loop #1)
    RAM = 16GB DDR4 3200MHz
    GPU = Evga Nvidia 980Ti Hydro
    Drives =
    1x NVME Samsung 960 @ M.2 PCI-E 3.0 x4
    1x Mushin MKNSSDEC240GB
    1x Kingston SV300S37A240G
    2x Seagate ST4000DM000-1F2168 (RAID1)
    1x WDC WD20EURS-73S48Y0
    PSU = Platimax 850W


    Mobo = Gigabyte x79 UD3
    CPU = Intel 4930k @ 4.2GHz (water loop #2) | *swap* 4820k @ 5GHz
    RAM = 32GB DDR3 1600MHz
    GPU = EVGA NV 780Ti SC, Gigabyte Radeon 290x (NoN Ref)
    Drives =
    2x Patriot SSD 60GB
    PSU = OCZ 700W 80+ Bronze


    4U (IBM xSeries 225)
    Mobo = IBM E7505 Master-LS2 MSI-9121
    CPU = 2x Intel Xeon @ 3.2GHz Gallatin
    RAM = 8GB DDR1 266MHz
    GPU = ATi Radeon x1950 Pro 512MB @ AGP 8x | Palit NV 7600GS 256MB | ATi FireGL 9500 128MB || (Swap)
    Drives =
    1x SCSI Ultra320 Seagate 300GB 15k
    5x SCSI Ultra320 MIX 146GB 15k (RAID 1)
    1x Generic IBM CD-ROM
    1x Generic IBM Floppy
    PSU = IBM 425W

    I also have plenty of hardware just laying about, which could be used to build more PC's, but there have no use for me - and my rack is full anyway.

    In the following posts I will describe and post latest shots of what i'm currently playing with.
  2. cyklondx

    cyklondx Limp Gawd

    Mar 19, 2018
    Recent update updating the #4 with new CPU's, Audio card and 5x SCSI drives 146GB (got it for free from work).
    #Note: Once I install system on this box I'll benchmark the changes and obviously i'll enjoy the quality of the sound :^)

    Replacing the 2x Xeon 2.6GHz Prestonia with 3.2GHz Gallatin CPU's
    Left Prestionia, Right Gallatin

    As you can see here is the comparison how the look changed (they are both compatible, and same socket.) I personally love how they looked in the Prestonia.
    Note: Some Prestonia models like SL6VM were already looking like the Gallatin. They weren't as flashy as this.

    I've bought a pair of Gallatin's 3.2GHz for $14 and seller sent me 4, since he wasn't sure they worked. One had missing pin (on the picture below with pins up), and one didn't want to post.
    2 were functional *yey!


    Audio card
    Sound Blaster Audigy 2 Zs SB0350 PCI

    5x SCSI Ultra320 Drives came in from DELL servers and had their tray caddies. I had to get new ones from ebay. (sorry for lack of pictures of actual drives - they are already formatting, but here are the trays i removed.)
    The IBM trays that came in had one broken.

    Broken IBM tray

    Here's the Dell's tray i removed (got 4 of them, if any1 wants them let me know - i'll send it even for free.)

    Here's the SCSI controller bios where we see all of the drives.

    Formatting them is a pain on avg it takes 1h, and 300GB one took 1h 48m | I cannot make the virtual disk raid with IBM iso until all drives are formatted under bios :<

    while I wait for the drives to format I drink a new soda i bought from Polish shop near me *Its very good.

    Here's how it looks like with all drives filled (sweet, and heavy as fffu)

    I'll post more info, once i get the formatting done, and finish thinking which system i want to install (Windows 2000, XP Sp3, or 2003 Server)

    Attached Files:

    Last edited: May 19, 2019
    GoldenTiger likes this.
  3. cyklondx

    cyklondx Limp Gawd

    Mar 19, 2018
    Done with the formatting (around 5h for all drives) installation did go smoothly.

    At first I installed the 2003 Server. It worked fine, but there were problems with DX8 - and not to mention old browser problems ie5. You cannot do any SSL connection, but thankfully 03 had Ethernet drivers. You cannot do the windows updates, until you manually download service packs on another computer, and later install ie6 and ie7; only then you can start doing the actual update process.
    It took about 5 reboots for updates after that, only to find that I needed to install .net 2.0 x86 (which i couldn't find) In the end I ended up installing the .net 2.0 service packs 1 and 2. That thankfully worked.
    After installing catalysts for radeon x1950, i proceeded to install old 3dmarks. To my dismay it wouldn't run the 3dmark 01 as no DX8 could be run for some odd reason.
    Next steps I took more into making the system more into what I typically use. Deamon Tools 1.4 version (older version that would work with it.) well RIP... the system didn't want to come back up.
    I tried going from safety mode and even debugging mode, and was able to remove deamon tools, but I couldn't get the system to boot anymore (i think problems with scsi.)

    Right now I have WinXP with all updates, all 3dmarks work just fine, and very similar issues - and one big one... no drivers for the network interface.

    I'll attach some benchmarks etc.

    On this specific workstation my-to-do would be acquiring

    another set of smaller speakers
    some cooling for drives (boi they get hat! while running small bench on the drives they've reached 91'C within 30min). - I'll likely will have to cut out holes in the drive cage, and mount some fan in front.
    better cooling on the xeon's, a single 120mm fan for those 2 is not enough ;3


    I've turned off those 2 fans that previously supposed to cool those 2 down - they were too loud. (and the heatsinks were not made for this board, they were ones i found somewhere laying around and they were able to fit.)
    While taking a look at it, and its almost time to finish up this retro PC, I've noticed that the fan of the x1950 fan is kinda too high up, and i won't be able to close the case. I'll likely look for riser agp card and put it vertically instead. It would also help the air flow (as it wouldn't be blowing the hot air on the mobo.)
    GoldenTiger likes this.
  4. cyklondx

    cyklondx Limp Gawd

    Mar 19, 2018
    May be offtopic but i spoke with oracle rep from their server/cloud infrastructure. As we spoke i had found out that they switched to amd epyc for their whole cloud infrastructure. (On datastax accelerate conference) it also seems HP considers to switch compleatly to amd by 2020.

    Adobe uses puppet and cassandra 3.
    Their presentation was still hand drawn and squares were not stright rofl.
    They use amazon linux 2017

    They are serial ssd/nvme killers at amazon as they chunk out 6-8tb data writrs every day.
    And only run trim at the end of the day.
    (They complained that after couple days they were alreadybimpacted by 10x slower io.)

    //Just a note to those interested.
    Last edited: May 23, 2019
  5. cyklondx

    cyklondx Limp Gawd

    Mar 19, 2018
    Some random speculation on Zen2 Epyc CPU | just speculations.

    14nm Epyc 7601 (32 cores 4 chiplets)
    Die = 852 mm^2
    Transistor = 4,800 million ?Incorrect? or should be x4 = 19,200 millions
    Density = 22,535 T/mm^2 | or 90,140 T/mm^2

    45W per chiplet @ 3.2GHz

    12nm TR 2990WX (32 cores 4 chiplets) ~ taking this data is correct.
    Die = 852 mm^2
    Transistor = 19,200 million
    Density = 90,140 T/mm^2

    62W per chiplet @ 4.2GHz -> +37%(W) +31% (Clock)

    7nm+14nm Epyc X (64 cores 8 chiplets) ~ Transistor counts and Densisty is calculated without IO die
    Die = 1040mm^2 (75mm^2 x8 = 600mm^2 + 440mm^2 IO) | (@ 600mm^2 -42%)
    Transistor = ? (@ 39,968 T/mm^2 @ 600mm^2) = ~23,980 million (+24% over 12nm)
    TransistorErrorRate = ~25,599 million (+6.7% Error Rate) | ~21,000 million ( -12% Error Rate)
    Density = ? (39,968 T/mm^2)
    DensityErrorRate = ~42,666 T/mm^2 (if chiplet = 3.2B) | expected @35,000T/mm^2 (+21% Error Rate)

    Total Wattage = should be around ~300W | 33W per component
    Max aprox = 350W | 38W per component
    Low aprox = 240W | 26W per component

    I am unsure about the IO, but it may take more power (and likely is) than the cpu chiplet. If thats the case we may be looking +50W to either making max aprox 400W.


    based on what we see on GPU's (taken within error)

    Die Size = 232 mm^2
    Transistors = 5,700 million
    Density = 24,568 T/mm^2

    Die Size = 495 mm^2 (+46%)
    Transistors = 12,500 million (+45.6%)
    Densisty = 25,252 T/mm^2

    Die Size = 331 mm^2 (-33%)
    Transistor = 13,230 million (+5.8%)
    Densisty = 39,969 T/mm^2 (58.2%)
    Last edited: May 30, 2019