SLI is Dead. Well, Try 1,280 GPUs!!!

Discussion in 'Video Cards' started by cybereality, Mar 18, 2019.

  1. Master_shake_

    Master_shake_ [H]ardForum Junkie

    Messages:
    8,682
    Joined:
    Apr 9, 2012
    a kidney a lung and a left testicle ought to be enough.
     
  2. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,391
    Joined:
    Mar 22, 2008
    They can take my whole body. I bet 1,280 RTX cards is enough to upload my brain...
     
    Last edited: Mar 19, 2019
    Master_shake_ likes this.
  3. Dodge245

    Dodge245 Limp Gawd

    Messages:
    181
    Joined:
    Oct 8, 2018
    Playing space invaders for eternity sounds like hell to me.

    On a serious note, you know why this exists, for game streaming companies like Shadow, and Nvidia's own service.
     
  4. Nausicaa

    Nausicaa [H]Lite

    Messages:
    120
    Joined:
    Mar 9, 2015
    Today google announced they were using AMD with vulkan and linux. Glad they chose something more open rather than nvidia not allowing anything but teslas in data centers.
     
    Dayaks and cybereality like this.
  5. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,391
    Joined:
    Mar 22, 2008
    Big win for AMD. And Linux Vulkan as well.
     
  6. XoR_

    XoR_ Gawd

    Messages:
    809
    Joined:
    Jan 18, 2016
    Funny how well this describes our universe...
     
    RogueTadhg, Dodge245 and cybereality like this.
  7. pippenainteasy

    pippenainteasy Gawd

    Messages:
    571
    Joined:
    May 20, 2016
    it's 40 GPUs per server. So maybe you can play Crysis 3 at 16K.
     
  8. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,391
    Joined:
    Mar 22, 2008
    All you need is one of those new LG 8K OLEDs and Nvidia DSR to 16K.
     
  9. Well, that could net me about 80-90 million work units a day on Distributed.net's RC5-72 challenge...

    Using the low-end figure.
    It'd take me from where I'm at (#229 overall currently) and rocket me up to #60 in one day.
    It'd take about 2 weeks to to hit #1.
     
    Last edited by a moderator: Mar 19, 2019
  10. Nausicaa

    Nausicaa [H]Lite

    Messages:
    120
    Joined:
    Mar 9, 2015
    Input lag though and latency issues.
     
    XoR_ likes this.
  11. Dodge245

    Dodge245 Limp Gawd

    Messages:
    181
    Joined:
    Oct 8, 2018
    That's good to hear, and a bit surprising, Logic would suggest that a datacentre would want best performance per watt, didn't think the Radeon Cards did so well in terms of power usage. On the other hand, Google is very "Pro" Open Source, so their choice does make sense, and I would also imagine Google may start to try and provide input/expand on Vulkan.
     
  12. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,391
    Joined:
    Mar 22, 2008
    They are using custom AMD chips, so it's possible power usage is better than the consumer available cards.
     
  13. Kajun614

    Kajun614 Limp Gawd

    Messages:
    147
    Joined:
    Jul 7, 2016
    I'll take two..checks pockets and only finds a quarter. hmm
     
  14. BoiseTech

    BoiseTech Limp Gawd

    Messages:
    316
    Joined:
    Mar 15, 2018
  15. XoR_

    XoR_ Gawd

    Messages:
    809
    Joined:
    Jan 18, 2016
    Not if single frame rendering was used.
    Since each ray is calculated separately they could assign each GPU small portion of rays of single eg. 8K frame and run everything in parallel and achieve real-time path tracing with many samples per pixel and still get normal latency.
     
    cybereality likes this.
  16. NoOther

    NoOther [H]ardness Supreme

    Messages:
    6,479
    Joined:
    May 14, 2008
    Not sure what you are trying to suggest here? I know tons of mixed Datacenters. Also Nvidia has a lot of resources to do open GPU development. I think you may be confusing different issues.
     
    Last edited: Mar 20, 2019
    Factum likes this.
  17. NoOther

    NoOther [H]ardness Supreme

    Messages:
    6,479
    Joined:
    May 14, 2008
    Factum and elite.mafia like this.
  18. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,391
    Joined:
    Mar 22, 2008
    Yeah, I realize this isn't SLI. It was a joke.
     
  19. Dodge245

    Dodge245 Limp Gawd

    Messages:
    181
    Joined:
    Oct 8, 2018
    whoosh!
     
  20. Nausicaa

    Nausicaa [H]Lite

    Messages:
    120
    Joined:
    Mar 9, 2015
    Nvidia makes proprietary hardware that solves problems (gsync was great but expensive now they are supporting freesync too), while amd beats it in raw power. The Radeon VII is very powerful, its made for workstations but its sold as a gaming GPU.
     
  21. Zepher

    Zepher [H]ipster Replacement

    Messages:
    16,763
    Joined:
    Sep 29, 2001
    I wonder if you get a discount on the RTX6000 cards if you buy 1280 of them, $5 million at list price just for the cards.
     
  22. MyNameIsAlex

    MyNameIsAlex Limp Gawd

    Messages:
    313
    Joined:
    Mar 10, 2019
    The most logical conclusion I drew is that their electricity is so cheap that it's not more important than depreciation.

    It would make sense to build a datacenter next to a coal plant or someshit tbh
     
  23. Spoom

    Spoom n00b

    Messages:
    53
    Joined:
    Mar 1, 2016
  24. raz-0

    raz-0 [H]ardness Supreme

    Messages:
    4,503
    Joined:
    Mar 9, 2003
    It's not just that. Doing some quick back of the envelope math, it's more modular and cost competitive with some serious HPC offerings. And with tensor cores, probably more broadly useful, especially if you can't afford the full deployment. In which case the competition falls off greatly. The only thing it's hard to determine is what the power and HVAC cost per rack is.
     
  25. Factum

    Factum [H]ard|Gawd

    Messages:
    1,744
    Joined:
    Dec 24, 2014
    GPU’s are getting more and more common in our datacenters.
    We are looking into 2U servers with 4 Tesla’s per servers.
    (You cannot just slam 21 servers in a 42U rack...power and cooling needs to be up to par) but I am seeing a shift...GPU’s are moving in, CPU’s are losing importance...no wonder Intel is going into GPU’s...do or die.

    The coming “battle” is going to be fun.