SLI is Dead. Well, Try 1,280 GPUs!!!

Playing space invaders for eternity sounds like hell to me.

On a serious note, you know why this exists, for game streaming companies like Shadow, and Nvidia's own service.
Today google announced they were using AMD with vulkan and linux. Glad they chose something more open rather than nvidia not allowing anything but teslas in data centers.
 
Well, that could net me about 80-90 million work units a day on Distributed.net's RC5-72 challenge...

Using the low-end figure.
It'd take me from where I'm at (#229 overall currently) and rocket me up to #60 in one day.
It'd take about 2 weeks to to hit #1.
 
Last edited by a moderator:
Today google announced they were using AMD with vulkan and linux. Glad they chose something more open rather than nvidia not allowing anything but teslas in data centers.

That's good to hear, and a bit surprising, Logic would suggest that a datacentre would want best performance per watt, didn't think the Radeon Cards did so well in terms of power usage. On the other hand, Google is very "Pro" Open Source, so their choice does make sense, and I would also imagine Google may start to try and provide input/expand on Vulkan.
 
They are using custom AMD chips, so it's possible power usage is better than the consumer available cards.
 
Input lag though and latency issues.
Not if single frame rendering was used.
Since each ray is calculated separately they could assign each GPU small portion of rays of single eg. 8K frame and run everything in parallel and achieve real-time path tracing with many samples per pixel and still get normal latency.
 
Today google announced they were using AMD with vulkan and linux. Glad they chose something more open rather than nvidia not allowing anything but teslas in data centers.

Not sure what you are trying to suggest here? I know tons of mixed Datacenters. Also Nvidia has a lot of resources to do open GPU development. I think you may be confusing different issues.
 
Last edited:
That's good to hear, and a bit surprising, Logic would suggest that a datacentre would want best performance per watt, didn't think the Radeon Cards did so well in terms of power usage. On the other hand, Google is very "Pro" Open Source, so their choice does make sense, and I would also imagine Google may start to try and provide input/expand on Vulkan.
Nvidia makes proprietary hardware that solves problems (gsync was great but expensive now they are supporting freesync too), while amd beats it in raw power. The Radeon VII is very powerful, its made for workstations but its sold as a gaming GPU.
 
I wonder if you get a discount on the RTX6000 cards if you buy 1280 of them, $5 million at list price just for the cards.
 
That's good to hear, and a bit surprising, Logic would suggest that a datacentre would want best performance per watt, didn't think the Radeon Cards did so well in terms of power usage. On the other hand, Google is very "Pro" Open Source, so their choice does make sense, and I would also imagine Google may start to try and provide input/expand on Vulkan.

The most logical conclusion I drew is that their electricity is so cheap that it's not more important than depreciation.

It would make sense to build a datacenter next to a coal plant or someshit tbh
 
Playing space invaders for eternity sounds like hell to me.

On a serious note, you know why this exists, for game streaming companies like Shadow, and Nvidia's own service.

It's not just that. Doing some quick back of the envelope math, it's more modular and cost competitive with some serious HPC offerings. And with tensor cores, probably more broadly useful, especially if you can't afford the full deployment. In which case the competition falls off greatly. The only thing it's hard to determine is what the power and HVAC cost per rack is.
 
GPU’s are getting more and more common in our datacenters.
We are looking into 2U servers with 4 Tesla’s per servers.
(You cannot just slam 21 servers in a 42U rack...power and cooling needs to be up to par) but I am seeing a shift...GPU’s are moving in, CPU’s are losing importance...no wonder Intel is going into GPU’s...do or die.

The coming “battle” is going to be fun.
 
Back
Top