NVIDIA Unveils Turing GPU for the RTX Quadro Lineup

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,601
Jensen kicked off the RTX Quadro cards by showing that now-famous Stormtrooper ray-tracing demo running in real time on a single Turing GPU with its Tensor Cores. The Tensor Cores provide up to 500 trillion operations per second that fill in where the real time ray-tracing comes up short. Deep learning on your ray-tracing to put it simply. Turing is 6X faster at rasterization than Pascal. The new RTX Quadro cards price out at $2,300 for the Quadro RTX 5000, $6,300 for the Quadro RTX 6000, and $10,000 Quadro for the RTX 8000.

Stormtrooper video.

Ray-traced graphics offer incredible realism. Interactive graphics driven by GPUs offer speed and responsiveness. The two now come together in the greatest leap since the invention of the CUDA GPU in 2006, NVIDIA CEO Jensen Huang announced Monday.

Speaking at the SIGGRAPH professional graphics conference in Vancouver, Huang unveiled Turing, NVIDIA’s eighth-generation GPU architecture, bringing ray tracing to real-time graphics. He also introduced the first Turing-based GPUs — the NVIDIA Quadro RTX 8000, Quadro RTX 6000 and Quadro RTX 5000. And he detailed the Quadro RTX Server, a reference architecture for the $250 billion visual effects industry.
 
Quadro RTX Professional GPU highlights:

  • New RT Cores to enable real-time ray tracing of objects and environments with physically accurate shadows, reflections, refractions and global illumination.
  • Turing Tensor Cores to accelerate deep neural network training and inference, which are critical to powering AI-enhanced rendering, products and services.
  • New Turing Streaming Multiprocessor architecture, featuring up to 4,608 CUDA cores, delivers up to 16 trillion floating point operations in parallel with 16 trillion integer operations per second to accelerate complex simulation of real-world physics.
  • Advanced programmable shading technologies to improve the performance of complex visual effects and graphics-intensive experiences.
  • First implementation of ultra-fast Samsung 16Gb GDDR6 memory to support more complex designs, massive architectural datasets, 8K movie content and more.
  • Nvidia NVLink to combine two GPUs with a high-speed link to scale memory capacity up to 96GB and drive higher performance with up to 100GB/s of data transfer.
  • Hardware support for USB Type-C and VirtualLink, a new open industry standard being developed to meet the power, display and bandwidth demands of next-generation VR headsets through a single USB-C connector.
  • New and enhanced technologies to improve performance of VR applications, including Variable Rate Shading, Multi-View Rendering and VRWorks Audio.


GPU Memory Memory with NVLink Ray Tracing CUDA Cores Tensor Cores


Quadro RTX 8000 48GB 96GB 10 GigaRays/sec 4,608 576

Quadro RTX 6000 24GB 48GB 10 GigaRays/sec 4,608 576

Quadro RTX 5000 16GB 32GB 6 GigaRays/sec 3,072 384
 
Starting at only $2300, I'm sure they will be selling like hot cakes!
 
Raytracing has never looked very realistic to me. Everything is too perfect.

It has it's place, but real-time raytraced games are pretty much NOT going to be realistic looking.
 
Raytracing has never looked very realistic to me. Everything is too perfect.

It has it's place, but real-time raytraced games are pretty much NOT going to be realistic looking.

That's why you then have to apply a pass of "Dirt-Tracing (R)" That dirt layer is what makes things realistic.
 
One vendor lock in demon for another. How about Vulkan instead?

Just wanna emphasize that what Nvidia is doing with Ray-Tracing is not impressive. You can download a Hybrid Ray-Tracing Demo right now, that was made in 2012 that uses DX11. So why does Nvidia requires a RTX graphics card with most likely their API?

http://rigidgems.sakura.ne.jp/



They updated the demo recently.

 
And let me guess, it uses a proprietary Nvidia standard that's going to fragment the market further and lock in consumers to their brand.

I know, this is Quadro, so not consumer....

...yet
 
And let me guess, it uses a proprietary Nvidia standard that's going to fragment the market further and lock in consumers to their brand.

I know, this is Quadro, so not consumer....

...yet

Doesn't matter if they do. It's not like the industry isn't fragmented as all hell anyway. At least there aren't 50 different brands of GPU like there were back in the 90s and early 2000s. Obviously developers and artists see value in NV's way of doing things or they wouldn't be able to get away with doing things the way they do. Also, until they get some real competition from someone, they really are the only game in town. I'm not saying that's good for the industry, but they'll play ball if someone gives them a reason to. So far... Nobody is... It's not their fault that they act like a big business because they're able. Someone has to rise up and push them to compete and be more open. Literally nobody is doing it, or is even up to the task. AMD is the closest, but they're too busy with CPUs right now.

Also, like you say, this is Quadro, so it's content development. Once out of that stage, things can change. Sure they'll make it much easier to integrate with their own consumer parts, APIs, etc. Why wouldn't they? But, they still want people to buy their stuff, and buy the software that runs on it, so they have to at least play "kinda" nicely. (largely fake or not)
 
Why are these multi-thousand dollar commercial GPU's using GDDR6 instead of HBM 2/3?
 
Back
Top