NVIDIA Shows That Their GeForce RTX GPUs Are Much Faster & Powerful Than Next-Gen Consoles

I'm still not sold on the whole RTX line because of the whole dedicated ray-tracing cores mumbo jumbo. Isn't it still just a segregated area of tensor cores dedicated to nothing but running ray tracing? Just feels like slapping on an extra tier of premium to justify even higher pricing.
 
Also, pretending that RTX performance/dollar won't rise considerably with the release of Ampere is just telling us you're not thinking at all. Because obviously, NVIDIA has nothing in the pipeline to shake-down the already two-years late AMD to RTX, and obviously it's going to just bowl over everything:rolleyes:

Who were you directing your rant at here?
 
Who were you directing your rant at here?


You. By using the 2070 nomenclature to compare against, you're implying that it will also have comparable RTX performance. At that price, it's unlikely.

A better comparison would be"Souped-up RX 5700"
:D

None of this performance in this APU comes for free, so the likelihood that this thing will have anywhere near RTX 2070's raytracing throughput is just laughable.
 
But nonetheless they are a huge step up from what's available now
Not a huge leap like what we've seen before generationally. The GPU maybe twice faster than a Xbox One X, while the CPU maybe 4-6 faster? The ram alone is pathetic with 16GB, which is just double of last generation.
and we all know they probably will fall into the $500 range which is really good for that level of performance.
Sure but will people buy it? Traditionally no console that was $500+ has done well, like the PS3 or Xbox One. To this day nobody knows how many Xbox Ones were sold since Microsoft stopped reporting the pathetic sales. If the price is too high then buying a PC would be not much more. I can build the equivalent of the Xbox Series X or PS5 for $700, assuming the prices stop bouncing around during the holidays.
Current gen cards are a rip off but at least AMD is a bit closer to reality now with their pricing.
My issue with AMD is that none of their cards offer Ray-Tracing, which is a feature that both Xbox Series X and PS5 will offer. Which means buying an AMD based graphics card maybe a waste of money, unless AMD reveals they'll do Ray-Tracing like Crytek Neon Noir's demo.
I honestly was going to wait till Ampere released sine I've already waited a year but the itch got so bad that I found some guy on Craiglist today that was selling a brand new ROG Strix 2080 Ti and I convinced him to give it to me for $800..I'd say I got a fucking great deal considering it's brand new:
That dude must have really needed the money if he sold it to you for that low. I think you just won at Christmas Holiday shopping.

That's more cu and fp32 performance than the 5700xt which > 2070, how are you predicting 2060 performance?
I have a number of reasons.

#1 The 5700XT has 9.7 tflops while the Vega 64 has over 12 tflops, but which one is the faster card? The RTX 2080 has less tflops than either of these cards and still beats them in gaming. Tflops are not the end all be all measurement of GPU performance.

#2 What about the memory bandwidth? If the rumors are correct then these systems have 16GB of GDDR6 memory, which is shared between the Zen2 cores and the monstrous 12 tflop GPU. If the Zen2 cores don't have their large L3 cache then the latency of the GDDR6 memory is going to cause a massive hit in CPU performance, which will further decrease the GPU performance. This is also one of the reasons why the PS4 and XB1 did so poorly in performance, despite the console optimization magic that people believe that exists.

#3 Then there's Ray-Tracing performance, which again nobody knows anything about how AMD or these consoles are going to accomplish this. For all we know, RDNA2.0 is going be like RDNA1.0 with slightly better power consumption and performance but adds nothing to Ray-Tracing particularly. AMD's and consequently Microsoft and Sony's version of Ray-Tracing maybe like how Crytek handles it? In which case Nvidia's RTX may have an edge in Ray-Tracing performance because they have dedicated hardware for it. How exactly is AMD going into 2020 with no Ray-Tracing hardware? They're about to release the RX 5600, which again doesn't have dedicated Ray-Tracing hardware. I'm starting to think AMD doesn't have dedicated Ray-Tracing hardware, or if they do they'll release it as a $700 graphics card. Which if that were the case then the PS5 and Xbox Series X doesn't have dedicated Ray-Tracing hardware either.
 
You. By using the 2070 nomenclature to compare against, you're implying that it will also have comparable RTX performance. At that price, it's unlikely.

A better comparison would be"Souped-up RX 5700"
:D

None of this performance in this APU comes for free, so the likelihood that this thing will have anywhere near RTX 2070's raytracing throughput is just laughable.

2070 RT performance is pathetic. Furthermore, Ampere has nothing to do with the consoles, the 2070 was just used as a reference of potential performance. So fast forward to holiday 2020, you get a $500 console with hardware RT and let's assume it's somewhere in the neighborhood of Turings anemic RT performance and Ampere doubles that, it still wouldn't be great at the midrange or cheap. So if nVidia were to put out say a 3070 in a maxQ laptop with 2x the RT performance of maxQ 2080, it would still cost $1500-2000 so the point of comparing a console to a PC would still look stupid.
 
  • Like
Reactions: N4CR
like this
Not a huge leap like what we've seen before generationally. The GPU maybe twice faster than a Xbox One X, while the CPU maybe 4-6 faster? The ram alone is pathetic with 16GB, which is just double of last generation.

Sure but will people buy it? Traditionally no console that was $500+ has done well, like the PS3 or Xbox One. To this day nobody knows how many Xbox Ones were sold since Microsoft stopped reporting the pathetic sales. If the price is too high then buying a PC would be not much more. I can build the equivalent of the Xbox Series X or PS5 for $700, assuming the prices stop bouncing around during the holidays.

My issue with AMD is that none of their cards offer Ray-Tracing, which is a feature that both Xbox Series X and PS5 will offer. Which means buying an AMD based graphics card maybe a waste of money, unless AMD reveals they'll do Ray-Tracing like Crytek Neon Noir's demo.

That dude must have really needed the money if he sold it to you for that low. I think you just won at Christmas Holiday shopping.


I have a number of reasons.

#1 The 5700XT has 9.7 tflops while the Vega 64 has over 12 tflops, but which one is the faster card? The RTX 2080 has less tflops than either of these cards and still beats them in gaming. Tflops are not the end all be all measurement of GPU performance.

#2 What about the memory bandwidth? If the rumors are correct then these systems have 16GB of GDDR6 memory, which is shared between the Zen2 cores and the monstrous 12 tflop GPU. If the Zen2 cores don't have their large L3 cache then the latency of the GDDR6 memory is going to cause a massive hit in CPU performance, which will further decrease the GPU performance. This is also one of the reasons why the PS4 and XB1 did so poorly in performance, despite the console optimization magic that people believe that exists.

#3 Then there's Ray-Tracing performance, which again nobody knows anything about how AMD or these consoles are going to accomplish this. For all we know, RDNA2.0 is going be like RDNA1.0 with slightly better power consumption and performance but adds nothing to Ray-Tracing particularly. AMD's and consequently Microsoft and Sony's version of Ray-Tracing maybe like how Crytek handles it? In which case Nvidia's RTX may have an edge in Ray-Tracing performance because they have dedicated hardware for it. How exactly is AMD going into 2020 with no Ray-Tracing hardware? They're about to release the RX 5600, which again doesn't have dedicated Ray-Tracing hardware. I'm starting to think AMD doesn't have dedicated Ray-Tracing hardware, or if they do they'll release it as a $700 graphics card. Which if that were the case then the PS5 and Xbox Series X doesn't have dedicated Ray-Tracing hardware either.

Tflops is an effective measure of performance within the same architecture (see Vega 64 and Radeon vii). There is no reason to assume a 12 teraflop Navi based card would be slower than a 10 teraflop Navi based card. Ten teraflop Navi card is at parity with the 2070/2070 super. Saying it will be equivalent to a 2060 is just not based in reality. It will be at a minimum equivalent to a 2070/2070 super.

CPU is still a massive jump from the jaguar based units in the one x and PS4, even with cache changes.

Numerous sources have stated there is hardware raytracing, to assume they're just going to pull a fast one and be like "surprise, software raytracing" is silly.
 
Tflops is an effective measure of performance within the same architecture (see Vega 64 and Radeon vii). There is no reason to assume a 12 teraflop Navi based card would be slower than a 10 teraflop Navi based card. Ten teraflop Navi card is at parity with the 2070/2070 super. Saying it will be equivalent to a 2060 is just not based in reality. It will be at a minimum equivalent to a 2070/2070 super.
A Vega 64 is slower than a RX 5700XT but has more tflops. Since consoles are rumored to be based on Navi2.0 then tflops maybe different yet again. Also console manufacturers are known to fudge numbers for marketing, because who's going to prove them wrong? Randy Pitchford doesn't seem impressed with these consoles, and there maybe a reason.
CPU is still a massive jump from the jaguar based units in the one x and PS4, even with cache changes.
For sure, just look at the CPU performance of a PS4 Pro in Linux. Multicore performance is slower than a modern Ryzen single core.

Numerous sources have stated there is hardware raytracing, to assume they're just going to pull a fast one and be like "surprise, software raytracing" is silly.
You'd think we would have heard of something from AMD? Which is why I think they're playing with words by saying there is hardware Ray-Tracing. Why sell consumers these useless RX 5700's when the consoles AMD is supplying hardware is going to have hardware Ray-Tracing? Either someone is exaggerating or AMD is going to charge a fantastic amount for Ray-Tracing on PC.
Yea, and the RTX 2080 will also cost more than a PS4/Xbox Series X. Nvidia needs to stop increasing prices and get to releasing the RTX 3000's series at much lower prices. According to Stream Hardware Survey the GTX 1060/1050/1050Ti/1070 are the most popular GPU's. The 1650's 1660's and RTX cards hardly exist.
 
DukenukemX read the patents for AMD RT.
They use a much more scalable shading unit which can handle RT or non RT with much less wasted silicon than NV approach.

V64 is faster in compute then a Ti or 5700.
Old GCN is best at that and what it is aimed at, gpus for desktop are just an adaptation to increase TAM.
 
DukenukemX read the patents for AMD RT.
They use a much more scalable shading unit which can handle RT or non RT with much less wasted silicon than NV approach.

You can certainly claim that they're different, as well they should be, but there is no basis to make a claim that one is more this or that.

V64 is faster in compute then a Ti or 5700.
Old GCN is best at that and what it is aimed at, gpus for desktop are just an adaptation to increase TAM.

Nvidia builds strong compute GPUs too -- they just charge and get paid for them.

They also build GPUs that are both more efficient and faster for consumer use, including versus Navi.


Can we get a little more logic and a little less bias?
 
You can certainly claim that they're different, as well they should be, but there is no basis to make a claim that one is more this or that.



Nvidia builds strong compute GPUs too -- they just charge and get paid for them.

They also build GPUs that are both more efficient and faster for consumer use, including versus Navi.


Can we get a little more logic and a little less bias?
Read the patent. Someone on here has already done this and had a very similar conclusion when discussing with them. They know more about it than you and I combined.
That said yes we have to see the end result first. Probably in the new consoles.
My point wasn't dick measuring GPUs but that uarch is different between vega, paxwell, navi etc.
 
My point wasn't dick measuring GPUs but that uarch is different between vega, paxwell, navi etc.

They kind of have to be different, however, the discussions that have been had brought up that AMDs implementation of hardware RT is more similar than different. Even without any knowledge of each others' design decisions, they're both working toward the same very well understood result and more have to worry about convergent engineering than anything else, to make sure that what they wind up producing is different enough to prevent a legal circus.
 
Consoles use to achieve performance by the games themselves bootstrapping the hardware without any os and using clever hardware assembly tricks to eek performance out of the hardware specifically tuned for the particular game.

That level of customization is long dead (well outside of the Nintendo portables before switch)

Now what makes consoles still relevant outside of exclusive games is the fact you can't customize the systems or the games much... Which makes cheating harder and multiplayer games more fair. And multiplayer games are a huge huge business currently.

It's not about being faster than pc's or even laptops.
 
I never click WCC links...

Since the discussion strayed into Raytracing territory... where is the guy who always talks about "Big Navi" and how it will blow away the 2080Ti... (circus can always use more jokers)
 
Back
Top