RTX 3xxx performance speculation

Seems to me that people here seem to be underestimating AMD. Under Raja they obviously seriously underperformed but if you look at the consoles with what Sony has presented and what MS released at Hot Chips it is becoming clear that Bog Navi will be around 40% faster than 2080 TI. At a competitive price AMD can totally disrupt the market. It is why Nvidia is holding back the 3080 TI and Titan variants

New consoles look good but that doesn't tell us how big a chip AMD is going to build, or how much memory BW is feeding it. So we really don't know the final disposition. Neither do we know quite where NVidia parts land either.

Holding back 3080 Ti? They are shipping 3090. Not hard to figure out 3090 > 3080 Ti.

As far as Titan. Titan is NOT a gaming card anymore. Titan RTX came months after RTX 2080 Ti. It was not held back, it was irrelevant for gaming, but probably waiting on accumulation a bunch of higher binned parts.
 
I’ll pay $600 max for 2080Ti performance at ~200W or less, if that isn’t an option then I guess I’ll wait a bit longer with my GTX 1080. Cyberpunk is probably the only new game I’m interested in, but I don’t have to play it on launch day either if performance is too low.
 
Thinking that AMD will suddenly become competitive is an overestimation and would be an exception to the norm, and would only happen if AMD were to suddenly choose to be competitive.

I think AMD will have something very competitive in the second-tier spot. We know the Xbone 2's Navi 2 should perform around 2080 with 12tflops with 52CUs. Bump that up to 72CUs (or 80 as some reports point to) and higher clocks for Big Navi and you've got ~15-20% better performance than 2080ti.

AMD won't compete for top spot, but will likely have something that easily competes with 3080. And if they can compete on price, this could disrupt things. If you're looking for a card in the second tier e.g. 3080, 6800 pro, you should wait; if you're willing to spend top dollar for the top tier, it's probably worth pulling the trigger on Sept 1.
 
As far as Titan. Titan is NOT a gaming card anymore. Titan RTX came months after RTX 2080 Ti. It was not held back, it was irrelevant for gaming, but probably waiting on accumulation a bunch of higher binned parts.

You keep telling yourself that, but you can still download a game ready driver for it. You can't do that for Tesla, Quadro, NVS, etc. which are NOT gaming cards. The Titan RTX? Download your 452.06 Game Ready Driver...

Whether or not it is worth the price as a gaming card is another story.
 
Thinking that AMD will suddenly become competitive is an overestimation and would be an exception to the norm, and would only happen if AMD were to suddenly choose to be competitive.

Has anything Lisa Su has done suggest anything other than they want to be extremely competitive. And unlike Nvidia, we have seen some tangible use of RDNA2
 
We know ... should ...
That's the thing, we don't really know, and unlike with Nvidia, we can't really predict with AMD. At best, we can probably say that AMD has the potential to compete at #2. That's encapsulated in terms of how high they're willing to spec their top-end consumer part, i.e. die size, memory bus width, TDP target and so on, as well as how well their currently unproven technology translates to desktop gaming.

We've been discussing the specs relative to Ampere, but that second part is a significant unknown. Even having AMD graphics underpinning the last two Xbox generations has not lead to AMD parts having a performance-per-'spec' parity on the desktop, let alone a notable advantage.
 
So RTX 3090 is going to be 14% faster at 2160p than 1440p when compared to a 1080 ti? Totally makes sense. Count me in for two! :rolleyes:

Why do people make this sort of bunk?

Whats so hard to believe about that? The extra bandwidth might give it a further seperation at 4k. This contrasts to the more bandwidth deprived 3060 that does a bit better at 1080p than 4k.
 
I think AMD will have something very competitive in the second-tier spot. We know the Xbone 2's Navi 2 should perform around 2080 with 12tflops with 52CUs. Bump that up to 72CUs (or 80 as some reports point to) and higher clocks for Big Navi and you've got ~15-20% better performance than 2080ti.

AMD won't compete for top spot, but will likely have something that easily competes with 3080. And if they can compete on price, this could disrupt things. If you're looking for a card in the second tier e.g. 3080, 6800 pro, you should wait; if you're willing to spend top dollar for the top tier, it's probably worth pulling the trigger on Sept 1.

Thing to remember about the XBOX is that the power profile of the entire console has not changed which suggests a lot of headroom in frequency as well for the PC add in cards.
 
Has anything Lisa Su has done suggest anything other than they want
Lisa Su is a CEO. She may be able to 'inspire' and 'drive' development, but what she says publicly has just about zero bearing on what AMD actually produces. She will say what a CEO should say, supposing she wishes to keep her job.
 
Has anything Lisa Su has done suggest anything other than they want to be extremely competitive. And unlike Nvidia, we have seen some tangible use of RDNA2

I am sure AMD will be competitive.

The difference most expect, is only at the top tend, where the expectation is that AMD won't go for the obscenely large and expensive GPU chip, but NVidia will.

This really only matter to the people who buy GPUs that cost >$1000. AMD should have something in all the segments below the absolute top end.

It's possible that AMD surprises beyond that and take the overall crown, but that isn't the expectation.
 
I am sure AMD will be competitive.
They have the potential to be competitive in raster performance, but so far their showing on consoles has them hitting at best the level of performance available with RTX 1.0 on Turing.
 
They have the potential to be competitive in raster performance, but so far their showing on consoles has them hitting at best the level of performance available with RTX 1.0 on Turing.

I think Microsoft listed that the Ray Tracing was scalable likely at the expense of some raster performance and at the low end would be similar to Turing, so might be game configurable and Big Navi could be competitive with RTX 2.0
 
Rumor: GeForce RTX 3090 Pricing to Arrive Around the $2,000 Mark
If this is true, is this card DOA even if it's a 50 % bump in performance? Or are there enough people willing to drop that kind of cash? I think if it's only 30 % it's DOA. That is of course, this rumor is true, knowing Nvidia, I don't doubt it.

The meaninglessness of rumors demonstrated. The source of that $2000 rumor is a Posting on the Chinese Forum "Chip hell". But a different poster on "Chip hell" is claiming the price will be $1399:
https://www.chiphell.com/forum.php?mod=viewthread&tid=2252850&extra=page=1&filter=typeid&typeid=223&page=1
Google Translate:
"Starting at $1399, the fe version is $100 more expensive, 24G gd6x confirm"

So which foreign language forum poster is considered a source. If we start making up numbers here, will people on "Chip Hell" starting quoting us as a rumor source?
 
Last edited:
I think Microsoft listed that the Ray Tracing was scalable likely at the expense of some raster performance and at the low end would be similar to Turing, so might be game configurable and Big Navi could be competitive with RTX 2.0
I think it will depend upon the use case.

I do expect a shaderbeast from AMD in some segment. Beyond that - it is a massive wildcard how well they can compete on larger RT workloads (beyond SuperPuddles) and/or on the high end.
 
Even having AMD graphics underpinning the last two Xbox generations has not lead to AMD parts having a performance-per-'spec' parity on the desktop, let alone a notable advantage.

I don't know... I think you can safely extrapolate ballpark performance of Big Navi from what's been stated and reported specs of upcoming consoles. The clock speed-gimped 5850 in the PS5 performs very much in line in mem bandwidth and rasterization to an underclocked desktop 5850; the 5770-like chip in the Xb1 performs very close to a desktop 5770. Same for the refreshed consoles that perform inline with the Polaris equivalents.
 
I don't know... I think you can safely extrapolate ballpark performance of Big Navi from what's been stated and reported specs of upcoming consoles. The clock speed-gimped 5850 in the PS5 performs very much in line in mem bandwidth and rasterization to an underclocked desktop 5850; the 5770-like chip in the Xb1 performs very close to a desktop 5770. Same for the refreshed consoles that perform inline with the Polaris equivalents.
You could be right; my skepticism is based on AMDs record of translating their console architectures to the desktop, not just in terms of hardware implementation, but also software.

Software is AMDs real GPU weakness.
 
If they sell out every card they can product at $1200 then what sense does it make to lower the price? They stopped productions of higher end 28xx cards and you see the supply chain dried up real quick.
Steam survey puts the 2080ti at a less than 1% market share. So where are all these hoards of cards people are snapping up without a care in the world? The 1060 is at ~12% so clearly Nvidia isn't selling out every card they can produce. They're going with the Apple strategy of selling fewer units at a higher profit margin.
 
Steam survey puts the 2080ti at a less than 1% market share. So where are all these hoards of cards people are snapping up without a care in the world? The 1060 is at ~12% so clearly Nvidia isn't selling out every card they can produce.
Those are not conflicting data points. A later-generation halo product versus the last-gen midrange will of course have wildly different market numbers.
 
Steam survey puts the 2080ti at a less than 1% market share. So where are all these hoards of cards people are snapping up without a care in the world? The 1060 is at ~12% so clearly Nvidia isn't selling out every card they can produce. They're going with the Apple strategy of selling fewer units at a higher profit margin.

>1% of all GPUs, sold since the dawn of time, that are still in use by Steam Gamers today.

If you looked at the percentage of GPUs sold only since the 2080Ti came out, it would obviously be significantly higher than that, and of course, $1000+ GPUs won't sell as well as $200-$300 GPUs.

But NVidia sells both the $200-$300 GPUs and the $1000+ GPUs.

Having $1000+ GPUs in the lineup doesn't detract from ~$300 GPU sales. In fact there is an argument that it enhances them (Halo Product).
 
Steam survey puts the 2080ti at a less than 1% market share. So where are all these hoards of cards people are snapping up without a care in the world? The 1060 is at ~12% so clearly Nvidia isn't selling out every card they can produce. They're going with the Apple strategy of selling fewer units at a higher profit margin.
Steam survey doesn't represent what has taken place in the current availability of cards in the 2080+ range. That market has nearly dried up availability and as such prices have skyrocketed. People bought every new 2080ti that became available, they are now trying to charge ti prices for the remaining 2080 and 2080 super cards because someone will be impatient. .
 
Steam survey puts the 2080ti at a less than 1% market share. So where are all these hoards of cards people are snapping up without a care in the world? The 1060 is at ~12% so clearly Nvidia isn't selling out every card they can produce. They're going with the Apple strategy of selling fewer units at a higher profit margin.

Well they certainly don't have a problem selling the cards that they've chosen to produce. How many more 2080 Ti's do you think should Nvidia produce and at what price? And why?
 
Those are not conflicting data points. A later-generation halo product versus the last-gen midrange will of course have wildly different market numbers.
Both cards are end of life so what gen they are doesn't affect sales,both are maxed out for what they are. The point was people are always saying "oh they're selling as many as they can make" which is untrue.
 
I dont care how many they produce or how much they cost. People spouting BS as facts to defend a company that's doing just fine I could live without.

Ok, I'm not sure what fact you're disputing though. 2080 Ti's really do seem to be sold out everywhere.

Third party sellers are trying to sell used 2080 Ti's on Amazon for $2000. Bat shit insane.
 
You could be right; my skepticism is based on AMDs record of translating their console architectures to the desktop, not just in terms of hardware implementation, but also software.

Software is AMDs real GPU weakness.

Interestingly, Anandtech published a play-by-play of Xbox Series X Architecture at HotChips yesterday...

https://www.anandtech.com/show/1599...ft-xbox-series-x-system-architecture-600pm-pt

A few new pieces of info:
  • RDNA 2 CUs 25% performance/clock than RDNA 1 (or Polaris?). This indicates that the other 25% of AMD's purported "50% faster than RDNA 1" will be attained by clocks and mem bandwidth
  • DX Ray Tracing is "economical". I took this to mean there would be major performance hits/trade offs to using RT on RDNA 2. RT via DX is not nearly as efficient nVidia's dedicated CUDAs, but we knew this
  • Unfortunately, they are holding TDP close to their chest. Revealing TDP would tell us a lot about power/performance envelope e.g. if TDP on the console is 120W, and they are getting 12Tflops, Big Navi is going to be a beast. I'm guessing Xbox Series X TDP is going to be higher than 120W e.g. 160W, which is one of the reasons they're not saying.
It sounds more and more that Big Navi will be a hell of a competitor to Ampere in traditional rasterization (and will at least be able to put "Ray Tracing-enabled" on the box), albeit maybe not at the top tier.
 
RDNA 2 CUs 25% performance/clock than RDNA 1 (or Polaris?). This indicates that the other 25% of AMD's purported "50% faster than RDNA 1" will be attained by clocks and mem bandwidth

No it’s 25% faster perf/clock than GCN i.e. the previous Xbox.
 
You keep telling yourself that, but you can still download a game ready driver for it. You can't do that for Tesla, Quadro, NVS, etc. which are NOT gaming cards. The Titan RTX? Download your 452.06 Game Ready Driver...

Whether or not it is worth the price as a gaming card is another story.

Titan cards are now prosumer. That's not Snowdog telling you that, that's what Nvidia have said. They have moved the Titan line out of the consumer market since the Titan V. These are cards for people who can't afford the Quadro cards or might not need the software support.

But, Nvidia aren't stupid, they know that there are people with a ton of money who will buy these cards for the E-peen or whatever. So why waste a revenue stream? That's why there are game ready drivers for the Titan cards.
 
Interestingly, Anandtech published a play-by-play of Xbox Series X Architecture at HotChips yesterday...

https://www.anandtech.com/show/1599...ft-xbox-series-x-system-architecture-600pm-pt

A few new pieces of info:
  • RDNA 2 CUs 25% performance/clock than RDNA 1 (or Polaris?). This indicates that the other 25% of AMD's purported "50% faster than RDNA 1" will be attained by clocks and mem bandwidth
  • DX Ray Tracing is "economical". I took this to mean there would be major performance hits/trade offs to using RT on RDNA 2. RT via DX is not nearly as efficient nVidia's dedicated CUDAs, but we knew this
  • Unfortunately, they are holding TDP close to their chest. Revealing TDP would tell us a lot about power/performance envelope e.g. if TDP on the console is 120W, and they are getting 12Tflops, Big Navi is going to be a beast. I'm guessing Xbox Series X TDP is going to be higher than 120W e.g. 160W, which is one of the reasons they're not saying.
It sounds more and more that Big Navi will be a hell of a competitor to Ampere in traditional rasterization (and will at least be able to put "Ray Tracing-enabled" on the box), albeit maybe not at the top tier.

I read that presentation as Microsoft saying "We will have raytracing...but don't do too much raytracing...as the hardware is not powerful enough".
 
I read that presentation as Microsoft saying "We will have raytracing...but don't do too much raytracing...as the hardware is not powerful enough".
I interpreted it the sameway, like we will have some puddles and maybe one small area directly infront of you globally illuminated but don't expect a ton if you want 60fps, we may give you an option for more @30.
 
No it’s 25% faster perf/clock than GCN i.e. the previous Xbox.

Yeah, The Claims seem to be that XBSX has equivalent to RTX 2080 GPU.

With 5700XT alread being equivalent to RTX 2070 GPU, at 40 CUs with 256 bit VRAM, then the XBSX with 52 CUs and 320 bit VRAM should easily be equivalent to RTX 2080 without any significant change in technology.

IOW it doesn't look like RDNA2 has a significant performance advantage over RDNA1.
 
I read that presentation as Microsoft saying "We will have raytracing...but don't do too much raytracing...as the hardware is not powerful enough".
Which I find both disappointing... and pretty much what's to be expected from AMD.

Mostly disappointed because it means that developers can't target ray tracing wholesale.
 
Yeah, The Claims seem to be that XBSX has equivalent to RTX 2080 GPU.

With 5700XT alread being equivalent to RTX 2070 GPU, at 40 CUs with 256 bit VRAM, then the XBSX with 52 CUs and 320 bit VRAM should easily be equivalent to RTX 2080 without any significant change in technology.

IOW it doesn't look like RDNA2 has a significant performance advantage over RDNA1.


Supposed to be more like a 2080 Super, also I'm pretty sure it's under clocked to maintain a certain TDP. I think it'll be a good improvement, but not what people are looking for ATM. We'll see though.
 
Supposed to be more like a 2080 Super, also I'm pretty sure it's under clocked to maintain a certain TDP. I think it'll be a good improvement, but not what people are looking for ATM. We'll see though.

On paper by simply the amount of Terra Flops it was above 2080 Super. Though 5700Xt also beats the 2070 Super in Paper Terraflops as well, while being more like Regular 2070 in gaming.

In games it was deliving more in line Regular RTX 2080:
https://www.vg247.com/2020/03/16/xbox-series-x-gears-5-ray-tracing/

So still not seeing any real leap in Raster performance for RDNA2. Big Navi (RDNA2) will be interesting because it's BIG, and because it gets the new features.

Also I don't think it is underclocked much at all. It's expected to draw 300 Watts. The PSU rating Digital Foundry saw on the prototype was 2.2A/200V = 440W...
 
Last edited:
On paper by simply the amount of Terra Flops it was above 2080 Super. Though 5700Xt also beats the 2070 Super in Paper Terraflops as well, while being more like Regular 2070 in gaming.

In games it was deliving more in line Regular RTX 2080:
https://www.vg247.com/2020/03/16/xbox-series-x-gears-5-ray-tracing/

So still not seeing any real leap in Raster performance for RDNA2. Big Navi (RDNA2) will be interesting because it's BIG, and because it gets the new features.

Also I don't think it is underclocked much at all. It's expected to draw 300 Watts. The PSU rating Digital Foundry saw on the prototype was 2.2A/200V = 440W...

I really hope big Navi is a raster beast that matches a 3090. I couldn’t care less about RT as I never used it with my 2080 Ti and since Xbox won’t be using it much, PS5 won’t either so I don’t expect to see too much emphasis on RT these next 3-5 years outside nvidia sponsored titles. If AMD gets 3090 level of raster performance and undercuts nvidia prices by even a few hundred they’ll gain a lot of high end customers and still turn a nice profit.
 
I really hope big Navi is a raster beast that matches a 3090. I couldn’t care less about RT as I never used it with my 2080 Ti and since Xbox won’t be using it much, PS5 won’t either so I don’t expect to see too much emphasis on RT these next 3-5 years outside nvidia sponsored titles. If AMD gets 3090 level of raster performance and undercuts nvidia prices by even a few hundred they’ll gain a lot of high end customers and still turn a nice profit.
That is what I was thinking. RT would be nice, but not many titles and the big hit to performance. I would rather have pure normal performance.
 
I really hope big Navi is a raster beast that matches a 3090. I couldn’t care less about RT as I never used it with my 2080 Ti and since Xbox won’t be using it much, PS5 won’t either so I don’t expect to see too much emphasis on RT these next 3-5 years outside nvidia sponsored titles. If AMD gets 3090 level of raster performance and undercuts nvidia prices by even a few hundred they’ll gain a lot of high end customers and still turn a nice profit.

Since RT is a fact of anything above Entry level GPUs going forward, the time for pretending RT is irrelevant is in the past. It's in AMD/NVidia/Intel Discrete future parts, and in both consoles.

Naturally you may weight any feature lower than others. Heck, I will weight the games I actually want to play most heavily, not specific features. We are all free to make whatever rationalizations we want with our purchases.

It all depends on the whole package of features and performance/price. Which is way too nuanced to pick out with rumors. We are going to need to see reviews of all the products, and pricing before the dust really settles.

Rumors seem to be pointing to AMD competing more with 3080 than 3090 though. That give NVidia more sway to capture luxury pricing on the 3090, while keeping prices tighter with AMD on 3080.
 
I read that presentation as Microsoft saying "We will have raytracing...but don't do too much raytracing...as the hardware is not powerful enough".
First generation tech is never very powerful. Nvidia are already 2 years into their RT tech, so their next gen should be significantly improved. Unfortunately, this will only be AMD's first gen at RT.

We shall soon find out.
 
Since RT is a fact of anything above Entry level GPUs going forward, the time for pretending RT is irrelevant is in the past. It's in AMD/NVidia/Intel Discrete future parts, and in both consoles.

Naturally you may weight any feature lower than others. Heck, I will weight the games I actually want to play most heavily, not specific features. We are all free to make whatever rationalizations we want with our purchases.

It all depends on the whole package of features and performance/price. Which is way too nuanced to pick out with rumors. We are going to need to see reviews of all the products, and pricing before the dust really settles.

Rumors seem to be pointing to AMD competing more with 3080 than 3090 though. That give NVidia more sway to capture luxury pricing on the 3090, while keeping prices tighter with AMD on 3080.

Just because consoles and Intel/AMD support RT doesn’t mean anything. All it says is they can market a “me too” checkbox with their products.

Based on what the Xbox architect said, it’s very likely RT will stay as a minimal used feature for a long time. Not to mention the games with the most replay value on the market are MP FPS/mmo/rts games and not SP games where RT tends to be more heavily used. You can tell by consumer disinterest towards RT that it will remain as a last priority item when building next generation games.

The same is true for DLSS if it remains a SP only game feature, it won’t make a big difference aside from SP game benchmarks and PR. Most PC gamers play online now and prioritize FPS and resolution and neither DLSS or RT help there.

Steam: https://store.steampowered.com/stats/Steam-Game-and-Player-Statistics

Twitch metrics: https://www.twitchmetrics.net/games/viewership All MP games dominating.

SP games can usually only be played once before they’re forgotten. Console and PC players have shifted online where performance is the key metric, not performance killing RT effects. This will just be an nvidia marketing tool, not something heavily utilized.

With regards to big Navi, we’ll see. If the APU theoretically achieves the same performance as a 2080, then a full fledged gpu at 300W that prioritizes raster performance and doesn’t waste die space on Tensor cores might just reach or even beat 3090 raw performance. If it does, I know what I’ll be buying.
 
Last edited:
Back
Top