RTX 3xxx performance speculation

But transition to 7nm so far, neither AMD nor Intel have made big gains this time, so I wouldn't expect Big gains from NVidia either.

Intel's transition to 10nm has been rocky as hell, but they have shown architectural gains across the board. AMD hasn't gained nearly as much from their initial Zen release, which is to be expected.

As for Nvidia- consider that they tend to execute very well each generation, and they already have the highest-performing architecture on the market. Essentially, they'd have to pull a Bulldozer or Netburst at this point. Further, we're talking about giving them a larger transistor budget with better efficiency. Knocking stuff like this out of the park is what they do.
 
Because there are three big players in PC silicon parts: AMD, Intel and NVidia.

Transitioning to 14nm/16nm, they made impressive performance gains.

But transition to 7nm so far, neither AMD nor Intel have made big gains this time, so I wouldn't expect Big gains from NVidia either.

AMDs gains seem to be mostly from architecture (Navi, Zen 2), not from clock speed boost.

So the evidence is that 28nm->14nm transition looks like it was MUCH better than the 14nm->7nm transition.

On top of that you have to factor that transistor economics are worse this time.

On the 28nm->14nm 980Ti to 1080Ti increased transistor count by 50%. That isn't going to happen this time. I would expect 10%-20% more transistors this time. Count ourselves lucky if it's 20%.

This is NOT going to be like 1080Ti vs 980Ti. That was the last gasp of big transistor count gains.
You can't compare CPU architectures to GPU architectures, was the poster's point. Still, you can't compare processes from different fabs, either. Everybody was on TSMC this generation of video cards, but NVIDIA is going with Samsung on Ampere. From what we know from public information as far as transistor density: GloFo < TSMC < Samsung < Intel.
 
Read my lips. 2x performance on raster, 10x performance on ray tracing, price drop on high end. 16GB on 3080Ti, 12GB on 3070/3080, 8GB on 3060, 6GB on 3050Ti, 4GB on 3050. All working for ray tracing. 3060 beating 2080Ti on raster but by far better with ray tracing. 3050 better on ray tracing than 2080Ti. All of them under $1000. Titan Ampere with 32GB at $2000.
AMD only beating on RDNA2 7nm+ TSMC (somehow under 7nm Samsung specs used for Nvidia Ampere) Nvidia Turing 2080TI in 12nm (in fact 16nm+) with a huge Navi23 high end GPU using expensive HBM2. My bet AMD will never be so far from Nvidia performance as in 2020 but may eventually catch up in the next years because they are making huge money with CPUs.
AMD can't already compete with Turing by using a greatly improved 7nm technology on quite big chips against Navidia on 16nm enhaced process (called 12nm by TSMC but is really 16nm+).
In that case put the RX 5700XT against the RTX 2070/2060 Super (which are very close on use the same chip) : RX 5700XT is based on a 251 mm² chip on 7nm, while RTX 2070 is on a 445mm² chip, but 40% of that chip is used for raytracing and IA, so only 267mm² are used for raster, but this is all on 16nm+ aka 12nm. Both GPU have quite the same transistor count (10.8 Nvidia vs 10.1 AMD) but Nvidia uses only around 6.5 million for raster.
On the same use (like Raster Gaming), Nvidia has a little better TDP and works quite at the same speed, so Nvidia has a huge edge in designing its GPU. It needs fewer circuits and consumes less while using deprecated technology at the foundry. And they managed to put on it another completely new technology they call Ray tracing and IA. Those GPU beat AMD's hands down on features while using older technology because those GPU are 3 years old now. So just think at the time they had to prepare their future advance on even better than AMD TSMC 7nm at Samsung foundry. It will be tremendous. And expect Intel to contemplate the irony of being able to only try fight for the 20% AMD market share left for AMD fans but with faulty 10nm tech while even AMD will be on 7NM EUV. I'm already in pain for Intel and AMD has all my sympathy to try to catch up as long as they make cheaper GPU than they just sold this year. 5700 and 5500 series are overpriced, shouldn't have sold they way they did. They just used some "the new thing" and "being different" effect but this won't last. I already see massive disappointment on the 5500 line because of the price.
 
Last edited:
You can't compare CPU architectures to GPU architectures, was the poster's point. Still, you can't compare processes from different fabs, either. Everybody was on TSMC this generation of video cards, but NVIDIA is going with Samsung on Ampere. From what we know from public information as far as transistor density: GloFo < TSMC < Samsung < Intel.
Not clear yet and dependent on technology. Regarding transistor density and reliable result : it's rather Samsung=TSMC>Intel >>Glofo. But actually 7nm Samsung = 6nm TSMC and is better than 7nm+ of TSMC and they are quite at the same level of advancement. In fact AMD will not use the top TSMC technology for Zen3 and RDNA 2 graphics card as it used 7nm for Vega 20. They should have used 6nm to be in touch with the best, as Huawei and Apple which are bigger clients than AMD. Huawei is starting to use 6nm and Apple is under way to producing 6nm CPU for 2020 iphones and ipads. AMD will use 5nm for Zen 4 so they will jump over one TSMC node.

The only lead Intel has today is in false modesty in naming. It predates the time when TSMC and Samsung have come close and caught up with Intel technology and Intel was naming as if they didn't care to show how very far they were in front of the competition. Their 10nm is now close to the 7nm Pro at TSMC and 8nm at Samsung, but these are already old technologies at TSMC and Samsung while it isn't working as expected yet at Intel. Intel may catch up at the end of 2021 with their 7nm as announced equal to 5nm TSMC/Samsung but both are already in pre-production when we haven't seen anything from Intel yet and as it's happening for 5 years now, Intel could be very far from ready and unable to produced the expected chips.
 
Last edited:
Read my lips. 2x performance on raster, 10x performance on ray tracing, price drop on high end. 16GB on 3080Ti, 12GB on 3070/3080, 8GB on 3060, 6GB on 3050Ti, 4GB on 3050. All working for ray tracing. 3060 beating 2080Ti on raster but by far better with ray tracing. 3050 better on ray tracing than 2080Ti. All of them under $1000. Titan Ampere with 32GB at $2000.
AMD only beating on RDNA2 7nm+ TSMC (somehow under 7nm Samsung specs used for Nvidia Ampere) Nvidia Turing 2080TI in 12nm (in fact 16nm+) with a huge Navi23 high end GPU using expensive HBM2. My bet AMD will never be so far from Nvidia performance as in 2020 but may eventually catch up in the next years because they are making huge money with CPUs.
AMD can't already compete with Turing by using a greatly improved 7nm technology on quite big chips against Navidia on 16nm enhaced process (called 12nm by TSMC but is really 16nm+).
In that case put the RX 5700XT against the RTX 2070/2060 Super (which are very close on use the same chip) : RX 5700XT is based on a 251 mm² chip on 7nm, while RTX 2070 is on a 445mm² chip, but 40% of that chip is used for raytracing and IA, so only 267mm² are used for raster, but this is all on 16nm+ aka 12nm. Both GPU have quite the same transistor count (10.8 Nvidia vs 10.1 AMD) but Nvidia uses only around 6.5 million for raster.
On the same use (like Raster Gaming), Nvidia has a little better TDP and works quite at the same speed, so Nvidia has a huge edge in designing its GPU. It needs fewer circuits and consumes less while using deprecated technology at the foundry. And they managed to put on it another completely new technology they call Ray tracing and IA. Those GPU beat AMD's hands down on features while using older technology because those GPU are 3 years old now. So just think at the time they had to prepare their future advance on even better than AMD TSMC 7nm at Samsung foundry. It will be tremendous. And expect Intel to contemplate the irony of being able to only try fight for the 20% AMD market share left for AMD fans but with faulty 10nm tech while even AMD will be on 7NM EUV. I'm already in pain for Intel and AMD has all my sympathy to try to catch up as long as they make cheaper GPU than they just sold this year. 5700 and 5500 series are overpriced, shouldn't have sold they way they did. They just used some "the new thing" and "being different" effect but this won't last. I already see massive disappointment on the 5500 line because of the price.

Now that is pure comedy.
 
You may have the same difference between Turing (16nm+ called 12nm) and Ampere (7nm EUV) as you had between Kepler 28nm and Turing 16nm+ (called 12nm). It's going to be huge.
Watch :
 
You may have the same difference between Turing (16nm+ called 12nm) and Ampere (7nm EUV) as you had between Kepler 28nm and Turing 16nm+ (called 12nm). It's going to be huge.
Watch :


The days of massive transistor count increases (and resulting performance gains) are behind us. In the current world, Cost per transistors is flat-lined. Which means if you want to add 30% more transistors when you process shrink, it will cost 30% more...

In the good old days, these extra transistors were basically FREE, so we got massive perf increases on process shrinks. Today they will be much more modest.
 
The days of massive transistor count increases (and resulting performance gains) are behind us. In the current world, Cost per transistors is flat-lined. Which means if you want to add 30% more transistors when you process shrink, it will cost 30% more...

In the good old days, these extra transistors were basically FREE, so we got massive perf increases on process shrinks. Today they will be much more modest.
Indeed. This is what drives me crazy about people saying the RTX cards cost so much because NVIDIA is greedy, and that AMD is pricing similarly because NVIDIA did it. The cost increase was practically 1:1 compared to the transistor increase. Qualcomm called it back in 2015 that 28nm would become the most cost-optimal node.

upload_2019-12-16_9-29-10.png
 
Definitive proof that the future is in chiplet designs in order to keep yields up. Hope they can make that shift in GPUs eventually.

I still believe the 2080Ti is priced a bit high, but that's the name of the game for the enthusiast product.. Either way, we won't be seeing affordable/budget 750mm^2 dies in my lifetime, so I agree. 8800GTX was up there in size and I believe were selling as high as $800 on newegg during its first year and that was over a decade ago ignoring inflation.

I suppose a 750mm^2 3080Ti WOULD be a decent leap in performance, but man.. that thing would likely push $2000 end user price..
 
Last edited:
  • Like
Reactions: N4CR
like this
Titan Ampere with 32GB at $2000..

The next console cycle is going to have a serious leap in graphics memory on the chip. Either extremely fast embedded memory all in a single package like cpu/gpu/ram/gddr all soldered together. Or something to the tune of like 128gb of memory for the gpu in a standalone package. We will be looking at the yaer 2026-2027 for a full system overhaul, which would put us squarely into the 8k120 league required to significantly bump of the graphic fidelity. Throw in ray tracing, and as yet uninvented gpu tech, and it will be a memory beast to keep those levels flowing and the gpu's fed.
 
The next console cycle is going to have a serious leap in graphics memory on the chip. Either extremely fast embedded memory all in a single package like cpu/gpu/ram/gddr all soldered together. Or something to the tune of like 128gb of memory for the gpu in a standalone package. We will be looking at the yaer 2026-2027 for a full system overhaul, which would put us squarely into the 8k120 league required to significantly bump of the graphic fidelity. Throw in ray tracing, and as yet uninvented gpu tech, and it will be a memory beast to keep those levels flowing and the gpu's fed.

C54F1C25-F16D-4FF6-9142-CDEAA66E580E.gif
 
Definitive proof that the future is in chiplet designs in order to keep yields up. Hope they can make that shift in GPUs eventually.

I still believe the 2080Ti is priced a bit high, but that's the name of the game for the enthusiast product.. Either way, we won't be seeing affordable/budget 750mm^2 dies in my lifetime, so I agree. 8800GTX was up there in size and I believe were selling as high as $800 on newegg during its first year and that was over a decade ago ignoring inflation.

I suppose a 750mm^2 3080Ti WOULD be a decent leap in performance, but man.. that thing would likely push $2000 end user price..
The 8800 GTX was 484 mm² at a time when die size was around 300 mm² in "big" chips, and I bought a pair of them when they launched in 2006 for $579 a piece on Newegg. For comparison, the 7800 GTX was $599 at release.
 
I'd advise a more panoramic, contextual perspective on things:

1) New 2020 consoles will likely have ~16gb memory with ~12gb addressable by the GPU, so to all effects, ~VRAM.
2) Raytracing will be available in both AMD/NV.
3) 7nm is a given at this point.
4) CPUs are moving to chiplets, GPUs are the next logical candidate, but 2020 seems too soon.

So what I'd expect for RTX 3xxx is:

1) 8GB VRAM minimum for x60 and higher up models
2) Some progress (the +1 tier per generation type, like old times) on RT performance (so 3060 ~ 2070 in RT).
3) Better power efficiency = more performance/same power, or same performance/lower power, depending on GPU tier.
4) No chiplet GPU in 2020. Maybe for the scientific market in 2021, consumer by 2022/2023.

At this point, buying a non-raytracing GPU is shooting yourself in the foot, but you could have 1 more year of no RT depending on games published in 2020. However, after 2020, with consoles using RT, it'll be a given for the great majority of games, even if just lighter-RT and non fully RT games (that's a ways off anyway). Anything less than 8gb VRAM is certainly a no-go, since new consoles will up the addressable video memory by quite a bunch.

A 7nm 8GB RTX 3060 for $300 wouldn't be a bad option at all. I wouldn't pay a single cent over $300 (current 2060 is way overpriced if you ask me), and if Nvidia is smart they'll realize RT will be needed everywhere not long from now, so maybe the current 2060 raytracing performance goes down a tier?

A 7nm 6GB RTX 3050 for $200 wouldn't be a bad option at all.

As for 3070/3080, I don't much care about those tiers, I'm a 2560x1080 ultrawide gamer, so I don't need much more than mid-tier to max out games.

As usual, there's no bad product or performance, just bad pricing. I'm sure some would say that my price expectations are too optimistic, and I see that. Nvidia keeps saying how "expensive" it is to include RT hardware. The second AMD includes RT hardware themselves, I'm pretty sure it'll magically stop being so expensive. Not that AMD will drastically lower prices, but basic feature competition will force those prices down. A $50 decrease on the 3060 vs 2060 would not be crazy reductive, while a 3050 would actually increase the price generationally. Welcome a new 3040 to the previous 1650 tier? Wouldn't put it past Nvidia, they certainly have enough models like every $30 and it'd give them some breathing room for product segmentation: 1650 upgrades to 3040, 2060 upgrades to 3050, 2070 upgrades to 3060, etc.
 
Last edited:
At this point, buying a non-raytracing GPU is shooting yourself in the foot

Cause there are so many games that implement it to the max. . .

Right now ray tracing is a cool little tech demo but other than that it's a gimmick. Only game with decent ray tracing performance is a little game called quake from when the last president almsot got impeached for putting his penis inside the mouth of a chubby secretary until he was satisfied.

I'm all for ray traced games. . .when they have reached par for graphics performance, until then, we dont have the hardware to barely run 4k titles as it is. Until we get full blown 4k ray tracing at 60 fps with all forms of ray tracing being used, and not just some refelctions here an there, it's litearlly just a gimmich.


As usual, there's no bad product or performance, just bad pricing. I'm sure some would say that my price expectations are too optimistic, and I see that. Nvidia keeps saying how "expensive" it is to include RT hardware. The second AMD includes RT hardware themselves, I'm pretty sure it'll magically stop being so expensive. Not that AMD will drastically lower prices, but basic feature competition will force those prices down. A $50 decrease on the 3060 vs 2060 would not be crazy reductive, while a 3050 would actually increase the price generationally. Welcome a new 3040 to the previous 1650 tier? Wouldn't put it past Nvidia, they certainly have enough models like every $30 and it'd give them some breathing room for product segmentation: 1650 upgrades to 3040, 2060 upgrades to 3050, 2070 upgrades to 3060, etc.

You haven't been paying attention. AMD has brought their pricing in alaignment with Nvidia, they have no interest in cutting us a deal now. The 5700x was a 580x replacement that they decided to bump up the price 100-150 dollars on cause that's where nvidia set the market. Look at the 5500xt and let me know if AMD is doing us any favors. Take another look at the 32 core threadripper and you'll see that AMD would rape us if they could get away with it monetarily speaking. They've just been such a shit product for so many years that they had no other choice but to firesale their products to get market segment. Those days are long gone.
 
I'm all for ray traced games. . .when they have reached par for graphics performance, until then, we dont have the hardware to barely run 4k titles as it is. Until we get full blown 4k ray tracing at 60 fps with all forms of ray tracing being used, and not just some refelctions here an there, it's litearlly just a gimmich.

I really don't get the 'if it doesn't run at 4k60, it doesn't count!" angle. I have multiple high-refresh VRR displays, but I also have 1080p60 displays, and game on all of them. Ray tracing incurs a performance hit. It must. We'll have reached another computing paradigm when it doesn't.

But that doesn't make it a gimmick, when it both adds to the experience of new AAA titles (yes, they exist!), and they're playable with it enabled. And I get that it doesn't mean much to you- it doesn't mean much to me personally either, but again, that's simply down to my (and your) game tastes, not an attribute of the technology itself.

Also, a word on Quake II RTX: while yes, this is an old engine, it's also open-source and well understood, which made it a great testbed for exploring more complete implementations of ray tracing without the baggage of modern AAA games and the engines they use. It's not meant to define ray tracing so much as a means of explaining it.

You haven't been paying attention. AMD has brought their pricing in alaignment with Nvidia, they have no interest in cutting us a deal now. The 5700x was a 580x replacement that they decided to bump up the price 100-150 dollars on cause that's where nvidia set the market. Look at the 5500xt and let me know if AMD is doing us any favors. Take another look at the 32 core threadripper and you'll see that AMD would rape us if they could get away with it monetarily speaking. They've just been such a shit product for so many years that they had no other choice but to firesale their products to get market segment. Those days are long gone.

We'll honestly be lucky if AMD decides to calm their pricing tits in situations where they compete only with themselves as much as Intel and Nvidia have done over the last decade. While Intel stumbled something fierce on their 10nm rollout, had they met their initial release cadence (in line with previous releases), we'd have had US$300 octocore CPUs and Zen would have released at firesale prices too.

I find it hilarious when folks complain about pricing without accounting for the various factors involved.
 
A 7nm 8GB RTX 3060 for $300 wouldn't be a bad option at all. I wouldn't pay a single cent over $300 (current 2060 is way overpriced if you ask me), and if Nvidia is smart they'll realize RT will be needed everywhere not long from now, so maybe the current 2060 raytracing performance goes down a tier?

I dont think that's gonna happen. $400 maybe. And AMD wont sell 2070 level RT for less than $400 either. Unfortunately.
 
Cause there are so many games that implement it to the max. . .

Right now ray tracing is a cool little tech demo but other than that it's a gimmick. Only game with decent ray tracing performance is a little game called quake from when the last president almsot got impeached for putting his penis inside the mouth of a chubby secretary until he was satisfied.

I'm all for ray traced games. . .when they have reached par for graphics performance, until then, we dont have the hardware to barely run 4k titles as it is. Until we get full blown 4k ray tracing at 60 fps with all forms of ray tracing being used, and not just some refelctions here an there, it's litearlly just a gimmich.




You haven't been paying attention. AMD has brought their pricing in alaignment with Nvidia, they have no interest in cutting us a deal now. The 5700x was a 580x replacement that they decided to bump up the price 100-150 dollars on cause that's where nvidia set the market. Look at the 5500xt and let me know if AMD is doing us any favors. Take another look at the 32 core threadripper and you'll see that AMD would rape us if they could get away with it monetarily speaking. They've just been such a shit product for so many years that they had no other choice but to firesale their products to get market segment. Those days are long gone.

RT is a lot more than a gimmick atm. And very enjoyable in 1080p, 1440p. Get yourself a RTX card and give it a try. Some fun titles out there.
 
I dont think that's gonna happen. $400 maybe. And AMD wont sell 2070 level RT for less than $400 either. Unfortunately.

Yeah, with the 2060 Super at $400, I don't see them selling something better than that for $300 next year. Maybe a $350 MSRP 3060, with $400 MSRP FE.
 
Cause there are so many games that implement it to the max. . .

Right now ray tracing is a cool little tech demo but other than that it's a gimmick. Only game with decent ray tracing performance is a little game called quake from when the last president almsot got impeached for putting his penis inside the mouth of a chubby secretary until he was satisfied.

I'm all for ray traced games. . .when they have reached par for graphics performance, until then, we dont have the hardware to barely run 4k titles as it is. Until we get full blown 4k ray tracing at 60 fps with all forms of ray tracing being used, and not just some refelctions here an there, it's litearlly just a gimmich.




You haven't been paying attention. AMD has brought their pricing in alaignment with Nvidia, they have no interest in cutting us a deal now. The 5700x was a 580x replacement that they decided to bump up the price 100-150 dollars on cause that's where nvidia set the market. Look at the 5500xt and let me know if AMD is doing us any favors. Take another look at the 32 core threadripper and you'll see that AMD would rape us if they could get away with it monetarily speaking. They've just been such a shit product for so many years that they had no other choice but to firesale their products to get market segment. Those days are long gone.
I guess you skipped the posts explaining that cost per transistor is not going down, and how the price scaled linearly with the number of transistors on the die.
 
I guess you skipped the posts explaining that cost per transistor is not going down, and how the price scaled linearly with the number of transistors on the die.

This is also a good point. I guess a 3060 at $300 is wishful thinking, no matter how much AMD competes, these prices will not (cannot?) go down for now. I still see the possibility of an RTX 3050 for $200-250, that gets to smilar performance of the 2060. It's not like Nvidia is going to keep RT in their more expensive GPUs forever, they can't now that consoles will include RT starting next year, so they need a base RT enabled model that has actual RT hardware, unlike the 16 series. Precisely because hardware is no longer necessarily cheaper to manufacture - at least until chiplets are available on GPUs, I foresee 3050 $250, 3060 $350, 3070 $450 and up. This would make generational sense, as afterwards it'd open up the space for a 4040, 4050, 4060, 4070 and up (where the 4040 should approach the 3050 performance). By then, we're talking 2022/2023, so this is certainly within performance expectations (plus competition from AMD, however successful it may end up being).

Then again, none of this matters if the 3050 doesn't perform like the 2060 or if the 3060 doesn't perform like the 2070. If all we get are minor performance improvements like the Super range, a whole lot of people are going to NOT spend hundreds of dollars for a little gain. I don't think this is likely, because it'll be a new process node AND a new generation number (3xxx), unlike 2019 the "Super" minimal upgrade year.

We do know that the $200-250 range is the biggest seller for GPU makers by a long stretch, and with new consoles doing RT and having way more VRAM, the $200 level is going to have to step it up considerably to just run next year's games decently at all - and by that I mean 1080p mid-high quality. A new 3xxx generation where there's no $200 raytracing-hardware card makes no sense to me. I don't need RT today, but if new console games are coming by next fall, no way I'd buy a new $200-300 GPU that won't support it in hardware. Maybe I'm in the minority.
 
Precisely because hardware is no longer necessarily cheaper to manufacture - at least until chiplets are available on GPUs, I foresee 3050 $250, 3060 $350, 3070 $450 and up.

If they can drop the die size, they can really drop unit costs for the same level of performance -- and they can scale performance higher at the same time. It's a win / win, really.

Further, there's nothing stopping them from using an interposer instead of a PCB for groups of like chiplets; this isn't really necessary for CPUs a la Zen 2, but since the bandwidth requirements for real-time graphics are on an entirely different scale, an interposer solution might make the whole thing more economical as performance needs increase.

Then again, none of this matters if the 3050 doesn't perform like the 2060 or if the 3060 doesn't perform like the 2070. If all we get are minor performance improvements like the Super range, a whole lot of people are going to NOT spend hundreds of dollars for a little gain. I don't think this is likely, because it'll be a new process node AND a new generation number (3xxx), unlike 2019 the "Super" minimal upgrade year.

We really should expect a significant uplift all around. The 'super' lineup represents the best that can be done on TSMCs "12nm" node; by adding RTX with Turing on the same node as Pascal, Nvidia really did push fabrication tech to the max to produce the 2000-series. So I agree, with the 3000-series, there should at least be a significant increase in performance-per-clock due to more execution units as well as architectural enhancements, and hopefully RT performance will increase five- to ten-fold, as well as be on all parts down the stack save the hypothetical 'GTX3030' parts that are usually cut down in other ways too, like limited transcoding blocks.

The typical 'GTX3050(Ti)' should definitely rock 1080p120 and 1440p60 on AAA-titles at medium overall settings and perhaps low RT.

A new 3xxx generation where there's no $200 raytracing-hardware card makes no sense to me. I don't need RT today, but if new console games are coming by next fall, no way I'd buy a new $200-300 GPU that won't support it in hardware. Maybe I'm in the minority.

I get the omission from lower-end parts today; both the 1660 and RX5700 make sense without RT and wouldn't make much sense with it. Even the 2060 is a tad questionable, if only because most versions of that GPU aren't going to be kitted up to an appropriate level of performance to make that tradeoff, but at the same time, that inclusion was likely calculated to bolster market penetration of the technology.

And I'll agree again going forward, barring some *30 OEM queen (or APUs), I'm really not interested in GPUs that don't support RT. I expect even indie-grade games to start dabbling with hardware RT acceleration as a cheap boost to visuals once the hardware install base reaches critical mass.
 
I expect even indie-grade games to start dabbling with hardware RT acceleration as a cheap boost to visuals once the hardware install base reaches critical mass.

%100. RT does make several other effects easier to implement, which Nvidia has drummed up to the moon and beyond. Soon they won't have to, as commonplace RT hardware is now a given in 2020 thanks to consoles. Once that jump is made, I expect most reflections and some other lighting techniques to just use RT from 2020 on, with some fallback for regular raster for a few years, say until 2025 or so. Kind of how it took a few years for pixel shaders to be commonplace, but once all price-tier GPUs shipped with them, it was a done deal - every game switched to them. RT will follow the same path.
 
In the PC world there will be a glut of non RT cards for many years. I do not see a very high incentive to take advantage of RT in PCs for awhile except for limited titles. I will probably end up buying a console late next year for RT reasons and maybe VR as well. Another way I look at it is DX 12 and Vulkan capable cards have been out for a long while and yet still years afterwards the large percentage of games have none or mediocre support for those APIs. RT for the PC has been out for over a year now, look at the number of titles available that support RT and how many of those actually make a significant oh I have to get difference. Not being negative about RT but looking at it hopefully more realistically. As for necessary price hikes, I would disagree, features and capabilities have been added to GPU's for many years with a decreasing price. The new process nodes make it harder but not at the price hikes we have seen, AMD Ryzen CPU's are not much more expensive then their 12 nm versions while at the same time way more capable. Nvidia large die GPU's will probably hit an expensive wall going down to 7nm this coming round, hopefully they will be worth it this time.
 
  • Like
Reactions: N4CR
like this
%100. RT does make several other effects easier to implement, which Nvidia has drummed up to the moon and beyond. Soon they won't have to, as commonplace RT hardware is now a given in 2020 thanks to consoles. Once that jump is made, I expect most reflections and some other lighting techniques to just use RT from 2020 on, with some fallback for regular raster for a few years, say until 2025 or so. Kind of how it took a few years for pixel shaders to be commonplace, but once all price-tier GPUs shipped with them, it was a done deal - every game switched to them. RT will follow the same path.

The bulk of the market is cards under $250. They really can't depend on RT until they sell a significant number of cards in that price range that have usable RT.

I expect it will be many years before we see usable RT at that price point, and then many more to have significant market share.

Though I think going forward all high end cards must have RT. Big Navi will certainly have RT, and all NVidia higher end cards already do, and will always going forward.
 
AMD Ryzen CPU's are not much more expensive then their 12 nm versions while at the same time way more capable.

The dies are smaller ;)

Nvidia large die GPU's will probably hit an expensive wall going down to 7nm this coming round, hopefully they will be worth it this time.

Not likely -- we may not see prices decrease, particularly since AMD continues to fail to provide competition, but performance will certainly improve.
 
The bulk of the market is cards under $250. They really can't depend on RT until they sell a significant number of cards in that price range that have usable RT.

I expect it will be many years before we see usable RT at that price point, and then many more to have significant market share.

Couple things in the wind that might change this outlook, which I do find correct today:
  • Adding RT to consoles means, at the very least, developers are looking to make use of it in that limited performance envelope
  • Successive RT implementations have been more reserved for where RT can make a real difference in final image quality, and accordingly the performance hit for enabling it has also dropped
Basically, the trend is broader industry usage with increasingly more finely-tuned implementations. We very well could see 5500XT- / 1650-class replacements running limited RT in indie-grade titles, as an example.
 
The bulk of the market is cards under $250. They really can't depend on RT until they sell a significant number of cards in that price range that have usable RT.

I expect it will be many years before we see usable RT at that price point, and then many more to have significant market share.

That's just it - how do we define what 'usable RT' means? Nvidia has strongly hinted during the past year that it means 1080p60. If there's an RTX 3050 this year that gives us close to 2060 performance for $250, then we'll have kind of achieved that point in 2020. For sure 1080p60 will be attainable in 2021, even if middle-level RT that we see in current games. 2022 I'd guess should be enough time to enable all RT features (not fully RT rendered, but all available RT techniques that we see today) at 1080p60 at $200. That's my guess at least.
 
That's just it - how do we define what 'usable RT' means? Nvidia has strongly hinted during the past year that it means 1080p60. If there's an RTX 3050 this year that gives us close to 2060 performance for $250, then we'll have kind of achieved that point in 2020. For sure 1080p60 will be attainable in 2021, even if middle-level RT that we see in current games. 2022 I'd guess should be enough time to enable all RT features (not fully RT rendered, but all available RT techniques that we see today) at 1080p60 at $200. That's my guess at least.

Does the 2060 do 1080p60 with RT?
 
The next gen consoles is late 2020, when do folks think the glut of new RT games for the PC will come about? I would think in 2021 with the new consoles, games with RT features will start to appear, for the XBox games with RT they should make it to the PC if there is no weird hardware requirement unique to the XBox. Now if performance is severely hindered, I doubt many games on the consoles would use RT.

If High end cards only have them next year, why would publishers go out of their way to support RT with a very small fraction of users that would have the hardware, plus at what level of hardware support do they have? 2060 RT? Frankly anything less than a 2080 Super does not seem that usable. For many, 1080p 60FPS is not really an IQ valuable resolution, if one is concerned about IQ that would fall upwards to 1440p and more like 4K. RT would probably be utterly meaningless for those gamers at 1080p and looking for 60fps. 4K TVs are so prevalent, RT needs to support that resolution just like the Consoles will, at least decently upscale with good playable frame rates. If one was really concerned about RT and support it, looks like the best bet would be next generation of consoles which will from day one support 4K.
 
Last edited:
The dies are smaller ;)



Not likely -- we may not see prices decrease, particularly since AMD continues to fail to provide competition, but performance will certainly improve.
Will be very interesting none the less, hopefully some good solutions from at least one of them, better if both kick some ass next year. AMD looks promissing, Nvidia is to me a rather big unknown, if they pull something similar to Pascal -> Party is on.
 
I expect to see a massive jump in ray tracing capability & performance but the cards overall performance over the current generation will not be as impressive. As long as AMD keeps coming up short with their offerings, Nvidia has little to worry about on the performance side. For now anyways.
 
The next gen consoles is late 2020, when do folks think the glut of new RT games for the PC will come about? I would think in 2021 with the new consoles, games with RT features will start to appear, for the XBox games with RT they should make it to the PC if there is no weird hardware requirement unique to the XBox. Now if performance is severely hindered, I doubt many games on the consoles would use RT.

If High end cards only have them next year, why would publishers go out of their way to support RT with a very small fraction of users that would have the hardware, plus at what level of hardware support do they have? 2060 RT? Frankly anything less than a 2080 Super does not seem that usable. For many, 1080p 60FPS is not really an IQ valuable resolution, if one is concerned about IQ that would fall upwards to 1440p and more like 4K. RT would probably be utterly meaningless for those gamers at 1080p and looking for 60fps. 4K TVs are so prevalent, RT needs to support that resolution just like the Consoles will, at least decently upscale with good playable frame rates. If one was really concerned about RT and support it, looks like the best bet would be next generation of consoles which will from day one support 4K.

Remember, the next generation of RT cards will probably see notable jumps in performance. I assume because this was the first generation the improvements will be larger when it comes to ray tracing performance. Maybe not as big of the jumps in the early 2000s, but probably bigger than what we are used to. Seeing the RTX3070 (or whatever it may be called) getting a 50% jump in ray tracing performance wouldn't shock me. Essentially, the frame rate hit on ray tracing will be notably lower with newer generations of GPUs compared to the current RTX2*** series. But maybe I am wrong.
 
Consoles might get the option of 1080p with RT or 4K without.
The other option would be scalable effects with res.
 
Last edited:
2060 non S, is quite a bit less powerful.

It doesn't look like the 2060 will do 60 FPS in Metro E, even with Ray Tracing off:
https://static.techspot.com/articles-info/1793/images/2019-02-15-image.png

Here are some benches. 2060 is dismal:
https://www.eurogamer.net/articles/...e-rtx-2060-super-rtx-2070-super-review?page=2

Well, especially with a 2060 you're not going to run games on ultra. Not directed at you - but it's like people have forgotten how to adjust settings for their rig.

I have a 2080ti on 3440x1440 (which is probably similar in FPS for a 2060 on 1080p). I found in some games like Control I found it much more visually appealing with all RTX on, and all "normal" settings set to medium; rather than ultra and no RTX.

You also have to watch RTX benchmarks. They are usually done on a game's launch day and never revisited after the (usually large) performance tweaks are made to drivers/the game.
 
Back
Top