RTX 3xxx performance speculation

That was a really long time ago. Around 2004 with the Geforce 3/4.

I know folks don’t like to admit it but fact is the 8800 gtx was a 485mm^2 chip for $600 back in 2006. In 2019 you’re getting a 545mm^2 rtx 2080 for $700.

We all want to pay less but it’s not true that GPUs have gotten expensive recently. They’ve been expensive for a long ass time.
I guess it’s partially an opinion. RTX sales figures tell a different story. When the next generations of cards and pricing emerge, I think we will know better what people thought of RTX pricing.

mom still on a 1080ti, and money is not a factor.
 
To elaborate on my 5-20% guessitmate. Unless they figure out chiplets, which I am highly skeptical of for GPUs, I wouldn’t expect much of an improvement in either performance or cost. 7nm costs about the same per transistor and the 2080ti is already massive. Even if they wanted to go higher transistor count with 7nm they are hitting TDP limits and cost would increase.

While I agree that the easy gains are likely over, they aren't going to give some kind of meaningless performance boost like 5-10%.

20%+ seems reasonable. While it might take a bit more transistors, a bit more clock speed, and maybe some efficiency tweaks...
 
Since we have a new variable to account for, RT performance, I'll state it like this:

Raster performance per bracket at ~120% with the RT hit for current and upcoming AAA titles at <10%.

The first part is what basically everyone here has written, the basic generational jump in raster performance, while the second part brings the idea that enabling the sort of hybrid raytracing that games are currently implementing will incur a minimal performance penalty versus Nvidia's Tesla architecture.

I doubt that. Looking at the frame breakdowns of RT effects, most RT effects spend at least half the time in shader cores doing Shading and Denoising for that RT effect.

That means to drastically improve RT performance, you need need to drastically improve RT intersection testing (RT cores) and critically also increase the Shader performance by a similar amount.

If you just doubled the RT core performance, it wouldn't cut the RT effect penalty in half.
 
If you just doubled the RT core performance, it wouldn't cut the RT effect penalty in half.

I don't disagree -- but I am looking more to end results than specific subsystem improvements as I don't feel qualified to make predictions with that level of granularity ;)
 
While I agree that the easy gains are likely over, they aren't going to give some kind of meaningless performance boost like 5-10%.

20%+ seems reasonable. While it might take a bit more transistors, a bit more clock speed, and maybe some efficiency tweaks...

Yeah, even a 10% bump in clock and 15% bump in shaders (the least I think we've seen from nVidia) would be a 27% increase.

Maybe I should look closer at the Vega -> Navi change for a guesstimate... but I think the arch change was too drastic to get a grasp on how much of the frequency gain was from the change to 7nm.

I've been a little lazy on my math with video cards lately. I focused on 3900x vs 9900kf since that was my most recent decision. High end video cards have been pretty boring haha.
 
Maybe I should look closer at the Vega -> Navi change for a guesstimate... but I think the arch change was too drastic to get a grasp on how much of the frequency gain was from the change to 7nm.

Can't really ever use AMD as a predictor -- they're too good at cocking up their own releases, unfortunately.

Past Nvidia shrinks have been extremely productive, typically because they actually have a handle on their technology (no Fine Wine®), and given that they're also putting out their second-generation of consumer-side ray tracing, it's within reason to see a higher bump than from Pascal to Turing.
 
3080 Ti needs to be at least 30% faster than the 2080 Ti or it won't be worth it.

Depends on how much performance they can squeeze out of a mature process.

Prices better not inflate again though! LOL.

2080 Tis are now priced at former Titan levels. And Titan prices basically doubled.

Problem is that people are shelling out. So the xx80Ti and up are now way way out of price range of most people.

AMD needs to bring out a competitor for the 2080 Ti in this small window before the 30xx drops. But will they price it just as exorbitantly or will prices return to a somewhat sane level?

4k is playable on 2080 Ti but it's not even close to the refresh rate of 120hz-240 hz monitors.

I'd settle for 120hz on my ultrawide 3440x1440 please!

Also can we have a better ray tracing "2.0" without severe frame penalties? If they can reduce the penalty/overhead to like 10% RTX would be much more viable technology.
 
I focused on 3900x vs 9900kf since that was my most recent decision. High end video cards have been pretty boring haha.

Are they ever, rtx while enjoyable only has a few title and a massive performance hit, and AMD isn’t anywhere in the high end GPU segment. Maybe intel will shake things up, I really doubt AMD can.

At least the processor market is interesting atm, been a long time since I could say that.
 
3080 Ti needs to be at least 30% faster than the 2080 Ti or it won't be worth it.

Depends on how much performance they can squeeze out of a mature process.

Prices better not inflate again though! LOL.

2080 Tis are now priced at former Titan levels. And Titan prices basically doubled.

Problem is that people are shelling out. So the xx80Ti and up are now way way out of price range of most people.

AMD needs to bring out a competitor for the 2080 Ti in this small window before the 30xx drops. But will they price it just as exorbitantly or will prices return to a somewhat sane level?

4k is playable on 2080 Ti but it's not even close to the refresh rate of 120hz-240 hz monitors.

I'd settle for 120hz on my ultrawide 3440x1440 please!

Also can we have a better ray tracing "2.0" without severe frame penalties? If they can reduce the penalty/overhead to like 10% RTX would be much more viable technology.

Raytracing will always be demanding. There is a good reason why it hasn't been done realtime before and even now it's done with very few rays (which allows for it to be used realtime in the first place) and it's the denoising algorithms that make it look something other than a noisy mess. By the time the hardware catches up developers will just start casting more rays for increased fidelity or implement more path traced solutions.

Nvidia seems to put a lot of effort into improving their upscaling features. DLSS and upscaling with sharpening are going to be important going forward when playing on 4K screens. Game engines are also implementing reconstruction tech to aid performance in movement. I feel that above 1440p is getting into diminishing results for increased fidelity so I don't mind running at sub-native resolution if it gives me a big performance boost and differences in image quality need to be combed from screenshots. Running at say 0.75x of 4K is still a lot of pixels so it's not the same thing as say consoles running at sub-1080p resolutions.

Nvidia's pricing was banking on continued mining craze. It died about the same time as the 20xx series were released and now prices are getting pushed down. I hope this means they need to re-evaluate the cost of 3080 Ti cards at least, maybe dropping down to an average level around 1000 euros rather than 1200.

I'm pretty happy with the performance of my 2080 Ti at 5120x1440 when raytracing is not in use. I am sold on raytracing being a big improvement in visuals so that's the area where I want to see the most improvement. RT performance improvements will be a more important factor in whether I sell my card and upgrade to a 3080 Ti or if I stick with it until the inevitable "3080 Ti Super". The only reason we didn't get one this time around is that they are pretty much tapped on the current node and even the Titan RTX was barely any faster.
 
Raytracing will always be demanding. There is a good reason why it hasn't been done realtime before and even now it's done with very few rays (which allows for it to be used realtime in the first place) and it's the denoising algorithms that make it look something other than a noisy mess. By the time the hardware catches up developers will just start casting more rays for increased fidelity or implement more path traced solutions.

Nvidia seems to put a lot of effort into improving their upscaling features. DLSS and upscaling with sharpening are going to be important going forward when playing on 4K screens. Game engines are also implementing reconstruction tech to aid performance in movement. I feel that above 1440p is getting into diminishing results for increased fidelity so I don't mind running at sub-native resolution if it gives me a big performance boost and differences in image quality need to be combed from screenshots. Running at say 0.75x of 4K is still a lot of pixels so it's not the same thing as say consoles running at sub-1080p resolutions.

Nvidia's pricing was banking on continued mining craze. It died about the same time as the 20xx series were released and now prices are getting pushed down. I hope this means they need to re-evaluate the cost of 3080 Ti cards at least, maybe dropping down to an average level around 1000 euros rather than 1200.

I'm pretty happy with the performance of my 2080 Ti at 5120x1440 when raytracing is not in use. I am sold on raytracing being a big improvement in visuals so that's the area where I want to see the most improvement. RT performance improvements will be a more important factor in whether I sell my card and upgrade to a 3080 Ti or if I stick with it until the inevitable "3080 Ti Super". The only reason we didn't get one this time around is that they are pretty much tapped on the current node and even the Titan RTX was barely any faster.

I think the pricing is to maintain the same margins considering the dies are 50% larger. A 1080ti/Titan Xp was 471mm^2 and a 2080ti is 754mm^2. The 2080 is the same size as the 1080ti/ Titan XP iirc.
 
I think the pricing is to maintain the same margins considering the dies are 50% larger. A 1080ti/Titan Xp was 471mm^2 and a 2080ti is 754mm^2. The 2080 is the same size as the 1080ti/ Titan XP iirc.

Yeah. Some people would rather ignore the massive increase in die size (and thus cost) and pretend it's just big Evil NVidia raising pricese for no reason at all.
 
The 980ti was closer to the 1070 than the 1080. That was part of the disappointment of rtx series, the 2080 provided the same performance as the 1080ti at a higher price. That didn't happen during the Pascal gen.
Maxwell to Pascal was a massive die shrink and the performance jump of 60-80% reflected that. There was no die shrink from Pascal to Turing. That's why the performance jump this gen was only 20-30%, which was typically what we saw during the reigning age of 28nm. We're going from 16nm to 7nm with Ampere, so I would expect another big jump coming.
 
Maxwell to Pascal was a massive die shrink and the performance jump of 60-80% reflected that. There was no die shrink from Pascal to Turing. That's why the performance jump this gen was only 20-30%, which was typically what we saw during the reigning age of 28nm. We're going from 16nm to 7nm with Ampere, so I would expect another big jump coming.

I wouldn't expect that. Like Dayaks indicated; Price/Transistor is now stagnant.

For decades this was THE driver of price/performance improvements in the Semiconductor industry.

You would see double the transistor/$ in full node shrinks.

Today getting 10% more transistors/$ would be a good outcome for a node shrink.
 
Yeah. Some people would rather ignore the massive increase in die size (and thus cost) and pretend it's just big Evil NVidia raising pricese for no reason at all.

It’s a reality of the consumer, we (general we) don’t care about the ins and outs of the manufacturing process, die size, etc. What we see is the brand and model cost compared to prior generations.

Nvidia’s biggest misstep was calling it a 2080ti instead of Titan. If they had done that no one would have blinked at the price, just the small performance increase.
 
  • Like
Reactions: Savoy
like this
I guess it’s partially an opinion. RTX sales figures tell a different story. When the next generations of cards and pricing emerge, I think we will know better what people thought of RTX pricing.

mom still on a 1080ti, and money is not a factor.

Turing simply didn’t make the same splash that Pascal did. For a lot of 1080/Ti owners including myself there just wasn’t anything compelling unless you’re trying to push 4K.

It’s especially bad for 1080 Ti owners. They literally have only the 2080 Ti available as a meaningful upgrade.
 
I found the 2080Ti a nice jump from my 1080 SLI setup. But mainly, that was the diminishing return and support of mGPU and SLI on new games these days.

I fully anticipated a 3080Ti within a year, but wanted to move to a single card solution and experience ray tracing as well. I think it looks great and am looking forward to the improvement in technology.

Did the price suck? Yeah, but let's not forget, unless you have a really old mobo/cpu setup, the video card is the single most important component in your PC for performance if you are a gamer. So personally, I kind of "get it" when it comes to the pricing.

If performance jump is 30% or more, I'd sell my 2080Ti and get a 3080Ti for sure.
 
I would expect the RTX performance to at least double, if not triple. Seriously, I'm expecting Nvidia to double down on all things RTX.

Raster more like 10-20%.

And the titan will just about handle 4K RTX in some games.
 
I found the 2080Ti a nice jump from my 1080 SLI setup. But mainly, that was the diminishing return and support of mGPU and SLI on new games these days.

I fully anticipated a 3080Ti within a year, but wanted to move to a single card solution and experience ray tracing as well. I think it looks great and am looking forward to the improvement in technology.

Did the price suck? Yeah, but let's not forget, unless you have a really old mobo/cpu setup, the video card is the single most important component in your PC for performance if you are a gamer. So personally, I kind of "get it" when it comes to the pricing.

If performance jump is 30% or more, I'd sell my 2080Ti and get a 3080Ti for sure.
I expect that as a bare minimum. The die shink, new architecture, next gen RTX and now MS has just announced NEW DX12 Features with raytracin tier 1.1
 
It’s a reality of the consumer, we (general we) don’t care about the ins and outs of the manufacturing process, die size, etc. What we see is the brand and model cost compared to prior generations.

Nvidia’s biggest misstep was calling it a 2080ti instead of Titan. If they had done that no one would have blinked at the price, just the small performance increase.
"They" generally don't care, I agree. But I think this is the fault of people not fully grasping the concept of value. Consumer-perceived value isn't the only part of the equation. I was taken aback by the cost of these cards at first as much as anyone, but as soon as the white papers were made available I did research and fully understood what I was buying. This is also still a luxury item that not everyone should purchase and no one should feel entitled to it.

Don't get me wrong. If NVIDIA is able to release their top-tier card in the next generation for $650-$700 then I'm all for it. But no one should be shocked when it doesn't happen or blame NVIDIA when AMD and/or Intel release their next generation at the same price levels considering the increasing difficulty and cost of advancing current technology.
 
I would expect the RTX performance to at least double, if not triple. Seriously, I'm expecting Nvidia to double down on all things RTX.

Raster more like 10-20%.

And the titan will just about handle 4K RTX in some games.

Either a true DLSS solution or at least twice the RTX performance if you want to get playable frame rates. And that's just in current RTX games. You'll have to triple that for upcoming games.
 
Either a true DLSS solution or at least twice the RTX performance if you want to get playable frame rates. And that's just in current RTX games. You'll have to triple that for upcoming games.

Eh fuck this RTX nonsense just give me better fps/performance please!
 
just got the rtx 2080 ti from the 1080 ti i just feel a bit ripped of :p RTX is an expensive gimmick...! it felt good buying the 1080ti on launch day. hopefully this card will retain most of the value until next ti gen. dont really feel good buying it TBH. 100% i probably wont even bother to use RT.
 
na 980ti to 1080ti was like 30 fps or smh. 1080ti to 2080ti was 20 fps for 50% more price.
 
just got the rtx 2080 ti from the 1080 ti i just feel a bit ripped of

So you neglected to read or head the reviews?

RTX is an expensive gimmick

Well, perhaps if you're not using it, but then why did you upgrade?

100% i probably wont even bother to use RT

...until a game comes out that you want to play with RT on, that is. Software drives hardware sales (except in your case above).

na 980ti to 1080ti was like 30 fps or smh. 1080ti to 2080ti was 20 fps for 50% more price.

FPS in what at what? With what frametime consistency?

FPS by itself is meaningless.
 
lol, some addiction maybe here. Anyways if not a significant performance bump or price cut with modest performance increase - I will just skip next Nvidia generation as well. Besides if Nvidia is anemic, AMD will probably really embarrass them, which could be nice. I rather Nvidia prove they can do another Pascal level jump.
 
AMD has been on 7nm for how long, and still just running middling mid-range products?

As much as I'd also like to see competition, history is unfortunately not on AMDs side.
I think it depends in how well RNDA scales, that is just Rasterization side of things, if Nvidia can really get RT performance way up there with much less hit for the IQ increase then AMD will still be behind. Being that AMD was late with RNDA, maybe they should just skip to RNDA 2 is another thought. Big RNDA may have a rather short lifespan unless priced super cheap.
 
Yeah. Some people would rather ignore the massive increase in die size (and thus cost) and pretend it's just big Evil NVidia raising pricese for no reason at all.

I've said this ad nauseum, but it isn't the consumer's problem that Nvidia can't design a GPU that fits within the budget of expected GPU pricing. By passing on that price increase direct to the consumer at an alarming 75% markup over the previous generation (1080Ti to 2080Ti) all it did was breed ill will for many (myself included), especially when the next best option was $799 and didn't perform significantly faster than the 1080Ti which was $100 less. And 15 months after release we still only have a handful of games for RT. By the time it is mainstream, the 2080Ti will be irrelevant. If I can get a 3080Ti in the ball park of $899? I might change my mind, but I'm not paying $1000+
 

I have no idea why people like to use these tech labels that AMD comes up with. They have no meaning, and historically, end up derided due to not living up to AMDs promises let alone all of the predictions levied upon them by The Faithful.

RDNA is just a less-unassed architecture that might support functional RT, but is otherwise inferior to Turing on the outset, is late, and is about to have to compete with Ampere assuming AMD is capable of launching a competitive 'big' Navi / RDNA GPU in the first place.
 
all it did was breed ill will for many (myself included)

I don't believe you, because you also post this:

If I can get a 3080Ti in the ball park of $899? I might change my mind


Basically, your ill-will is toward Turing, which is entirely plausible if you didn't find utility and value in RT. I didn't either, for my gaming, so I do understand, but since we're speaking as enthusiasts here and attempting to speak to the 'progression' of GPUs, we should recognize that Nvidia built Turing with RTX cores on an older node, and that the result was a larger, higher-TDP part and that increase costs are absolutely logical.

Consumers vote with their wallets, and like you and I, Turing didn't get the vote from many.

However, the point is that with a die shrink and a generation to refine RT in Ampere, Nvidia has the potential to release a product that significantly addresses the performance per dollar issues that many of us held against Turing.
 
I don't believe you, because you also post this:




Basically, your ill-will is toward Turing, which is entirely plausible if you didn't find utility and value in RT. I didn't either, for my gaming, so I do understand, but since we're speaking as enthusiasts here and attempting to speak to the 'progression' of GPUs, we should recognize that Nvidia built Turing with RTX cores on an older node, and that the result was a larger, higher-TDP part and that increase costs are absolutely logical.

Consumers vote with their wallets, and like you and I, Turing didn't get the vote from many.

However, the point is that with a die shrink and a generation to refine RT in Ampere, Nvidia has the potential to release a product that significantly addresses the performance per dollar issues that many of us held against Turing.

As you and I have discussed previously, I'm definitely a bang for the buck whore at times. The 2080Ti flies right in the face of my sensibilities at the launch price of $1200. That's why I'm on a vanilla 5700 (XT bios) which is damn near the top of bang for the buck (minus RT), especially considering it was sub-$300 and I sold the game bundle. I'm basically in this card for $250 for a brand new card.

I wouldn't say it's ill will toward Turing...I'd say ill will toward Nvidia's handling of Turing with the prices and the lack of usable features for the price. Essentially, you got a 25-30% increase in raster performance for an unprecedented $500 more than the previous generation. If the prices come back in line (or closer to expected) and the utility of RT starts to show itself more, then I could easily be persuaded to switch.
 
I'll likely be looking at Ampere myself, if / when RT becomes something that I'm truly interested in. That's likely Cyberpunk 2077.

But for now, the 1080 Ti trucks along, and I'll be damned if my ancient GTX970 isn't still aquitting itself nicely...
 
  • Like
Reactions: N4CR
like this
If you are talking about 10X performance increase in RT, then you need 10x more RT cores and 10X more shader cores, which means 10X more transistors. Which is totally impossible.

That max plausible transistor increase for RTX 3000, might be 30%, not 1000%. You aren't even going to get 50% increase in RT performance out of that transistor budget, let alone 10X (1000%).

Though there is a thread speculating on RTX 3000, that is a better place to discuss this.

Quoting you from the other thread into this one....

But wouldn't you think that Nvidia will become more efficient at RT in the 18 months between releases? For example, maybe each Ampere RT core is 30-50% faster at RT than the Turing ones for example.

I'm not saying 10x's the performance difference, but if the end product is twice the performance, that might start to get it in the realm of playable, especially on mid range hardware with something other than RT on low settings.
 
Quoting you from the other thread into this one....

But wouldn't you think that Nvidia will become more efficient at RT in the 18 months between releases? For example, maybe each Ampere RT core is 30-50% faster at RT than the Turing ones for example.

I'm not saying 10x's the performance difference, but if the end product is twice the performance, that might start to get it in the realm of playable, especially on mid range hardware.

No, I wouldn't.

Intersection testing is well known, and straightforward math, optimized for decades in software, not the complex interplay of textures, vertexes that present countless novel opportunities for optimizations.

Intersection testing should have been near optimally optimized from the beginning. You might squeeze out a bit, but nothing hugely significant.

IMO the best spend of any increase in transistor budget is evenly (as possible) across the board. Because if you overspend on RT cores, you under-spend on Shader cores and that impacts everything. Both RT and traditional games. If you short change shader cores, you get the same issue that pissed people off about RTX 2000. Small jump in traditional game performance AGAIN.
 
Back
Top