RTX 3xxx performance speculation

Nebell

2[H]4U
Joined
Jul 20, 2015
Messages
2,379
Edit: keep politics out of this thread please.

I currently own a 1080Ti and 2080Ti is about 25-30% faster.
I think Nvidia got away with lackluster improvement by adding new tech (Ray Tracing, DLSS, etc).
But next year I believe there won't be any significant new tech. So my guess is that 3080Ti will be about 40-50% faster than 2080Ti. At least.
I'm almost definitely upgrading to 3080Ti.

What do you guys predict?
 
Last edited:
I will say it be 20%ish for the same price. Maybe finally bump up to 16gb for the Ti.
 
i think like 980ti was close to 1080, 1080ti close to 2080, 2080ti will be close to 3080. so the 3080ti will, probs provide around same bump it have for the last 3 ti's + - nvidia have had prety good upgrades from ti to ti but price made 2080ti not really worth it for me, that is probably the biggest difference last 3 ti generations. think the 3080ti will cost around same as 2080ti.. hope they dont increase price more atleast.
 
i think like 980ti was close to 1080, 1080ti close to 2080, 2080ti will be close to 3080. so the 3080ti will, probs provide around same bump it have for the last 3 ti's + - nvidia have had prety good upgrades from ti to ti but price made 2080ti not really worth it for me, that is probably the biggest difference last 3 ti generations. think the 3080ti will cost around same as 2080ti.. hope they dont increase price more atleast.

The 980ti was closer to the 1070 than the 1080. That was part of the disappointment of rtx series, the 2080 provided the same performance as the 1080ti at a higher price. That didn't happen during the Pascal gen.
 
To elaborate on my 5-20% guessitmate. Unless they figure out chiplets, which I am highly skeptical of for GPUs, I wouldn’t expect much of an improvement in either performance or cost. 7nm costs about the same per transistor and the 2080ti is already massive. Even if they wanted to go higher transistor count with 7nm they are hitting TDP limits and cost would increase.
 
The 980ti was closer to the 1070 than the 1080. That was part of the disappointment of rtx series, the 2080 provided the same performance as the 1080ti at a higher price. That didn't happen during the Pascal gen.
i actually had the 1080 in my pc at a time, and it was not much better then the oc 980ti i ran then. but that card had lost silicone lottery hard tho, it was one of the fatter zotac card vs my Gainward golden sample 980ti. it was faster yes, but not much. that why i said close to 1080, as in my mind it really was.
 
I think regular performance will be the usual increase, raytracing performance will see a healthy boost, and pricing will be entirely dependent on where AMD lands in the performance stack.

If they don't do a better job in the mid range segment with RT, they might as well can it. The average person isn't going to pay $1200+ just to get playable framerates at common resolutions with the higher quality RT on.
 
Last edited:
Die shrink alone should give it higher clocks and lower temps. Architechture wise I don't see huge advancements, just refinements. More Cuda/RTX/Tensor cores. 2nd generation RTX is intriguing. Maybe we'll finally get playable RTX games.
 
I would expect a return to normal generational performance bumps. 20xx disappointed on that front because they spent a huge chunk of the die area on aspirational new features instead of general performance uplift.
 
I would expect a return to normal generational performance bumps. 20xx disappointed on that front because they spent a huge chunk of the die area on aspirational new features instead of general performance uplift.

Unfortunately, it looks like Nvidia has a new idea of what normal pricing in relation to past generational performance. Let's hope that doesn't continue or else we'll see a $1500+ 3080Ti and a $1000 3080.
 
yeah the 2080ti is basically what ppl call "whale" territory :p literally worst value card from nvidia in 5+ years... i think, im only more updated from the 980ti gen, as it was around then i had more money available, from there only the new ti i skipped. i could get the new Ti but, honestly not worth it for 1440p anyway. im excited to see what intel can pull next year tho, i hope it is great but you never know. they surely got the resources and ability to kill nvidia i think. but amd said the were making a nvidia killer right now with their massive improvements on cpu dont actually sound too far fetched, how it works out to be another disapointment or not i or no1 else knows, but im hopeful. amd have done great on gfx before, more so then cpu.
 
i would go around this

at best
2080ti = 3080

at worst
3080ti = 115% 2080ti
 
Agreed, I think rt performance will get a major bump. Do we really need more raster performance?
Of course we need more raster. Right now it's high resolution or high refresh rate, not both. This rt sideshow is teen years from relevance
 
Personally, I enjoy RT and have 0 regrets buying a 2080Ti. A price to pay for a hobby I passionately enjoy.

If I had to guess, I'd say 3xxx series will be more traditional. 3000's first followed by a Ti 6 to 12 months after initial release.

Performance probably better overall, still doubt on 4K RT at high refresh rate though. Even at 2K now its demanding and 4K is almost impossible without DLSS. I'd except more refinement of DLSS and more RT cores as I expect nvidia to continue pushing this technology until it finally matures.

That's my guess!!!
 
I think it may all depend on when AMD releases a high end card. Nvidia is simply sitting on the behind like Intel was but, we will see.
 
I predict disappointment abound. I’ll go with 5-20%.
I agree, I was going to be more pessimistic in the 5-15% range though. The Ray Tracing engine will likely be 50% better. All of that for another 25% price hike above what we paid this generation.

Sad thing is, I will probably buy one...
 
Since we have a new variable to account for, RT performance, I'll state it like this:

Raster performance per bracket at ~120% with the RT hit for current and upcoming AAA titles at <10%.

The first part is what basically everyone here has written, the basic generational jump in raster performance, while the second part brings the idea that enabling the sort of hybrid raytracing that games are currently implementing will incur a minimal performance penalty versus Nvidia's Tesla architecture.
 
I dont know how fast they'll be but I know i'll be getting a 3080Ti, now that I'm no longer CPU bound i'm ready to spend my pennies on GPUs again. As of now im content with my 1080Ti @ 1440p adn the 2080Ti is too late in it's life cycle to interest me. I didn't interest me a whole lot when it was new either.
 
Lower cost, lower power requirement and more VRAM are what I'm hoping nVidia brings to the table. Still not interested in Ray Tracing and I'm doubtful the performance on the 30XX series will be substantial.
 
Pretty sure the days of gen-over-gen improvements of 40-50%+ are way over. I'd guess 15%-ish.


Says the man who forgot about Pascal. It offered 80% performance over Maxwell.

Although 14nm to 7nm is not as big a leap as 28nm to 14nm was, it's still going to deliver 50-60% density increase and power reduction. See here:

https://www.anandtech.com/show/1349...ction-of-chips-using-its-7nm-euv-process-tech

Nvidia was already able to to coax 20% efficiency increase with same die process under Turning, so "up to 70%" is the range I would put things at (assuming die sizes are maintained.

I would expect somewhat smaller dies, so maybe 40-50% better performance? It would explain why they've waited almost two years to move on to a more advanced process node (harder to build large dies on a cutting-edge node).
 
Pretty sure the days of gen-over-gen improvements of 40-50%+ are way over. I'd guess 15%-ish.
well, the die shrink alone potentially could give about 20% performance increase at the same power envelope. So I'll take that as the minimum
 
Pretty sure the days of gen-over-gen improvements of 40-50%+ are way over. I'd guess 15%-ish.
Get off my lawn! I agree, and I remember when saving up and spending *ONLY* $300 could net you a 50% increase in performance, from a top of the line card.
 
Human memory is also remarkably terrible. ;)

When was this? The Voodoo 2 on 500nm and a whopping 2 million transistors at 50Mhz?
Would you like me to spend my energy showing you a half decade where almost all top tier cards were less than $450?
And yes, it was a very long time ago admittedly when technology was moving along swiftly.
 
Would you like me to spend my energy showing you a half decade where almost all top tier cards were less than $450?
And yes, it was a very long time ago admittedly when technology was moving along swiftly.

That was a really long time ago. Around 2004 with the Geforce 3/4.

I know folks don’t like to admit it but fact is the 8800 gtx was a 485mm^2 chip for $600 back in 2006. In 2019 you’re getting a 545mm^2 rtx 2080 for $700.

We all want to pay less but it’s not true that GPUs have gotten expensive recently. They’ve been expensive for a long ass time.
 
Back
Top