NVIDIA RTX 3080 Ti and GA102 "Ampere" Specs, Other Juicy Bits Revealed

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,929
I'm HIGHLY skeptical of the souce. Reminds me of "Gamer Meld" on YouTube... I don't trust it at all. Surprised TPU published this baseless report.

"As for performance, the "GA102" based prototype is allegedly clocking 40 percent higher performance than the RTX 2080 Ti at 4K UHD resolution in poorly optimized games, 50% higher performance on optimized games, and up to 70 percent performance in the "best case scenario" (a game that's been optimized for the "Ampere" architecture). We know from older leaks that by increasing the number of streaming multiprocessors, NVIDIA is doubling the CUDA core : RT core ratio compared to Turing, resulting in more RT cores per tier; and increased ray-tracing performance.

Each "Ampere" RT core is able to process 4x more intersections per unit clock-speed than "Turing." The tensor core count is also reportedly going to see an increase. The focus on ray-tracing and AI performance increase could give game developers the freedom to cram in more RTX effects per title, letting users disable what they want on older "Turing" cards. Performance limitations on "Turing" made developers choose from the RTX feature-set on what to implement. With "Ampere," NVIDIA could introduce DLSS 3.0, an updated image quality and performance enhancement. NVIDIA could resurrect a hybrid memory technology similar to AMD's HBCC, called NVCache, which spreads video memory across the video memory, the system memory, and flash-based storage."


https://www.techpowerup.com/266959/...-ga102-ampere-specs-other-juicy-bits-revealed
 
A 40-50% perfomance increase at 4K? I'm calling BS on this one. Wishful thinking, at best. Unless Ampere ends up being another Pascal this rumor will turn out be WAAAAY off base. At best we can hope for a 25-30% average increase.

As for the RT performance increase. We'll see. I feel like 4x is expecting a lot gen-to-gen and I just don't see it happening.

Overhauled control panel would be nice, though I hope they keep NCP and GFE separate. GFE not requiring a log-in anymore would be very much appreciated.

DLSS 3.0 could be promising, but the blurb about forcing it on by default and pushing this with benchmark sites is rather unwanted.
 
The greatest improvement Nvidia can make to their cards is to actually substantially increase RT performance. Many games that make good and heavy use of RT are suffering massive performance loss.

I wouldn’t be surprised if we have just a 10-20% improvement in rasterization performance while RT performance increases by 100% to mitigate the performance hit current RTX cards are faced with (I’m doubting 4x increase, but 2x seems likely). It’s not like AMD is really pushing Nvidia to up their rasterization performance so why not focus on the feature they’ve been heavily pushing for nearly 2 years now?
 
A 40-50% perfomance increase at 4K? I'm calling BS on this one. Wishful thinking, at best. Unless Ampere ends up being another Pascal this rumor will turn out be WAAAAY off base. At best we can hope for a 25-30% average increase.

As for the RT performance increase. We'll see. I feel like 4x is expecting a lot gen-to-gen and I just don't see it happening.

Overhauled control panel would be nice, though I hope they keep NCP and GFE separate. GFE not requiring a log-in anymore would be very much appreciated.

DLSS 3.0 could be promising, but the blurb about forcing it on by default and pushing this with benchmark sites is rather unwanted.

I'm going to say I hope they keep Nvidia Control Panel as is, maybe minor GUI updates. Don't want the sign in or auto setting features.
 
A 40-50% perfomance increase at 4K? I'm calling BS on this one. Wishful thinking, at best. Unless Ampere ends up being another Pascal this rumor will turn out be WAAAAY off base. At best we can hope for a 25-30% average increase.

I'll take that bet. Turing got 40%+ over Pascal at 4k so an advanced 7nm architecture with an IPC increase should easily hit 40% at 4k. In fact, if it's anything less then it would be a colossal failure.
 
I'm going to say I hope they keep Nvidia Control Panel as is, maybe minor GUI updates. Don't want the sign in or auto setting features.

NCP needs a fairly major overhaul. Not just GUI, but it's rather slow as well. They basically need a bottom-to-top redesign at this point. It doesn't need to be super fancy or "modern" (please, don't make it "modern") but it needs to be snappy, it needs to present information better, it needs monitoring features, a built-in OC tool would be really nice but not required, it would be nice if they allowed BIOS flashing from NCP instead of relying on 3rd party programs. Really, I'd love to see Nvidia get NCP up to the level of AMD's control panel feature wise, or even close to it.
 
I'm going to say I hope they keep Nvidia Control Panel as is, maybe minor GUI updates. Don't want the sign in or auto setting features.
That and also some nice new GeForce wallpapers. At least three 8K premium wallpapers if I'm slamming 1300 smackers down on the nightstand for the Ti chariot to the stars.
 
Last edited:
Why is this so hard to believe?

The 980 Ti was 40% faster than the 780 Ti at launch day (and it didn't even have a die shrink)

The RTX 2080 Ti was 35% faster on launch day, and is now 45% faster (with optimizations in new games.)

relative-performance_3840-2160.png


40% on launch day is completely believable: even with a significant reduction in die size, they will have e room to add a few more shaders!

It won't be as amazing as Pascal was, but it will be cheaper than Turing.,
 
Last edited:
The greatest improvement Nvidia can make to their cards is to actually substantially increase RT performance. Many games that make good and heavy use of RT are suffering massive performance loss.

I wouldn’t be surprised if we have just a 10-20% improvement in rasterization performance while RT performance increases by 100% to mitigate the performance hit current RTX cards are faced with (I’m doubting 4x increase, but 2x seems likely). It’s not like AMD is really pushing Nvidia to up their rasterization performance so why not focus on the feature they’ve been heavily pushing for nearly 2 years now?

I'm willing to bet money it won't have only 10-20% increase. ESPECIALLY if they increase RT performance by a huge margin.
It means they would basically kill 2080Ti price which would tank to 1/3rd of what it's worth now. RT is not THAT popular to bet on it with low raw performance boost and instead focus on RT performance.
There's no way in hell I'm buying €1300 3080Ti over €500 2080Ti if it's only 20% faster in 4k.
 

While Ampere sounds great in Moores Law is Deads video, the talk of Navi2x really piqued my interest. I hope AMD put the screws to nVidia this time around --maybe just maybe I'll buy an AMD gpu after a decade of nVidia only.

I think he's spot on in the video, Ampere GA102 is going to slaughter RTX 2080 Ti with at minimum 40-70% general performance uplift and an average of at least 50%. RT performance will simply embarrass Turing and it will lead to A LOT of salty Turing 2080 Ti owners that paid $1200+ for a card that gets outclassed by a $350 3060. I'm glad I only paid $800 for my 2080 ti ($320 in reality after selling my Titan X Pascal).

AMD with their partnership with TSMC keeps on paying dividends and I think nVidias cockiness much like Intel is going to hurt them significantly if Navi2x comes to market top down a quarter or two early, especially if the performance is on par with Ampere.

Exciting times ahead, I look forward to seeing what both companies show us and ultimately pricing will be a big decider, especially with this economic slump (however artificial it is).
 
Last edited:
...it will lead to A LOT of salty Turing 2080 Ti owners that paid $1200+ for a card that gets outclassed by a $350 3060.

The only salty 2080ti owners should be the ones buying now to a few months ago and really they should be more upset at themselves for not waiting. Anyone one else who is actually mad by their two year old card being outclassed by new tech should stick to sub $300 cards and enjoy the performance that follows.
 
While Ampere sounds great in Moores Law is Deads video, the talk of Navi2x really piqued my interest. I hope AMD put the screws to nVidia this time around --maybe just maybe I'll buy an AMD gpu after a decade of nVidia only.

I think he's spot on in the video, Ampere GA102 is going to slaughter RTX 2080 Ti with at minimum 40-70% general performance uplift and an average of at least 50%. RT performance will simply embarrass Turing and it will lead to A LOT of salty Turing 2080 Ti owners that paid $1200+ for a card that gets outclassed by a $350 3060. I'm glad I only paid $800 for my 2080 ti ($320 in reality after selling my Titan X Pascal).

AMD with their partnership with TSMC keeps on paying dividends and I think nVidias cockiness much like Intel is going to hurt them significantly if Navi2x comes to market top down a quarter or two early, especially if the performance is on par with Ampere.

Exciting times ahead, I look forward to seeing what both companies show us and ultimately pricing will be a big decider, especially with this economic slump (however artificial it is).

There are number of people on this very forum who think that anything close to a 40% increase is a pipe dream. That even a 2x performance increase in Ray Tracing is wishful thinking and that Ray Traced performance will only move up one tier, i.e. the 3060 will match the 2070 etc.

But I agree with you, Ampere (or whatever the next gen gaming gpu line up is) is going to be at least 40% of the a performance uplift over Turing. And Ray Tracing performance is going to be far better!!! Double the RT performance is minimum if they want to make it so that even mainstream cards can turn RT on.

Price, I don't think the price is going to be as bad as everyone fears. All the 7nm and 10nm processes are very mature now. And we have three way competition for the first time ever, with the consoles, AMD's RDNA2 and Nvidia's Ampere all arriving in last two quarters. Those reasons, combined with the economic slump might lead to decent pricing. Or maybe that is wishful thinking!!
 
There are number of people on this very forum who think that anything close to a 40% increase is a pipe dream. That even a 2x performance increase in Ray Tracing is wishful thinking and that Ray Traced performance will only move up one tier, i.e. the 3060 will match the 2070 etc.

But I agree with you, Ampere (or whatever the next gen gaming gpu line up is) is going to be at least 40% of the a performance uplift over Turing. And Ray Tracing performance is going to be far better!!! Double the RT performance is minimum if they want to make it so that even mainstream cards can turn RT on.

Price, I don't think the price is going to be as bad as everyone fears. All the 7nm and 10nm processes are very mature now. And we have three way competition for the first time ever, with the consoles, AMD's RDNA2 and Nvidia's Ampere all arriving in last two quarters. Those reasons, combined with the economic slump might lead to decent pricing. Or maybe that is wishful thinking!!

I hope this is true, I am ready to finally upgrade from the 1080Ti.
 
Not sure what the existence of GA 103 means for the price of GA 102 (3080 ti)

Is this like the xbox series X & xbox series S ?
 
20, 30, 50, even 60 or 70 percent increase is fine with me as long as nVidia doesn't keep their $1200 flagship mainstream and $2500 halo price points for consumer cards.

I'll jump on a 3080Ti on release day if it's in the traditional $700-800 bracket. Not holding my breath, though, since nVidia took advantage of the temporary crypto mining craze which shot the price of the 1080Ti to well over $1000 and carried that over as the permanent price point of the 2080Ti, so they have essentially conditioned most of their customers into these overinflated MSRPs (more like MAPs ...minimum advertised pricing contacts).

Does anyone else feel that DLSS is a bandaid technology to try and showcase/market more performance than their products are actually capable of at a given resolution (essentially 900p upscaled to 1080p, 1080p upscaled to 1440p, or 1440p upscaled to 2160p, etc)? It's like the polar opposite of DSR.
 
I'm willing to bet money it won't have only 10-20% increase. ESPECIALLY if they increase RT performance by a huge margin.
It means they would basically kill 2080Ti price which would tank to 1/3rd of what it's worth now. RT is not THAT popular to bet on it with low raw performance boost and instead focus on RT performance.
There's no way in hell I'm buying €1300 3080Ti over €500 2080Ti if it's only 20% faster in 4k.
Nvidia will stop the production, if they haven’t already, of 2080 Ti chips long before the 3080 Ti. They don’t care about second hand prices and they’ll have cleared their 2080 Ti inventory before revealing the performance difference.

Even at 20% faster raw performance at 4K, couple that with a 80-100% increase in RT performance and I’d almost be willing to bet the 3080 Ti will sell just as well as the 2080 Ti did. Ray tracing is the new buzzword now and will especially be sought after the next generation consoles also start utilizing it.

20, 30, 50, even 60 or 70 percent increase is fine with me as long as nVidia doesn't keep their $1200 flagship mainstream and $2500 halo price points for consumer cards.

I'll jump on a 3080Ti on release day if it's in the traditional $700-800 bracket. Not holding my breath, though, since nVidia took advantage of the temporary crypto mining craze which shot the price of the 1080Ti to well over $1000 and carried that over as the permanent price point of the 2080Ti, so they have essentially conditioned most of their customers into these overinflated MSRPs (more like MAPs ...minimum advertised pricing contacts).

Smart idea, wouldn’t want you to suffocate. :)
 
Last edited:
20, 30, 50, even 60 or 70 percent increase is fine with me as long as nVidia doesn't keep their $1200 flagship mainstream and $2500 halo price points for consumer cards.

I'll jump on a 3080Ti on release day if it's in the traditional $700-800 bracket. Not holding my breath, though, since nVidia took advantage of the temporary crypto mining craze which shot the price of the 1080Ti to well over $1000 and carried that over as the permanent price point of the 2080Ti, so they have essentially conditioned most of their customers into these overinflated MSRPs (more like MAPs ...minimum advertised pricing contacts).

Does anyone else feel that DLSS is a bandaid technology to try and showcase/market more performance than their products are actually capable of at a given resolution (essentially 900p upscaled to 1080p, 1080p upscaled to 1440p, or 1440p upscaled to 2160p, etc)? It's like the polar opposite of DSR.
I wouldn't count on it. I expect the price to jump to $1500 before they drop it under $1000. Why should they drop it if they are happy with their sales numbers at $1200? They probably sold as many or even more 2080ti then 1080ti I bet.
 
Turing 2080 Ti owners that paid $1200+ for a card that gets outclassed by a $350 3060.
So the same way a 2060 outclasses a 1080ti?

The 3060 will have about 66-70% the performance of a 2080ti for $350 which would still make it a great deal.
But It is unlikely that the 3060 will cost $350 given that lowend 2060s cost $350

NVidia would be crazy to give more performance at that price point, unless forced by competition.
.
 
I'd bet it will cost twice that.

I'd almost wager real money to bet it'll fall between $1100-1300 on launch, with light gouging by the retailers pushing it north of $1350 for the first few weeks to couple months.
 
Well, it is, kinda. In order to make a car go 150mph you need, given an average automobiles drag (sounds like he knows what he's talking about) you need X horsepower. But in order to make it go 200 or 250mph you need to significantly raise horsepower and change that drag, it doesn't scale linearly. At some point processing MOAR PIXELZ means you can't just add more CUDA Cores or Dibblywinks or Tranxacfitiers.....you gotta rethink how to more efficiently use the technology you already have, so this is a way for them to get there. It's a "think differently" situation......it's not really a bandaid but it's still fixing a problem by not following the standard NAIL/HAMMER path.

Reality is many/most people can't tell when they're in the middle of their game if their resolution is scaling dynamically......consoles have proven that to some extent....if its 4k or near-4-k you can only notice it when you're comparing against full-rez in real time, or freeze-framing, or spending your time not playing the game instead looking at screen-edges to find something to bitch about in the aliasing of some leaf or motion blur, etc, in other words we are increasingly picking at nits. Most people when they're in the game aren't pausing to go "WTF that guy's patch on his arm 300m away momentarily blurred! THESE DEV SUCKS, I'M GIVING UP GAMING AND GOING OUTSIDE TO JOG"....I mean some of you lunatics do that, but most of us who have gamed for awhile go "yeah I can deal with this level of graphics"....my bigger bitch has to do with people using "ray tracing" like its the answer to all of gamings problems. It helps make games shadows and lighting look more realistic, but 10 years ago the only thing anyone cared about was MORE POLYS! WE NEED MORE POLYS! Um, so we wound up with games that ran at 720p/30fps but CIRCLES WERE NEARLY PERFECTLY ROUND-ISH ALMOST, etc, etc.

We're kinda dumbasses, gamers I mean......personally I want a 3080 Ti at any cost, my 1080 Ti is doing a fine job but it's time to move on....I don't care about Ray Tracing in Quake 2 (ugh) or Minecraft (not a game).....but I'm thinking this card will be able to deliver more reason to enable RT Lighting in games I might actually I want to play.....
 
Last edited:
I'm still curious about HDMI 2.1 functionality. That's going to probably lead to a chain reaction of purchases over the next 12 months in my house. If it doesn't, then that cycle will probably be closer to 24 months.
 
  • Like
Reactions: noko
like this
Does anyone else feel that DLSS is a bandaid technology to try and showcase/market more performance than their products are actually capable of at a given resolution (essentially 900p upscaled to 1080p, 1080p upscaled to 1440p, or 1440p upscaled to 2160p, etc)? It's like the polar opposite of DSR.

Not at all, the benefits of DLSS are significant for all future cards, as the significant performance gains you get with it (with and without RT on) with a completely negligible (and sometimes better) hit on image quality now on DLSS 2.0. And who doesn't want better performance at every resolution now that 120+ Hz monitors are the norm? I might have agreed with you when DLSS 1.x was current and the hit on image quality was noticeable, but 2.0 is game changing given that you can play 4k60 (or more appropriately at 1440p 120+ Hz) on a 2060 now with nearly maxed IQ otherwise on the couple titles that support it.

Well, it is, kinda. In order to make a car go 150mph you need, given an average automobiles drag (sounds like he knows what he's talking about) you need X horsepower. But in order to make it go 200 or 250mph you need to significantly raise horsepower and change that drag, it doesn't scale linearly. At some point processing MOAR PIXELZ means you can't just add more CUDA Cores or Dibblywinks or Tranxacfitiers.....you gotta rethink how to more efficiently use the technology you already have, so this is a way for them to get there. It's a "think differently" situation......it's not really a bandaid but it's still fixing a problem by not following the standard NAIL/HAMMER path.

Reality is many/most people can't tell when they're in the middle of their game if their resolution is scaling dynamically......consoles have proven that to some extent....if its 4k or near-4-k you can only notice it when you're comparing against full-rez in real time, or freeze-framing, or spending your time not playing the game instead looking at screen-edges to find something to bitch about in the aliasing of some leaf or motion blur, etc, in other words we are increasingly picking at nits.

Indeed, I'm all for DRS as well if implemented properly, but DLSS 2.0 seems to be considerably better than any other current DRS method I've seen so far in both image quality and performance gains.

I can confirm that the 3080 Ti will perform better than the 2080 Ti. You can quote me on that.

You didn't say better at what. Now I'm gonna have to buy both cards and smash them together until the PCB breaks on one and if the 3080 Ti's PCB doesn't win, I'm gonna come back and debunk your quote with the 2080 Ti's superior PCB performance.
 
The mooreslawisdead video seems legit. He talks about how hard it is to get information, and how he has to be vague in some aspects to protect sources.



He is saying AMpere is the next Pascal, so the same big leap over Turning that the 1080Ti was over the 980Ti. We should all be excited about that, but it is also of course unconfirmed.
 
Yeah Gamer Meld is something that pops up on my feed sometimes. Watched 2 or 3 and was like no.

As for Ampere, I can see 50% boost with RT enabled stuff but not normal stuff. I’m expecting 30% with the highest product line. 15-25% mid tier there or abouts.
 
I'm HIGHLY skeptical of the souce. Reminds me of "Gamer Meld" on YouTube... I don't trust it at all. Surprised TPU published this baseless report.

I agree, I blocked MLID YT channel, as they basically post extremely speculative click bait.

But I think TPU did us a favor summarizing it, so I can critique the content, without sitting through the video.

Most of it looks like speculative BS again. It's possible he got a leak about a manufacturing sample, that has 5376 Cuda. Which is not a stretch, since that is only ~17% more cores than the top Turing die, throw in some mild IPC/Clock gains and you might see 30% more performance. Or even that might be BS and it will have more cores....

That and the physical layout (3 fans) of the card are things that might leak out from a card producer in the chain somewhere in China. Or could again be complete BS.

But the stuff about what NVidia is going to do at the business strategy level, would need some leaks from higher level executives from within NVidia.

Some of the claims are obviously just made up nonsense as well. Little "gems" like this:

"They won't mandate any low RT settings for old cards, they going to crank RT up and make you upgrade" :rolleyes:

It should be obvious to everyone, that this is 100% in the hands of developers. There is no incentive for developers to block lower performing cards, and there will be no universal reaction among developers on how much (if any) RT they include in games. So this one is just pure made up BS, and not even well thought out BS.

It's too bad that people that get actually leaks, didn't just post the actual leaks, not embellish them to the point that they become ridiculous. Who knows there is a kernel of truth among the embellishment here, but even if there is, the final output is basically useless floating in a sea of BS.
 
I am far more interested in what they are going to be launching in the 75w range this time through. I am looking to build an emulation cabinet using an old Dell Optiplex as I am retiring a few of those in the coming months, I mean even a 1650 is going to be massive overkill but I would still like to see what is available.
 
Man if I can get better than 2080Ti performance for $350 when these release I do not exaggerate when I say satellites will be able to see my erection from space.

I hope your PC has a condom mod. And that’s not the right liquid for cooling.

Seriously though, I’m just going to wait for benchmarks and then decide to upgrade my 2080 since I game at 4K.
 
Back
Top