GeForce RTX 2080 SUPER Launch Review Roundup

TechSpot lays out the value proposition pretty clearly. Either a new 2070S at $500+tax, or a used 2080 (once market price has adjusted); or used 1080Ti <$500.
Buying a new 2080S at $700+ makes no sense though.
 
It's faster and cheaper than a normal RTX 2080, so there's that. The 2080Ti is still around $1100 on the low end, so this is a pretty nice product to be $400 less.
 
Looks like a couple of games (Shadow of the Tomb Raider, Strange Brigade especially) do well with this, but few games tested shows more than a 5 fps increase, and many are showing only 2-3 fps increase over the 2080, without RT. With RT, the improvements are better, but performance is still pretty low overall. While the percentage increases look decent, the actual FPS increase is so small, I'm not sure it would translate to any better gameplay experience. If you thought the 2080 was overpriced, I'm not sure the 2080 Super gives enough performance boost to change that opinion.
 
Looks like a couple of games (Shadow of the Tomb Raider, Strange Brigade especially) do well with this, but few games tested shows more than a 5 fps increase, and many are showing only 2-3 fps increase over the 2080, without RT. With RT, the improvements are better, but performance is still pretty low overall. While the percentage increases look decent, the actual FPS increase is so small, I'm not sure it would translate to any better gameplay experience. If you thought the 2080 was overpriced, I'm not sure the 2080 Super gives enough performance boost to change that opinion.
It's $100 cheaper than the original 2080 FE, so there's that. Better performance at a cheaper price is good news for those who have been sitting on the fence. But just like the Titan X to Titan Xp, there's no reason for people who already own a 2080 to get a 2080 SUPER.
 
Unimpressed, glad they aren't taxing the founders markup but still this card almost looks like a panic product because they made the 2070 super perform too close to the 2080.
 
I had the impression there would be a more appreciable performance improvement than that. For the most part, it looks negligible over the RTX 2080. I think those 2080 Super gains can be easily attained by basic overclocking a normal RTX 2080 without modifying the GPU's voltage.

Did Nvidia maybe gimp the planned Super performance gains after seeing AMD's RX 5000 series' performance?
 
Last edited:
  • Like
Reactions: Rav3n
like this
If I were to move to new card out of this gen, this 2080S would be it... I'm just having a really hard time being excited about it though.

Just not sure if it's worth the upgrade from my current 1080 for roughly 30-35% more performance at 1440p (can someone confirm that's a close estimation or am I off here?)

Really wanting something new before CP/Doom Eternal comes out, but I'm on the fence about this whole thing :cautious:
 
If I were to move to new card out of this gen, this 2080S would be it... I'm just having a really hard time being excited about it though.

Just not sure if it's worth the upgrade from my current 1080 for roughly 30-35% more performance at 1440p (can someone confirm that's a close estimation or am I off here?)

Really wanting something new before CP/Doom Eternal comes out, but I'm on the fence about this whole thing :cautious:

I'm wondering the same - also on a GTX 1080 at 1440p. It seems like this would be the way to go, but for the price...I dunno.
 
I'm wondering the same - also on a GTX 1080 at 1440p. It seems like this would be the way to go, but for the price...I dunno.

Glad to hear I'm not the only one struggling along the upgrade path with the 1080 :( Nothing out there screams "this is the obvious upgrade choice".
 
Glad to hear I'm not the only one struggling along the upgrade path with the 1080 :( Nothing out there screams "this is the obvious upgrade choice".

The big problem for me is that most reviews haven't included the 1080 for a long time...even right after the 1080 Ti came out they pretty much stopped including it since that was sort of the "replacement" for the 1080. But that leaves me always wondering exactly how much faster a newer card is.

EDIT: I noticed that Anand's review actually has the 1080...nice!
 
Last edited:
i still say until RTX get better, best bang for the buck is a used 1080 TI.

Most people still think that due to inertia, and are keeping 1080Ti resale high.

But based on my vast miltary experience and analysis, most performance per dollar right now is a used 2080 - they've been dipping down close to $500 lately if you keep your eyes peeled. I was ready to buy another 1080Ti when I realized that - now the proud owner of a Zotac 2080 AMP triplefan that OCs off the charts, for less than the used 1080Tis I was considering.

TLDR: used $500 1080Ti vs used $500-550 2080, grab the 2080.
 
Last edited:
Unless there's going to be a price-drop to go along with it, the 2080S looks like it's an uneventful refresh. 5% performance gains, same price.

Hardware Unboxed points out that the 2060S costs 18% more than a 2060 while giving only an 8% performance increase, costing 9% more per FPS.

Comparing the 2080S to the 1080Ti, HU points out that we're getting a 9% performance increase at the same price-point as the 1080Ti 2.5 years later. Is that the worst performance-over-time scaling we've ever had in gaming GPU history?




The RTX 2000 series and Super refreshes look like they're an exercise in pretending to be offering something while not actually offering something.
 
If I were to move to new card out of this gen, this 2080S would be it... I'm just having a really hard time being excited about it though.

Just not sure if it's worth the upgrade from my current 1080 for roughly 30-35% more performance at 1440p (can someone confirm that's a close estimation or am I off here?)

Really wanting something new before CP/Doom Eternal comes out, but I'm on the fence about this whole thing :cautious:

I'm wondering the same - also on a GTX 1080 at 1440p. It seems like this would be the way to go, but for the price...I dunno.
According to TPU 35% sounds about right.

upload_2019-7-23_12-36-57.png
 
Most people still think that due to inertia, and are keeping 1080Ti resale high.

But based on my vast miltary experience and analysis, most performance per dollar right now is a used 2080 - they've been dipping down close to $500 lately if you keep your eyes peeled. I was ready to buy another 1080Ti when I realized that - now the proud owner of a Zotac 2080 AMP triplefan that OCs off the charts, for less than the used 1080Tis I was considering.

TLDR: used $500 1080Ti vs used $500-550 2080, grab the 2080.

What you're missing is that "if you keep your eyes peeled" you can find a 1080Ti for $430-450. Everyone can find a good deal on a card if they look hard enough and wait long enough. I definitely wouldn't spend over $500 for a 1080Ti though...even a hybrid cooled one. And personally, I wouldn't spend over $500-525 on a 2080 either...
 
I had the impression there would be a more appreciable performance improvement than that. For the most part, it looks negligible over the RTX 2080. I think those 2080 Super gains can be easily attained by basic overclocking a normal RTX 2080 without modifying the GPU's voltage.

Did Nvidia maybe gimp the planned Super performance gains after seeing AMD's HD5000 series' performance?

Do you honestly believe that it would even be physically possible for Nvidia to alter the card, produce brand new units, and get them out to reviewers within a week? And even if that weren't literally impossible what incentive would there be for Nvidia to spend the obscene amount of money required to do something like that?
 
Unless there's going to be a price-drop to go along with it, the 2080S looks like it's an uneventful refresh. 5% performance gains, same price...


The RTX 2000 series and Super refreshes look like they're an exercise in pretending to be offering something while not actually offering something.
And keep in mind, that 5% performance gain translates into only 2-3 fps increase in many scenarios. The big issue here is going to be pricing, which it appears may be a bit lower. Still an underwhelming product refresh.
 
If I owned a 1080Ti I'd simply wait for whatever the next major refresh is. I don't think this is the card that was even meant for that audience. This is more for someone that had a 1070 or something similar and doesn't want to drop a grand or more for a 2080Ti. It's not going to be all that amazing for RTX, but it's potent enough to do 4K/60 for the majority of titles. I also feel like the lost point is that this is like a 2080 but cheaper. It's better and cheaper. Not by a ton, but it's a no brainer which one to buy now. With some luck maybe it'll drive the normal 2080 prices down a little, too.
 
Do you honestly believe that it would even be physically possible for Nvidia to alter the card, produce brand new units, and get them out to reviewers within a week? And even if that weren't literally impossible what incentive would there be for Nvidia to spend the obscene amount of money required to do something like that?

It certainly would be physically possible to do between the time the RX 5000 series' performance was announced and the release of the RTX S series. Nvidia could just driver-gimp the performance, or disable some additional cores during production. They don't have to redesign the card or spend obscene amounts of money to do it.

Also, the RX series was announced May 26, and its performance was revealed June 10.
 
Last edited:
It certainly would be physically possible to do between the time the RX 5000 series' performance was announced and the release of the RTX S series. Nvidia could just driver-gimp the performance, or disable some additional cores during production. They don't have to redesign the card or spend obscene amounts of money to do it.

Also, the RX series was announced May 26, and its performance was revealed June 10.

No, it wouldn't. "Disable some cores" is still a physical change to the product and would require reworking or scrapping EVERYTHING produced up to that point. Conspiracy theories like this are really bloody stupid. Its just a boring card and it was always going to be a boring card, simple as that.
 
No, it wouldn't. "Disable some cores" is still a physical change to the product and would require reworking or scrapping EVERYTHING produced up to that point. Conspiracy theories like this are really bloody stupid. Its just a boring card and it was always going to be a boring card, simple as that.

Disabling cores is a physical change (edit: wait... is it? The cores aren't removed. It could be a firmware change). However, you've cherry picked parts of the post to attack a straw man argument. Driver-gimping isn't a physical change. Also, when did Nvidia begin producing RTX S boards? Why are you depicting a question that was asked as though it was an assertion?

Being of the belief that the card was always going to be what it is doesn't justify your arguing that something which literally could be done couldn't be done.
 
Last edited:
Disabling cores is a physical change, yes. However, you've cherry picked parts of the post to attack a straw man argument. Driver-gimping isn't a physical change. Also, when did Nvidia begin producing RTX S boards? Why are you depicting a question that was asked as though it was an assertion?

Being of the belief that the card was always going to be what it is doesn't justify your arguing that something which literally could be done couldn't be done.

I am not even going to address driver-gimping because it is ludicrous. Its just an extension of the stupid and constantly disproven "Nvidia gimps cards" bullshit and not even worth talking about. Work on the cards would have had to begin months ago. Even though there don't seem to be a ton of hardware changes done to the core for these cards any changes that were done would still need time to be designed, prototyped, tested, approved, and then produced. Production on old chips would need to be stopped so work on these could start. As was proved during the mining craze, quick reactions do not happen.
 
I am not even going to address driver-gimping because it is ludicrous. Its just an extension of the stupid and constantly disproven "Nvidia gimps cards" bullshit and not even worth talking about.

Your belief that Nvidia's moral character is above driver-gimping has nothing to do with the fact that it's entirely possible, that it's not a physical change, and that it doesn't cost an obscene amount of money. By the way, what's the difference between gimping performance in drivers and gimping performance by disabling cores (which we all know is done)?

Work on the cards would have had to begin months ago. Even though there don't seem to be a ton of hardware changes done to the core for these cards any changes that were done would still need time to be designed, prototyped, tested, approved, and then produced. Production on old chips would need to be stopped so work on these could start. As was proved during the mining craze, quick reactions do not happen.

The only relevant detail would be when the cards went into production. As far as I'm aware, disabling cores doesn't require a redesign of anything, and could maybe be a last-minute change that doesn't impact the card's operation at all. It's not like the cores are physically removed.
 
Your belief that Nvidia's moral character is above driver-gimping has nothing to do with the fact that it's entirely possible, that it's not a physical change, and that it doesn't cost an obscene amount of money. By the way, what's the difference between gimping performance in drivers and gimping performance by disabling cores (which we all know is done)?



The only relevant detail would be when the cards went into production. As far as I'm aware, disabling cores doesn't require a redesign of anything. It's not like the cores are physically removed.

Theoretically possible, sure. Likely? No, probably not. With as much as people still dig around in drivers these days an artificial limit like that would likely be found and disabled. Even though Nvidia doesn't seem to care too much about public perception, that might be one backlash even they would want to avoid.

If we go with the idea of them having changed it after the performance reveal during E3, I'd have to imagine production had already started. All three Super cards likely began production at roughly the same time. Unless this is just a paper launch, they would need to start production at a point that would ensure there was stock for their own FE release as well as all the AIBs they need to supply around the world. I don't know what yields are like on Turing chips, but given the sheer size of them they're not exactly getting a crap load per wafer.
 
Loved the GN review. Save your self time, and either watch the first or last minute.

Nothing special, since its a one horse race.
 
Theoretically possible, sure. Likely? No, probably not. With as much as people still dig around in drivers these days an artificial limit like that would likely be found and disabled. Even though Nvidia doesn't seem to care too much about public perception, that might be one backlash even they would want to avoid.

If we go with the idea of them having changed it after the performance reveal during E3, I'd have to imagine production had already started. All three Super cards likely began production at roughly the same time. Unless this is just a paper launch, they would need to start production at a point that would ensure there was stock for their own FE release as well as all the AIBs they need to supply around the world. I don't know what yields are like on Turing chips, but given the sheer size of them they're not exactly getting a crap load per wafer.

Apparently, Nvidia disables CUDA cores by zapping them with a laser. So, it involves no physical change to the card's design.

https://forums.tomshardware.com/thr...nvidia-go-about-disabling-cuda-cores.3004260/
 
Glad to hear I'm not the only one struggling along the upgrade path with the 1080 :( Nothing out there screams "this is the obvious upgrade choice".

I think there are quite a few of us. I'm still gaming on my "old" 7700k/GTX1080 but there are still no decent upgrades for me for a reasonable price (unless I were doing some sort of content creation). In 99% of gaming benchmarks the 7700k is damn near as fast as a 9900k so a new CPU won't really get much. But then GPU upgrades are pretty mediocre right now too so it makes the thought of upgrading somewhat painful. Hell a friend of mine is still on a 4790k with a GTX1080 and is dying to upgrade but just can't find a reason too for the cost it would incur.
 
Last edited:
Hmm, so looks like the best deal right now is one of the 2070 cards that is on sale with rebates that are around $420-430. Slightly faster than a 2060 super and roughly the same price or $20 more. At least for 1080p/1440p 144hz/freesync gaming. We might see the 2080 cards start going on sale in a week or two to move out the old inventory.
 
We might see the 2080 cards start going on sale in a week or two to move out the old inventory.

That was my first thought. Assuming there's a decent supply of those (I have no idea), you'd have to imagine they'll go on sale soon. There's absolutely no reason to buy one the instant the 2080 Super is available.
 
Comparing the 2080S to the 1080Ti, HU points out that we're getting a 9% performance increase at the same price-point as the 1080Ti 2.5 years later. Is that the worst performance-over-time scaling we've ever had in gaming GPU history?

Man, this card is taking a beating by the reviewers. Well most reviewers.
Tweaktown of course is drooling over this card. "FASTEST GDDR6 EVER!!"

God, those guys are pathetic shills.
 
Back
Top