Nvidia Ampere Purportedly 50% Faster Than Turing At Half The Power Consumption

It kind of is when the xx80 card is released 9 months before the xx80 Ti version. If the two had been released together, then I'd agree.

Uh, no. The 1080 was a replacement for the 980. The 1080 ti was the ti class card. Doesn't matter if they release at the same time or 9 months apart. You don't get to arbitrarily decide where things go just because it doesn't fit your narrative.
 
Constant upgrades? No one has needed "constant upgrades" for the last decade. As for power, even in places with how power cost the difference is a few bucks a month. Preferring one over the other is fine, but try to keep the BS to a minimum.

depends on the luxury of what country you call home, where power is on a pay as you use, big systems do not cost a few bucks a month it costs a fear few a month. As for constant upgrades if you want to keep up with 4K 120-144hz or better and 8K you need to make the jump from 1080ti to 2080ti to super to 3080 if you are going to achieve that. If not then it will be less but I haven't seen people content with just enough in over a decade, lets consider this a Ryzen 5 3500X and 1660ti is enough for a great many things 1080P maxed out probably comfortable 1440P but when has anyone said that is enough.
 
depends on the luxury of what country you call home, where power is on a pay as you use, big systems do not cost a few bucks a month it costs a fear few a month. As for constant upgrades if you want to keep up with 4K 120-144hz or better and 8K you need to make the jump from 1080ti to 2080ti to super to 3080 if you are going to achieve that. If not then it will be less but I haven't seen people content with just enough in over a decade, lets consider this a Ryzen 5 3500X and 1660ti is enough for a great many things 1080P maxed out probably comfortable 1440P but when has anyone said that is enough.

Many people have said it's enough. Look around even this forum. Very few people, on a hardware enthusiast forum, are running top end hardware and upgrade every time something new comes out. And in the wider PC gaming space, many people keep systems for years. There are very few people engaging in constant upgrades because there is no need to in order to play PC games and there hasn't been a need to for a long time.
 
I do enjoy the CES Season of Sillyness. I'll believe such numbers once real world user and reviewer numbers get released. 50% faster at half the power is well, grade A hype machine sillyness unless they're dropping RTX from the chips.
 
Uh, no. The 1080 was a replacement for the 980. The 1080 ti was the ti class card. Doesn't matter if they release at the same time or 9 months apart. You don't get to arbitrarily decide where things go just because it doesn't fit your narrative.

I don't feel it's any more arbitrary than selecting Ti cards as the reference. Why not pick the non-Ti cards? Why not pick the Titans?

The simple fact is that Pascal did not offer the 50% boost everyone talks about at the time of release. At the time of release, the top dog Pascal card was the 1080 - which was a great card that offered a major performance boost over the previous top dog. 9 months after that, Nvidia released the even beefier 1080 Ti. If we were following the same script back then that we are following today, we'd all have been complaining about how the lackluster 1080 performance because it was a "paltry" 35% jump.

I agree that if we strictly compare Ti vs Ti vs Ti, we do see that there was a 50% boost for a single generation - no disagreement there. I just think it's a bit revisionist because it overlooks the long time between Ti cards along with intermediate releases of the new architecture which Nvidia hasn't done with Turing. And if we agree that the 1080 Ti vs 980 Ti is the correct comparison, when we go back one generation from there to 980 Ti vs 780 Ti, we see a performance boost of 25-35% when used at playable settings (the gap is much larger at unplayable framerates where the 980 Ti did, for example, 22fps instead of 10). So, although the 50% generational boost did happen, it happened one time - it wasn't normal and it isn't normal.

To bring this back towards the original topic, the fact that the rumor is headline grabbing is because a 50% performance boost is quite substantial and still somewhat unusual. If that were the norm, then the 50% rumor this time wouldn't even raise an eyebrow. While I hope the rumor is true, I'm not holding my breath.

I think the interesting thing here will be whether Nvidia launches Ampere with pre-crypto, peak-crypto, or post-crypto pricing. The big complaint about the 2080 Ti isn't the performance but rather the 100% price increase required to get the 35% bump in performance. The previous generations tended to have price boosts limited to 10% or less.
 
I don't feel it's any more arbitrary than selecting Ti cards as the reference. Why not pick the non-Ti cards? Why not pick the Titans?

The simple fact is that Pascal did not offer the 50% boost everyone talks about at the time of release. At the time of release, the top dog Pascal card was the 1080 - which was a great card that offered a major performance boost over the previous top dog. 9 months after that, Nvidia released the even beefier 1080 Ti. If we were following the same script back then that we are following today, we'd all have been complaining about how the lackluster 1080 performance because it was a "paltry" 35% jump.

I agree that if we strictly compare Ti vs Ti vs Ti, we do see that there was a 50% boost for a single generation - no disagreement there. I just think it's a bit revisionist because it overlooks the long time between Ti cards along with intermediate releases of the new architecture which Nvidia hasn't done with Turing. And if we agree that the 1080 Ti vs 980 Ti is the correct comparison, when we go back one generation from there to 980 Ti vs 780 Ti, we see a performance boost of 25-35% when used at playable settings (the gap is much larger at unplayable framerates where the 980 Ti did, for example, 22fps instead of 10). So, although the 50% generational boost did happen, it happened one time - it wasn't normal and it isn't normal.

To bring this back towards the original topic, the fact that the rumor is headline grabbing is because a 50% performance boost is quite substantial and still somewhat unusual. If that were the norm, then the 50% rumor this time wouldn't even raise an eyebrow. While I hope the rumor is true, I'm not holding my breath.

I think the interesting thing here will be whether Nvidia launches Ampere with pre-crypto, peak-crypto, or post-crypto pricing. The big complaint about the 2080 Ti isn't the performance but rather the 100% price increase required to get the 35% bump in performance. The previous generations tended to have price boosts limited to 10% or less.

How is comparing a Ti from one generation to a Ti from the following "arbitrary"? The Ti class card from Pascal was around 50% better than the Ti class card from Maxwell. Period. End of story. You don't need to make a big deal out of someone answering someone else's simple question. Cards coming out before the Ti are irrelevant because they're not the same product class. When looking at generational improvements you compare class to class. 970 vs 1070. 980 vs 1080, etc. That's how it works. All you are doing is trying to muddy the waters in order to, what, prevent people from saying Nvidia had a good generational improvement? The fuck's the point of that?
 
I don't feel it's any more arbitrary than selecting Ti cards as the reference. Why not pick the non-Ti cards? Why not pick the Titans?

The simple fact is that Pascal did not offer the 50% boost everyone talks about at the time of release. At the time of release, the top dog Pascal card was the 1080 - which was a great card that offered a major performance boost over the previous top dog. 9 months after that, Nvidia released the even beefier 1080 Ti. If we were following the same script back then that we are following today, we'd all have been complaining about how the lackluster 1080 performance because it was a "paltry" 35% jump.

I agree that if we strictly compare Ti vs Ti vs Ti, we do see that there was a 50% boost for a single generation - no disagreement there. I just think it's a bit revisionist because it overlooks the long time between Ti cards along with intermediate releases of the new architecture which Nvidia hasn't done with Turing. And if we agree that the 1080 Ti vs 980 Ti is the correct comparison, when we go back one generation from there to 980 Ti vs 780 Ti, we see a performance boost of 25-35% when used at playable settings (the gap is much larger at unplayable framerates where the 980 Ti did, for example, 22fps instead of 10). So, although the 50% generational boost did happen, it happened one time - it wasn't normal and it isn't normal.

To bring this back towards the original topic, the fact that the rumor is headline grabbing is because a 50% performance boost is quite substantial and still somewhat unusual. If that were the norm, then the 50% rumor this time wouldn't even raise an eyebrow. While I hope the rumor is true, I'm not holding my breath.

I think the interesting thing here will be whether Nvidia launches Ampere with pre-crypto, peak-crypto, or post-crypto pricing. The big complaint about the 2080 Ti isn't the performance but rather the 100% price increase required to get the 35% bump in performance. The previous generations tended to have price boosts limited to 10% or less.
Your wall of text is rendered moot by this chart:

upload_2020-1-5_20-42-1.png


From this link.
https://www.dsogaming.com/news/nvid...has-dropped-from-60-pre-2010-to-30-post-2010/

50% isn’t even nearly the best generational leap Nvidia ever had.
 
It would make more sense too look at the semantics from the companies point of view. From that perspective any reference is likely referring to architecture vs architecture improvements on a chip to chip level, not consumer SKU to SKU improvements (which would almost certainly not even be set this far out). The perspective would be Ampere vs. Turing vs. Pascal. vs Maxwell vs. Kepler (etc.) and not 2080ti vs 1080ti vs 1080. vs 980ti or whatever form you want.

From that perspective it's likely a comment on GA104 (?) vs. GT104. Nvidia compared GT104 (2080) against GP104 (1080) at launch. Much like how GP104 (1080) was compared against GM204 (980) and GM204 (770) against GK104 (680, notice not 770) in Nvidia's launch media.

From the design target perspective that is also what they would've planned against when laying out design targets. SKU configuration and planning is considerably more fluid and flexible even much later in the pipeline.
 
Huh?

50% isn’t even nearly the best generational leap Nvidia ever had.

But Pascal was the biggest leap since 2008, nearly 12 years ago. Pascal getting 50%+ is an anomaly in the recent decade where transistor costs started flat-lining.

So if you are looking expectations it would be the most recent decade, not the previous ones, where you could double transistors for almost the same costs. Today if you want to double the transistors, you will pay nearly double the cost.

Average the generation leaps of the last decade, and that will be closer to what you will get. (I used 30% for Turning, feel free to correct)

(53+14.9+28.8+20.4+29.9+56.3+30)/7 = 33.3%.
 
I'll wait until actual, real-world gaming reviews arrive on the scene before believing anything an investor in the industry says. I do hope the 3080 Ti is a monster of a card but as of this moment I don't see anything that it'll be 50% faster than previous gen.
 
Dubious source, but anything is possible. These days we are happy with 30% over the prior generation.
 
Back
Top