AdoredTV: NVIDIA’s Performance Improvement Has Dropped from 70% to 30%

Could just be that it's getting harder to squeeze performance out of each process, especially when they don't shrink them as often as they used to you know.

During the golden age of video cards, (1998 to 2008), companies relied on FOUR things to improve performance, in descending order of importance:

1. Die Shrinks (allowing you to add more functional units, OR lower power, allowing higher clocks at same power level)
2. Increasing power consumption (higher clocks beyond what the die shrink bought you). We went from the 16w GeForce 256 to 75w 7800 GTX to 150w 8800 GTX to 220w GTX 280 in the span of ten years.
3. Increasing die size from one generation to the next (more functional units), as the industry got more experienced.
4. Architectural improvements. It's actually pretty rare to see a part like Maxwell, where the architectural improvements are AS important as the added functional units. Most of the time this is ~5-10% of the total performance. Sometimes it was hard to validate the efficiency of new features because no games used them for several years.


High-end graphics cards have not always been huge 600mm^2 monsters, they started out rather small. The Geforce 256 was a paltry 111 mm², but then it grew successively. This, combined with a die shrink AND power increase allowed some next-generation cards to more than double performance

Geforce 3: 128 mm²
FX 5800 XT: 200 mm²
7800 GTX : 333 mm²
8800 GTX: 484 mm²
GTX 280: 575 mm²

And Nvidia has been stuck in thew die size AND power consumption (250w) corner since then, dependent on die shrinks and architectural improvements ALONE. And those die shrinks have slowed down as well.
 
Last edited:
We are several people that have stated that the GTX680 (GK104) was NIVIDIA's midrange SKU fithing with AMD's top-end GPU.

Most people tended to get all worked up aove this, saying the the "GTX 680" name was more important the the SKU (GK104) and it wasn't a midrange SKU.

I wondered now that AMD's PR-thingy Adored makes the same claim...if it will be accepted now? ^^
 
980ti to 1080ti was 70% improvement in less than 2 years. By electronics standards that's enormous.

The size of the chips is the problem now, they're too complex, it's absolutely at the limits of what we can do.

If I were to guess I'd suggest that in the next couple of generations Nvidia are going to do a version of nvlink that enables cards to operate in parallel, or more accurately to share workloads properly between discrete processors so that you can scale out rather than just trying to fit it all on one chip.

There's also a need for a leap in graphics, consoles still hold it all back. Higher resolutions and refresh rates are all we're really gonna get at the moment
 
I generally upgrade when these are true.
1. My current card is not fast enough
2. New cards that are not astronomically priced are at least double the speed of my current card
OR when my current card dies.
This is true for me too. The only exception being when I found my R9 280 to be way too loud for my liking and there was a sale on GTX 1060...

One thing to consider is that back in the days the latest and fastest graphics card was usually insufficient to run the latest games on maximum settings.
GeForce 3 was for example insufficient to run Doom3 at max settings. To really experience a game right you had to wait for the next generation graphics cards.

How many games released in the past six months has had "GTX 1080Ti" as the minimum "recommended" card for "High" settings 1440p (with a pair in SLI required for "Ultra")?
(And then reviewers find that those recommendations are in reality a bit on the low side compared to what's required to actually enjoy the game at that setting...)
 
This is so stupid. How is 70% improvement sustainable generation after generation? The pace of speed improvements have slowed for every chip maker, gpu or cpu. Look at the mobile chip maker space, no big improvements generation after generation.
 
The last 6 minutes of pt 2 is just really sad because its all true. And the cause is not just AMD not competing, its also because people let Nvidia do this us.

LOL, that's funny. What the hell does "we let them do this [to] us" even mean? Are we going to raise some Reddit pitchforks and storm nVidia headquarters and demand we get 70% year over year improvement?
 
Last edited:
This is so stupid. How is 70% improvement sustainable generation after generation? The pace of speed improvements have slowed for every chip maker, gpu or cpu. Look at the mobile chip maker space, no big improvements generation after generation.

Yeah, I'm not sure the author of that article knows what "diminishing marginal returns" means.
 
Most of you guys are missing the point. In the early days you would get the top most silicon the fastest GPU. These days you pay more for mid range GPU as you did back then.

Now it is GTX then GTX TI then the fastest of the family of the GPU the Titan. You pay more for mid range you pay stupid amount for "high end" and the performance difference is not earth shattering.
The part where people just pay because it says Nvidia is regardless of competition (it being there or not).

The conclusion I like even better in part 2 he outlines all the crap people say about AMD. They want AMD to compete so Nvidia will lower prices (which if you understood the 1st part you already overpaying by a good deal already). Go Nvidia GO!
 
IMO, Nvidia is likely just dragging things out to maximize their profits. Just like Intel, and just about everyone else.
 
is it also your opinion that transistors can keep getting smaller and smaller indefinitely ?
 
is it also your opinion that transistors can keep getting smaller and smaller indefinitely ?
There is no doubt they can get a lot smaller but its whether materials/processes exist to do it and if a profit can be made.
Indefinitely is a bit silly though, you cant expect that.
 
It's really not due to lack of competition, at least that is not a major driver.

There are three major ways to increase throughput (GPU centric, but applies to CPUs to a large degree as well):
1) Faster clock rate
2) More execution units
3) Find ways of avoiding work in the first place

Early on we were able to hit all three pretty easily.
Now we're running into power and thermal issues limiting the first two. We can squeeze a bit more in with a node size reduction, but that too is getting much harder.
The third is completely non-trivial. It is very hard to find new ways of avoiding work, and sadly, that too requires work - which requires power, makes heat, etc.

I have zero doubt if NV could make a part which was twice as fast, they would do so. Same with Intel.
Due to a lack of demand if anything. If you have a DX11 capable graphics card then you're able to play any game made today. At the lowest settings, but the game is playable. Where when QUake 3 Arena was released there was a surge of demand for graphics card since very few could play it. Majority of PC sold had at least a ATI Rage 3D chip that was barely able to play Quake 3.

But since the 360 and PS3 were released we're stuck with games that never stray too far from what those consoles can do. Thus, people can use 5+ year old hardware to play those games. We need PC exclusive games that aren't afraid to push hardware to it's limit. Not games which default to Ultra graphics setting that ultimately make very little visual difference, but something like Ray Tracing.
 
, but something like Ray Tracing.

You know why Ultra doesn't look any better?

It's because the next step toward more realism is going to take an ungodly increase in graphics power. And also most people lack the eyesight acuity to see the difference anyway. It's the same reason 4k TV adoption is dog-slow, even though prices are down to the point that they were when 1080p sets started flying off the shelves.

And Ray Tracing is not the answer. It has just as many downsides if you want your models to move around in a scene.
 
Last edited:
Most of you guys are missing the point. In the early days you would get the top most silicon the fastest GPU. These days you pay more for mid range GPU as you did back then.

Now it is GTX then GTX TI then the fastest of the family of the GPU the Titan. You pay more for mid range you pay stupid amount for "high end" and the performance difference is not earth shattering.
The part where people just pay because it says Nvidia is regardless of competition (it being there or not).

The conclusion I like even better in part 2 he outlines all the crap people say about AMD. They want AMD to compete so Nvidia will lower prices (which if you understood the 1st part you already overpaying by a good deal already). Go Nvidia GO!

Some people buy Nvidia simply because it is Nvidia. Some people buy AMD simply because it is AMD. Smart people buy the best card for their budget and specific uses regardless of brand. Prices on everything has increased dramatically since "the early days". That is how the world, and every industry, goes. Nvida and AMD both have colluded to raise prices to their current position. Both companies also spend a hell of a lot more on each generation of GPU then in the past and that cost only goes up over time.
 
Some people buy Nvidia simply because it is Nvidia. Some people buy AMD simply because it is AMD. Smart people buy the best card for their budget and specific uses regardless of brand. Prices on everything has increased dramatically since "the early days". That is how the world, and every industry, goes. Nvida and AMD both have colluded to raise prices to their current position. Both companies also spend a hell of a lot more on each generation of GPU then in the past and that cost only goes up over time.

AMD and Nvidia are in collusion? Sure, just like alligators and baby ducks are colluding.
 
AMD and Nvidia are in collusion? Sure, just like alligators and baby ducks are colluding.

Dude, both companies engaged in price fixing in the past. They might not be doing it right now, but their previous collusion is what lead to the current prices.
 
Dude, both companies engaged in price fixing in the past. They might not be doing it right now, but their previous collusion is what lead to the current prices.
That's not what collusion means.

Collusion is an agreement between two or more parties, sometimes illegal and therefore secretive, to limit open competition by deceiving, misleading, or defrauding others of their legal rights, or to obtain an objective forbidden by law typically by defrauding or gaining an unfair market advantage. It is an agreement among firms or individuals to divide a market, set prices, limit production or limit opportunities.


Nvidia and AMD won't have colluded with each other to fix prices. Their competition has always been too full of Animosity for that. Besides which, AMD as a company were so thoroughly dicked over by Intel over the years after Intel welched on their agreement to share technology in the 90s, that the idea of AMD opening itself up to that sort of thing again is just absurd.

They may have price fixed independently of each other in the past, I haven't done the research and won't comment, but they specifically will not have "colluded"
 
This is so stupid. How is 70% improvement sustainable generation after generation? The pace of speed improvements have slowed for every chip maker, gpu or cpu. Look at the mobile chip maker space, no big improvements generation after generation.

I agree. And I can't be bothered to watch the fanboi's favorite idiot brother after having to suffer through the 480 review and the hundreds of forum posts on WCCF and a couple other sites that reference him. The guy had about 1K views on videos until he got onto an anti nvidia/intel roll.

People complain about this crap like the companies have some personal vested interest in getting them the the best stuff at the cheapest prices. The companies are there to maximize profitability. They are not charities. They are not your rich aunt. They do have an active interest in making products so that their shareholders make money while at the same time putting in enough R&D to stay relevant and produce products that keep them in business.

This is how much stuff used to cost "back in the day" (as in snaps from a 1991 PC Magazine). We used to have to buy math co-processors. To play Falcon 3.0 with the "best" modeling.

And I used to own a NEC Multisync 5D. So those saying that things are expensive now can go suck it.

If I sound grumpy it's because my company moved IT into some idiotic open half wall office layout. Whoever you are that decided that this environment is some kind of collaborative paradise, you are an idiot of the nth degree and read too many bs articles about how great Apple's new open campus is supposed to be. /rant

https://preview.************/noRXmv/20170911_132735.jpg
https://preview.************/kbL8Rv/20170911_132741.jpg
https://preview.************/gAYOta/20170911_132754.jpg
 
That's not what collusion means.



Nvidia and AMD won't have colluded with each other to fix prices. Their competition has always been too full of Animosity for that. Besides which, AMD as a company were so thoroughly dicked over by Intel over the years after Intel welched on their agreement to share technology in the 90s, that the idea of AMD opening itself up to that sort of thing again is just absurd.

They may have price fixed independently of each other in the past, I haven't done the research and won't comment, but they specifically will not have "colluded"

Nvidia and AMD were sued for conspiring to "fix, raise, maintain and stabilize prices of graphics processing chips and cards” between 2002 and 2007. A case which they eventually settled for $1.7 million.
 
The histories of these companies, especially their recent histories, prove that they wouldn't.

After the hugely overclockable Sandy Bridge saw people not upgrading their CPUs for many years (I'm still running an overclocked 2600K with some breathing room in graphically-intensive games), Intel greatly slowed down its generational performance gains to prevent a reoccurrance of the same thing.

Well, it also has to do with the fact that there is no reason to upgrade for 5% performance. You'd have to wait 4 years at least to see a notable difference. Had the successor to the 2500/2600 been 6 cores, I am sure a lot more would have upgraded.
 
The last 6 minutes of pt 2 is just really sad because its all true. And the cause is not just AMD not competing, its also because people let Nvidia do this us.
lol, what?

We let nvidia do what to us? The fact is that for the past few years, nvidia is the company that has been producing the top end gaming cards with little to no competition from anyone. Are you saying that people looking to buy the fastest performing hardware shouldn't, and should just buy mid-level products from AMD just for the sake of doing so? Fuck that. I'm not going to pretend that trying to compete against nvidia's ti and titan cards wouldn't require billions in investment and years spent, however the situation today would likely be far different if we had a competing card from AMD that could actually meet or beat a 1080ti in performance for roughly the same cost. Hell, what if 3dfx hadn't fallen apart? What if Matrox still actually tried to produce 3d accelerators? I mean there was S3 but I'm not going to pretend they were ever going to stand a chance. Intel doesn't give a shit beyond their integrated graphics and seemingly has no concern for the discrete GPU market.

So in all this time, we've had how many competitors since 3dfx went bankrupt and matrox quit trying? Zero. Well, one if you count bitboyz(lul). That's not "us" letting nvidia do anything, that's the market playing out on it's own.
 
I'm completely happy with Nvidia's pricing strategy.





As a shareholder.
 
Most of you guys are missing the point. In the early days you would get the top most silicon the fastest GPU. These days you pay more for mid range GPU as you did back then.

Now it is GTX then GTX TI then the fastest of the family of the GPU the Titan. You pay more for mid range you pay stupid amount for "high end" and the performance difference is not earth shattering.
The part where people just pay because it says Nvidia is regardless of competition (it being there or not).

The conclusion I like even better in part 2 he outlines all the crap people say about AMD. They want AMD to compete so Nvidia will lower prices (which if you understood the 1st part you already overpaying by a good deal already). Go Nvidia GO!

Firstly, do you think that the top end's price will never increase? Inflation doesn't exist? My 1.99 Big Mac is still around?

Secondly, if you are so concerned with a certain price point, why not compare all the releases at that price point and determine if there's any performance gain to be made? Just realize that any accusations made against Corporation A will equally slap Corporation B on the cheek.

AMD and Nvidia have competed with each other with similar performance brackets for the same price brackets. An Nvidia fanboy can take this video, swap all Nvidia for AMD and make the same amount of sense.
 
so AMD finally learned how to gouge customers thanks to Nvidia?

Its a Christmas miracle
 
Firstly, do you think that the top end's price will never increase? Inflation doesn't exist? My 1.99 Big Mac is still around?

Secondly, if you are so concerned with a certain price point, why not compare all the releases at that price point and determine if there's any performance gain to be made? Just realize that any accusations made against Corporation A will equally slap Corporation B on the cheek.

You missed the part of the video where it described what top end price was at the beginning, you also missed the part where they show Nvidia profit, maybe put those 2 together.

Don't feel addressed by your second point, Since I don't participate into every buying cycle.
 
He is funny...in a sad way...he blames NVIDIA for having a mid-range SKU that could compete with AMD's high-end SKU...and pricing it accordingly.
Not once does he point the finger at AMD for failing so hard at that generation.
Only about VEGA he has a touch of reality...but that is quickly ruined by his "Baa-waaa...NVIDIA earns money and makes sales...not fair for AMD!"

Like I said...the only benefit of his video is the fact that it will put a pipe in people thinking the GK104 was a high end SKU...despite the "GTX x8x" moniker...I know I have run into a fair amount of people unable to grasp this simple concept.
 
You missed the part of the video where it described what top end price was at the beginning, you also missed the part where they show Nvidia profit, maybe put those 2 together.

Don't feel addressed by your second point, Since I don't participate into every buying cycle.

I didn't miss that point. All I am pointing out is that inflation exists. It's financially unrealistic to expect the top end to remain the same price as the top end of 20 years ago, just like I expect my Big Mac to still cost the same 20 years ago but not want my salary to stay the same as 20 years ago. I then ask you to look at all the SKUs at the ~$250 price point because the Geforce 256 was $280 or so when launched. You were implying that generational differences were small so if the top end differences were small, the cheaper SKUs should be even weaker. Is the RX580 or 1060 faster than the Geforce 256?

Whether you buy top end or not, you got improvements over previous generations, be it whether you buy AMD or Nvidia. And since you don't participate in every buying cycle, your improvement jumps are even greater between your buying cycles.

As for competition between AMD and Nvidia, it's not all about price. There's also the need for competition of technology and innovation. Just because 1 corporation releases some new tech or innovation doesn't mean that it's the best or greatest idea. Competition gives us choice to champion a superior tech when it presents itself.
 
Give me a break please, inflation & competition where did you come up with these when you are looking at the profits Nvidia makes.

Nvidia hates innovation they produce proprietary technology for the soul purpose of generating more money. G-sync, CUDA, Gameworks(more to come). You used to pay $500 for "superior tech" and now it is over $1000 , inflation alright.
https://semiaccurate.com/2015/10/12/nvidia-patent-licensing-devastated-itc-ruling/
https://www.extremetech.com/computi...ng-for-infringing-on-its-invention-of-the-gpu
https://semiaccurate.com/2015/01/20/two-new-twists-nvidia-patent-trolling-kepler-license-scheme/

Competition gave you the Maxwell card in different flavours for the last couple of years how is that for innovation.

You badly lost touch with reality.

Stating that "NIVIDA hates innovation"

Names technologies innovated by NVIDA:
CUDA, G-Sync, GameWorks.


Not sure if you are trolling?
 
Nvidia hates innovation they produce proprietary technology
Let me get this straight, they both hate innovation and they make proprietary technology.

Which one is it? Because both are mutually incompatible statements.
they produce proprietary technology for the soul purpose of generating more money
Who doesn't? Does AMD not develop its own technology with money in mind, or do you believe they are the samaritan of technology that they are giving us things, free of charge, no strings attached?
 
Semantics do not work for me When it only benefits one side of the market that is not innovation, it is a marketing purpose to keep people stuck in the same buying pattern. Which led to the videos from AdoredTV
 
Semantics do not work for me When it only benefits one side of the market that is not innovation, it is a marketing purpose to keep people stuck in the same buying pattern. Which led to the videos from AdoredTV

So you WERE trolling and basing a personal attack on said trolling....gotcha.
 
I blame AMD for not competing.

I did the techie version earlier but in some ways at least it is this simple.

It's that awful MBA speak (really business studies 101) that riles the masses here but all public companies have to grow, it's the stupid thing about them (private is wonderful if you don't need debt or to cash out). The only way to do that is to get a bigger market or increase your margins, which can be any combination of reducing costs (r&d, manufacturing, supply chain et al) or increasing prices. That's it. All those trillions of dollars boil down to companies competing at those two things.

The addressable market isn't going to change for gaming. It won't spend more and there isn't enormous growth in numbers. They also in many cases haven't had a reason to upgrade for a long time. We've lacked a graphics killer 'must have'. Fortunately though there is significant opportunity for NVidia for other complimentary areas. That's kept the R&D spend up, but also meant that they can keep pushing the halo still higher. Now we're finally reaching the point where they don't have any urgency to productise the consumer cards, even though they could have had them out. It's a knockout punch they don't need to throw.
 
The addressable market isn't going to change for gaming. It won't spend more and there isn't enormous growth in numbers. They also in many cases haven't had a reason to upgrade for a long time. We've lacked a graphics killer 'must have'.

There is a cost to developing higher end graphics features and most developers aren't willing to bite at the cost. Games development is expensive enough that we struggle to even make the game. When Nvidia is willing to come in and basically do stuff for us at their cost(ie Nvidia developer support, Gameworks, etc), we basically bite off their hand for the chance to get free features.

The other bit that often stops us shooting for the stars is when we look at market surveys(eg Steam hardware survey) and look at where the 90% of the market is, that again limits us when trying to decide how high a ceiling we will reach with graphics. I think we can safely say we reached this plateau where graphics are good enough for most people and that it's much harder to achieve a wow factor.

Besides, lately nobody has been complaining about graphics. Everyone has been complaining about gameplay and content so again graphics takes the 2nd row seat.
 
Yes, the rate of performance increase declined post-Fermi due to a change in design philosophy at NVIDIA. They went from power at all cost to power efficiency. Not at all shocking that the performance deltas between generations declined as they did. We were also stuck on the same 28nm fab process for 5 years. Now that they have the efficiency down and the node has finally shrunk we are seeing good gains again, with performance improvements across the product stack averaging around 70% with Pascal over Maxwell.
Most of you guys are missing the point. In the early days you would get the top most silicon the fastest GPU. These days you pay more for mid range GPU as you did back then.

Now it is GTX then GTX TI then the fastest of the family of the GPU the Titan. You pay more for mid range you pay stupid amount for "high end" and the performance difference is not earth shattering.
The part where people just pay because it says Nvidia is regardless of competition (it being there or not).

The conclusion I like even better in part 2 he outlines all the crap people say about AMD. They want AMD to compete so Nvidia will lower prices (which if you understood the 1st part you already overpaying by a good deal already). Go Nvidia GO!
Oh, really? I compiled the data on midrange NVIDIA cards going back to GeForce2.

upload_2017-9-13_16-0-33.png


I couldn't find comparative performance data for the 7600 GT and prior, which is why they're empty, but the slope of the trend line shows a an average increase of $3.79 per generation. Looks like there are an awful lot of $199 occurrences in there, though :whistle:.
 
Back
Top