Midrange GPUs had gotten High-End GPU´s Price tag

I remember this was a hot topic when the 680 & 670 came out and were missing much of the non-gaming compute processing power that the 580 & 570 had. People felt it was a re-badged mid-range chip because of that (aside from the chip code, unexpectedly low TDP, and lower Bus width).

Then people saw the bottom of many (not all) 670s and noticed they had short (historically mid-range length) PCBs with longer fan shrouds to make them look like "high end" cards. Certainly plenty of conspiracy fodder.


0y3HzIY.jpg



The biggest assumption here is that Nvidia's chip nomenclature hasn't changed. If it hasn't changed and these really are mid-range GPUs, then this is just a result of AMD's inability to provide any real competition.

If Nvidia can beat AMD with their mid-range offerings, then why not? Blame AMD.

It becomes even more fun when you add the Gx106 SKU's to the picture ;)
 
To the OP - I'm glad I found your post today, you are certainly not alone in thinking we're now getting less cards for more money. Many consumers these days are blinded by 10-20% faster performance than last year! kind of sales pitch and simply ignore the technological leap that may or may not be happening. I remember the old days 20 years ago when we were getting 50%+ increases every year or two. It was insanity. I get that these days there's diminishing returns in how refined our GPUs have become, but consider this: the jump we've seen this generation is high, but not as high as it could have been. We have a new architecture AND a process die shrink. This shrink is not from 20nm to 14/16nm, it's from friggin 28nm to 14/16nm. That's 2 nodes in one go. Even then, most of what Nvidia seems to have done is enjoy the process shrink, adapt the architecture to the node without really bothering improving it much, and cranking up the clocks as much as they could get away with. In that sense, I see the OP's x60 range card on steroids argument. The second I saw the 1080 and 1070 I felt a bit disappointed, like we should have had even more performance, specially with the price hike.

Then, look at AMD - and I hate I'm going to sound like a fanboy, but consider I've been an Nvidia consumer for my past 9 cards: Geforce 2, 4, FX, 6600, 8600, 9600, 260, 470 and 770 - providing value equivalent to 970-980 at 200. While their power consumption seems worryingly high considering the process shrink advantages, the performance/value is there in a way that I don't see in the 1070/1080.

By no means am I suggesting that these new cards are bad, but I have a feeling that if you've been doing this for the past 20 years and were already enjoying and learning about GPUs in the 3DFX Voodoo era and before, there's an unavoidable stench these days that keeps telling me we are not getting anywhere close to what we could be getting if the market were more competitive than it is. Look at Intel pre-AMD's good years, during and post. Intel was amazing, then it was utter crap, then it got amazing again after Core, now it's meh once again until hopefully AMD competes properly with Zen this year. The same scenario is playing out in AMD/Nvidia lately and most people can't see it because GPUs still get a good amount of performance increase every year because it's such a young tech compared to CPUs. From my historical perspective, Nvidia is the Intel of GPUs and it's becoming very clear that they've grown complacent, but as long as AMD can't compete properly on performance/price/power consumption, we're stuck. At least the RX 480 it gives us a chance of getting similar performance from Nvidia at $199 at a minimum, and that's purely thanks to competition from AMD - otherwise, I'd expect the 1060 to barely surpass the 970; but now there's a chance the 1060 will get to 980 levels of performance.

Being stuck is not good. Being stuck makes you spend greater amounts of money for hardware that should cost less. Being stuck makes you get what should be mid-range cards for high-end card prices. I don't think the bus metric of OP is relevant, but the eagle-view of his suggestion seems perfectly accurate to me, and I didn't have to measure numbers to achieve that conclusion. I just had to be around and experience CPU/GPU tech advances for the past 20 years.
 
Last edited:
To the OP - I'm glad I found your post today, you are certainly not alone in thinking we're now getting less cards for more money. ...
Maybe you and the OP can start a GPU company and crank out an Nvidia beating GPU and sell it for less money since apparently making top end GPUs isn't all that hard.
 
Maybe you and the OP can start a GPU company and crank out an Nvidia beating GPU and sell it for less money since apparently making top end GPUs isn't all that hard.

Maybe. Or maybe you can mature enough to conceive that people have different opinions than yours and it's not an attack on your position regarding any particular topic.
 
Why is it that Nvidia is apparently 'kept semi-honest" by competition from AMD, but not the other way round? When Nvidia slashed prices of various cards after AMD released a competing card certain people praise AMD to the skies, but when Nvidia launches a card that forces price cuts from AMD, the same people praise AMD for cutting prices.

What do you think would happen if Nvidia suddenly disappeared? I genuinely think a lot of the grief that AMD gets isn't due to people's views of the company, it's from the persistent agenda certain posters have across carious forums that AMD is somehow a better company. One that releases products out of the goodness of their hearts and is just striving to combat evil Nvidia and protect us all.

The reality is that both companies care about us to the extent of how many dollars can they get from us. Nothing evil or good about it, they both just want our money. Which is absolutely fine. One company just happens to be better at it currently, doesn't mean they are magically less ethical.
 
Why is it that Nvidia is apparently 'kept semi-honest" by competition from AMD, but not the other way round? When Nvidia slashed prices of various cards after AMD released a competing card certain people praise AMD to the skies, but when Nvidia launches a card that forces price cuts from AMD, the same people praise AMD for cutting prices.

What do you think would happen if Nvidia suddenly disappeared? I genuinely think a lot of the grief that AMD gets isn't due to people's views of the company, it's from the persistent agenda certain posters have across carious forums that AMD is somehow a better company. One that releases products out of the goodness of their hearts and is just striving to combat evil Nvidia and protect us all.

The reality is that both companies care about us to the extent of how many dollars can they get from us. Nothing evil or good about it, they both just want our money. Which is absolutely fine. One company just happens to be better at it currently, doesn't mean they are magically less ethical.
Were all entitled to our own opinions....I started disliking nvidia the moment they locked out physics cards anytime a amd card was detected. (and i never even owned a amd card up to the point) I have yet to see a business decision like that from AMD. Up to my current card i had ONLY purchased Nvidia cards. All the trolls go on and on how terrible AMD drivers are, but none of my nvidia cards got anywhere near the performance boost and driver support my current card has got over the time i have been using it. I do agree their both just trying to make the most money in the end but my time using an AMD card has just been better so far. I even like the rma experience better than what i got with evga, and i never saw that coming.
 
Last edited:
To the OP - I'm glad I found your post today, you are certainly not alone in thinking we're now getting less cards for more money. Many consumers these days are blinded by 10-20% faster performance than last year! kind of sales pitch and simply ignore the technological leap that may or may not be happening. I remember the old days 20 years ago when we were getting 50%+ increases every year or two. It was insanity. I get that these days there's diminishing returns in how refined our GPUs have become, but consider this: the jump we've seen this generation is high, but not as high as it could have been. We have a new architecture AND a process die shrink. This shrink is not from 20nm to 14/16nm, it's from friggin 28nm to 14/16nm. That's 2 nodes in one go. Even then, most of what Nvidia seems to have done is enjoy the process shrink, adapt the architecture to the node without really bothering improving it much, and cranking up the clocks as much as they could get away with. In that sense, I see the OP's x60 range card on steroids argument. The second I saw the 1080 and 1070 I felt a bit disappointed, like we should have had even more performance, specially with the price hike.

Then, look at AMD - and I hate I'm going to sound like a fanboy, but consider I've been an Nvidia consumer for my past 9 cards: Geforce 2, 4, FX, 6600, 8600, 9600, 260, 470 and 770 - providing value equivalent to 970-980 at 200. While their power consumption seems worryingly high considering the process shrink advantages, the performance/value is there in a way that I don't see in the 1070/1080.

By no means am I suggesting that these new cards are bad, but I have a feeling that if you've been doing this for the past 20 years and were already enjoying and learning about GPUs in the 3DFX Voodoo era and before, there's an unavoidable stench these days that keeps telling me we are not getting anywhere close to what we could be getting if the market were more competitive than it is. Look at Intel pre-AMD's good years, during and post. Intel was amazing, then it was utter crap, then it got amazing again after Core, now it's meh once again until hopefully AMD competes properly with Zen this year. The same scenario is playing out in AMD/Nvidia lately and most people can't see it because GPUs still get a good amount of performance increase every year because it's such a young tech compared to CPUs. From my historical perspective, Nvidia is the Intel of GPUs and it's becoming very clear that they've grown complacent, but as long as AMD can't compete properly on performance/price/power consumption, we're stuck. At least the RX 480 it gives us a chance of getting similar performance from Nvidia at $199 at a minimum, and that's purely thanks to competition from AMD - otherwise, I'd expect the 1060 to barely surpass the 970; but now there's a chance the 1060 will get to 980 levels of performance.

Being stuck is not good. Being stuck makes you spend greater amounts of money for hardware that should cost less. Being stuck makes you get what should be mid-range cards for high-end card prices. I don't think the bus metric of OP is relevant, but the eagle-view of his suggestion seems perfectly accurate to me, and I didn't have to measure numbers to achieve that conclusion. I just had to be around and experience CPU/GPU tech advances for the past 20 years.

I think theres truth in your premise, but IMO your argument that we're getting cheated out of gains that we would have gotten in the past just doesn't support it. You can't mention diminishing returns, then turn around and ignore that fact for the rest of your argument by bringing GPU's from 20 years ago into it. Sorry, but the performance improvements we were seeing 20 years ago are not relevant to the market OR the technology today. At that time everything was new, chip design philosophies were changing constantly, new things were being integrated into the chips, the software was becoming more sophisticated, and processes were shrinking at a rate that we're never going to see again without some kind of revolutionary breakthrough (e.g. quantum computing, substrates other than silicon). Everything now is more mature, the architectures of these GPU's have basically converged on the unified shader model to the point where the only way to get really big gains is to throw more shaders around or clock higher. We got about 1.5 of those 2 things in GP104 and you're still disappointed? I can't argue with your right to be disappointed, but asking NV to make dramatic architectural changes on top of a node change is asking for a big risk (ask AMD about R600)... one that might not even bring the gains you seem to be expecting, and certainly one they just don't need to take. And for all that, we STILL got a 50-60+% Improvement over GM204 with less power consumption. I'm not going to make any arguments about the value AMD is providing with Polaris at $200 because we haven't actually seen those results to know if it means anything.

IMO the thing about intel is a bit of a non-sequitur. Maybe they were resting on their laurels during P4 days, but I don't really know that what we're seeing now is solely the result of AMD's lack of competition. If anything, I think we have to also blame the competition from ARM for the stagnation of desktop performance, as intel has focused heavily on reducing power consumption and dedicated relatively high proportions of their silicon to IGP performance to make sure that mobile SoC's won't encroach too hard on Intel's notebook marketshare. At the same time, the demand for CPU performance has stayed relatively flat compared to demands for GPU performance, and I don't think that has anything to do with GPU's being "young tech", but that people love eye candy and mid-range CPU's are perfectly adequate for most purposes.

No matter how much competition we see in the market, even if we had 5 GPU manufacturers, they don't own fabs and they don't control when they get to switch nodes. Its not like NV would have stuck to 28nm if they had the opportunity to go to 16nm just because AMD wasn't competitive or in a position to do the same. We're stuck no matter what. In fact, with even more players we might even be *worse* off in certain ways. While we're throwing it back to the 3DFX days, who can forget how every single GPU manufacturer had their own proprietary API? Imagine a world where there were 5 different versions of "gameworks" and 5 different groups of people bitching about how every version other than theirs sucks. That might be my personal version of hell.
 
I remember the "goold old days" of computer hardware. I had a 486 33mhz (with the math co-processor!) that costed nearly $3,000 at the time. I could put dual GTX 1080's in my couple of month old Z170 system and still not be close to that total. Yup, the good old days. /rosecoloredglases
 
I remember the "goold old days" of computer hardware. I had a 486 33mhz (with the math co-processor!) that costed nearly $3,000 at the time. I could put dual GTX 1080's in my couple of month old Z170 system and still not be close to that total. Yup, the good old days. /rosecoloredglases

The old VESA video cards certainly would have the GTX 1080 beaten on card length!
 
From the conclusion of the HardOCP GTX 1070 Founders Edition Review:

The most important aspect to take away from the launch of the NVIDIA GeForce GTX 1070 Founders Edition is the fact that NVIDIA has taken what use to be $649 video card performance (GTX 980 Ti/R9 Fury X) and brought it down to $379-$449 price point. That is at least a $200 reduction, for performance that just a month ago was the fastest you could buy.

Bringing that level of performance down to a more affordable price for everyone allows more people to jump on the high-performance bandwagon. For the new games coming out this year, that gameplay experience improvement will be welcomed.
 
These are true statements about current advances being a state of affairs due to diminishing returns and I did fall on my own trap of then ignoring this reality and wishing for the past, I'm mostly nostalgic for the breakthrough advances we used to have. The past decade has been so smoothly evolutionary. Still can't shake off the feeling that we should be more advanced at this point. Maybe next year when HDR monitors come out that will satisfy me for a good while. It's crazy that we're still paying premium for 60hz screens... I don't care much for 4K, I'd rather have a better 1080p signal as the human eye cares far more about contrast than extra definition. A 1080p 120hz HDR screen is what I dream of and whatever GPU is enough to drive those 1080p pixels is all I need. The 1070 just seems... too much; for me of course, clearly there's plenty out there who can use it's performance.
 
These are true statements about current advances being a state of affairs due to diminishing returns and I did fall on my own trap of then ignoring this reality and wishing for the past, I'm mostly nostalgic for the breakthrough advances we used to have. The past decade has been so smoothly evolutionary. Still can't shake off the feeling that we should be more advanced at this point. Maybe next year when HDR monitors come out that will satisfy me for a good while. It's crazy that we're still paying premium for 60hz screens... I don't care much for 4K, I'd rather have a better 1080p signal as the human eye cares far more about contrast than extra definition. A 1080p 120hz HDR screen is what I dream of and whatever GPU is enough to drive those 1080p pixels is all I need. The 1070 just seems... too much; for me of course, clearly there's plenty out there who can use it's performance.
What break thrus are you referring to? I been at this a long time and it seems like were getting the same advancements in performance year after year...well actually new card to new card. (going back to riva tnt2)
 
Nvidia is comparing the 1070 to the 970 - performance to performance range video card.
The New GeForce GTX 1070 Graphics Card

For the performance increase I can't imagine anyone really being disappointed with the $50 hike

The 1080 is being compared to the 980 - go figure - In the end actually a small price hike (except for the F__k Edition cards). This time around I think the gap between the 1070 and 1080 performance makes the 1080 an Enthusiast card.

Anyways the 970/980 were not midrange cards - they were performance cards in my books. Same with the 1070 with the 1080 more an Ethusiast card which happens to be the fastest cards on the planet beating last years Enthusiast cards.

Low end - Mainstream - Midrange - Performance - Ethusiast. Well at least that is how I see it for the ranges of cards. AMD wants VR capable cards at the Mainstream level, what ever that means. What will be the price of the 1080Ti be? I think that depends if AMD is competitive or not. As for the Titan Next?
 
It's not unexpected that high end cards would start to cost more. It's amazing they have stayed close to the same for so long. If you look at Nvidias entire history as a whole as opposed to the last 5 years, then this whole argument starts to lose merit.
The only people who are really crying about are the people who upgrade their cards every generation just to upgrade. So pay the piper or don't. Your 980Tis are still good. If you want to move to 4k at more comfortable playability or something pay the piper.
A lot of people are still gaming on 1080p or 1440p and will be for years. A 980ti or Titans or a gtx 1070 even will be good enough for a good while for these people.

Even if they have moved the tiers and added a new very top end card to the mix with the Titans. The 1080 beats the last 1000 dollar card. A midrange card as you say beats the last very top end card and costs 300 bucks less. So who can complain about that? Seriously? Yeah they will release a 1080Ti and it will be much better than the 1080. That's a given. And it will probably cost a lot. That's a given. But who will actually need this card? If people are still gaming on 1080p and 1440p Not to many.
Fact is the 1080p and 1440p gamers have more than enough power at their disposal now even in the "midrange" to slam most all of their games at acceptable levels.
I will be buying the 1070TI when it comes if i don't break down for the regular 1070. And even if i did, it's going to blow away my gtx660 which i have not had any problems playing games on, not even 2016 games with perfectly acceptable graphics quality.

Here's a bigger picture than just the last 3 generations of cards. This is how much i plopped down for a gtx 280 10 years ago.
1
EVGA 01G-P3-1284-AR GeForce GTX 280 SSC Edition 1GB 512-bit GDDR3 PCI Express 2.0 x16 HDCP Ready SLI Supported Video Card
Item #: N82E16814130368
30 Day Return Policy

$669.99


Nvidia gouged early adopters and Amd released their whatever and Nvidia dropped the price closer to $550, and some people got rebates, I got a rebate, Evga was phenomenal in handling this. It was like 100 bucks. But the card still cost closer to 600 in the end.

That card ended up not even beating my gtx8800 in some games. It was the worst upgrade i ever made and that is the only video card i have bought in 20 years that actually died, and it died pretty much right after the warranty expired. Since then i have not bought the 600 dollar card. I am fine with turning some graphics down when the difference is very minimal, especially when you been tweaking games all your life. A lot of times you can get them looking almost as good or even as good subjectively as max settings. And with the new "midrange" 1070 card you won't even really have to do that at 1080p or 1440p.

So despite this logical falacy that the new high end is the old mid range, the new mid range beats or matches the old Very high end at much cheaper cost and you get 2 or more gigs of ram and newer technologies.

The new highest end is now also been catapulted due in part to Display technologies. There was no 4k back in the gtx8800 ultra days and 4k is not the norm yet. Some people probably don't even care about 4k. I know i don't.

So the newest highest end is for people who are willing to pay big bucks for 65" oleds and 4k tvs, where those cards are actually required. Those people want the cutting edge and they can afford or are going to buy it. The rest of the millions of people who play Counter Strike or DOta or League of Legends or Cod or Bf4 more than anything else, games which run fine already on older hardware and which take up most of the pc gaming market don't need it.

And so long as Nvidia has no real threat from AMD this is the way it is, and always has been.

The 1070 Aib will probably be in the mid $400 range. I expect it. There will still be the vanilla cards in the $379 range as well. And for the performance and having 8 gigs of ram, i am pretty content with that. It's about 100 bucks or less than i payed for my x60 card almost 4 years ago, the performance leaves my 660 not even on the charts and it has 6 GB more ram. It will smash most anything i will play at high enough settings at 1440p that i won't miss a few post effects or slightly reduced texture resolution for 300+ more dollars.

What you are doing is concentrating on the Sku or the Die, and or what they did before, which you were never guaranteed no matter how many years in a row if that's what you basically got, which is not even absolute as some newer gens didn't really beat older ones much at all sometimes. As in the case of my gtx280. But the fact is that you are getting amazing performance in the mid range with the last two generations of cards, that people have gotten plenty of use out of. My bro has a 970 and he still says he can play all his games just fine even using downsampling for better image quality.
 
It's not unexpected that high end cards would start to cost more. It's amazing they have stayed close to the same for so long. If you look at Nvidias entire history as a whole as opposed to the last 5 years, then this whole argument starts to lose merit.

If you really want to see the whole argument lose merit, adjust prices in the last 20 years for inflation and see how much we're actually paying these days relative to old cards. 9700Pro launched at $400, which is $530 in 2016 money and things like the GTX 260 were noticeably more expensive ($~450 in 2016 money) than the 1070.
 
By comparison the performance increase of midrange video cards has has [been] quite small

Yeah the x60 line seems to be dragging. Maybe 1060 and AMD's card will kick it up a notch soon.
 
What makes it a 1060 equivalent? Check the die size.
code name, I mean they could have been a 104 chip, it would have been inline with 1070... also die size is bigger because it had more units related to RTX and TC, which 1060 lacks

if you are going to do a comparison everytime a die size comparison nothing is equivalent of nothing.. each generation die sizes increases as the cores added are more,specially when the die shrink is not as noticeable as it was before,how would you keep the track?

of course this is unofficially..
One of the first posts in this thread:
well for their it is the actual representation of their internal codenames, if you check them, so why would you expect it means anything else?
 
if you are going to do a comparison everytime a die size comparison nothing is equivalent of nothing.. each generation die sizes increases as the cores added are more,specially when the die shrink is not as noticeable as it was before,how would you keep the track?

Exactly, you keep track of price and performance. Code names and die sizes are irrelevant from a consumer value standpoint.
 
You bumped this thread just to keep bitching over something that's inevitable? I thought I made this clear two years ago.

Die sizes are going up, and cost-per-transistor is not falling. Someone has to pay for all these new techs, and it's fallen on the high-end.


Each time we make a quantum leap in graphics processing, it's an expensive leap. The prices for those individual leaps just keep going up, BECAUSE EACH LEAP (GeForce 256 -> GeForce 3 -> 8800 GTX -> RTX 2080) IS TEN TIMES MORE DIFFICULT THAN THE LAST.

But Nvidia has a very comfortable OEM business they like to keep rolling, so RTX will eventually make it's way into the low-end. Just not until two years from now, when TSMC 7nm is actually stable for large chips.

The reason pricing changes is because Moore's law has changed. Accept these changes, or go find a new hobby.
 
Last edited:
Only for the uninformed consumer...

Knowing die sizes or code names or even manufacturing cost doesn't change the value equation since those things don't actually do anything for the consumer. The things that matter are performance, features, power consumption etc.

Informed consumers can bitch and moan about what they think a company should sell a certain die for but that's just that - bitchin and moanin.
 
Natural for GPUs to increase in complexity and cost with each new gen. With the inclusion of tensor and RT cores, should it have remained the same? The chip naming convention is now even more meaningless. TU104 vs TU106, should the latter been a cut down TU104? Or a full TU106? The expected perf of a 2070 could be closer to a 2080 than a 1070 was to a 1080. So a 106 chip in this instance may be giving a better perf return than a cut down 104.
 
Knowing die sizes or code names or even manufacturing cost doesn't change the value equation since those things don't actually do anything for the consumer. The things that matter are performance, features, power consumption etc.

Informed consumers can bitch and moan about what they think a company should sell a certain die for but that's just that - bitchin and moanin.

This is like when car makers stated using turbo 4 cyclinders instead of normal 6 cylinder engines. Many declared no "real" luxury car would use a 4 cylinder, because people know 6 cylinders is the minimum. Automakers were just trying to charge more for less cylinders! :rolleyes:.
 
Knowing die sizes or code names or even manufacturing cost doesn't change the value equation since those things don't actually do anything for the consumer. The things that matter are performance, features, power consumption etc.

Informed consumers can bitch and moan about what they think a company should sell a certain die for but that's just that - bitchin and moanin.

You are correct there. It's fun to understand die sizes, costs to produce, supply constraint, what Jensen's favourite toothpaste is etc. but ultimately it boils down to:
  • Performance - middling relative to prior generation jumps, at least that's what all the rumors point to
  • Features - disputable as to how many games will use, how, and seem to be an enormous performance hit
  • Power Consumption - worse than prior gen
  • Price - W. T. F.
 
Are people really going to get Turing without a g-sync display? Are people worried about dipping to 50 FPS on a really demanding game?
 
Back
Top