Midrange GPUs had gotten High-End GPU´s Price tag

This coming from the person that also provided the idiotic car analogy.

And so I guess if you have the money then fuck common sense and logical pricing as you can afford it...

The idiotic car analogy was simply to point out how idiotic comparing GPUs based solely on ROPs, VRAM bit width, and perceived "it says x70 so it's midrange" ideas are.

Also all these prices seems perfectly logical to me. I sold my last 970 to downgrade to a 960 until these cards launched. The cards don't have to be cheap for the pricing to be logical.
 
The idiotic car analogy was simply to point out how idiotic comparing GPUs based solely on ROPs, VRAM bit width, and perceived "it says x70 so it's midrange" ideas are.

Also all this price seems perfectly logical to me. I sold my last 970 to downgrade to a 960 until these cards launched.
Well from the point of it being their x104 chip then the analogy makes little sense. It is like BMW charging more for the 2017 5 series than they did for the 2016 7 series if a car analogy is to be made.

Lol 700 bucks seems logical to you for a fucking x104 chip gpu?
 
Well from the point of it being their x104 chip then the analogy makes little sense. It is like BMW charging more for the 2017 5 series than they did for the 2016 7 series if a car analogy is to be made.

Lol 700 bucks seems logical to you for a fucking x104 chip gpu?

What makes you think it isn't?
 
And here comes the same bullshit excuses we had when the 680 came out. We always get a bigger chip card so by now no one could possibly be that damn stupid to think otherwise.

People don't give a shit if the 680 cards came with a used syringe, they are just looking at the performance for the dollar they can buy now and in the immediate future. $380+ for a high end GPU and $600+ for the top of the market GPU look like good values now and when compared historically.
 
And you have your answer. The 1080 will not be their top Pascal card and even you know that.

Also, the 1080Ti will get replaced with something faster eventually. Who would ever spend money on PC components that get replaced?!?

GPUs aren't ovens expected to last for 15 years. Get over it.
 
And you have your answer. The 1080 will not be their top Pascal card and even you know that.

Actually, I don't.

There is a huge degree of certainty that there will be one, yes, given what they did with Kepler and Maxwell, but I wasn't entirely certain 980ti was going to be a thing until it was actually released, 680 didn't have a 680ti.

Huge degree of certainty is still not 100%, which won't be until they announce it.

Whether or not one finds $700 acceptable or logical depends on personal opinion, by my opinion is, if you look beyond the part number, you will find that going from 580 to 780ti is very nearly doubling the performance (according to TPU graphs), which was going from a $500 to a $700 part (780ti is a $700 MSRP part) for similarly sized chips

Now we compare similarly sized 980 to 1080 (which is a $600 MSRP, only the FE is currently priced at $700, and there will undoubtedly be aftermarket cards closer to that $600), it is too, almost double in performance, so when comparing that, $600 isn't too stupid.
 
removed childish rant

hqdefault.jpg
 
Last edited by a moderator:
OP, out of curiosity, do you have any links showing Nvidia using "high-end" in their marketing for the GTX 1080? I'm not saying that they don't; I just didn't find any in a very brief search.
 
I don't really agree with this big die small die issue. This is since Nvidia went to that release model -

GTX 680 -> GTX 780: 427 days
GTX 780 -> GTX 980: 483 days

56 day difference, or slightly less then 2 months more wait.

TPU (Techpowerup) 2560x1600 avg summarry:

780/680 = 26.6% (0.06% improvement per day)
980/780 = 29.9% (0.62% improvement per day)

GTX 980 -> 980ti: 257 days
980ti -> 1080: 360 days

103 day difference, or a slightly more then 3 months longer wait

TPU 4K avg results:

980ti/980 = 28.2% better (0.11% improvement per day waited)
1080/980ti = 37.0% better (0.10% improvement per day waited)

Waiting for the new uarch if anything is more beneficial then waiting for the big die.

Granted this isn't an indicator of future trends since future release dates are not determinable at this point.

However hypothetically if this pattern does hold, would you rather always be on the new uarch or the big die?
 
inflation-thumb.jpg


But in all reality prices have gone up because the market dictates the price. People are paying these types of prices for cards so Nvidia and AMD will continue to raise the prices until they see a decline in demand.
Yeah someone should show the trend for a mid-range phone and also mid-range phone marketed as high end in the past to now.
Mobile phones are an eye watering example, and their economy of scale (number of sales) is so much better (just in a totally different league) than 970/390 and upwards discrete GPUs.
Cheers
 
Is the 1080 faster than the 980ti or any of Nvidia's other cards by a reasonable margin? It seems to be, so why does it matter how wide the memory bus is, or how big the die is?
 
I have some awesome well thought out advice for everyone arguing: If you can't afford it or don't like it, then perhaps don't buy it? :wacky:
 
Well no fucking shit. Even a child can figure that out. The point is that it is getting out of hand but hardly anyone here seems to give a shit. All we get are ignorant replies like we did all the way back to when this started happening with the 680. Oh its faster than the previous card so its high end. lol

Are you sure that nobody actually cares? We're here debating this, after all.

My take is that the unusually high pricing of the 1080, and the number of nerds who are falling all over themselves to buy them at that price, demonstrate that in the past, prices for the enthusiast class graphics cards were probably lower than they could have been. Clearly, at least right now, the market is willing to tolerate paying $700 for a graphics card that's really only useful for playing video games. It's so willing to tolerate it, in fact, that there's money to be made buying the things at that price, and then scalping them on ebay. This suggests to me that nVidia could have set the price for the 1080FE at $800, and probably still had every single one they could make sold by the 29th.
 
The comparison between 670 and 760 isn't terribly valid, since both are based on the same micro arch (hence a refresh), it's more valid to compare a X70 from an old gen arch to a X60 of a new gen (EG 960 vs 770)

However, 960 is a massively shrunk die (227mm^2) compared to 970/980 (398) or even 670 (294mm^2), and both are using the same node.

But I do agree with you on Maxwell's mid range offering is lacking at best (my first impression of 960 as basically "eugh"), but I think this is good, at least for AMD, means AMD is generally the better buy for performance at that price range.

It doesn't matter about the specs at all or if they're refreshes or not, but how much relative performance you get per dollar. For a good while now you've had to spend $330-450 to get decent performance from Nvidia. The 560ti was the last decent card at $250.

We'll see how well the 1060 does given the 1070 has jumped to $380, but I doubt it will be very close to 1070 performance for $250-280. And if performance wise, it is between a 1070 and GTX 970 then it isn't really a worthwhile upgrade for 970 owners. I hope AMD delivers something competitive at the $300 range. Competition is good for us all.
 
Hello, I think a contributing factor in this is the global economy was in the toilet in 2010, it is better now so they can charge more for the nerd crack.
 
I was hoping this would turn into an enlightening conversation, I guess I was wrong.

The OP makes too many logical leaps to arrive at the conclusion that "nVidia is jacking up the prices" for the sake of it.

However, the OP has completely and utterly failed to take COST of the chip into account. If the cost of a 580 die and a 680 die is the same, and if the performance increase is significant, why wouldn't nVidia try to sell it as a high end chip?
searching on internet I found that producing a GF110 GPU had a value of 205usd that more than 100% revenue.meanwhile a GTX 560 Ti , had a value of 93usd.
image.png


The OP also have failed to compare the performance improvements between Fermi and Tesla cards of the same tier (say 280 vs 480) then compared it to the improvements between Kepler and Fermi (say 580 to 680), and Maxwell vs Kepler (980 vs 780) and finally, Pascal vs Maxwell (1080 vs 980), and then compare each scenario's lower tiers where die sizes are similar. if it does indeed show that their x80 cards are indeed closer to mid-ranged tier cards in terms of performance increase, then the conclusion would be much more convincing.

why would I compare two different architecture when they would have different CU/GPC/ALUs/cache/geometry processor organization therefore the performance difference will be there aswell the extra Shaders would allow more performance. but when you compare the card from the same architecture you can see where they are located in the market
f2LzGJL.jpg


480 to 460
42% less Shader
50% less ROPs
256 bit bus vs 384 bit bus width
59% worse compute perf

580 to to 560 Ti
33% less Shader
50% less ROPs
256 bit bus vs 384 bit bus width
25% compute perf difference

680 to GTX 780
50% Less Shaders
50% less ROPs
256 bit bus vs 384 bit bus width
35% compute perf difference




I think the main point about the OP is that he is ticked off that we are no longer getting the full sized die like we used to back in pre-Kepler days when the x80 chips were > 500mm^2 chips rather than the current ~300mm^2 chips, but the prices within each tier has remained the same or have increased over the years. IE with the same money, we are not buying the same amount of silicon anymore (IE with the prices of 580 and 480, 780ti should have been priced the same as 780, and 980ti the same as 980).

My argument would be, if that happened, what would become of AMD? If, instead of 680 being 680, 680 is now a 670, with a hypothetical 680ti being 680, would 290x be that big of a hit anymore? If 980ti was now a 980, what would people see Fury X being, or would there even BE a Fury X?
the 680 now a days performs at the midrange of amd GPUs in 2012 (Pitcairn) given driver optimization. and the "high end" 780 is as slow as the r9 280x.and if there were a 780 Ti as 680 the 290x would be still at same MSRP being similar on performance




The wonders of capitalism. OP needs to take an Economics 101 course at his local community college.

Also, his view of history is extremely limited. I'm sure he doesn't remember the GTX 8800 Ultra that went for $700+ when it came out. So little extra performance for so much more money. People still bought them. People that have extra money that feel they can spend that little extra to get what they really want. Sometimes its not performance, it's status.
"From 8800 Ultra (800usd) to 280 (650) to 285 (400usd) prices decreased."

[/quote]
Yes, that is correct. :cool:
He is right, but no one cares because most of us understand basic economics. If and when Nvidia actually prices something out of price reach, the market will react and buy less. Nvidia will then slash prices.[/quote]when Nvidia releases the new architecture (Pascal keeps most of Maxwell) and they add a smaller node with HBM2/HMC (wait HMC was too expensive so they use HBM now) the GV104 (midrange) will be more than a GTX 1080 using 16nm and GDDR5X(GDDR5 refresh)

You're right, it is amazing the lack of intelligence here when people have to explain the elementary school concepts of inflation, supply and demand, and perceived value.

The market dictated these prices, not Nvidia. Nvidia only sets the manufacturer's suggested retail price, the market dictates whether or not people will continue to buy at that price.
producing a a midrange/high end fermi cpu did cost 50% less than MSRP. now the price has doubled and there wasnt a 100% inflation



Actually, I don't.

There is a huge degree of certainty that there will be one, yes, given what they did with Kepler and Maxwell, but I wasn't entirely certain 980ti was going to be a thing until it was actually released, 680 didn't have a 680ti.

Huge degree of certainty is still not 100%, which won't be until they announce it.

Whether or not one finds $700 acceptable or logical depends on personal opinion, by my opinion is, if you look beyond the part number, you will find that going from 580 to 780ti is very nearly doubling the performance (according to TPU graphs), which was going from a $500 to a $700 part (780ti is a $700 MSRP part) for similarly sized chips

Now we compare similarly sized 980 to 1080 (which is a $600 MSRP, only the FE is currently priced at $700, and there will undoubtedly be aftermarket cards closer to that $600), it is too, almost double in performance, so when comparing that, $600 isn't too stupid.
GP100
Inside Pascal: NVIDIA’s Newest Computing Platform
 
Last edited:
I said it pre-release of the 1080 & 1070 and I'll say it again. I can't wait for a few years from now, because with NV marking up their stuff like 300% (or or up to 1000% by now?), there's bound to be GPU manufacturers popping up all over the place to take a piece of that pie for slightly less. Heck there should be at least a few new manufacturers coming right from these forums.

I mean it's easy right? NV is obviously holding back on us too, they really could have easily released a chip 2x as fast at a lower price and still made money.
 
PontiacGTX said:
when Nvidia releases the new architecture (Pascal keeps most of Maxwell) and they add a smaller node with HBM2/HMC (wait HMC was too expensive so they use HBM now) the GV104 (midrange) will be more than a GTX 1080 using 16nm and GDDR5X(GDDR5 refresh)

producing a a midrange/high end fermi cpu did cost 50% less than MSRP. now the price has doubled and there wasnt a 100% inflation
You didn't read everything I wrote did you? I mentioned two other things in that same sentence as inflation.

Inflation is only part of it. There's also supply and demand and perceived value.

Nvidia does some research. They think the market will pay a certain MSRP. They release product at a suggested price. The market either does or does not pay the MSRP. Prices adjust accordingly. Consumers are to blame for high prices, because the majority of customer's perceive the card's worth as higher and enough product is sold at the higher price to tell Nvidia their prices are justified.

Until there are more butthurt customer's like yourself saying the cards aren't worth that price, then the market won't react and lower prices. So feel free to continue rallying people, but it really is an uphill battle. My suggestion is to save a little more money, since a couple hundred is pocket change compared to other adult bills.
 
You didn't read everything I wrote did you?

Inflation is only part of it. There's also supply and demand and perceived value.

Nvidia does some research. They think the market will pay a certain MSRP. They release product at a suggested price. The market either does or does not pay the MSRP. Prices adjust accordingly. Consumers are to blame for high prices, because the majority of customer's perceive the card's worth as higher.

Until there are more butthurt customer's like yourself saying the cards aren't worth that price, then the market won't react and lower prices.
Asking for better prices is bad then? because the prices will increase and nvidia wont care because it has most of the marketshare and if AMD doesnt get enough support then the only thing you will get are 104/114/204 GPUs for 1000usd because price are higher just cause they want more profit because the inflation, competition,"low demand",etc
 
This isn't hard to understand. You price your part for whatever the market will bear. So let's price things based solely on Nvidia products:

The GTX 460 1GB = $230, because it's no faster than the GTX 275, a card released a year earlier at $275. At time of release the 275 was $230!

ZOTAC GeForce GTX 460 1 GB Review

Yeah, new features are nice, but if the card is too expensive, nobody will buy the new hotness. And the Core 448 was released late in life to compete with AMD's refresh parts (nothing to do with initial pricing of the part).

As for the GTX 680, it's performance was 25% FASTER than a GTX 580, a card that was priced AT THE TIME at $400 (because of the 7970). So $500 was fitting :D

NVIDIA GeForce GTX 680 Kepler 2 GB Review

I don't see why you're getting so sad over this. The performance improvement over previous parts determines how good a deal the card is. Just because the GK104 released before the GK110 doesn't mean it was poor value AT THE TIME. AMD pulled the same stunt, waiting over a year to release Hawaii.

Now $700 is pricey, but the card is 25% faster than a $600 GTX 980 Ti. Price will quickly fall after the "gotta have it now" folks are done, just like they always do.

Also, you people need to get used to the idea that prices may rise because the cost per transistor is rising with each die shrink.
 
Last edited:
Asking for better prices is bad then? because the prices will increase and the companies wont care because they have most of the marketshare and if AMD doesnt get enough support then the only thing you will get are 104/114/204 GPUs for 1000usd because price are higher just cause they want more profit because the inflation, competition,etc
No. Its not bad, its just futile when you are only talking about a few dollars here and there. It's also futile unless you can get a massive following and sway the market. Based on all the backlash you are getting here, I would say the market can stomach quite a few more price increases.
 
Asking for better prices is bad then?

It's not bad. But while you're at it why not demand nVidia sell us their products at cost? No one individual can decide the fair price for a product. Only the market (and government regulators) can do that :D

because the prices will increase and nvidia wont care because it has most of the marketshare and if AMD doesnt get enough support then the only thing you will get are 104/114/204 GPUs for 1000usd because price are higher just cause they want more profit because the inflation, competition,"low demand",etc

nVidia will sure as hell care if people stop buying their products because they're too expensive. Guess what, people are still buying them - i.e. they're not too expensive. Why are we having so much trouble with basic economics in this thread?

I have no idea what you mean by AMD not getting enough support. Their sales and revenue are a result of the quality of their engineering, management and marketing abilities. They're not a damn charity.
 
Who cares about code names or architectures if you get the relative performance for the money spent. If you don't like how much performance you get for the dollar, than don't buy it. Prices will then normalize.
 
I can only quote myself here:

There is already a new revision of Maxwell out.

Maxwell V1 = GM107
Maxwell V2 = GM204

We could try adding one more parameter, that support my claim and goes against your claim:



Midrange SKU (GF114 - Tick) - 332 mm² - 256 bit / 128.26 GB/s
Highend SKU (GF100 - Tock) - 520 mm² - 384 bit / 192.384 GB/s
Midrange SKU (GK104 - Tick) - 294 mm² - 256 bit / 192.256 GB/s
Highend SKU (GK110 - Tock) - 561 mm² - 384 bit / 336.4 GB/s
Midrange SKU (GM204 - Tick) - 398 mm² - 256 bit / 224 GB/s
Highend SKU (GM200- Tock) - 550 mm² - 384 bit / 336 GB/s
Midrange SKU (GPxx4 - Tick) - ~300 mm² - (stacked memory, details not public yet)
Highend SKU (GPxx0) - Tock) - ~550 mm² - (stacked memory, details not public yet)

My upgradepath:
GF100 -> GK110 -> GM200 (true highend upgrade path)
A buyer on more restricted bugdet should go:
GF114 -> GK104 -> GM204 (midrange upgrade path)

The only reason Nvidia can get away with selling Gxxx4 SKU's as "highend" is because Nvidia's midrange SKU's can compete with AMD's highend SKU's.

Again, look at the metrics.

I will now add some of the "missing data:

Midrange SKU (GF114 - Tick) - 332 mm² - 256 bit / 128.26 GB/s (GTX 560 Ti)
Highend SKU (GF100 - Tock) - 520 mm² - 384 bit / 192.384 GB/s (GTX 480)
Midrange SKU (GK104 - Tick) - 294 mm² - 256 bit / 192.256 GB/s (GTX 680, high end SKU by name only should have been 660 Ti)
Highend SKU (GK110 - Tock) - 561 mm² - 384 bit / 336.4 GB/s (GTX 780Ti /Titan)
Midrange SKU (GM204 - Tick) - 398 mm² - 256 bit / 224 GB/s (GTX 980)
Highend SKU (GM200 - Tock) - 550 mm² - 384 bit / 336 GB/s (GTX 980 TI / Titan
Midrange SKU (GP104 - Tick) - 314 mm² - 256 bit / 230 GB/s (GTX 1080)
Highend SKU (GPxx0) - Tock) - ~550 mm² - (stacked memory, details not public yet)

I will Return with the GTX 1080Ti / Titan is released.

The pattern is clear...when AMD performance is absent, NIVIDIA is able to charge permium prices for mid rage GPU's:
GTX 680 vs 7970
GTX 1080 vs [missing competition]
 
This isn't hard to understand. You price your part for whatever the market will bear. So let's price things based solely on Nvidia products:

The GTX 460 1GB = $230, because it's no faster than the GTX 275, a card released a year earlier at $275. At time of release the 275 was $230!

ZOTAC GeForce GTX 460 1 GB Review

Yeah, new features are nice, but if the card is too expensive, nobody will buy the new hotness. And the Core 448 was released late in life to compete with AMD's refresh parts (nothing to do with initial pricing of the part).

As for the GTX 680, it's performance was 25% FASTER than a GTX 580, a card that was priced AT THE TIME at $400 (because of the 7970). So $500 was fitting :D

NVIDIA GeForce GTX 680 Kepler 2 GB Review

I don't see why you're getting so sad over this. The performance improvement over previous parts determines how good a deal the card is. Just because the GK104 released before the GK110 doesn't mean it was poor value AT THE TIME. AMD pulled the same stunt, waiting over a year to release Hawaii.

Now $700 is pricey, but the card is 25% faster than a $600 GTX 980 Ti. Price will quickly fall after the "gotta have it now" folks are done, just like they always do.

Also, you people need to get used to the idea that prices may rise because the cost per transistor is rising with each die shrink.

At the end of the day, this is all that matters. Where the card falls in relation to existing products and the competition. Arguing about die sizes, bus widths and trying to make some conclusion that you're getting cheated is fucking pointless. Names are marketing, code names are marketing. If they changed the code name of GP104 to GP100 and never released "big" pascal to the gaming market, would we be having this discussion? The 1080 is faster than the old market leader by a significant margin and costs less (at least the non-FE is supposed to). Whats the problem? People in this thread will argue that the "04" chips are masquerading as high end chips, but NV marketing would probably tell you that the 04 are the "new" high end chips, and that the "100" chips are a new "class" of chips that command a premium. And they can get away with making that assertion as long as the competition's "high end" chips are only as good as NV's "mid-range" chips.
 
Last edited:
Some of you will have a hard time convincing anyone these are midrange cards when they are blowing everything else out of the water. But keep trying I guess.
 
At the end of the day, this is all that matters. Where the card falls in relation to existing products and the competition. Arguing about die sizes, bus widths and trying to make some conclusion that you're getting cheated is fucking pointless. Names are marketing, code names are marketing. If they changed the code name of GP104 to GP100 and never released "big" pascal to the gaming market, would we be having this discussion? The 1080 is faster than the old market leader by a significant margin and costs less (at least the non-FE is supposed to). Whats the problem? People in this thread will argue that the "04" chips are masquerading as high end chips, but NV marketing would probably tell you that the 04 are the "new" high end chips, and that the "100" chips are a new "class" of chips that command a premium. And they can get away with making that assertion as long as the competition's "high end" chips are only as good as NV's "mid-range" chips.

look at diesize + memory bus.
You can try and "relabel" those, their measurements doesn't care.

Volta dies will follow the same pattern.
The 1080 is the replacement for GTX 980
The GTX 980Ti/Titan X will get their replacement SKU's.

And if you follow the logic of some people, that will be a NEW segment...above the GM200 SKU's.


You can't have your cake and eat it too.
 
OP,

We debated this to exhaustion in the past and as you can see, there are strong opinions on both sides of this argument.
From what I've gathered since the GTX 680, the "100" chip went from just increasing performance over last gen by 30-40%, to nearly 80%.
The Titan X OC was the same as two 980's in SLI.

Nvidia's performance skyrocketed during the GTX 600 series and this enabled the 100 chip to be moved up to be called the Titan.
I can't blame NV for making their small chips as fast as big chips. In the end we all have benefited from these faster mid-range parts, because beast performance like the Titan and Ti wouldn't come until the next gen, instead of in the same gen.
No more refreshes like the GTX 480 to GTX 580, one year cycle cards. Now cards are able to have two year cycles, and the Titan up to a three year cycle.

Titan, next gen performance, today. This has lived up to that so far. Titan X is still a viable option against the GTX 1080.
 
I can only quote myself here:



I will now add some of the "missing data:

Midrange SKU (GF114 - Tick) - 332 mm² - 256 bit / 128.26 GB/s (GTX 560 Ti)
Highend SKU (GF100 - Tock) - 520 mm² - 384 bit / 192.384 GB/s (GTX 480)
Midrange SKU (GK104 - Tick) - 294 mm² - 256 bit / 192.256 GB/s (GTX 680, high end SKU by name only should have been 660 Ti)
Highend SKU (GK110 - Tock) - 561 mm² - 384 bit / 336.4 GB/s (GTX 780Ti /Titan)
Midrange SKU (GM204 - Tick) - 398 mm² - 256 bit / 224 GB/s (GTX 980)
Highend SKU (GM200 - Tock) - 550 mm² - 384 bit / 336 GB/s (GTX 980 TI / Titan
Midrange SKU (GP104 - Tick) - 314 mm² - 256 bit / 230 GB/s (GTX 1080)
Highend SKU (GPxx0) - Tock) - ~550 mm² - (stacked memory, details not public yet)

I will Return with the GTX 1080Ti / Titan is released.

The pattern is clear...when AMD performance is absent, NIVIDIA is able to charge permium prices for mid rage GPU's:
GTX 680 vs 7970
GTX 1080 vs [missing competition]

couple of things, why you start the chart with the GTX 560TI and then jump to 480..?

where's 460 and 560?

where it's the mention that fermi card were 40nm parts?

where's transistor count per generation?

why not mention to the 448bit and 512bit GTX 200 series?..

Where are efficiency and targeted TDP numbers?

GTX 480 is GF100 and the GTX 580 GF110 why not mention to it? (by that logic of part number shouldn't the 980TI/TitanX be slower than GTX 980?)...

how much was the jump in performance from GTX 560 to GTX 660? how much from 570 to 670 and 580 to 680?

GTX 680 was launched at 499$ same as GTX 580.

HD 7970 was launched at 549$ 200$ more than the HD 6970. In this case was AMD that was shocked and then re-launched the HD7970 as ghz edition 50$ cheaper to be able to compete with the GTX 680..

where it's he pattern clear?. range of the card is not measured by shader count, bus width or die part nomenclature, the range of the card is given by the performance achieved that's all.. all of the rest of the numbers mean shit if the performance it's there.
 
NVidia is only doing what they should, the real ones are at fault is AMD. because of their failure to execute time and time again, beginning with Fermi Nvidia has been able to pawn off what would have been a midrange chip as a high end chip. it has happen again, the GP104 has clearly been designed as the mid range chip but so thoroughly dominates AMD that NVidia can pawn it off as the high end (fool me once shame on you, there will be no fooling me twice). so I will wait patiently for the proper high end chip, the GP102 or the ultra high end GP100.
 
couple of things, why you start the chart with the GTX 560TI and then jump to 480..?

where's 460 and 560?

where it's the mention that fermi card were 40nm parts?

where's transistor count per generation?

why not mention to the 448bit and 512bit GTX 200 series?..

Where are efficiency and targeted TDP numbers?

GTX 480 is GF100 and the GTX 580 GF110 why not mention to it? (by that logic of part number shouldn't the 980TI/TitanX be slower than GTX 980?)...

how much was the jump in performance from GTX 560 to GTX 660? how much from 570 to 670 and 580 to 680?

GTX 680 was launched at 499$ same as GTX 580.

HD 7970 was launched at 549$ 200$ more than the HD 6970. In this case was AMD that was shocked and then re-launched the HD7970 as ghz edition 50$ cheaper to be able to compete with the GTX 680..

where it's he pattern clear?. range of the card is not measured by shader count, bus width or die part nomenclature, the range of the card is given by the performance achieved that's all.. all of the rest of the numbers mean shit if the performance it's there.

Try and add that info into the chart and see if it changes anything.
Wake me up if does...if not...I fail to see the point.
 
look at diesize + memory bus.
You can try and "relabel" those, their measurements doesn't care.

Volta dies will follow the same pattern.
The 1080 is the replacement for GTX 980
The GTX 980Ti/Titan X will get their replacement SKU's.

And if you follow the logic of some people, that will be a NEW segment...above the GM200 SKU's.


You can't have your cake and eat it too.

You keep saying to look at die size and memory bus.... why? I don't give a damn about those things. Bus width is irrelevant; bandwidth matters, die size is irrelevant; shader and geometry throughput matters. You're making arguments based on single stats that have only a tenuous relation to performance, then arguing that those stats should determine the price, rather than letting the performance dictate the price. This is the same reason we don't care about comparing clock speeds across dissimilar architectures, or the fact that we recognize that 1 Intel core isn't the same as 1 AMD CPU core. AMD sells an "8 core" cpu for $150... does that mean Intel is taking the piss for selling a 6 core chip for $400? We all know better. If you really want somebody to blame, blame AMD for producing "high end" chips that only perform like "mid range" chips, but then again I guess since Fury X had a 4096 bit memory bus it is the greatest GPU of all time.

OP,

We debated this to exhaustion in the past and as you can see, there are strong opinions on both sides of this argument.
From what I've gathered since the GTX 680, the "100" chip went from just increasing performance over last gen by 30-40%, to nearly 80%.
The Titan X OC was the same as two 980's in SLI.

Nvidia's performance skyrocketed during the GTX 600 series and this enabled the 100 chip to be moved up to be called the Titan.
I can't blame NV for making their small chips as fast as big chips. In the end we all have benefited from these faster mid-range parts, because beast performance like the Titan and Ti wouldn't come until the next gen, instead of in the same gen.
No more refreshes like the GTX 480 to GTX 580, one year cycle cards. Now cards are able to have two year cycles, and the Titan up to a three year cycle.

Titan, next gen performance, today. This has lived up to that so far. Titan X is still a viable option against the GTX 1080.

Solid summary of it right here
 
I remember this was a hot topic when the 680 & 670 came out and were missing much of the non-gaming compute processing power that the 580 & 570 had. People felt it was a re-badged mid-range chip because of that (aside from the chip code, unexpectedly low TDP, and lower Bus width).

Then people saw the bottom of many (not all) 670s and noticed they had short (historically mid-range length) PCBs with longer fan shrouds to make them look like "high end" cards. Certainly plenty of conspiracy fodder.


0y3HzIY.jpg



The biggest assumption here is that Nvidia's chip nomenclature hasn't changed. If it hasn't changed and these really are mid-range GPUs, then this is just a result of AMD's inability to provide any real competition.

If Nvidia can beat AMD with their mid-range offerings, then why not? Blame AMD.
 
Last edited:
The performance of high end video cards has improved several times over the last few years. By comparison the performance increase of midrange video cards has has bean quite small, and the performance increase of low end smaller still.

The main reason for this is the fact that AMD has NOT released anything new for mid range or low end since the 7xxx series. All the newer AMD mid and low range video cards are just rebadged 7xxx series GPUs. And without any new GPUs AMD cannot put much pressure on Nvidia, so Nvidia can basically price things as they wish.

MAYBE that will change this year when AMD finally releases something new in mid-range and low-end. MAYBE!
 
Back
Top