Some Perspective: Zarathustra's Nvidia Price History

Too lazy and tired to read all the comments, but including performance jump from previous generation might be neat to know.

Ti 4600 was the first video I bought back in high school. Worked a ton of minimum wage hours to play UT at an awesome (1600x1200) resolution. Haha. Was the king of lan parties....in the gfx department only.
 
I said all that basically to the word during pascals release when people were bitching up a storm on the pricing. But truth doesn't appease those who just want to bitch, even when the data is basically a few 30 second google searches away since most reviews of all the video cards released in the last 17 years are all still online.
 
I know, I know.

I actually originally made them jpegs because I was on autopilot and not thinking about it, but I later changed them to PNGs, but for some reason the editor seems to recompress them in a webp format resulting in JPEG like artifacting.

Not my fault, promise. :p

I'm guessing the front page wasn't original designed with charts and graphs in mind.
OR you know. Just use 100% quality JPEGS. No artifacts.
 
  • Like
Reactions: N4CR
like this
I appreciate the table, and I get it, however, I seem to remember prices dropping much faster, cut down cards not being so cut down and so on.
Previous generation card being cleared out too?
Then again, its all foggy, because its low priority to memorize these things anyway.
 
OR you know. Just use 100% quality JPEGS. No artifacts.

As someone with shitbox 3rd world New Zealand rural internet, PNG is almost always slower to load than JPG... at 100% it's pretty much a wash and less hassle to do that way.
 
Really need to include Titan data to make the chart and graph most useful.
 
I don't know why, even if I account for inflation I can't get the same calibre GPU now as in 2001.

In 2001 I could get a Geforce 2 GTS which was about the GTX1080 of the times for 70K (GF2 Ultra = Titan XP, GF2 TI = 1080ti that just released) Inflation was about 100% since then, so that means I should be able to get an 1080 for 140k now. But the cheapest1080 right now is 200K.

So thank you very much for the comparison, but I'll continue to whine.

PS Prices in local currency.
 
I don't know why, even if I account for inflation I can't get the same calibre GPU now as in 2001.

In 2001 I could get a Geforce 2 GTS which was about the GTX1080 of the times for 70K (GF2 Ultra = Titan XP, GF2 TI = 1080ti that just released) Inflation was about 100% since then, so that means I should be able to get an 1080 for 140k now. But the cheapest1080 right now is 200K.

So thank you very much for the comparison, but I'll continue to whine.

PS Prices in local currency.

What is your local currency? I would guess that its price relative to the US Dollar could be the reason. Remember that nVidia being an American company prices things in USD. So you take the price in USD, multiply by the rate of USD to your local currency, and get what you'd expect to pay locally. Your currency may well be lower now against the dollar than it was in 2001. The dollar is quite strong at the present time.

So for example if in 2001 one dollar bought 140 of your currency but now one dollar buys 286 of your currency, that would explain the difference.
 
I had 2 cards on that list, the GTX285 and the GTX980, both EVGA.

I just calculated what my Video Toaster 4000 card cost in today's dollars, $4,278.65. That is quite a bit of cash. It was $2,514.75 with tax when I bought it in 1993.
 
What I think is more interesting is the connectors at the end of those cards, the heat sinks, and the size of those cards. Doing a Google search for the Geforce 2 card shows a single-slot card with a DVI and VGA connector. Now, looking at the founders edition of the 1080Ti, the connectors are Displayport (2006) and HDMI (2002). They have to be because the resolution exceeds that of the capacity for the VGA and DVI.

Also.... food for thought.... the performance of the 1050 card, which is the lowest performing of the 10-series of cards of this generation, outperforms how many of those once "top-of-the-line cards".

And then, what was the typical resolutions used in 2000 compared with today?
 
You forgot this beauty, at $700!

http://www.anandtech.com/show/1861

The GeForce 7800 GTX 512MB card, released to compete with the x1800 XT. Factory-overclocked like a motherfucker, with 30% faster core and 40% faster memory!

I had a 7900 that was a beast...

7900_VF_IV_1.jpg
7900_VF_IV_2.jpg
 
Personally, I would like to see AMD/ATi's prices on that graph too. Also, notice the gradual uptrend since 2009, and the 2006 prices due to the purchase of ATi by AMD, those numbers there tell you pretty much what nVidia will do if AMD go under, or stop trying to compete at the high end.
 
What I think is more interesting is the connectors at the end of those cards, the heat sinks, and the size of those cards. Doing a Google search for the Geforce 2 card shows a single-slot card with a DVI and VGA connector. Now, looking at the founders edition of the 1080Ti, the connectors are Displayport (2006) and HDMI (2002). They have to be because the resolution exceeds that of the capacity for the VGA and DVI.

Also.... food for thought.... the performance of the 1050 card, which is the lowest performing of the 10-series of cards of this generation, outperforms how many of those once "top-of-the-line cards".

And then, what was the typical resolutions used in 2000 compared with today?

I was running 1600x1200 back then, which isn't much different than 1920x1080, which is probably the typical res used today.
 
This comparison is slightly off.

Nowdays Nvidia is selling their higher midrange cards for highend prices (xx70 and xx80) - DIE size/transistors. Can't remember exactly when NV made that switch, but probably had something to do with the lack of competition from AMD. The real highend cards (Ti and Titans) are usually coming later or totally overpriced in case of Titans.
While the chart implies the prices were relative stable, it has a slight feeling of aplles and oranges. Good job NV marketing team.
 
GDI ZARA
I will spank you for using .jpg on this kind of images... PNG is your friend.
You don't get all those horrible artifacts in high contract areaa and you end up with a smaller file :D


i might be a bit hypersensitive because I have 65 books with aprox. 200 pages each they some guy scanned into .jpg :banghead:and i have to clean em up and save them is .png.

While we're nitpicking: no polynomial graphs. The maximums and minimums of the graph don't correspond to the actual data points.
 
Firstly, PC hardware in general has gotten cheaper over the years. One of the reasons we're so excited about RyZen is because Intel's pricing isn't too different either, but then again so is their hardware. Intel equivalent is $1000, while RyZen is $330. So adjusting for inflation makes no sense.

Secondly, we all know why Nvidia released the 1080 Ti, and that's because of the rumors about Vega. And it's surprisingly faster than a Titan, which is strange cause Ti's are usually slower than Titan's. Maybe it's because Vega was rumored to be faster than Nvidia's Titan? So for now Nvidia is enjoying the $700 sales prices before Vega ships at whatever price. Considering AMD's recently pricing with RX 480 and RyZen, it's safe to say Vega will be cheaper.
 
6800 Ultra, not sure how it has survived wife's constant efforts of eliminating every loose PC component from the house. Never much cared for the creepy scientist graphic.


20170311.jpg
 
What is your local currency? I would guess that its price relative to the US Dollar could be the reason. Remember that nVidia being an American company prices things in USD. So you take the price in USD, multiply by the rate of USD to your local currency, and get what you'd expect to pay locally. Your currency may well be lower now against the dollar than it was in 2001. The dollar is quite strong at the present time.

So for example if in 2001 one dollar bought 140 of your currency but now one dollar buys 286 of your currency, that would explain the difference.

It was in the same ballpark for ages. Around 280-300.
 
Doing what, playing COD Modern Warfare 1? Those had like 512MB right? I had an 8800 GTS with 384MB and I thought that was the pinnacle of awesome back then. 2005?

lol.. this Awesome example of Hi-Tech Video Awsomeness... is now running in a MCE PC. running Kodi, Emulators, and the occasional Portal/Halflife games ;)
 
You are correct. These became very difficult decisions to make when creating this history. In the end I just had to pick one in some cases. Product cycles are not as straight forward and simple in practice as they are in theory. Any analysis like this is always going to be an approximation.

The main thing is that on the table you are clearly talking about the 8800U, but on the chart it's labeled as an 8800GTX, so if someone only looks at the chart they are thinking about the wrong card. There actually could be another data point on the chart / table for the 8800gtx like there is for the 7xx, 9xx, 10xx cards and you'd see the same price inflation. If I had to guess the 8800GTX would probably plot out just under $800, making it fall a lot more inline with the rest of the cards.

For those wondering, here is the [H] 8800gtx article. It launched November 8, 2006, with a $599 price tag. http://www.hardocp.com/article/2006/11/08/bfgtech_geforce_8800_gtx_gts

The 8800U launched May 2, 2007, 6 months after the 8800GTX. This card was about 10% faster for another $230 higher MSRP. http://www.hardocp.com/article/2007/05/02/nvidia_geforce_8800_ultra_sli

The 8800GTX was actually a great value given the crazy performance jump, the 8800U was more like a cash grab to get a higher binned product.
 
Also.... food for thought.... the performance of the 1050 card, which is the lowest performing of the 10-series of cards of this generation, outperforms how many of those once "top-of-the-line cards".

In a very general statement, it wasn't unreasonable to expect the next tiers midrange card to be competitive with the previous generations high end card. So like the GTX 1060 being similar performance to the GTX 980. The 960 being closer to the 780, the 770 vs the gtx 680, etc etc. The only omissions are the FX 5xxx series and the 9xxx series where issues held those series back. I would expect a 1050 to be able to compete with a GTX 780, minus memory limited scenarios like turning on AA. What tends to make a high end card stay fast is lots of memory bandwidth, which is usually cut down on lower end cards. GPU processing power outpaces memory bandwidth growth, so it can make for some interesting comparisons when you start looking at whether or not to continue using an older card. Without actually finding a review to know for sure, I'd guess either the 1050 or 1050 ti probably trades blows with a GTX 780. The 1050 might score a win at 1680 x 1050 no AA, but the GTX 780 would likely come back at 1080p with 8x AA. There will also be a big swing if you look at DX11 vs DX9 titles as the 780 is probably faster on DX9, and the 1050 faster at DX11.
 
Last edited:
That's not fair...now what am I going to whine about?

I personally dont have a problem with their hardware, my problem is their monopolistic ways and overall anti-consumer attitude.

Examples:

GameWorks original release pretty much made games run like crap on anything not made by them.
Blocking the consumer of being able to use an AMD GPU as primary and a Nvidia as a Physx one, by disabling the card via driver.
G-Sync.
Focing the consumer to log in before allowing you to use GFE.
Installing "telemetry" ( spyware in my opinion) via GFE or stand alone drivers without informing you and without an option to opt out, unless manually uninstalled.

That last one really bothers me, since a driver has complete system access, does gather a lot on information that has no reason to be sent to them and the worse part is, all these tech sites, very conveniently, never bother in mention it or make a fuss about it.

I do understand the fact that either AMD or anyone else have provided any real competition to them, as a matter of fact, I actually have a 970 because of that, but I will jump to a Vega card if it is faster than what I currently have, just because of their crap.
 
Every single time a brand new high end NVIDIA GPU is launched, we always have the same responses. Some people think it is too expensive while others think it is well worth the price considering the performance. What is, and isn't worth it to you, is obviously a very subjective matter, and as such we are not here to tell you what to think. It is however beneficial to have a common set of data to go back to when we have these discussions. And you know how HardOCP likes data! For about 7 years now, we have been compiling pricing data in the NVIDIA price history chart seen below, taking into account inflation, that way we are making an "apples to apples" comparison of older GPUs and new GPUs.

Our method is simple; pick NVIDIA's top end GPU introduction price for each era and estimate inflation using the Bureau of Labor and Statistics published values for the Consumer Price Index (CPI-U) based on the launch date of each GPU. The end period used is January 2017, as these are the most recent published figures. (I doubt there has been much inflation since then.) Once done, we figure out the price in 2017 dollars for every card at launch, and chart those on a graph. The astute reader will notice that Titan cards are absent from this comparison. This was a conscious decision, as Titan cards appear to be in a "halo class" of their own, and the pricing doesn't seem to obey the normal laws of physics (err, normal pricing trends) resulting in the analysis looking odd. We also made the conscious decision to start in 2000, as that seemed like a nice round year to start things at.

Without further ado, let's look at the data:

As we can see from this chart, current pricing for the 1080 Ti is pretty much inline with where NVIDIA has typically been. When adjusted for inflation the 1080 Ti almost exactly matches the price of the GeForce 2 Ultra from back in 2000. We have some notable fluctuation over the years, which mostly seems to coincide with when NVIDIA had true competition in the market place. When NVIDIA were on top, and the competition had nothing, the prices went up, as we can see with the 8800 Ultra. Other times, during periods of higher competition in the market, pricing was lower. You could argue that the 1080 Ti is actually under-priced for the market climate. Argue whichever way you want about the appropriateness of NVIDIA's pricing, but this information does show a trend much in line with the relative market position of the brand.

And

I dunno, the graph kind of looks all over the place to me.

But I get what you're saying, it's not wildly out of line with historical values for the most part.

I believe that it looks all over the place because we don't have an idea of their profit margin. If we looked at GPU's by GMROI$ (Gross Margin Return On Investment Dollars) we'd see a much smoother trend line. GMROI accounts for total cost of goods sold and takes into account all operating costs (buildings, employees, research, development, marketing). That would provide a much clearer picture and you can see hints of it in the graph, where year one cost is fairly high (likely down from the start of the company).

Unfortunately we'd need internal numbers to get this data. You'd need cost, revenue, and units sold, each by product. You might be able to pull most of the information from earnings reports.
 
Have you considered currency fluctuations? Perhaps a better comparison would be in the currency of the country of manufacture?

And



I believe that it looks all over the place because we don't have an idea of their profit margin. If we looked at GPU's by GMROI$ (Gross Margin Return On Investment Dollars) we'd see a much smoother trend line. GMROI accounts for total cost of goods sold and takes into account all operating costs (buildings, employees, research, development, marketing). That would provide a much clearer picture and you can see hints of it in the graph, where year one cost is fairly high (likely down from the start of the company).

Unfortunately we'd need internal numbers to get this data. You'd need cost, revenue, and units sold, each by product. You might be able to pull most of the information from earnings reports.


Well, these two assessments depend on a false assumption, that cost and profit margin have anything at all to do with pricing.

The "cost plus" model is one that is laughed at and derided in finance circles as only used by beginners, mom and pop shops, and in some industries where billing by the hour is the norm.

The price of any product is set without any regard what so ever to what it costs to make. It is set based on market demand and what the company believe a sufficient amount of people are willing to pay in order to maximize revenue. If the profit margin is not high enough my following this model, then they just don't sell the product instead (or they continue to sell it at reduced profit or at a loss, in order to stay in the market and do better next gen).

Pricing is completely and totally divorced from cost of manufacture/development and is determined 100% on predicted market conditions and consumers' willingness to pay.
 
Well, these two assessments depend on a false assumption, that cost and profit margin have anything at all to do with pricing.

The "cost plus" model is one that is laughed at and derided in finance circles as only used by beginners, mom and pop shops, and in some industries where billing by the hour is the norm.

The price of any product is set without any regard what so ever to what it costs to make. It is set based on market demand and what the company believe a sufficient amount of people are willing to pay in order to maximize revenue. If the profit margin is not high enough my following this model, then they just don't sell the product instead (or they continue to sell it at reduced profit or at a loss, in order to stay in the market and do better next gen).

Pricing is completely and totally divorced from cost of manufacture/development and is determined 100% on predicted market conditions and consumers' willingness to pay.

I wouldn't say that. I would say that cost is adjusted from the cost getting the product to market.

Cost + is also highly utilized in scenarios where you have to work within MSRP and MAP policies.
 
You should always start the y-axis at zero. Otherwise the relative sense of scale is wrong. I think this looks better:
 

Attachments

  • nvidia-gpu-price.png
    nvidia-gpu-price.png
    39.1 KB · Views: 35
You should always start the y-axis at zero. Otherwise the relative sense of scale is wrong. I think this looks better:


As an engineer by training often working in borderline scientific research, I would agree with that assessment. I thought I had started the axis at 0. Don't tell me the damned thing moved around on me and I didn't catch it?

Edit:

Damn. It did. Oh well. I think it still tells a good story though.

I have no idea when that happened. I hate Excel's auto-axes, but I could have sworn I set them to static a long time ago when I first started doing this chart. Maybe they got reset some how?
 
Last edited:
I don't know why, even if I account for inflation I can't get the same calibre GPU now as in 2001.

In 2001 I could get a Geforce 2 GTS which was about the GTX1080 of the times for 70K (GF2 Ultra = Titan XP, GF2 TI = 1080ti that just released) Inflation was about 100% since then, so that means I should be able to get an 1080 for 140k now. But the cheapest1080 right now is 200K.

So thank you very much for the comparison, but I'll continue to whine.

PS Prices in local currency.

You do get the same caliber gpu as you did in the year 2000, a better caliber in fact.

It is totally incorrect to assume a geforce2 ultra was the highest end card then, so it must be equal to a Titan today. It's not, and it doesn't add up either. It's comparable an x80 and back then was more of like the predecessor to the Ti series.
Which the TI series added another sort of rank to the ladder, but it's still an x80 in ranking.

A Geforce 2 gts was about 340 bucks back in the day and was liken just about exactly to a gtx 970 today.

An example is for instance, we were running Q3 arena at 800x600 or 1024x768 resolution. But our crts could do upwards of 1600x1200. You could run Quake 3 on a Geforce 2 Ultra @ 1600x1200 but your framerate would be sub 60fps unless perhaps you had like a 1Ghz T Bird and turned some graphics down. To push much past 60fps at 1600x1200 on a Geforce 2 Ultra in Quake 3 Arena in the year 2000 you would need 2 Geforce 2 Ultras = about 1100 dollars new with tax.

Compare that with a regular 1080 today at 4k and they are similar. We can call 1600x1200 the 4k of the year 2000. The 1080ti is a great offering as these things go and surpasses that by a fair bit in some games and yes costing a fair bit more than a 550 dollar Ultra in 2000 but compared to a Titan is costing a fair bit less.

In fact i leave a Titan out of the equation entirely, first because it's a dumb buy, and second because it basically never really had a place in the mainstream. It's the ultra i don't care about money segment, like niche dual gpu stacker card segment and certainly can't be compared to anything that came out in the year 2000. It has basically already lost it's way and the only reason they seemingly keep the Titan around now is because people will actually buy it.

But fact is we have basically always been able to count on the main 3 being in the same general price performance range no matter what they call them. Low Mid High. Today it's the x60, x70 and x80, whether they add a TI on the end 6 months later or whatever it doesn't matter it's a refresh but it's still the same x60 x70 x80 rank with a bump in price to go along with it.

I imagine people who are used to these product cycles and are not willing to waste money on it any more, either just buy the 1070, or 1060, or want the extra bump of the x80 but are smart enough to wait it out till the x80Ti. The regular x80 is dangling just like the Titan in irrelevancy in my personal opinion. But i'm sure they sold a shit ton of 1080's and quite a few Titans anyway.
 
You do get the same caliber gpu as you did in the year 2000, a better caliber in fact.

It is totally incorrect to assume a geforce2 ultra was the highest end card then, so it must be equal to a Titan today. It's not, and it doesn't add up either. It's comparable an x80 and back then was more of like the predecessor to the Ti series.
Which the TI series added another sort of rank to the ladder, but it's still an x80 in ranking.

A Geforce 2 gts was about 340 bucks back in the day and was liken just about exactly to a gtx 970 today.

An example is for instance, we were running Q3 arena at 800x600 or 1024x768 resolution. But our crts could do upwards of 1600x1200. You could run Quake 3 on a Geforce 2 Ultra @ 1600x1200 but your framerate would be sub 60fps unless perhaps you had like a 1Ghz T Bird and turned some graphics down. To push much past 60fps at 1600x1200 on a Geforce 2 Ultra in Quake 3 Arena in the year 2000 you would need 2 Geforce 2 Ultras = about 1100 dollars new with tax.

Compare that with a regular 1080 today at 4k and they are similar. We can call 1600x1200 the 4k of the year 2000. The 1080ti is a great offering as these things go and surpasses that by a fair bit in some games and yes costing a fair bit more than a 550 dollar Ultra in 2000 but compared to a Titan is costing a fair bit less.

In fact i leave a Titan out of the equation entirely, first because it's a dumb buy, and second because it basically never really had a place in the mainstream. It's the ultra i don't care about money segment, like niche dual gpu stacker card segment and certainly can't be compared to anything that came out in the year 2000. It has basically already lost it's way and the only reason they seemingly keep the Titan around now is because people will actually buy it.

But fact is we have basically always been able to count on the main 3 being in the same general price performance range no matter what they call them. Low Mid High. Today it's the x60, x70 and x80, whether they add a TI on the end 6 months later or whatever it doesn't matter it's a refresh but it's still the same x60 x70 x80 rank with a bump in price to go along with it.

I imagine people who are used to these product cycles and are not willing to waste money on it any more, either just buy the 1070, or 1060, or want the extra bump of the x80 but are smart enough to wait it out till the x80Ti. The regular x80 is dangling just like the Titan in irrelevancy in my personal opinion. But i'm sure they sold a shit ton of 1080's and quite a few Titans anyway.

Phew, that post is way too long to try to prove a point trough the back door.

I don't care about resolutions and frames / seconds. The GF2 GTS was 2nd best consumer card in NV's lineup until the GF2 TI came out. Now the 1080 is the second best card in the lineup, made third by the 1080Ti. That's all. So it should have the same market position and price.

You might as well have pulled those resolution figures from your ass. There is absolutely no reason to declare 1600x1200 the 2001 equivalent of 4K in 2017. It makes no sense apart from it conveniently fitting the narrative you're trying to invent. If the 2nd best card of 2001 could only handle 1024x768, and the second best card of 2017 can handle 4k, then 1024x768 is the 4k equivalent of 2001, it's only fair that way.

And PS: You know damn well that most people didn't have CRTs that could handle 1600x1200 at a reasonable refresh rate. Even if they had no optimal resolution as LCDs have, your eyes would've burst out on the back of your skull if you tried to play at slow refresh rates on a CRT.

For example I've had a pretty high end CRT at the time and even that could only handle 1152x864 with acceptable refresh rate, even trough it was theoretically capable of 2048x1536 as well.
 
Last edited:
More food for thought....

The first nVidia card I had was a 4400Ti (AGP) on closeout. When that card failed under warranty, it was replaced with a 5900 Ultra (AGP) by the manufacturer. When the motherboard failed, and I build a new computer in 2007, I got what I think was 8600 GTS 320MB because the graphics interface switched from AGP to PCI Express. I replaced that card with a 460 in 2010. Then, when I built my current system, I got a 980.

Now, if you plot out the performance of those cards across a chart, you would see significant improvements at each step. At the 980 level, though, I hit a performance wall. It was NOT because of the video card, but because of the monitor. It doesn't matter if your card is capable of doing 300FPS at 1920x1080 resolution if your monitor is only capable of 60FPS. I was holding off on upgrading my monitor until a 27" G-Sync IPS monitor were available for a while, and I got one last December. Now, I'm looking at replacing my 980 with a 1080Ti which, again, will be a significant improvement, especially at the monitor's max resolution of 2560x1440. However, unlike the 980 which I got two months after release (because of stock shortages), I think I'll hold off until Christmas to get a 1080Ti or whatever surprise nVidia has in store to respond to AMD.
 
My first video card was on the ISA bus at a time when the 3D accelerator didn't exist. :D I think it had 512k of VRAM.
 
Phew, that post is way too long to try to prove a point trough the back door.

I don't care about resolutions and frames / seconds. The GF2 GTS was 2nd best consumer card in NV's lineup until the GF2 TI came out. Now the 1080 is the second best card in the lineup, made third by the 1080Ti. That's all. So it should have the same market position and price.

You might as well have pulled those resolution figures from your ass. There is absolutely no reason to declare 1600x1200 the 2001 equivalent of 4K in 2017. It makes no sense apart from it conveniently fitting the narrative you're trying to invent. If the 2nd best card of 2001 could only handle 1024x768, and the second best card of 2017 can handle 4k, then 1024x768 is the 4k equivalent of 2001, it's only fair that way.

And PS: You know damn well that most people didn't have CRTs that could handle 1600x1200 at a reasonable refresh rate. Even if they had no optimal resolution as LCDs have, your eyes would've burst out on the back of your skull if you tried to play at slow refresh rates on a CRT.

For example I've had a pretty high end CRT at the time and even that could only handle 1152x864 with acceptable refresh rate, even trough it was theoretically capable of 2048x1536 as well.


60hz was brutal on the desktop back in the CRT days.

I found it much less bothersome in game though.

In 2001 I bought a 22" Diamondtron based Iiyama Visionmaster Pro 510 which I ran at 1600x1200 at 100hz.

(The official specs say 85hz at that resolution, but I am certain I was at 100. Maybe I was overdriving it?)

It could handle up to 2048 x 1536 at 80 Hz but i think anything much over 1600x1200 looked like ass, and I didn't have enough GPU horsepower to push that resolution anyway.

I was definitely in an extreme minority though. People saw that screen in my dorm room and found it to be humorously large, and to be honest, for the time, it was. Damned thing was heavy enough to flex my desk.

Most people were on 16" monitors that topped out at 1024x768 or 1280x1024 and 75 or 80hz.
 
  • Like
Reactions: M76
like this
Phew, that post is way too long to try to prove a point trough the back door.

I don't care about resolutions and frames / seconds. The GF2 GTS was 2nd best consumer card in NV's lineup until the GF2 TI came out. Now the 1080 is the second best card in the lineup, made third by the 1080Ti. That's all. So it should have the same market position and price.

You might as well have pulled those resolution figures from your ass. There is absolutely no reason to declare 1600x1200 the 2001 equivalent of 4K in 2017. It makes no sense apart from it conveniently fitting the narrative you're trying to invent. If the 2nd best card of 2001 could only handle 1024x768, and the second best card of 2017 can handle 4k, then 1024x768 is the 4k equivalent of 2001, it's only fair that way.

And PS: You know damn well that most people didn't have CRTs that could handle 1600x1200 at a reasonable refresh rate. Even if they had no optimal resolution as LCDs have, your eyes would've burst out on the back of your skull if you tried to play at slow refresh rates on a CRT.

For example I've had a pretty high end CRT at the time and even that could only handle 1152x864 with acceptable refresh rate, even trough it was theoretically capable of 2048x1536 as well.

It totally makes sense and I didn't pull anything out of my ass. The Geforce2 ultra was the precursor to the Ti and the Titan is an outlier. It only came along recently, and it is basically - as has been shown to be an early adopters fee for the Ti series for people running higher than average resolutions.

And i had a monitor in 2000 that could do 1600x1200 at 60Hz the later which was a common and acceptable gaming refresh rate at the time...and ever since for that matter. Not saying it's great, that's just what a lot of people used. I think i started using 75Hz across the board soon thereafter with my Geforce 3.

But the resolution thing is entirely valid. Higher resolution requirements from consumers have driven the small market "ultra high end" Gpu performance levels and created the "Titan" segment where it didn't exist before. Well it kind of did. It was the dual stacked Gpu cards or the high dollar AIB overclocking cards which are again. Ouliers.
 
Last edited:
60hz was brutal on the desktop back in the CRT days.

I found it much less bothersome in game though.

In 2001 I bought a 22" Diamondtron based Iiyama Visionmaster Pro 510 which I ran at 1600x1200 at 100hz.

(The official specs say 85hz at that resolution, but I am certain I was at 100. Maybe I was overdriving it?)

It could handle up to 2048 x 1536 at 80 Hz but i think anything much over 1600x1200 looked like ass, and I didn't have enough GPU horsepower to push that resolution anyway.

I was definitely in an extreme minority though. People saw that screen in my dorm room and found it to be humorously large, and to be honest, for the time, it was. Damned thing was heavy enough to flex my desk.

Most people were on 16" monitors that topped out at 1024x768 or 1280x1024 and 75 or 80hz.

I didn't see a CRT larger than 19" until I started working at a company. Those were horrendously expensive back in the day. At least by my country's standards. So yes most people had 14 or 15" monitors, capable of maybe 800x600 with a decent refresh rate. The lucky ones like me had 17", and a few rich kids had 19"-s. I don't think 16" CRTs even existed. They liked the odd numbers with that except for 14". But 14" monitors also vanished after the early nineties. So in the early 2000s the choice was 15" 17" 19" 21" And I gave up getting a shot at a driving license for a 17".
 
It totally makes sense and I didn't pull anything out of my ass. The Geforce2 ultra was the precursor to the Ti and the Titan is an outlier. It only came along recently, and it is basically - as has been shown to be an early adopters fee for the Ti series for people running higher than average resolutions.

And i had a monitor in 2000 that could do 1600x1200 at 60Hz the later which was a common and acceptable gaming refresh rate at the time...and ever since for that matter. Not saying it's great, that's just what a lot of people used. I think i started using 75Hz across the board soon thereafter with my Geforce 3.

But the resolution thing is entirely valid. Higher resolution requirements from consumers have driven the small market "ultra high end" Gpu performance levels and created the "Titan" segment where it didn't exist before. Well it kind of did. It was the dual stacked Gpu cards or the high dollar AIB overclocking cards which are again. Ouliers.
I don't see the Titan as an outlier, it's Nvidia positioning themselves higher in the price range. It's the best they can do, as the Ultra was the best they could do back then. If they made a dual GPU card still I'd also call that an outlier, but not the titan. It's just the new ultra to me. If we take out the titan from the lineup then what's the equivalent of the ultra series?

Maybe among casuals 60Hz was acceptable, but that was the time of the first 3D glasses. I've had them too, and it took 120hz as a minimum to run them. I think the worst refresh rate I played games at was 85Hz. Going back even to the Voodoo1, I've been already upping the refresh rate to the max. I don't think I played games at 60hz since the 486 era.

I assumed everyone was in the same boat.
 
Last edited:
I personally dont have a problem with their hardware, my problem is their monopolistic ways and overall anti-consumer attitude.

Examples:

GameWorks original release pretty much made games run like crap on anything not made by them.
Blocking the consumer of being able to use an AMD GPU as primary and a Nvidia as a Physx one, by disabling the card via driver.
G-Sync.
Focing the consumer to log in before allowing you to use GFE.
Installing "telemetry" ( spyware in my opinion) via GFE or stand alone drivers without informing you and without an option to opt out, unless manually uninstalled.

That last one really bothers me, since a driver has complete system access, does gather a lot on information that has no reason to be sent to them and the worse part is, all these tech sites, very conveniently, never bother in mention it or make a fuss about it.

I do understand the fact that either AMD or anyone else have provided any real competition to them, as a matter of fact, I actually have a 970 because of that, but I will jump to a Vega card if it is faster than what I currently have, just because of their crap.

Game devs CHOSE to use GameWorks. It made it easier for them implement graphics features. Did it help Nvidia? So what you're saying is they should have spent their dollars to create tools that would help them AND their competitor? Does Apple put out tools that make it easier to use their features on Android?

Same with AMD and Physx. Why should they make it EASIER to use their exclusive features with other graphics cards? Should Ford make cars designed to use Chevy parts?

G-Sync is an objectively BETTER technology than FreeSync. Using a hardware board in the monitor allows much better control. Now I'd love to see a better industry standard that every would adopt, sure. But honestly gaming monitors are a very small portion of the market, so I can see why it's taken so long.

GFE is kinda a "typical" marketing decision these days. I don't like it either, but Nvidia is FAR from the only company doing this crap. They still DO allow you full driver installs and such with no login though, So if you want the GFE extras, then you've gotta bite the bullet. Same with the telemetry on GFE.

My point is Nvidia isn't overly greedy or abusive. They're just a typical corporation, doing the same things every other corporation does.
 
Back
Top