Here's Why Graphics Card Pricing Could Shoot Up In 2020

Oh I agree it's a completely shitty tradition. It mostly comes from the fact that I grew up with very little money so I had to make do with what I had and save up in the meantime. I went to college in 2002 with a voodoo 3 2000. Was only because I overclocked it to the brink of destruction that I was able to keep using it. If you were to touch it while under load it would reward you with a mild burn.

Old habits are hard to break I guess.

They are. The real reason I say that it sucks is because I think the best upgrade one can make in the past 3-5 years is one of these new 144hz 1440p monitors. They make just about any game more enjoyable.
 
/sigh
I need to replace my RX 580... but damned if I'm going to spend over $400 to do it :/
 
I remember buying the fastest GPU money could buy in 2001, the GeForce 3 TI 500...

...for $350.

Now granted, that's about $500 in today's money with inflation, but still compare that to a 2080ti at $1,200, or even an RTX Titan at $2,500. This shit is out of hand.

It is completely out of hand, to the point where I’m considering just reusing my 1080Ti in my build next year. I’d like to get a 3080 Ti or whatever the next high-end card is, but I won’t be surprised if it is $1400 and I don’t think the value is there at that price. I still maintain (and perhaps I’m wrong) that part of the reason the RTX cards were priced so high was to make up for the collapse of the mining sales.
 
Even if I could easily afford it, I won't give in to these money laundering businesses so will rather sit it out. Currently on a 1070 Ti I got a good deal on BF last year and I'm using 1080p 144Hz so I will be fine for a while. I'm sure if the high pricing standards on GPU keeps up the savior will be the console pricing of next gen consoles with equally good hardware for much much less.
 
The more expensive cards become the less PC gamers there will be in the future so even less will buy the next gen and less will fork over for the next gen cards.
 
They are. The real reason I say that it sucks is because I think the best upgrade one can make in the past 3-5 years is one of these new 144hz 1440p monitors. They make just about any game more enjoyable.


I just recently went to 120hz after being at 60hz since the CRT era, when I ran at 90-100hz

Honestly? I can barely tell the difference.

I suspect a lot of the high refresh demands are driven entirely by placebo effect.
 
  • Like
Reactions: Auer
like this
I just recently went to 120hz after being at 60hz since the CRT era, when I ran at 90-100hz

Honestly? I can barely tell the difference.

I suspect a lot of the high refresh demands are driven entirely by placebo effect.

I went to 120 a couple months ago by changing my monitor input and settings. It helped in FPS but it's not like something I can live without. Cost to benefit, you really need to weigh these things out nowadays. Until a game I actually like drags my rig into the gutter - I'm holding.
 
Last edited:
I just recently went to 120hz after being at 60hz since the CRT era, when I ran at 90-100hz

Honestly? I can barely tell the difference.

I suspect a lot of the high refresh demands are driven entirely by placebo effect.

I'm willing to accept that it is also highly individual. Very little point in arguing over that.

Like you tho, the diff is not dramatic for me, personally.
 
I have to admit, after 60fps being the halo target framerate to have for like 15-20 years since the dawn of the FPS era, I was a skeptic that added framerate would really be beneficial. I played with it a little in college, because Counter-Strike was so light on my Geforce 3 TI 500 and my Iiiama VisionMaster Pro 510 allowed me to hit 100hz at 1600x1200, but it was mostly an "I'm doing this for shits and giggles because I can" kind of thing. I never saw any real benefit from it.

After all the evangelizing on here and other forums for high refresh, I decided, why not, I'll give it a shot and picked up the Asus XG438Q. What I really wanted was Free/G sync to avoid tearing, but I got a 120hz capable screen along for the ride, so why not give it a try?

I don't have many titles I can render at 4k120hz, but CSGO was one of them. I tried it both at 60 and at 120, and the 120 may feel every so slightly more fluid, but man it is so subtle that I probably couldn't reliably pick the right one in a blinded test.

The real benefit of this screen is so that when I occasionally dip under 60fps in a more graphically demanding title, it doesn't result in tearing. Being able to survive the occasional 58fps without tearing is a godsend and opens some flexibility doors for graphical settings. :p

I just thought it would be more with doubling my framerate on my monitor, maybe I am just someone who doesn't see tearing that much. I got way more out of tuning for the game engine and assigning steam to the even cores, ect. The lowest FPS I see is typically 90 - unless there is a Server or Connection issue. This source engine is simple and generally CPU bound mostly, taking only about gig of VRAM. Overclocked R9290 @ 1920x1080 does just fine. Doom (2016) can be rendered at 60fps for the most part as well, Vulkan is great.
 
I just thought it would be more with doubling my framerate on my monitor, maybe I am just someone who doesn't see tearing that much. I got way more out of tuning for the game engine and assigning steam to the even cores, ect. The lowest FPS I see is typically 90 - unless there is a Server or Connection issue. This source engine is simple and generally CPU bound mostly, taking only about gig of VRAM. Overclocked R9290 @ 1920x1080 does just fine. Doom (2016) can be rendered at 60fps for the most part as well, Vulkan is great.

I find that the ID Tech Engine titles (Doom, Wolfenstein etc.) do run very well even at Ultra settings on surprisingly modest hardware, but they also don't look as good as other titles that are harder on my hardware. I suspect a ID Tech Engine title "Ultra" setting is more equivalent to a "Medium" setting in many other games. Their models seem to have lower polygon counts, their shading seems kind of disappointing, etc. etc.

I just feel like they are getting a great engine because things run really smoothly, when in reality they are just shipping a product with lower quality graphics.
 
  • Like
Reactions: N4CR
like this
It is completely out of hand, to the point where I’m considering just reusing my 1080Ti in my build next year. I’d like to get a 3080 Ti or whatever the next high-end card is, but I won’t be surprised if it is $1400 and I don’t think the value is there at that price. I still maintain (and perhaps I’m wrong) that part of the reason the RTX cards were priced so high was to make up for the collapse of the mining sales.

There's really no question that was a big part of it imo. nvidia missed out on the bubble pricing the first go-round.
I have to admit, after 60fps being the halo target framerate to have for like 15-20 years since the dawn of the FPS era, I was a skeptic that added framerate would really be beneficial. I played with it a little in college, because Counter-Strike was so light on my Geforce 3 TI 500 and my Iiiama VisionMaster Pro 510 allowed me to hit 100hz at 1600x1200, but it was mostly an "I'm doing this for shits and giggles because I can" kind of thing. I never saw any real benefit from it.

After all the evangelizing on here and other forums for high refresh, I decided, why not, I'll give it a shot and picked up the Asus XG438Q. What I really wanted was Free/G sync to avoid tearing, but I got a 120hz capable screen along for the ride, so why not give it a try?

I don't have many titles I can render at 4k120hz, but CSGO was one of them. I tried it both at 60 and at 120, and the 120 may feel every so slightly more fluid, but man it is so subtle that I probably couldn't reliably pick the right one in a blinded test.

The real benefit of this screen is so that when I occasionally dip under 60fps in a more graphically demanding title, it doesn't result in tearing. Being able to survive the occasional 58fps without tearing is a godsend and opens some flexibility doors for graphical settings. :p

That's really what I'm talking about, it's just better in every way. Higher resolution, higher refresh rate, lower response times, gsync so no tearing. It has just been great honestly. It's quite an upgrade over a 120hz TN panel I had and a really nice hp 1200p 60hz monitor.
 
Oh, and as a reminder from that story I wrote for the [H] last year...

Numbers are a little outdated, as there has been continued inflation since then, but it is still pretty close.
I wouldn't use inflation as some markets are immune to it. Also if we're going into a depression in 2020 then higher priced GPU's aren't going to sell either way. And yes, I said depression not recession. Many people don't know what's going on in the economy, especially in the repo market.
 
I find that the ID Tech Engine titles (Doom, Wolfenstein etc.) do run very well even at Ultra settings on surprisingly modest hardware, but they also don't look as good as other titles that are harder on my hardware. I suspect a ID Tech Engine title "Ultra" setting is more equivalent to a "Medium" setting in many other games. Their models seem to have lower polygon counts, their shading seems kind of disappointing, etc. etc.

I just feel like they are getting a great engine because things run really smoothly, when in reality they are just shipping a product with lower quality graphics.
Looks like id is upping the detail for 7. Should be better in that regard this time around for Doom Eternal. And thank god they are doing multiplayer instead of outsourcing it.

"
id Tech 7
Main article: id Tech 7
At QuakeCon 2018 id Software announced a new game in the Doom franchise called Doom Eternal which is powered by the id Tech 7 engine.[6] The company promises that the new engine is capable of delivering a 10-times increase in geometric detail vs. id Tech 6.[7] On PC, id Tech 7 will support Vulkan rendering only.[8]"
 
I think graphic cards will get much cheaper. With the next consoles coming out, Nvidia has more reasons to drop prices than anyone else, which would force AMD to drop prices as well. Both AMD and Nvidia have poor sales as is, so high prices would just hurt sales even further. Then you have Intel who plans to release a GPU next year as well, which would also lower prices. I believe 2020 will be a great year to buy a graphics card.

Didn't NV already start the... Our cards are 100000x better then any console PR early.

I expect when the next gen consoles launch GPUs sales will slow for awhile and pricing will drop. As long as people are willing to pay 2012 high end prices for mid range GPUs in 2020 they will continue to price them right there.
 
I think graphic cards will get much cheaper. With the next consoles coming out, Nvidia has more reasons to drop prices than anyone else, which would force AMD to drop prices as well. Both AMD and Nvidia have poor sales as is, so high prices would just hurt sales even further. Then you have Intel who plans to release a GPU next year as well, which would also lower prices. I believe 2020 will be a great year to buy a graphics card.

AMD would not be forced to lower their prices, since they are already priced competitively as it is. As for NV lowering the prices, good luck with that.
 
The more expensive cards become the less PC gamers there will be in the future so even less will buy the next gen and less will fork over for the next gen cards.

Nah - just built my brother in law a "new" PC for Christmas with a bunch of used parts (levftover HD, Case, PSU, 1700X CPU) and new mobo/RAM/card - aorusM B450/cheap DDR3000/RX590. It's plenty enough to game at 1080P and gets him in the door where he'll eventually upgrade.

When most newbs game at 1080P on 24" Dell monitors, its' fairly cheap to get "in" and out of being a console peasant. Starts 'em on the upgrade path.

It's only the grognards who consider themselves "competitive" who look for the superfast cards/monitors/4K gaming. IMO - speaking as one of the non-grognards.
 
Maybe but maybe not with consoles.
HBM won't be such a stupid idea if vram is the issue.
 
Radeon 7500 160 2001 https://www.extremetech.com/computing/58484-radeon-85007500-review

Radeon 8500 225 2001 https://www.extremetech.com/computing/58484-radeon-85007500-review

Radeon 9800xt 499 2003 https://www.guru3d.com/articles-pages/ati-radeon-9800-xt-review,1.html

Radeon x850XTPE 549 2004 https://www.anandtech.com/show/1558/2

Radeon x1950 Pro 199 2006 https://techreport.com/review/11032/atis-radeon-x1950-pro-graphics-card/

Radeon 2900xt 399 2007 https://www.legitreviews.com/ati-radeon-hd-2900-xt-video-card-review_503

Radeon HD3870 219 2007 https://www.anandtech.com/show/2376/5

Radeon HD4890 250 2009 https://www.anandtech.com/show/2745

Radeon HD5870 380 2009 https://www.anandtech.com/show/2841

Radeon HD6970 369 2010 https://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950

Radeon HD7970 549 2011 https://www.anandtech.com/show/6093...price-cuts-inbound-7970ge-available-next-week

R290X 550 2014 https://www.fudzilla.com/home/item/35946-amd-drops-290-290x-hawaii-prices

FuryX 650 2015 https://wccftech.com/amd-officially-launches-radeon-fury-x-nano-650/

RX480 229 2016 https://www.kitguru.net/components/...on/amd-radeon-rx-480-8gb-version-to-cost-229/

Radeon VII 699 2019 https://www.rockpapershotgun.com/2019/01/09/ces-2019-amd-radeon-vii-price-specs-release-date/

RX5700 XT 449 2019 https://www.polygon.com/e3/2019/6/10/18660471/amd-rx-5700-xt-release-date-price-specs-e3

there are a shit load of red herrings though

the 5870 was over msrp because of mining same as 7000 series same as 290x series and rx480 cards.
Good job, but 9700 was top card and a huge market leader at the time. Where is that? Also what do you mean 5870 was inflated do to mining. What the hell mining are we talking about? There was no inflated GPU pricing in that era. Unless I am not understanding your methodology.
 
Good job, but 9700 was top card and a huge market leader at the time. Where is that? Also what do you mean 5870 was inflated do to mining. What the hell mining are we talking about? There was no inflated GPU pricing in that era. Unless I am not understanding your methodology.
There was a shortage of 5000 series cards.

Bitcoin mining had just become mainstream.

Well not mainstream but wide spread.
 
The more expensive cards become the less PC gamers there will be in the future so even less will buy the next gen and less will fork over for the next gen cards.

Yes and no. Honestly, the total cost of entry really hasn't gone up all that much overall even with the increases in GPU prices. Newer stuff runs faster, which means 1080p/60 is viable on lower end harder, that lower end hardware still ends up either the same price as the previous gen or a little cheaper. Then, of course, with the price drops on memory, and SSDs those are cheaper than ever. Combine that with the incredible value you get out of Zen 2 parts and B450 boards and, if anything, you are getting better bang for your buck overall now than previous. Buying used and/or open box brings up the value even more. There is definitely a tipping point where increased cost is going to override the price reductions in other areas, but it might take a while to get there.
 
They are. The real reason I say that it sucks is because I think the best upgrade one can make in the past 3-5 years is one of these new 144hz 1440p monitors. They make just about any game more enjoyable.

I'd imagine you're completely correct. Unfortunately the old monitor has become a matter of pride for me.its a Samsung 245BW that just won't die. I think it must be over 12 years old now.

It did die once truth be told. The power supply has 3 large resistors in it that are a known weak point. Got some new ones off Amazon for less than $5 and soldered them in one evening. That was a couple years ago. Still running. One day I'll burn it out!
 
I almost feel like just playing on console moving forward.
New gen microsoft console promoting 12TF GPU power (1080ti level) and a ryzen 8/16 CPU. Since they are supposedly going to allow mouse and keyboard next gen — the lines are really getting blurred!
 
As long as people keep buying them at $1000+, they will keep driving prices up. It's as simple as that. People need to learn to vote with their wallet at some point, but unfortunately most enthusiasts don't care as long as they can still afford it somehow with no thought given to the next generation coming in at an even higher price and so on until they can't afford it and it finally affects them.
 
As long as people keep buying them at $1000+, they will keep driving prices up. It's as simple as that. People need to learn to vote with their wallet at some point, but unfortunately most enthusiasts don't care as long as they can still afford it somehow with no thought given to the next generation coming in at an even higher price and so on until they can't afford it and it finally affects them.

I think they have tbh. Sales have been sluggish iirc. It's kind of evident ho
I'd imagine you're completely correct. Unfortunately the old monitor has become a matter of pride for me.its a Samsung 245BW that just won't die. I think it must be over 12 years old now.

It did die once truth be told. The power supply has 3 large resistors in it that are a known weak point. Got some new ones off Amazon for less than $5 and soldered them in one evening. That was a couple years ago. Still running. One day I'll burn it out!

Ah, you repaired it, so you cheated! :p You can always keep it as a second monitor!
 
I think they have tbh. Sales have been sluggish iirc. It's kind of evident ho

Googling it, I don't see any hard sales numbers that show how well it's doing compared to the previous gen cards in the same time period after launch. All I see is mixed results showing both good and bad sales from reported retailers and Nvidia themselves, but no hard numbers published still. But just by anecdotally going off of the number of 2080ti owners in this forum, I don't see any less of them than I have for previous gen flagship cards.
 
Googling it, I don't see any hard sales numbers that show how well it's doing compared to the previous gen cards in the same time period after launch. All I see is mixed results showing both good and bad sales from reported retailers and Nvidia themselves, but no hard numbers published still. But just by anecdotally going off of the number of 2080ti owners in this forum, I don't see any less of them than I have for previous gen flagship cards.

I was going by nvidia's reporting on sales which can be misleading due to the general nature of those numbers. My anecdotal opinion is there are far fewer owners of top-end cards on this forum than there were for pascal. I'm on the FS forum a decent bit and there are not many 2080 ti's for sale. If we did a poll thread I do wonder what the results would be.
 
It's not odd at all. The 570 and 580 are better value at the moment. Not better performance, but better fps/$ at a lower performance level. This matters to most people (not necessarily here at [H] though). Steve at HU has shown this in the frames per dollar graphs at the end of some recent GPU videos. The price rises substantially to get more performance making it less attractive for cost-sensitive consumers.
 
I was going by nvidia's reporting on sales which can be misleading due to the general nature of those numbers. My anecdotal opinion is there are far fewer owners of top-end cards on this forum than there were for pascal. I'm on the FS forum a decent bit and there are not many 2080 ti's for sale. If we did a poll thread I do wonder what the results would be.

Why would FS/FT be a good indicator of 2080ti owners? It's the flagship card right now, so most people would be selling their previous gen 1080/ti cards in order to upgrade to the 2080ti. The only 2080ti sales would be from people who probably couldn't really afford it in the first place so they're unloading it, or they found another flavor 2080ti or Titan RTX card to replace their existing card(s). I'm going off of how many people I see with 2080ti's in their signature blocks and threads in the video card forum that I frequent as well, which is still pretty anecdotal compared to overall market share considering this is an enthusiast site, but at least in our realm the $1000+ cards still seem to be selling as well as the $750 cards of last generation from what I've observed.


I wouldn't expect the flagship cards, which represent the worst value on the market for performance per dollar, to be anywhere near the top 5 selling cards on Amazon. I don't find it odd at all either that the RX 580/570 cards are leading in sales either, since they offer the best value over nearly every other card on the market still. My son has an 8GB RX570 as well and its been a great card that can max out pretty much everything he plays at 1080p ultrawide res.

That the 2080S and Ti are at 10th and 12th actually tells me that they're still selling really well considering how bad of a value they are at their price point, and that's coming from a 2080 owner as well.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Why would FS/FT be a good indicator of 2080ti owners? It's the flagship card right now, so most people would be selling their previous gen 1080/ti cards in order to upgrade to the 2080ti. The only 2080ti sales would be from people who probably couldn't really afford it in the first place so they're unloading it, or they found another flavor 2080ti or Titan RTX card to replace their existing card(s). I'm going off of how many people I see with 2080ti's in their signature blocks and threads in the video card forum that I frequent as well, which is still pretty anecdotal compared to overall market share considering this is an enthusiast site, but at least in our realm the $1000+ cards still seem to be selling as well as the $750 cards of last generation from what I've observed.



I wouldn't expect the flagship cards, which represent the worst value on the market for performance per dollar, to be anywhere near the top 5 selling cards on Amazon. I don't find it odd at all either that the RX 580/570 cards are leading in sales either, since they offer the best value over nearly every other card on the market still. My son has an 8GB RX570 as well and its been a great card that can max out pretty much everything he plays at 1080p ultrawide res.

That the 2080S and Ti are at 10th and 12th actually tells me that they're still selling really well considering how bad of a value they are at their price point, and that's coming from a 2080 owner as well.

I'm just countering your anecdotal evidence with my own, but I will not attempt to act like it's any better than yours.
 
That the 2080S and Ti are at 10th and 12th actually tells me that they're still selling really well considering how bad of a value they are at their price point, and that's coming from a 2080 owner as well.

That's exactly it - 2080s are about the most horrible value in modern times, even for a top of the line card. Hence, its been reduces to almost out of the top 10 sales as far as Amazon goes. That is not a good showing for where it should be. Nvidia is pricing themselves out of the market.
 
Part of me feel that the majority of high end graphics card owners are probably youtubers who do new builds every other week (Linus, JayTwoCentz, etc) and the poor saps who watch those videos AND get inspired to do the same.
 
I wouldn't use inflation as some markets are immune to it. Also if we're going into a depression in 2020 then higher priced GPU's aren't going to sell either way. And yes, I said depression not recession. Many people don't know what's going on in the economy, especially in the repo market.

A depression isn't a defined term in the field economics. It's just something news organizations use as click bait.

That said, a market downturn does seem likely, which sucks, because of the 2017 deficit increasing tax cuts we don't have the means to deficit spend in a recession this time around to counteract it, so it might get pretty bad.

Time will tell.
 
3080 Ti $1599.99 the more you buy the more you save. Now with 2x the electrolytes..I mean RT.
 
That's exactly it - 2080s are about the most horrible value in modern times, even for a top of the line card. Hence, its been reduces to almost out of the top 10 sales as far as Amazon goes. That is not a good showing for where it should be. Nvidia is pricing themselves out of the market.

I guess your didn't understand my post then; there are countless more cards on the market at lower prices and better value than the 2080, so it showing up anywhere near the top 10 selling cards is a good showing for Nvidia.
 
  • Like
Reactions: mikeo
like this
I guess your didn't understand my post then; there are countless more cards on the market at lower prices and better value than the 2080, so it showing up anywhere near the top 10 selling cards is a good showing for Nvidia.
I understand perfectly. No, it's not good when your flagship card is 10 and 12. Maybe you need to say that outloud to yourself a few times and think about it more. I
 
Last edited:
Back
Top