Desktop GPU Sales Hit 20-Year Low

AMD and Nvidia I believe have stopped looking at market segments like this is the 60 class and this is the 80 class and, they have instead shifted to resolutions. This is your 1080p, your 1440p, your 2K, your 4K, ...
They are trying to argue well this card series covers all the current needs and features required to play at 1080p, this one your 1440p, and over here we have your 4K, and they are sort of right to do it that way maybe it's weird. As a consumer, enthusiast I sort of hate it, the prices on the new stuff are stupid, but the prices on the old stuff are getting better but they aren't there yet and I get that retailers are still marking things up because 1 they are moving and 2 they are using that markup to pay for the losses they are taking on some of the unsold stock that they purchased at the height of the GPU prices and won't take a bath on them. It's all just a clusterfuck and it makes me angry, though it is letting me go back and appreciate some of my much older stuff and realize it's still nice, busted out the Wii U for the first time in like 8 years and remembering the stuff was actually fun.
 
I get everything you're saying and it's correct from a gamers perspective.

But the traditional GPU reviews doesn't make sense anymore. These cards do wayyyyy more than just play a 3D games. People are being sold gaming performance when that's probably 33% of what it can do. Imagine only using 33% of what a card can do after paying $1600 for a 4090. GPU reviews should be three parts, gaming, productivity and research. I know Linus can do it, but people generally want to see FPS bars, not render times and folding. I think it's important for people to look beyond gaming to find value in the current GPU market. The 4000 series dominates the 3000 series in productivity, but we will recommend a 3000 series card because of price without considering that it will also be used for work.

Uhh, LTT does go over productivity benchmarks in his 4070ti review.
 
You know the GPU market is in a rough state when Linus from LTT says that consoles like the PS5 and Xbox Series X may represent a better investment given soaring GPU prices. (It's towards the end of the clip.)


He's not wrong. I know some people that still assume GPU prices are screwed, and they have turned to consoles. Good chance that's how most none tech informed people are thinking.
And he has a good argument. We're now at the point where an NVIDIA xx70 series card is selling for at least $799. You're not even at the flagship level and you're already spending enough that a PS5 or XSX looks like a bargain.
PS5 and XSX are cheaper and you'll have enough money to buy a few games at least.
However large or small NVIDIA's profit margins are, that's not helping the PC gaming market one bit — it's walling off the high-end experience. And even a 3070 might be difficult to justify for someone who just wants the latest CoD game to run decently. And it'll definitely run decently on a cheaper and more accessible console.
This is what happened during the Xbox 360 era because ATI and Nvidia had also raised prices to insane levels and gamers bought consoles instead of PC's specifically GPU's. That's why the meme "But can it run Crysis" got started because graphic cards were too much.
 
I get everything you're saying and it's correct from a gamers perspective.

But the traditional GPU reviews doesn't make sense anymore. These cards do wayyyyy more than just play a 3D games. People are being sold gaming performance when that's probably 33% of what it can do. Imagine only using 33% of what a card can do after paying $1600 for a 4090. GPU reviews should be three parts, gaming, productivity and research. I know Linus can do it, but people generally want to see FPS bars, not render times and folding. I think it's important for people to look beyond gaming to find value in the current GPU market. The 4000 series dominates the 3000 series in productivity, but we will recommend a 3000 series card because of price without considering that it will also be used for work.

I would contend, by and large, that productivity people aren’t buying 4070Ti, or 4050, or 4060. The card can do it, but that’s not who is buying it and it’s not the primary purpose of the card. Productivity people are buying either Quadro or the 4090, because that is powerful enough to be kind of like a discount Quadro.

I can buy a Ferrari to take my kids to school. You know, it’s a car, it can drive from point A to point B, but no one looking at a Ferrari is thinking about that as a primary use case, and anyone reviewing it (Top Gear or whatever) is not going to waste time discussing how good it is as a daily driver.
 
Last edited:
Again, Nvidia isn't pricing the products as gaming cards. When the cards were $500 for a x80, it was a gaming card. Now it's a high end professional card without the workstation price.
Gamers are saying the current prices are too much and I agree. So where's the value for a 4000 series card? It's definitely not in gaming.

Gamer focused cards need to exist if possible. The 58000x3D is a good example of a stripped down product focused on gaming and not as good in productivity. I would like to see a more affordable GPUs dedicated to gaming.

AMD marketed the 5800x3D to gamers specifically. Nvidia is marketing these GPUs to gamers specifically. They market the Quadro to professionals. That’s all you need to know how these products need to be priced and reviewed.
 
Maybe Nvidia and AMD should rush to come out with their RTX 4050 Ti and RX 7500 XT before cryptos climb the ladder again. :D
 
Quick FYI the Quadro lineup died they killed
It after Turing.

It’s been replaced with the RTX A and now the RTX Ada series.

That said I have a bunch of RTX A2000 6gb models, I put them in anything that needs 4 or more displays that doesn’t need power behind them. I am sad Nvidia never made a consumer variant of the card.
Now I would never game on a work computer… but I would certainly test the capabilities of a workstation before deployment to ensure it works as expected. And the A2000 6gb was a very capable card for 1080p.
 
AMD and Nvidia I believe have stopped looking at market segments like this is the 60 class and this is the 80 class and, they have instead shifted to resolutions. This is your 1080p, your 1440p, your 2K, your 4K, ...

That is kind of how it has been for a while.

**30/40 - Generally going to need to turn down settings or play games that are a few years older at 1920x1080
**50 - Generally 60 frame rates but may have to turn down some settings at 1920x1080
**60 - Generally 60 frame rates with max settings at 1920x1080
**70 - Generally 60 frame rates with max settings at 2560x1440
**80 - Generally 60 frame rates with max settings at 3440x1440
80 Ti / **90 / Titan - Generally 60 frame rates with max settings at 3840x2160

Of course not every game runs the same, but generally that seems to have held true.
 
Quick FYI the Quadro lineup died they killed
It after Turing.

It’s been replaced with the RTX A and now the RTX Ada series.

That said I have a bunch of RTX A2000 6gb models, I put them in anything that needs 4 or more displays that doesn’t need power behind them. I am sad Nvidia never made a consumer variant of the card.
Now I would never game on a work computer… but I would certainly test the capabilities of a workstation before deployment to ensure it works as expected. And the A2000 6gb was a very capable card for 1080p.

Whatever they call it now, they still make a professional card outside of their "gaming" cards. And they know they are different because they wall off the drivers between the two and artificially limit features in gaming cards.
 
He's not wrong. I know some people that still assume GPU prices are screwed, and they have turned to consoles. Good chance that's how most none tech informed people are thinking.
That's because GPU prices are screwed. They never unscrewed them after the crypto craze blaming the chip shortage, but it won't last. They are already sitting on a ton of unmoveable 4080 stock. And the 4070 ti will be even harder to sell.
 
He's not wrong. I know some people that still assume GPU prices are screwed, and they have turned to consoles. Good chance that's how most none tech informed people are thinking.

PS5 and XSX are cheaper and you'll have enough money to buy a few games at least.

This is what happened during the Xbox 360 era because ATI and Nvidia had also raised prices to insane levels and gamers bought consoles instead of PC's specifically GPU's. That's why the meme "But can it run Crysis" got started because graphic cards were too much.
Speaking of the "Can it run Crysis?" era, my first custom build on my own dime was in the wake of its release - Q6600, 2 GB of DDR2, and a few extra months to wait for a GeForce 8800 GT because G92 seriously undercut NVIDIA's own G80 (8800 GTS 320/640 MB, GTX) lineup while keeping similar performance, now only $200-250.

That 8800 GT curbstomped the X360 and PS3 just a few short years into their lifespan, and the gulf only widened as GPUs improved.

Modern day NVIDIA will never release an amazing price-to-performance card like that again, I hate to say. The closest thing since was probably the GTX 970, and even that had the infamous 3.5 GB VRAM issue (which I sidestepped because I got a GTX 980 in late 2015 for effectively 970 price, $315 shipped).

Can we even name a card today that offers the sort of price to performance that the 8800 GT or GTX 970 did for under $350, or even under $300? There probably wouldn't be record lows for GPU sales if there were!
 
Last edited:
Given what AMD's mobile CPU lineup looks like (mix and match of undecipherable and confusing names and archs), I fully expect the rest of the radeon RX 7000 series to be rebrands.
Nvidia said the rest of their product stack is RTX 3000 series. this may be the new norm.

and it's depressing.
 
I fully expect the rest of the radeon RX 7000 series to be rebrands.

Going by the estimated chip sizes:

Navi 21 (68xx/69xx) — replaced by Navi 31 (released 2022 year end)

Navi 22 (67xx) — replaced by Navi 32 (expected by 2023 june)

Navi 23 (66xx) — replaced by Navi 33 (laptop version announced jan 2023, desktop version expected after Navi 32 ?? )

Navi 24 (65xx/64xx) — no equivalent in RDNA 3. Can expect rebadge as 73xx but who will buy a 4GB card in 2023 ??)
 
76fw2g.jpg
 
Just checked on costco, bestbuy, amazon in my market, outside the expensive special edition (mutli controller type), seem still impossible to find.
 
Just checked on costco, bestbuy, amazon in my market, outside the expensive special edition (mutli controller type), seem still impossible to find.
I'm in BF nowhere Canada and the local "The Source" has 2, one of each with and without disc.
 
You know the GPU market is in a rough state when Linus from LTT says that consoles like the PS5 and Xbox Series X may represent a better investment given soaring GPU prices. (It's towards the end of the clip.)

You're not even at the flagship level and you're already spending enough that a PS5 or XSX looks like a bargain. However large or small NVIDIA's profit margins are, that's not helping the PC gaming market one bit — it's walling off the high-end experience. And even a 3070 might be difficult to justify for someone who just wants the latest CoD game to run decently. And it'll definitely run decently on a cheaper and more accessible console.
Here's the thing, it never has been a good investment when doing an apples to apples, even mid ranged cards when you take into account the cost of the rest of the computer tended to always be more than the existing console. But that's the thing even if you see the PC as only a gaming experience (which it's not but whatever) it's all about the choices you have, you have the choice to not go with a "mid-range" card if you don't want to, what's wrong with the $500 mid range card from last generation that's so horrible you NEED to upgrade, or the year before that? You have a huge choice in whether or not you feel the need to have the latest gen hardware at any level. Do you need 4k ultra hdr 1024xAA and all that stuff? Maybe you are saying yes in which case fine make the "investment" you feel is better. Meanwhile I can still play Cyberpunk on my old gtx970, sure not at 4k but if I decide 1080 low settings is fine is there really anything wrong with it? (I played Far Cry 6 at 1440 on my gtx970 and it was largely acceptable as a playing experience) Can you say the same thing with a PS3? No, you can't because your PS3 can't play it. I also want to play FF8 on a PS5, can I? Not unless they made a special edition remake, however I want to play Ultima1 on my PC I can. I also have a much larger range of games that are available to me on the PC, not just triple A titles that each cost $60 (if you're lucky).
Ultimately a PC is more of an investment than a console, because the end of life is much longer out than a console, you're probably not playing Cyberpunk if you have an Viper V550 video card, but if you have an 8800gtx you may be able to squeak out something (maybe, I'm not saying you can). And sure $800 for a "midrange" video card is crazy insane, but how many years will that video card work for games that come out? How much longer is the PS5 going to be relevant? another 2 years maybe? And sure if you're happy with the games you have then absolutely squeak out more life out of it, meanwhile the guy who paid too much for a video card (not me) is going to end up playing the latest and greatest games as they come out.
 
Evolution of GPU decision making:

  • Groceries, or new GPU
  • Car, or new GPU
  • Rent, or new GPU
  • 4 year College, or new GPU
  • Purchase pro NFL team, or new GPU
I like how you put rent as #3 instead of #2, simultaneously showing how ridiculous rent has become too. Of course several decades ago the full cost of a four-year degree might be #3 and a brand new car slightly #4...
 
I like how you put rent as #3 instead of #2, simultaneously showing how ridiculous rent has become too. Of course several decades ago the full cost of a four-year degree might be #3 and a brand new car slightly #4...
I'm throwing down the gauntlet to challenge Nvidia to produce the next consumer $10K GPU!!
 
I'm in BF nowhere Canada and the local "The Source" has 2, one of each with and without disc.
I bit like Loblaws and pharmacies back in the day of The Wii madness being good place to find them, didn't know The Source were still in the retail business.
 
I'm throwing down the gauntlet to challenge Nvidia to produce the next consumer $10K GPU!!
Don't encourage them, they might decide to jack prices up to 1990s SiliconGraphics workstation levels!

You know, a company whose entry-level model was a $5,000 Indy workstation - "an Indigo without the go" - to $30K+ for a decently specced Indigo2 or Octane - to $200,000+ for an Onyx2 deskside graphics supercomputer, all short of the Origin/Onyx clustered setups eventually outclassed by a humble GeForce 256 in a modestly-priced PC.

I'll let you calculate for inflation on your own.

The current DGX Station A100 is $100,000+ and based on last-gen Ampere architecture, so I guess that's the Onyx of the modern age.

So that leaves me wondering - who's going to do to NVIDIA what they did to SGI?
 
Last edited:
You know, a company whose entry-level model was a $5,000 Indy workstation
Hey if someone made a 5k desktop windows gaming device that outperformed self-built gaming PCs with top-end CPUs like the 13900KS and 7950X3D multiple times over when playing CPU-intensive strategy and simulation games, I would very much consider that 5k device. I will pay for for that several times increase of real-time "ticks" or a small fraction of the normal wait time in a turn-based 4x game. More size, heat, power? No problems with dealing with that either. But 1k for a graphics card where CPU-intensive PC games still run at the same speed and you're just getting prettier graphics on the same gameplay, and only those prettier graphics on certain types of games? No thanks.
 
Don't encourage them, they might decide to jack prices up to 1990s SiliconGraphics workstation levels!

You know, a company whose entry-level model was a $5,000 Indy workstation - "an Indigo without the go" - to $30K+ for a decently specced Indigo2 or Octane - to $200,000+ for an Onyx2 deskside graphics supercomputer, all short of the Origin/Onyx clustered setups eventually outclassed by a humble GeForce 256 in a modestly-priced PC.

I'll let you calculate for inflation on your own.

The current DGX Station A100 is $100,000+ and based on last-gen Ampere architecture, so I guess that's the Onyx of the modern age.

So that leaves me wondering - who's going to do to NVIDIA what they did to SGI?

OK but SGI made dinosaurs first so Nvidia dinosaurs when?
 
Hey if someone made a 5k desktop windows gaming device that outperformed self-built gaming PCs with top-end CPUs like the 13900KS and 7950X3D multiple times over when playing CPU-intensive strategy and simulation games, I would very much consider that 5k device. I will pay for for that several times increase of real-time "ticks" or a small fraction of the normal wait time in a turn-based 4x game. More size, heat, power? No problems with dealing with that either. But 1k for a graphics card where CPU-intensive PC games still run at the same speed and you're just getting prettier graphics on the same gameplay, and only those prettier graphics on certain types of games? No thanks.
The x86 CPU market is the most exciting it's been since the Pentium III vs. Athlon days, though!

Both Intel and AMD are leapfrogging each other to the point that neither can get too greedy, in ways that are definitely NOT happening in the GPU market right now because both have to adjust their prices in response to the other, not neatly reserving rungs on a price ladder for each other.

HEDT, on the other hand... AMD let Threadripper languish in favor of EPYC because Intel's been dragging their heels on Sapphire Rapids, and CPUs that could be Threadrippers are instead sold as much more profitable EPYCs due to demand.

I feel your pain in wanting more CPU grunt for sims, though - admittedly less 4X and more vehicular on my end. DCS, IL-2, BeamNG, whatever, all will demand as much single-threaded performance as possible.
 
The x86 CPU market is the most exciting it's been since the Pentium III vs. Athlon days, though!

Both Intel and AMD are leapfrogging each other to the point that neither can get too greedy, in ways that are definitely NOT happening in the GPU market right now because both have to adjust their prices in response to the other, not neatly reserving rungs on a price ladder for each other.

HEDT, on the other hand... AMD let Threadripper languish in favor of EPYC because Intel's been dragging their heels on Sapphire Rapids, and CPUs that could be Threadrippers are instead sold as much more profitable EPYCs due to demand.

I feel your pain in wanting more CPU grunt for sims, though - admittedly less 4X and more vehicular on my end. DCS, IL-2, BeamNG, whatever, all will demand as much single-threaded performance as possible.
I have a pair of Xeon Silver 4316 on the way because Threadripper is dead outside of Lenovo and they have kinda pissed me off so.... there's that.
 
The x86 CPU market is the most exciting it's been since the Pentium III vs. Athlon days, though!
GPU / 3D accelerator market is bit of a snoozefest though.
I miss the mid-90s to early 2000s - 3Dfx, Matrox, S3, Verite, ATi, early nVIDIA (Riva128), Intel i740, 3Dlabs, SiS, etc.
 
Last edited:
GPU / 3D accelerator market is a snoozefest though.
I miss the mid-90s to early 2000s - 3Dfx, Matrox, S3, Verite, ATi, early nVIDIA (Riva128), Intel i740, 3Dlabs, SiS, etc.
You forgot PowerVR in that long list! Also Number Nine, but they were about as irrelevant as SiS for serious graphics.

Granted, for as exciting as those times were, it was distressingly easy to bet on the wrong horse with something like a 3D Blaster VLB or an S3 ViRGE (the infamous 3D decelerator, since vindicated for good 2D performance and compatibility, making an excellent Voodoo1/2 companion).

Or worse yet, the NVIDIA NV1 with very Sega Saturn-esque quadrilateral 3D architecture that was fundamentally incompatible with Direct3D and OpenGL using triangles, and the built-in audio wasn't Sound Blaster-compatible, either.

Now, it's just an overwhelmingly dominant NVIDIA, ATI AMD, and Intel finally trying for a third time after the i740 flopped and Larrabee lost all pretense of being a GPU.
 
Here's the thing, it never has been a good investment when doing an apples to apples, even mid ranged cards when you take into account the cost of the rest of the computer tended to always be more than the existing console. But that's the thing even if you see the PC as only a gaming experience (which it's not but whatever) it's all about the choices you have, you have the choice to not go with a "mid-range" card if you don't want to, what's wrong with the $500 mid range card from last generation that's so horrible you NEED to upgrade, or the year before that? You have a huge choice in whether or not you feel the need to have the latest gen hardware at any level. Do you need 4k ultra hdr 1024xAA and all that stuff? Maybe you are saying yes in which case fine make the "investment" you feel is better. Meanwhile I can still play Cyberpunk on my old gtx970, sure not at 4k but if I decide 1080 low settings is fine is there really anything wrong with it? (I played Far Cry 6 at 1440 on my gtx970 and it was largely acceptable as a playing experience) Can you say the same thing with a PS3? No, you can't because your PS3 can't play it. I also want to play FF8 on a PS5, can I? Not unless they made a special edition remake, however I want to play Ultima1 on my PC I can. I also have a much larger range of games that are available to me on the PC, not just triple A titles that each cost $60 (if you're lucky).
Ultimately a PC is more of an investment than a console, because the end of life is much longer out than a console, you're probably not playing Cyberpunk if you have an Viper V550 video card, but if you have an 8800gtx you may be able to squeak out something (maybe, I'm not saying you can). And sure $800 for a "midrange" video card is crazy insane, but how many years will that video card work for games that come out? How much longer is the PS5 going to be relevant? another 2 years maybe? And sure if you're happy with the games you have then absolutely squeak out more life out of it, meanwhile the guy who paid too much for a video card (not me) is going to end up playing the latest and greatest games as they come out.

I'm well aware of the arguments for the PC as a general-purpose machine, or running classic games... but those arguments only have a limited appeal to everyday buyers.

If you buy a console, you'll have solid performance for its entire lifespan because games will be targeted at its capabilities; there's no worrying that you'll need to settle for reduced detail or choppy frame rates. Now, that does mean you won't see huge leaps in graphics until the next console generation, but you also don't have to risk your GPU feeling inadequate after a few years. And that's a major factor for a tight-budgeted regular buyer who might rather pay $400 once in several years than $800 (or $400-500 a couple of times).

And frankly, your examples just aren't very good. The PS3 was released in 2006; even the fastest gaming PC from 2006 can't play Cyberpunk. Playing Ultima I is neat, but there's hardly a large audience (and virtually any PC can run it). You have a better case for FF8, but remember that you don't necessarily have to toss out an old console if you're particularly attached to a game... and ironically, that game has been available as a remaster for the PS4/Switch/Xbox since 2019, and it's even playable on phones. Not that you'd want to count on remasters to play your back catalog, but they do help.

You also don't seem to have much experience with console lifecycles. While consoles did have a roughly five-year time in the spotlight, that extended to seven years for the last generation. Moreover, game developers don't immediately put the brakes on games for these systems the moment a new model is out; there's usually a year or two before mainstream titles are either unavailable or are clearly much better on newer consoles. Buy a PS5 now and you'll likely be at the top of the console food chain until 2027, and might still get some extra relevancy beyond that.

I should stress that I'm not against someone knowingly pouring lots of money into PC gaming. It's a hobby, and it might still make financial sense if you don't see you or your family wanting to play in the living room (or you want to play away from home on a laptop). I'm more just explaining why it's a tough sell to the general public, and why GPU sales are so low.
 
I'm well aware of the arguments for the PC as a general-purpose machine, or running classic games... but those arguments only have a limited appeal to everyday buyers.

If you buy a console, you'll have solid performance for its entire lifespan because games will be targeted at its capabilities; there's no worrying that you'll need to settle for reduced detail or choppy frame rates.
While your point holds true for the most part, there's always a Cyberpunk 2077, PUBG, ARK: Survival Evolved, 7 Days to Die, or rolling back even further to X360/PS3, Left 4 Dead showing up on consoles that just looks like absolute ass with low framerates where a PC will brute-force through them handily.

I get appalled when I see friends of mine playing these games on X360 or base PS4, knowing they could've had a much better experience dropping a decent graphics card into a desktop PC they'd need to have for other things anyway.

Games running acceptably on consoles requires the developer to care enough about optimization to make it happen, and even on PS5, I still need to manually set games to target 60 FPS at the expense of ray-tracing or other eye candy superfluousness (Forspoken comes to mind).

Hopefully, the XSX/PS5 generation makes more of a push for consistent 60 FPS console gaming, if not 120 FPS (they support high refresh rates and VRR now!), kind of like 6th-gen consoles (DC/PS2/GCN/Xbox) often had 60 FPS releases compared to the later two generations where developers would rather sacrifice frame rate for more visual effects.
 
My 8yo has a 6600xt. It's great for roblox. No need for a 4090. Maybe many of us just don't need a gpu every 3 months. Hrmmm it might be that simple. Space the tech out a little longer nV sigh....I have a 6900xt. I won't need a new gpu until at least 8900xtx rtx 5090 gen. Same with my 3070ti Laptop gpu.
 
My 8yo has a 6600xt. It's great for roblox. No need for a 4090. Maybe many of us just don't need a gpu every 3 months. Hrmmm it might be that simple. Space the tech out a little longer nV sigh....I have a 6900xt. I won't need a new gpu until at least 8900xtx rtx 5090 gen. Same with my 3070ti Laptop gpu.
"I must have 8K HDR with 12.7 surround on my Apple Watch."

Really? Are you sure?

Nvidia: We don't know why things aren't selling.
 
While your point holds true for the most part, there's always a Cyberpunk 2077, PUBG, ARK: Survival Evolved, 7 Days to Die, or rolling back even further to X360/PS3, Left 4 Dead showing up on consoles that just looks like absolute ass with low framerates where a PC will brute-force through them handily.

I get appalled when I see friends of mine playing these games on X360 or base PS4, knowing they could've had a much better experience dropping a decent graphics card into a desktop PC they'd need to have for other things anyway.

Games running acceptably on consoles requires the developer to care enough about optimization to make it happen, and even on PS5, I still need to manually set games to target 60 FPS at the expense of ray-tracing or other eye candy superfluousness (Forspoken comes to mind).

Hopefully, the XSX/PS5 generation makes more of a push for consistent 60 FPS console gaming, if not 120 FPS (they support high refresh rates and VRR now!), kind of like 6th-gen consoles (DC/PS2/GCN/Xbox) often had 60 FPS releases compared to the later two generations where developers would rather sacrifice frame rate for more visual effects.
True, we shouldn't rule those out. I just don't think they're enough of a problem that console gamers should worry that the next big game might run poorly.

And while we'd ideally get the best of all worlds, I'm frankly happy that consoles now give you some choices for visual priorities (without making them overly complicated, as they sometimes are on PCs). You might see fewer compromises as more developers focus exclusively on the newest consoles. I don't know that we'll get 60FPS by default, especially not with ray tracing involved, but it should be more commonplace.
 
Back
Top