Desktop GPU Sales Hit 20-Year Low

............ because they target reduced detail and choppy frames out the gate on console................
Tell me about it. When all my friends were gaming on xbox 360, ps3, xbox one, i just wanted to throw up in my mouth. First you have to pay a monthly fee to play online, two you cant play very popular mmorpgs, three you low detail, choppy framerate -- computers have been doing 60fps/60hz for many years before consoles caught up. I currently have an xbox x as a bluray player and its pretty good -- but its still a kiosk that doesnt let me do lots of things a basic PC does.
 
I'm well aware of the arguments for the PC as a general-purpose machine, or running classic games... but those arguments only have a limited appeal to everyday buyers.
True enough, but what I'm throwing out there is the idea that you really can't do an apples to apples on console vs PC unless you absolutely restrict the PC to the "I'm only buying this to play games" argument.

And frankly, your examples just aren't very good. The PS3 was released in 2006; even the fastest gaming PC from 2006 can't play Cyberpunk.
My examples were just that, examples of things that can be done, and to refute the PS3 thing what if you didn't buy it in 2006? What if you bought it closer to the end of life (or I should say when the PS4 came out) consoles don't tend to have a habit of dropping in price unless stores are absolutely flooded with them, so that price tag from day 1 is the same as year 6 with the exact same capabilities, I'm not sure if a PC from 2012 could play CyberPunk but again just using that as a name for an example. I mean I think 2012 you had 3000 series from Intel for CPU, and 600 series Nvidia GPUs? Yeah, absolutely potato graphics by today's standards but, not absolutely useless.

Playing Ultima I is neat, but there's hardly a large audience (and virtually any PC can run it).
That's kind of the point, any PC can run it, and yeah maybe no that many wants to play Ultima 1, what about 2,3,4,5,6,7 or the tens of thousands of other games that came out between then and now? That was more my point.

You have a better case for FF8, but remember that you don't necessarily have to toss out an old console if you're particularly attached to a game... and ironically, that game has been available as a remaster for the PS4/Switch/Xbox since 2019, and it's even playable on phones. Not that you'd want to count on remasters to play your back catalog, but they do help.
Again, I wasn't trying to make a debate about this I didn't do research to find out which games had a remaster or what not, I will say consoles are getting better with that letting you play "retro" games, I do believe they fully expect you to repurchase them though.

And all this to say PCs are better? No, just to say that comparing a PC to a console isn't exactly the most black and white comparison, and using the current cost of the latest gen of hardware really is a bad metric to use.
 
My 8yo has a 6600xt. It's great for roblox. No need for a 4090. Maybe many of us just don't need a gpu every 3 months. Hrmmm it might be that simple. Space the tech out a little longer nV sigh....I have a 6900xt. I won't need a new gpu until at least 8900xtx rtx 5090 gen. Same with my 3070ti Laptop gpu.
I'm with you that I don't need a new gpu often, but I will probably want to upgrade in 2-3 years time to play the latest titles. And I'll also likely need a better card for my line of work. I just don't want to come back to a GPU market that is even more prohibitively expensive and terrible than it is now. I feel like it some point they're going to lease GPUs to people like they're a car. Hopefully some sanity comes back to the market in that time period. Either way, super happy with the 6800 XT for the moment and the price I paid for it.
 
There's one thing that hasn't been brought up yet regarding this. With discrete card sales being so low the used market is in really bad shape at least with regards to good deals for buyers. For years people have been saying to wait and buy used cards on the cheap but that's not going to work. With the super high prices of new GPUs it means more people are going to hold onto their old cards and keep using them. Due to scarcity in the used market it will make used cards hold their value.

This is made even worse because people have already been doing this for several years. Prices have been quite high and value has been sorely lacking with new cards due to many different reasons. There are many out there who are "due" for an upgrade and may have been "due" for an upgrade for a few years now. They're still not likely to buy badly overpriced new cards (for the most part) which puts more demand on the used market.

In the long run this is may very well be catastrophic for the gaming industry. The greed of the two major discrete GPU makers may possibly cause a collapse of the market simply due to the desire to keep or increase margins by artificially neutering the number of GPUs for sale. It's not only GPU ownership which suffers but in the long run it will cause a major stagnation in the game market simply because not enough people will be able to afford cards which push graphics. We're already seeing it now. More people keep screaming for RT and yet only a small fraction of the cards out there can even turn the features on and have enough performance to run the games.
 
What if you bought it closer to the end of life (or I should say when the PS4 came out) consoles don't tend to have a habit of dropping in price unless stores are absolutely flooded with them, so that price tag from day 1 is the same as year 6 with the exact same capabilities,

Well, that's categorically false. Just about every console drops its price by half or more by the end of its run. The aforementioned PS3 dropped from $500-$600 to about $250 by the time the PS4 was announced.
 
Yeah my 360 consoles (for WMC extenders) only cost $179 new from Best Buy whenever I bought them - $179 wasn't launch MSRP not even for the low model 360
 
Maybe Nvidia and AMD should rush to come out with their RTX 4050 Ti and RX 7500 XT before cryptos climb the ladder again. :D
Crypto will never ever exist again. Proof of stack was pushed onto crypto because places all over the world got sick and tired of supplying electricity to these people who went after the cheapest often social form of electricity at the cost of the locals. Countries were ready to ban crypto and gave them the alternative to either become energy efficient or get banned. Proof of stack was the obvious choice.

There's one thing that hasn't been brought up yet regarding this. With discrete card sales being so low the used market is in really bad shape at least with regards to good deals for buyers. For years people have been saying to wait and buy used cards on the cheap but that's not going to work. With the super high prices of new GPUs it means more people are going to hold onto their old cards and keep using them. Due to scarcity in the used market it will make used cards hold their value.
People are going to hold onto old cards because the PS5 is the standard and that standard is about 5 years old mid to high end PC tech. Also it has gotten to the point where people have to choose between essential living and a GPU.
In the long run this is may very well be catastrophic for the gaming industry.
How is it a bad thing when we're not forced to buy new hardware? By the Xbox 360 and PS3 era we've basically made graphics so good that the difference of games on modern PS5/XBX hardware isn't very noticeable. Halo 3 vs Infinite isn't that much of a difference. People care more about gameplay today.


Yeah my 360 consoles (for WMC extenders) only cost $179 new from Best Buy whenever I bought them - $179 wasn't launch MSRP not even for the low model 360
They must have been used models, not that I haven't bought a used 360 myself.
I'm well aware of the arguments for the PC as a general-purpose machine, or running classic games... but those arguments only have a limited appeal to everyday buyers.
You say that while Sony's launch title for the PS5 is a remade PS3 game sold for $70. Sony's handling of the PS3 is so bad that PC gamers had to remake their neworking service so they can enjoy the PS3 version of Demon Souls with online functionality. There's clearly an appeal.
ps5 demon souls meme.jpeg

If you buy a console, you'll have solid performance for its entire lifespan because games will be targeted at its capabilities; there's no worrying that you'll need to settle for reduced detail or choppy frame rates.
You just enjoy reduced detail and choppy frame rates all the time on console. A Plague Tale: Requiem wasn't 30fps at launch as the game often ran at 24fps. A performance patched made it 30fps.

4ynxjq.jpg

Now, that does mean you won't see huge leaps in graphics until the next console generation, but you also don't have to risk your GPU feeling inadequate after a few years. And that's a major factor for a tight-budgeted regular buyer who might rather pay $400 once in several years than $800 (or $400-500 a couple of times).
Up until recently most Steam gamers were using a 6 year old GTX 1060. Majority of GPU's on steam are still GTX cards from around 6 years ago. You still have your GTX 970's and 960's there hanging on. Seems most gamers aren't feeling inadequate with their old GPU's.
And frankly, your examples just aren't very good. The PS3 was released in 2006; even the fastest gaming PC from 2006 can't play Cyberpunk.
My PC at the time had a ATI Radeon X1950 GT and that played Crysis just fine but that was released in 2007. Before that I had a GeForce 6800 but my PSU blew up and took it out, but I'm sure that would have played Crysis just fine, especially since I unlocked the cores. The GeForce 6800 was released in 2004.

but remember that you don't necessarily have to toss out an old console if you're particularly attached to a game...
But you do have to toss out the game if the console is broken and there's no alternative to play them.
and ironically, that game has been available as a remaster for the PS4/Switch/Xbox since 2019, and it's even playable on phones. Not that you'd want to count on remasters to play your back catalog, but they do help.
Yea just buy a new console and the same game to solve a problem that can be done for free. On PC we don't have remasters because that's what texture packs and mods are for. Anyone that does create a remaster gets crucified, like Dark Souls. Also "playable on phones".
76rkia.jpg

Buy a PS5 now and you'll likely be at the top of the console food chain until 2027, and might still get some extra relevancy beyond that.
Or get a PC and make the PS5 part of your food chain.
I should stress that I'm not against someone knowingly pouring lots of money into PC gaming. It's a hobby, and it might still make financial sense if you don't see you or your family wanting to play in the living room (or you want to play away from home on a laptop). I'm more just explaining why it's a tough sell to the general public, and why GPU sales are so low.
Fun fact, you can use your PC in a living room just like a console with a gamepad. Any gamepad.
jy3acn9g5taz.jpg

I repeat, ANY GAMEPAD.
komodo-retro-adapter-20090521-600.jpg.cf.jpg
 
Last edited:
GPU / 3D accelerator market is bit of a snoozefest though.
I miss the mid-90s to early 2000s - 3Dfx, Matrox, S3, Verite, ATi, early nVIDIA (Riva128), Intel i740, 3Dlabs, SiS, etc.
I do not, I remember the driver hell / game patches of that era...never again.
 
Desktop CPU sales are apparently their lowest in 30 years. I would love to see a graph correlating the two.
 
No way that price for new. Was it in 2013?

A simple Google search of Xbox 360 + $179.99 will show you yes, it was new, and plenty of times at that price, and yes around there is when I bought mine, but even before it hit that price in sales

https://www.google.com/search?q="Xb...BAzYuMZgBAKABAcABAQ&sclient=gws-wiz-serp#ip=1

Edit: Best Buy still has it listed on its site even @ that price even though no longer sold

https://www.bestbuy.com/site/reviews/microsoft-xbox-360-e-4gb-console-black/9267058?page=2
 
Last edited:
A simple Google search of Xbox 360 + $179.99 will show you yes, it was new, and plenty of times at that price, and yes around there is when I bought mine, but even before it hit that price in sales

https://www.google.com/search?q="Xb...BAzYuMZgBAKABAcABAQ&sclient=gws-wiz-serp#ip=1

Edit: Best Buy still has it listed on its site even @ that price even though no longer sold

https://www.bestbuy.com/site/reviews/microsoft-xbox-360-e-4gb-console-black/9267058?page=2
You mean you bought a 360 now? As in 2022 or 2023? I thought this is something you did back in 2008 or 2010 when it was relevant. There are so many better cheaper alternatives to watching media. I wouldn't spend $180 today for a 360, despite the plethora of cheap good games it has on it. Kodi, Plex, Jellyfin, are so much better than WMC and can literally run on anything and most smart TV's have the ability to connect to them.
ryan-reynolds-but-why.gif
 
You mean you bought a 360 now? As in 2022 or 2023? I thought this is something you did back in 2008 or 2010 when it was relevant. There are so many better cheaper alternatives to watching media. I wouldn't spend $180 today for a 360, despite the plethora of cheap good games it has on it. Kodi, Plex, Jellyfin, are so much better than WMC and can literally run on anything and most smart TV's have the ability to connect to them.
View attachment 540378
You're completely misreading his post out of context.
 
You mean you bought a 360 now? As in 2022 or 2023? I thought this is something you did back in 2008 or 2010 when it was relevant. There are so many better cheaper alternatives to watching media. I wouldn't spend $180 today for a 360, despite the plethora of cheap good games it has on it. Kodi, Plex, Jellyfin, are so much better than WMC and can literally run on anything and most smart TV's have the ability to connect to them.
View attachment 540378
Xbox 360 Arcades were under $200 as early as 2008, and were ~$150 MSRP by 2010.

There were other versions that were $199 MSRP in later years as well.
 
Its been awhile but I'm pretty sure I picked up a 360 in 2010 or 2011 for $199 that had built in wifi. My first one didn't.
 
You're completely misreading his post out of context.
That he bought cheap 360's at some point to use as a Windows Media Center device? WMC takes me back as I did have a setup long ago in ancient times. Anyway so is he saying he recently or in the distant past bought a 360 and that's some cheap gaming or that consoles often get that cheap that there's no need for a GPU? I'm not a mind reader if you're sketchy with the details. The 360 isn't relevant today as a game console unless you wanna experience some games from the past or have an already existing 360 collection. The only reason I referenced the 360 is because it's an example of a time in history when console gaming was truly cheaper then PC gaming, you know like back in 2006. That isn't the case today. It wasn't even the case when RTX 3000 GPU's needed a bank loan to buy them because PS5's were scalped to over $1k. PC gaming today has a very low barriers to entry and that includes not needing to buy a new GPU. I just built a PC for a little girl that has a Ryzen 5600G for $600 that includes a monitor as well with speakers. The iGPU in it isn't fast but it'll play most games just fine, especially Roblox which is the game she plays. You don't need a discrete GPU for most games.
 
So now we've gone from "No, there's no way you got a 360 for $179 in 2013" immediately to "You did get a 360 for $179 in 2013, so what?" 🤣

Won't kill you ya know to just go "Oh, I learned something, my bad!"

head-tap.gif
 
So now we've gone from "No, there's no way you got a 360 for $179 in 2013" immediately to "You did get a 360 for $179 in 2013, so what?" 🤣

Won't kill you ya know to just go "Oh, I learned something, my bad!"

View attachment 540427
I bought a 360 back in 2008 or 2009 for the purpose of playing Halo 3. I also modded it to run unauthorized applications and it did eventually get the RROD and I used a heatgun to "fix it". I didn't pay $180 for it, and it was used. I think I paid $250 for it from GameStop or some local game store. Also looking online I don't see a new 360 for $180, but plenty of refurbished for that price. Refurbished isn't new, especially with the way they "refurbish" these devices. What am I suppose to learn from this? I don't even know when you bought your 360's, as in more than one. I don't even know what purpose this has to the conversation? Do you game on the 360's or exclusively use them for WMC? You have a RTX 3060 so I don't think you do.

D6PfW.jpg
 
I bought a 360 back in 2008 or 2009 for the purpose of playing Halo 3. I also modded it to run unauthorized applications and it did eventually get the RROD and I used a heatgun to "fix it". I didn't pay $180 for it, and it was used. I think I paid $250 for it from GameStop or some local game store. Also looking online I don't see a new 360 for $180, but plenty of refurbished for that price. Refurbished isn't new, especially with the way they "refurbish" these devices. What am I suppose to learn from this? I don't even know when you bought your 360's, as in more than one. I don't even know what purpose this has to the conversation? Do you game on the 360's or exclusively use them for WMC? You have a RTX 3060 so I don't think you do.

View attachment 540816

k

791574_head-tap.gif
 
By using only 1(!) metic, one must conclude that CPU prices are to high to then:
https://crast.net/137106/processor-sales-drop-to-30-year-low/

Or perhaps the picture is more complex than a single metric can indicate.
It's always "more complex", but in reality, people are "happy with what they have".. which likely means what they have is more than enough for their needs. And perhaps there's not enough people in a state of "need" with regards to upgrading, both in terms of GPU and CPU.

I know I tend to hold onto hardware at least twice as long as most people here. With that said, the idea of annually upgrading one's system, maybe the "need" (intense desire) for that is over (?)

Also, the big buck margins are always server side. Perhaps Sapphire Rapids will help drive some things (?). Perhaps a drop in prices in motherboard, DDR5, CPU and GPU might make "upgrading" more desirable. Otherwise, people will skip, waiting for a bigger jump in performance/features.
 
It's always "more complex", but in reality, people are "happy with what they have".. which likely means what they have is more than enough for their needs. And perhaps there's not enough people in a state of "need" with regards to upgrading, both in terms of GPU and CPU.

I know I tend to hold onto hardware at least twice as long as most people here. With that said, the idea of annually upgrading one's system, maybe the "need" (intense desire) for that is over (?)

Also, the big buck margins are always server side. Perhaps Sapphire Rapids will help drive some things (?). Perhaps a drop in prices in motherboard, DDR5, CPU and GPU might make "upgrading" more desirable. Otherwise, people will skip, waiting for a bigger jump in performance/features.

To some extent, GPUs are suffering from the same 'problem' that PCs as a whole, phones and other tech products have been dealing with for years: they've matured to the point where a years-old device is frequently "good enough." While an RTX 4070 Ti is a huge leap over any GTX 10 series card, you don't need it when Apex Legends or Valorant runs just fine. You just won't get the visual effects or performance you get from the latest GPUs and consoles.
 
To some extent, GPUs are suffering from the same 'problem' that PCs as a whole, phones and other tech products have been dealing with for years: they've matured to the point where a years-old device is frequently "good enough." While an RTX 4070 Ti is a huge leap over any GTX 10 series card, you don't need it when Apex Legends or Valorant runs just fine. You just won't get the visual effects or performance you get from the latest GPUs and consoles.
I remember when even a two-year-old PC during the tech arms race of the '90s and 2000s was hopelessly obsolete.

It wasn't "upgrade your GPU" obsolete, I'm talking "upgrade your whole system because you need a new mobo for CPUs that are literally twice as fast and also new GPUs that use these newfangled AGP/PCIe slots" outdated, and the newest games either ran like complete crap or were literally unplayable because your old GPU lacked programmable pixel and vertex shaders.

Then I went six years on a Q6600, then almost a whole decade on a 4770K, each later getting its RAM maxed and one or two GPU upgrades along the line. Only recently did I step up to a 12700K build, and the older systems are still quite viable for casual computing and lightweight gaming.

We really have it easy now, and ray-tracing isn't a feature that new games literally require just to run at all. (There are RT-required enhanced releases of existing games, but you could just play the original non-RT versions and it's still not the dramatic difference that GLQuake was to software-rendered Quake, for instance.)

The Steam Deck may have even emphasized "low-spec" PC gaming like never before, with its appealing price and portability; there's a lot of good games that still run comfortably on it, damn the maxed settings and RT.
 
I remember when even a two-year-old PC during the tech arms race of the '90s and 2000s was hopelessly obsolete.

It wasn't "upgrade your GPU" obsolete, I'm talking "upgrade your whole system because you need a new mobo for CPUs that are literally twice as fast and also new GPUs that use these newfangled AGP/PCIe slots" outdated, and the newest games either ran like complete crap or were literally unplayable because your old GPU lacked programmable pixel and vertex shaders.

Then I went six years on a Q6600, then almost a whole decade on a 4770K, each later getting its RAM maxed and one or two GPU upgrades along the line. Only recently did I step up to a 12700K build, and the older systems are still quite viable for casual computing and lightweight gaming.

We really have it easy now, and ray-tracing isn't a feature that new games literally require just to run at all. (There are RT-required enhanced releases of existing games, but you could just play the original non-RT versions and it's still not the dramatic difference that GLQuake was to software-rendered Quake, for instance.)

The Steam Deck may have even emphasized "low-spec" PC gaming like never before, with its appealing price and portability; there's a lot of good games that still run comfortably on it, damn the maxed settings and RT.
Tell me about it. I remember how people thought Quake requiring a Pentium 60 was a big deal since the CPU was only three years old.

You're right about the Steam Deck. I also think this helps the console market to a certain degree. If you're going to keep your gaming hardware for several years, it's very tempting to buy a $399 console instead of a $399 GPU (let alone a $799 one), at least if you get it in the first couple years of its lifecycle; you're getting a complete set of gaming hardware that will still be actively targeted even near the end of its useful life. A sufficiently powerful PC can outperform that console, but you're also spending more and not necessarily getting more longevity in return.
 
There's a whole class of consumer where "status" matters so much with regards to their own "esteem" (or other) that they have to run the very latest and greatest. Obviously, disposable income factors in greatly.
 
Yeah I very quickly went from a Athlon 64 3200+ > Core 2 Duo E6300 > Core 2 Quad 6600 and then stayed there for about 6 years until I built a 3570k build that I then kept for ~8 years
holy crap. that's the same lineup I had. to a tee.
 
Is there a tl;dr? Are there still supply issues and price gouging is is that a thing of the past?
 
I remember when even a two-year-old PC during the tech arms race of the '90s and 2000s was hopelessly obsolete.

It wasn't "upgrade your GPU" obsolete, I'm talking "upgrade your whole system because you need a new mobo for CPUs that are literally twice as fast and also new GPUs that use these newfangled AGP/PCIe slots" outdated, and the newest games either ran like complete crap or were literally unplayable because your old GPU lacked programmable pixel and vertex shaders.

Then I went six years on a Q6600, then almost a whole decade on a 4770K, each later getting its RAM maxed and one or two GPU upgrades along the line. Only recently did I step up to a 12700K build, and the older systems are still quite viable for casual computing and lightweight gaming.

We really have it easy now, and ray-tracing isn't a feature that new games literally require just to run at all. (There are RT-required enhanced releases of existing games, but you could just play the original non-RT versions and it's still not the dramatic difference that GLQuake was to software-rendered Quake, for instance.)

The Steam Deck may have even emphasized "low-spec" PC gaming like never before, with its appealing price and portability; there's a lot of good games that still run comfortably on it, damn the maxed settings and RT.
I sure don't miss those days. As soon as you completed a new build, with each passing day you grew increasingly aware of how quickly it was becoming an old-and-slow build. Not only did you have to start planning your next build almost right away, you had to find somewhere to unload your most recent one. In my case I finally gave up and started chucking the things in the attic, along with all the empty Newegg boxes.
 
I sure don't miss those days. As soon as you completed a new build, with each passing day you grew increasingly aware of how quickly it was becoming an old-and-slow build. Not only did you have to start planning your next build almost right away, you had to find somewhere to unload your most recent one. In my case I finally gave up and started chucking the things in the attic, along with all the empty Newegg boxes.
I miss those days because thats when overclocking and tweaking actually meant something. I remember i overclocked my AMD XP Barton chip from 1.83ghz to 2.4ghz and saved $800 just on the CPU cost alone. I went from PC gaming -> benchmarking -> crypto mining -> to now... i just surf the internet and post on forums.
 
I miss those days because thats when overclocking and tweaking actually meant something. I remember i overclocked my AMD XP Barton chip from 1.83ghz to 2.4ghz and saved $800 just on the CPU cost alone. I went from PC gaming -> benchmarking -> crypto mining -> to now... i just surf the internet and post on forums.
You people that paid too little for CPUs that overclocked have caused the prices to rise on all CPUs that no longer overclock.
 
You people that paid too little for CPUs that overclocked have caused the prices to rise on all CPUs that no longer overclock.
I assume sarcasm? Otherwise i disagree. I think i paid $200 for my XP barton 2500? inflation between 2005 and 2023 is 1.5x so thats $300 for a mid range cpu today. I just bought my daughter a Ryzen 5 5500 for $98 and i frequently see ryzen 7 5700x under $200. Even the flagship 7950x is under $600 and comes with 32gb of ram at microcenter. Videocard prices are a little out of control, but we have a healthy used market -- and id argue no one on a budget "needs" anything more than a $250 6600xt or a $400 3070. I think i paid $549 new for a 7970 3gb adjusted for inflation that seems pretty close to current pricing 7900xt
 
It's always "more complex", but in reality, people are "happy with what they have".. which likely means what they have is more than enough for their needs. And perhaps there's not enough people in a state of "need" with regards to upgrading, both in terms of GPU and CPU.

I know I tend to hold onto hardware at least twice as long as most people here. With that said, the idea of annually upgrading one's system, maybe the "need" (intense desire) for that is over (?)

Also, the big buck margins are always server side. Perhaps Sapphire Rapids will help drive some things (?). Perhaps a drop in prices in motherboard, DDR5, CPU and GPU might make "upgrading" more desirable. Otherwise, people will skip, waiting for a bigger jump in performance/features.
I am well aware of the prices on the server side, I am an IT architect, I design large VMware clusters for a living (VXRail and Cisco Hyperflex solutions)
But I agree with you proposition.
Most people are contend with the hardware/performance they are getting (except for a small and to vocal minority)
Did I NEED to upgrade from a 3090 to a 4090?
No.
Did I do it just because I could afford it and I like turning everything up to 11?
Yes indeed, guilty as charged.
I splurged on a luxury.
 
Yeah I very quickly went from a Athlon 64 3200+ > Core 2 Duo E6300 > Core 2 Quad 6600 and then stayed there for about 6 years until I built a 3570k build that I then kept for ~8 years
I'm trying to remember what CPU's I've used but this is what I remember. Athlon XP 3000 something Mobile, to Phenom II X4 955 Black Edition, to FX 4200, to FX 8350, to Ryzen 1700, to my current Ryzen 2700X. I've had other CPU's before like a Pentium 90, Pentium 166 MMX Overdrive, to Penium II @233, and I believe I upgraded to Pentium III 800Mhz. My first Athlon wasn't even the Athlon Mobile which was for overclocking, but some other Athlon XP that I totally forgot what it was. I upgraded more often back then compared to today because upgrades don't offer much. As it is I have a Ryzen 3700X laying around to be installed but I'm trying to fix a MSI B550 that my cousin gave me because the motherboard don't work, but looking at it the board has a short so I'm trying to see if a simple capacitor is shorted and hopefully not the southbridge chip. Otherwise I wouldn't spend the money to upgrade since the benefits aren't there.
 
You people that paid too little for CPUs that overclocked have caused the prices to rise on all CPUs that no longer overclock.
People that bought Intel CPU's for little and overclocked are the reasons why Intel only allows K series CPU's to be overclocked, you know the most expensive of their CPU's. But AMD for the most part allows all their CPU's to be overclocked. The only reason Intel still continues this practice is because "you people" only buy Intel CPU's. The same can be applied to Nvidia as lots of "you people" only buy Nvidia and of course Nvidia sets the prices of GPU's in the industry. You know, the industry of where only two companies AMD and Nvidia sell GPU's.
 
There's a whole class of consumer where "status" matters so much with regards to their own "esteem" (or other) that they have to run the very latest and greatest. Obviously, disposable income factors in greatly.

This is so true.. quite sickening
 

Gartner Says Worldwide PC Shipments Declined 28.5% in Fourth Quarter of 2022 and

The EMEA PC market had a historical decline of 37.2% year over year, due to the intersection of political unrest, inflationary pressures, interest rate increases and a pending recession.

“A decline of this magnitude only happens when market demand effectively comes to a halt,” said Kitagawa. “Business and consumer confidence across EMEA has collapsed, leading to a huge drop in PC demand. A massive increase in inventory has also severely limited sell-in opportunities as sellers focus on moving old stock.”

The Asia Pacific market excluding Japan declined 29.4% year-over-year, mainly due to the market in China. While the fourth quarter has traditionally been peak season for China’s business PC market, budget cuts by the Chinese government and uncertainty around changing COVID policies led to a significant drop in overall PC demand.

Annual Overview: PC Market Collapses After COVID Boom
Worldwide PC shipments totaled 286.2 million units in 2022, a 16.2% decrease from 2021 and the worst annual shipment decline in Gartner’s PC tracking history (see Table 3).

 
Back
Top