Desktop GPU Sales Hit 20-Year Low

I can't believe all the POS miners that Nvidia and AMD was catering to sending out pallets of GPU's while neglecting their "loyal for decades" PC gaming customer base have left them high and dry. Who would've thought that chasing the fast buck at the expense of the customers that made you what you are doesn't have consequences.

You know, you say that, but an unmineable PS5 was launched 2 years ago and is still hard to find. Pretty much any GPU will never meet an ROI, but the RTX 4090 and 7900XTX are near impossible to find. There's more at play than just "Crypto bad."
 
You know, you say that, but an unmineable PS5 was launched 2 years ago and is still hard to find. Pretty much any GPU will never meet an ROI, but the RTX 4090 and 7900XTX are near impossible to find. There's more at play than just "Crypto bad."

Yeah I know there are other factors involved but "crypto bad" is mostly to blame. They were the ones, along with the bottom feeding scalpers, that wiped out global stock in no time and pounced on any new cards that made it to market. But then Nvidia, AMD along with the retailers went along with it. They could've easily put a stop to it by limiting GPU's 1 to a customer, Best Buy and the like could've sold them in person only.... But they didn't. They sold us out and went all in for the miners sending out literal pallets of video cards to themim extreme cases.

Now I know that's how the market works and I'm not saying they did anything illegal or unethical. All I'm saying is if you crap on the customer base that has been loyally buying your products for the last few decades, making you what you are now, there are consequences.
 
Yeah I know there are other factors involved but "crypto bad" is mostly to blame. They were the ones, along with the bottom feeding scalpers, that wiped out global stock in no time and pounced on any new cards that made it to market. But then Nvidia, AMD along with the retailers went along with it. They could've easily put a stop to it by limiting GPU's 1 to a customer, Best Buy and the like could've sold them in person only.... But they didn't. They sold us out and went all in for the miners sending out literal pallets of video cards to themim extreme cases.

Now I know that's how the market works and I'm not saying they did anything illegal or unethical. All I'm saying is if you crap on the customer base that has been loyally buying your products for the last few decades, making you what you are now, there are consequences.

Personally, I think you are overstating the role of crypto, but that's just my opinion.
 
I'm honestly not surprised when the only new GPU that's even remotely worth it is the RTX 4090, and NVIDIA's being real stingy with the stock on that one.

If the local Micro Center happens to get new cards, even with $100-200 AIB markup, they usually sell out within the first hour, sometimes even the first few minutes - though having only 1 to 5 RTX 4090s every restock sure isn't helping. As I understand, while us gamers aren't competing with miners this time around, we are competing with people who need beefy GPUs with CUDA support for Machine Learning purposes, and AI seems to be taking off big time as of late.

Don't even get me started on Best Buy if you want a $1,600 Founder's Edition. Just don't. Might as well be unobtanium. Still miffed that NVIDIA makes their FE cards exclusive to Best Buy, as I'd have a somewhat better chance nabbing one at Micro Center.

Honestly, I could probably keep my old GTX 980 going for a while longer yet when I'm still largely using 1080p displays, except for one little thing - VR - that demands literally as much GPU as you can throw at it, and for some titles, even the extravagantly expensive RTX 4090 is basically a minimum requirement for a good VR experience. (Usually because they're not optimized worth a damn.)

The plan: skip Pascal, wait for Turing's improvements - which didn't really net you more FPS for your dollar, so I wanted for Ampere instead and would've bought then, except, you know, largely out of stock for TWO GODDAMN YEARS due to the crypto craze, and by the time they restocked, Ada Lovelace was around the corner, which had the FPS gains but was also a repeat of Turing MSRP-wise (huge overstock of last-gen cards after a crypto bust, so they priced high to incentivize buying the older stuff instead).

I wound up getting an RX 7900 XTX on day one instead after waiting so damn long for a new GPU, and while it is a considerable improvement, I wasn't expecting a defective vapor chamber and subpar VR performance (performs like a 3090, costs a few hundred dollars more). AMD managed to snatch defeat from the jaws of victory, so I'm just gonna have to bite the bullet on NVIDIA's price gouging. Seven years is a long time to go without GPU upgrades...
 
And the PS5 is basically a GTX 1070 or even 1060 levels of performance. Unfortunately the Intel ARC GPU's are about the same if not worse then a GTX 1060.
Jeez I just posted this but, PS5 is definitely not that low of performance. It is between a 6600XT and a 6700XT

PS5 is 2304 shading units for 10.25 flops
6600XT is 2048 for 10.6 flops
6700XT is 2560 for 13.21 flops
Despite the slightly lower flops potential compared to a 6600XT----it has more shading units. also has more TMUs and more compute units. Combined with the fact its dedicated hardware: the PS5 generally performs pretty close to 6700XT, in Rasterization. **Actually, its very similar in spec to the RX 6700. forgot about that one.


And Intel's ARC A770 basically always performs better than a 1060. The drivers need a lot of work, to make game performance more consistent. But, at its worst, its like an RTX 3060. It frequently performs better than that. And there are a couple games where it performs as good or better than a 3060ti.
It also seems to generally scale well with resolution. Sometimes strangely well. if they can smooth the drivers out and drop the price to an even $300, they would have a great card.
Personally, I think you are overstating the role of crypto, but that's just my opinion.
Sales numbers came out a few months ago, which basically underlined that; indeed, a meaningful amount of GPUs went to Crypto.
 
Last edited:
  • Like
Reactions: T4rd
like this
You know, you say that, but an unmineable PS5 was launched 2 years ago and is still hard to find. Pretty much any GPU will never meet an ROI, but the RTX 4090 and 7900XTX are near impossible to find. There's more at play than just "Crypto bad."

I presume you are or were a miner? A statement like this could only come from a miner.
 
Jeez I just posted this but, PS5 is definitely not that low of performance. It is between a 6600XT and a 6700XT
Since requiem plague tale was brought up, I started looking at benchmarks and yea the PS5 isn't much faster than a GTX 1060. According to Digital Foundry which I don't think their benchmarking is good, thinks this game performs the same on the PS5 as a RTX 2070 Super, which gets an average frame rate of 32fps.


Where a GTX 1060 will average 25fps on low settings, but you can at least tweak those settings to achieve nearly 60fps and if you get FSR working it can be medium and high settings.

PS5 is 2304 shading units for 10.25 flops
6600XT is 2048 for 10.6 flops
6700XT is 2560 for 13.21 flops
Despite the slightly lower flops potential compared to a 6600XT----it has more shading units. also has more TMUs and more compute units. Combined with the fact its dedicated hardware: the PS5 generally performs pretty close to 6700XT, in Rasterization. **Actually, its very similar in spec to the RX 6700. forgot about that one.
Paper specs don't matter, only benchmarks.
And Intel's ARC A770 basically always performs better than a 1060.
If it does, and when it does it's barely a difference.
The drivers need a lot of work, to make game performance more consistent. But, at its worst, its like an RTX 3060. It frequently performs better than that. And there are a couple games where it performs as good or better than a 3060ti.
It also seems to generally scale well with resolution. Sometimes strangely well. if they can smooth the drivers out and drop the price to an even $300, they would have a great card.
I'm not a fan of buying into the future because I've done it many times with AMD. My FX 8350 and Vega 56 did age well but not in any meaningful way. I replaced my FX 8350 and my Vega 56 is doing nicely but that's mostly because I use Linux. You won't get a GTX 1060 owner to go out and buy a ARC A770 for what little it offers.
 
Since requiem plague tale was brought up, I started looking at benchmarks and yea the PS5 isn't much faster than a GTX 1060. According to Digital Foundry which I don't think their benchmarking is good, thinks this game performs the same on the PS5 as a RTX 2070 Super, which gets an average frame rate of 32fps.

A. The 2070 super is A LOT faster than a 1060. Therefore, the PS5 is a lot faster than a 1060
B. Being similar to the 2070 super basically lines up with exactly what I said, between 6600xt and 6700xt. And then I remembered the RX 6700 non-XT and indeed, that is VERY similar in spec to PS5. And that would line up well with a 2070 super or 2080. and sometimes the 2080 super or 3060ti, depending upon the game engine.
 
I presume you are or were a miner? A statement like this could only come from a miner.
I miss mining so much. curse you PoS ethereum. Now i am forced to look for video games to play, but none of them are as fun for me. I think miners certainly impacted the market -- buying video cards on 0% interest knowing they were going to pay themselves off every 3-6 months (in the good times) made it easy to just max out your credit cards and then start selling the coin to make your electric and credit card payments. I never mined and held, so obviously i never got rich from mining -- but ive sold over 300 bitcoins in the last ten years and paid off a lot of video cards. for every 1 miner I think there is probably 50+ PC gamers. However the average miner might of owned 7-15 video cards wild guess, most of the miners i know i had only 1 or 2 rigs -- but of course there was full data centers of video cards too so who knows how that skews the results. I know one guy on this forum claimed to have over 100 video cards in his house. I quit growing when i started tripping 15 amp breakers
 
I started looking at benchmarks and yea the PS5 isn't much faster than a GTX 1060. According to Digital Foundry which I don't think their benchmarking is good, thinks this game performs the same on the PS5 as a RTX 2070 Super,
According to techpowerup a 2070 super has twice the performance than a 1060, which make that bit of a strange statement.
 
  • Like
Reactions: T4rd
like this
A. The 2070 super is A LOT faster than a 1060. Therefore, the PS5 is a lot faster than a 1060
That's some logic flow there. The PS5 gets 32 fps on Plague Tale: Requiem while a GTX 1060 will average 30fps. Also the PS5 version of Requiem does indeed use temporal upscaling for 4k, which means it already has some form of FSR or DLSS. Digital Foundry thinks the game on PS5 performs like a RTX 2070 Super but you can achieve much higher frame rates on a 2070. The problem here is that Digital Foundry guesstimated it by comparing the graphics output of the game on PS5 vs PC, because it's not like game developers tell you what settings they use. We're assuming Digital Foundry's is correct, which is probably not a good assumption. If the 2070 is twice as fast as the 1060 and therefore so is the PS5, then the PS5 should be able to achieve higher frame rates like the 2070, but it doesn't.
performance-1920-1080.png


The thing you need to understand is that a PS5 will always perform at 32fps on A Plague Tale: Requiem. A RTX 2070 Super can go beyond that thanks to DLSS, and a GTX 1060 can also go beyond that thanks to FSR. More importantly, this also means people who don't want to buy new GPU's don't need to still. They also don't need to buy a PS5 to enjoy Requiem at 32fps when they can already enjoy the game at 30fps on a GTX 1060. Hence the lack of need to upgrade.

B. Being similar to the 2070 super basically lines up with exactly what I said, between 6600xt and 6700xt. And then I remembered the RX 6700 non-XT and indeed, that is VERY similar in spec to PS5. And that would line up well with a 2070 super or 2080. and sometimes the 2080 super or 3060ti, depending upon the game engine.
Just because the 6600 XT is similar to the PS5 in specs doesn't mean the PS5 will perform like a 6600 XT. The PS5 shares 16GB of memory with the CPU and GPU which hurts performance because both can't access memory at the same time, plus the CPU has access to GDDR6 memory which isn't ideal for a CPU due to high latency. There's a reason why PC's use DDR5 for CPU and GDDR6 for the GPU, because that's optimal. That's also the reason we run benchmarks and not talk about paper specs because there are things we don't see that can effect performance. When Gamers Nexus said the PS5 performs like a medium to high end PC from 5 years ago he's spot on if Requiem is anything to go by. A 1080 Ti is still very good at playing Plague Tale: Requiem, which is not only better than a PS5 but also from 5 years ago. It's beating a 6600 XT, which again explains why not many people are buying new GPU's.
 
Top end GPU launches are a minuscule percentage of the GPU pie. The total lack of mid and low end GPUs that can function on top of absurd pricing have really hurt the market. AMD and nVidia need to understand that they can be replaced, having the king of performance means nothing when that's WAY under 1% of the GPU market.

I'd love to see AMD and nVidia go the way of Christmas tree farms around here or at least get close. They had pretend shortages and gouged so hard every year for a decade that everyone bought plastic, now five out of seven of them are gone and one of the two survivors is winding down the tree farm at the end of this year. Watching them dispose of all those trees that they were pretending didn't exist filled my heart with malicious glee.
slightly off topic, hoping I dont get shot for this, but the only reason to have a real christmas tree is to burn that bitch in July. They are spectacular. Far better than having a wet ass tree in your house in december. Burn that dry fucker in july. You wont be dissapointed.
 
I just paid someone to take one of my old 6900XT's away. Its making saving up for a new gen card even harder.
 
Who cares? What does that matter? kirbyrj speaks the truth.

Truth? What truth? The GPU industry is where it's at because of mining. The 4090 and 7900XTX are in short supply because AMD and Nvidia are limiting production to sell off the huge amounts of stock left over from when the mining bubble burst. Nvidia even came out with a statement saying that they were doing this.

Mining was mainly responsible for the GPU shortage for the last two years, and it's after affects are still causing problems now.
 
Neither AMD nor NVidia want to sell a lot of cards right now to consumers. There’s too much demand in Enterprise, between impending Tariffs, an AI development war, and a bunch of other things, Datacenter’s are buying them as fast as TSMC can roll them out.

I don’t understand why people keep insisting this is the case. The consumer GPU market is worth billions of dollars and both AMD and Nvidia want to have that. Considering that they just cut orders at TSMC, it’s pretty clear this current situation is related to overall chip demand and not some weird conspiracy where Nvidia and AMD are trying to make it so difficult for normal people to buy GPUs that they just don’t in order to free up supply for enterprise consumers.
 
Truth? What truth? The GPU industry is where it's at because of mining. The 4090 and 7900XTX are in short supply because AMD and Nvidia are limiting production to sell off the huge amounts of stock left over from when the mining bubble burst. Nvidia even came out with a statement saying that they were doing this.

Mining was mainly responsible for the GPU shortage for the last two years, and it's after affects are still causing problems now.

If that's what you think, Jensen and Co. did a fantastic job brainwashing the masses.

In times where production is limited, they are going to prioritize the high margin products like data center etc. At least this time, they told you that they weren't going to make 4090s because they were making professional GPUs they can charge even more for.

And they're pretty much is no huge glut of product out there except in the used market.

I would never say crypto had no impact, but I don't think it was the impact you and others think it was.
 
The 4090 and 7900XTX are in short supply because AMD and Nvidia are limiting production to sell off the huge amounts of stock left over from when the mining bubble burst
That make little sense to me, why have the 7900xt already with below msrp rebate ? Issue with yield would explain all of this.

It has been a long time since we saw a huge amount of supply (or even good price) of high end ampere cards, not a single new 3090 sold by newegg:
https://www.newegg.com/p/pl?N=8000 4131&d=rtx+3090

No new 3080 either:
https://www.newegg.com/p/pl?N=8000 4814 4131&d=rtx+3080

No new 3080TI either:
https://www.newegg.com/p/pl?d=rtx+3080ti&N=8000

Large quantity of used mining card in the channels is different than stock left. In our "just in time" era, would be hard to imagine being still in this situation almost 4 months after proof of stake, wanting to sales $36,000 USD hopper gpu instead of $1,600 4090 (specially when those failed AD102 goes in a pile ? for a future maybe 4080TI) and suspecting not that much of a demand for the 4080TI does not require some limiting production to sell off huge stock of 3080 to 3090TI that they do not seem to try to sell them via newegg and amazon if they have it.
 
Last edited:
That make little sense to me, why have the 7900xt already with below msrp rebate ? Issue with yield would explain all of this.

It has been a long time since we saw a huge amount of supply (or even good price) of high end ampere cards, not a single new 3090 sold by newegg:
https://www.newegg.com/p/pl?N=8000 4131&d=rtx+3090

No new 3080 either:
https://www.newegg.com/p/pl?N=8000 4814 4131&d=rtx+3080

No new 3080TI either:
https://www.newegg.com/p/pl?d=rtx+3080ti&N=8000

Large quantity of used mining card in the channels is different than stock left.

Unfortunately Nvidia ampere cards are back to selling above MSRP

AMD doesn’t have that market strength, so RDNA 2 cards are still selling at a discount to MSRP

If you are not gaming at 4K, then, Best strategy right now might be to buy a discounted Navi 22 card such as 6750 XT / 6700 XT / 6700

https://www.digitaltrends.com/computing/no-gpu-shortage-still-feels-like-one/

it might take some time for nvidia prices to normalize, esp with AMD stumbling with the cooler on the reference 7900 XTX cards
 
I would never say crypto had no impact, but I don't think it was the impact you and others think it was.
Crypto is the main driving force behind all the problems with GPU sales. What people are going to start learning soon is that COVID didn't cause a lot of the problems, but was a catalyst to already existing problem. We've had problems with GPU prices twice in the past due to crypto and even before COVID. The 2016 crypto boom started this mess and while crypto did crash in 2017, Nvidia was already resisting to reduce prices and justified the price increase with Ray-Tracing which was pushed with Toms Hardware telling everyone to "Just Buy it". Then the pandemic hit along with a massive surge of BitCoin which before 2020 was worth around $9,000 but even after the Binance and FTX fiasco, it's still worth $16,000. BitCoin is suppose to be a failed monopoly money and yet somehow it still retains a massive value of $16,000.

Nvidia's and AMD's situation is that they like the high prices and they don't want to go back down because they have shareholders. No amount of server AI bullshit is gonna make up for the lost crypto miner sales. Meanwhile the GTX 1060 was the #1 GPU used on Steam for 5+ years, because we went from a MSRP of $200 GTX 960, to $250 GTX 1060, to a $360 RTX 2060, and $330 RTX 3060 and I guarantee you that nobody for the past 2 years could find a RTX 3060 for MSRP. $850 for a RTX 3060 was stupid. We also have Nvidia lying about who they sold their GPU's to. Crypto had such a big impact to Nvidia that they tried to lie about it in order to boost their share price. It clearly worked, and I hope Nvidia shareholders are left holding the bag. The one ray of hope is that nobody listened to Jayztwocents and didn't just go buy a GPU back in July, like the "idiots" Jayztwocents thinks you are. That is literally what he called people, not my words.
PCDIGA_RTX_3060_5.jpg
 
The one ray of hope is that nobody listened to Jayztwocents and didn't just go buy a GPU back in July, like the "idiots" Jayztwocents thinks you are. That is literally what he called people, not my words.
How much was he wrong ? Was availability and/or price been significantly (and better than usual time passing affair) better since ?

There is now more 3060/3060TI than 1060 on hardware survey (more 2060/2060 super has well), we should not overread over the artifact that we combine 1060 6gb and 3gb and entered a more splitted sku world.
 
I don’t understand why people keep insisting this is the case. The consumer GPU market is worth billions of dollars and both AMD and Nvidia want to have that. Considering that they just cut orders at TSMC, it’s pretty clear this current situation is related to overall chip demand and not some weird conspiracy where Nvidia and AMD are trying to make it so difficult for normal people to buy GPUs that they just don’t in order to free up supply for enterprise consumers.
It’s not a conspiracy theory when Nvidia puts the plan in writing and publicly states it to their board and investors only for all the news sites to put out articles about Nvidia “manipulating the market”.

There is also their public statement to the Chinese government about how they will do their best to supply their market before the new US embargo kicks in.
 
Since 2005?
iGPU becoming good enough and the popularity of the corei5 type of cpu that come with them is probably the biggest force in the downtrend, followed by laptop popularity and augmented lifespan of computer hardware
 
Jeez I just posted this but, PS5 is definitely not that low of performance. It is between a 6600XT and a 6700XT

PS5 is 2304 shading units for 10.25 flops
6600XT is 2048 for 10.6 flops
6700XT is 2560 for 13.21 flops
Despite the slightly lower flops potential compared to a 6600XT----it has more shading units. also has more TMUs and more compute units. Combined with the fact its dedicated hardware: the PS5 generally performs pretty close to 6700XT, in Rasterization. **Actually, its very similar in spec to the RX 6700. forgot about that one.


And Intel's ARC A770 basically always performs better than a 1060. The drivers need a lot of work, to make game performance more consistent. But, at its worst, its like an RTX 3060. It frequently performs better than that. And there are a couple games where it performs as good or better than a 3060ti.
It also seems to generally scale well with resolution. Sometimes strangely well. if they can smooth the drivers out and drop the price to an even $300, they would have a great card.

Sales numbers came out a few months ago, which basically underlined that; indeed, a meaningful amount of GPUs went to Crypto.
It's absolutely not a 6700 xt in terms of performance, it's very close to the 6600 xt / 5700 xt.
 
It's absolutely not a 6700 xt in terms of performance, it's very close to the 6600 xt / 5700 xt.
Mhmm. See the part where I remembered the 6700 non-XT----and indeed the specs are VERY similar. Its between a 6600 XT and 6700 XT.
 
That's some logic flow there. The PS5 gets 32 fps on Plague Tale: Requiem while a GTX 1060 will average 30fps. Also the PS5 version of Requiem does indeed use temporal upscaling for 4k, which means it already has some form of FSR or DLSS. Digital Foundry thinks the game on PS5 performs like a RTX 2070 Super but you can achieve much higher frame rates on a 2070. The problem here is that Digital Foundry guesstimated it by comparing the graphics output of the game on PS5 vs PC, because it's not like game developers tell you what settings they use. We're assuming Digital Foundry's is correct, which is probably not a good assumption. If the 2070 is twice as fast as the 1060 and therefore so is the PS5, then the PS5 should be able to achieve higher frame rates like the 2070, but it doesn't.
View attachment 538904

The thing you need to understand is that a PS5 will always perform at 32fps on A Plague Tale: Requiem. A RTX 2070 Super can go beyond that thanks to DLSS, and a GTX 1060 can also go beyond that thanks to FSR. More importantly, this also means people who don't want to buy new GPU's don't need to still. They also don't need to buy a PS5 to enjoy Requiem at 32fps when they can already enjoy the game at 30fps on a GTX 1060. Hence the lack of need to upgrade.


Just because the 6600 XT is similar to the PS5 in specs doesn't mean the PS5 will perform like a 6600 XT. The PS5 shares 16GB of memory with the CPU and GPU which hurts performance because both can't access memory at the same time, plus the CPU has access to GDDR6 memory which isn't ideal for a CPU due to high latency. There's a reason why PC's use DDR5 for CPU and GDDR6 for the GPU, because that's optimal. That's also the reason we run benchmarks and not talk about paper specs because there are things we don't see that can effect performance. When Gamers Nexus said the PS5 performs like a medium to high end PC from 5 years ago he's spot on if Requiem is anything to go by. A 1080 Ti is still very good at playing Plague Tale: Requiem, which is not only better than a PS5 but also from 5 years ago. It's beating a 6600 XT, which again explains why not many people are buying new GPU's.


>shits on digital foundry's detailed in-depth analysis
>compares 1080p performance on gtx1060 to 1440p (fps-capped) performance on PS5...
>follows up with tldr screed on benchmarking rigor


I know you've got a hard-on for platform-warrioring, but it's okay to accept that consoles are, in fact, "okay". Your battlestation will still respect you in the morning.



By the by, the benchmark you want for comparison:

performance-2560-1440[1].png


Mmmm... cinematic...
 
Mhmm. See the part where I remembered the 6700 non-XT----and indeed the specs are VERY similar. Its between a 6600 XT and 6700 XT.
The specs are similar, but the PS 5 is not RDNA 2, it doesn't have all of the architectural advantages and increased IPC, that's why I compared it to a 5700 XT. It has some minor ray tracing features, but it would likely tie a 6600 XT. PC gaming needs to get cheaper, end of story. if people could buy a 6700 XT for 250 it'd be a better value proposition overall. 6650 XT and 6600 XT are getting there but it's still not the same value IMO. Long term, if you play a lot of games, it's probably cheaper to a own a PC because games are generally cheaper but still.
 
I'm straight up locked into PC TBH after all this time.... I'd go console now? I'd just play games at low and wait to upgrade whenever I could.

What I did with a 1080 FTW friend let me borrow after I sold my 2070super literally right before PC Apocalypse started, so in-between selling it and going to get a new 3000 series card - poof - I couldn't 😭 I only sold the 2070super for ~$420 too 😭 I was playing what I could on my backup GT 1030 before my friend lent me the 1080 FTW too 😭

Never again. 3060 Ti is staying my backup-backup card.

All my cards are/were EVGA too and you can't buy EVGA anymore 😭😭

Edit: I only got my 3060 Ti in April for MSRP ( I held out) why it took so long, - I don't plan on upgrading till 5000 series - and will probably be a 5050Ti at the rate power consumption and prices are going 😭😭
 
The specs are similar, but the PS 5 is not RDNA 2, it doesn't have all of the architectural advantages and increased IPC, that's why I compared it to a 5700 XT. It has some minor ray tracing features, but it would likely tie a 6600 XT. PC gaming needs to get cheaper, end of story. if people could buy a 6700 XT for 250 it'd be a better value proposition overall. 6650 XT and 6600 XT are getting there but it's still not the same value IMO. Long term, if you play a lot of games, it's probably cheaper to a own a PC because games are generally cheaper but still.
PS5 is a custom chip, which has aspects of RDNA 1 and 2, as well as some custom bits (the geometry engine is one of those custom bits. Which seems to be a custom approach to Mesh Shading or something dynamic like that).
Its unclear why they chose to do it like that. However, looking at the paper specs, its clear that a fair bit of it is RDNA 2 based. The shader counts, TMUs, ROPs, compute units, and clockspeeds are all indicative of RDNA 2.0 type silicon. And indeed, it supports ray tracing, as well. And I'm not so sure the IPC of RDNA 2 is necessarily better than RDNA 1. 5700XT has more shaders than an 6600 XT, but less clockspeed, and less overal flops---but isn't much worse than a 6600XT.

Despite all of the chatter about this near the beginning of the PS5's life-----I don't think there is any problem here, performance wise. I mean, if you look at that digital foundry video, the PS5 version isn't far behind the Series X version. And that is frequently the case with many games. Thus proving that the performance is comparable to RDNA 2.
 
  • Like
Reactions: T4rd
like this
I'm amused at the attempts to maul consoles in a thread discussing how dedicated GPU sales are tanking. It's like insisting your battleship's extra cannons will win the day while the ship is slowly sinking. Greater power doesn't mean much if fewer and fewer people can (or want to) use it.

Modern GPUs are amazing things, and there's no doubt that a well-equipped gaming PC can tap dance all over a PS5 or Xbox Series X. But PC gaming is an increasingly expensive hobby, and the truth is that multiple stars need to align for the advantages to justify the added expense (the right overall component mix, games that make full use of the hardware, and beyond). And the ugly truth is that many people would rather spend $399 on a console every several years than three times that amount on a PC that might need mid-cycle upgrades to keep up with the Joneses.
 
But PC gaming is an increasingly expensive hobby,
I would be curious about this, looking at steam most played:

https://store.steampowered.com/charts/mostplayed

Everything before the free to play Call of Duty would ran quite well on a $500 pre built desktop or if you choose an laptop with a GPU instead of your other choice for the laptop you would have had anyway.

The library of old title that run out of the box free of issue is bigger than ever, Epyc, Amazon give games aways, low cost indie title is a popular genre.

Keeping up to have the strongest machine to play on a oled tv, we are back to earlys 90s pricing true, but I am not sure what percentage of the world pc gamer it represents, the can on a core 2 duo with players from Vietnam to everywhere title scene is huge (maybe because you need to spend a lot to beat the PS5 experience and that why it push it in too extreme, very cheap and very uncheap).
 
Last edited:

The thing is, I'm not sure how that graph works. You mean in 3 months of ALL the desktops surveyed, AMD dropped 10% of ALL the GPU discrete desktop market share? I find that hard to believe as most people don't have a 3 month cycle on upgrading. If someone had an AMD GPU in Q2'22, it's pretty likely they still had one in Q3'22. Likewise, even if someone upgraded from an older Nvidia card to a 4XXX one, they are still dropping one and replacing it which is a wash.

Unless the graph speaks specifically to GPU sales (which the graph doesn't indicate). Then it makes sense that Nvidia sales would be significantly higher because they released a new card in that timeframe where AMD did not.
 
Back
Top