GPU prices — is the worst behind us ?

HUB Analysis on ideal GPU prices:

here's what the lineup should have looked like at launch, up to the mid-range:

  • The RTX 4070 should have debuted at $500, matched by AMD with the 7900 XT.
  • The 7800 XT should have been priced at $460, indicating it was better priced than most without being exceptional.
  • Around $400, we should have seen the 7700 XT competing against the RTX 4060 Ti 16GB, with the Nvidia model priced at $370.
  • Below that, at $300, there should have been a battle between the RTX 4060 Ti 8GB and RX 7600 XT, presenting an interesting choice between faster performance with less VRAM, and slower performance with more VRAM.
  • At $250, the battle should have been between the RTX 4060 and RX 7600.
  • Rounding it out, last-generation GPUs like the RX 6600 at $170 and the RTX 3050 6GB at $110 should have provided solid entry points to PC gaming.
https://www.techspot.com/article/2817-price-is-wrong-gpu-reality-check/#google_vignette
They have to cater to their audience a bit.

They don't really test with DLSS and FSR when they make those performance/price comparisons, so it's not a useful guide. I think they also seem to forget that a global pandemic gave pretty much every company an excuse to raise prices on everything (inflation), and they are not going to lower them.
 
They have to cater to their audience a bit.

They don't really test with DLSS and FSR when they make those performance/price comparisons, so it's not a useful guide. I think they also seem to forget that a global pandemic gave pretty much every company an excuse to raise prices on everything (inflation), and they are not going to lower them.
How is it not useful? Seems pretty apples to apples to me. Not every game supports those features, and some support frame gen while others don't. So this way they avoid the nvidia cherry picked slides they love to use.
 
It also misrepresents the performance in 5/6 of the titles they test with. New games DO support dlss/fsr. And in something 5 years old that doesn't, we don't need performance graphs... the cards can all play those games.

Especially in the lower tier cards, those features need to be included, because they benefit the most.

I like that HU is on the consumers side, but they could improve their analysis methodology and test results. It's a lot of work for 2 people, they should grow their business and hire someone.
 
icegif-13.gif
 
Only through a technicality of every original gameboy shipping with it. But these days probably Minecraft or Fortnight or something

Minecraft, with an estimated 300 million copies sold.
well 520m vs 300m. i dont think they sold that many gameboys, but it does have a few decades of age on it.
but i guess this is a little ot.... to the point, not everyone uses the extra rt/fg features, i dont, so that post is relevant to some.
 
i know, but its technically true. no idea what they were referring to...
They?

TBF, I misquoted Wikipedia, which says it's the best-selling game of all time. But with "nearly 140 million monthly active players as of 2023", you could make the case for most popular, probably.

Compare the Pokemon franchise, with, with dozens of different games across ten generations and also including a whole bunch of side games, has 480 million units sold. Minecraft is one (or 5, depending on how you count) games.
 
well 520m vs 300m.
hundreds of different versions of the game across basically every platform. Minecraft--if you discount the spinoffs--has 2 (or 3, if you count the retired, original mobile version.) But then we start getting into the weeds.
 
It also misrepresents the performance in 5/6 of the titles they test with. New games DO support dlss/fsr. And in something 5 years old that doesn't, we don't need performance graphs... the cards can all play those games.

Especially in the lower tier cards, those features need to be included, because they benefit the most.

I like that HU is on the consumers side, but they could improve their analysis methodology and test results. It's a lot of work for 2 people, they should grow their business and hire someone.
There actually aren't a lot of sites/channels consistently looking at performance with upscaling turned on.

HU kind of sidesteps it, by being bullish about raw performance. (See: they think the 7900 XT should be priced like a 4070 and the 7700 XT, like a 4060 ti. Due to AMD's lesser RT capability. But that would also skew them heavily in favor for raster. And then the 7800 XT gets put in a weird zone where it wouldn't make sense at all.
And cards lower than that I guess arent worth it for RT, so there isn't a similar shuffle.)


In the end, it really highlights how weird everyone's product stack and pricing, are.
 
Last edited:
There actually aren't a lot of sites/channels consistently looking at performance with upscaling turned on.
With good reason, the second you start comparing with different upscaler on it become quite hard.

You need to match the upscaler quality, say Xess Quality is the same as DLSS ultra quality but only with DLSS 3.7 and latest Xess support for title still on 3.4 it is more Xess quality = dlss balanced and so on.

Benchmaker much prefer comparing the exact same visual if they can, they could compare how fast FSR run on each model but because it is so similar there is not much need-ask for it, Xess can swing widely I think when it run on Intel versus the rest, but so few Intel interest....

Testing GPU got quite complicated and like at some point in the old past when game could look significantly different with glide or without or 16 bits vs 24 bits, different antialiasing tech/capability etc... cannot be resumed by numbers anymore.

Been more than a year ? that any hard to run title released without vast upscaling options and almost all the new big games released cannot run at 4k ultra settings, making upscaling always something you do try to see if it is better than the lower setting needed to run them for those with that common resolution on their tv.
 
Last edited:
The less-popular Windows version has nVidia-only ray tracing, sure. The main Java version still doesn't. But don't let that stop you from copping an attitude.
Wrong again. It has DXR. Honestly did you even check the link or do any research before posting?
 
Wrong again. It has DXR. Honestly did you even check the link or do any research before posting?
It didn't at first. I don't play the Windows version, so I didn't know they added AMD support. Meanwhile, you haven't shown that Java minecraft has ray tracing.

You keep telling yourself how great you are for starting an Internet argument, though, bro.
 
I don’t want to get into the thick of anything but I’ll confirm DXR raytracing runs Minecraft agreeably enough. DLSS is locked to Nvidia hardware but I had no issues playing around with RT on Intel Arc hardware, even if it was pretty pokey on an Arc A380 for anything north of 720p.

Still pulling for Intel.
 
It didn't at first. I don't play the Windows version, so I didn't know they added AMD support. Meanwhile, you haven't shown that Java minecraft has ray tracing.

You keep telling yourself how great you are for starting an Internet argument, though, bro.
Java edition requires Optifine and the ray tracing package in there, but it can be done, Bedrock edition is the only officially supported ray traced version though.
But before you get too far into it, nobody plays the Java edition purely vanilla, Optifine and Sodium are pretty much required for any degree of modern mod compatibility, and any hopes of having an actual draw distance on modern hardware.
 
It didn't at first. I don't play the Windows version, so I didn't know they added AMD support. Meanwhile, you haven't shown that Java minecraft has ray tracing.

You keep telling yourself how great you are for starting an Internet argument, though, bro.
I'm not telling myself nothing. I'm simply correcting the incorrect information you posted. Honestly, I've never even played minecraft, it's just not something I'm interested in, but I also fact check myself before posting something so contentious. Perhaps you should start doing the same.
 
With good reason, the second you start comparing with different upscaler on it become quite hard.

You need to match the upscaler quality, say Xess Quality is the same as DLSS ultra quality but only with DLSS 3.7 and latest Xess support for title still on 3.4 it is more Xess quality = dlss balanced and so on.

Benchmaker much prefer comparing the exact same visual if they can, they could compare how fast FSR run on each model but because it is so similar there is not much need-ask for it, Xess can swing widely I think when it run on Intel versus the rest, but so few Intel interest....

Testing GPU got quite complicated and like at some point in the old past when game could look significantly different with glide or without or 16 bits vs 24 bits, different antialiasing tech/capability etc... cannot be resumed by numbers anymore.

Been more than year ? that any hard to run title released without vast upscaling options and almost all the new big games released cannot run at 4k ultra settings, making upscaling always something you do try to see if it is better than the lower setting needed to run them for those with that common resolution on their tv.
Yeah, it will be difficult and even a little subjective. But the newer features like frame gen, super resolution really do add a lot in terms of performance. Raster is only a piece now, not the be-all end-all. And in some games, I've seen DLSS improve image quality. That's why it's wrong to keep ignoring it. DLSS can upscale an image and produce a better output than native. In most games at a minimum (from my experience), the highest quality DLSS choice = native in terms of PQ, while boosting performance significantly.
Find that old thread of mine where I posted screenshots of Jedi Survivor with no upscaling, FSR, and then DLSS from the mod. The DLSS was the best picture, even at the 50% input resolution, which was completely bonkers. It's an amazing piece of technology. Yes, 50% input res, so 1/2(x), 1/2(y) input resolution upscaled to x, y looked better than native x, y or FSR at any quality level, and performed insanely better too. I put some of that blame on the game dev's, rest on AMD for making it FSR exclusive at launch. Forcing an inferior upscaler on everyone. Everyone arguing that Raster is all that matters are AMD owners. You don't really know any better if you have never tried DLSS (2.0 or higher) on an Nvidia card.

It completely changes the placement of cards in various tiers.

As far as games supporting it, even old games are getting support if they are decently popular enough.

"I don't like raytracing!" well you can still use DLSS and Nvidia Reflex and get those benefits I'm describing without turning RT on, if you really want.

Why do you think the Nvidia cards are so popular? Because the technology works, and works well.
 
Yeah, it will be difficult and even a little subjective. But the newer features like frame gen, super resolution really do add a lot in terms of performance. Raster is only a piece now, not the be-all end-all. And in some games, I've seen DLSS improve image quality. That's why it's wrong to keep ignoring it. DLSS can upscale an image and produce a better output than native. In most games at a minimum (from my experience), the highest quality DLSS choice = native in terms of PQ, while boosting performance significantly.
Find that old thread of mine where I posted screenshots of Jedi Survivor with no upscaling, FSR, and then DLSS from the mod. The DLSS was the best picture, even at the 50% input resolution, which was completely bonkers. It's an amazing piece of technology. Yes, 50% input res, so 1/2(x), 1/2(y) input resolution upscaled to x, y looked better than native x, y or FSR at any quality level, and performed insanely better too. I put some of that blame on the game dev's, rest on AMD for making it FSR exclusive at launch. Forcing an inferior upscaler on everyone. Everyone arguing that Raster is all that matters are AMD owners. You don't really know any better if you have never tried DLSS (2.0 or higher) on an Nvidia card.

It completely changes the placement of cards in various tiers.

As far as games supporting it, even old games are getting support if they are decently popular enough.

"I don't like raytracing!" well you can still use DLSS and Nvidia Reflex and get those benefits I'm describing without turning RT on, if you really want.

Why do you think the Nvidia cards are so popular? Because the technology works, and works well.
With studio cuts and team claw backs expect DLSS and RT to only become more necessary. Why spend time and resources manually optimizing textures and lighting when you can leave them raw and let technologies like Nanite, Lumen, Ray Tracing, DLSS, and Fame Generation get 99% of the results with 1% of the effort.

For all the comments of “when I turn on ray tracing I barely notice a difference” that was what they wanted. The non ray traced lighting was teams and 10’s of thousands of man hours, the ray traced version was a few button clicks and a few weeks of automated rendering engines running calculations with barely any oversight.

Costs on development for the gaming industry have exploded and wasn’t sustainable. The massive industry layoffs and consolidation efforts among studios is proof of this.

These technologies are the solution, pair with some AI to augment QA departments and you are seeing virtually the same output with half the man power on 1/3’rd the budget.

Like it or not that’s the reality of the industry right now.
 
well 520m vs 300m. i dont think they sold that many gameboys, but it does have a few decades of age on it.
but i guess this is a little ot.... to the point, not everyone uses the extra rt/fg features, i dont, so that post is relevant to some.
Ok, did some digging, they apparently sold 425M copies over JUST mobile platforms, so I'm guessing 99 cent downloads of a game that resembles the original... I mean sure I guess. But yeah "only" 30M from gameboy sales.
 
With studio cuts and team claw backs expect DLSS and RT to only become more necessary. Why spend time and resources manually optimizing textures and lighting when you can leave them raw and let technologies like Nanite, Lumen, Ray Tracing, DLSS, and Fame Generation get 99% of the results with 1% of the effort.

For all the comments of “when I turn on ray tracing I barely notice a difference” that was what they wanted. The non ray traced lighting was teams and 10’s of thousands of man hours, the ray traced version was a few button clicks and a few weeks of automated rendering engines running calculations with barely any oversight.

Costs on development for the gaming industry have exploded and wasn’t sustainable. The massive industry layoffs and consolidation efforts among studios is proof of this.

These technologies are the solution, pair with some AI to augment QA departments and you are seeing virtually the same output with half the man power on 1/3’rd the budget.

Like it or not that’s the reality of the industry right now.

I think there needs to be one big asterisk here. With most games being multiplatform, and consoles using lower end AMD GPUs, ray tracing will probably not become standard for a while. Though I have not been watching console performance much as the last home console I owned was a PS2 so maybe more games are using ray tracing with "good enough" upscaler results that I think.
 
I think there needs to be one big asterisk here. With most games being multiplatform, and consoles using lower end AMD GPUs, ray tracing will probably not become standard for a while. Though I have not been watching console performance much as the last home console I owned was a PS2 so maybe more games are using ray tracing with "good enough" upscaler results that I think.
I’m not so sure, the new PS5 coming at the end of the year will support it, Microsoft will be forced to update the XBox platform to even remain in the game, and the upcoming Switch will have Raytracing and all the DLSS goodies.
Expect non ray traced options to be the new potato mode in the next 2 years for sure, which they still wont look bad by any means, less dynamic sure, static shadows, flat colouring, a lack of gradients to shadows, and an over reliance on global illumination instead of pointed light sources.
The newer texture compression methods and higher resolutions will cover some of the spread but I certainly expect a step or 2 backwards in visual fidelity before things start moving forward again.
Gameplay wise this could be a resurgence, for too long games have overly relied on visuals to carry the game, that just may not be an option for the next year or 2 is all. So we could see some interesting gameplay mechanics step up to take its place.
 
Back
Top