chameleoneel
Supreme [H]ardness
- Joined
- Aug 15, 2005
- Messages
- 7,739
XFX has a 6750 XT on AMAZON AND Newegg for $300
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
They have to cater to their audience a bit.HUB Analysis on ideal GPU prices:
here's what the lineup should have looked like at launch, up to the mid-range:
https://www.techspot.com/article/2817-price-is-wrong-gpu-reality-check/#google_vignette
- The RTX 4070 should have debuted at $500, matched by AMD with the 7900 XT.
- The 7800 XT should have been priced at $460, indicating it was better priced than most without being exceptional.
- Around $400, we should have seen the 7700 XT competing against the RTX 4060 Ti 16GB, with the Nvidia model priced at $370.
- Below that, at $300, there should have been a battle between the RTX 4060 Ti 8GB and RX 7600 XT, presenting an interesting choice between faster performance with less VRAM, and slower performance with more VRAM.
- At $250, the battle should have been between the RTX 4060 and RX 7600.
- Rounding it out, last-generation GPUs like the RX 6600 at $170 and the RTX 3050 6GB at $110 should have provided solid entry points to PC gaming.
How is it not useful? Seems pretty apples to apples to me. Not every game supports those features, and some support frame gen while others don't. So this way they avoid the nvidia cherry picked slides they love to use.They have to cater to their audience a bit.
They don't really test with DLSS and FSR when they make those performance/price comparisons, so it's not a useful guide. I think they also seem to forget that a global pandemic gave pretty much every company an excuse to raise prices on everything (inflation), and they are not going to lower them.
Ahhh seems normalcy is coming back, where last generation hardware that is 3+ years old doesn't cost as much as it did when it was first released.XFX has a 6750 XT on AMAZON AND Newegg for $300
The most popular game of all time doesn't have ray tracing, DLSS, frame generation, or any of that.It also misrepresents the performance in 5/6 of the titles they test with.
Which game is that?The most popular game of all time doesn't have ray tracing, DLSS, frame generation, or any of that.
tetrisWhich game is that?
i know, but its technically true. no idea what they were referring to...
Only through a technicality of every original gameboy shipping with it. But these days probably Minecraft or Fortnight or somethingi know, but its technically true. no idea what they were referring to...
Minecraft, with an estimated 300 million copies sold.Which game is that?
Only through a technicality of every original gameboy shipping with it. But these days probably Minecraft or Fortnight or something
well 520m vs 300m. i dont think they sold that many gameboys, but it does have a few decades of age on it.Minecraft, with an estimated 300 million copies sold.
They?i know, but its technically true. no idea what they were referring to...
hundreds of different versions of the game across basically every platform. Minecraft--if you discount the spinoffs--has 2 (or 3, if you count the retired, original mobile version.) But then we start getting into the weeds.well 520m vs 300m.
There actually aren't a lot of sites/channels consistently looking at performance with upscaling turned on.It also misrepresents the performance in 5/6 of the titles they test with. New games DO support dlss/fsr. And in something 5 years old that doesn't, we don't need performance graphs... the cards can all play those games.
Especially in the lower tier cards, those features need to be included, because they benefit the most.
I like that HU is on the consumers side, but they could improve their analysis methodology and test results. It's a lot of work for 2 people, they should grow their business and hire someone.
With good reason, the second you start comparing with different upscaler on it become quite hard.There actually aren't a lot of sites/channels consistently looking at performance with upscaling turned on.
I guess you missed the fact that it indeed does have ray tracing.Minecraft, with an estimated 300 million copies sold.
The less-popular Windows version has nVidia-only ray tracing, sure. The main Java version still doesn't. But don't let that stop you from copping an attitude.I guess you missed the fact that it indeed does have ray tracing.
https://www.minecraft.net/en-us/updates/ray-tracing
But, to hell with the facts, am I right?
Wrong again. It has DXR. Honestly did you even check the link or do any research before posting?The less-popular Windows version has nVidia-only ray tracing, sure. The main Java version still doesn't. But don't let that stop you from copping an attitude.
It didn't at first. I don't play the Windows version, so I didn't know they added AMD support. Meanwhile, you haven't shown that Java minecraft has ray tracing.Wrong again. It has DXR. Honestly did you even check the link or do any research before posting?
Java edition requires Optifine and the ray tracing package in there, but it can be done, Bedrock edition is the only officially supported ray traced version though.It didn't at first. I don't play the Windows version, so I didn't know they added AMD support. Meanwhile, you haven't shown that Java minecraft has ray tracing.
You keep telling yourself how great you are for starting an Internet argument, though, bro.
I'm not telling myself nothing. I'm simply correcting the incorrect information you posted. Honestly, I've never even played minecraft, it's just not something I'm interested in, but I also fact check myself before posting something so contentious. Perhaps you should start doing the same.It didn't at first. I don't play the Windows version, so I didn't know they added AMD support. Meanwhile, you haven't shown that Java minecraft has ray tracing.
You keep telling yourself how great you are for starting an Internet argument, though, bro.
Yeah, it will be difficult and even a little subjective. But the newer features like frame gen, super resolution really do add a lot in terms of performance. Raster is only a piece now, not the be-all end-all. And in some games, I've seen DLSS improve image quality. That's why it's wrong to keep ignoring it. DLSS can upscale an image and produce a better output than native. In most games at a minimum (from my experience), the highest quality DLSS choice = native in terms of PQ, while boosting performance significantly.With good reason, the second you start comparing with different upscaler on it become quite hard.
You need to match the upscaler quality, say Xess Quality is the same as DLSS ultra quality but only with DLSS 3.7 and latest Xess support for title still on 3.4 it is more Xess quality = dlss balanced and so on.
Benchmaker much prefer comparing the exact same visual if they can, they could compare how fast FSR run on each model but because it is so similar there is not much need-ask for it, Xess can swing widely I think when it run on Intel versus the rest, but so few Intel interest....
Testing GPU got quite complicated and like at some point in the old past when game could look significantly different with glide or without or 16 bits vs 24 bits, different antialiasing tech/capability etc... cannot be resumed by numbers anymore.
Been more than year ? that any hard to run title released without vast upscaling options and almost all the new big games released cannot run at 4k ultra settings, making upscaling always something you do try to see if it is better than the lower setting needed to run them for those with that common resolution on their tv.
With studio cuts and team claw backs expect DLSS and RT to only become more necessary. Why spend time and resources manually optimizing textures and lighting when you can leave them raw and let technologies like Nanite, Lumen, Ray Tracing, DLSS, and Fame Generation get 99% of the results with 1% of the effort.Yeah, it will be difficult and even a little subjective. But the newer features like frame gen, super resolution really do add a lot in terms of performance. Raster is only a piece now, not the be-all end-all. And in some games, I've seen DLSS improve image quality. That's why it's wrong to keep ignoring it. DLSS can upscale an image and produce a better output than native. In most games at a minimum (from my experience), the highest quality DLSS choice = native in terms of PQ, while boosting performance significantly.
Find that old thread of mine where I posted screenshots of Jedi Survivor with no upscaling, FSR, and then DLSS from the mod. The DLSS was the best picture, even at the 50% input resolution, which was completely bonkers. It's an amazing piece of technology. Yes, 50% input res, so 1/2(x), 1/2 input resolution upscaled to x, y looked better than native x, y or FSR at any quality level, and performed insanely better too. I put some of that blame on the game dev's, rest on AMD for making it FSR exclusive at launch. Forcing an inferior upscaler on everyone. Everyone arguing that Raster is all that matters are AMD owners. You don't really know any better if you have never tried DLSS (2.0 or higher) on an Nvidia card.
It completely changes the placement of cards in various tiers.
As far as games supporting it, even old games are getting support if they are decently popular enough.
"I don't like raytracing!" well you can still use DLSS and Nvidia Reflex and get those benefits I'm describing without turning RT on, if you really want.
Why do you think the Nvidia cards are so popular? Because the technology works, and works well.
Ok, did some digging, they apparently sold 425M copies over JUST mobile platforms, so I'm guessing 99 cent downloads of a game that resembles the original... I mean sure I guess. But yeah "only" 30M from gameboy sales.well 520m vs 300m. i dont think they sold that many gameboys, but it does have a few decades of age on it.
but i guess this is a little ot.... to the point, not everyone uses the extra rt/fg features, i dont, so that post is relevant to some.
With studio cuts and team claw backs expect DLSS and RT to only become more necessary. Why spend time and resources manually optimizing textures and lighting when you can leave them raw and let technologies like Nanite, Lumen, Ray Tracing, DLSS, and Fame Generation get 99% of the results with 1% of the effort.
For all the comments of “when I turn on ray tracing I barely notice a difference” that was what they wanted. The non ray traced lighting was teams and 10’s of thousands of man hours, the ray traced version was a few button clicks and a few weeks of automated rendering engines running calculations with barely any oversight.
Costs on development for the gaming industry have exploded and wasn’t sustainable. The massive industry layoffs and consolidation efforts among studios is proof of this.
These technologies are the solution, pair with some AI to augment QA departments and you are seeing virtually the same output with half the man power on 1/3’rd the budget.
Like it or not that’s the reality of the industry right now.
I’m not so sure, the new PS5 coming at the end of the year will support it, Microsoft will be forced to update the XBox platform to even remain in the game, and the upcoming Switch will have Raytracing and all the DLSS goodies.I think there needs to be one big asterisk here. With most games being multiplatform, and consoles using lower end AMD GPUs, ray tracing will probably not become standard for a while. Though I have not been watching console performance much as the last home console I owned was a PS2 so maybe more games are using ray tracing with "good enough" upscaler results that I think.