AMD RDNA 2 gets ray tracing

Looks like AMD fine wine, lol. 5700XT beating a 2070 Super in 1080p and tying at 1440p and not even significantly behind a 2080. A card that you could have gotten for $349 at Dell not too long ago. Looking at Fidelity FX in RE3, it does give some nice improvements without the normal texture issues with regular texture sharpening. The performance mode for DLSS on this title looks like a non-starter and degrades the image from their samples, the quality mode definitely enhances the IQ while improving performance. Maybe AMD will have FidelityFX 2 for sharpening and up scaling. In any case FidelityFX is better IQ wise than DLSS 1.0, maybe even 1.5 like used in the original Control. Plus FidelityFX is not restricted to one brand of current generation GPUs. Even the 1080Tis, 1080's etc. can get the benefit of that tech which is very cool.

Control used DLSS 1.9:


But trying to claim image quality parity between DLSS 2.0 and FFX would be false...why I am not surprised at you posting like this?
 
Control used DLSS 1.9:


But trying to claim image quality parity between DLSS 2.0 and FFX would be false...why I am not surprised at you posting like this?

They really only look at grass in this video. Other reviews show areas where DLSS falls flat. As all things when it comes to upsampling mileage varies on the scene. It is not universal to say that DLSS will always be better. That's just not true. I posted a review from guru that took a different area and it showed some serious problems in artifacting with DLSS. Again, it depends on the scene.
 
Control used DLSS 1.9:


But trying to claim image quality parity between DLSS 2.0 and FFX would be false...why I am not surprised at you posting like this?

You need to learn how to read, as usual your smear job fails. IQ wise DLSS 1.0 -> SUCKED! Blurry with other artifacting issues -> FidelityFX CAS does not have that problem, looks better, works on more different type of GPU's, easier to implement.

I too think DLSS 2.0 is better, knew that way before you even understood that, shouted on roof tops in the DLSS tread even before anyone else, so to speak.

I would expect an update with AMD for a reconstruction type scaler, also for Radeon Boost which is very game restricted (hey just like DLSS). Unless a game implements DLSS, it is a useless feature for that game. All next generation console ported games will most likely have at least FidelityFX or something better.
 
You need to learn how to read, as usual your smear job fails. IQ wise DLSS 1.0 -> SUCKED! Blurry with other artifacting issues -> FidelityFX CAS does not have that problem, looks better, works on more different type of GPU's, easier to implement.

I too think DLSS 2.0 is better, knew that way before you even understood that, shouted on roof tops in the DLSS tread even before anyone else, so to speak.

I would expect an update with AMD for a reconstruction type scaler, also for Radeon Boost which is very game restricted (hey just like DLSS). Unless a game implements DLSS, it is a useless feature for that game. All next generation console ported games will most likely have at least FidelityFX or something better.

We don't live in the past.
This is the DLSS 2.0 world.

But it does tell a lot when you have to compare present day FFX to the past day DLSS in orddr to prentend to having a point.

And that you keep your fingers crossed for low implementation of DLSS 2.0 is really telling.
(But you have to ignore a lot of facts for that hope)
FFX < DLSS 2.0...and that is unlikely to change as bilinear+sharpen is subpar to the way DLSS works.

But it will be fun to watch people like you use more and more fallacious "arguments" as the gap between the two widens in a futile attempt to make the case of "parity"

Only question is how deep down the rabbit hole will you fall?
Your failure of "serial raytracing" will be hard to match, but it looks like you are heading for a repeat 😉
 
They really only look at grass in this video. Other reviews show areas where DLSS falls flat. As all things when it comes to upsampling mileage varies on the scene. It is not universal to say that DLSS will always be better. That's just not true. I posted a review from guru that took a different area and it showed some serious problems in artifacting with DLSS. Again, it depends on the scene.

Link to said reviews?
Fluff is easy...and reviews doing pictures and not video like DF in 2020 is hillarious.

But please, show me?
 
We don't live in the past.
This is the DLSS 2.0 world.

I've ignored nVidia for the past 2 years, still content with my 1060 3GB, though it's definitely time for an upgrade. RTX is nice, but not so noticeable that it's a reason to upgrade. I didn't care much for DLSS 1.0 either, it made IQ blurrier. I've been quite open to get Ampere or RDNA2, whichever architecture gives me better performance around $200-250.

Now that we've seen what DLSS 2.0 can do? I'm not so sure about AMD anymore. Seeing Digital Foundry show that Control @ 540p upscaled to 1080p looks better than native 1080p was very impressive. I hope AMD releases similar upscaling technology so there's more competition. Right now though, anyone who denies that Nvidia has a legitimate advantage with DLSS 2.0 seems like a pretty obvious AMD fanboy to me. The ability to play at half-size rendering while barely noticing an IQ difference in the majority of cases could spell the end of native rendering. I don't care that there's actually 2560x1440 pixels rendered on my screen, but I do care if it looks as if there were while I get way higher performance. Unless AMD has something good to announce in this respect, seems like it's Ampere for me...
 
I've ignored nVidia for the past 2 years, still content with my 1060 3GB, though it's definitely time for an upgrade. RTX is nice, but not so noticeable that it's a reason to upgrade. I didn't care much for DLSS 1.0 either, it made IQ blurrier. I've been quite open to get Ampere or RDNA2, whichever architecture gives me better performance around $200-250.

Now that we've seen what DLSS 2.0 can do? I'm not so sure about AMD anymore. Seeing Digital Foundry show that Control @ 540p upscaled to 1080p looks better than native 1080p was very impressive. I hope AMD releases similar upscaling technology so there's more competition. Right now though, anyone who denies that Nvidia has a legitimate advantage with DLSS 2.0 seems like a pretty obvious AMD fanboy to me. The ability to play at half-size rendering while barely noticing an IQ difference in the majority of cases could spell the end of native rendering. I don't care that there's actually 2560x1440 pixels rendered on my screen, but I do care if it looks as if there were while I get way higher performance. Unless AMD has something good to announce in this respect, seems like it's Ampere for me...

Prepare for a lot of FUD about "FFX = DLSS" in the near future...just saying 😉

But yes DLSS 2.0 s a gamechanger.

And I suspect Nvidia is working hard on DLSS 3.0...they seldom rest on their laurels.
 
Link to said reviews?
Fluff is easy...and reviews doing pictures and not video like DF in 2020 is hillarious.

But please, show me?

Since you asked. There is a video showing DLSS artifacts in this review, whether they are noticeable or annoying to an individual is dependent on many things.
https://www.dsogaming.com/screensho...ative-4k-vs-fidelityfx-upscaling-vs-dlss-2-0/
"Most of the times, these artifacts are not that easy to spot." So while it does have some artifacting, it seems not that big of a deal overall. Their conclusion is still that DLSS is a win overall, how much is obviously debatable and depends a lot on the person. What looks good to you on your monitor may not be the same that it looks to someone else on their monitor/tv and something that bothers one person can be completely acceptable to another person. It's seems silly to keep arguing about something that can vary from person to person. Like the shimmering that can take place with Fidelity FX.... some people notice it, others don't. To one it could be annoying and to others they may not notice it and it's perfectly acceptable. Some people things TAA looks like an oil painting, others think it's the best AA available. It's all subjective. All you're doing is trying to say you perceptions are more correct than everyone elses. I mean, maybe you're the most superior human on earth in all things image quality... but that still doesn't change how other people perceive things.
 
Prepare for a lot of FUD about "FFX = DLSS" in the near future...just saying 😉

But yes DLSS 2.0 s a gamechanger.

And I suspect Nvidia is working hard on DLSS 3.0...they seldom rest on their laurels.
Yeah there was a rumor not too long ago that nVidia's working on DLSS 3.0 which will work in any game that uses TAA and doesn't need to be specifically implemented like 2.0 does. The performance benefit likely won't be as high though.
 
  • Like
Reactions: N4CR
like this
Since you asked. There is a video showing DLSS artifacts in this review, whether they are noticeable or annoying to an individual is dependent on many things.
https://www.dsogaming.com/screensho...ative-4k-vs-fidelityfx-upscaling-vs-dlss-2-0/
"Most of the times, these artifacts are not that easy to spot." So while it does have some artifacting, it seems not that big of a deal overall. Their conclusion is still that DLSS is a win overall, how much is obviously debatable and depends a lot on the person. What looks good to you on your monitor may not be the same that it looks to someone else on their monitor/tv and something that bothers one person can be completely acceptable to another person. It's seems silly to keep arguing about something that can vary from person to person. Like the shimmering that can take place with Fidelity FX.... some people notice it, others don't. To one it could be annoying and to others they may not notice it and it's perfectly acceptable. Some people things TAA looks like an oil painting, others think it's the best AA available. It's all subjective. All you're doing is trying to say you perceptions are more correct than everyone elses. I mean, maybe you're the most superior human on earth in all things image quality... but that still doesn't change how other people perceive things.
Pretty interesting conclusion from that review:

All in all, DLSS 2.0 is slightly better than both Native 4K and FidelityFX Upscaling. Performance-wise, both FidelityFX and DLSS 2.0 perform similarly. FidelityFX Upscaling comes with a sharpening slider via which it can provide a sharper image than both Native 4K and DLSS 2.0. However, there is more aliasing with FidelityFX Upscaling than in both Native 4K and DLSS 2.0. On the other hand, DLSS 2.0 can eliminate more jaggies, but also introduces some visual artifacts.

FidelityFX is usable on any DX12 GPU/video card, even Intel/AMD Intergrated graphics CPUs, DLSS 2.0 is not. I would not be too surprise that developers will just prefer FidelityFX as the option vice using a more limited DLSS 2.0 for gamers. I would say most if not all next gen console games will at least use FidelityFX plus any additional improvements for reconstruction, how many will then do the extra work fitting in DLSS 2.0 with only slightly better results porting to PC games is to be seen.
 
most if not all next gen console games will at least use FidelityFX plus any additional improvements for reconstruction, how many will then do the extra work fitting in DLSS 2.0 with only slightly better results porting to PC games is to be seen.

I want to say you're probably right, but too often have I heard "AMD will dominate now because consoles are based on their architecture". And yet, since the PS4 era began, no such advantage has been visible on PC-land. It sounds like a good argument, but so far the facts haven't validated this premise. Probably because consoles use lots of custom coding that simply don't work on PC. Case in point: Death Stranding, a PS4 exclusive, that had to be ported to DX12 for PC release, and performs better on DLSS than FFX CAS.
 
Pretty interesting conclusion from that review:



FidelityFX is usable on any DX12 GPU/video card, even Intel/AMD Intergrated graphics CPUs, DLSS 2.0 is not. I would not be too surprise that developers will just prefer FidelityFX as the option vice using a more limited DLSS 2.0 for gamers. I would say most if not all next gen console games will at least use FidelityFX plus any additional improvements for reconstruction, how many will then do the extra work fitting in DLSS 2.0 with only slightly better results porting to PC games is to be seen.
Yeah, most all reviews came to the conclusion DLSS was better, but the amount of better is from tiny bit better to OMG I love Nvidia and they can have my first born. In reality, it's a good step forward, and I look forward to more iterations, but it's not like OMG if I don't have this feature right now I'm missing out on life. They perform similar and outcome favors DLSS to some extent (in the eye of the beholder). You'd think from some of the comments here that this makes render 720p look like better than 8k native with DLSS. It's not that good, but it is better than other options in this game in general.
 
Last edited:
  • Like
Reactions: noko
like this
You'd think from some of the comments here that this makes render 720p look like better than 8k native with DLSS. It's not that good, but it is better than other options in this game in general.

I'll assume you're referring to me there, but I'd just point you to Digital Foundry's review of DLSS 2.0 in Control. 540p looked better than native 1080p, and a similar case was made for 720p vs native 1440p, or 1080p vs native 4K, with the best quality being 1440 to native 4K. That means %50 resolution is giving similar or better results than native resolution. Check DF's coverage of Death Stranding, and you'll see they find FFX CAS gives comparable performance at %75 resolution. That's a hefty %25 tax on your GPU that you can offset with DLSS 2.0, nobody is talking about any exaggerated or rumored gains here. That's measured by DF very clearly and publicly available on Youtube.

If I can get an extra %25 of GPU performance on one brand vs the other, I'll get the one with better performance without thinking twice. If AMD equals that performance, I'd happily choose AMD. I have no beef with any of them, I buy whoever gives me better stuff at a given price point. Could care less about "brand loyalty" (the stupidest, yet most successful way to regularly siphon money out of customer's bank accounts).
 
What’s stopping anyone from using sharpen with DLSS 2.0? It’s not like freestyle becomes disabled. Given the choice I’d always pick DLSS 2.0 and while FFx may be open, like other AMD software initiatives it will likely fall flat on its face. I hope Nvidia improves DLSS to the point where it can be run with any game. Right now for me DLSS is useless as it’s only used in single player games and not multiplayer ones like warzone and apex legends.
 
  • Like
Reactions: noko
like this
Yeah there was a rumor not too long ago that nVidia's working on DLSS 3.0 which will work in any game that uses TAA and doesn't need to be specifically implemented like 2.0 does. The performance benefit likely won't be as high though.

I am eleventy thousand percent sure, NV will press DLSS and various related aspects upgrades hard.
Specifics will evolve, but I think it is incredibly safe to assume - this field is going to be aggressively advanced.

And it makes sense, right? Anyone can at least in theory make a Giant Wall Of Shaders. Brute force is great, and gets you in the game. But - what if you had a wall of shaders, AND advanced techniques which allowed you to get more out of them? Even setting aside ray tracing for the moment.

That's how you will be able to charge more for your Wall Of Shaders than the other guy.
 
50th AE 5700Xt disagrees with you.. they did do it. And should do it more, as a collector those are the cards I'm interested in, bit like the 'Phantom Editions' (platinum edition?) e.g. X800XT PE. They were vapourware back then though, I tried to get one lol.

They could have done a Vega PE version too if they wanted, mine significantly (and quite a few other late production V64s) undervolts better than most earlier cards, 1.6GHz (max rated boost) @ 180-190W and 1100MHz HBM2 in most titles incl Battlefront II, NMS, etc is a big difference to 300W... and it also makes the stock blower pretty damn quiet lol.

Kind of the reason I jumped on ref RX 5700 as the design was different but the core runs XT bios and with in 10%
 
  • Like
Reactions: N4CR
like this
I am eleventy thousand percent sure, NV will press DLSS and various related aspects upgrades hard.
Specifics will evolve, but I think it is incredibly safe to assume - this field is going to be aggressively advanced.

And it makes sense, right? Anyone can at least in theory make a Giant Wall Of Shaders. Brute force is great, and gets you in the game. But - what if you had a wall of shaders, AND advanced techniques which allowed you to get more out of them? Even setting aside ray tracing for the moment.

That's how you will be able to charge more for your Wall Of Shaders than the other guy.
Yeah the other part of that rumor is that nVidia might start enabling DLSS by default and try to get games benchmarked with it on to give them a performance advantage vs AMD. We'll see how good it gets.
 
I'll assume you're referring to me there, but I'd just point you to Digital Foundry's review of DLSS 2.0 in Control. 540p looked better than native 1080p, and a similar case was made for 720p vs native 1440p, or 1080p vs native 4K, with the best quality being 1440 to native 4K. That means %50 resolution is giving similar or better results than native resolution. Check DF's coverage of Death Stranding, and you'll see they find FFX CAS gives comparable performance at %75 resolution. That's a hefty %25 tax on your GPU that you can offset with DLSS 2.0, nobody is talking about any exaggerated or rumored gains here. That's measured by DF very clearly and publicly available on Youtube.

If I can get an extra %25 of GPU performance on one brand vs the other, I'll get the one with better performance without thinking twice. If AMD equals that performance, I'd happily choose AMD. I have no beef with any of them, I buy whoever gives me better stuff at a given price point. Could care less about "brand loyalty" (the stupidest, yet most successful way to regularly siphon money out of customer's bank accounts).
I wasn't talking about anyone in particular and don't even remember your specific comments, so maybe just a guilty conscience? Like I said, it's cool and does a pretty good job, but you all keep referencing a single person who had an opinion. I pointed out others who had different opinions (and by differing I mean they thought dlss was good, but not perfect and not significantly different). I even posted a link to a video showing some of the quality issues that digital foundary either didn't see or failed to mention/show. They are minor but there. If you have an Nvidia card run dlss, if you don't run ffx and get about the same speed boost with slightly less quality. How much less, as I've been saying, is in the eye of the beholder as can be seen by reading/watching more than a single source.
 
Yeah, most all reviews came to the conclusion DLSS was better, but the amount of better is from tiny bit better to OMG I love Nvidia and they can have my first born. In reality, it's a good step forward, and I look forward to more iterations, but it's not like OMG if I don't have this feature right now I'm missing out on life. They perform similar and outcome favors DLSS to some extent (in the eye of the beholder). You'd think from some of the comments here that this makes render 720p look like better than 8k native with DLSS. It's not that good, but it is better than other options in this game in general.
Being limited to a single game for comparison while cool could also be off in implementation of DLSS or FidelityFX. If we had several games for analysis that would be much better for a sound conclusion, my view. Will be interesting as the upcoming releases occur. All from what I can see, Nvidia has the processing capability for this sorta of things way beyond what AMD will put out, tensor cores. While processing ability does not always equate to game performance, it can be a good indicator.
 
I wasn't talking about anyone in particular and don't even remember your specific comments, so maybe just a guilty conscience? Like I said, it's cool and does a pretty good job, but you all keep referencing a single person who had an opinion. I pointed out others who had different opinions (and by differing I mean they thought dlss was good, but not perfect and not significantly different). I even posted a link to a video showing some of the quality issues that digital foundary either didn't see or failed to mention/show. They are minor but there. If you have an Nvidia card run dlss, if you don't run ffx and get about the same speed boost with slightly less quality. How much less, as I've been saying, is in the eye of the beholder as can be seen by reading/watching more than a single source.

No worries, I had just mentioned specific cases so it seemed like you were mentioning my post. in the end it’s all about using whatever performance boosts your hardware can do, you’re right. I’m looking forward to see what both brands reveal soon, although the $200 market will probably take a few more months. As respected as DF is for these analyses, I’d definitely want to see more games (that I care about) and how they benefit from new architectures. It’s a waiting game now!
 
No worries, I had just mentioned specific cases so it seemed like you were mentioning my post. in the end it’s all about using whatever performance boosts your hardware can do, you’re right. I’m looking forward to see what both brands reveal soon, although the $200 market will probably take a few more months. As respected as DF is for these analyses, I’d definitely want to see more games (that I care about) and how they benefit from new architectures. It’s a waiting game now!
Yeah, it's hard to come to a conclusion on one game with limited samples, but so far it seems like a step in the right direction and I feel if AMD doesn't have something better this release it probably going to be okay, but they better be working on something for the following release or they are going to be struggling. I think imag quality is going to come back to the forefront as recently it hasn't been a focus because the same code on a sharer produced the same results so there wasn't to much IQ comparisons going on like in the past. With these new techniques it will become very important again.
 
I feel if AMD doesn't have something better this release it probably going to be okay, but they better be working on something for the following release or they are going to be struggling. I think imag quality is going to come back to the forefront

I’d be surprised if AMD doesn’t have anything new on the reconstruction front. It’d be extremely useful for consoles to achieve greater longevity this next gen.

That said, I agree that IQ seems to be at the forefront again. It’s probably taken so long because of the market‘s obsession with resolution. I’ll be really happy if we finally move away from brute force and forget about native resolutions. If reconstruction can manage comparable or better results, it’s all upside. The way AI seems to be evolving, this is a likely future for games, and I’m all for it! Our CPUs already do all sorts of tricks, you could say we all take clever shortcuts on everything: cooking, baking, building things... Even our monitors with dithering to achieve a convincing appearance of higher bit depth. Why shouldn‘t our GPUs be as crafty?
 
I want to say you're probably right, but too often have I heard "AMD will dominate now because consoles are based on their architecture". And yet, since the PS4 era began, no such advantage has been visible on PC-land. It sounds like a good argument, but so far the facts haven't validated this premise. Probably because consoles use lots of custom coding that simply don't work on PC. Case in point: Death Stranding, a PS4 exclusive, that had to be ported to DX12 for PC release, and performs better on DLSS than FFX CAS.
It actually has. 8 CPU Cores being the sweet spot for gaming now isn't by happenstance.
 
Last edited:
It actually has. 8 CPU Cores being the sweet spot for gaming now isn't by happenstance.

Of course, but that hasn't solely benefited AMD in the market. Obviously Ryzen are great CPUs, but as soon as Intel went up in cores - begrudgingly - they have also kept reaping the rewards. In GPUs, AMD has barely seen any performance advantage just because consoles are based on Radeon tech - if at all.
 
I’d be surprised if AMD doesn’t have anything new on the reconstruction front. It’d be extremely useful for consoles to achieve greater longevity this next gen.
It would certainly make sense, but from what I've read the new console APUs don't have any sort of AI hardware so it would be rather difficult to utilize a machine learning system like DLSS. Maybe they're just planning on relying on FidelityFX to improve current upscaling? That said, I believe Microsoft at least has been working on some sort of new upscaling system so we might hear more about that from them in the near future.
 
It's part of DirectX, Direct ML, though I haven't seen much written or announced about it.

DirectML here is more like to CUDA than DLSS.
DirectML will run any ONNX "DLSS" algorithm, so kinda a false comparision ;)

So we have this:

CUDA -> DLSS (Warp-Level Primitives run on Tensor Cores via CUDA)
DirectML -> ????
 
DirectML here is more like to CUDA than DLSS.
DirectML will run any ONNX "DLSS" algorithm, so kinda a false comparision ;)

So we have this:

CUDA -> DLSS (Warp-Level Primitives run on Tensor Cores via CUDA)
DirectML -> ????
Right, sorry. What I meant to say, as you correctly pointed out, was that DirectML is the way in which Microsoft will implement some so-far-unnamed/unknown DLSS-like technology. From the looks of it anyway. I'd guess we'll hear about it sometime in the next couple months when new GPU architectures are revealed.
 
I'll assume you're referring to me there, but I'd just point you to Digital Foundry's review of DLSS 2.0 in Control. 540p looked better than native 1080p, and a similar case was made for 720p vs native 1440p, or 1080p vs native 4K, with the best quality being 1440 to native 4K. That means %50 resolution is giving similar or better results than native resolution. Check DF's coverage of Death Stranding, and you'll see they find FFX CAS gives comparable performance at %75 resolution. That's a hefty %25 tax on your GPU that you can offset with DLSS 2.0, nobody is talking about any exaggerated or rumored gains here. That's measured by DF very clearly and publicly available on Youtube.

If I can get an extra %25 of GPU performance on one brand vs the other, I'll get the one with better performance without thinking twice. If AMD equals that performance, I'd happily choose AMD. I have no beef with any of them, I buy whoever gives me better stuff at a given price point. Could care less about "brand loyalty" (the stupidest, yet most successful way to regularly siphon money out of customer's bank accounts).
DLSS doesn't work equally in all games in first revision, I don't see it changing for 2nd. Third maybe with a more AMD type solution. Open world and non-linear games are not a free ride for this tech.
 
DLSS doesn't work equally in all games in first revision, I don't see it changing for 2nd. Third maybe with a more AMD type solution. Open world and non-linear games are not a free ride for this tech.
True. Then again, all it takes is for Nvidia to put an emphasis on big ticket items that most gamers are bound to be interested in, and if performance is better on those = $$$.
To be clear: I want AMD to have a response to this. I just don't see it yet, and FFXCAS is not good enough vs DLSS 2.0 (for now).
 
True. Then again, all it takes is for Nvidia to put an emphasis on big ticket items that most gamers are bound to be interested in, and if performance is better on those = $$$.
To be clear: I want AMD to have a response to this. I just don't see it yet, and FFXCAS is not good enough vs DLSS 2.0 (for now).
Fair approach and fits what they do in past. I'd just prefer more universal solution (play lots of different games these days), even if it might be a little inferior in supported titles.
This time around I'm open to running nvidia, then selling when AMD had something ready. Depends on timing.
 
I'd just prefer more universal solution (play lots of different games these days), even if it might be a little inferior in supported titles.

I get it, although truly universal (GPU architecture agnostic) is surely harder to do. Sometimes I wish AMD would stop being "nice" in this way and just do proprietary stuff like Nvidia to even the waters. Anyway, in case you didn't notice, DLSS 2.0 seems more universal than it was before, as it no longer requires game-specific training:

"One of the biggest changes is that DLSS 2.0 is a general network, and no longer game-specific. This means that whereas DLSS 1.0 had to be implemented by the developer and trained specific to each game, DLSS 2.0 will work without additional training across a much wider range of games. It will still need to be implemented by the developers (which is supposedly relatively simple to do), but as a general AI network, it is much less work to integrate into each individual title." - from Tom's Hardware.

So, not universal, but apparently much easier to enable in any RTX GPU. If it's as "easy" as Gameworks to be included in games, we might see a sudden spike in DLSS support in the next 6-12 months. And seeing the gains we're seeing, that would probably sway my purchasing decision towards Ampere. No telling, of course, until AMD shows its cards.
 
I posted these in the Nvidia DLSS thread, but they are probably better here.

Upper shot is with native rendering FidelityFX off, lower with 75% render scale FidelityFX on.

RAGE2_1_OFF.png

RAGE2_1_ON.png

RAGE2_2_OFF.png

RAGE2_2_ON.png

RAGE2_3_OFF.png

RAGE2_3_ON.png

As you can see, a decent performance boost (about 25%) with minimal quality loss.

Fidelity FX works better than some people would lead you to believe.

I will do some test with Death Stranding once I get to that game.
 
I posted these in the Nvidia DLSS thread, but they are probably better here.

Upper shot is with native rendering FidelityFX off, lower with 75% render scale FidelityFX on.

View attachment 263423

View attachment 263424

View attachment 263425

View attachment 263426

View attachment 263427

View attachment 263428

As you can see, a decent performance boost (about 25%) with minimal quality loss.

Fidelity FX works better than some people would lead you to believe.

I will do some test with Death Stranding once I get to that game.

No need to post pictures, when DF makes videos better than anything you post:

 
Fidelity FX works better than some people would lead you to believe.

Thanks for the shots! Those do indeed look nice, although I can see the downgrade much more obviously than I see it in DLSS 2.0. And like Factum said, those Digital Foundry videos are simply amazing. I still cannot believe the amount of quality coming from DLSS in Death Stranding. I hope AMD can match it, I want more competition! Because it means I'll spend less money :)
 
Like I posted earlier...expect a lot of FUD trying to equate FFX with DLSS...so easy to predict 😏
 
Like I posted earlier...expect a lot of FUD trying to equate FFX with DLSS...so easy to predict 😏
I never said FFX was as good or better than DLSS (actually, if you look at my old posts, I say a lot about how good DLSS is).

I posted those pictures so people can decide on Fidelity FX based on it's own merits.
 
I never said FFX was as good or better than DLSS (actually, if you look at my old posts, I say a lot about how good DLSS is).

I posted those pictures so people can decide on Fidelity FX based on it's own merits.

Yes but you said something good about AMD and thats not allowed, you can only agree DLSS is best. I however appreciate users posting their experiences.
 
Back
Top