AMD FidelityFX Super Resolution

Take this as reseach for someone wondering if there computer could run the game as what FSR is really made for IMO ..

It's my x470 with 3700x and RX 5700 as I decided that no HD pack , 1080p Med with FSR in Ultra Quality with SAM on and FPS is displayed in top left .

 
What so many I believe that are firmly in team green seem to say is that "it's not as good as DLSS". I think most on team red or team neutral are willing to admit this. However with how good it is comparatively, and
how easily it seems to be implemented by Devs that want to invest the time, it's completely worth it.

PS... before the fanbois rail me, look at the card I'm running in my sig.

I personally don't use dlss or fsr if I can avoid it. Will I in the future, probably. The games that need it the most, is those games that nvidia doesn't even need dlss to destroy amd cards.
 
Take this as reseach for someone wondering if there computer could run the game as what FSR is really made for IMO ..

It's my x470 with 3700x and RX 5700 as I decided that no HD pack , 1080p Med with FSR in Ultra Quality with SAM on and FPS is displayed in top left .


I just started playing this game at max setting and the HD pack, it's some of the best graphics I have seen in a PC game. I don't need FSR really, the game runs so well Im getting a average of 80-100fps.
 
I personally don't use dlss or fsr if I can avoid it. Will I in the future, probably. The games that need it the most, is those games that nvidia doesn't even need dlss to destroy amd cards.
You mean raytracing stuff? That'll change. NV jumped first on raytracing and got a bit ahead. It won't last. It never does. AMD will catch up. I do think NV and AMD are onto something with DLSS and FSR though, and I hope they'll turn into a standard API feature going forward. In other words the way I want it to work is a game can just turn scaling on in DX or Vulcan or whatever and it just works. The vid card driver can take care of how it works if it provides an implementation, otherwise you get something like AMD FSR. In other words a game just turns on scaling, and if you have a recent NV card you get DLSS, if you don't you get FSR, and if AMD or NV come up with something better in the future you get that.
 
Games and game engines had already started to implement there own upscaler's. And if you look at EPIC's Temporal Super Resolution... they are hot on the heals on nVidia without using an AI. The only step missing is some one trying to do it driver side... AMD could do that tomorrow, but I gotta imagine they haven't yet as in it current state it would mess with UI elements and that really might not look so hot for some games.
 
Another comparison and another example that FSR does absolutely nothing to reconstruct native resolution Click!
 
Check these shots I made. You can add FSR to almost any Windows game on Linux. And the results are decent.

https://twitter.com/cybereality/status/1450561479816753157

Cyberpunk 2077 loses a little picture quality, but 30% performance boost, and still looks good in motion.

Doom Eternal has almost no picture quality loss. I had to check the settings multiple times, but it was working and looked the same.
 
Another comparison and another example that FSR does absolutely nothing to reconstruct native resolution Click!
I would agree that on a RTX 3080 as tested but I guess a Radeon card and driver can not run DLSS 2.0 to find out if the game engine is bias to Nvidia to even have DLSS 2.0 in the game kind of answers who payed the most ..like Fry Cry 6 is all for AMD ..
 
You can also run FSR on just about any GPU. AMD or Nvidia, on a GTX 970. It even works on Intel integrated GPU.
 
it never claimed to ;)
Yes, except they kinda did before they released it
FidelityFX-Super-Resolution.jpg

Four smaller consecutive frames => one bigger frame
FSR doesn't work like that at all. FSR takes single image as input and upscales and AMD do not advertise it like they did back when it was "in development".

My guess is that they had something better planned but realized their plan was too ambitious and they would not be able to make it both fast and universal so they made it fast and universal by using something simpler and apparently it works...

What's also interesting about Deathloop is that there are subtle differences in the implementation of NVIDIA's Deep Learning Super Sampling (DLSS), NVIDIA's Deep Learning Anti-Aliasing (DLAA), and AMD's FidelityFX Super Resolution (FSR), which we are keen to find out more about.
"subtle difference" my ass

If AMD actually made it use temporal data it could be an actually good upscaling solution that could rival DLSS even without using any AI stuff.
 
What's also interesting about Deathloop is that there are subtle differences in the implementation of NVIDIA's Deep Learning Super Sampling (DLSS), NVIDIA's Deep Learning Anti-Aliasing (DLAA), and AMD's FidelityFX Super Resolution (FSR), which we are keen to find out more about.


"subtle difference" my ass

If AMD actually made it use temporal data it could be an actually good upscaling solution that could rival DLSS even without using any AI stuff.
At the least, Deathloop's FSR doesn't appear to include its own sharpening pass. And similarly, the sharpening in its DLSS also seems relatively relaxed, compared to other games. This aligns with Nvidia's recent tools updates for developers, literally giving them a sharpening slider.
 
Yes, except they kinda did before they released it
View attachment 405598
Four smaller consecutive frames => one bigger frame
FSR doesn't work like that at all. FSR takes single image as input and upscales and AMD do not advertise it like they did back when it was "in development".

My guess is that they had something better planned but realized their plan was too ambitious and they would not be able to make it both fast and universal so they made it fast and universal by using something simpler and apparently it works...
Well I would say this single slide is ambiguous, at best. However, I don't pretend to have a catalogue of all the ways FSR was marketed, throughout time. So let me redo my post and say
it never claimed to ;)
AMD didn't claim reconstruction, AA, or even a temporal element, around release time.
 
If AMD actually made it use temporal data it could be an actually good upscaling solution that could rival DLSS even without using any AI stuff.
It is good though. I'm getting great results in many games, even with 3rd party tools, it works fine.
 
It is good though. I'm getting great results in many games, even with 3rd party tools, it works fine.
This is exactly what they were targeting. They wanted something that could scale to any hardware, old and new, without needing to have custom silicon or an expensive gpu to use it.
 
I just started playing this game at max setting and the HD pack, it's some of the best graphics I have seen in a PC game. I don't need FSR really, the game runs so well Im getting a average of 80-100fps.
I finally moved up to 1080p Ultra and HD is on .. I did get a nice boost in frame rate just turning FSR on to as what my understanding that it was made to do for most any video card .. frame rate in upper left corner

 
This is exactly what they were targeting. They wanted something that could scale to any hardware, old and new, without needing to have custom silicon or an expensive gpu to use it.
Right. I'm sure they could have designed some fancy AI algorithm, that only works with a special chip on their most expensive new GPU, but that is not helpful.

FSR works on just about anything. I even got it working on this cheapo $200 PC I got with Intel integrated graphics. I mean, it doesn't run well, but it works.
 
Right. I'm sure they could have designed some fancy AI algorithm, that only works with a special chip on their most expensive new GPU, but that is not helpful.
DLSS 1.0 was fancy AI upscaling algorithm.
F-17.jpg


It was quite unique in how image looked on it.
DLSS 2.0 works nothing like DLSS 1.0 though.
AMD could use similar approach to DLSS 2.0 without tensor cores. I doubt it could be made to be as universal as to run on any DX12 GPU because of technical limitations and/or performance.

FSR works on just about anything. I even got it working on this cheapo $200 PC I got with Intel integrated graphics. I mean, it doesn't run well, but it works.
From what I can tell FSR is pretty much the same thing as Radeon Image Sharpening that AMD tried to sell as DLSS alternative back in 2019, just with nicer packaging, hyper up marketing and itself put on steroids to the point it looks completely broken. At least in some games which overdid settings.
1636657814493.png


Far Cry 6, the best case scenario for FSR because it is "Ultra Quality" and despite it the image on the right side looks like crap. Completely broken. Radeon Image Sharpening usually looked better in screenshots... but back then what they had to deal with was DLSS 1.0 and it had its own flaws (or more like it did its own thing). Today it is DLSS 2.0 which usually looks pretty fantastic so they needed to move things a few notches up to the point people cannot tell if what they see are details or garbled artifacts. Someone who doesn't really see all that well would however get an impression that image with FSR is even "better than native" meaning "sharper than native". It doesn't however have anything to do with quality

Other than these older 5700xt screenshots I am not entirely sure how current Radeon's GPU upscaling looks like but as far as I can tell Nvidia Turing upscales images in a way that is completely usable and doesn't look that different to how FSR looks when used without sharpening. I do not see any artifacts on transparent textures or oversharpening allover the place with GPU upscaling but I am not using sharpening so it guarantees that image will look good.
Contrast adaptive sharpening can be enabled on both AMD and Nvidia cards so the only difference FSR does is having native resolution HUDs at the expense of:
- worse performance - GPU needs not only to create native resolution frame buffers but also do the upscaling part in shaders (instead of fixed function pipeline inside chip's output stage), also copy a lot of data from place to place. In some situation even using bilinear filtered "resolution scale" instead of just reducing resolution could cause performance to be significally worse and of course image quality in this case would also be worse
- lack of control over parameters of sharpening - if game does not exposes the controls

I am not entirely sure what causes these broken transparent textures in games with FSR but I guess developers who set sharpening settings do not know word "moderation"

There are of course differences between different upscaling methods between AMD, Nvidia, FSR, differences between sharpening implementations, etc. and there might be some slight differences in eg. ringing artifacts and such between them making one better than others. The issue here is however that FSR is not considerably better. It doesn't look better and actually it often look worse. This Farcry 6 screenshot does not look good at all. FSR vs native screenshots people make describing them as "look almost as good as native" look nothing like native and show significant amount of sharpening and ringing artifacts. FSR itself might have slightly less ringing artifacts than Lanczos-like GPU upscaling and contrast adaptive sharpening might have less ringing artifacts than normal sharpening (note: Nvidia sharpening is contrast adaptive) but if it is all cranked up to the point there it more ringing artifacts and sharpening makes small details less visible (as excessive sharpening does) then it doesn't matter.

People do not see better quality of FSR compared to other readily available methods. They only see excessive sharpening and not seeing much details anyways assume it looks better. That is the only explanation.

BTW. Intel is also developing their upscaling solution called XeSS.
So far from screenshots and videos it looks very promising. Definitely looks better than FSR, no excessive sharpening nonsense visible in any of their marketing materials

-o-XeSS-Intela.-To-konkurencja-dla-DLSS-oraz-FSR-2.jpg
 
Not sure why you're on a crusade. I admitted that DLSS 2.0 is better, but at a high cost to compatibility. FSR, on the other hand, works on nearly any GPU that can play a modern game.

I haven't tried Far Cry 6 yet, so I don't know if they have a bad implementation. The AMD quality settings are based on a fixed resolution scale, but there is also an additional sharpness setting (range from 1 to 5) which can cause artifacts if you set it wrong.

It is content dependent, though, some games look better sharp, some soft, or some in the middle. This is set by the developer and is an artistic decision (maybe Ubisoft chose wrong). And on Linux you can adjust this value yourself.

Not sure if you saw my screenshots I posted earlier. FSR looks very good. Look at the DOOM shots, it is almost impossible to tell the difference (even though it's rendering half the pixels). And Cyberpunk looks softer but still very nice.

https://twitter.com/cybereality/status/1450561479816753157
 
Not sure why you're on a crusade.
Crusades are good for gaining valuable insight about topics.
Eg. Europeans didn't know much about people in the holy land before they went on the crusades to kill and plunder them :)

Also it is because of posts like this
Look at the DOOM shots, it is almost impossible to tell the difference (even though it's rendering half the pixels). And Cyberpunk looks softer but still very nice.
https://twitter.com/cybereality/status/1450561479816753157
It looks nothing like native and shows way too much sharpening, things like ringing artifacts and rendering quirks (on the floor of Doom shot) getting overblown. Don't you see that?

FSR on its own is very good general purpose upscaling algorithm and works especially well for real-time applications like games. It is Lanczos with major issue or ringing artifacts eliminated, which by itself can make certain edges look more detailed and less bloated. If you start adding bloat to the image then what you get is bloated image.
 
It looks nothing like native and shows way too much sharpening, things like ringing artifacts and rendering quirks (on the floor of Doom shot) getting overblown. Don't you see that?
No, I don't see that. I think you're just trolling at this point because the quality looks great.
 
No, I don't see that. I think you're just trolling at this point because the quality looks great.
Here you go my friend
1636741530471.png


Instead of nonsense like "it is almost impossible to tell the difference" if I wanted to prove why FSR is great algorithm to use I would rather do something like this

Take bunch of rubble
1636741738846.png


Upscale it using most basic bilinear filtering - older GPUs used this for upscaling. Most games with 'resolution scale' option also uses this
1636741773373.png

Looks quite blurry to me

Then upscale it using lanczos - it is how games look on RTX card when using GPU scaling to change resolutions
1636741816859.png

Definitely looks sharper. Makes all the limitations of rendering visible though and has overshoots typical for lanczos

Old school anyone?
1636741858296.png

Razor sharpness - cannot go sharper than this and cannot see more details than this. Pure perfection!!!1 It is almost pixel art :)

AMD FSR
1636741890307.png

This looks actually very good. No overshot of any kind, has even a kind of high end look to it.

Add 0.5 RCAS
1636741970523.png

I would not go further with sharpening than this as it already starts to be too much and some edges have obvious ringing artifacts.

No matter how I look at it from this bunch the best options are FSR without sharpening and integer scaling. The latter being sharp without any bloat and in practice always plays great. FSR though also doesn't seem to have any bloat.

FSR solves issue of not having decent upscaling. It does not make it possible to play games with lower resolution and make it look like native resolution. Even DLSS does not look like native resolution. DLSS is literally designed to look exactly like DLAA - which by design looks quite a bit different than TAA. So if game was natively running DLAA then DLSS would be upscaling method which would make it possible to have the same* quality but with higher performance. *) When not moving too much and game was properly coded so things like texture lod bias was properly adjusted for each DLSS mode.

----------------
So where, in what way I am trolling you?
I did proper analysis of FSR. You did not.
Taking a game, tweaking until you think it looks ok to you and not checking how it compares to native render and putting ludicrous claims like "it is almost impossible to tell the difference" sound more like trolling than me checking what you posted and finding obvious issues with your claims.
 

Attachments

  • 1636742884662.png
    1636742884662.png
    87.7 KB · Views: 0
  • 1636742905102.png
    1636742905102.png
    118.8 KB · Views: 0
You're not wrong. I agree with your analysis.

My point is that if you have to take a static image and blow it up to 8x in Photoshop just to spot the artifacts then the algorithm is working.

While you are actually playing the game, in motion at high frame rate, it looks great and any slight picture quality loss (which is there, sure) is not obvious.

I'd much rather take a 30% boost in performance, with still great playable quality, at the expense of some small detail that is not apparent unless I take a screenshot and analyze it in Photoshop.
 
You're not wrong. I agree with your analysis.

My point is that if you have to take a static image and blow it up to 8x in Photoshop just to spot the artifacts then the algorithm is working.

While you are actually playing the game, in motion at high frame rate, it looks great and any slight picture quality loss (which is there, sure) is not obvious.

I'd much rather take a 30% boost in performance, with still great playable quality, at the expense of some small detail that is not apparent unless I take a screenshot and analyze it in Photoshop.

Not to mention being brand agnostic (while his preferred DLSS is not).
 
I think all of us (outside of multiplayer) will play native if it’s fast enough. This is for when it isn’t. And in those cases- you’re not gonna notice. It’s moving. Unless it’s weird like control smoke got
 
I think all of us (outside of multiplayer) will play native if it’s fast enough.
Well I play at 144Hz at high resolutions, so there is never enough power. Granted, I am mostly CPU limited at around 120fps in newer games, and over 90fps is still great, but why not get free performance?
 
Well I play at 144Hz at high resolutions, so there is never enough power. Granted, I am mostly CPU limited at around 120fps in newer games, and over 90fps is still great, but why not get free performance?
If I’m over 100FPS I’ll trade quality for any more perf. Sync makes it clean regardless at that point- and I play mostly single player. 3090 let’s me crank. Set refresh is 120hz normally for me.
 
If the gaming experience is better for you with FSR or DLSS then I say go for it. I don't think there is any straight answer here. The problems I had with DLSS was shimmering, ghosting on movement other times like in Rise Of The Tomb Raider it is a fantastic improvement then what was available before. Doom Eternal also gets improved if using DSR and DLSS in my view. FarCry 6 I did not like FSR mostly because of more aliasing, other than that it looked pretty good, since I had enough performance anyways I left it off. Blowing up images to find an issue for me is a roll eye event but that is just me. A lot of videos dealing with either FSR or DLSS have motion blur on, meaning that will hide a lot of clarity and some artifacts not even mentioning video compression loss of quality. I like motion blur off and depending upon quality of Depth of Field on off and if there are options the one that best suits me. People buy OLED TVs with outstanding motion clarity, or fast monitors with good motion clarity and yet turn on motion blur :D, well do that if you want, just does not make much sense, I rather have my full eye potential for motion vice being handicapped with premature motion blurring. I do like DLSS better than FSR so far, some games I've found it usable and beneficial, others not.
 
Little test
Attached to this post are screenshots of your typical run of the mill game Overload (do recommend it, it is awesome!) which tortures my GPU beyond the point playing at 4K is viable option so I dropped resolution to 1440p and tried various scaling methods available to me to see which one is best.

Can you spot which one is using AMD FSR?
 

Attachments

  • A.png
    A.png
    605.3 KB · Views: 0
  • B.png
    B.png
    605.7 KB · Views: 0
  • C.png
    C.png
    735.6 KB · Views: 0
  • D.png
    D.png
    861.2 KB · Views: 0
  • E.png
    E.png
    660.4 KB · Views: 0
Little test
Attached to this post are screenshots of your typical run of the mill game Overload (do recommend it, it is awesome!) which tortures my GPU beyond the point playing at 4K is viable option so I dropped resolution to 1440p and tried various scaling methods available to me to see which one is best.

Can you spot which one is using AMD FSR?
Just a guess, but D? Seems to be overly sharpened to the point the walls look terrible.
 
Little test
Attached to this post are screenshots of your typical run of the mill game Overload (do recommend it, it is awesome!) which tortures my GPU beyond the point playing at 4K is viable option so I dropped resolution to 1440p and tried various scaling methods available to me to see which one is best.

Can you spot which one is using AMD FSR?

Read what he said below...

You're not wrong. I agree with your analysis.

My point is that if you have to take a static image and blow it up to 8x in Photoshop just to spot the artifacts then the algorithm is working.

While you are actually playing the game, in motion at high frame rate, it looks great and any slight picture quality loss (which is there, sure) is not obvious.

I'd much rather take a 30% boost in performance, with still great playable quality, at the expense of some small detail that is not apparent unless I take a screenshot and analyze it in Photoshop.

If they can keep the game at a higher frame rate, have a not completely obvious loss of PQ when playing the game, and in a package that is brand agnostic and works with every game? That's win for everyone not using an RTX card that's playing a game that Nvidia hasn't decided is worth putting out a DLSS profile for anyway...

No one is saying that DLSS can't be better at certain things, but the point of entry is lower, more accessible, and more universal.
 
  • Like
Reactions: noko
like this
Yeah, you convinced me, thank you guys
Instead of dropping resolution to get the best possible performance with lowest possible input lag and still get to use variable refresh rate I will now use hacky tools or otherwise called "package that is brand agnostic and works with every game" to get marginally better upscaling. That only makes sense! 🤪

Let's get the facts straight:
- FSR needs to be implemented in games
- Magpie and Lossles Scaling are hacks and add input lag and disable VRR - until these issues are resolved or better hacks developed it is hardly a good option
- FSR is upscaling algorithm and not reconstruction algorithm - it cannot and does not attempt to add any missing details
- default GPU upscaling does not look nearly as bad as it did in the past - it uses Lanczos and is quite sharp

Well I play at 144Hz at high resolutions, so there is never enough power. Granted, I am mostly CPU limited at around 120fps in newer games, and over 90fps is still great, but why not get free performance?
If you cannot see details anyway - as you are pointing out so often - then just drop resolution. This is what gives the best performance, lowest input latency and has proper working VRR support.

If however you somehow see difference between FSR and eg. Lanczos then you should also see difference between FSR and native. You cannot claim to see one and not the other!

If they can keep the game at a higher frame rate, have a not completely obvious loss of PQ when playing the game, and in a package that is brand agnostic and works with every game? That's win for everyone not using an RTX card that's playing a game that Nvidia hasn't decided is worth putting out a DLSS profile for anyway...
Again FSR does not work with any game. At least so far it cannot be made to do so without obvious issues.
FSR works on some games which implemented it on any GPU - this part is good and I like it and do not complain about another option. Especially since I actually like FSR, at least as long as it is run alone without RCAS.

What AMD should do is make FSR their GPU upscaling. Nvidia should do it also!

No one is saying that DLSS can't be better at certain things, but the point of entry is lower, more accessible, and more universal.
I mentioned DLSS a lot a the beginning because I was disappointed about how FSR works. I thought it will be something like DLSS 2.0, just without AI stuff. It would be useful tech not only for PC but also for consoles.
I am still disappointed they didn't do properly but all things considered once I investigated how FSR works for general upscaling I am glad they developed it. They basically fixed Lanczos and made high-performance shader version of it that can be used to upscale lots of stuff eg. photos, videos, games, etc.
 
What AMD should do is make FSR their GPU upscaling. Nvidia should do it also!
You mean you get FSR if you turn on GPU scaling in the control panel? That would be good for games that don't have scaling built in, which is most of them. I'd want DLSS as an option on NV RTX cards though. No need to pick one, just add it to the settings in the control panel & let me set it per application along with a system default. Similarly I'd want whatever Intel is cooking and anything AMD comes up with in the future. I'd still rather have scaling just become and API (DirectX, Vulcan, etc.) and driver feature though. Game dev just makes an API call to turn on scaling and you get whatever your GPU vendor provides -- with tweaking options in their proprietary control panel along with vendor specific tweaking APIs.
 
I've tried Lossless Scaling. It does seem to work okay. I know FreeSync is not supported, but I didn't notice any obvious lag (maybe there is 1 frame delay, but at 144Hz that is acceptable to me).

It does seem to work better on Linux with Glorious Egg Roll. Not sure if FreeSync works, since FreeSync is not supported yet on the latest Ubuntu, so I can't test that right now. But it looks great, and as I said 30% performance boost (going from 1080p to 1440p).

AMD should definitely just add FSR to the control panel as they did with CAS. CAS was a weaker algorithm, but was still better than bilinear and worked okay in some games like Rage 2. Having it in the driver was great, because every game worked right off the bat.

I did like Overload a lot. Great game. Honestly can't tell what is what with those images (aside from the native one) so I'm not sure. But I have tested FSR quite a bit and I love it.

The idea is to have a good enough solution that works with everything and can run in real-time (without expensive special chips). And AMD achieved that. No, it's not the most advanced, but you can run it on an Nvidia GTX 970 or even Intel integrated, as well as AMD's new parts.

AMD also released the source code for free, anyone can download the code and add to their game or engine (which is why you see projects like Lossless Scaling and Glorious Egg Roll able to add it so quickly).

Nvidia gates DLSS. Not only do you need a new, expensive GPU made by Nvidia, you also have to be one of their special partner developers. It's not like anyone can do what they want. This is not what is good for the industry or for consumers.
 
Nvidia gates DLSS. Not only do you need a new, expensive GPU made by Nvidia, you also have to be one of their special partner developers.
Wrong. It is offered openly by contacting them, and for unity + unreal engine as a publicly available plug in.
 
Don't play at 4K and dont need either tech to have plenty of FPS at 1440P and with a 6900XT. It gets the job done just fine at native. The only time I can see a need for either is with the use of Ray Tracing and even then I would rather just turn Ray Tracing down if I needed to.
 
I know this is the FSR thread, but Nvidia has a new DLSS version and compared it directly to FSR in the same games.




I love FSR, but I have to admit DLSS does look much better.
 
I don't think there is much debate on if DLSS looks better than FSR. Not really up for contention imho. DLSS 2.x does look better.

What they are trying to really draw more attention to is their drivers already having a spatial scaler built in. And while it is similar in that is use's the Lanczos method it's missing parts of what makes FSR better. In that FSR resolves edges better and resolves ringing issues with the regular Lanczos resizing methods. And when FSR is built in by the developers, they can choose to not have everything scaled by FSR. Allowing for other elements to remain properly sharp at native resolution.

I also hope this gives AMD a nudge to implementing an FSR like scaler driver side.
 
Last edited:
Back
Top