AMD Radeon Super Resolution is FSR but for all games

cybereality

[H]F Junkie
Joined
Mar 22, 2008
Messages
8,789
https://videocardz.com/newz/amd-to-...lution-rsr-technology-that-works-in-all-games

According to our information, RSR is based on FSR 1.0 algorithm and will work through the Radeon software driver.
The technology will therefore overcome the biggest issue of proprietary super-resolution technologies, which is a requirement to be implemented in a specific place in the graphics pipeline.
This is usually needed to prevent issues with user interfaces, which are usually rendered at native resolution.
In other words, RSR will not require game developer support but will work independently through a Radeon driver.

This is amazing and what I've been saying forever. In January AMD owners will have FSR automatically in every game!

Though you can already do this with mods, it will be cool to have an official solution.
 
https://videocardz.com/newz/amd-to-...lution-rsr-technology-that-works-in-all-games



This is amazing and what I've been saying forever. In January AMD owners will have FSR automatically in every game!

Though you can already do this with mods, it will be cool to have an official solution.

I always wondered why they didn't do this from the start.

It was made apparent how easy that could have been when the Lossless Scaling tool on Steam created an app to do it.

(It works great, but you have to run the title in a borderless window for it to work, and when I did, that stopped FreeSync from working properly)

This is how it should work.

Still hoping they will release something that works similar to DLSS though, as the quality is just that much better.
 
Yeah, you can do this on Linux with Glorious Eggroll, it works fine full screen and everything. But it's a per game thing, so you have to set arguments on each game so kind of annoying.

AMD should have done this out the gate, not sure why they didn't. I understand you can get better results with developer integration, but there is a small list of games. Of course everyone wants every game to work.

DLSS is clearly superior, and Nvidia has their own FSR competitor, which does work on the driver level (maybe why AMD decided to do this now). But honestly, I don't like having a small list of games that support one feature.

If it works with everything, even if it is technically worse, that's still better overall.
 
Yeah, you can do this on Linux with Glorious Eggroll, it works fine full screen and everything. But it's a per game thing, so you have to set arguments on each game so kind of annoying.

AMD should have done this out the gate, not sure why they didn't. I understand you can get better results with developer integration, but there is a small list of games. Of course everyone wants every game to work.

DLSS is clearly superior, and Nvidia has their own FSR competitor, which does work on the driver level (maybe why AMD decided to do this now). But honestly, I don't like having a small list of games that support one feature.

If it works with everything, even if it is technically worse, that's still better overall.
I always thought it should be done driver side as a modified AA algorithm that the drivers overrode if you enabled the option for it in the driver.
 
Got to aggree, feels like NIS made AMD decide to do this.

AMD doing this made Nvidia announce NIS. NIS has been around in their drivers for a long time, they just reframed it to take the wind out of AMD's announcement.
 
  • Like
Reactions: kac77
like this
Now what I want to know is how much of NIS has NVidia been sitting on for years, is it something they just slapped together over the year because they wanted to have something to go against FSR or have they been sitting on it a while to push their DLSS stuff (which is the better tech hands down). Image upscaling techniques have been around a while that's not new so I just want to know how long it was in development for as a solution, if they've been sitting on it for years then why not put it out sooner as an alternative option to DLSS, if they just threw it together fast because they could then I feel bad for AMD who spent a long time on their FSR stuff. I wanna see some benchmarks for both along with some side-by-side video, it is good to have options I suppose, I might have to give it a go in a few titles that I would like to see a bit of an FPS boost in that arent too visually demanding.
 
Well NIS has existed in the driver for about 2 years already. Not sure what it was doing, since I never saw the option before, but I imagine it was used for other scaling the driver does.

However, they recently (as of November) gave it a name and promoted it on their website, along with integrating into GFE and making the option easier to find.

https://www.nvidia.com/en-us/geforce/news/nvidia-image-scaler-dlss-rtx-november-2021-updates/

It's clear this is a response to FSR. They already have better technology with DLSS (far better) but AMD FSR was picking up steam, and it's possible they had insider information that AMD was launching RSR soon (this is not uncommon with big companies).

So, yes, it is a competition and I feel like AMD is finally taking things seriously. Of course, they have no chance of beating Nvidia any time soon, but they can add features like this to please their existing customers.
 
So, yes, it is a competition and I feel like AMD is finally taking things seriously. Of course, they have no chance of beating Nvidia any time soon, but they can add features like this to please their existing customers.

I don't think Nvidia is in the lead at all. They have ray tracing and marketing, but other than that, their entire product stack has been a response to AMD products for this entire generation.

NIS has been a little-used, now dated feature that Nvidia quickly polished the interface for as a response to SRS. FSR might be universal, but only if game companies implement it. SRS is baked in. This is a direct threat to DLSS, which also has to be coded for, because not only does it "just work," it works on older games and everything.

On top of that, Nvidia wants to leave the sub-$500 video card market. Which means that AMD has a feature that boosts gaming on everything from what, Polaris on? Including all of their Ryzen onboard graphics?

Nvidia is only responding, to AMD and now to Intel with the RTX 2050.
 
So NIS has existed for years, nobody used it. AMD copies it and makes a big hey we're doing this yay team Red aren’t we awesome. NVidia updates their interface and says hey don’t forget we have it too.

From what I’ve seen NIS upscale everything when turned on including menu’s and UI elements where it’s not needed. Don’t have any useful AMD hardware to see how theirs goes.
 
I don't think Nvidia is in the lead at all. They have ray tracing and marketing, but other than that, their entire product stack has been a response to AMD products for this entire generation.
Nvidia has market share and mind share. I think AMD products are fine and I've been using them for a while (I use both brands, I have a few computers). AMD GPUs are honestly great. Especially when you could get mid-range cards, they were very competitive.

Unfortunately people have this idea (whether true or not) that Nvidia is better, so they sell more and people respect them more. Nvidia also does research, and comes up with some new tech like ray tracing or DLSS, but sadly ties it to proprietary hardware on their expensive cards.

AMD has had innovation too. While Nvidia was first with G-Sync, no question that FreeSync won. AMD killed it, but Nvidia still has their partners label monitors as "G-Sync Compatible" it doesn't matter. FreeSync won.

AMD was first with Eyefinity, and always supported it better than Nvidia, with many more screens, different aspect ratios and resolutions, PLP mode, etc. So this was real innovation (and Nvidia copied it poorly).

But sadly people see that maybe Nvidia has techically the "fastest card on the market" even though they can't afford it and a slightly less fast but reasonable card from AMD would suit them better.
 
Any technical demo available? Be nice to see what this can do so I can gauge how excited to be :)
 
No, technically it's just a rumor at this point but it seems pretty solid.

However, the algorithm is FSR 1.0, so you can just play any FSR game to get an idea. Terminator Resistance I think had the best implementation.
 
Nvidia has market share and mind share. I think AMD products are fine and I've been using them for a while (I use both brands, I have a few computers). AMD GPUs are honestly great. Especially when you could get mid-range cards, they were very competitive.

Unfortunately people have this idea (whether true or not) that Nvidia is better, so they sell more and people respect them more. Nvidia also does research, and comes up with some new tech like ray tracing or DLSS, but sadly ties it to proprietary hardware on their expensive cards.

AMD has had innovation too. While Nvidia was first with G-Sync, no question that FreeSync won. AMD killed it, but Nvidia still has their partners label monitors as "G-Sync Compatible" it doesn't matter. FreeSync won.

AMD was first with Eyefinity, and always supported it better than Nvidia, with many more screens, different aspect ratios and resolutions, PLP mode, etc. So this was real innovation (and Nvidia copied it poorly).

But sadly people see that maybe Nvidia has techically the "fastest card on the market" even though they can't afford it and a slightly less fast but reasonable card from AMD would suit them better.
G-Sync now is basically a certification standard like THX or Dolby, when G-Sync came out there was nothing that compared to what it did or how well those monitors performed or looked, they were beautiful. As the hardware that went into standard monitors started getting better and the average quality of pannels improved those features were gradually rolled into the VESA standard but even now G-Sync monitors look beautiful and are very nice to game on. But with Freesync if you are Freesync, Freesync Premium, or Freesync Premium Pro certified you are looking at very different performance ranges and you really need the performance pro models to compare against the G-Sync ones. G-sync and Freesync obviously offer variable refresh rates but that is the least interesting part of the certification at this stage.
 
FSR / RSR (which apparently is based upon FSR 1.0) and Nvidia's scaling tech is pretty unimpressive thus far, IMO. For starters the flickering artifacts introduced by FSR is quite a bit more prominent than DLSS artifacts, and FSR screws up reflections horribly in Anno 1800. But opinions change, as mine did with DLSS when version 2.0 landed, so I sincerely hope AMD makes a breakthrough that allows us to have our cake and eat it too, kinda like DLSS 2.0, but I mean REALLY eat it (e.g. universal lossless performance hikes). As of right now, I'm still a bit underwhelmed.

1641001623787.png
 
FSR / RSR (which apparently is based upon FSR 1.0) and Nvidia's scaling tech is pretty unimpressive thus far, IMO. For starters the flickering artifacts introduced by FSR is quite a bit more prominent than DLSS artifacts, and FSR screws up reflections horribly in Anno 1800. But opinions change, as mine did with DLSS when version 2.0 landed, so I sincerely hope AMD makes a breakthrough that allows us to have our cake and eat it too, kinda like DLSS 2.0, but I mean REALLY eat it (e.g. universal lossless performance hikes). As of right now, I'm still a bit underwhelmed.

View attachment 427329
Both technologies are crap.
 
No, technically it's just a rumor at this point but it seems pretty solid.

However, the algorithm is FSR 1.0, so you can just play any FSR game to get an idea. Terminator Resistance I think had the best implementation.
I dont think I own anything that does FSR :(

Where do you see the option typically? (when present), is it by screen resolution?

I do play at 4k so this could be of value to me.
 
I dont think I own anything that does FSR :(

Where do you see the option typically? (when present), is it by screen resolution?

I do play at 4k so this could be of value to me.
AMD Radeon Super Resolution will be released in January 2022. It will be supported by RDNA1 and RDNA2 architectures (RX 5000 and newer)

Where they put settings for DLSS/FSR in games differs, sometimes it’s with the resolution settings, other times they hide it down with the AA and detail settings.
 
Unneeded, and most people aren't professional esport players :rolleyes:. Bad faith arguments on your part.
I use e-sport just to describe it. If you want you can see the noise in every game, if you don't care about it - never mind :)
But I, can't accept buying an expensive video card, which will lower quality on the games that I play just to get a few FPS more - it's pointless for me.
So, it's a personal choice in the end :)
 
I use e-sport just to describe it. If you want you can see the noise in every game, if you don't care about it - never mind :)
But I, can't accept buying an expensive video card, which will lower quality on the games that I play just to get a few FPS more - it's pointless for me.
So, it's a personal choice in the end :)
Unless you buy a new top card every year... you are going to run into some games that push your FPS lower then your refresh. I mean if you just bought a top card perhaps not today... but these techs are going to be a god send for anyone wanting to push their cards just a little longer till the next card cycle.

If you can afford to game at 120 FPS in every game every year then hey you don't need this tech. Congrats on your success. lol Most of us don't budget $1200 for a GPU yearly.
 
I use e-sport just to describe it. If you want you can see the noise in every game, if you don't care about it - never mind :)
But I, can't accept buying an expensive video card, which will lower quality on the games that I play just to get a few FPS more - it's pointless for me.
So, it's a personal choice in the end :)
What's crap to one person is treasure to someone else.

Hate tearing? Don't use VSYNC OFF. VSYNC OFF is crap.
Hate lag? Don't use VSYCN ON. VSYNC ON is crap.
Hate motion blur? Use strobing. Non-strobing is crap.
Hate flicker? Don't use strobing. Strobing is crap.
Hate stutters? Use VRR such as G-SYNC. Non-VRR is crap.
Hate problems (lag,tearing) of framerates ranges outside VRR Hz ranges? Don't use VRR. VRR is crap.
Etc.

Someone may be more framerate-sensitive than noise-sensitive.
Someone may be more stutter-sensitive than lag-sensitive.
Someone may be more noise-sensitive than HDR-color-quality-sensitive.
Etc.

It's a game of compromises. No one technology package (GPU+screen) currently can do it all.

Everybody sees differently. I respect user choice on what's vastly superior. What is faint stutters to one person is motion-sickness inducing to another person. Etc, Etc, Etc.

Pick your poison.
 
Last edited:
Also, FSR and similar technologies can also increase picture quality. If you run FSR on Ultra Quality it actually makes the image look better (fixes aliasing artifacts) and you still gain some performance.
 
Also, FSR and similar technologies can also increase picture quality. If you run FSR on Ultra Quality it actually makes the image look better (fixes aliasing artifacts) and you still gain some performance.
Its silly but DOTA gives you a slider.... I turn FSR on at 99%. lol I know its just an image sharpener but wth its easy and built in.
 
I've seen the comparison pics on DOTA (I don't play the game) but it does help at 99%.
 
What's crap to one person is treasure to someone else.

Hate tearing? Don't use VSYNC OFF. VSYNC OFF is crap.
Hate lag? Don't use VSYCN ON. VSYNC ON is crap.
Hate motion blur? Use strobing. Non-strobing is crap.
Hate flicker? Don't use strobing. Strobing is crap.
Hate stutters? Use VRR such as G-SYNC. Non-VRR is crap.
Hate problems (lag,tearing) of framerates ranges outside VRR Hz ranges? Don't use VRR. VRR is crap.
Etc.

Someone may be more framerate-sensitive than noise-sensitive.
Someone may be more stutter-sensitive than lag-sensitive.
Someone may be more noise-sensitive than HDR-color-quality-sensitive.
Etc.

It's a game of compromises. No one technology package (GPU+screen) currently can do it all.

Everybody sees differently. I respect user choice on what's vastly superior. What is faint stutters to one person is motion-sickness inducing to another person. Etc, Etc, Etc.

Pick your poison.
Jokes on him though, eSport players don't play with maxed out settings, they play with lowest possible settings to get maximum frame rates and lowest input delay. :facepalm:
 
I'm not sure I'd even consider e-sport PC gaming. It's a business, they aren't playing for fun.

I mean, who wants to play games at lowest settings, VSync off, 640x480 resolution (with the wrong aspect ratio, no less)? LOL.
 
I dont think I own anything that does FSR :(

Where do you see the option typically? (when present), is it by screen resolution?

I do play at 4k so this could be of value to me.

Just google "games with FSR" or something like that. It works on a HUGE range of hardware from Intel, nVidia and AMD.

This one looks like a pretty decent list.
https://www.okaygotcha.com/2021/12/amd-fsr-compatible-games.html

But I wouldn't get tooo excited. It's basicly just a way to lower you resolution and not have it look like total crap. It still doesn't look as good as native resolution, but it gives you some wiggle room.

Also, FSR and similar technologies can also increase picture quality. If you run FSR on Ultra Quality it actually makes the image look better (fixes aliasing artifacts) and you still gain some performance.
erhm.... FSR does not increase picture quality over native. It also doesn't apply any sort of AA or fix it. It does a little edge reconstruction, but that's to help make the upscale look better, not resolve AA issues.

*late edit* At least that is how I understand it at this point.
 
Last edited:
Yes, it does. At least on some games. Look at the shots I posted here and look carefully at the black power wires in the middle.

https://twitter.com/cybereality/status/1407728843675246599
I would qualify that as a nice side benefit vs something it's doing purposefully. Offically FSR doesn't try to fix the AA of a game and relies heavily on the game having a good AA implementation. The edge reconstruction is supposed to just be about helping enhance the quality of the upscale.

Awesome comparison shots btw :)
 
I would qualify that as a nice side benefit vs something it's doing purposefully. Offically FSR doesn't try to fix the AA of a game and relies heavily on the game having a good AA implementation. The edge reconstruction is supposed to just be about helping enhance the quality of the upscale.

Awesome comparison shots btw :)
Oh and I took some FSR vs Native vs NIS using the Lossless Scaling App and CS:Source on the map de_italy. I uploaded em to Flickr. You can download the images in full resolution PNG if you like and use nVidia iCAT tool to compare them.

https://www.flickr.com/photos/194638445@N04/albums/72177720295354950

https://www.nvidia.com/en-us/geforce/technologies/icat/
 
Jokes on him though, eSport players don't play with maxed out settings, they play with lowest possible settings to get maximum frame rates and lowest input delay. :facepalm:
Indeed, they lower settings to get more FPS, and if they turn on FSR/DLSS (upscale) they will get more FPS but do you know why they don't do it?
Because they need to see where is their enemy, and not somewhere there because the crap effects over the whole picture ;)

Oh and I took some FSR vs Native vs NIS using the Lossless Scaling App and CS:Source on the map de_italy. I uploaded em to Flickr. You can download the images in full resolution PNG if you like and use nVidia iCAT tool to compare them.

https://www.flickr.com/photos/194638445@N04/albums/72177720295354950

https://www.nvidia.com/en-us/geforce/technologies/icat/
Screenshot cannot show what happens when you move and try to hit/kill the target.
 
Indeed, they lower settings to get more FPS, and if they turn on FSR/DLSS (upscale) they will get more FPS but do you know why they don't do it?
Because they need to see where is their enemy, and not somewhere there because the crap effects over the whole picture ;)


Screenshot cannot show what happens when you move and try to hit/kill the target.
Generally the increase in frame rate is going to only lead to better responsive game-play. Plenty of vids/test show that both FSR and DLSS reduce latency. However... I doubt you would need or see either tech in competitive play. Allot of those players are clueless about this stuff and just lazily crank the detail down to potato mode instead.

Also keep in mind that FSR doesn't suffer from the same level of motion artifact's as DLSS can. So in motion FSR tends to look consistent to a native presentation, which also makes sense given how it works. DLSS however is trying to reconstruct from temporal data on the fly. The stability of which is generally great... but it's not perfect. And because DLSS does such a stellar job generally, when it stumbles it tends to stand out. But again, in competitive play... your not likely to see either tech used short of some niche scenarios.
 
Back
Top