AMD FidelityFX Super Resolution

Here are the same images, but as lossless PNGs so you can see actual quality.

https://postimg.cc/gallery/94Kfs33

This is on Ubuntu 21.04, Nvidia driver 465 on an RTX 2080 Ti.

Performance in Terminator is much better in borderless window, the previously mentioned performance drop was due to fullscreen mode.
Wow! The Ultra Quality has more detail on a number of textures, like the brick wall. Plus the power lines are less alias than Native which has jaggies. I will have to see this in motion.
 
AMD really, REALLY needs to get support into the bigger games that count and that will drive more attention onto FSR; RDR2, R6 Siege etc...
 
AMD really, REALLY needs to get support into the bigger games that count and that will drive more attention onto FSR; RDR2, R6 Siege etc...
Fighting Nvidia is pricy... not because Nvidia is default 100x better they are just willing to throw money around more.

Same thing happened with Freesync. Nvidia paid a lot of monitor players to include their tech and hype the hell out of it. AMD said we are not going to pay companies to use our tech... no instead of paying them we will just give them something that won't cost them a fortune and charge them nothing at all for it. We know what has happened their Freesync has won... ya gsync is still a thing but you would have to be a hard core Nvidia booster to not admit its all but dead at this point.

The same will happen with this... Nvidia paid a bunch of (ok not really a small hand full of) developers to implement their tech into games, and spent a lot of money hyping it up for them. (those paid youtube shill videos cost more then you would imagine lol) AMD has come along with a solution and are probably not going to break open a bag of cash to drive its uptake. However they have made it inexpensive to implement... and free to use. (just like Freesync) So yes the same thing will happen over time. I suspect a handful of games may not see FSR right away as they have DLSS and I assume a bag of Nvidia cash with strings attached all over it.

I don't really see any way for Nvidia to save DLSS.... I mean if they back track now and find away for it to work without tensor cores they will look like Jerks even more. If they build a DLSS light that works the same way as AMDs FSR they will also look like Jerks. I expect they will double down and try to pay off a handful of developers they hope will have the smash hits to run DLSS and skip FSR inclusion. That would be peek Nvidia and what else would anyone expect from them.
 
Fighting Nvidia is pricy... not because Nvidia is default 100x better they are just willing to throw money around more.

Same thing happened with Freesync. Nvidia paid a lot of monitor players to include their tech and hype the hell out of it. AMD said we are not going to pay companies to use our tech... no instead of paying them we will just give them something that won't cost them a fortune and charge them nothing at all for it. We know what has happened their Freesync has won... ya gsync is still a thing but you would have to be a hard core Nvidia booster to not admit its all but dead at this point.

The same will happen with this... Nvidia paid a bunch of (ok not really a small hand full of) developers to implement their tech into games, and spent a lot of money hyping it up for them. (those paid youtube shill videos cost more then you would imagine lol) AMD has come along with a solution and are probably not going to break open a bag of cash to drive its uptake. However they have made it inexpensive to implement... and free to use. (just like Freesync) So yes the same thing will happen over time. I suspect a handful of games may not see FSR right away as they have DLSS and I assume a bag of Nvidia cash with strings attached all over it.

I don't really see any way for Nvidia to save DLSS.... I mean if they back track now and find away for it to work without tensor cores they will look like Jerks even more. If they build a DLSS light that works the same way as AMDs FSR they will also look like Jerks. I expect they will double down and try to pay off a handful of developers they hope will have the smash hits to run DLSS and skip FSR inclusion. That would be peek Nvidia and what else would anyone expect from them.
Quote of the day, ChadD !

https://twitter.com/KyleBennett/status/1407839113097666568
 
DOTA 2 just popped FSR.

https://www.dota2.com/newsentry/2992060508110412253

AMD FidelityFX Super Resolution

This update also adds support for AMD's FidelityFX Super Resolution. This technique allows the game to render at a lower resolution and then upscale the results with improved image quality. The result is high quality rendering at a lower performance cost than full resolution rendering, which allows for higher framerates even on less powerful graphics cards. Players can enable this setting in the Video options by turning the "Game Screen Render Quality" to less than 100%, and then turning on the "FidelityFX Super Resolution" checkbox. FidelityFX Super Resolution works on any GPU compatible with DirectX 11 or Vulkan.
 
anybody done the "kyle bennett FSR challenge" yet? ive got to get updated and try it...

https://twitter.com/KyleBennett/status/1407421909030125569
1624490657709.png
 
DOTA 2 just popped FSR.

https://www.dota2.com/newsentry/2992060508110412253

AMD FidelityFX Super Resolution

This update also adds support for AMD's FidelityFX Super Resolution. This technique allows the game to render at a lower resolution and then upscale the results with improved image quality. The result is high quality rendering at a lower performance cost than full resolution rendering, which allows for higher framerates even on less powerful graphics cards. Players can enable this setting in the Video options by turning the "Game Screen Render Quality" to less than 100%, and then turning on the "FidelityFX Super Resolution" checkbox. FidelityFX Super Resolution works on any GPU compatible with DirectX 11 or Vulkan.
Well they where not kidding about games coming soon where they. I mean just about everyone can run DOTA2 maxed out. Having said that I guess some people do reduce some settings and go for stupid high frame rates with Dota... I guess now perhaps they go Ultra Quality and leave everything cranked ?

Cool I do play dota now and then I'll have to go fire up a bot match so I can mess with settings haha

EDIT Valve implemented it a bit different if you select a render resolution less then 100% they have a added FSR toggle. I ran out of time today to really check it out... but I slide my render res down to 90% and toggled it can't say I could see any real difference in IQ... but I went from around 180FPS to 200-210 or so. Not bad for what looks like free, I'll have to play a match or two tomorrow and see how far I can slide it down. If I can hit 240 FPS I guess there isn't really much call for anymore, not that I am running a 240hrz monitor anyway.
 
Last edited:
I mean, freesync didn't really kill gsync like everyone proclaimed...
 
The problem with proprietary GPU features, is they only really stick around as long as the manufacturer wishes to sponsor their existence.

We've seen this a number of times, most notably with hardware PhysX (which is dead), SLI (dead), Nvidia 3D Vision (dead), most recently GSync (which has lost all steam and will probably disappear in the next 1 or 2 years).

Yes, FreeSync won. It is not as good as GSync in some technical ways, but it is cheaper for companies to implement, cheaper for consumers, much broader hardware support (not tied to Nvidia), and is mostly good enough.

We will see the same with FSR. It is technically not as good as DLSS (at this point in time) but it's cheaper and easier for everyone and still gives acceptable results. I'd say in the next few years we will see DLSS fade away.
 
GSync (which has lost all steam and will probably disappear in the next 1 or 2 years).

Again, that was said when freesync was first announced, yet here we are. It's still around 8 years after Gsync announcement. 7 years after freesync's announcement. As we can see with many things, there's room for 'cheap' (so to speak) and 'premium' to coexist indefinitely. Not saying that will be the case, but to just assume one way or another 'x' will definitely kill 'y' just seems fanboi-ish to me. There's also 2 tiers of Gsync now, that could have been the answer/death you might be predicting. Seems like some people only think competition is good and warranted when it's Nvidia has something AMD doesn't, otherway around or when both have 'equal' services, their tune changes and they want the competition eliminated. 🤔
 
Last edited:
Last edited:
Again, that was said when freesync was first announced, yet here we are. It's still around 8 years after Gsync announcement. 7 years after freesync's announcement. As we can see with many things, there's room for 'cheap' (so to speak) and 'premium' to coexist indefinitely. Not saying that will be the case, but to just assume one way or another 'x' will definitely kill 'y' just seems fanboi-ish to me. There's also 2 tiers of Gsync now, that could have been the answer/death you might be predicting. Seems like some people only think competition is good and warranted when it's Nvidia has something AMD doesn't, otherway around or when both have 'equal' services, their tune changes and they want the competition eliminated. 🤔
Competition in parts is GOOD. Competition in standards is anti consumer and stupid.

As for Gsync yes its not dead... nor is it really alive. Looking at my local part supplier... 94 basic FreeSync monitors, 22 FreeSync Premium, 14 Gsync monitors (with most of the newer models being freesync with Gsync compatible badges)... and 2 listings for Asus ROG Gsync ultimate monitors which are cool if your looking for a $2-6k gaming monitor.
 
I understand companies want to innovate, and sometimes it takes proprietary experimentation to prove a feature or product works.

But after that point, when the idea is validated, it should be a standard.
 
Ok so FSR has some limits. I was at my folks place this evening and fired up a very very old core 2 duo machine with a 750 ti 1080p. Dota2 low settings 40ish fps with some nasty lows. FSR 75% and I may may have got a couple extra FPS but those lows. lol I wasn't expecting much but probably not doubt with a 4 core old ass CPU it might have been actually playable.
 
The internet says Jeremy is a developer from Edge of Eternity

View attachment 368895

Wow! AMD seems to be right about the ease of implementation. Need to look more at AMD Cauldron TAA, which I did not know about. Looking great so far.

I tried the Riftbreaker Demo on the 3090, SFR increased FPS significantly and I didn't really see any significant difference. Now the game plays so well even at 4K without SFR using RT, max settings with the 3090 that it is not needed here, maybe at 8K resolutions this would be useful with this game and setup.
 
compares 580, 1060 and 970. yaps a lot at the beginning, might want to skip it...
edit: he also mentions amd has said that theoretically any gpu with shader model 5 should support fsr. igpus, see below...
 
Last edited:
is it me or can i barely tell difference at all in the look. but i guess thats not the point just more fps. also not using a big res monitor.
 
Not sure that is working right. First, I don't see any difference at all in PQ (maybe YouTube compression) but the fps also isn't any better with FSR (or worse). Not sure that is right.
 
Not sure that is working right. First, I don't see any difference at all in PQ (maybe YouTube compression) but the fps also isn't any better with FSR (or worse). Not sure that is right.
I think the performance might be jumping without it as they have the resolution scaler on. So when its off its off but at 75% resolution scale. So turning it on would take a FPS hit... but the IQ in theory should be closer to the 100%. So if they went off and upped the scale to 100% the FPS would be much lower. I don't know though youtube compression sucks I can't tell if its working for sure either.
 
Radeon driver modder enabling Rebar and FSR all the way back to preGCN cards including mobility graphics here.Only requires shader model 5 or better for FSR. Gonna test my old 6950 direct cu2 this weekend.
 
Radeon driver modder enabling Rebar and FSR all the way back to preGCN cards including mobility graphics here.Only requires shader model 5 or better for FSR. Gonna test my old 6950 direct cu2 this weekend.
I've seen some tests on older cards and the general consensus is that AMD probably cutoff support for FSR where they did, because cards much older than that don't really have the headroom to truly benefit from FSR. Resulting in only a handful of extra FPS.
 
Consoles is why AMD will overtake nVidia with FSR implementation in games vs. DLSS.
I kind of figured they would, even before FSR was implemented I worried that DLSS would only be available for a small number of games due to consoles being all AMD.
 
I kind of figured they would, even before FSR was implemented I worried that DLSS would only be available for a small number of games due to consoles being all AMD.
that and (practically confirmed) rumor wise nVidia is notoriously difficult to work with.
 
Yes, I've seen people use it on the GTX 970, but it only supports like 12 games right now.
 
Back
Top