Did.... did AMD just extend the life of my 1080ti by enabling FSR?

The rest of the year can go by and you might only end up with a handful of games that even implement it so don't count on this extending the life of your 1080 ti too much.
 
I wouldn't be too surprised if by end of year there are more FSR enabled games than DLSS ones. Why? The utterly easy implementation:

How does it work?

FidelityFX Super Resolution is a spatial upscaling technique, which generates a “super resolution” image from every input frame. In other words, it does not rely on history buffers or motion vectors. Neither does it require any per-game training.
https://gpuopen.com/fsr-announce/

No motion vectors and frame buffering should make this a very speedy implementation and maybe just a developer must do thing since their games will play on all sorts of hardware and the better it plays the more they can sell.
 
I wouldn't be too surprised if by end of year there are more FSR enabled games than DLSS ones. Why? The utterly easy implementation:

https://gpuopen.com/fsr-announce/

No motion vectors and frame buffering should make this a very speedy implementation and maybe just a developer must do thing since their games will play on all sorts of hardware and the better it plays the more they can sell.

On the flip side doesn't this mean it will be less effective versus DLSS? The more specialized data you have from training, etc, the more you can improve.
 
People are going to excitedly turn this on for the first time, hoping to double their fps "for free" and then realize it's a blurry mess. And only 2 or 3 games will use it within the next year, then it disappears forever. Like everything else AMD has done, aside from FreeSync.
 
People are going to excitedly turn this on for the first time, hoping to double their fps "for free" and then realize it's a blurry mess. And only 2 or 3 games will use it within the next year, then it disappears forever. Like everything else AMD has done, aside from FreeSync.
I would bet more games will be using it then DLSS by the end of the year. If it is as easy to implement as AMD says, it will be a no brainer for the developers to put it in or add it with an update. We just have to see if the benefits outweigh the negatives. As for DLSS, I've been very disappointed overall with DLSS, I guess I expected much more than what I got.
 
DLSS2 is black magic. It can't be easy to copy something that seems like it shouldn't even be possible.
DLSS 2.0 is far from perfect though. I really don't get how it can make some parts of the game look better while making other parts look worse than native resolution. For instance in Cyberpunk 2077 you can look at one area and see a slight advantage from dlss on quality compared to native and then you can see several other areas where it actually degrades the image quite a bit and there's much more crawling and flickering. To me if they can reduce the aliasing on some objects then they should be able to reduce the aliasing on all objects. Bottom line is it just seems like it's all over the place and not even remotely consistent. Of course knowing this forum I'm sure someone will chime in and say it's perfect in every regard and they've never seen anything negative when using it..
 
yup BUT only if devs implement it.
Are you kidding?? IF they implement it? It is all but guaranteed developers will implement it. First of all since AMD makes the chips for the PlayStation and XBOX and it will have to be implemented by all who write tit;les for the consoles. Porting it to PC GAMES WILL BE EASY work. Second the developers have been provided the porting kits by AMD already. Third since legacy Nvidia cards will be supported by the AMD software there is added incentive for developers to implement it since they will have a larger market for their games. Nobody will opt out as the loss of sales to competitors who do implement it would be devastating to their success. DLSS will be dead within a year and Nvidia will have to join in supporting this open standard.
 
Are you kidding?? IF they implement it? It is all but guaranteed developers will implement it. First of all since AMD makes the chips for the PlayStation and XBOX and it will have to be implemented by all who write tit;les for the consoles. Porting it to PC GAMES WILL BE EASY work. Second the developers have been provided the porting kits by AMD already. Third since legacy Nvidia cards will be supported by the AMD software there is added incentive for developers to implement it since they will have a larger market for their games. Nobody will opt out as the loss of sales to competitors who do implement it would be devastating to their success. DLSS will be dead within a year and Nvidia will have to join in supporting this open standard.
BUT not IF.
 
I wouldn't be too surprised if by end of year there are more FSR enabled games than DLSS ones. Why? The utterly easy implementation:
Good luck with that.

There's no free performance lunch. And "easy to implement" = predictably looks like shit.
 
Are you kidding?? IF they implement it? It is all but guaranteed developers will implement it. First of all since AMD makes the chips for the PlayStation and XBOX and it will have to be implemented by all who write tit;les for the consoles. Porting it to PC GAMES WILL BE EASY work. Second the developers have been provided the porting kits by AMD already. Third since legacy Nvidia cards will be supported by the AMD software there is added incentive for developers to implement it since they will have a larger market for their games. Nobody will opt out as the loss of sales to competitors who do implement it would be devastating to their success. DLSS will be dead within a year and Nvidia will have to join in supporting this open standard.
Saving another “DLSS will be dead within X timeframe post,” eventually someone will claim victory, even if Nvidia implements something better than their current setup.

Hell, I am glad we actually have side by side motion comparison of more than one game that isn’t powered by AMD cards. Showing how much better FSR is!
 
DLSS 2.0 is far from perfect though
Sure it isn't, but for me it's still damn good enough. Only REAL issue's I even noticed in Cyberpunk (3440x1440p running on "Performance" DLSS mode at that!) was that objects behind chain link fences didn't always look right but that, or any other minor thing I might be forgetting, doesn't change the fact I literally just nearly DOUBLED my frame rate while over all still looks nearly indistinguishable from native.

So yeah, it isn't perfect but I wouldn't call it "far from" in the slightest. I played the whole game and not once did I think "DLSS is kinda some bullshit"...no I looked in the corner of my monitor and saw 60-120fps+ using maxed out graphics on my 2080Ti playing the most demanding game ever made.
 
Well that is your opinion but as far as I'm concerned performance mode looks like garbage compared to native resolution. Even quality has its downsides in spots and no way in would I ever run any lower than that but some of you are just oblivious to it and have lower standards or just care about an extremely high frame rate. In my opinion the best overall implementation of DLSS is in Death Stranding as that game looks horrific at native resolution and has some of the worst aliasing and crawling I have ever seen in any game. DLSS cleans that up drastically and the few little downsides it has is worth it in that game for sure.
 
Last edited:
Good luck with that.

There's no free performance lunch. And "easy to implement" = predictably looks like shit.
How hard to implement or not has zero bearing on quality, even though it still may look like crap. Just have to see.

So far for me, for the most part, DLSS = Crap. Motion artifacting, crawling on high frequency textures, pop in and out of textures are just too distracting. Seems to be tolerable only at 4K in Quality mode but even then like in Control I could not really appreciate.
 
Are you kidding?? IF they implement it? It is all but guaranteed developers will implement it. First of all since AMD makes the chips for the PlayStation and XBOX and it will have to be implemented by all who write tit;les for the consoles. Porting it to PC GAMES WILL BE EASY work. Second the developers have been provided the porting kits by AMD already. Third since legacy Nvidia cards will be supported by the AMD software there is added incentive for developers to implement it since they will have a larger market for their games. Nobody will opt out as the loss of sales to competitors who do implement it would be devastating to their success. DLSS will be dead within a year and Nvidia will have to join in supporting this open standard.
FSR isn't going to be available on consoles initially. It may or may not be in the future.
How hard to implement or not has zero bearing on quality, even though it still may look like crap. Just have to see.

So far for me, for the most part, DLSS = Crap. Motion artifacting, crawling on high frequency textures, pop in and out of textures are just too distracting. Seems to be tolerable only at 4K in Quality mode but even then like in Control I could not really appreciate.
Do you think FSR won't magically have similar issues? Control had DLSS 1.0 at release. It didn't get 2.0 until April last year.

https://www.nvidia.com/en-us/geforce/news/control-nvidia-dlss-2-0-update/
 
Not really - it will be similar to turning down the resolution to get more performance with some extra filtering on top. It's not doing an image reconstruction like DLSS, so while you can get a little better image quality than running at a lower resolution scale, it won't be like running at native high res, at least based on the images they've shown which look like a traditional (dumb) upscaler, vs. an AI enhanced upscaler...
 
^^^ Whatever it turns out to do to performance and IQ... the price of admission is right.. that "F" word... Free... works for me.

Having the ability to enable Fresync with my 1080ti's was another cool freebie.
 
FSR isn't going to be available on consoles initially. It may or may not be in the future.

Do you think FSR won't magically have similar issues? Control had DLSS 1.0 at release. It didn't get 2.0 until April last year.

https://www.nvidia.com/en-us/geforce/news/control-nvidia-dlss-2-0-update/
What I expect FSR to do is to give a better upscale experience then traditional means with a performance increase. If it does that then then that would make it useful plus if games actually use it effectively making it a transparent option for most people to have. Does it have to look exactly like or better than native? No. Just a better option then before would make it useful. If it exceeds expectations great.

As for Control rave etc. with DLSS 2.0, I could not get it to not have artifacting on certain edges, textures (even when standing still, textures would be crawling). No way did it effectively duplicate the quality of native resolution. This was at 1440p, even using DSR 4K resolution with monitor at 1440p (so basically DLSS rendering at 1440p) the artifacting was still there. Of course it does some things extremely well like hair and other stuff and then also falls apart. Give and take, if one just look at what is right and ignore the other stuff, still images vice motion, the filtering it does that wipes out rain, waterfalls, special effects like in Death Stranding -> it is the best thing ever? -> For me it is not and is almost a gimmick at this stage. Still I like having that option since you can trade some quality with increase performance and maybe get RT for a better overall quality. When I get to try it out for the next game that has it, I will try it out again.
 
Last edited:
Yes, I expect FSR to be useful in exactly the way you describe. I do not expect it to be magical like DLSS2, but options are always good.
 
I really don't get how it can make some parts of the game look better while making other parts look worse than native resolution

It feel somewhat natural to me that it would be the case for some stuff, text for example an Ai could be really good to predict mixed pixel and be better than native for those or something like a power line in the sky or the edge of a straight building or a leaf getting thinner and thinner in a very constant way.
 
A nyone know if this will improve a card's performance for general productivity? I have three 10 series cards, and use them for photo/video editing...
 
I haven't had these issues with DLSS. To me it looks great, yes there may be some artifacts here and there, but nothing that is a deal breaker. Especially not when you can get double performance in some cases.

I also liked RIS/CAS and thought it helped in the games that supported it. Not DLSS quality, but if you need the performance to be able to play the game, it came through.

Whatever the quality, AMD needs it to make RT viable. Nvidia RTX without DLSS is also a slideshow, so FSR will be a godsend if it works even half as good as DLSS.
 
not in any way that i know of, its res scaling for games.
Thought so. No worries. My 1080 is adequate for photo editing. But, eventually, I'll be upping my video shooting and editing game.. which is where the 5950 will be superb, of course, but graphic power will be needed. My Canon R5 can shoot 8k Raw video, and its 4k 120fps file sizes are also huge....see the link to a Milky Way timelapse from a month ago...which is amateurish for two reasons..a gap when a battery died, and it being editing at a way too slow frame rate. Click on 2x playback speed helps, but it should be 4x faster. Watch on YT in 4k.....
 
Last edited:

Did.... did AMD just extend the life of my 1080ti by enabling FSR​


Short answer: I doubt it

The amount of matrix math needed to make DLSS work takes several times the raw throughput of your Pascal card. (50 for RT 2060 tensors). Even if we assume half-utilization, its still 2.5x faster!

I don't think we will see a similar miracle shader guesstimate release like FXAA\SMAA here, as upsampling is a lot harder,( but Fxaa did buy my GTX 460 1gb an addition two years!)

Shader-assisted AA has a lot longer history...but using them, to target perfect upsampling ( without a huge performance hit,) sound impossible without adding dedicated Tensors
 
Last edited:
Well in a couple of days we will see how this pans out. Since AMD has not shown too many beforehand examples, to brag, build excitement and so on . . . well we just have to see. Looks like Unreal Engine 5 did not need those tensor cores to have some incredible upsampling and the 6800 XT pounds the 3080 somewhat as well in the demo for performance at this point in time.
 
Sometimes you have to look at the big picture.
The short-sighted question would be something like..."Why would you help your competition by allowing their GPUs access to your tech?"
The long-term would be the same...but for different reasons.
If AMD can release a technology which helps Nvidia customers who ALREADY own the cards, then it delays them buying a new Nvidia card.
It gives AMD more time and therefore a better chance to compete in those markets.
It also allow them to catch up or even surpass Nvidia sales when those same guys decide to buy a newer GPU...as long as the experience was positive.
 
Yes, that is a great point. AMD breathes new life into your GTX 1060. Not only are you not needing a new GPU, but you will remember AMD did you a solid.

Additionally, by making it easy to develop for and widely supported (not just on AMD's latest and greatest) there is a good chance developers will choose this over DLSS (if they are not already in Nvidia's pocket) as that would give the best benefit for cost to all parties.
 
  • Like
Reactions: GHRTW
like this
Additionally, by making it easy to develop for and widely supported (not just on AMD's latest and greatest) there is a good chance developers will choose this over DLSS (if they are not already in Nvidia's pocket) as that would give the best benefit for cost to all parties.
Apples v. bicycles. DLSS is it's own thing. Silly to try to frame it as anything else. They're not mutually exclusive.
 
I don't know. They seem to be there for the same reasons, even if the technology is different. It's the same thing.
 
  • Like
Reactions: noko
like this
I guess it will apply FSR in the game as I would not think an Nvidia card could run an AMD driver , also I have 21 . 6 .1 loaded already on a RX 5700 and I need to load up Boarder Lands 3 .. Wish you Nvidia fans all the best of luck as to be the most interesting out side the box thinking in a while just like Free Sync working on Nvidia cards .
 
I guess it will apply FSR in the game as I would not think an Nvidia card could run an AMD driver , also I have 21 . 6 .1 loaded already on a RX 5700 and I need to load up Boarder Lands 3 .. Wish you Nvidia fans all the best of luck as to be the most interesting out side the box thinking in a while just like Free Sync working on Nvidia cards .
Of course it will be within the game. And why are you talking about Borderlands 3? That game does not even have FSR at launch as it is not one of the seven games. It's not even listed in any upcoming games that will support it.
 
Back
Top