AMD Could Tease DLSS 3-rivaling FSR 3.0 at GDC 2023

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,891
Could FSR 3 bring parity to the competition in terms of features and performance?

"AMD frantically dropped in the first mention of FSR 3.0 in its Radeon RX 7900 series RDNA3 announcement presentation (slide below). The company let out precious little details of the new technology except the mention that it offers double the frame-rate versus FSR 2 (at comparable image quality). Does this involve a frame-rate doubling technology similar to DLSS 3? We don't know yet. It could just be a more advanced upscaling algorithm that doubles performance at a given quality target compared to FSR 2. We'll know for sure later this month. It would be a coup of sorts for AMD if FSR 3.0 doesn't require RX 7000 series GPUs, and can run on older Radeon GPUs, whereas DLSS 3 requires the latest GeForce RTX 40-series GPUs."

dEzz9EkF2MevnbYo.jpg


Source: https://www.techpowerup.com/305557/amd-could-tease-dlss-3-rivaling-fsr-3-0-at-gdc-2023
 
I was very much opposed to frame generation when I first heard of it.

I thought there would be no way it would be anything but terrible, with horrible input lag.

I tested it for shits and giggles when I played through Dying Light 2, and to my surprise I wound up leaving it on.

On my 4090 at 4k Ultra (with dlss2 on quality settings) it allowed me to go from ~90FPS up to pinning my 120hz screen, with lower GPU load (thus cooler and quieter)

It was essentially just like getting free frame rate with no side effects. No graphical distortions and no noticible increase in input lag (to me, but I am in my 40's now. Not going to claim that some young kid can't feel it)

If I were playing a competitive multiplayer title, I'd probably leave it off just in case, but other than that I am pretty impressed. No. I am flabbergasted.

I still don't like this. I ideally would like everything rendered at native resolutions without frame generation, but I have to reluctantly admit that I am impressed.

What they have accomplished theoretically should not be possible.
 
Last edited:
I was very much opposed to frame generation when I first heard of it.

I thought there would be no way it would be anything but terrible, with horrible input lag.

I tested it for shits and giggles when I played through Dying Light 2, and to my surprise I wound up leaving it on.

On my 4090 at 4k Ultra (with dlss2 on quality settings) it allowed me to go from ~90FPS up to pinning my 120hz screen, with lower GPU load (thus cooler and quieter)

It was essentially just like getting free frame rate with no side effects. No graphical distortions and no noticible increase in input lag (to me, but I am in my 40's now. Not going to claim that some young kid can't feel it)

If I were playing a competitive multiplayer title, I'd probably leave it off just in case, but other than that I am pretty impressed. No. I am flabbergasted.

I still don't like this. I ideally would like everything rendered at native resolutions without frame generation, but I have to reluctantly admit that I am impressed.

What they have accomplished theoretically should not be possible.
I wonder if the latency problems identified earlier on with frame generation has been resolved?

Update:


4 months ago:
 
I was very much opposed to frame generation when I first heard of it.

I thought there would be no way it would be anything but terrible, with horrible input lag.

I tested it for shits and giggles when I played through Dying Light 2, and to my surprise I wound up leaving it on.

On my 4090 at 4k Ultra (with dlss2 on quality settings) it allowed me to go from ~90FPS up to pinning my 120hz screen, with lower GPU load (thus cooler and quieter)

It was essentially just like getting free frame rate with no side effects. No graphical distortions and no noticible increase in input lag (to me, but I am in my 40's now. Not going to claim that some young kid can't feel it)

If I were playing a competitive multiplayer title, I'd probably leave it off just in case, but other than that I am pretty impressed. No. I am flabbergasted.

I still don't like this. I ideally would like everything rendered at native resolutions without frame generation, but I have to reluctantly admit that I am impressed.

What they have accomplished theoretically should not be possible.
Same boat... I swore I'd never use it... then I got a 4090 and tried it (as I game at 4K, so why not right?). Used it in CP2077, MSFS2020 and Portal RTX, these are the only games I own with the feature, and I have left it on in all 3 games. It looks and feels great. I was absolutely amazed, and it was not what I was expecting. I'd probably not use it in COD or BF, but Single Player games? Absolutely. But TBH, it performs so smoothly and without any real input lag that I could notice, I'd almost try it in a MP game too if they supported it and I needed it to get max frames with max graphics.
 
I wonder if the latency problems identified earlier on with frame generation has been resolved?

Update:


4 months ago:


Not sure.

I think there is a huge difference in implementation from title to title though.

Dying Light 2 may have been a very good implementation of DLSS 3 (or randomly just well suited for it) while other titles may not be.

I'd play it on a case by case basis. If it results in a bunch of artifacting or input lag, then I'd leave it off, if not, its free performance.
 
At first it did look at like just a fancy extra that could get good in the futur.

But many new games seem to have become quite cpu limited again, which could make a good frame generation tech (say what it would be at the end of 2023 or 2024) quite interesting, even for the strong GPU a la 4080-4090-7900xt-xtx, we already see a lot of under 85 fps low show up in benchmark, but high enough to be in dlss3 sweet spot.

People that want 300 fps probably does not want it, but people that want a locked 120fps while it does 75-95 fps without seem to be in the sweet spot for the tech and that scenario seem more common than one thought it would even on a 4090 with dlss 2 on.
 
I was very much opposed to frame generation when I first heard of it.

I thought there would be no way it would be anything but terrible, with horrible input lag.

I tested it for shits and giggles when I played through Dying Light 2, and to my surprise I wound up leaving it on.

On my 4090 at 4k Ultra (with dlss2 on quality settings) it allowed me to go from ~90FPS up to pinning my 120hz screen, with lower GPU load (thus cooler and quieter)

It was essentially just like getting free frame rate with no side effects. No graphical distortions and no noticible increase in input lag (to me, but I am in my 40's now. Not going to claim that some young kid can't feel it)

If I were playing a competitive multiplayer title, I'd probably leave it off just in case, but other than that I am pretty impressed. No. I am flabbergasted.

I still don't like this. I ideally would like everything rendered at native resolutions without frame generation, but I have to reluctantly admit that I am impressed.

What they have accomplished theoretically should not be possible.
Zarathustra being impressed and using anything machine learning based does not compute. WHO ARE YOU.
 
Bringing it to all GPUs would count for something, but I guess some are happy with being walled off.
Yeah, as AMD tries to move the entire field forward with open standards, Nvidia innovates just for proprietary purposes. it's not hard to tell who is poisoning the industry. Keep buying Nvidia - $2000 GPUs coming soon. And then we get the countless crying threads on price. they've made their expensive Nvidia bed.. now they can sleep in it. And before you all jump and say AMD cards are expensive too - remember this - just following the market leader in pricing - and very competently priced at that.
 
Zarathustra being impressed and using anything machine learning based does not compute. WHO ARE YOU.

Not going to lie. I am still not convinced allowing AI tech to make decisions for humans is a good thing.

I don't want AI driving my car, making medical decisions or even placing entries in my day planner for me.

I'm all for using machine learning to find patterns for these things, and presenting those patterns to humans for further research in developing hypotheses as to why they work, and then testing those hypotheses, such that static algorithms can be validated and put in place, but I am opposed to any black box system making decisions for people.

If you are going to have an "expert system" make a decision on anything that matters, you had better be able to explain why and how it made that decision, or you could just as easily wind up with problems like the actual case where AI correlated a ruler being present in a picture with a positive melanoma diagnosis or correlated older grainier X-ray images with positive tuberculosis diagnoses, because the datasets it had been fed mostly contained positive tuberculosis samples from 3rd world nations that use older X-Ray machines.

For anyhting where the decision actually matters, a human needs to be able to explain it, or it shouldn't be used.

But for something like a video game, especially a non-competitive single player title, where the downsides if it gets something wrong are insignificant in the grand scheme of things, why the hell not?

I would prefer native resolution and framerate rendering, but if I can't have that even on a 4090, without sacrificing quality settings, I guess I'll relent, as long as it works well.

It worked very well (at least for me) in Dying Light 2. I'm going to take it on a case by case basis though, as others have reported some issues in other titles. These may just be teething problems though.
 
Last edited:
AI is created by humans - therefore it is falable. ChatGPT ranting and raving at users after a series of some pointed questions is just the latest example. Maybe AI can HELP speed up some things but in the end - it will always have to be verified by a human. If not - chaos will ensue. At least as far as important things.
 
Last edited:
What I like about these sort of things is that I can try it and turn it off if I'm not happy. I'm not paying extra for it unless I need to buy a new card, so I see it as a win.
 
I dunno, I dont have a problem just dropping the resolution down when I have to. I wish they spent thier efforts on something more impactful.
 
What I like about these sort of things is that I can try it and turn it off if I'm not happy. I'm not paying extra for it unless I need to buy a new card, so I see it as a win.
It's the new thing - all cards will be getting this and you will pay for it in the next new card. Turning it off won't save you money.
 
I dunno, I dont have a problem just dropping the resolution down when I have to. I wish they spent thier efforts on something more impactful.
Yeah but that nice 4k monitor will turn out garbage imaging when scaling down from native.
 
I dunno, I dont have a problem just dropping the resolution down when I have to. I wish they spent thier efforts on something more impactful.

I used to not mind lowering the resolution a little bit back in the CRT days, but these days if you run at anything other than the native resolution of your panel, it looks like crap, and the upscaling adds input lag.
 
Almost like they don't have the money to compete, and even if they did people would buy nvidia anyway
Yeah the multi billion dollar company is just some scrappy startup in the face of mean old Nvidia. Keep making excuses... AMD = Admittedly Mediocre Devices, and it's not for lack of funding.
 
Yeah the multi billion dollar company is just some scrappy startup in the face of mean old Nvidia. Keep making excuses... AMD = Admittedly Mediocre Devices, and it's not for lack of funding.
nvidia r&d is double amd.

Last year, AMD spent $1.98 billion on R&D, an increase of 38% over the last two years. However, NVIDIA spent $3.92 billion, up 65% over the last two years.
 
Yeah the multi billion dollar company is just some scrappy startup in the face of mean old Nvidia. Keep making excuses... AMD = Admittedly Mediocre Devices, and it's not for lack of funding.
Their development budget is no where near Nvidia's and you have to be high to think it is. Just because their stock has a certain market cap doesn't mean they can just go all-in on development. They make their money in CPUs, there's a long history of people not buying ATI or AMD and buying Nvidia even when ATI/AMD's products have been competitive or better (it's been a long time admittedly), it's amazing to me they even bother with the discrete market anymore. No one is calling them a "scrappy start up" and I think both companies are fuckin horrible for consumers.
 
Almost like they don't have the money to compete, and even if they did people would buy nvidia anyway
Seeing their current "competitive" behavior, I'm starting to believe that's their intention.

Let Nvidia dominate, pick up their scraps. AMD has plenty of money now. More money (even accounting for inflation) than Nvidia did back in the Geforce 400 days when they started to dominate the market.
 
I find that it's not that bad when you're actually using it. The videos that are reviewing the technology slow things down and pinpoint specific issues (which are legitimate), but you might not ever actually see when you're playing. The catch is that DLSS3 only works on GPU's that are new and fast. It feels like tech that would be better served on older GPU's - the ones that actually need an FPS boost. Yet things don't work that way.
 
I dunno, I dont have a problem just dropping the resolution down when I have to. I wish they spent thier efforts on something more impactful.
Do you have a really high resolution (like a 4k monitor) or one you sit far from ? Because not running native on a lcd type monitor will tend to be a big deal.

The impact / effort of something like this could be rather huge.
 
Last edited:
I have several 4k monitors. Al long as I stick to the standard resolutions they look fine.
 
Unless you honestly need a nVidia only features (like say you need Cuda cores to accelerate something or need NVENC) there is nothing wrong with Radeon hardware if the price/performance fits your needs. Fanboy's be fanboys though.

I find that it's not that bad when you're actually using it. The videos that are reviewing the technology slow things down and pinpoint specific issues (which are legitimate), but you might not ever actually see when you're playing. The catch is that DLSS3 only works on GPU's that are new and fast. It feels like tech that would be better served on older GPU's - the ones that actually need an FPS boost. Yet things don't work that way.

It doesn't work as well at lower frame rates. That is part of the catch to the tech to start with. As with lower frame rates you get 2 problems.

1) Input lag gets too high
2) Interpolation mistakes are more visible as they stay on the screen longer.
 
It doesn't work as well at lower frame rates
It depend what we mean by that.

In the linustechtips blend test people by far preferred 50-60 fps dlss 3 than 30 fps without it and was one of the case where the advantage was the biggest.
 
From my understanding game look fine if he let his 4k monitor upscale from 1080p, 1440p well known standard resolutions.

Has the resolution get higher, the less an issue not rendering native would get
 
From my understanding game look fine if he let his 4k monitor upscale from 1080p, 1440p well known standard resolutions.

Has the resolution get higher, the less an issue not rendering native would get

I can see how 1080p would look good on a 4k screen if it is small enough.

No interpolation is necessary. You can simply fit 4 pixels in 1, so if the screen is small enough (or you sit far enough away) the experience shouldn't be significantly different from native 1080p
 
  • Like
Reactions: Mega6
like this
I can see how 1080p would look good on a 4k screen if it is small enough.

No interpolation is necessary. You can simply fit 4 pixels in 1, so if the screen is small enough (or you sit far enough away) the experience shouldn't be significantly different from native 1080p
1080 on a 4k monitor is such a waste. 1440 would be a mismatch mess.
 
I can see how 1080p would look good on a 4k screen if it is small enough.

No interpolation is necessary. You can simply fit 4 pixels in 1, so if the screen is small enough (or you sit far enough away) the experience shouldn't be significantly different from native 1080p
From my understanding that rarely how it is done, they use a very similar algorithm (if not exactly the same) for all resolutions, screen that allow pure pixel scaling are more the exception that the norm and I would imagine it is quite easy to beat anyway.

Playstation-Xbox console for example do not seem to worry and go all around 800p to 1800p with dynamic and zone resolution.
 
Back
Top