Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
lol, as much as I love AMD, my thoughts exactly.People are going to excitedly turn this on for the first time, hoping to double their fps "for free" and then realize it's a blurry mess. And only 2 or 3 games will use it within the next year, then it disappears forever. Like everything else AMD has done, aside from FreeSync.
The reason I don't think FSR will become too mainstream comes down to one thing. Nvidia's ability to work with *cough* bribe game developers both large and small.Are you kidding?? IF they implement it? It is all but guaranteed developers will implement it. First of all since AMD makes the chips for the PlayStation and XBOX and it will have to be implemented by all who write tit;les for the consoles. Porting it to PC GAMES WILL BE EASY work. Second the developers have been provided the porting kits by AMD already. Third since legacy Nvidia cards will be supported by the AMD software there is added incentive for developers to implement it since they will have a larger market for their games. Nobody will opt out as the loss of sales to competitors who do implement it would be devastating to their success. DLSS will be dead within a year and Nvidia will have to join in supporting this open standard.
https://www.techpowerup.com/review/...olution-quality-performance-benchmark/10.html looks very nice to melol, as much as I love AMD, my thoughts exactly.
The reason I don't think FSR will become too mainstream comes down to one thing. Nvidia's ability to work with *cough* bribe game developers both large and small.
AMD has been asked by journalists/youtubers and their response is basically: Our plans are for games. But its an open source shader, someone could do it if they wanted to.A nyone know if this will improve a card's performance for general productivity? I have three 10 series cards, and use them for photo/video editing...
And as noted by a few places now, 1080p upscaled to 4K with FSR*, looks better than 1440p with normal resolution scaling to 4K.Plenty of reviews out and it looks fairly impressive. The consensus seems to be if you're at 4K then ultra quality and quality mode are both very acceptable and if you're at 1440p then you want to stick with ultra quality. Of course if you got a crappy GPU then you're going to have to sacrifice visual quality even more and use a lower setting but you already having to lower settings anyway.
Yeah there is no doubt that Nvidia is going to double down on DLSS. I certainly won't complain about that though as it's getting better with each implementation. I just wish all these games would update to the newest version of DLSS so I don't have to manually go put a dll file in.No, you miss what I'm saying. FSR looks good, very good, and I do personally hope it catches and more game devs implement it. Also I am fully aware that it will be very easy to implement. But... that said, you underestimate or don't at all, the far reach Nvidia has into both large and small game developers. So it's not about how good it looks, or how easy it can be implemented, but with Nvidia's far reach into developers, that will be the big wrench into the gears. It's always been the case though, Nvidia working very, VERY closely with large and small game developers. Providing them with equipment, software tools, personnel, and more importantly incentives to assist them in their games. THAT is where AMD will run into a wall with FSR.
With AMD complete lockdown of 2 of the 3 main consoles and those console virtually 100% of the time upscaling sub 4k to 4k if they are plugged to a 4k tv, it would be hard to imagine if it is indeed a step up over the way that upscaling is done now that it would not rapidly become the norm and if it is not better than the current upscaling used on those amd gpu, it is not special.The reason I don't think FSR will become too mainstream comes down to one thing. Nvidia's ability to work with *cough* bribe game developers both large and small.
um yeah it will helpTo answer the title question, no.
Those on even worse cards get a higher fps at lower display quality and that's it.
Amd will not convert one 1080ti to rtx 3070.
Calm down a bit,if they gave performance so easily new series of graphics cards would not come out every year.
I looked from Linus, much worse pictures and a few games that support, the one who has already played on weaker graphics cards with native resolution will certainly not play again with a worse picture, better lower the details in the games and get a bigger frame.um yeah it will help
wat?
no one claimed that.
calm down and maybe watch/read some reviews...
With AMD complete lockdown of 2 of the 3 main consoles and those console virtually 100% of the time upscaling sub 4k to 4k if they are plugged to a 4k tv, it would be hard to imagine if it is indeed a step up over the way that upscaling is done now that it would not rapidly become the norm and if it is not better than the current upscaling used on those amd gpu, it is not special.
wat? wat? and yeah.I looked from Linus, much worse pictures and a few games that support, the one who has already played on weaker graphics cards with native resolution will certainly not play again with a worse picture, better lower the details in the games and get a bigger frame.
The gtx 1050ti and gtx 1650super are capable of native resolution without FSR on medium quality.The 1080ti is a high end class and no need for FSR.
But if that FSR makes someone happy, let them.
New generations of graphics cards are coming out every year, that’s something to keep in mind.wat? wat? and yeah.
thanks tips, i still cant make sense of half of what youre typing....New generations of graphics cards are coming out every year, that’s something to keep in mind.
wat?!If your really serious about astrophotagraphy/streaming check out the mallincam hyper plus color. There are quite a few folks streaming deep space video from around the globe using it in conjunction with their 12 inch plus scmidt cassegrain telescopes and the imagery is stunning when conditions are right for viewing.
What happens if you combine both FSR AND DLSS? DLSS would be used to generate initial frame from lower resolution, like 720p -> 1440p, then FSR 1440p -> 4k. 3x FPS boost, maybe?
FSR sounds really simple to implement, I hope it sees wide adoption. Though its output at lower resolutions need some improvements, I guess it's unavoidable since the source frame has too little data to work with.
These are early days, who knows how many games will support it in the future, could become so useful that it becomes something that is just expected to be patched into games released within the last ~5 years.I hope there's a way to inject it into older games, because it looks pretty damn good but a lot of the games that would benefit from it the most are stuff that I already own and am 99% sure won't get a patch for it.
Nothing that I currently play supports it.
Well BF1 has RT so I assume 2042 will also.Will battlefield 2042 have fidelity fx or dlss / ray tracing or all? As far as i know it is one of the few games that works better on ati cards and is not sponsored by nvidia.
Warzone also as far as I've heardThese are early days, who knows how many games will support it in the future, could become so useful that it becomes something that is just expected to be patched into games released within the last ~5 years.
Well BF1 has RT so I assume 2042 will also.
One thing I want to know more about is, I have heard some DLSS users complain that DLSS feels like it adds a layer of latency, and is really felt when playing fast fps games. Wonder if there is any truth to that at all..
Woops, yes BF5 has RTYou mean bf 5, bf 1 has no dlss or RT
well, maybe... amd is recommended 570/equivalent nv cards as the minimums but people are finding it works on lower cards, some not so good though...Hmm, why is everyone so co cerned with super high resolution like 4k? Taco play games on 1080 and it's super crisp. Well, perhaps if it does allow me to make 480p into 1080p nd nt have performance impact it'll be goid. Thank you.
Edit: blimey this will legit make 2gb cards relevant again!!
My Pascal Titan X was what had me switch to a 4K display, and I happily gamed at that resolution until I replaced it with a 2080 Ti. There was no DLSS or FSR back then.If i had a 1080ti i be doing jumping jacks if i was just buying a 4k display honestly im ready to get one for my card now
well, maybe... amd is recommended 570/equivalent nv cards as the minimums but people are finding it works on lower cards, some not so good though...
im sure youll still want to stay within the vram limits...Makes me wonder how much the vram can be pushed in textures packs with FSR as some games use over 5Gb in 1080p
/r/VXJunkies is leaking.If your really serious about astrophotagraphy/streaming check out the mallincam hyper plus color. There are quite a few folks streaming deep space video from around the globe using it in conjunction with their 12 inch plus scmidt cassegrain telescopes and the imagery is stunning when conditions are right for viewing.
Ya there is a limit... of what this can revive. I confirmed I could make it work on a card as old as a 750ti (the last NV cards I still have around before I went back to team Red lol) Although it runs the overhead it requires starts to add up. It still got me a few extra FPS but just not enough to really make any difference. lolwell, maybe... amd is recommended 570/equivalent nv cards as the minimums but people are finding it works on lower cards, some not so good though...