Nvidia Claims Latest BFV Update Can Increase RTX Performance by 40%

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
EA DICE posted patch notes for Battlefield V yesterday, which briefly mentioned that they were adding DLSS and RTX "optimizations" to the game without going into any detail. But that update was delayed until today, and now, Nvidia just published some more specifics. They claim that, with addition of Deep Learning Super Sampling, framerates in Battlefield V can be increased by up to 40% with RTX on. Thanks to cageymaru for the tip.

Check out Nvidia's Battlefield V DLSS video here.

DLSS is available with the following GPUs, with DXR ray tracing turned on, at the resolutions listed below: 3840x2160: All RTX GPUs, 2560x1440: RTX 2060, 2070, and 2080, 1920x1080: RTX 2060 and 2070

Now, you can play Battlefield V with DXR ray tracing at higher settings, resolutions and detail levels across our entire range of GeForce RTX GPUs, thanks to the addition of DLSS. The following are our new recommended settings for 60 FPS Battlefield V gameplay: RTX 2060: 2560x1440, High Quality Preset, Medium DXR, DLSS On. RTX 2070: 2560x1440, Ultra Quality Preset, Medium DXR, DLSS On. RTX 2080: 2560x1440, Ultra Quality Preset, Ultra DXR, DLSS On. RTX 2080 Ti: 3840x2160, Ultra Quality Preset, Ultra DXR, DLSS On.
 
I can't understand why dlss wouldn't be available to the higher end cards at lower resolutions. So if I picked up a 2080 Ti for my 1440p 165hz screen, I could not enable dlss, but if I had gotten the 2080 or lower, I could. What benefit is there to locking things out in such a way?
 
I can't understand why dlss wouldn't be available to the higher end cards at lower resolutions. So if I picked up a 2080 Ti for my 1440p 165hz screen, I could not enable dlss, but if I had gotten the 2080 or lower, I could. What benefit is there to locking things out in such a way?


yeah that one is a head scratcher
 
I can't understand why dlss wouldn't be available to the higher end cards at lower resolutions. So if I picked up a 2080 Ti for my 1440p 165hz screen, I could not enable dlss, but if I had gotten the 2080 or lower, I could. What benefit is there to locking things out in such a way?
lame and disappointing, a lot like the game itself.
 
Because Nvidia loves to neuter some feacher in their top end cards. After people beg the appropriate amount, they may re-enable them.
 
Hmm, that was a bit of a surprise. Didn't expect that DLSS would be segmented by card. I can kind of see why, since DLSS is, at its heart, upscaling a lower resolution. They may be saying that the 2080ti doesn't need it at anything lower than 4k, but to cut it out completely is kind of a surprise. This introduces even more fragmentation down the line because if you're gaming with a 2080ti on high refresh 1440p, you may or may not get DLSS support on any given title, even if other RTX cards do get the support. Just...bleh.
 
I can't understand why dlss wouldn't be available to the higher end cards at lower resolutions. So if I picked up a 2080 Ti for my 1440p 165hz screen, I could not enable dlss, but if I had gotten the 2080 or lower, I could. What benefit is there to locking things out in such a way?

They probably haven't ran it through their super computer yet or however it works.

I'm surprised DLSS + DXR isn't a subscription service!
 
I can't understand why dlss wouldn't be available to the higher end cards at lower resolutions. So if I picked up a 2080 Ti for my 1440p 165hz screen, I could not enable dlss, but if I had gotten the 2080 or lower, I could. What benefit is there to locking things out in such a way?
So that your framerate doesn't get too high that you won't want to upgrade to Nvidia's next offering.
 
I can't understand why dlss wouldn't be available to the higher end cards at lower resolutions. So if I picked up a 2080 Ti for my 1440p 165hz screen, I could not enable dlss, but if I had gotten the 2080 or lower, I could. What benefit is there to locking things out in such a way?

I run the same setup as you. I believe it's because DLSS would make performance worse, because it likely takes a fixed amount of time(let's say 5ms) for it to process the image, negating any performance benefits once you hit a certain frame-rate.
 
Wow does that blur a lot, dirt in first scene was really blurry and then the grass in the next one looked really messed up
 
I run the same setup as you. I believe it's because DLSS would make performance worse, because it likely takes a fixed amount of time(let's say 5ms) for it to process the image, negating any performance benefits once you hit a certain frame-rate.
Ok that sounds reasonable, thanks for suggesting that. As this was just a video, do you have a link to Nvidia discussing this tradeoff in depth so I can learn more? I understand you might not have that, but if so please share!
 
Why do they even publish this, it only makes the technique look bad even if the fps is better it just looks so much worse, it's so much more blurry for my sharp eye sight anyway.
 
So much for DLSS being a thing for 2080ti owners, yet so many told me just wait to see how much performance it would bring and superior image quality...
 
Yikes....DLSS looks like Hot Garbage . At least RTX offers sometimes noticeable visual improvements at times, but the degraded image quality using DLSS so far is not impressive even if it offers a boost in fps. From what I have seem, it sort of appears to be barely better than using resolution scaling. Maybe over time DLSS will improve, but for now it does not impress.
 
Wouldn't it be better if the AA engine knew all the 3D models in the game. And just AA off of that information?
 
Right away I thought the dlss version looked blurrier. I wonder how it would look if they were rendering at 4k to begin with.
 
I can't understand why dlss wouldn't be available to the higher end cards at lower resolutions. So if I picked up a 2080 Ti for my 1440p 165hz screen, I could not enable dlss, but if I had gotten the 2080 or lower, I could. What benefit is there to locking things out in such a way?

Same boat for me just with a 144hz panel, not really sure why its like that
 
is this better? kyle plan on doing any image quality testing on this?

blur.jpg
 
From that video every time I paused the image was nicer on the non DLSS version... more blurry and less RT on the DLSS version from my opinion.
 
I can't understand why dlss wouldn't be available to the higher end cards at lower resolutions. So if I picked up a 2080 Ti for my 1440p 165hz screen, I could not enable dlss, but if I had gotten the 2080 or lower, I could. What benefit is there to locking things out in such a way?

Same boat for me just with a 144hz panel, not really sure why its like that

I run the same setup as you. I believe it's because DLSS would make performance worse, because it likely takes a fixed amount of time(let's say 5ms) for it to process the image, negating any performance benefits once you hit a certain frame-rate.

I think that is likely it. It becomes a self-negating boost that ultimately just lowers the visual quality. I really was hoping the DLSS was going to be a better performance boost, but it lowers image quality from what I can see. The "Deep Learned" 'shortcut' for the AA can make stuff look worse. It's an interesting idea, maybe with more work it will improve, but not sold on it yet.
 
I think that is likely it. It becomes a self-negating boost that ultimately just lowers the visual quality. I really was hoping the DLSS was going to be a better performance boost, but it lowers image quality from what I can see. The "Deep Learned" 'shortcut' for the AA can make stuff look worse. It's an interesting idea, maybe with more work it will improve, but not sold on it yet.

Originally I thought that DLSS would calculate the difference between 1080P and 4K and make up and algo to fo from 1080P to 4K with no perf hit. Marketing got me I guess lol
 
what a cluster fuck. They are going to screw over everyone that has a 2080ti with 144hz+ screens at 1440p? Really Nvidia? I got a good deal on 2080ti here in forums. But man lets fuck over those who paid 1300+ and don't really have 4k because they want higher hz monitors.

The principal of things bothers me. But never really cared about RTX or DLSS. But whatever made them leave 2080ti out hanging for 1440p if true is a head-scratcher. I can see a lot of pissed off people with 1440p 144hz+ monitors and 2080ti.
 
DLSS was available for me on my 2080ti at 3440x1440... It looks terrible man. At first I was like I thought I remember BFV looking sharper... then I looked in the options and saw DLSS was on. I turned it off and it's instantly nice and sharp...

Having said that on the map I was playing I was routinely > 100 FPS with DXR on Ultra and DLSS off, with the odd dip down to 85-90FPS... I don't really need DLSS... There is some weird stuttering going on though, almost feels like network lag, but my ping to the server was consistently 35 or so...

Honestly the implementation of theses new features leaves a lot to be desired in BFV... so buggy.
 
I played with the new update last night...DXR looks amazing. I can crank the settings to Ultra and the effect is really nice. Turn on DLSS...and the screen looks like one giant motion blur. Also, nutering-per-card is downright shitty for Nvidia....not to mention no Crossfire/SLI/MGPU support for people with a dual-card setup. EA-Dice used to be really good about supporting a multi-card setup...Look at BFBC2/BF3/BF4 for support...the game scaled really well. Sad about the non-support for multicard setups.
 
I come from a photography background and most AA efforts in that field have been detrimental to image quality.

While I understand the need for keeping the jaggie's in check nothing does it better than high resolutions.

Everything else is a compromise imho.
 
DLSS worked for me on my 3440 x 1440 monitor on my 2080 ti.. As per my post aboca, it looks horrible and I would avoid using it. Probably turn other options down if I am running short on frame rate...
 
Back
Top