Battlefield V NVIDIA Ray Tracing RTX 2080 Performance @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Battlefield V NVIDIA Ray Tracing RTX 2080 Performance

Can the NVIDIA GeForce RTX 2080 muscle up the performance to play Battlefield V with ray tracing enabled at 1080p and 1440p now that the new RTX performance patch has been released? We will find out using an ASUS ROG STRIX RTX 2080 OC and testing Battlefield V using 64-player MULTIPLAYER. Everything just works?

If you like our content, please support HardOCP on Patreon.
 
I really would like to know why there is such a big performance drop going from DX11 to DX12. Its not like DX12 gives you any enhanced graphics over what DX11 offers.
 
Nvidia's had performance issues on DX12 with the 10XX series and the 9XX series. I wonder if it's the architecture. Sure DX12 is M$'s take on AMD's Mantle,but it's not like running Mantle in Doom caused a penalty for Nvidia cards at the time.
 
I really would like to know why there is such a big performance drop going from DX11 to DX12. Its not like DX12 gives you any enhanced graphics over what DX11 offers.

Dx12 is just pretty shit for performance in general.
 
Man, this just isn't going so well for Nvidia is it? I get that real-time raytracing is a monster. I get that it's new tech. But come on - at the end of the day - no one pays upward of $900 on a video card to get barely playable performance at 1080p.
 
Man, this just isn't going so well for Nvidia is it? I get that real-time raytracing is a monster. I get that it's new tech. But come on - at the end of the day - no one pays upward of $900 on a video card to get barely playable performance at 1080p.

I get that and its true. But i think back to the days of Crysis. No card could run that game at exceptable frame rates. It took quite a while till we finally got them. Ray Tracing has always been one of the Holly Grails of computer graphics. You only have to listen to game devs talk about it with such enthusiasm. I know performance with these cards are not anywhere near what we would like for the crazy prices Nvidia is asking. But it is a step in the right direction for the tech itself. Personally, i would love AMDs next generation to come out with it, even if the performance is comparable.
 
I get that and its true. But i think back to the days of Crysis. No card could run that game at exceptable frame rates. It took quite a while till we finally got them. Ray Tracing has always been one of the Holly Grails of computer graphics. You only have to listen to game devs talk about it with such enthusiasm. I know performance with these cards are not anywhere near what we would like for the crazy prices Nvidia is asking. But it is a step in the right direction for the tech itself. Personally, i would love AMDs next generation to come out with it, even if the performance is comparable.

While that's very true, nobody was lead to believe that it would work well at the time. Can't do raytracing playably at 4 mega pixels? For that price? Cmon
 
I agree, the price is by far the biggest fail with these new cards from Nvidia. If they were priced the same or close to the last Pascal series, i doubt anyone would be moaning about the performance. Nvidia really made a meal of this launch.
 
I’m almost okay with this type of performance. It harkens back in a way to when game maximum game settings didn’t result in perfect performance.

On the other hand it’s a single technology holding back performance and I don’t know about it really being worth it as opposed the games where it was multiple settings and pushing all the polygons resulted in markedly increased fidelity.

Probably not articulating myself properly but oh well.
 
So, ray tracing on an Nvidia RTX 2080 is about as good as VR was with an AMD RX 480? Perhaps we can call it a draw for the respective marketing depts? :D
 
I really like this article. Very informative. Can't wait for the 2080 ti article to tie this all together.

I agree with the above comment that Nvidia maybe should have just left this to a titan type card to get everyone excited about it, then on the next series make it more available to mainstream type cards.
I have witnessed going back a step with performance to get a new visual effect, like shadows, but this seems like two steps back, instead of one.
 
[H] might want to try a higher thread count processor. I couldn’t figure out why your performance is so far below mine but I run a 2700x with 16 threads vs the test setup of a 7700k at 8 threads.

I know during development they targeted 12 threaded cpus. The recommended specs are a 8700k or above.

It could very well be the difference. Might help with the DX12 decrease too?

I know you’re busy but I think it makes sense.

Merry Christmas!!
 
[H] might want to try a higher thread count processor. I couldn’t figure out why your performance is so far below mine but I run a 2700x with 16 threads vs the test setup of a 7700k at 8 threads.

I know during development they targeted 12 threaded cpus. The recommended specs are a 8700k or above.

It could very well be the difference. Might help with the DX12 decrease too?

I know you’re busy but I think it makes sense.

Merry Christmas!!
Lol. Yeah. Four cores eight threads at 5GHz is killing performance... Just in ray tracing.

Straws?
 
This will never be playable until someone can fix dx12 performance. I remember a review showing nvidia cards having a bigger performance hit with dx12 than amd cards but either way all cards had a performance hit.

Can't remember any other dx successor to run slower with the same features

Any game that supported mantle ran much smoother so there's something seriously broken with dx12
 
Merry Christmas Kyle and thanks for posting something for us to read today.

I am just not seeing a good reason to upgrade these days. While I applaud Nvidia for bringing out a new technology, at this point it does not seem to provide any benefit. There is no game out where this makes for a better experience. Am I correct in thinking playing Battlefield at 4k or even 1440p is better looking than 1080p with RTX.
 
Why are you testing DXR with multiplayer? Nobody was going to be using it for MP to begin with, as most gamers want steady performance above all else. Plus the maps are significantly larger in multiplayer with dozens more people than in single player. This is a rather poor series of articles that doesn’t reflect where most gamers would use DXR to begin with. And ageeeing with the person above, I also have much better overall stability and performance with DXR than what you’re showing, using a 2080 overclocked and 1440p Ultra. That is in single player however, MP is closer to your numbers.

As for the rest of people calling it a failure of some sort, are you serious? Or entitled? Soft shadows, completely unplayable at release. Antialiasing, completely unusable at release. It took years and new techniques for it to become usable.

As for pricing? HardOCP already covered this. Top of the line cards have always been really expensive and have been for twenty years. The 2080 and 2070 are better values than the 1080ti and 1080 were at launch when using inflation. Hell a GeForce 2 Ultra would be 700-750 dollars in today’s value. I feel salty gamers haven’t been around long enough or forgot the past along the way.
 
Dx12 is just pretty shit for performance in general.

it's mostly just the frostbite engine.. if you look at other dx12 like rise of the tomb raider the differences in performance are pretty minimal. that being said though i feel developers really aren't bothering to take advantage of the features within dx12 and are just putting it in there just to say it has it.
 
I find it hilarious that supposedly there's going to be a 2060 rtx... What's the point of Raytracing if only the $1400 TI can do it with any kind of performance?
 
Thanks for a nice Christmas present Mr Bennet! No surprises in your analysis. Coming from a 1070, I'm happy with my RTX 2080 perf in Battlefield with DXR on, bearing in mind the early adoption tax and even happier in all the other games I play without DXR.The RTX series is not only about "allowing entry into playable performance with DXR enabled" so I feel it's perhaps a little disingenuous to withhold a recommendation solely on that basis. If you keep in mind this is 1st gen tech, the RTX series is the best of both worlds DXR on and off ☺
 
Why are you testing DXR with multiplayer? Nobody was going to be using it for MP to begin with, as most gamers want steady performance above all else. Plus the maps are significantly larger in multiplayer with dozens more people than in single player. This is a rather poor series of articles that doesn’t reflect where most gamers would use DXR to begin with. And ageeeing with the person above, I also have much better overall stability and performance with DXR than what you’re showing, using a 2080 overclocked and 1440p Ultra. That is in single player however, MP is closer to your numbers.

As for the rest of people calling it a failure of some sort, are you serious? Or entitled? Soft shadows, completely unplayable at release. Antialiasing, completely unusable at release. It took years and new techniques for it to become usable.

As for pricing? HardOCP already covered this. Top of the line cards have always been really expensive and have been for twenty years. The 2080 and 2070 are better values than the 1080ti and 1080 were at launch when using inflation. Hell a GeForce 2 Ultra would be 700-750 dollars in today’s value. I feel salty gamers haven’t been around long enough or forgot the past along the way.
Nice one. Certainly are a lot of salty RTX haters lurking in the environs ☺
 
Glad I got the 2070, with the 2080 I would have paid 50% more for 10% improvement in RT. Better to save money for future cards that can max out RT at 1440p at a reasonable price.
 
Thanks for this 3 part. It's really shed clear light #'s comparisons for RT on these cards. Pretty obvious that NV's nickle and dime approach of minute(10 rt cores) makes little to no difference for these cards. I expect there will be a bit more of a gap for the TI. I do wonder how much added costs in the RTX line really came from RT/Tensor cores and how much was just made up for profits.
 
Dx12 is just pretty shit for performance in general.
That sentiment really only comes from DX12's main advantages, when not optimized for by the game engines, is with lower end hardware. I see performance gains on my system in DX12 mode, in the two games I have that can use it.

I’m almost okay with this type of performance. It harkens back in a way to when game maximum game settings didn’t result in perfect performance.

On the other hand it’s a single technology holding back performance and I don’t know about it really being worth it as opposed the games where it was multiple settings and pushing all the polygons resulted in markedly increased fidelity.

Probably not articulating myself properly but oh well.
It's not just a single technology, but the whole reason Ray-Tracing was built into BFV was because of Nvidia's RTX cards. RTX, DXR, BFV, are all part of the same package, and it is not doing well. You would think that a new technology being launched, that is currently exclusive to one series of hardware, on one version of windows, and used in only one game, would have been released in a "best case" scenario. This is not the case.
 
All things considered, the performance drop in just the change of API is a real deal-breaker already at 4K. I imagine some additional optimization can still be had by just improving the DX12 renderer, which could in turn make RT more palatable. That 17fps on 1440p gets real damn close to 60fps average, which isn't bad at all for first-gen RT.

So, ray tracing on an Nvidia RTX 2080 is about as good as VR was with an AMD RX 480? Perhaps we can call it a draw for the respective marketing depts? :D

Not in this case. RX480 was saying it's a Premium VR Experience, but the competitor offers a much better one. Here, the competitor does not offer better (or anything for that matter) in the Real-Time Raytracing department.
 
Last edited:
"Similar to the previous review, we have an expensive $869.99 video card just being able to play 1080p resolution with NVIDIA Ray Tracing enabled. Doesn’t quite sound right to spend that much money just to play a game at 1080p with just playable performance does it? The price required to even get playable performance at a minimum seems too high for us to recommend it. It is better than the RTX 2070, but it just allows entry into playable performance with DXR enabled in Battlefield V. It’s quite a lot of money to just get entry performance into a new feature. Everything just works? Not really...at least not in NVIDIA's ray tracing launch title."

As someone who had an Orchid Righteous 3D card right when they came available it was a great card for single player games. Did I play competitive (vs other opponents) Quake with it, no (640x480 it ran at around 30 FPS) Did I play single player Quake with it, yes along with Tomb Raider (such an awesome game) and many other games when driver wrapper support became available. There should be a distinction for tech that supports 60 frames and 120 frames or single player and competitive multiplayer games. I've always felt this way for 3D Vision, it is absolutely fantastic for single player titles but terrible (well perhaps not competitive driving games) for competitive games due to frame rate and cross-hairs being wonky. Bottom line, new tech that is not ready for mulitplayer mainstream yet. I could mention when NVIDIA released hardware T/L and it was slow.

There is a reason why 1080p 144 Hz monitors are still what I suggest for everyone to buy, consistent 100+ frame rate is king for many types of gaming; it is hard to get that with a 1440p monitor. What does a 1440p monitor net you, better image quality? I will take the experience (high frame rate coupled with better responsiveness through controls) over better image quality (more pixels) most every time.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
For Part 3, request playable settings using DXR be explored for each card vice just using all maxed out settings. As in for 1440p for the 2070/80/Ti using DXR, what are the max other graphic settings one can use when using Ray Traced reflections.

I do not see the win here with DXR in BFV, improve reflection IQ while degrading everything else using a much lower resolution. Really 4K playability with max out settings in DX 11 to a choppy 1080p, DX 12 max out settings with low or med DXR settings!

CPU usage might be interesting to explore, does DXR use additional threads over non DXR game play? Does it hit the CPU harder or less?

As for DX12 and DX11, that in itself is destroying performance and thus DXR ability - WTH is the issue there is what I wondering and only Dice/Nvidia probably would know. In short BFV is a no go for DXR technology so far.
 
I really would like to know why there is such a big performance drop going from DX11 to DX12. Its not like DX12 gives you any enhanced graphics over what DX11 offers.

Pretty unfortunate that their DX12 implementation somehow isn't up to par. With a Q6600 and a 7870, when the Mantle patch hit in BF4, it literally doubled or tripled my framerate on some maps. The performance uplift was absolutely absurd - like 35 fps to 90~ in extreme cases.

Was surprised they went for DX12 instead of Vulkan too...

I actually saw their rendering guy answer this question on Twitter some time back that "Frostbite's DX11 renderer is just _really_ good."
 
I really like this article. Very informative. Can't wait for the 2080 ti article to tie this all together.

I agree with the above comment that Nvidia maybe should have just left this to a titan type card to get everyone excited about it, then on the next series make it more available to mainstream type cards.
I have witnessed going back a step with performance to get a new visual effect, like shadows, but this seems like two steps back, instead of one.

Loved the trip down memory lane for me there! Remembering loading up F.E.A.R. for the first time and having to chop shadow quality back a whole bunch but still being impressed the whole time. Ah, Memories!
 
Without bothering reading all the posts and to answer the question after the update its quite playable at rtx2080 at 1440p getting around 75-85fps with RTX set on low
 
Looking at this video playing at 4K Ultra, single and multiplayer the VRam requirements are unreal with DXR! Hitting close to 11gb! I wonder if the 2080/2070 are running out of Vram in this title? Also CPU is 8700K, this game may indeed need more than 8 threads or more than 4 cores with DXR but could not find any applicable data other than EA recommending an AMD 2700 or 8700.



Edit: The multiplier section of video, VRam goes over 10gb, single player it is over 8gb. Still even at 4K that is a lot of Vram being used and maybe the 2070/2080 are running out of VRam in multiplayer at 1440p. Do recommend tests be done with different CPUs core counts. This game or DXR may have made 4 core CPUs obsolete.
 
Last edited:
Looking at this video playing at 4K Ultra, single and multiplayer the VRam requirements are unreal with DXR! Hitting close to 11gb! I wonder if the 2080/2070 are running out of Vram in this title?

I thought the very same. I've noticed most sites have ceased posting Vram usage metrics and I apologize if I missed them in these two reviews. At 4k I'd say it's almost a given they're hitting the v-ram ceiling if using fully/manually maxed settings. It's even possible at 1440p. Most AAA games I've gotten since ROTTR/RE7/MEA have all hit over 8GB in 4k. When I had my 1080SLI setup I often saw how it would cost me FPS. When I lowered settings to get under those 8GB's I saw major improvements. I know the old belief of games will use it if you have it regarding Vram but I honestly feel these days that it should be more like if you don't have it you won't get it. Regardless, can't wait until Kyle & team get pt. 3 out for the final conclusions.
 
I thought the very same. I've noticed most sites have ceased posting Vram usage metrics and I apologize if I missed them in these two reviews. At 4k I'd say it's almost a given they're hitting the v-ram ceiling if using fully/manually maxed settings. It's even possible at 1440p. Most AAA games I've gotten since ROTTR/RE7/MEA have all hit over 8GB in 4k. When I had my 1080SLI setup I often saw how it would cost me FPS. When I lowered settings to get under those 8GB's I saw major improvements. I know the old belief of games will use it if you have it regarding Vram but I honestly feel these days that it should be more like if you don't have it you won't get it. Regardless, can't wait until Kyle & team get pt. 3 out for the final conclusions.
Also the system ram was over 11gb as well :LOL:, 16gb for a gaming machine is bare minimum for Enthusiast. Anyways the erratic framerates Brent saw makes me wonder if the cause is actually VRam limitations in multiplayer of the 2070 and the 2080. Cannot find one review looking at CPU workload with and without DXR - since part of DX 12 API, looks like it will take additional cpu threads/cores when used. New technology which has not been explained at all in how it affects your gaming rig.
 
I have to totally agree. Waiting for pt.3 but I've run this game with the rig in my sig at 1080p/120hz and 4k/60hz w/ fully manually maxed settings in DX12 except for, DXR med-blur related off-lens effects/aberration off and it was impressive to me. I only played it in single player but I did test all the single player campaigns for about an hour each. France and Norway each had some impressive visuals to me. I'm using nearly identical settings that Kyle did for his Strix 2080TI review but I toned them down a little just so I could keep the fans under 75% and the card under 65c.

At the time I was testing this game I'd been 'benching my brains out' between the new GPU, OC'ing my CPU a bit further to 4.3, and tweaking fans etc. for optimal noise/temp compromises due to now only have one GPU so I'm a bit foggy on my exact FPS but I do remember being surprised(100-118 fps/1080p and 50-60+ in 4k) considering all the bad hype back then. I did this all just after the 'optimized' RT drivers came out but had not even tried at game release.
 
Just did some quick re-testing in 4k(4096x2160). FPS are the same as I remember them. This CPU was getting 40-54% usage. Vram was averaging 9.1-9.4 GB. It didn't seem to change at all in going from Med to Ultra DXR. I'm sure if I lowered my resolution to 3840x2160 it'd still be ~9GB so I think this establishes the game does use more than 8GB in 4k. I only retested the Norway/France campaigns since there's not much to RT in a desert ;)
 
Metro Exodus or a good top down isometric RPG without a lot of demands might make the card shine.
 
Also would like to see playable settings with DXR enabled for all 3 cards. Maybe the 2070 would become playable with High BFV settings instead of Ultra?
Maybe you could run the 2080 at High/Medium at 1440p?

It's a lot of work for you though and you should have some holiday time. :)
 
  • Like
Reactions: noko
like this
Back
Top