Battlefield V NVIDIA Ray Tracing RTX 2070 Performance @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Battlefield V NVIDIA Ray Tracing RTX 2070 Performance

Can the NVIDIA GeForce RTX 2070 muscle up the performance to play Battlefield V with ray tracing enabled at 1080p and 1440p now that the new RTX performance patch has been released? We will find out using an ASUS ROG STRIX RTX 2070 OC and testing Battlefield V using 64-player MULTIPLAYER. Everything just works?

If you like our content, please support HardOCP on Patreon.
 
Great review.

I agree that BFV was probably not a good choice to launch with this technology.

It does actually look pretty nice when you stop to look for the reflections, but BF is a fast paced game and the effect is kind of wasted.

Hopefully we'll see better implementations in the future.
 
Harsh but fair. Gonna skip this generation as I need a card that drives VR and 4K without crunching my bank account.
 
Nice breakdown. Few questions:

1- JayZ saw better performance even with RT on in multiplayer vs sp. Did you see that?

2- do you see RT on as an "essential" part of BF V? I.e., are nonRT Modes significantly worse looking that it's not worth buying of you don't turn it on?

3- did the cards run hotter/pull more power with RT active?
 
Summary of the entire line really, and by the time we get cards good enough to do RT respectably for reasonable prices, they will be powerful enough to do even better alternatives to RT, anyways. The whole idea of RT is to reduce complexity and dev time to make current fidelity faster and easier, and higher fidelity possible. RT has plenty of drawbacks and hacks needed to do just what we do today, and adding it as another layer to rasterization only complicates matters further. RT isn't the answer we need, and we're not ready, yay, value.
 
Last edited:
RTX in general is a bomb, from many perspectives. Sorry Nvidia, but you get $0.00 from me this gen, and possibly next gen if you keep this up.
 
I can't wait for something like Hitman or Assassins creed uses DXR, I think that combined with DLSS you'll see playable frame-rates in those kind of single player games, and the reflections et al could actually be an integral part of gameplay.

In higher player count fast paced games like BF or the various Battle Royale games it simply can't be used in as exciting a way, if even noticed at all. The other features that the RT cores can do like if there appears to be a gap in a fence or a car frame you should be able to shoot through, and th RT system could "see" that from your point of view and allow it will IMHO become much more important than these tech demos show it to be now.
 
Great review.

I agree that BFV was probably not a good choice to launch with this technology.

It does actually look pretty nice when you stop to look for the reflections, but BF is a fast paced game and the effect is kind of wasted.

Hopefully we'll see better implementations in the future.

Yea, I would like to see a stealth game utilizing this.
 
The whole idea of RT is to reduce complexity and dev time to make current fidelity faster and easier, and higher fidelity possible. RT has plenty of drawbacks and hacks needed to do just what we do today, and adding it as another layer to rasterization only complicates matters further.

This is a solid point. Ray tracing, in general, is meant to relax workload on developers from having to physically create lightmaps and shadowmaps, or use cheap tricks for reflections, and reduce workload.

However, just to make NVIDIA's implantation of ray tracing (on reflections only) even close to playable, it required a butt load of extra work between NVIDIA and DICE to optimize it so it is playable (and the answer to reduce the amount of ray tracing at that), and only then apparently on the most expensive of the RTX video cards.

For something that is supposed to reduce workload, it seems to have so far done the opposite.
 
Last edited:
Nice breakdown. Few questions:

1- JayZ saw better performance even with RT on in multiplayer vs sp. Did you see that?

2- do you see RT on as an "essential" part of BF V? I.e., are nonRT Modes significantly worse looking that it's not worth buying of you don't turn it on?

3- did the cards run hotter/pull more power with RT active?
I like JayZ2Cents but the moment he mentions NVIDIA I leave his video. He seems very shilly on all things NVIDIA to me lately. I also think Bitwit and Paul are not as blunt as necessary. I like the [H], Tech Jesus, and the UDF kid from Sotuh Africa lately for less diplomatic opinions on these things.
 
With DXR off why such differences in fps between DX12 and DX11? I remember this being an issue when DX12 was introduced but still happening? Why?
 
  • Like
Reactions: ltron
like this
Great review.

I agree that BFV was probably not a good choice to launch with this technology.

It does actually look pretty nice when you stop to look for the reflections, but BF is a fast paced game and the effect is kind of wasted.

Hopefully we'll see better implementations in the future.

This needs to be emphasized. I like ray tracing, I want ray tracing in games. However, with the extreme performance hit on today's hardware ray tracing needs to be used in very select situations and games. The genre of game matters. This is not the type of game that demonstrates the technology well to the benefit of the gameplay experience.

Metro Exodus? That game might be a lot better for it.
 
Nice breakdown. Few questions:

1- JayZ saw better performance even with RT on in multiplayer vs sp. Did you see that?

2- do you see RT on as an "essential" part of BF V? I.e., are nonRT Modes significantly worse looking that it's not worth buying of you don't turn it on?

3- did the cards run hotter/pull more power with RT active?

I haven't tested single player. Does single player even matter in this game?

No, I think the game looks great without ray tracing reflections.... does it look better? only if you stop and look at it, and then you are ded, D E D, ded

I have not looked at power or temperature while testing yet, I'll take note of it in the next article, we are going to do 2080, but nothing stuck out at me while testing to give it any notice, if that means anything
 
This needs to be emphasized. I like ray tracing, I want ray tracing in games. However, with the extreme performance hit on today's hardware ray tracing needs to be used in very select situations and games. The genre of game matters. This is not the type of game that demonstrates the technology well to the benefit of the gameplay experience.

Metro Exodus? That game might be a lot better for it.

I'm sure from nvidia's perspective it's better to have 1 AAA title game that actually uses the tech than 0. Charging a huge premium for the new gen with no games currently using them certainly won't help them move units. Kinda reminds me of AMD and DX12. All that effort they put in to be ahead of nvidia in that department was a waste, because most people did not care and for the longest time DX12 performance just sucked.
 
This is a solid point. Ray tracing, in general, is meant to relax workload on developers from having to physically create lightmaps and shadowmaps, or use cheap tricks for reflections, and reduce workload.

However, just to make NVIDIA's implantation of ray tracing (on reflections only) even close to playable, it required a butt load of extra work between NVIDIA and DICE to optimize it so it is playable (and the answer to reduce the amount of ray tracing at that), and only then apparently on the most expensive of the RTX video cards.

For something that is supposed to reduce workload, it seems to have so far done the opposite.
It's also why I seriously doubt AMD's answer to RT, will be RT. RT has lots of inherent limitations that prevent it doing things that rasterization does currently, easily. It's like trading one set of cons that add tons of added dev time to hack and fix for another set that you need to learn to do as effectively as possible. It's almost like going a step back so to speak. They know this so they use it as a addition, which, just makes a already high dev load, that is actively being pushed onto 3rd party engine makers to do to keep time sane, a even larger time investment. We're already at the point as I said that they offload dev time to 3rd party engines or have spin off teams working on in house engines separately largely due to this. AdoredTV actually has a really good video on stuff about RT I'll link it below, think anyone who hasn't seen it probably should, get a understanding why this RT hype is even less appealing than it seems.

 
With DXR off why such differences in fps between DX12 and DX11? I remember this being an issue when DX12 was introduced but still happening? Why?

Why indeed. That's a big part of why DXR is so slow, you are already starting from a decreased performance stance.

If I were DICE, i'd try to fix DX12 performance so it actually helped, then turning on DXR would not be so bad /shrug
 
Thanks for the review. Hopefully the 3070 will be a worthwhile upgrade from a 1070. The 2070 certainly doesn't seem to be.
 
I can't wait for something like Hitman or Assassins creed uses DXR, I think that combined with DLSS you'll see playable frame-rates in those kind of single player games, and the reflections et al could actually be an integral part of gameplay.
.

Nvidia dropped the ball with this gen in pricing/performance and too early for RT. Games like you mentioned can only use RT as and add on. Are developers going to go way out of their way to inject it into their game? Their isn't RT on a console and as much as we don't like them, I really think that is going to be determined whether it is used or just "bolt on" on triple A games. They would have been better off making a card that plays ALL games at 4K 60fps + HDR with an HDMI 2.1 chipset. (if not 120fps)
 
  • Like
Reactions: ltron
like this
Harsh but fair. Gonna skip this generation as I need a card that drives VR and 4K without crunching my bank account.

I agree. However, it's too soon to make a real judgement on this current generation of RTX cards. I suspect there are still optimizations that can be made to games and the bottom line is this is the only one out yet. Regardless, this is new tech and it always seems that new tech is slow in the beginning and it gradually gets better over time with improved software, drivers, and hardware. RT has to start somewhere. Also, DICE needs to get their DX12 act together, period.
 
Great review as always. Thanks. for fucks sake people you should not mention Jayz in this forum he has no clue...! :D
 
I tinkered with testing this in single player a couple weeks back with my Strix 2080TI on a t.v. that supports HDR in 1080p/1440p/2160p. FIrst I have to say I was concerned about heat and didn't notice anything relevant. I'd been playing/testing that day for about 4-6 hours already so everything was up to normal temps. If it did my fan curves held it. In terms of power I didn't check my UPS to see. Anything less than 4k and it held or exceeded 60fps no problem. At 4k I saw some dips in the 45-55 range but it was trying to hold 60fps. This with all settings manually maxed except any kind of blur was turned off and in DX12. I do totally agree that at higher resolutions DXR really shined. I spent a considerable amount of time on the French campaign just looking at things. I also agree this is totally the wrong genre for this tech. A RPG of something like resident evil would've been better suited.
 
It's amazing that after all this time DICE still haven't gotten their DX12 implementation to perform properly at a base level. It's just throwing away underlying frames when running DXR. ID manage just fine, as did IO with Hitman 2016 and DICE sure as shit have more resources than IO does.
 
Nice breakdown. Few questions:

1- JayZ saw better performance even with RT on in multiplayer vs sp. Did you see that?

that's not uncommon with BF games.. they have a lot more client side only features enabled in the single player that can't be enabled in the multiplayer due to excessive server load/data transfer/risk of desync. e.g. there's typically more destructible/fully destructible buildings in SP where has in multiplayer they're just partially/scripted destruction. also from what i saw of what little SP i played there's less texture pop in/out since high frame rates aren't as important as it is in multiplayer.
 
Why indeed. That's a big part of why DXR is so slow, you are already starting from a decreased performance stance.

If I were DICE, i'd try to fix DX12 performance so it actually helped, then turning on DXR would not be so bad /shrug

This. Granted, I know this game was suppose to showcase Ray Tracing for the RTX series video cards and how purty they can make floors and windows look, but come on.
This is not a stroll through the woods and look at the pretty textures game. The people I know who purchased an RTX card could give a fuck less about Ray Tracing in this game.

None of them have even enabled DXR, and the ones that have a 2070, WILL NOT run it in DX12, let alone running "Ultra" settings. As for myself, I use a mixture of high/med w/ max draw distance for every BF game I have played. Max frames are key, fuck the aesthetics.

Thanks for your hard work Brent.
 
Thanks for the review, [H]!

I really want to like the RTX series.

But, I don't. These ridiculous prices don't provide any meaningful ROI, in my eyes.

If the price points shift to where they *should* be (like previous gens), then I would likely be a 2080Ti buyer in the $700-800 range to replace my $720 980Ti.
 
OK, screw the RTX series. I play at 1080p and the 2070 still doesn't cut it so I'll just buy a 1070Ti when I see them hit below $300 new hopefully soon.
 
I can't wait for something like Hitman or Assassins creed uses DXR, I think that combined with DLSS you'll see playable frame-rates in those kind of single player games, and the reflections et al could actually be an integral part of gameplay.

In higher player count fast paced games like BF or the various Battle Royale games it simply can't be used in as exciting a way, if even noticed at all. The other features that the RT cores can do like if there appears to be a gap in a fence or a car frame you should be able to shoot through, and th RT system could "see" that from your point of view and allow it will IMHO become much more important than these tech demos show it to be now.


Sorry to be the bearer of bad news, but DLSS and RT can't (realistically) be used at the same time. RT REQUIRES the tensor cores to denoise the raw 1-bit RT output. Without the tensor cores, the RT output looks about the same as white noise. Because those Tensor cores are running to fix up the RT output, they can't ALSO perform DLSS on the image as well without a huge slowdown, which kind of misses the point....
 
Well my replacement for my first MSI Duke 2070 will arrive Wednesday since the first one space invadered. Hopefully I'll get a chance to actually try out some games this time. Doesn't look like I'll get to enjoy any actual ray tracing, but I never expected to. I was hoping for DLSS but I game at 1080 ultrawide hence the 2070, and so far the only "game" to support DLSS only supports it at 4k....

But hey hopefully this replacement card will actually work. I mean my expectations are pretty damn low at this point. Should have bought the Vega 64...
 
  • Like
Reactions: N4CR
like this
The review is good but I kind of disagree with your conclusion, the main problem is BFv Dx12 is the same as BF1 which was not up to par, so it is bringing down RT. That is not a Nvidia issue but a DICE one,
Dx12 should run at least up to Dx11 level.
So the way I see it the RT should of run around were the Dx12 w/o RT, basically say 25% hit on Dx11 level performance.
 
Almost ironically not a very good value proposition over a year plus old vega..
The review is good but I kind of disagree with your conclusion, the main problem is BFv Dx12 is the same as BF1 which was not up to par, so it is bringing down RT. That is not a Nvidia issue but a DICE one,
Dx12 should run at least up to Dx11 level.
So the way I see it the RT should of run around were the Dx12 w/o RT, basically say 25% hit on Dx11 level performance.
Nvidia has often had more of a dx12 hit than their competitor.
 
So the way I see it the RT should of run around were the Dx12 w/o RT, basically say 25% hit on Dx11 level performance.
Yeah, it should, but it does not.

DX12 implementation is on Dice. But RTX sits on top of it, so it is still the performance you get and NVIDIA picked this as their RTX launch partner. DX12 may not be their direct fault, but it is still their problem.
 
It really makes one wonder why the overall Nvidia's RTX technology roll-out been so embarrassingly mindbogglingly bad... What the hell was Nvidia thinking/smoking?

They pinned the rollout on a single game for demo purposes to start (Battlefield V) and it manages to completely underwhelm on all cylinders and come across as this completely half baked thing with very little real/meaningful appeal as far as the tech showcasing goes.

Hell, why didn't they just start with some canned/internally developed series of demos to start off with - something that they could put out there to really show off the new tech's potential? (I.E. Like demos they've released in the past that show off specific capabilities where you can run them yourself, turning the tech on/off to effectively demonstrate exactly why you would want this tech in a game to begin with and really showcase all of it's goodness.) Something that actually let's you see the benefits/potential that could be realized by using it?

Sure, getting developers to use this tech and to have it working within a game is a harder nut to crack, takes time, and requires a funding/investment/teaming push... but why the hell did they choose to do a roll-out this way to begin with!? It's almost like they were all hell bent to provide a solution without a clear/demonstrable problem to solve and really wanted to sabotage themselves as much as possible. It's not like they needed to rush this to market because of any real competition. They could have waited until it was at least baked to the point of being edible before rolling it out.
 
Last edited:
Reading that review is a bit of a surprise. I actually thought that RT on would bring it totally to its knees, as in single digit figures. 1440p is pretty much unplayable, but 1080p doesn't appear to bad at all. Nvidia certainly didn't choose the best showcase to show off it's brand new hardware as DX12 performance is not as refined as DX11. I would actually like to see a Vulkan game running RT, this is where it truly could shine. Guess there might be a bit of a wait on that one.
 
I would actually like to see a Vulkan game running RT, this is where it truly could shine. Guess there might be a bit of a wait on that one.
Nvidia released extensions to support RT on Vulkan, so it's possible today.

However, DX12 and Vulkan are very similar and should have similar performance characteristics. It's really just up to the skill of the developer to optimize the code.

Granted, DICE are very good game developers, and their engines do look nice for sure, but aren't optimized to the level of ID (well no one is really).
 
Disappointing results, great review.

Lighting in this game is using normal hardware and shader based methods- no raytracing there which is the main point to do raytracing - it just not working here

All shadows is using traditional game methods such as shadow maps no raytracing - it ain't working

Ambient Occlusion in how light really bounces around, shades and color bleed - where raytracing is almost the only accurate way - is not used here - it just works is not a RTX card, raytracing does work but what Nvidia has does not.

Caustics, subsurface scattering also not used

So only one aspect of raytracing was used reflections and it tanked performance, which only covers a limited amount of screen space. Real raytracing would be 100% of anything shown or not. I would not even call this raytracing but very limited assisted use of ray Tracing in this game and its performance tanks

Nvidia has a very limited capable gaming raytraced option. This is also an area when multiple card support would have been key.
 
Agreed, noko . Even worse is that the one lobby floor in Rotterdam is so unrealistically like a mirror. The cheaper screen-space reflection actually looks better.
 
  • Like
Reactions: noko
like this
The article contains blatantly wrong information, no ray tracing has been reduced EVER, in fact reflection quality has been improved with the recent patch, the reflection clarity is higher because denoising is better, and reflections now reflect even more objects like grass and leaves through an SSR implementation on top of ray tracing. Before the patch none of these objects were even reflected in the first place.

Here see Digital Foundry analysis with input from DICE itself.

 
The review is good but I kind of disagree with your conclusion, the main problem is BFv Dx12 is the same as BF1 which was not up to par, so it is bringing down RT. That is not a Nvidia issue but a DICE one,
Dx12 should run at least up to Dx11 level.
So the way I see it the RT should of run around were the Dx12 w/o RT, basically say 25% hit on Dx11 level performance.

It kinda boggles the mind why dx12 is such a stumbling block for devs. I guess it's a clue how with dx11 a big fat driver layer is there holding everybodys hands rearranging shaders and injecting bits of code into the engine to paper over coding cracks. Now in dx12 that's not there and the intimate knowledge of the workings of gpus and coding best practice needed to get the performance that the driver teams have just aren't there. I saw this blog years ago, that I'm trying to find again, and in it an ex driver dev laid out how bad game devs were at sticking to the direct X spec and the jankiness of their code. The guy said that after they profiled games they'd end up injecting custom shaders and wholesale 'proper' bits of code into the game engine at runtime. It might not be the case now but in the not to distant past the driver guys had to come and clean up shoddy code all the time, which isn't an option that exists in dx12 with its much thinner driver layer.
 
Back
Top