TechSpot Tests the Battlefield V DXR Ray Tracing Performance and Implementation

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,060
TechSpot has tested Battlefield V to see how the Microsoft DirectX Raytracing (DXR) implementation in the WWII shooter affects performance. They tested all 4 DXR settings in-game to see how they differ in looks and performance at various resolutions. The NVIDIA graphics cards used include the RTX 2080 Ti, RTX 2080, and RTX 2070. It is important to note that DICE is planning a patch to fix the Medium DXR setting as it is not properly applied in-game at this time.

The RTX 2070 deserves a special mention, because this card can barely be classed as ray tracing compatible. Delivering performance barely over 30 FPS at 1080p with DXR reflections set to low, it's clear that 36 RT cores and 6 gigarays per second of ray tracing performance doesn't cut it for actual real time ray tracing. We’d be bitterly disappointed if we bought an RTX 2070 for ray tracing, only to be left with that sort of performance.
 
IMO somewhat flawed review becuase thy didn't bother benchmarking with other settings lowered from maybe ultra to high and seeing if that allowed a more playable frame rate with almost no loss in shadow/lighting/AA quality. They also put the opinion in at the end that other titles will have issues when they may perform somewhat better due to DX12 being fairly broken in the frostbite engine currently,

I do agree that the 2070 is a joke though. They should not have bothered shipping the card with ray tracing being advertised.
 
IMO somewhat flawed review becuase thy didn't bother benchmarking with other settings lowered from maybe ultra to high and seeing if that allowed a more playable frame rate with almost no loss in shadow/lighting/AA quality. They also put the opinion in at the end that other titles will have issues when they may perform somewhat better due to DX12 being fairly broken in the frostbite engine currently,

I do agree that the 2070 is a joke though. They should not have bothered shipping the card with ray tracing being advertised.

I don't agree. People are paying a premium price for a 2080ti and expect to play with ultra performance for what they pay for.
 
I don't agree. People are paying a premium price for a 2080ti and expect to play with ultra performance for what they pay for.

It's one game. Ultra is relative. Expectations are subjective. RTX is an optional, bleeding edge toggle.

All the bandwagon pseudo-outrage over how other people spend their money is kind of ponderous.
 
Last edited:
I think it's too early to say it's a total bust. Battlefield V is probably not a great candidate. I'd think slower paced games, story based, etc might fare much better. Games with spell effects or MOBA's might look pretty cool with it.
 
It's going to take some significant process shrinks to make this viable long term, the Turing die size is already enormous so tripling the RT core count is out of the question for the time being. And really, are we going to keep pumping out massively oversized dies at 7nm and 4nm to support 200+ RT units? Hopefully DLSS is a much more tame technology which should give cards like the 2070 a better chance to justify their existence.
 
Rahh I figure the backlash is more on QA than on performance at this point. We already knew RTX came with a huge penalty.
 
RahhWe already knew RTX came with a huge penalty.

Yes, but I'll admit to being surprised at the 2070 results. I took for granted (OK - shouldn't have) that an expensive card offered with RTX could do something useful with that feature. Had I bought the card with the expectation of eventually enjoying that capability, getting 30 FPS would leave me pretty unhappy. The quantitative results make this a very interesting review.
 
Wow...who would have thought that the first implementation of an emerging technology would yield mediocre results?
 
Wow...who would have thought that the first implementation of an emerging technology would yield mediocre results?
Should have been reserved for demos from Nvidia. Until it can be enabled without such a ridiculous performance impact, it has no business being on expensive, high end consumer cards. Its very presence reduced the potential performance of these cards because so much of the die was wasted on an immature tech you will only turn on long enough to say neat, and then never use again. It's at a bare minimum one full generation away from being useful for the top end, never mind the mainstream.
 
I don't agree. People are paying a premium price for a 2080ti and expect to play with ultra performance for what they pay for.

I bought a 2080ti. The expectation was to play 4k w/ ultra settings with a decent frame rate in modern titles.

I never thought that enabling DXR, however, would allow me to do what I just mentioned. I've been around for long enough to know that any time a big feature like this comes out that you aren't going to be able to crank the settings all the way up and still expect decent FPS - Even if you have the best hardware you can buy. Throughout most of computing history it's been this way.

So I think it isn't very helpful to only show benchmarks with the rest of the settings @ ultra if you could have enabled RTX and maybe lowered some settings to high and still achieved a decent frame rate.

This is why I like how HardOCP does reviews these days - They show me what settings i'd have to drop to achieve a playable framerate at a given resolution. And if the sacrifice to get raytracing is that I only need to drop TAA to low, disable stupid stuff like motion blur, and drop shadows to high from ultra - I think that's worth doing if the pay off is raytracing.

The way the techspot review is designed is to essentially say - OMG raytracing is unusable. This opinion is a perfectly valid one - But it can't be taken seriously since they didn't do a better job of showing a more comprehensive run down of what adjusting various settings would have done to make things playable. If the end result is that you'd have to drop all the other settings to medium/low to get raytracing running on a particular RTX card - Then yes, the opinion that OMG raytracing is unusable on this card becomes a bit more valid as proven by the benchmarks.

But again - I think a lot of users would disagree that DXR isn't usable on a card if one only had to drop the rest of the settings from Ultra to High. As in most games dropping things from Ultra to High isn't even that noticeable compared to the payoff of having raytraced reflections/shadows.

Finally, I think NVidia marketing department screwed up using BF5 as the first consumer available game for raytracing features. Given the poor performance in DX12 mode compared to DX11 mode, and generally being a faster game where FPS matters more - They'd have been better off making sure Shadow of the Tomber Raider / the new Metro game was actually the first game. They made the wrong choice betting on DICE/EA here.
 
Last edited:
IMO somewhat flawed review becuase thy didn't bother benchmarking with other settings lowered from maybe ultra to high and seeing if that allowed a more playable frame rate with almost no loss in shadow/lighting/AA quality. They also put the opinion in at the end that other titles will have issues when they may perform somewhat better due to DX12 being fairly broken in the frostbite engine currently,

I do agree that the 2070 is a joke though. They should not have bothered shipping the card with ray tracing being advertised.

If I spent $1200 on a video card it better play all the games with all the eye candy on max
 
If I spent $1200 on a video card it better play all the games with all the eye candy on max

You can believe that all you want, and that's your opinion - It has no historical basis for being correct.

I won't even touch on max being entirely subjective given DSR / the difffering resolution requirements, and if you think 30/60/120fps is acceptable frame rate.
 
Last edited:
Everyone already knew performance with RTX on would be horrible.

You don' buy an RTX 2080 Ti for Ray Tracing lol. You buy it for the raw horsepower. I have a 2080 Ti and I never expected to use it for Ray Tracing. The performance won't be there for at least another 4 years and that's only if they can increase RT performance 30% in 2020 and another 30% in 2022.
 
Dice games generally are not that GPU demanding for x80TI cards at Ultra settings. Looking at 1440p/140+fps & 4k/70+ fps is a good indicator of that per game/gpu generations. If this much performance loss happens with BF I can only imagine the slideshow we'll see with Metro Exodus.

I agree with others, give us a GTX w/o the tensors and cram as many other cores as possible on the die for true generational gain.
 
Everyone already knew performance with RTX on would be horrible.

You don' buy an RTX 2080 Ti for Ray Tracing lol. You buy it for the raw horsepower. I have a 2080 Ti and I never expected to use it for Ray Tracing. The performance won't be there for at least another 4 years and that's only if they can increase RT performance 30% in 2020 and another 30% in 2022.

Then why did Nvidia market the card differently? RTX is all over the box & the hardware itself. Nvidia even had a huge press conference going over the details of it and proclaimed it the best thing since sliced bread in the PC gaming world. C'mon, you know people bought it for that! Yes, the performance is great, everyone already knew that. This card was marketed as a feature set card.
 
Then why did Nvidia market the card differently? RTX is all over the box & the hardware itself. Nvidia even had a huge press conference going over the details of it and proclaimed it the best thing since sliced bread in the PC gaming world. C'mon, you know people bought it for that! Yes, the performance is great, everyone already knew that. This card was marketed as a feature set card.

Because it's still the first card to support raytracing in hardware. Any company on the face of the earth would do what they did.
 
Because it's still the first card to support raytracing in hardware. Any company on the face of the earth would do what they did.

That's not what the issue is. The issue is that Nvidia didn't show their hand. They tip toed around performance numbers, actually tried to hide performance numbers (Tomb Raider) and still marketed the card as revolutionary. Well, 40fps is not revolutionary, reflections or not.
 
bx9TMdp.jpg
 
IMO somewhat flawed review becuase thy didn't bother benchmarking with other settings lowered from maybe ultra to high and seeing if that allowed a more playable frame rate with almost no loss in shadow/lighting/AA quality. They also put the opinion in at the end that other titles will have issues when they may perform somewhat better due to DX12 being fairly broken in the frostbite engine currently,

I do agree that the 2070 is a joke though. They should not have bothered shipping the card with ray tracing being advertised.

Why? nVidia doesn't compromise with price, why should owners of these cards compromise with performance?
 
1080p w/ Low settling and barely a playable frame rate.

lolololol
If you like turn-based games and nausea, it can still be played at 1440p. :)
I managed to play about 15 minutes before I started feeling seasick.
Still kind of hoping it was just a clumsy add-on vs. something build specifically for RTX.
Does make you wonder why they'd ever go through with RTX support on BFV, if this is what their internal testing looked like.

Maybe they are trying to drive a resurgence of SLI?
 
I ran through and recorded part of one mission @ 1080p with everything maxed out, and displayed FPS + GPU usage.



On my machine, a 2950x with a 2080TI, I had a solid 60 fps practically the entire time. (I have 2 2080 TIs, but SLI doesn't seem to be supported).

I'll admit, the game looks beautiful, and far better than with RTX off, or on low. YouTube on the other hand doesn't do this justice, as it's horribly compressed. I could have uploaded a 4k video to get better compression, but this video would have been an 11 GB upload.

Now, the one MAJOR downside. Despite having 60 fps, input felt horribly laggy. I'm talking like half a second input delay. There's absolutely no way I'd be able to play multiplayer with RTX on. With it off, input felt fine.
 
Skipping 20 series till Raytracing matures along with cooling. Have 1080ti will wait awhile to upgrade again.
 
At least you can notice the cool effects of ray tracing while stuttering along at 20-30 FPS!
 
I ran through and recorded part of one mission @ 1080p with everything maxed out, and displayed FPS + GPU usage.



On my machine, a 2950x with a 2080TI, I had a solid 60 fps practically the entire time. (I have 2 2080 TIs, but SLI doesn't seem to be supported).

I'll admit, the game looks beautiful, and far better than with RTX off, or on low. YouTube on the other hand doesn't do this justice, as it's horribly compressed. I could have uploaded a 4k video to get better compression, but this video would have been an 11 GB upload.

Now, the one MAJOR downside. Despite having 60 fps, input felt horribly laggy. I'm talking like half a second input delay. There's absolutely no way I'd be able to play multiplayer with RTX on. With it off, input felt fine.


Thank you for your service. And yeah you're right, the youtube compression isn't doing any favors
 
IMO somewhat flawed review becuase thy didn't bother benchmarking with other settings lowered from maybe ultra to high and seeing if that allowed a more playable frame rate with almost no loss in shadow/lighting/AA quality. They also put the opinion in at the end that other titles will have issues when they may perform somewhat better due to DX12 being fairly broken in the frostbite engine currently,

I do agree that the 2070 is a joke though. They should not have bothered shipping the card with ray tracing being advertised.

If I remember correctly from their video, they didn't see much difference in quality or frame rate on high and medium so they skipped those... Turns out there was a bug (since patched) that made it so high and medium settings didn't actually change the settings if you clicked it if you were not starting on the low setting. I think you had to set it to low quit out and then come back in to change to medium or high.
 
RTX 2070 class action lawsuit!...misleading advertising promoting the cards as ray-tracing compatible...so the 2080Ti cards blow up and catch fire and the 2070 cards suck at ray-tracing...why would anyone buy this new gen?
 
  • Like
Reactions: N4CR
like this
You can believe that all you want, and that's your opinion - It has no historical basis for being correct.

I won't even touch on max being entirely subjective given DSR / the difffering resolution requirements, and if you think 30/60/120fps is acceptable frame rate.
I dunno, 1200 dollars is a lot of long islands you can dump into a woman with a pretty mouth and severe daddy issues...
 
Back
Top