Battlefield V NVIDIA Ray Tracing RTX 2070 Performance @ [H]

The article contains blatantly wrong information, no ray tracing has been reduced EVER, in fact reflection quality has been improved with the recent patch, the reflection clarity is higher because denoising is better, and reflections now reflect even more objects like grass and leaves through an SSR implementation on top of ray tracing. Before the patch none of these objects were even reflected in the first place.

Here see Digital Foundry analysis with input from DICE itself.



The idea is that the first time ray tracing was implemented it was slower way slower and how ray tracing works you can choose what you will do and won't that is clear that it takes a lot of power to ray trace things is the reason why it was slower. Unless DICE did not know how to do this they limited objects and landscapes and that part is documented.
Nvidia pays developers I am very sure that when contracts are signed there is a clause where they will defend or retract any kind of negative statements regarding the technology used. Free lip service and what has been Nvidia mantra over the years when problems arise "it is not us, it is them".
It kinda boggles the mind why dx12 is such a stumbling block for devs. I guess it's a clue how with dx11 a big fat driver layer is there holding everybodys hands rearranging shaders and injecting bits of code into the engine to paper over coding cracks. Now in dx12 that's not there and the intimate knowledge of the workings of gpus and coding best practice needed to get the performance that the driver teams have just aren't there. I saw this blog years ago, that I'm trying to find again, and in it an ex driver dev laid out how bad game devs were at sticking to the direct X spec and the jankiness of their code. The guy said that after they profiled games they'd end up injecting custom shaders and wholesale 'proper' bits of code into the game engine at runtime. It might not be the case now but in the not to distant past the driver guys had to come and clean up shoddy code all the time, which isn't an option that exists in dx12 with its much thinner driver layer.

I would say the opposite I would say that DX12 is not a stumbling block it is where developers can get way more out of the API than what they are used to.
And what is known about DX12 on Nvidia that their DX11 drivers allow better usage of multi-threads.
 
It kinda boggles the mind why dx12 is such a stumbling block for devs. I guess it's a clue how with dx11 a big fat driver layer is there holding everybodys hands rearranging shaders and injecting bits of code into the engine to paper over coding cracks. Now in dx12 that's not there and the intimate knowledge of the workings of gpus and coding best practice needed to get the performance that the driver teams have just aren't there. I saw this blog years ago, that I'm trying to find again, and in it an ex driver dev laid out how bad game devs were at sticking to the direct X spec and the jankiness of their code. The guy said that after they profiled games they'd end up injecting custom shaders and wholesale 'proper' bits of code into the game engine at runtime. It might not be the case now but in the not to distant past the driver guys had to come and clean up shoddy code all the time, which isn't an option that exists in dx12 with its much thinner driver layer.
Right, with BFx I think it is a combination of issues, for one I don't think DICE ever really implemented Dx12 engine, it seems more of a patch type, quick an dirty.
But that is the thing with all Dx12 stuff, it takes time to do all the under the hood low-level stuff that could have been used to make better core game.
Most Dev's are on tight budget and timeline so it is a tough thing to get done right,
It is definitely a marketing nightmare for anything new to come out as it will all be Dx12+, the only thing is time to get rendering engines up to speed and try to have them more flexible so they can be used on a wide amount of game titles.
That is only way I see it moving foward and that only works for big AAA Dev;s, smaller companies will have to use someones other engine Unity, Unreal etc
 
The idea is that the first time ray tracing was implemented it was slower way slower and how ray tracing works you can choose what you will do and won't that is clear that it takes a lot of power to ray trace things is the reason why it was slower. Unless DICE did not know how to do this they limited objects and landscapes and that part is documented.
Nvidia pays developers I am very sure that when contracts are signed there is a clause where they will defend or retract any kind of negative statements regarding the technology used. Free lip service and what has been Nvidia mantra over the years when problems arise "it is not us, it is them".


I would say the opposite I would say that DX12 is not a stumbling block it is where developers can get way more out of the API than what they are used to.
And what is known about DX12 on Nvidia that their DX11 drivers allow better usage of multi-threads.

Of course dx12 is an opportunity. What I said is that nobody quite knows how to make the most of it yet (for knowledge base reasons) or even turn it into a net gain. Surely you're answering your own questions here: isn't dx11 multi thread performance being better than dx12 performance down to a big thick driver layer doing all the complicated work for game devs and without it in dx12 land these same devs can't make the magic happen again on their own.

Right, with BFx I think it is a combination of issues, for one I don't think DICE ever really implemented Dx12 engine, it seems more of a patch type, quick an dirty.
But that is the thing with all Dx12 stuff, it takes time to do all the under the hood low-level stuff that could have been used to make better core game.
Most Dev's are on tight budget and timeline so it is a tough thing to get done right,
It is definitely a marketing nightmare for anything new to come out as it will all be Dx12+, the only thing is time to get rendering engines up to speed and try to have them more flexible so they can be used on a wide amount of game titles.
That is only way I see it moving foward and that only works for big AAA Dev;s, smaller companies will have to use someones other engine Unity, Unreal etc

I agree a whole bunch of dx12 implementations seem like quick and dirty patches. But by now you'd think some decent efforts would come along.
 
The idea is that the first time ray tracing was implemented it was slower way slower and how ray tracing works you can choose what you will do and won't that is clear that it takes a lot of power to ray trace things is the reason why it was slower. Unless DICE did not know how to do this they limited objects and landscapes and that part is documented.
DICE fell into several fps crushing bugs in the first release, they corrected most of them, and optimized the rest, now they can execute parallel processing for the ray traced and rasterized data, which boosted fps by a lot. There are more optimizations and bug crushing to be done by the way, so expect more performance to be gained in the future.
 
The article contains blatantly wrong information, no ray tracing has been reduced EVER, in fact reflection quality has been improved with the recent patch, the reflection clarity is higher because denoising is better, and reflections now reflect even more objects like grass and leaves through an SSR implementation on top of ray tracing. Before the patch none of these objects were even reflected in the first place.

Here see Digital Foundry analysis with input from DICE itself.



AFAIK they have actually optimized the amount of rays cast on certain objects, like leaves on the ground which were receiving so many rays that it didn't affect image quality, so they optimized the amount of rays cast on certain objects, they did this per-object optimizing, as well as area optimizing within each map for certain areas. It is very much about rays cast, and optimizing what is getting more or less calculation of rays. The workload was over burdened in many cases prior to the update.

https://wccftech.com/battlefield-v-dxr-update-reflecting-on-improvements/

Optimizations will also be coming from Denoiser and filter improvements which will play a large part on more specular surfaces that occur in snow maps such as the Frozen Lake. BVH or Boundry Volume Hierarchies which allows ray tracing to be faster and efficient in triangle intersections. The devs also found a bug to be the culprit behind major performance dips with RTX when destroying objects. Since Battlefield V comes with lots of destructible environments, the removal of the bug increases the performance by a huge factor, allowing the ray tracing hardware to be utilized much more efficiently. This also applies to foliage and vegetation which have been optimized to use ray tracing properly as too many rays were falling on them.
 
Last edited:
AFAIK they have actually optimized the amount of rays cast on certain objects, like leaves on the ground which were receiving so many rays that it didn't affect image quality, so they optimized the amount of rays cast on certain objects, they did this per-object optimizing, as well as area optimizing within each map for certain areas. It is very much about rays cast, and optimizing what is getting more or less calculation of rays. The workload was over burdened in many cases prior to the update.

https://wccftech.com/battlefield-v-dxr-update-reflecting-on-improvements/

Just curious - did you increase power limit or did you run the card at stock?
 
Of course dx12 is an opportunity. What I said is that nobody quite knows how to make the most of it yet (for knowledge base reasons) or even turn it into a net gain. Surely you're answering your own questions here: isn't dx11 multi thread performance being better than dx12 performance down to a big thick driver layer doing all the complicated work for game devs and without it in dx12 land these same devs can't make the magic happen again on their own.

If it was so complicated for developers to implement the same things Nvidia implemented for DX11 why does Nvidia not supply source material(or other examples) for developers to reach the same performance?

It is odd that AMD does not have the same problems regarding performance when were talking about optimizing for multi core ....
 
AFAIK they have actually optimized the amount of rays cast on certain objects, like leaves on the ground which were receiving so many rays that it didn't affect image quality, so they optimized the amount of rays cast on certain objects, they did this per-object optimizing, as well as area optimizing within each map for certain areas. It is very much about rays cast, and optimizing what is getting more or less calculation of rays. The workload was over burdened in many cases prior to the update.
Yes, there were several bugs that spawned many unnecessary rays on several objects and compounded the cost of ray tracing. What you said in the review that they downgraded the reflection quality, that is not true at all, reflection quality actually has INCREASED.
 
Yes, there were several bugs that spawned many unnecessary rays on several objects and compounded the cost of ray tracing. What you said in the review that they downgraded the reflection quality, that is not true at all, reflection quality actually has INCREASED.
So quality reflections at Low settings increased in terms of IQ?
 
Yes. Again look for the Digital Foundry Analysis. Before the patch, denoising was a bit mediocre resulting in slightly blurry reflections, and several objects were excluded from the ray traced reflection entirely (they were mostly small foliage: grass, leaves ..etc).

After the patch, Denoising has improved, so now reflections are sharper. Also previosuly excluded objects like grass, flying leaves and LOD objects are now reflected with a screen space implementation on top of the ray tracing. So DICE increased the amounts of objects being reflected by using a smart combination of screen space and ray tracing.
OK.
 
Yes, there were several bugs that spawned many unnecessary rays on several objects and compounded the cost of ray tracing. What you said in the review that they downgraded the reflection quality, that is not true at all, reflection quality actually has INCREASED.

You might be confused about what I said. In terms of optimizing reflections, I never meant IQ was downgraded via the patch. In fact, I believe I stated in the conclusion I noticed no difference in IQ between the patch and prior to the patch. Check out my last sentence in "The Visual Impact" section.

In regards to quality differences, what I was referring to in the review was the difference that setting the in-game graphics setting of: Low/Medium/High/Ultra makes on reflection quality. The quality options directly affect the resolution of the reflections, Low as I stated, is pretty fuzzy and not as well defined, while going up to Ultra is pristine. I was simply explaining how the quality options in the game work.
 
Nice review [H], this is a review I've been waiting for.

As others have mentioned, dx12 being slower than dx11.. that's on the game dev's... I mean, shouldn't dx12 be faster than dx11, since its closer to the metal?
 
Nice review [H], this is a review I've been waiting for.

As others have mentioned, dx12 being slower than dx11.. that's on the game dev's... I mean, shouldn't dx12 be faster than dx11, since its closer to the metal?

IIRC Turing does very well in most DX12/Vulkan games. That’s where you generally see 45% gains over a 1080ti. I would definitely put this on DICE. They only just recent fixed massive stuttering in DX12.

I think it’s only better when it’s designed from the ground up....
 
Last edited:
IIRC Turing does very well in most DX12/Vulkan games. That’s where you generally see 45% gains over a 1080ti. I would definitely put this on DICE. They only just recent fixed massive stuttering in DX12.

I think it’s only better when it’s designed from the ground up....
Interesting that Dice forged Mantle with AMD with way better performance when using AMD cards but yet appears to have a stumbling block with DX 12. Anyone tried to isolate any setting in the game, like shadow quality etc. that is hitting DX 12 hard while not DX 11? Time consuming but could be revealing as well what needs improving in the game. Anyways that is the worst performance degradation I've seen in a game going from DX 11 to DX 12 which does not help combining that with DXR.
 
The article contains blatantly wrong information, no ray tracing has been reduced EVER, in fact reflection quality has been improved with the recent patch, the reflection clarity is higher because denoising is better, and reflections now reflect even more objects like grass and leaves through an SSR implementation on top of ray tracing. Before the patch none of these objects were even reflected in the first place.

Here see Digital Foundry analysis with input from DICE itself.


So you are indicating that Nvidia Raytracing ability was not enough to give accurate reflections but screen space reflections had to be added for performance sakes to add in missing objects. A technique that is used for reflections in many games, a.k.a race car game hoods reflecting the environment, water scenes reflecting boats and environment. Actually this speaks rather bad for RTX ability to do raytracing when only reflections, a small gamut of real ray tracing of a scene has to be supplemented in order to have decent 1080p gaming speeds and quality. Plus the denoise is in house correct? Or are they now using Nvidia way with AI?

The review was right on on showing real performance, feel and what you would expect in best case scenario on a 2070, low and maybe medium raytraced settings at 1080p if below 60 fps is ok with you. For high quality a 2080 or maybe even a 2080 Ti would be needed. 1440p? Cant wait for Brent's next review.

What I really find blatantly wrong is Nvidia/Dice putting on a presumption the game is even raytraced. Only one aspect is while everything else is not and even that, reflections had to be supplemented with faster methods to make the reflections better. :LOL:. I kinda find that funny but also sad :(. Even the reflections are not truly fully raytraced, the only thing using DXR/RTX.

I could not think of a worst way to showcase your biggest new feature set then this on a way over priced or hiked up price card with a dismal rate of failures.
 
The article contains blatantly wrong information, no ray tracing has been reduced EVER, in fact reflection quality has been improved with the recent patch, the reflection clarity is higher because denoising is better, and reflections now reflect even more objects like grass and leaves through an SSR implementation on top of ray tracing. Before the patch none of these objects were even reflected in the first place.

Here see Digital Foundry analysis with input from DICE itself.



Despite this. It doesn't add to the playability of the game. Doesn't change the gameplay itself. Gtx 2060 apparently has RT cores as well. So do you think as long as its beautiful we can enjoy a slideshow? lol. It looks better but looks don't make gameplay better. Which is a shame.
 
Seems like a fair review, potential improvements to RTX 2070 perf notwithstanding. The link to the digital foundry video was the cherry though...so nice to view an analysis by guys who are enthused by an exciting new technology!
 
Seems like a fair review, potential improvements to RTX 2070 perf notwithstanding. The link to the digital foundry video was the cherry though...so nice to view an analysis by guys who are enthused by an exciting new technology!
The RTX 2070 reminds me of Matrox Parhelia. Yeah, technically, it could do it, but in terms of actual gameplay....nope.
 
That judgement is probably 100% spot on. (Matrox Parhelia….The glory days of ATI stomping on Nvidia's dust busters.)

I feel that how one views the RTX series depends somewhat on one's expectations of 1st generation products.

I'm happy that my card can ray trace, I'm impressed with the reflections, and I'm satisfied with the BFV performance on a 2080 card at 3440X1440 DXR low to medium, while keeping in mind this is 1st gen tech.
If RTX 2070 owners absolutely must have MAX fps/fluidity, they can just use RTX off and have perf within 15% or so of a 1080ti.

But BEST of all, going forward, if my whole life flashes before my eyes, I'll be able to say, at least some of it has been ray traced :)
 
That judgement is probably 100% spot on. (Matrox Parhelia….The glory days of ATI stomping on Nvidia's dust busters.)

I feel that how one views the RTX series depends somewhat on one's expectations of 1st generation products.

I'm happy that my card can ray trace, I'm impressed with the reflections, and I'm satisfied with the BFV performance on a 2080 card at 3440X1440 DXR low to medium, while keeping in mind this is 1st gen tech.
If RTX 2070 owners absolutely must have MAX fps/fluidity, they can just use RTX off and have perf within 15% or so of a 1080ti.

But BEST of all, going forward, if my whole life flashes before my eyes, I'll be able to say, at least some of it has been ray traced :)

I think where the disconnect comes in is the 2070 is almost the same die size as a 1080ti and nVidia is charging like it is. The problem is a good portion of the die is for tech you can barely use and if people don’t care about RT or DLSS, or don’t play the two games it’s used in, the value is bad.

I would have much rathered CUDA cores personally.
 
If it was so complicated for developers to implement the same things Nvidia implemented for DX11 why does Nvidia not supply source material(or other examples) for developers to reach the same performance?

It is odd that AMD does not have the same problems regarding performance when were talking about optimizing for multi core ....

They wouldn't share their intellectual property under any circumstances. Thats a given.
The dx12 driver is a much smaller less resource intensive program than the dx11 driver. Theres a lot less impact that optimisations can have. Nvidia put together a very efficient dx11 driver, but in dx12 those optimisations count for much less
 
Actually this speaks rather bad for RTX ability to do raytracing when only reflections, a small gamut of real ray tracing of a scene has to be supplemented in order to have decent 1080p gaming speeds and quality. Plus the denoise is in house correct? Or are they now using Nvidia way with AI?
Denoising is in house yes.
And no this speaks nothing about RTX, this is the first implementation of the tech, and we already achieve good results. That's RAY TRACING man! Before RTX you wouldn't even dream of running it at 10fps on 720p.

For high quality a 2080 or maybe even a 2080 Ti would be needed. 1440p? Cant wait for Brent's next review.
2018Ti runs 1440p60 Ultra RTX just fine.
The review was right on on showing real performance, feel and what you would expect in best case scenario on a 2070, low and maybe medium raytraced settings at 1080p if below 60 fps is ok
It's playable and doable for a lot of people, also remember DICE is still working on fixing DX12 and adding more DXR performance, so things will improve even further than now.
 
  • Like
Reactions: noko
like this
Despite this. It doesn't add to the playability of the game. Doesn't change the gameplay itself. Gtx 2060 apparently has RT cores as well. So do you think as long as its beautiful we can enjoy a slideshow? lol. It looks better but looks don't make gameplay better. Which is a shame.
It gives you the option to have higher image quality, that's important for some you know.
 
It gives you the option to have higher image quality, that's important for some you know.

I get it if you got like 2080ti or so. But if you can't play the game consistently on a card what good is image quality. So yea if you got like 2080ti it would make sense. That was my point its not worth it on the lower end of the spectrum.
 
They wouldn't share their intellectual property under any circumstances. Thats a given.
The dx12 driver is a much smaller less resource intensive program than the dx11 driver. Theres a lot less impact that optimisations can have. Nvidia put together a very efficient dx11 driver, but in dx12 those optimisations count for much less

I guess that is the big difference between AMD and Nvidia we all saw what happened with Vulkan.
 
It's not important for playing FPS games such as BFV.
You'll get completely raped by other players while you're admiring DXR.

Ahhh but that map with the bridge that spans the map. It’s soooo good in the swamps under it.

You’re right though. I have to remind myself to look for enemies. The first couple times I got shot in the head while staring into the water.

Can’t wait for a game I care about and the right genre to release....
 
RTX is just this generation's HairWorks. It adds to the experience but at a high performance penalty, and will probably need 2nd generation hardware to run acceptably.

When I tried HairWorks in Witcher 3 on my 980 Ti it was playable but I didn't like the performance hit.

It was awesome when I replayed Witcher 3 on my overclocked Titan XP.
 
RTX is just this generation's HairWorks. It adds to the experience but at a high performance penalty, and will probably need 2nd generation hardware to run acceptably.

When I tried HairWorks in Witcher 3 on my 980 Ti it was playable but I didn't like the performance hit.

It was awesome when I replayed Witcher 3 on my overclocked Titan XP.

That's pretty much how this works. You have to start somewhere and the first generation suffers and after that the hardware and software is better. It's like people forget all the new tech in games that came before.

I get it that this round is HIGH priced. That doesn't eliminate the good tech that is evolving.
 
Denoising is in house yes.
And no this speaks nothing about RTX, this is the first implementation of the tech, and we already achieve good results. That's RAY TRACING man! Before RTX you wouldn't even dream of running it at 10fps on 720p.


2018Ti runs 1440p60 Ultra RTX just fine.

It's playable and doable for a lot of people, also remember DICE is still working on fixing DX12 and adding more DXR performance, so things will improve even further than now.
Well I hope there are more optimizations, improvements in the code etc. to allow more of the scene to have real time raytracing and not just reflections supplemented by Screen Space Reflections (actually nothing wrong with that). To get to the point where baking in the texture lighting, lightmaps, which is uses raytracing, a lot of rendering time etc. making the beautiful games we are use to is not needed no more. Afraid to say that is a very long ways off, processing used to make some of the incredible gaming scenes can take days of rendering, high end render specific machines, to get the lighting right for a room, scene etc. except it is static except by clever pixel shading. That would be wonderful if virtually everything was raytraced in real time, so far it is not and Battlefield V has a way to go for me to think it is real time raytraced. Yes reflections is a great place to start and some of the images I've seen do look great in that respect in Battlefield V.
 
Last edited:
We have done testing BFV on Saturday in regards to VRAM usage and System RAM usage. We have concluded System RAM is not a problem, our test system is maxing out at about 10GB of System RAM at the highest settings in the game with DXR Enabled at Ultra at 1440p. However, it is clear that VRAM could be a limitation with DXR. It seems to demand a lot of VRAM, in fact, DX12 itself demands a lot of VRAM, about twice that of DX11 just turning on DX12. With a 2080 Ti here are our 1440p results.

DX11 - 1440p - Highest Game Settings
4503 MB VRAM
8723 MB System RAM

DX12 - 1440p - Highest Game Settings - No DXR
7976 MB VRAM
10862 MB System RAM

DX12 - 1440p - Highest Game Settings - Ultra DXR
8824 MB VRAM
10714 MB System RAM

From this very testing at 1440p it seems the 8GB on the 2070 and 2080 might be bottlenecking DXR performance in the game at 1440p as it can definitely exceed 8GB of VRAM with Ultra DXR at 1440p.

We need to test 1080p in DX12 and Ultra DXR and see what the results are next.

We will include a table with all of this information in our 2080 Ti BFV Ray Tracing article to bring it all together and talk about it.
 
Thanks for taking the time to do that. That should put a few people's minds at rest.
 
Screenshot_20190118-130452_YouTube.jpg
It seems Ultra textures are just too much for most of RTX lineup. Digital Foundry does a good job showing this:


Those are some impressive gains on just a 6 GB RTX 2060 if the visuals are not affected that much. BFV may have gone overboard on the ultra textures. No doubt high textures are using much less vram.
 
Back
Top