Battlefield V NVIDIA Ray Tracing RTX 2080 Performance @ [H]

Does DX12 on it's own provide no visual improvements over DX11?
I really would like to know why there is such a big performance drop going from DX11 to DX12. Its not like DX12 gives you any enhanced graphics over what DX11 offers.
 
Strike 2, swing and a miss...Thanks for the review guys! Im wondering if its Nvidia specific the performance tanking with DX12, or does AMD suffer in the same manner?

On my machine I do notice higher CPU usage when I made the switch over to DX12 along with more memory being used. BFV uses all 12 of my threads, so perhaps there is something to the 7700K not having enough threads to push DX12 properly?

Would be an interesting follow-up article. DX11/DX12 comparison from both camps using the 7700k as the baseline and perhaps mixing in a 2700X or 9900k and disabling threads along the way. I "feel" like DX12 looks better, but maybe thats just BS. Gonna switch back to DX11 and see after a decent amount of time played in DX12

Again, thanks guys! just interested now why such a precipitous drop off in frames for DX12
 
  • Like
Reactions: noko
like this
On my machine I do notice higher CPU usage when I made the switch over to DX12 along with more memory being used. BFV uses all 12 of my threads, so perhaps there is something to the 7700K not having enough threads to push DX12 properly?
We see only one instance where only one thread hits a max 100% usage and it is there for a split second moment. Actually, DX11 shows higher CPU usage than DX12, and DX12 with DXR set to Ultra is less CPU intensive than DX12 alone. The fact of the matter is that DXR/NVIDIA Ray Tracing is GPU limited even at 1080p. So you can say what you want to say, but we did the research, and while BFV is surely CPU intensive for a 4C/8C CPU (and remember ours is overclocked to 5GHz across all cores), a six or eight core CPU is not in any way needed at the clocks we are running. Now using a 2080 Ti, we are likely moving to a 9600K, because yes, as the GPU scales higher, it certainly puts you in a situation where you are more likely to be CPU limited, but that has not been the case with the 2070 or 2080, but the 2080 is certainly pushing it.
 
Looking at all of the data from the 2070 and the 2080 reviews, I do have a suspicion that our rig will in fact be CPU bound with the 2080 Ti. Brent is checking right now. I also just got back from Microcenter and bought a 9700K for his testing which I will ship to him today. We will be able to do some comparison testing followups as well.
 
Thanks [H].

On the 2080ti review, can we also get 980ti and 1080ti's added to just the "No DXR" Graphs? So to see the performance increases without regard to DXR, when the games' patch level, os's directx, rigs CPU, etc are all the same?
 
Thanks [H].

On the 2080ti review, can we also get 980ti and 1080ti's added to just the "No DXR" Graphs? So to see the performance increases without regard to DXR, when the games' patch level, os's directx, rigs CPU, etc are all the same?
We have other articles that already fully cover that in other games. We will not be doubling or tripling the workload and time it takes to get these done in order to accomplish that.
 
I switched back today and can confirm I do see higher CPU useage in DX11, by 5-8%. Ive been on DX12 for awhile now, and my move to DX12 followed the first big patch, so perhaps that swayed things...looks like I was wrong at any rate! VRAM and RAM usage is lower under DX11 still though I notice. Frametimes are a little better and more consistent under DX11 on the AMD side as well.

Its too bad they didnt make more of a push for DX12 optimizations earlier on. With ray tracing being unfortunately tacked on at the end of the game development, that probably would have provided more incentive if introduced earlier. Perhaps we will see a focus on DX12 performance gains in future updates, given the 2 shot mandatory performance penalty that ray tracing and DX12 is currently handing out to the green team in order to implement the star feature of Turing.

Looking forward to future game updates and perhaps driver updates to try and fix this sorry situation, and appreciate you guys looking into all these things. Enjoy the 9700K looks to be a great Gaming CPU these days, but I look forward to the days when AAA games can use all of your Threadripper! few years to go id imagine...
Is this game worth it to buy? I may try out Dx 11 and than 12 with some Vega’s and compare to a 1080 Ti. Wondering if a solely game engine issue or NVidia issue or both.

Even with the cpu not indicating 100%, you can have thread dependencies with one thread waiting for results from another and if they are on the same core (too few independent cores) causes stalls and contention with the cache on the same core. Anyways would be very interesting if the larger size caches of Ryzen shows an advantage here as well.
 
Is this game worth it to buy? I may try out Dx 11 and than 12 with some Vega’s and compare to a 1080 Ti. Wondering if a solely game engine issue or NVidia issue or both.

Even with the cpu not indicating 100%, you can have thread dependencies with one thread waiting for results from another and if they are on the same core (too few independent cores) causes stalls and contention with the cache on the same core. Anyways would be very interesting if the larger size caches of Ryzen shows an advantage here as well.

I'm loving this game, totally worth it in my opinion, Has potential to be the best BF yet once they iron out a few bugs and add a little more content.
 
Interesting review, I am not seeing more than 10% drop with DX 12 in their tests for either Nvidia or AMD. Interesting is the Vega 64 is almost as fast as a 2080 at 1080p, only 3fps slower, Ultra settings with the Vega 56 and 2070 the same. Did not expect that.

Now the CPU scaling is very reviewing, game loves cores with 6core/6thread and 8core/8thread significantly faster at the lower resolutions. Looking to be a very interesting game release, the 2080 Ti also appeared to do decently at 4K with DXR as well. Absolutely gorgeous 4K video. Believe the ram usage reported is from prior to the patch, videos after the patch are showing it much higher.
 
2080 Ti also appeared to do decently at 4K with DXR as well. Absolutely gorgeous 4K video.

Game does look pretty amazing at 4k. It's ironic that most people joke in relation to RT that these games are not usually meant to played as a walk around and look at things type of thing but it's hard not to. Some parts have just incredible eye candy.

Interesting is the Vega 64 is almost as fast as a 2080 at 1080p, only 3fps slower

Yeah, [H]ard's Vega 64 reviews pretty much have proved it's a great buy for 1080p gaming. At this point with their 2070/2080 RT performance #'s it'd be difficult to justify not getting a Vega 64 if someone was only going to game at 1080p. Can only imagine what the next refresh will improve on with it.
 
  • Like
Reactions: noko
like this
Is this game worth it to buy?

I enjoy it, they still have some work to do on it, but its got a great footing to grow from. People hate on things, just because they wanna hear themselves talk at times. So issues get a little blown out of proportion once in awhile. Runs awesome on my machine as well. Glad I went the Vega route, water block inbound!

Hopefully DX12 eventually shines, but it looks like it may be pulling load off the CPU, and loading up the GPU more, hurting performance... So not sure we will see performance parity, hopefully though!
 
We have done testing BFV on Saturday in regards to VRAM usage and System RAM usage. We have concluded System RAM is not a problem, our test system is maxing out at about 10GB of System RAM at the highest settings in the game with DXR Enabled at Ultra at 1440p. However, it is clear that VRAM could be a limitation with DXR. It seems to demand a lot of VRAM, in fact, DX12 itself demands a lot of VRAM, about twice that of DX11 just turning on DX12. With a 2080 Ti here are our 1440p results.

DX11 - 1440p - Highest Game Settings
4503 MB VRAM
8723 MB System RAM

DX12 - 1440p - Highest Game Settings - No DXR
7976 MB VRAM
10862 MB System RAM

DX12 - 1440p - Highest Game Settings - Ultra DXR
8824 MB VRAM
10714 MB System RAM

From this very testing at 1440p it seems the 8GB on the 2070 and 2080 might be bottlenecking DXR performance in the game at 1440p as it can definitely exceed 8GB of VRAM with Ultra DXR at 1440p.

We need to test 1080p in DX12 and Ultra DXR and see what the results are next.

We will include a table with all of this information in our 2080 Ti BFV Ray Tracing article to bring it all together and talk about it.
 
That titan purchase is making more sense now! :woot:

I could start another thread and write a book on the dramas with that thing. It was a graduation gift and I love it but ultimately an expensively bizarre beast. I will say that for games that support both SLI and DX12 well it kicks butt. It's the last of my SLI stuff. Having 2 x m.2 SSD raids plus a hybrid HDD is pretty neat though.
 
Been experimenting with BF5 and getting some different results with Ryzen. With SMT off, 8 core 2700 OC to 4100mhz, 3200men (in sig) DX 11 is much slower than DX 12. CPU spikes to near 100% at times! Using 16gb Vega FE OC.

Anyways DX 11 1080p is around 111 FPS, DX 12 145 fps. I am somehow frame limited to 145 fps no matter what I do. This is one hell of an interesting game engine and it is using all 8 threads of the 2700 when SMT is turned off.

Found an outstanding tool that works with all API's (I tested in Vulcan, DX 11 and DX 12), records just about anything, totally customizable, low CPU overhead. Just think Fraps but a couple of magnitude better. Does frame times for all API's, benchmarks/records data. Very much worth while to check for anyone interested. It is available on Steam but is not limited to Steam games, it will work on any game, in this case Origin BF5. FPS Monitor:

 
Last edited:
I am somehow frame limited to 145 fps no matter what I do

I know it has no relation to BFV directly but I was recently reading a similar account over at GamersNexus regarding their 9700k testing and some unusual results GTAV. Seems some engines have unusual high FPS caps and in turn will do odd things to CPU usage when they're hit. Might want to give it a read.
 
I know it has no relation to BFV directly but I was recently reading a similar account over at GamersNexus regarding their 9700k testing and some unusual results GTAV. Seems some engines have unusual high FPS caps and in turn will do odd things to CPU usage when they're hit. Might want to give it a read.
Will do, the game in the options has the highest option of 200 fps but it makes no difference between 144 fps setting and 200 fps, it stops at 145 fps for me and obvious limiting. That is with and without FreeSync on . Tried different driver options, no VSync, no FreeSync etc. - same. This game supports FreeeSync 2 as in HDR and FreeSync at the same time, HDR look really good and gives much depth to the rendering and one of the smoothness games I've ever played with FreeSync. Since using the game version of the Pro Drivers - they are not as current as the new AMD Adrenalin drivers. I could load the new drivers but it usually messes up my ability to switch back to the Pro driver easily. This game engine can use a lot of threads even without DXR and how it uses the CPU seems to be different, it is like packets of processing is done, CPU is constantly going above 60% up to almost %100 (That is with SMT off so just 8 threads vice normal 16 threads) then for a period of time 20%-30% and then repeat, very interesting. Using FPS Monitor it looks similar to a square wave when it is graphs in real time. The performance and the IQ this game gives is utterly phenomenal! Love to see how it plays and looks with DXR, have to agree with Brent 2070 review in the end, " This isn’t a game you sit around admiring the scenery. that this game really does not need DXR" ( https://www.hardocp.com/article/2018/12/17/battlefield_v_nvidia_ray_tracing_rtx_2070_performance/10 ) .
 
Last edited:
Been experimenting with BF5 and getting some different results with Ryzen. With SMT off, 8 core 2700 OC to 4100mhz, 3200men (in sig) DX 11 is much slower than DX 12. CPU spikes to near 100% at times! Using 16gb Vega FE OC.

Anyways DX 11 1080p is around 111 FPS, DX 12 145 fps. I am somehow frame limited to 145 fps no matter what I do. This is one hell of an interesting game engine and it is using all 8 threads of the 2700 when SMT is turned off.

Thanks for those AMD results. I suspect since the AMD has less IPC than the Intel, is why the DX11 cpu usages are higher than comparative Intel results. Since DX12 uses the main CPU less, those results are less affected, if at all.

What is interesting is that better performance is seen going from 6 cores to 8, which it seems several people have mentioned. Being back to a place in gaming when we can be cpu limited is interesting and means it might get fun again to worry about overclocking, etc. =)
 
Back
Top