Battlefield 1 Video Card DX12 Performance Preview @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Battlefield 1 Video Card DX12 Performance Preview - The anticipated new game Battlefield 1 is now official released. In this Performance Preview we take the AMD Radeon RX 480 and NVIDIA GeForce GTX 1060 and find out how DX11 and DX12 compare. We also compare both video cards together in DX11 and DX12 to find out which one rocks this game.
 
Wow, look at those frame time spikes in DX12. DX11 is still king.
 
  • Like
Reactions: DPI
like this
What about DX12's ability to run multiple GPUs?

Mainly, I'm curious if an AMD APU system + RX480 might get a little 'boost' using the iGPU.

That said, I'm totally unaware if the multi-GPU offering of DX12 is handled in DX12 itself, or if a dev must specifically code to use multi-GPUs.
 
this is exactly what I experienced during the beta with my system. dx12 was a stuttery mess for me! dx11 was perfect! dx12 is better for me in everything else that I play.
 
But... but... flipping the DX12 switch is supposed to magically increase frames 20% across the board. It's going to save all the old shitty graphics cards people are still using, or the crappy games devs couldn't be bothered to optimize for PC. /s
 
dx12 does give me those "magical" improvements in everything(somethings not as much) but bf1. which is weird...
vulkan is even better!
 
Hmmmm using a 4core 8t CPU. Anyway you can use a 6-8core with HT. See if DX12 shows improvement? Some benchmarks in other places show a pretty hefty increase in DX12 with more cores.

You guys plan to do a CPU test?
 
This seems to have been the case in just about all DX12 titles so far...worse on all cards, less worse for AMD.
 
6 core 6800k here and BF1 remains a stuttery mess in BF1 on DX 12. DX11 is buttery smooth.
 
6 core 6800k here and BF1 remains a stuttery mess in BF1 on DX 12. DX11 is buttery smooth.

Sounds like a Dice issue since it does it with AMD and Nvidia.

Did you notice better minimum framerates? I just stayed with DX11 for now. I want to go DX12 but stuttering = no go
 
DX12 has really turned out to be a disappointment so far. Why are we seeing these performance discrepancies, particularly as large as we see in BF1, between DX11 and DX12? I understand that DX11 is likely better optimized owing to developer familiarity along with the extensive optimizations that both AMD and NVIDIA have made over the years in the drivers, but DX12 by all rights should be a superior API.. I don't get it.
 
DX12 has really turned out to be a disappointment so far. Why are we seeing these performance discrepancies, particularly as large as we see in BF1, between DX11 and DX12? I understand that DX11 is likely better optimized owing to developer familiarity along with the extensive optimizations that both AMD and NVIDIA have made over the years in the drivers, but DX12 by all rights should be a superior API.. I don't get it.

Well its all on the developers. Every single DX12 game is all on the developers so far. Some are a complete mess, some work well.

Can't blame Nvidia/AMD.
 
Sounds like a Dice issue since it does it with AMD and Nvidia.

Did you notice better minimum framerates? I just stayed with DX11 for now. I want to go DX12 but stuttering = no go

DX12 is awful. I am beginning to think multiplayer is a totally different beast than campaign in BF1

My minimum fps is about 70 fps. Though normally I am at 80 to 100 fps.
 
Well its all on the developers. Every single DX12 game is all on the developers so far. Some are a complete mess, some work well.

Can't blame Nvidia/AMD.
I'm not blaming NVIDIA or AMD. They have a vested interest in making DX12 work well, game developers not necessarily.
 
Well its all on the developers. Every single DX12 game is all on the developers so far. Some are a complete mess, some work well.

Can't blame Nvidia/AMD.

This really needs to be emphasized, and I had meant to write a line about that. It must be stressed that with DX12 a lot more responsibility relies on the developer now to extract performance, optimize performance and efficiency, and also generate multi-GPU performance scaling. It all rides on the developer, and sometimes, they take the easier/cheaper/port'ier way out.

So far DX12, on the whole, has been a rough start. It's a trend at this point, and not a good one. Out of all the games coming out this year I would have thought BF1 would have proven why we need and want DX12. /shrug

At least performance is really good this time in DX11. The visuals don't seem to be that much of an upgrade though over BF4, IMO. It doesn't feel like a "generational" game engine visual upgrade, IMO.
 
Battlefield is supported by Artificial Aiming, who needs performance when you're not even doing your own shooting.

I just saved $59.99-$128.98
 
What about DX12's ability to run multiple GPUs?

Mainly, I'm curious if an AMD APU system + RX480 might get a little 'boost' using the iGPU.

That said, I'm totally unaware if the multi-GPU offering of DX12 is handled in DX12 itself, or if a dev must specifically code to use multi-GPUs.
First the multi gpu think would have enabled by the game dev. Second using APU and RX480 i doubt you would get any boost and more then likely would LOSE performance since apu's gpu will be much slower and less game adjusts for that and sets work load to more rx 480 but then likely still waiting for apu to do the job it was given.

As i said game has to be coded for multi-gpu thing and that likely won't have good results for some time since it would be some what a trial and error to see what works well and what doesn't.
 
At it's core these games are still developed with D3D11 at heart, D3D12 is never going to shine until 11 support is retired completely.
 
The thing I find most interesting here is how badly DX12 compares to DX11 on AMD hardware. Not to knock on AMD, but because DICE supported Mantle pretty well in Battlefield 4, and it provided a solid FPS bump on most AMD cards. DICE is usually a very AMD friendly developer, and I would have thought that at least on AMD cards, the DX12 path would be pretty functional. That does not appear to be the case. I feel like there is probably a story here and I want to hear it.
 
The thing I find most interesting here is how badly DX12 compares to DX11 on AMD hardware. Not to knock on AMD, but because DICE supported Mantle pretty well in Battlefield 4, and it provided a solid FPS bump on most AMD cards. DICE is usually a very AMD friendly developer, and I would have thought that at least on AMD cards, the DX12 path would be pretty functional. That does not appear to be the case. I feel like there is probably a story here and I want to hear it.

Time.

And EA isn't pushing back the release date for dx12, hell no.

Vulkan, Doom, we have seen what happens when a developer has the desire and means to optimize their games for consumers. Everyone benefits.

At least DICE, whatever dx12 performance is desired, their dx11 is top-notch and I bet a reason behind a lackster dx12 perfromance. Why bother when dx11 is pretty amazing anyway?
 
DX 11 looks to be a rather smooth consistent frame rate experience with some great performance. Seems like this game engine smokes! DX 12 results are interesting but expect them to improve over time due to Dice is one hell of a developer and will probably push DX 12 improvements (which DX 12 allows more for the developer to do). Can HardOCP get an interview with Dice about this?

DX 11 is continues to fight to stay relevant and will not die easily. The adding of more powerful cores to get DX 12 performance up seems contrary to goals of DX 12 allowing weaker cpu's to be used fully and less of an impact on the game results - almost seems backwards.

I pretty much agree that BF1 was rather looked at a game that will showcase DX12 - so far it continues the trend from other DX 12 games.The only game I play that gets significant increase in performance using DX 12 is Deus Ex MD and it seems to get better with each update as well.
 
Last edited:
In that note, I don't have knowledge regarding how the transition worked between dx9 and dx11. Did it take less time for games to start using the newer dx version than now, assuming dx9 to 11 improved performance? Were people to quick to assume dx12 adoption would be quick and painless?
 
Another disappointing result from DX12. The Problem is, low level APIs push too much work onto developers, not only you have to optimize for each GPU variant from each GPU vendor but you have to manage VRAM yourself and It requires immense investment and no developer can afford to spend such money and time to optimize for every GPU in existence.

I don’t know Why PC gaming community accepted Low level APIs as a good thing. it seems like PC gamers forgot that there was a reason PC always used High level APIs. Low level APIs need single fixed hardware for best result, they just don't work with varied and constantly changing hardware of PCs.
 
Last edited:
DX12 is bad for PC gaming, not good. And here is another show of it.

Its only going to get worse and worse as future GPUs are released and games not updated for proper support.

The entire concept is made for a fixed static hardware solution aka consoles. Not a random bunch of hardware, changes and such like the PC is.

In this case the DX12 port more or less seems like a rough console copy and not much else.
 
There are many many devs who are ready to write off DX12 like DX10 of before.

Edit: I decided I best explain a bit on why I said this.
Firstly, remember I'm a tester, not a developer. I do come into contact with some of the code but this is more to carry out my work as a senior tester rather than actually understanding the code as a full developer.
Many developers who are now wading into DX12 are taking a similar approach to DX12 as they did with the console LLAPIs. The concept is the same, you have more primitive calls in DX12 rather that larger monolithic calls for DX11. Therefore, some devs think that what works in a low level format with consoles should be similar with PC.

And lo behold, it doesn't. The immediate problem that many devs seems to encounter right away is CPU bottlenecks. DX12 is supposed to remove CPU bottlenecks but instead I know the devs are running into them a lot more now. Many fingers to point at, but the biggest culprit seems to be the fact that basically Windows OS has its own mind in scheduling threads and assigning cores which differs from consoles.

Am I surprised that DX12 on BF1 has stutter issues? No. Total War Warhammer had similar problems. Problem gets worse the more you have tasks running in the background.

For a fun experiment, get Prime95 and run a few threads to load a core or two. Watch performance tank in DX12 whereas DX11 seems to get along fine.

Edit 2: I might as well plant my flag. I don't like the way DX12 is going. I look at the industry and talking to others, DX12 is polarized. You have champions who love DX12 and believe the future is that way. There are others who think DX12 is insanity and just bringing a fractious future. I'm in the second camp. Should have improved DX11, not create this fragmentation in DX12. Not like we didn't have enough component permutations in the PC environment that we needed to introduce code paths for multiple vendors!
 
Last edited:
An interview with Dice would yield nothing interesting. Follow the money.

Maybe programmers just need experience with DX12 and their overlords aren't giving them pocket money until they graduate from DX12 school. Or it's all about economics at this point and as such, one has to wonder if the DX12 secret sauce isn't as enticing as it has been purported to be. If it were all that, one would think that they would be climbing over one another trying to be first to market with a full-on, ground-up DX12 title.

DX11 is proving to be the XP of APIs.
 
Battlefield 1 Benchmarked: Graphics & CPU Performance

CPU performance between DX11 and DX12.
CPU_FuryX.png

CPU_GTX1080.png
 
Hmm interesting. I'm running a 5820k @ 4.3Ghz 16GB DDR4 and an 390X. And BF1 in DX12 runs buttery smooth on my system around 55 FPS at 1440.
 
Excellent article. BF1 isn't my cup of tea, but I enjoyed the article nonetheless. When I read about the stutters my immediate reaction was "Sounds like insufficient VRAM", so when you proceed to your in-depth article could you investigate VRAM usage?
 
Excellent article. BF1 isn't my cup of tea, but I enjoyed the article nonetheless. When I read about the stutters my immediate reaction was "Sounds like insufficient VRAM", so when you proceed to your in-depth article could you investigate VRAM usage?

Yes, but I don't think it's a VRAM issue, 480 has 8GB, and at 1080p I doubt its eating 8GB of VRAM in the Campaign mode. My theory is it's related to either driver VRAM "management" (not capacity) or thread distribution.
 
Setting a higher priority for a DX 12 game so windows doesn't confuse multiple threads as being separate programs - maybe worth a try. I wonder if affinity settings may help as well? Some testing is in order.
 
There are many many devs who are ready to write off DX12 like DX10 of before.

Edit: I decided I best explain a bit on why I said this.
Firstly, remember I'm a tester, not a developer. I do come into contact with some of the code but this is more to carry out my work as a senior tester rather than actually understanding the code as a full developer.
Many developers who are now wading into DX12 are taking a similar approach to DX12 as they did with the console LLAPIs. The concept is the same, you have more primitive calls in DX12 rather that larger monolithic calls for DX11. Therefore, some devs think that what works in a low level format with consoles should be similar with PC.

And lo behold, it doesn't. The immediate problem that many devs seems to encounter right away is CPU bottlenecks. DX12 is supposed to remove CPU bottlenecks but instead I know the devs are running into them a lot more now. Many fingers to point at, but the biggest culprit seems to be the fact that basically Windows OS has its own mind in scheduling threads and assigning cores which differs from consoles.

Am I surprised that DX12 on BF1 has stutter issues? No. Total War Warhammer had similar problems. Problem gets worse the more you have tasks running in the background.

For a fun experiment, get Prime95 and run a few threads to load a core or two. Watch performance tank in DX12 whereas DX11 seems to get along fine.

Edit 2: I might as well plant my flag. I don't like the way DX12 is going. I look at the industry and talking to others, DX12 is polarized. You have champions who love DX12 and believe the future is that way. There are others who think DX12 is insanity and just bringing a fractious future. I'm in the second camp. Should have improved DX11, not create this fragmentation in DX12. Not like we didn't have enough component permutations in the PC environment that we needed to introduce code paths for multiple vendors!

What about Vulkan? Id Software did an exquisite job in terms of their implementation in Doom. Is this API better than DX12? Also, why are the same developers who did a pretty good job with Mantle struggling so much with DX12? e.g. Nixxes did Thief using Mantle which ran well and then Hitman, Rise of the Tomb Raider and Deus Ex Mankind Divided which all struggled in DX12 on AMD hardware with stuttering and other problems.

I would have thought their experience would carry over and we'd be seeing better results than with Mantle, is DX 12 fundamentally flawed in some way?

I thought that with these lower-level APIs we could make games that were not possible under older APIs since they enable orders of magnitude more draw calls per frame without a massive performance hit and that is why developers believe they are the future. I believe that this future will come sooner if developers adopt Vulkan since games can be developed from the ground up for this API without having to worry about how many people have Windows 10.
 
Hmm interesting. I'm running a 5820k @ 4.3Ghz 16GB DDR4 and an 390X. And BF1 in DX12 runs buttery smooth on my system around 55 FPS at 1440.

Multiplayer or Single Player?
 
Last edited:
There is a setting in BF1 called GPU Memory restriction, supposedly meant to provide stability by reducing the amount of vram used? I'm a little confused why wouldn't bf1 just use the total amount possible. Anyone know more?
 
What about Vulkan? Id Software did an exquisite job in terms of their implementation in Doom. Is this API better than DX12? Also, why are the same developers who did a pretty good job with Mantle struggling so much with DX12? e.g. Nixxes did Thief using Mantle which ran well and then Hitman, Rise of the Tomb Raider and Deus Ex Mankind Divided which all struggled in DX12 on AMD hardware with stuttering and other problems.

I would have thought their experience would carry over and we'd be seeing better results than with Mantle, is DX 12 fundamentally flawed in some way?

I thought that with these lower-level APIs we could make games that were not possible under older APIs since they enable orders of magnitude more draw calls per frame without a massive performance hit and that is why developers believe they are the future. I believe that this future will come sooner if developers adopt Vulkan since games can be developed from the ground up for this API without having to worry about how many people have Windows 10.

Mantle is not DX12. If I am not wrong, Mantle became Vulkan. DX12 is Microsoft's attempt at LLAPI for the PC masses. DX12 is more familiar to those who work with XBox SDK.
However I'll honestly say that I actually don't know any studio that is developing games in Vulkan. iD Software did a stellar job with Vulkan on Doom and it's truly a software technological marvel. But don't forget that iD Software has some of the greatest brains in the 3D game engine industry.

I can't add much more on why former Mantle developers are currently suffering under DX12 as I have had no contact with Mantle.
 
Back
Top