Battlefield 1 Video Card DX12 Performance Preview @ [H]

What about Vulkan? Id Software did an exquisite job in terms of their implementation in Doom. Is this API better than DX12? Also, why are the same developers who did a pretty good job with Mantle struggling so much with DX12? e.g. Nixxes did Thief using Mantle which ran well and then Hitman, Rise of the Tomb Raider and Deus Ex Mankind Divided which all struggled in DX12 on AMD hardware with stuttering and other problems.

I would have thought their experience would carry over and we'd be seeing better results than with Mantle, is DX 12 fundamentally flawed in some way?

I thought that with these lower-level APIs we could make games that were not possible under older APIs since they enable orders of magnitude more draw calls per frame without a massive performance hit and that is why developers believe they are the future. I believe that this future will come sooner if developers adopt Vulkan since games can be developed from the ground up for this API without having to worry about how many people have Windows 10.

Play battlefield 4 with both DX11 and mantle then come back and tell me what gave you the better performance. Low level APIs are not suitable for PC gaming and battlefield 4 with mantle is another example to prove that.
 
Play battlefield 4 with both DX11 and mantle then come back and tell me what gave you the better performance. Low level APIs are not suitable for PC gaming and battlefield 4 with mantle is another example to prove that.
That's crap. I played BF4 using Mantle on my 290X CF setup when Mantle was released and it was a great experience, it ran better than DX11. At the time it was the smoothest gaming experience I had ever seen in person.

Later AMD GPUs dropped full support for Mantle and it was eventually recommended to run Battlefield 4 in DX11 on new AMD hardware but that was because of architectural changes in AMD's later revisions of GCN and not a problem with the game API.
 
Mantle is not DX12. If I am not wrong, Mantle became Vulkan. DX12 is Microsoft's attempt at LLAPI for the PC masses. DX12 is more familiar to those who work with XBox SDK.
However I'll honestly say that I actually don't know any studio that is developing games in Vulkan. iD Software did a stellar job with Vulkan on Doom and it's truly a software technological marvel. But don't forget that iD Software has some of the greatest brains in the 3D game engine industry.

I can't add much more on why former Mantle developers are currently suffering under DX12 as I have had no contact with Mantle.

I see. It's a pity about Vulkan. It's not just me, quite a few in the PC gaming community would like developers to focus on Vulkan as it gives much more freedom in terms of OS and the number of people with compatible hardware is significantly greater than those with Windows 10. There is also the fear that once Microsoft's new console comes out that DX12 development will become stagnant, as was the case in the past. They may well stop caring about PC gaming again.

I believe that the potential for progress is much greater with Vulkan and I don't know why the developers seem so disconnected from this.
 
I played the beta using DX11, and have been dabbling with the released game using DX12.

It appears to support my SLI 970s well enough, but my complaints are less about performance and more about the game itself ;)

(new game feels weird, etc...)
 
Could it simply be they didn't have enough time to go through dx12 better ChrisC?

Hard to answer. The game itself is quite polished and it looks like it's only DX12, so I am going to assume they are facing the same issues my own studios seem to also face, DX12 naivety. More time may help resolve DX12 specific issues but game development always has this horrible balance where the bean counters are looking at your progress and wanting to ship ASAP.

While again not being able to speak for DICE, I can imagine the pressure where you are demonstrating a completely fine and working DX11 version of the game and justifying more development time/budget on getting DX12 right to the bean counters.

I see. It's a pity about Vulkan. It's not just me, quite a few in the PC gaming community would like developers to focus on Vulkan as it gives much more freedom in terms of OS and the number of people with compatible hardware is significantly greater than those with Windows 10. There is also the fear that once Microsoft's new console comes out that DX12 development will become stagnant, as was the case in the past. They may well stop caring about PC gaming again.

I believe that the potential for progress is much greater with Vulkan and I don't know why the developers seem so disconnected from this.

Another thought came into my mind. Before Vulkan Doom came into the scene, iD Software was well known for being experts at OpenGL game development. OpenGL is quite rarely used in most games these days. Will Vulkan be heading to nowhere the same way OpenGL did? Again, cannot speak for everyone but I can assure you, nobody speaks Vulkan in my studio.

My personal opinion is this. The PC market is so severely fragmented that the lowest common denominator(wherever it is found) is always what most developers will go for. DX12 has a problem, there is a significant number of Windows 7/8 PCs out there who are not DX12. As much as our instincts say that those still on Windows 7/8 are not your core PC gamers who have the hardware to play our games, the lowest common denominator here is DX11 which works perfectly fine for Windows 7/8 and Windows 10. It is going to be a long time before DX11 is the backup API, like the DX9/11 days.


ps. Notice how we talk DX9/11 and completely omit DX10? There are really a lot of people I know who would like to see DX12 go the way of DX10.
 
Play battlefield 4 with both DX11 and mantle then come back and tell me what gave you the better performance. Low level APIs are not suitable for PC gaming and battlefield 4 with mantle is another example to prove that.

Firstly, they are not low-level APIs, they are lower-level APIs. There is still some abstraction, at least that's what the experts were saying, whether it is true is another matter it seems.

Mantle doesn't run great anymore in BF4 because they effectively dropped continued support for it when the 380 series was released and when Mantle was donated to Khronos. Anything beyond AMD's Hawaii never worked properly as the changes required due to architectural differences were not made, AMD even said this as they had achieved their goal to steer the industry towards lower-level APIs and Mantle had become Vulkan so was no longer necessary.

There have been lots of patches to the game since the initial release of Mantle in BF4 and changes to the driver, so it probably doesn't even work properly on Hawaii anymore, but the point is that it did work properly when Mantle was under active development giving better frame rates and frame times, particularly in multiplayer, on AMD hardware.
 
That's crap. I played BF4 using Mantle on my 290X CF setup when Mantle was released and it was a great experience, it ran better than DX11. At the time it was the smoothest gaming experience I had ever seen in person.

Later AMD GPUs dropped full support for Mantle and it was eventually recommended to run Battlefield 4 in DX11 on new AMD hardware but that was because of architectural changes in AMD's later revisions of GCN and not a problem with the game API.

Now Mantle runs worse than DX11 on all AMD GPUs including 290x. low level APIs need long time support from developers. do you really believe developers will support their games years after release?
 
Now Mantle runs worse than DX11 on all AMD GPUs including 290x. low level APIs need long time support from developers. do you really believe developers will support their games years after release?

If it's the standard and they want to sell copies they'll have to support older hardware that is 3-5 years old. Whether they'll patch the game engine to support newer hardware properly that comes out years after release I'm not sure. The only way this will work is if the big engine developers manage most of this stuff for game developers who just use the engine. This was stated when Mantle was released. Everyone else should use high-level APIs.
 
Now Mantle runs worse than DX11 on all AMD GPUs including 290x. low level APIs need long time support from developers. do you really believe developers will support their games years after release?
Read the post above yours.

The problem isn't that the developers needed to continue the support, the problem was that AMD changed the API, made changes to the hardware, and then stopped supporting the API themselves. That should not be a problem with either DX12 or Vulkan as they are more mature now.
 
I think low DX12 performance probably has a few root causes:

1. This game just launched across many systems, I'm guessing the hard work to make DX12 shine just hasn't been done. Triple A game dev like this is always a rushjob, and spending a bunch of dev time to improve performance on only one section (W10 only) of the platforms you are launching on just isn't going to happen. Look at how BF4 was developed for PC across it's lifetime, the game really didn't shine until Dice LA put lots of time, including public test servers, long after it's launch.

2. Pretty good DX11 performance. Similar to the above, if DX11 is performing quite well, why waste pre launch dev time to try and boost a portion of a portion of the player bases experience.

3. Windows 10 performance issues. As someone still on 7, I haven't experienced this, but I've heard a lot about W10 just suddenly using a ton more system resources for relatively non important processes/services. I think Luke a LTT is looking into this, as performance on W10 in general, unless you are running a beast of a machine, seems to be inconsistent due to W10 fiddling on shit in the background. I'd love to see if [H] or other readers have seen inconsistencies due to background W10 processes.

I really hope DICE develops BF1 longterm like they did BF4, and I hope they put in the time to make DX12 a viable and worthwhile option. Overall though, I really wish more developers would embrace Vulkan, as we have seen it create big tangible performance benefits that people still on W7 can access.
 
I think low DX12 performance probably has a few root causes:

1. This game just launched across many systems, I'm guessing the hard work to make DX12 shine just hasn't been done. Triple A game dev like this is always a rushjob, and spending a bunch of dev time to improve performance on only one section (W10 only) of the platforms you are launching on just isn't going to happen. Look at how BF4 was developed for PC across it's lifetime, the game really didn't shine until Dice LA put lots of time, including public test servers, long after it's launch.
Unlikely to be the case as much here because the Frostbite engine has it's own dedicated development team that works on the engine outside of game specific projects iirc.
 
Unlikely to be the case as much here because the Frostbite engine has it's own dedicated development team that works on the engine outside of game specific projects iirc.
Hmmm, I wonder how much game specific refinements they have to make. It's a shame that Battlefront only supports DX12 on XBOX, it would be interesting to see if the same performance issues were there too.
 
Read the post above yours.

The problem isn't that the developers needed to continue the support, the problem was that AMD changed the API, made changes to the hardware, and then stopped supporting the API themselves. That should not be a problem with either DX12 or Vulkan as they are more mature now.

I guess the only way to show people low level APIs don't work with PCs is to wait a few years. time will prove that.
 
Hmmmm using a 4core 8t CPU. Anyway you can use a 6-8core with HT. See if DX12 shows improvement? Some benchmarks in other places show a pretty hefty increase in DX12 with more cores.

You guys plan to do a CPU test?
Not specific to BF1, but they did do a couple articles looking at scaling between 4c/8t, 8c/16t and 10c/20t processors.

Introduction - DX11 vs DX12 Intel 4770K vs 5960X Framerate Scaling
DX12 Does What - DX11 vs DX12 Intel 6700K vs 6950X Framerate Scaling

What should be noted is that in a GPU-limited scenario, which is where most video games are at, the more threads a CPU has available the worse the performance gets when moving to DX12.
Yes, but I don't think it's a VRAM issue, 480 has 8GB, and at 1080p I doubt its eating 8GB of VRAM in the Campaign mode. My theory is it's related to either driver VRAM "management" (not capacity) or thread distribution.
One interesting thing of note now that we're seeing cross platform titles more in the Windows Store coming over from the Xbox ONE is console code that is never changed for the PC environment. In particular, Forza Horizon 3 seems to be relying more on system memory than video card memory, even going so far as creating its own exclusive 1GB of cache. I'm sure this is coding due to the unified memory environment on consoles. The result, of course, is a hitching mess of a game on most PCs. Patches have made performance better, but the root cause still exists. The texture streaming that the game employs combined with the memory management means that hitching gets worse the higher the framerate goes. No one should ever need to limit their game to 30 FPS to get around these kinds of issues.

Now I'm not saying that this is what is happening in BF1, but to your point it is interesting nonetheless to see these kinds of issues being widespread.
 
Not specific to BF1, but they did do a couple articles looking at scaling between 4c/8t, 8c/16t and 10c/20t processors.

Introduction - DX11 vs DX12 Intel 4770K vs 5960X Framerate Scaling
DX12 Does What - DX11 vs DX12 Intel 6700K vs 6950X Framerate Scaling

What should be noted is that in a GPU-limited scenario, which is where most video games are at, the more threads a CPU has available the worse the performance gets when moving to DX12.

Thanks Armenius, those are some good links and very readworthy.
 
...However I'll honestly say that I actually don't know any studio that is developing games in Vulkan....
i'm sure there are, vulkan is the best thing since sliced bread,
dx12 is win10 only, which i as many will simply not use,
 
I used to run WoW in OpenGL mode; ran a lot smoother than DirectX on my retired system.
 
There are many many devs who are ready to write off DX12 like DX10 of before.

Edit: I decided I best explain a bit on why I said this.
Firstly, remember I'm a tester, not a developer. I do come into contact with some of the code but this is more to carry out my work as a senior tester rather than actually understanding the code as a full developer.
Many developers who are now wading into DX12 are taking a similar approach to DX12 as they did with the console LLAPIs. The concept is the same, you have more primitive calls in DX12 rather that larger monolithic calls for DX11. Therefore, some devs think that what works in a low level format with consoles should be similar with PC.

And lo behold, it doesn't. The immediate problem that many devs seems to encounter right away is CPU bottlenecks. DX12 is supposed to remove CPU bottlenecks but instead I know the devs are running into them a lot more now. Many fingers to point at, but the biggest culprit seems to be the fact that basically Windows OS has its own mind in scheduling threads and assigning cores which differs from consoles.

Am I surprised that DX12 on BF1 has stutter issues? No. Total War Warhammer had similar problems. Problem gets worse the more you have tasks running in the background.

For a fun experiment, get Prime95 and run a few threads to load a core or two. Watch performance tank in DX12 whereas DX11 seems to get along fine.

Edit 2: I might as well plant my flag. I don't like the way DX12 is going. I look at the industry and talking to others, DX12 is polarized. You have champions who love DX12 and believe the future is that way. There are others who think DX12 is insanity and just bringing a fractious future. I'm in the second camp. Should have improved DX11, not create this fragmentation in DX12. Not like we didn't have enough component permutations in the PC environment that we needed to introduce code paths for multiple vendors!

The root issue here is simple: Developers have almost no control over thread scheduling on Windows.

Windows thread scheduling is almost comically simple, but effective: the highest priority runnable thread(s) always execute. They will be assigned to any free CPU cores, regardless of where they previously executed. This maximizes total application bandwidth, but at the expense of latency [threads can stall out for a few ms, jumping across CPU cores can require data copied across the CPU cache or additional memory reads, etc].

Now you introduce a low level API, where developers are attempting to do what they do on consoles and optimize every little facet of thread/memory management. And they are finding the OS is NOT going to corporate with them. Windows is not a low level RTOS, and never will be, and trying to treat it like one is a recipe for disaster.

Take your Prime 95 case; In a DX11 app, you have two VERY heavy threads that do about 90% of the work. On a quad core CPU, Windows can schedule this without any major problems [albiet with some latency spikes] even with two threads in Prime95 going. Now take DX12, where you have a lot of little threads, all of which want to run at the same time. Meanwhile, you only have two CPU cores free to use, due to Prime95 eating two of them. You now have the classic situation where all those threads are bumping eachother, each one trying to complete, and as a result, you suffer significant latency and performance loss.

This is the type of thing I'd like to point out I predicted would happen. What I ultimately expect will happen is devs will learn their lesson and dial back to using just a handful of threads, rather then trying to micromanage everything to death.
 
I noticed in the review a slight edge was given to the GTX 1060 for having 2fps higher Maximums. But on the RX 480, the minimums were 1-3fps better than the 1060. Honestly minimum FPS is where it matters as well, more so than maximums in this case. I'd say both of those cards are dead equal without even a mention of a winner... of course within margin of error again.

I focus on minimums and averages and I could give a rats ass about the maximum number. Minimums and average determines consistency and the overall enjoyability (If that's even a word) of the game.
 
Seems a wasted addition, doesn't add any graphical effects, runs worse (despite dx12 supposedly running better), just a checkbox feature unless they add to it in the future.
 
Seems a wasted addition, doesn't add any graphical effects, runs worse (despite dx12 supposedly running better), just a checkbox feature unless they add to it in the future.

Wasted for users, more or less, but not for DICE/EA- they know now that it works pretty darn well all things considered, and now DICE can work on whittling the issues with the Frostbyte engine down and getting a solid base into their games (hopefully including BF1).
 
When I made a decision to skip Windows 10 I assumed it'll start costing me frame-rates in newer games. I never expected reality will be so funny :D

So far only good low level api game is Doom which doesn't even require windows 10 junk.
 
Why is everyone scared to benchmark Battlefield 1 in multiplayer 64 slot servers ?
 
i'm sure there are, vulkan is the best thing since sliced bread,
dx12 is win10 only, which i as many will simply not use,

Bingo. DX12 is a bust, and artificially limiting it to Windows 10 to try to force upgrades was also a fail. Now that Windows 7 is actually clawing back marketshare from 10 as people are waking up and rolling back, DX11 will remain the game development target for years to come.

Vulkan proliferation is what the industry really needs, so we move away from this ludicrous situation where the graphics API is shackled to Microsoft's marketing whims, and Doom was a great proof of concept of that.
 
DX12 is bad for PC gaming, not good. And here is another show of it.

Its only going to get worse and worse as future GPUs are released and games not updated for proper support.

The entire concept is made for a fixed static hardware solution aka consoles. Not a random bunch of hardware, changes and such like the PC is.

Yeah it's more and more looking like that is the case. Programming "closer to the hardware" is a good idea in theory but on a platform as diverse as the PC it has just as much potential to cause problems as it does to fix them. It has shown a benefit in a few games but there seem to be an equal number of games where it actually hurts performance. I am beginning to wonder if some of those games just weren't well optimized for DX11 in the first place too.

Doom saw big improvements with Vulkan but that was also an OpenGL game by "default" and AMD is notorious for having bad OpenGL drivers, and even Nvidia probably doesn't put nearly as much effort into them as they do with DirectX since that's what most games are using these days.

In this case the DX12 port more or less seems like a rough console copy and not much else.

I am getting that impression from a lot of the Microsoft games ported from Xbone as well. None of them seem to run particularly well on PCs (even those with hardware many times faster than Xbone hardware). A lot of people thought the similar architectures between the new consoles and PCs would lead to better ports, but it may just be leading to easier/lazier ports for the devs while PC gamers see no benefit apart from a few more games being available on PC.
 
DX12 is the biggest gimmick and bullshit released by Microsoft to get people upgraded to a shitty OS called Windows 10. I game on Windows 7 SP1, DX11 + SLI 1080 and it is an ultimate gaming platform. Gotta laugh to people who thought DX12 will do something magically for them.
 
DX12 is the biggest gimmick and bullshit released by Microsoft to get people upgraded to a shitty OS called Windows 10. I game on Windows 7 SP1, DX11 + SLI 1080 and it is an ultimate gaming platform. Gotta laugh to people who thought DX12 will do something magically for them.

I have a feeling its more about selling Xboxes than PCs ;)
 
DX12 is the biggest gimmick and bullshit released by Microsoft to get people upgraded to a shitty OS called Windows 10. I game on Windows 7 SP1, DX11 + SLI 1080 and it is an ultimate gaming platform. Gotta laugh to people who thought DX12 will do something magically for them.
it's great you have an opinion based on your experience with a particular set of hardware, but... Those of us with AMD setups Win 10 is a considerable boon over 7, and DX12/Vulkan has been pretty kind as well.
 
Just because DX 12 is tearing down your Nvidia performance does not mean the API is a non-starter. For AMD users it has helped in a number of games (Ashes, Dues Ex MD, Hitman, DX 12 Microsoft games . . .). The API has much potential, way beyond the limitations of DX 11, just give it time to mature and better tools to help the developer do better as more experience is gained. The whining here, well that how it sounds is kinda funny.
 
I urge some rational and objective thinking when it comes to flag waving for your preferred IHV please.

If you believe in DX12 for the benefits that DX12 is going to bring to the PC game development environment and its advances, good, keep that up.

However, if you are waving the flag of DX12 to cover up a certain IHV's ineptitude, then beware. DX12 does not favor a certain IHV over the other. One IHV had a huge lead due to their initial outreach efforts while the other could not be bothered. Now that IHV has gone quiet while other IHV still cannot be bothered. All DX12 has done for one IHV is to remove a lot of their processing bottlenecks. Don't forget that each time you read the data sheets, one IHV always seems to have a Tflop advantage that never seems to always translate to an fps advantage.

Another point not often raised enough, getting help with DX12 is even more intrusive than DX11. More access to your source code is needed. Many studios do balk at letting outsiders into greater access to their full development code.

Curse of the even numbered DX version indeed.....
 
IMG_4768.JPG
4k gaming,yum.
 
I urge some rational and objective thinking when it comes to flag waving for your preferred IHV please.

If you believe in DX12 for the benefits that DX12 is going to bring to the PC game development environment and its advances, good, keep that up.

However, if you are waving the flag of DX12 to cover up a certain IHV's ineptitude, then beware. DX12 does not favor a certain IHV over the other. One IHV had a huge lead due to their initial outreach efforts while the other could not be bothered. Now that IHV has gone quiet while other IHV still cannot be bothered. All DX12 has done for one IHV is to remove a lot of their processing bottlenecks. Don't forget that each time you read the data sheets, one IHV always seems to have a Tflop advantage that never seems to always translate to an fps advantage.

Another point not often raised enough, getting help with DX12 is even more intrusive than DX11. More access to your source code is needed. Many studios do balk at letting outsiders into greater access to their full development code.

Curse of the even numbered DX version indeed.....
do you ever have anything pleasent to say? Always the bringer of the black cloud. And far too many assumptions on actual involvement of IHVs. Your position at your job at best gives you little insight to the vast market and you using it as a prop for the inclusion of your so-called-facts is improper in any venue.

when one can distance themselves from petty pissin contests aka: Fps meters, they then can see what it is DX whatever cam bring and what its limitations are.

oddly enough looking at the graphs DX12 looks very flat, with the plummets being the negative and points that need attention. but being DX11 is about as good as it is going to get, a reasonable observer can see the absolute benifit that DX12 can bring. It is far too early to be complaining that it hasn't surplanted it's predecessor.
 
It is far too early to be complaining that it hasn't surplanted it's predecessor.

In your view, when have there been time enough? Quite some time have already passed. Are we talking 2017? 2020? 2025?
 
That benchmark is more than 2 years old. now Battlefield 4 runs far better on DX11.

Direct3D 11 vs. Mantle - Battlefield Hardline Performance Video Card Review

Battlefield 4 - AMD Radeon R9 Fury X Video Card Review

Remember AMD invested millions of dollars to implement mantle on frostbite 3 and focused optimizing it for a few AMD GCN1 GPUs and this is what they have achieved.
Show less
 
Last edited:
Back
Top