Battlefield 1 Video Card DX12 Performance Preview @ [H]

do you ever have anything pleasent to say? Always the bringer of the black cloud. And far too many assumptions on actual involvement of IHVs. Your position at your job at best gives you little insight to the vast market and you using it as a prop for the inclusion of your so-called-facts is improper in any venue.

when one can distance themselves from petty pissin contests aka: Fps meters, they then can see what it is DX whatever cam bring and what its limitations are.

oddly enough looking at the graphs DX12 looks very flat, with the plummets being the negative and points that need attention. but being DX11 is about as good as it is going to get, a reasonable observer can see the absolute benifit that DX12 can bring. It is far too early to be complaining that it hasn't surplanted it's predecessor.

So be it. I'm not arguing with a flagwaver.
 
I urge some rational and objective thinking when it comes to flag waving for your preferred IHV please.

If you believe in DX12 for the benefits that DX12 is going to bring to the PC game development environment and its advances, good, keep that up.

However, if you are waving the flag of DX12 to cover up a certain IHV's ineptitude, then beware. DX12 does not favor a certain IHV over the other. One IHV had a huge lead due to their initial outreach efforts while the other could not be bothered. Now that IHV has gone quiet while other IHV still cannot be bothered. All DX12 has done for one IHV is to remove a lot of their processing bottlenecks. Don't forget that each time you read the data sheets, one IHV always seems to have a Tflop advantage that never seems to always translate to an fps advantage.

Another point not often raised enough, getting help with DX12 is even more intrusive than DX11. More access to your source code is needed. Many studios do balk at letting outsiders into greater access to their full development code.

Curse of the even numbered DX version indeed.....
Well I hope that chain of thought reflects your view as well regarding yourself. I would like to hear from Dice what they think in regards to DX 12 BF1 and future of DX 12. If there are issues with either DX 12 or Windows then that can be revealed and hopefully Ironed out. DX 11 plainly sucks and is indeed at it's limits. From draw calls to effective usage of a cpu is enough to move on to a better API. Why are developers, if they are, having problems with DX 12 I would like to hear from them. If IHV's do not give enough support for the developers to easily tap into the hardware strengths then that can improve as well. No reason Vulkan results with Doom with AMD cannot also be done with DX 12 for AMD or Nvidia. It may take more work but now they can exploit the hardware were as before features were just left dormant with DX 11.

I really don't understand how hard it is to understand AMD hardware was designed around threading etc. and giving it an API that is much less threaded it will have some handicaps. Visa Versa I do believe with Nvidia more serial approach to the universe and maybe that is the reason Nvidia tanks in a lot of cases (not all by no means) with DX 12. What I believe does not matter - what are the facts and what does the data indicate.
 
Now DX 12 in the Beta is reported to be the opposite of the launch version where it is even faster then DX 11 now. Kinda interesting thread:
Is dx12 working for any one?

If DX 12 performance prior was greater then DX 11 as well as current DX11 then there maybe a patch that will allow it to perform much better.
 
Well I hope that chain of thought reflects your view as well regarding yourself. I would like to hear from Dice what they think in regards to DX 12 BF1 and future of DX 12. If there are issues with either DX 12 or Windows then that can be revealed and hopefully Ironed out. DX 11 plainly sucks and is indeed at it's limits. From draw calls to effective usage of a cpu is enough to move on to a better API. Why are developers, if they are, having problems with DX 12 I would like to hear from them. If IHV's do not give enough support for the developers to easily tap into the hardware strengths then that can improve as well. No reason Vulkan results with Doom with AMD cannot also be done with DX 12 for AMD or Nvidia. It may take more work but now they can exploit the hardware were as before features were just left dormant with DX 11.

I really don't understand how hard it is to understand AMD hardware was designed around threading etc. and giving it an API that is much less threaded it will have some handicaps. Visa Versa I do believe with Nvidia more serial approach to the universe and maybe that is the reason Nvidia tanks in a lot of cases (not all by no means) with DX 12. What I believe does not matter - what are the facts and what does the data indicate.

What do I think regarding the DX11 and DX12 situation? It is scarily mirroring the situation we had with DX9 and DX10.
DX10 brought benefits over DX9 and even some capability that DX9 does not have. However DX10 was hampered by 2 major issues(imo): 1. Vista was universally hated and the platform was not adopted well; 2. DX10 did not bring enough to the table that people felt they were missing out a lot by not going DX10.
DX11 and DX12 is now in a similar boat. The problem DX12 faces are again(imo) 2 major reasons: 1. DX11 does everything DX12 can, albeit less efficiently; 2. Still a boatload of people on Win7 even though Win10 isn't nearly as bad as Vista.

Will adoption of DX12 be better and faster? Maybe. But the reason I say maybe is because a) AMD's marketshare is so small that the push from them to go DX12 is relatively weak and b) Nvidia really cannot be bothered. (If Nvidia reps want to challenge that view point, please put up or shut up! Actions, not words!)

I'll repeat: the biggest challenge to DX12 adoption is DX11 is good enough!

Of course, flagwavers of both IHVs have their viewpoints regarding DX12 but everything I read so far seems to suggest they view this as a silo of us-vs-them and completely fail to take into account the development challenges of DX12. Does DX12 favor AMD? Yes but not because DX12 inherently favors AMD. At the moment, most DX12 games either fall into 2 performance advantages for AMD. Either it is sponsored by AMD and thus has many optimizations that work for AMD or the game has a lot of DX11 bottlenecks that hamper the AMD hardware removed.

As for Nvidia tanking in DX12, that is a perception thing because I feel people think that Nvidia needs to have gains similar to AMD moving from DX11 to DX12. Unfortunately for Nvidia, their horsepower is tapped out in DX11 that going to DX12, they have nothing extra left to give. Hence the near negligible gains seen by Nvidia hardware going from DX11 to DX12.

Working on a game costs a lot of money. Working on 1 DX version of a game costs a lot of money already. Working on 2? Not many studios can afford that. Like I mentioned before, some are investing for the future and taking this as a learning curve and cost. That is good for them but many studios aren't doing that. So how else will studios adopt DX12? Simple: sponsor them. Is Nvidia keen to sponsor DX12? Doesn't seem so from my point of view. Where does that leave us? AMD right? Question is will they?

AMD never seems to have any long term plans when it comes to software and I believe that is biting them over and over again. Yet they never learn and will let them be bitten over and over again. If my long memory serves me well, there was the whole shenanigans over the intel compiler hampering their performance. Well, where's the AMD compiler then? Then the total lack of support for AMD 3DNow! People like to say AMD has good drivers but really? AMD drivers are stable yes, but good? Those drivers seem to be hindering the hardware's performance, how is that good? Is an old lady driving a formula 1 car a good driver? She's stable, won't crash but is that a good driver? DX9 was supposed to be AMD's and yet they failed to build on that base, instead handing over the technological lead to Nvidia to eventually build the most one-sided IHV favoring DX11. Who remembers Get In The Game rather than The Way It Is Meant To Be Played? GPUOpen vs Gameworks? Bullet vs PhysX? DX12 is now here and from the looks of it, it looks like another AMD DX but is it? We saw an initial spurt of AMD support but that's now dying out. AMD has gone very very quiet on DX12 lately. Why? I don't know, ran out of money, ran out of momentum, seriously no idea. Anyone with any oomph with AMD, go kick their ass over this, thank you very much.

Vulkan? No chance. And that's my honest opinion. Before you climb on my back over Vulkan, be honest with yourself. How many games do you know of are written with an OpenGL base? I know iD Software did Vulkan Doom but don't forget, John Carmack is the one of the biggest proponents of OpenGL in the business. Every Doom and Quake I know of all support OpenGL. Beyond Vulkan Doom, who else is going to give Vulkan a serious look?

Ultimately guess who in my opinion will be the biggest enemy of DX12? Bean counters. That's who.

As I stated, right now we have 2 versions, DX11 and DX12. DX11 does everything DX12 does. There are advantages that DX12 can bring over DX11 but we can't fully realize that except for those with the really high end GPUs. So if we target the mainstream and the majority(Win10+Win7 platforms), DX11 addresses them all adequately. The hardware we all have will run both DX11 and DX12. It's not exclusive.

So the moment bean counters look at this, how will they ever justify the cost of developing the DX12 version?

Anyway, just to end, I'll give a very personal opinion here. Vulkan is the way to go.
DX12's problem isn't that DX12 sucks. It's the Win10 adoption that is the problem. Bean counters look at the target audience with beautifully colored pie charts and see that chunk that doesn't support DX11, goes bananas. Vulkan on the other hand supports Windows 10/8/7. However, finding developers who can do Vulkan is way harder than finding developers who do DX. Not to mention the whole paradigm difference between Vulkan and that elephant in the room: Xbox SDK.
 
What do I think regarding the DX11 and DX12 situation? It is scarily mirroring the situation we had with DX9 and DX10.
DX10 brought benefits over DX9 and even some capability that DX9 does not have. However DX10 was hampered by 2 major issues(imo): 1. Vista was universally hated and the platform was not adopted well; 2. DX10 did not bring enough to the table that people felt they were missing out a lot by not going DX10.
DX11 and DX12 is now in a similar boat. The problem DX12 faces are again(imo) 2 major reasons: 1. DX11 does everything DX12 can, albeit less efficiently; 2. Still a boatload of people on Win7 even though Win10 isn't nearly as bad as Vista.

Will adoption of DX12 be better and faster? Maybe. But the reason I say maybe is because a) AMD's marketshare is so small that the push from them to go DX12 is relatively weak and b) Nvidia really cannot be bothered. (If Nvidia reps want to challenge that view point, please put up or shut up! Actions, not words!)

I'll repeat: the biggest challenge to DX12 adoption is DX11 is good enough!

Of course, flagwavers of both IHVs have their viewpoints regarding DX12 but everything I read so far seems to suggest they view this as a silo of us-vs-them and completely fail to take into account the development challenges of DX12. Does DX12 favor AMD? Yes but not because DX12 inherently favors AMD. At the moment, most DX12 games either fall into 2 performance advantages for AMD. Either it is sponsored by AMD and thus has many optimizations that work for AMD or the game has a lot of DX11 bottlenecks that hamper the AMD hardware removed.

As for Nvidia tanking in DX12, that is a perception thing because I feel people think that Nvidia needs to have gains similar to AMD moving from DX11 to DX12. Unfortunately for Nvidia, their horsepower is tapped out in DX11 that going to DX12, they have nothing extra left to give. Hence the near negligible gains seen by Nvidia hardware going from DX11 to DX12.

Working on a game costs a lot of money. Working on 1 DX version of a game costs a lot of money already. Working on 2? Not many studios can afford that. Like I mentioned before, some are investing for the future and taking this as a learning curve and cost. That is good for them but many studios aren't doing that. So how else will studios adopt DX12? Simple: sponsor them. Is Nvidia keen to sponsor DX12? Doesn't seem so from my point of view. Where does that leave us? AMD right? Question is will they?

AMD never seems to have any long term plans when it comes to software and I believe that is biting them over and over again. Yet they never learn and will let them be bitten over and over again. If my long memory serves me well, there was the whole shenanigans over the intel compiler hampering their performance. Well, where's the AMD compiler then? Then the total lack of support for AMD 3DNow! People like to say AMD has good drivers but really? AMD drivers are stable yes, but good? Those drivers seem to be hindering the hardware's performance, how is that good? Is an old lady driving a formula 1 car a good driver? She's stable, won't crash but is that a good driver? DX9 was supposed to be AMD's and yet they failed to build on that base, instead handing over the technological lead to Nvidia to eventually build the most one-sided IHV favoring DX11. Who remembers Get In The Game rather than The Way It Is Meant To Be Played? GPUOpen vs Gameworks? Bullet vs PhysX? DX12 is now here and from the looks of it, it looks like another AMD DX but is it? We saw an initial spurt of AMD support but that's now dying out. AMD has gone very very quiet on DX12 lately. Why? I don't know, ran out of money, ran out of momentum, seriously no idea. Anyone with any oomph with AMD, go kick their ass over this, thank you very much.

Vulkan? No chance. And that's my honest opinion. Before you climb on my back over Vulkan, be honest with yourself. How many games do you know of are written with an OpenGL base? I know iD Software did Vulkan Doom but don't forget, John Carmack is the one of the biggest proponents of OpenGL in the business. Every Doom and Quake I know of all support OpenGL. Beyond Vulkan Doom, who else is going to give Vulkan a serious look?

Ultimately guess who in my opinion will be the biggest enemy of DX12? Bean counters. That's who.

As I stated, right now we have 2 versions, DX11 and DX12. DX11 does everything DX12 does. There are advantages that DX12 can bring over DX11 but we can't fully realize that except for those with the really high end GPUs. So if we target the mainstream and the majority(Win10+Win7 platforms), DX11 addresses them all adequately. The hardware we all have will run both DX11 and DX12. It's not exclusive.

So the moment bean counters look at this, how will they ever justify the cost of developing the DX12 version?

Anyway, just to end, I'll give a very personal opinion here. Vulkan is the way to go.
DX12's problem isn't that DX12 sucks. It's the Win10 adoption that is the problem. Bean counters look at the target audience with beautifully colored pie charts and see that chunk that doesn't support DX11, goes bananas. Vulkan on the other hand supports Windows 10/8/7. However, finding developers who can do Vulkan is way harder than finding developers who do DX. Not to mention the whole paradigm difference between Vulkan and that elephant in the room: Xbox SDK.
Not gonna speak to more than the first part as I don't feel like spending all night with it.

Win 10 by the steam survey in sept is at >50% with win 7 being ~36-37% so no this isn't anything like Vista. So in part that is not even a good argument at all on your part.

Also I note that you never seem to comment on the technical differences only the ignorant views one would expect from those outside the industry with absolutely no knowledge. Why is that?
 
Not gonna speak to more than the first part as I don't feel like spending all night with it.

Win 10 by the steam survey in sept is at >50% with win 7 being ~36-37% so no this isn't anything like Vista. So in part that is not even a good argument at all on your part.

Also I note that you never seem to comment on the technical differences only the ignorant views one would expect from those outside the industry with absolutely no knowledge. Why is that?
I think he has made good points, Vulcan also spans Linux even if a rather small piece of pie. So you have ~50% less audience if you go DX 12 - so far nothing really show cases DX 12 benefits -> Bean Counter -> Go DX 11.

Going two paths is also more expensive but Xbox is now DX 12 so I think developers who develop for Xbox and PC has at least two API's. Then you add in PS4 - each more to the metal api's . I can see DX 11 going to the wayside when Windows 10 is like 80%. The 20% audience on Win 7 would probably be mostly business users and not gamers.
 
The problem is DX12 just isnt DX12. You have to optimize for every single uarch and SKU out there. And its quite obvious when you see DX12 games. We are talking multiple times the work with DX12 vs DX11 and for what benefit? And since the game price isn't going up, where do you take said money from? Where do you take the time from?

In the beginning the most rapid DX12 supporters claimed DX12 was as easy as a checkbox. Now reality is setting in.

There is a reason why we haven't had it on the PC before and why we still shouldn't have it. Maybe if there was only 1 IHV left with barely any innovation it would work.
 
Not gonna speak to more than the first part as I don't feel like spending all night with it.

Win 10 by the steam survey in sept is at >50% with win 7 being ~36-37% so no this isn't anything like Vista. So in part that is not even a good argument at all on your part.

Also I note that you never seem to comment on the technical differences only the ignorant views one would expect from those outside the industry with absolutely no knowledge. Why is that?

Do steam survey results matter?

As far as this forum goes, it doesn't count because the Nvidia/AMD marketshare stats in it are clearly wrong. Am I supposed to now believe it correctly represents Win10 marketshare?


PS. Sorry I'm being deliberately snarky today. Shit day in the office. Why? DX12 bugs of course!
 
Not gonna speak to more than the first part as I don't feel like spending all night with it.

Win 10 by the steam survey in sept is at >50% with win 7 being ~36-37% so no this isn't anything like Vista. So in part that is not even a good argument at all on your part.

Also I note that you never seem to comment on the technical differences only the ignorant views one would expect from those outside the industry with absolutely no knowledge. Why is that?
Because the technical differences don't matter if developers drop it due to the difficulty of implementing it on PC and the money drops it because it take too much extra time and effort to implement correctly. Personally, I definitely see the benefit of the increased number of draw calls that are possible, but so far we haven't seen any game released that is able to use it to their advantage and create something really new and exciting. In my opinion this will be the real benefit that comes from building a game using DX12 from the ground up. But who is going to be the first to take that leap? Even all the Xbox games coming to PC thus far have only supported feature level 11_0.
 
I think he has made good points, Vulcan also spans Linux even if a rather small piece of pie. So you have ~50% less audience if you go DX 12 - so far nothing really show cases DX 12 benefits -> Bean Counter -> Go DX 11.

Vulkan is now the default graphics API for Android with 7.0 forward - I'd say that's significant longterm since developers will have that much more justification to the beancounters that standardizing on Vulkan is the way to go so all platforms can be hit with one API, and they'll very easily be able to create scaled down versions of their games for mobile thanks to SPIR-V.

Going two paths is also more expensive but Xbox is now DX 12 so I think developers who develop for Xbox and PC has at least two API's. Then you add in PS4 - each more to the metal api's .

"Xbox DX12" =/= Windows PC DX12. Still two separate codepaths despite MS marketing trying to create the perception that developers can target both platforms with a single API. Ask any developer - its simply not true. DX on Xbox has always been a stripped-down and highly optimized/customized affair with little in common with the PC x86 version.

I can see DX 11 going to the wayside when Windows 10 is like 80%.

Not in its current clusterfuck perpetual-beta state of broken updates and now shrinking marketshare. Last month saw Windows 10 actually lose marketshare while Windows 7 *gained*. This is unprecedented in major Windows releases. Tomorrow we'll get a new report, and I don't expect its going to be good news for Windows 10 since nothing has improved since last month.

Windows 10 will never be "80%". Never. They'd need a major "return to sanity" release ala Windows 11 that removes all the consumer-hostile crap and gives users and businesses back options and control before any Windows version will ever reach anywhere near that marketshare. 10 has too much working against it now, most of all Microsoft.

The 20% audience on Win 7 would probably be mostly business users and not gamers.

Not sure where you get "20%" on Win7. If we go by the stats that the Win10 zealots are always quick to point out - Steam HW Survey stats - we see Windows 10 at 47.28% (and it lost .16% recently), while Windows 7 + 8 + 8.1 make up a combined 46.41% (gained .31%). Bottom line: 7 & 8.x combined are nearly half the market, so it makes no financial sense for developers to target DX12 now or for years to come, especially with Win10 uptake sputtering.
 
Vulkan is now the default graphics API for Android with 7.0 forward - I'd say that's significant longterm since developers will have that much more justification to the beancounters that standardizing on Vulkan is the way to go so all platforms can be hit with one API, and they'll very easily be able to create scaled down versions of their games for mobile thanks to SPIR-V.



"Xbox DX12" =/= Windows PC DX12. Still two separate codepaths despite MS marketing trying to create the perception that developers can target both platforms with a single API. Ask any developer - its simply not true. DX on Xbox has always been a stripped-down and highly optimized/customized affair with little in common with the PC x86 version.



Not in its current clusterfuck perpetual-beta state of broken updates and now shrinking marketshare. Last month saw Windows 10 actually lose marketshare while Windows 7 *gained*. This is unprecedented in major Windows releases. Tomorrow we'll get a new report, and I don't expect its going to be good news for Windows 10 since nothing has improved since last month.

Windows 10 will never be "80%". Never. They'd need a major "return to sanity" release ala Windows 11 that removes all the consumer-hostile crap and gives users and businesses back options and control before any Windows version will ever reach anywhere near that marketshare. 10 has too much working against it now, most of all Microsoft.



Not sure where you get "20%" on Win7. If we go by the stats that the Win10 zealots are always quick to point out - Steam HW Survey stats - we see Windows 10 at 47.28% (and it lost .16% recently), while Windows 7 + 8 + 8.1 make up a combined 46.41% (gained .31%). Bottom line: 7 & 8.x combined are nearly half the market, so it makes no financial sense for developers to target DX12 now or for years to come, especially with Win10 uptake sputtering.
Well I like the Vulcan idea, that would streamline the number of API's - One for each console and then Vulcan for everything PC which would support all OS's from Windows 7 and up as well as Linux. Mac's? I would think the bean counters would go for that plus Valve probably would help there as well. ID software I see will have a very clear advantage here (leverage the hardware with broad OS support much more easily). Agreed Win 10 will have a hard time over a certain point for awhile. Now what surprised me was when I went to my wife's doctors office all their pc's had Win 10 on them. So Win 10 is starting to penetrate the professional market at least in her doctor's office that is.
 
DX12 has really turned out to be a disappointment so far. Why are we seeing these performance discrepancies, particularly as large as we see in BF1, between DX11 and DX12? I understand that DX11 is likely better optimized owing to developer familiarity along with the extensive optimizations that both AMD and NVIDIA have made over the years in the drivers, but DX12 by all rights should be a superior API.. I don't get it.

In DX12, the developers have to do a LOT more low-level optimization, which is REALLY hard to do properly. There's a reason why the past 20 years of software APIs have done everything possible to hide low-level implementation details, and suddenly exposing all this to developers, many of which have never had to deal with it before, is bound to have negative performance impacts.
 
I thought that was what a number of developers wanted, low level access. Those that do not want that can use DX 11 or OpenGL.
 
I thought that was what a number of developers wanted, low level access. Those that do not want that can use DX 11 or OpenGL.

There is a difference between utopia and real world. Even DICE falls flat on their ass with DX12.

It all comes down to time and money in the end. Lets say, hypothetically, would you pay 15$ more for BF1 and wait 3 months more for a good DX12 implementation? And do you think all buyers would? If any answer is no = failure.
 
There is a difference between utopia and real world. Even DICE falls flat on their ass with DX12.
The Beta DX 12 ran much better, faster then current DX 11. That is what was reported - will a patch catch it up to previous levels? ? ? Who knows. Would like Dice to chip in on this since they at this point in time have a rather big influence on the industry.
 
Maybe DX 12 will separate the real men and women from the boys and girls. Yes maybe harder to get the results but you can get results that you never could through a layered API. The real masters will come out in other words.
 
Maybe DX 12 will separate the real men and women from the boys and girls. Yes maybe harder to get the results but you can get results that you never could through a layered API. The real masters will come out in other words.

If you mean those that got money and time and those that dont and willing to spend it? Are you willing to pay more for games and wait longer due to prolonged development times?

You know the famous triangle where you can only pick 2? With DX12 it sounds like you try to pick all 3.
 
If you mean those that got money and time and those that dont and willing to spend it? Are you willing to pay more for games and wait longer due to prolonged development times?

You know the famous triangle where you can only pick 2? With DX12 it sounds like you try to pick all 3.
No, I will buy the best games and those that can create the best games will get my money. I could care less what API it is just that DX 12 gives the keen developer more options to differentiate themselves from the rest. A.K.A ID Software, you could not do what ID Software did with Vulkan with OpenGL or DX 11, it takes a more to the metal or ability to go to the metal API. Your hate for DX 12 is ever present or is it your idol is not doing so well with DX 12 with supposedly having superior drivers (not!) :p
 
No, I will buy the best games and those that can create the best games will get my money. I could care less what API it is just that DX 12 gives the keen developer more options to differentiate themselves from the rest. A.K.A ID Software, you could not do what ID Software did with Vulkan with OpenGL or DX 11, it takes a more to the metal or ability to go to the metal API. Your hate for DX 12 is ever present or is it your idol is not doing so well with DX 12 with supposedly having superior drivers (not!) :p

The problem is OpenGL haven't had the greatest focus, specially not from AMD. And then you have extreme type of people like ID, who does OpenGL/Vulkan out of pure stupidity to say it directly. Hence why they are the only company doing it. So unless you start to find a lot more companies doing it its quite a mood point. And as far as I know, DOOM runs fine in a high level API.

Still you didn't answer while trying to go for the person with your nonsense. What part of the triangle have to give? You cant get everything for free as you try to get. Or is it the same mentality when DX12 begun that its "just a checkbox in the engine"?

Pick 2 and only 2.
ironTriangle.jpg
 
Last edited:
Point is Doom runs slower both on Nvidia and definitely on AMD using OpenGL. If Nvidia is the gold standard for drivers, best OpenGL, best engineering driver team to optimize a given game and ID comes and makes to the metal optimizations that then beats Nvidia's best -> you do the math. Well if you can add that is. Low level API's do have great potential. Add in DX 12 Multi-GPU and the few games that use it are blowing away the DX 11 versions SLI/CFX. It is time for the DX 12/ Vulcan era to begin. DX 11 and OpenGL should slowly fade out.

DX is the defacto Windows standard, OpenGL isn't.

Again you didn't answer the question about the triangle. Stop being so blinded and answer it. Its no different than your own workplace, assuming you have a job. Instead you just ramble on with even more work for the developer that adds more cost and time while you praise the low level APIs to the sky as had it no downsides.

And we all know not to use sponsored titles dont we? (Funny that extra money is needed for DX12 titles? :) )
gw4_1920.png
 
How about you pick two from that stupid triangle, seriously.
The problem is OpenGL haven't had the greatest focus, specially not from AMD. And then you have extreme type of people like ID, who does OpenGL/Vulkan out of pure stupidity to say it directly. Hence why they are the only company doing it. So unless you start to find a lot more companies doing it its quite a mood point. And as far as I know, DOOM runs fine in a high level API.

Still you didn't answer while trying to go for the person with your nonsense. What part of the triangle have to give? You cant get everything for free as you try to get. Or is it the same mentality when DX12 begun that its "just a checkbox in the engine"?

Pick 2 and only 2.
ironTriangle.jpg

Man been having issues with posting on the forum. Anyways you can pick two from that stupid triangle, seriously.
 
How about you pick two from that stupid triangle, seriously.


Man been having issues with posting on the forum. Anyways you can pick two from that stupid triangle, seriously.

So what have to give? Will the DX12 make the game take longer, cost more or reduced features/gameplay? :)
 
DX is the defacto Windows standard, OpenGL isn't.

Again you didn't answer the question about the triangle. Stop being so blinded and answer it. Its no different than your own workplace, assuming you have a job. Instead you just ramble on with even more work for the developer that adds more cost and time while you praise the low level APIs to the sky as had it no downsides.

And we all know not to use sponsored titles dont we? (Funny that extra money is needed for DX12 titles? :) )
gw4_1920.png

:ROFLMAO:, man are you drunk or something. Game has to support DX 12 Muti-GPU for it to happen. You didn't answer the question about your own triangle. :whistle:
 
:ROFLMAO:, man are you drunk or something. Game has to support DX 12 Muti-GPU for it to happen. You didn't answer the question about your own triangle. :whistle:

Why is it Deus Ex got mGPU, but only for AMD? Or are we starting to see limitations from the sponsor money at work? Why is it the reality is never like it gets hyped to be? Checkbox for DX12 in engine, EMA everywhere for free.
 
Why is it Deus Ex got mGPU, but only for AMD? Or are we starting to see limitations from the sponsor money at work? Why is it the reality is never like it gets hyped to be? Checkbox for DX12 in engine, EMA everywhere for free.
I believe Nvidia is working with them on it. Well in this game it plays better in Dx 12 for AMD.

Anyways loaded up Rise Of The Tomb Raider on the GTX 1060 (new build for my daughter) and Dx 11 was hitching on the Bench mark, preset very high, smaa, 1080p gave 77fps. Dx 12 same settings was smooth as butter but 74 fps. Just the opposite from older versions. The 1060 is surprising me on performance, at 1440p it was 49 fps.
 
I believe Nvidia is working with them on it. Well in this game it plays better in Dx 12 for AMD.

Anyways loaded up Rise Of The Tomb Raider on the GTX 1060 (new build for my daughter) and Dx 11 was hitching on the Bench mark, preset very high, smaa, 1080p gave 77fps. Dx 12 same settings was smooth as butter but 74 fps. Just the opposite from older versions. The 1060 is surprising me on performance, at 1440p it was 49 fps.
The benchmark always hitches on my PC, but it is smooth as silk during actual gameplay in both DX11 and DX12. I prefer the former because VXAO adds a lot of depth to the scene.
 
The benchmark always hitches on my PC, but it is smooth as silk during actual gameplay in both DX11 and DX12. I prefer the former because VXAO adds a lot of depth to the scene.
Only in DX 11 for me on the 1060, AMD use to on DX 12 but runs good on either DX 11 or DX 12 now. Dues Ex runs better on the Nano then the 1070 so playing it on that machine at 1440p. I also have to note, the 1060 looks washed out a little compared to the Nano. I've have seen this over and over again between Nvidia and AMD. It is a 10 bit 4K monitor both cards running it at 10bits. The 1070 I did not notice a difference in color compared to a 290x but it is a 8 bit monitor.
 
Better rerun these benchmarks. Other sites are showing the fury x beating the 1070 on this game in dx12. There's been a least one AMD patch for this game.

My i7 4770k at 4.5ghz and my fury x at stock clock runs this a 1440p and max settings with no slowdown whatsoever with freesync on the Omen 32" 1440p 75hz display.

My processor hovers around 50% utilization in dx12 and my graphics card is using low 3gb range of RAM. The swap file is huge at 12GB.

But no stuttering at all. None whatsoever. 11-9 catalyst drivers.
 
Better rerun these benchmarks. Other sites are showing the fury x beating the 1070 on this game in dx12. There's been a least one AMD patch for this game.

My i7 4770k at 4.5ghz and my fury x at stock clock runs this a 1440p and max settings with no slowdown whatsoever with freesync on the Omen 32" 1440p 75hz display.

My processor hovers around 50% utilization in dx12 and my graphics card is using low 3gb range of RAM. The swap file is huge at 12GB.

But no stuttering at all. None whatsoever. 11-9 catalyst drivers.
Can you link those sites please, curious on their methodology including if they analysed frames, used a good tool to analyse DX12,etc.
TBH you will find performance could swing either way depending when rebenched and who has the optimum drivers and patch from developers at that point.
Case in point is how AoTS now has Nvidia either beating or matching equal tier AMD cards now, but initially was well back in DX12 with crazy setting.

Cheers
 
whats the deal with dx12's? it's been out for a few years now and they still cant seem to get it right yet...
 
Same teething pains as dx10, where the adoption of Win10 is not as rapid as believed, thus leading to many developers still needing dx11. Also, dx11 is still more than capable enough while most studios still lack DX12 expertise.

At worst, DX12 is heading the same route as dx10 but most likely, we are at that stage where we used to live with dx9/dx11.
 
Can you link those sites please, curious on their methodology including if they analysed frames, used a good tool to analyse DX12,etc.
TBH you will find performance could swing either way depending when rebenched and who has the optimum drivers and patch from developers at that point.
Case in point is how AoTS now has Nvidia either beating or matching equal tier AMD cards now, but initially was well back in DX12 with crazy setting.

Cheers

http://wccftech.com/battlefield-1-directx-12-benchmarks-amd-nvidia/
or
 

Ah Khalid is a notorious AMD fanatic who is mentioning Hardware.info and the other GameGPU in the AMD reddit thread, GameGPU consistently has unusual performance benchmarks that at times can be good for one or other manufacturer.
I notice the performance benchmark of Hardware.info did not change from October to November for their results, also another aspect is how well they capture and analyse the DX12 frame times, to me only a few sites are doing that well and basing that on actual gameplay and pretty good work with PresentMon.
This may be further compounded by settings-resolution (unusually seems Nvidia is doing better at the higher resolutions such as 4k), or even maps.
BTW PCGamershardware.de used more recent drivers than Hardware.info (mentioned by Khalid), Hardware.info used 'AMD cards we used the Catalyst 16.10.1, Nvidia video cards, we used driver version 373.06', while PCGames used 'Geforce 375.63 WHQL, Radeon Software 16.10.2' Oct19;
And their results had custom 1070s 20% faster than Fury X in DX11 (where those reviews mentioned by Khalid/reddit had AMD in front even in that): http://www.pcgameshardware.de/Nvidi...e-GTX-1070-Xtreme-Gaming-Review-Test-1211600/
Albeit 4k for those GPUs, but the performance has not changed greatly from early game to that review I linked.
Here is the earlier benchmark test they ran specifically for BF1 with both DX11 and DX12: http://www.pcgameshardware.de/Battl...attlefield-1-Technik-Test-Benchmarks-1210394/
Note the single game performance is 2/3rds down.
And their results are pretty consistent at glance with GamersNexus (and I like how they measure games looking at 1% and 0.1% and how they use PresentMon): http://www.gamersnexus.net/game-bench/2652-battlefield-1-graphics-card-benchmark-dx11-vs-dx12
At lower resolutions (specifically 1080p) GamersNexus results for 480 and 1060 (bear in mind custom AIB there) also kinda reflects the trend noticed in the review here at HardOCP, where the 480 has the slight edge, albeit with Nvidia having better 1% and 0.1% behaviour; I would assume drivers and game updates would improve more on the 1% and 0.1% for AMD.

But as you say would be worth though one of the more respectable review sites again revisiting (including HardOCP) this as they will have a better feel for the mechanics of the game and how it plays to strengths/weakness of each card, meaning they can test on a few multiple maps/settings to highlight both of the manufacturers strength/weakness or whether the performance is consistent without it changing much with maps/etc.
Cheers
 
Last edited:
Why is everyone scared to benchmark Battlefield 1 in multiplayer 64 slot servers ?

Not scared, but it's hard to get a consistent (and similar) run-through because of the dynamic multiplayer nature of it. Every battle is different, every scenario different for each run-through, every set of circumstances is different for each run-through. Since it is new, it takes time to learn and become very familiar with the maps, and play style so one can record a long timed event of gaming and compare performance in multiplayer.

That said, we did so for BF4, and you can bet on the fact I will also do so for BF1.

In time, I will be able to do multiplayer BF1 performance comparisons. I've been very busy lately with other reviews and games I haven't had the time to really get into it much outside of the campaign mode. That will change as I play the game more.
 
Just picked up BF1 during Origin's Black Friday sale. Ran it across 3 machines and it ran great on all three.

On my sig machine I cranked it up to Ultra @ 1440p with GPU memory restriction set to off and it ran great. Very smooth, fps in the 80-90 range. VRAM usage just under 4GB
On a 2600k + HD7970 I let the auto detect do it's thing. At 1080p it seemed to pick the perfect settings. VRAM usage was around 2.7-2.8GB, GPU pegged at 100% FPS 60-80
On an i5 3450 + GTX 680 2GB I again let auto detect do it's thing. 1080p once more, VRAM usage just under the 2GB limit. I forgot to toggle my overlay on this one but the performance was quite playable with no major dips. Though there was a noticeable degradation of image quality here vs the 7970 where I didn't really feel like I was missing out on much going to it from the 980Ti

All my gameplay was done on full or nearly full 64 slot servers.
 
Back
Top