DX12 versus DX11 Gaming Performance Video Card Review @ [H]

I am more of the "I'll worry about the future when it gets here" crowd.

It's pretty hard to convince people to move to DX12 (and thus, W10) when no games perform visibly better under it.

I am in win 10 on the virtue that several of my games (GoW1, GoW4 and a few very old games, like early 2000's/late 1990's old) will not run on W7, and win 7 does not inherently allow per monitor DPI scaling, and not supporting one of the external USB protocols (can't remember the name right now). I used to be on Win 10 before my PSU bricked and my computer "fixer" ended up reinstalling win 7, I only went back to win 10 again last week.

DX12 never really entered that argument. GoW4 might be the only exception because it is exclusively a DX12 game, but I didn't migrate solely on DX12's performance over DX11, in fact, it is one of my biggest reasons why I took so long getting back into it.
 
DX12 reminds me a lot of DX10 at this point, we've been through this before. You all know what happened to DX10, it was quickly replaced in favor of DX11, and DX11 has been great for games. I'm not sure this will happen to DX12, but no one can deny it's got itself off to a rough start. One thing DX11 really helped with was improving game visuals, image quality got better, they were able to do more with graphics. DX12 hasn't even reached this point yet, right now it's all about just making it at least perform as good as DX11, or a little better. It will take another leap to actually use the performance benefits to improve image quality with new graphical features and higher quality geometry and whatnot. It has a ways to go still.


I call this the Microsoft 90 degree course swerve effect. Basically MS carries on a certain development path for some years and some versions, folks are happy. Then it finds it needs to change course. It does so at 90 degrees and everyone hates the change in course and the resulting product. Then then next iteration comes out on that same course and everyone by then accepts it and carries on that course.
 
Another fail brought to you by Microsoft.
How is smoother gameplay a fail on Microsoft's part? Just because [H] and other sites test best case scenarios, doesn't mean the same applies to all people. Since I can't afford to continually stay current with hardware, I am very thankful for DX12, since I see noticeable improvements in gameplay on my computer. I think DX12 is a win!
 
Nice comparison, but I'd also like to see if frame times are impacted by DX12. Subjectively, I feel like I experience smoother gameplay using DX12 over DX11 in Deus Ex:MD, even though the frame rate is ostensibly the same. I'd be interested to see if this is a real improvement or if it's all in my head.
 
"Wow, so why does DX12 suck so badly after all this time? Maybe it's time for DX13 ???"

It does not suck it is just not used in an optimal way. API will take time for programmers to learn and get the most out of it , if you take an avg development time of 4 to 5 years and games that have been featured now were not made for DX12. In a few years time that cycle is over and in that time you will see better use of DX12.

I agree with you but ...

"avg development time of 4 to 5 years" - DX12 has been out for a few years already: "DirectX 12 was announced by Microsoft at GDC on March 20, 2014, and was officially launched alongside Windows 10 on July 29, 2015"

https://en.wikipedia.org/wiki/DirectX#DirectX_12

It just seems like after all this time, one would think DX12 would, at a bare minimum, be performing better than DX11 but it's actually doing worse in some cases. Me thinks it's time for DX13.
 
How is smoother gameplay a fail on Microsoft's part? Just because [H] and other sites test best case scenarios, doesn't mean the same applies to all people. Since I can't afford to continually stay current with hardware, I am very thankful for DX12, since I see noticeable improvements in gameplay on my computer. I think DX12 is a win!

Sounds like placebo, or you own Microsoft stock.
 
Sounds like placebo, or you own Microsoft stock.
Score!!! First time I've been accused of being a shill!

I know it's feeding the troll, but I can't help but take this bait -

Placebo effect or not, isn't that what gaming is about - my personal experience? If someone thinks that 100fps is a significant improvement in the gaming experience over 85fps, when gaming on a 60Hz monitor, more power to them. If I perceive a better experience in DX12 than in DX11, than isn't that really what's important?
 
I just remembered that propaganda:
dx12.png


No, so far IQ is exactly the same between the API's, DX12 isn't being used to enhance IQ at this point.
When I last checked it was not possible to enable the best possible AO in Tomb Raider (VXAO) in DX12. Is that still the case?
 
I am more of the "I'll worry about the future when it gets here" crowd.

It's pretty hard to convince people to move to DX12 (and thus, W10) when no games perform visibly better under it.
Exactly.

My gfx cards last between 1.5 and 2.5 years.
In that time they play everything admirably and any new and upcoming is catered for with the next card.
I always look forward to a gfx card update, not always because its needed but new shiny and can go silly on some gfx settings :)

This time its useful, not for DX12, but for 4K DSR at 1080p (silly heh) and better VR performance (much needed to reduce Aliasing).
And I now have a decent route to a 4K PC display.
DX12 has brought nothing to my table, it has only been worse.
In Tomb Raider the VXAO option isnt available under DX12. DX12 looks worse.
 
DX12 reminds me a lot of DX10 at this point, we've been through this before. You all know what happened to DX10, it was quickly replaced in favor of DX11, and DX11 has been great for games. I'm not sure this will happen to DX12, but no one can deny it's got itself off to a rough start. One thing DX11 really helped with was improving game visuals, image quality got better, they were able to do more with graphics. DX12 hasn't even reached this point yet, right now it's all about just making it at least perform as good as DX11, or a little better. It will take another leap to actually use the performance benefits to improve image quality with new graphical features and higher quality geometry and whatnot. It has a ways to go still.
Not to sound like a troll, but for some that leap could be != Win10 aka Win11 / Win7 SP3 ;)
 
Sounds like placebo, or you own Microsoft stock.
or not, as I see it too. at least I did until my mobo died. my sig system saw improvement in almost all dx12 games and the ones that didn't improve fps-wise seemed to play smoother without as many hitches or stutters. and cpu usage was visibly smoother in any monitor software. I'm still sticking to the idea that dx12 shows better improvement on lower spec'd systems.
 
DX12 reminds me a lot of DX10 at this point, we've been through this before. You all know what happened to DX10, it was quickly replaced in favor of DX11, and DX11 has been great for games. I'm not sure this will happen to DX12, but no one can deny it's got itself off to a rough start. One thing DX11 really helped with was improving game visuals, image quality got better, they were able to do more with graphics. DX12 hasn't even reached this point yet, right now it's all about just making it at least perform as good as DX11, or a little better. It will take another leap to actually use the performance benefits to improve image quality with new graphical features and higher quality geometry and whatnot. It has a ways to go still.

I totally agree and have been thinking the same for the last year. I kind of remember that the gap from DX5 or so to DX8 was almost unnoticeable in terms of my perceived experiences. DX9.0a to DX9.0c was unusually dramatic visually and Witcher 2 really pulled it off by showing how demanding even 9.0c could be. Dead Space 1-2 was cool, let's try to forget 3. I remember when DX10 came out and I thought the new textures looked cool. Problem was that you really needed Vista to get it. I got lucky on a black Friday and got an ATI HD2600 that let my P4 4.3GHZ, 1st SATA raid, XP SP3 have some of the DX10 stuff. It was cool. I think I barely had a year with it before DX11 started to appear and in that time we all got to hear that famous question "Can it run Crysis?". By then a core2quad and 8800gt was fun.

When DX11 came out I remember being so mad because for the first 12-24months most games were still using 9.0c. Then when they did start using it, it became obvious I was going to need some serious GPU hardware to play in 1080p/60fps-i.e. Crysis 2-3, both Metro's, Tomb Raider. At that point my SLI phase had started. DX11 isn't fully matured. Textures pack and compression methods continue to grow and change. I believe in about a year will start to see a more mainstream approach to DX12 especially now that NV has mostly finished releasing the latest gen cars and AMD is soon to follow.
 
Ok, got a chance to do my benchmarks. Really first time I've done anything like this. I did three passes of the built-in benchmarks in DX11 mode, then switched to DX12 mode. Latest Win10 and game patches. Hardware, as mentioned before is Xeon E3-1230 v2, so not overclocked; Gigabyte Windforce Radeon R9 285 at factory settings, 8GB DDR3 1600 RAM. I had the default "high" settings, but turned off lens flare, motion blur, Chromatic Aberration (DX:MD), and Film Grain (RotTR). Screen resolution is my monitor's native 1600x1200.

First, these are the settings I have played through DX:MD, and gameplay seemed fine. Probably some stuttering, but not distracting from gameplay. I played through the game in DX12 mode. I haven't played RotTR yet, but figured I could use similar settings. Second, without the numbers, the benchmark seemed a bit smoother in DX12 than in DX11.

Here are the average numbers from the three passes:

DX: MD - DX11 - Min 32.67, Max 60.40, Avg 47.57. One of the passes had an abnormally low minimum, the other two had 37fps for minimum
DX:MD - DX12 - Min 37.60, Max 58.30, Avg 47.57. Surprised me that the averages were exactly the same!
Conclusion - no perceptible difference based on the numbers between DX11 and DX12, but DX12 felt smoother.

RotTR - DX11
Mountain Peak - Min 21.85, Max 131.15, Avg 56.37
Syria - Min 6.55, Max 92.52, Avg 54.29
Geothermal Valley - Min 6.47, Max 73.86, 50.52
Overall - 53.67
RotTR - DX12
Mountain Peak - Min 41.87, Max 74.71, Avg 58.04
Syria - Min 29.62, Max 89.25, Avg 58.01
Geothermal Valley - Min 5.25, Max 82.67, 55.91
Overall - 57.28
Conclusion - Lower highs, but better averages in DX12. Some of the minimums were way higher, and there was a definite feel of smoother motion. I think the extreme low on Geothermal valley is due to one area of texture loading where the screen stuttered bad, nearly frozen, near the end for about a second.

So in my case, with these two games, there is noticeable improvement using DX12, on what would be considered good hardware for circa 2013. I can't afford to upgrade now, though something may be on the "hoRyzen," so I am certainly glad there have been some DX12 developments.

Now you all have given me the bug to try out different things and benchmark my system :p. Next thing you know I'll be downloading 3dMark, and MSI afterburning and tweaking like crazy!
DX 11 with VXAO, 3440x1440, SLI 1070 Max setting except Reflections on Normal - 71fps. Smooth fps no artifacts
DX 11 with HBAO+, same as above but AO - 79fps, Smooth fps no artifacts
DX 12 with HBAO+, Multi-GPU, same settings above - 80fps - Stuttering, rendering artifacts

Since the best image quality benchmark is above my refresh rate of my 60 monitor, I can use adaptive sync and get the highest quality rendering and smoothest game play - DX 11 with VXAO wins. The fps difference as in more is not better since from my prospective the monitor 60 hz will not be able to display the faster frame rate and would tear making the image quality even worst.

Now just because DX 12 in this game does not give any clear or better benefits over DX 11 does not mean other games will be the same. I still believe DX 12 has a lot more potential then DX 11 ever had.
 
I don't think any of those games are native DX12 games.. and just DX11 games with DX12 tacked on.. Pointless article. PC gamers should know better.
 
  • Like
Reactions: kac77
like this
I don't think any of those games are native DX12 games.. and just DX11 games with DX12 tacked on.. Pointless article. PC gamers should know better.

True. But obviously there is this desire for people to compare DX11/OGL to DX12/Vulkan and games that support both paths are the only way to do it. Though I think I get your point. These low level APIs aren't going to spread their legs just by doing what the old ones did.
 
"Wow, so why does DX12 suck so badly after all this time? Maybe it's time for DX13 ???"



I agree with you but ...

"avg development time of 4 to 5 years" - DX12 has been out for a few years already: "DirectX 12 was announced by Microsoft at GDC on March 20, 2014, and was officially launched alongside Windows 10 on July 29, 2015"

https://en.wikipedia.org/wiki/DirectX#DirectX_12

It just seems like after all this time, one would think DX12 would, at a bare minimum, be performing better than DX11 but it's actually doing worse in some cases. Me thinks it's time for DX13.
Two years is absolutely nothing in terms of software development. When I saw the title of this article I was extremely worried because if the games are ancient and the ones included in this test are, then you are not going to see what DX12 is all about. This is why Sniper Elite 4 showed the biggest gains. Most of the games included in this test had DX12 bolted on as an after thought. The game was not designed around it. This is an extremely important thing to note.

DX12 basically is about parallelism. All consoles since basically the Playstation (you can include the Sega Saturn in that as well) have many / had many more processing cores than your average PC. The current consoles have 8 and think the PS3 had more than that. If you wanted your game to not perform like ass then it behooved the developer to ensure the games were at least somewhat optimized for the architecture. In just about all of these consoles every processing core is utilized.

There is no such requirement on the PC. In fact DX 11 makes game development comparatively easy. In DX 11 quite a bit of the optimization can be done by AMD and Nvidia and quite a bit of the parallelism is handled automatically albeit not all that well. For DX 12 more of it is on the developers. This is a very good thing for those of us with 6 and 8 core machines. In so many words DX 12 kind of forces the developers to think about parallelism from the very beginning and makes them distribute the workload more effectively so that it can be dispersed more appropriately. Console developers have been doing this for quite some time but not for the PC. DX11 games fed through a DX 12 path without optimization will most likely perform like ass. So it's paramount that the games tested are relatively new if you are looking for benefits.

Unfortunately this test gives the impression that there are no benefits to using DX 12 and that couldn't be further from the truth.
 
Last edited:
DX 11 with VXAO, 3440x1440, SLI 1070 Max setting except Reflections on Normal - 71fps. Smooth fps no artifacts
DX 11 with HBAO+, same as above but AO - 79fps, Smooth fps no artifacts
DX 12 with HBAO+, Multi-GPU, same settings above - 80fps - Stuttering, rendering artifacts

Since the best image quality benchmark is above my refresh rate of my 60 monitor, I can use adaptive sync and get the highest quality rendering and smoothest game play - DX 11 with VXAO wins. The fps difference as in more is not better since from my prospective the monitor 60 hz will not be able to display the faster frame rate and would tear making the image quality even worst.

Now just because DX 12 in this game does not give any clear or better benefits over DX 11 does not mean other games will be the same. I still believe DX 12 has a lot more potential then DX 11 ever had.
In my experience with tomb raider the built-in benchmark loop produced hugely variable results when using either API. DX12 was perceptible smoother when actually playing in the geothermal Valley level, I guess because it was very cluttered and drawcall bottlenecks come into play even at relatively low framerates (which happens often in WoW as well, Suramar in particular suffers from this).

Natively multithreaded drawcall submission is almost always going to be hugely beneficial in open world games, but it can be achieved just as easily with vulkan. I don't see DX12 gaining traction over vulkan in the long run, but who knows I could be wrong.

As someone stated in a post in this thread earlier, in many cases DX12 paths of these games perform worse when heavily GPU limited. This could be driver side immaturity, it could be developers not being able to match the efficiency of code written by respective IHVs.

To put it in TLDR form; Vulkan and DX12 allow developers to have more finely tuned code, but they also enable them to write horribly tuned code, far more horrible than they could have done using DX11.

Who can you trust to do a good job using these APIs? Epic Games, Blizzard, DICE, Crytek, Rockstar, Valve etc

Who can you not trust? Bethesda (lol), IO interactive (Hitman), EA games etc

Its a massive investment of time, therefore money, and it only makes sense if there will be significant long term return, all the developers I mentioned as being capable of it license(or use themselves) their engines for many games, Rockstar being the exception with one game that sells 100m lol
 
meh, I wish I could agree. AMD made the claims back then from the very beginning about this, "40% performance increase with DX12" and two years later it is still getting beat by DX11. Perhaps they should refrain from making false claims? Please name these benefits of DX12 and provide links to benchmarks or whatever. It appears to me that it's time for DX13 to put the issues of DX12 behind them.

Two years is absolutely nothing in terms of software development. When I saw the title of this article I was extremely worried because if the games are ancient and the ones included in this test are, then you are not going to see what DX12 is all about. This is why Sniper Elite 4 showed the biggest gains. Most of the games included in this test had DX12 bolted on as an after thought. The game was not designed around it. This is an extremely important thing to note.

DX12 basically is about parallelism. All consoles since basically the Playstation (you can include the Sega Saturn in that as well) have many / had many more processing cores than your average PC. The current consoles have 8 and think the PS3 had more than that. If you wanted your game to not perform like ass then it behooved the developer to ensure the games were at least somewhat optimized for the architecture. In just about all of these consoles every processing core is utilized.

There is no such requirement on the PC. In fact DX 11 makes game development comparatively easy. In DX 11 quite a bit of the optimization can be done by AMD and Nvidia and quite a bit of the parallelism is handled automatically albeit not all that well. For DX 12 more of it is on the developers. This is a very good thing for those of us with 6 and 8 core machines. In so many words DX 12 kind of forces the developers to think about parallelism from the very beginning and makes them distribute the workload more effectively so that it can be dispersed more appropriately. Console developers have been doing this for quite some time but not for the PC. DX11 games fed through a DX 12 path without optimization will most likely perform like ass. So it's paramount that the games tested are relatively new if you are looking for benefits.

Unfortunately this test gives the impression that there are no benefits to using DX 12 and that couldn't be further from the truth.
 
I don't see any point in going with DX13. It seems a waste to me... lets let DX12 mature and be exploited before we consider moving to another API. Or update DX12 to make it easier for coders.
 
Two years is absolutely nothing in terms of software development. When I saw the title of this article I was extremely worried because if the games are ancient and the ones included in this test are, then you are not going to see what DX12 is all about. This is why Sniper Elite 4 showed the biggest gains. Most of the games included in this test had DX12 bolted on as an after thought. The game was not designed around it. This is an extremely important thing to note.

DX12 basically is about parallelism. All consoles since basically the Playstation (you can include the Sega Saturn in that as well) have many / had many more processing cores than your average PC. The current consoles have 8 and think the PS3 had more than that. If you wanted your game to not perform like ass then it behooved the developer to ensure the games were at least somewhat optimized for the architecture. In just about all of these consoles every processing core is utilized.

There is no such requirement on the PC. In fact DX 11 makes game development comparatively easy. In DX 11 quite a bit of the optimization can be done by AMD and Nvidia and quite a bit of the parallelism is handled automatically albeit not all that well. For DX 12 more of it is on the developers. This is a very good thing for those of us with 6 and 8 core machines. In so many words DX 12 kind of forces the developers to think about parallelism from the very beginning and makes them distribute the workload more effectively so that it can be dispersed more appropriately. Console developers have been doing this for quite some time but not for the PC. DX11 games fed through a DX 12 path without optimization will most likely perform like ass. So it's paramount that the games tested are relatively new if you are looking for benefits.

Unfortunately this test gives the impression that there are no benefits to using DX 12 and that couldn't be further from the truth.


two years building from scratch is absolutely nothing, but when you already have software that is developed and you are updating it, or optimizing it its a lot of time.

DX12, Vulkan is able the ability to access more cores, and ability to do more in your GPU's if they have free resources. But eek that extra performance out, takes a lot more time via testing and programming. So there needs to be a balancing act for the learning curve studios, because they tend not to set their time lines, the time lines are set by their publishers, hence why the initial batch of DX12 games were all so so.
 
I don't see any point in going with DX13. It seems a waste to me... lets let DX12 mature and be exploited before we consider moving to another API. Or update DX12 to make it easier for coders.


They are probably going to update DX12 to something in the middle of LLAPI's and high level API's, the removal of abstraction layers, seems to be a bit premature for most developers..... Cost (time)/benefit (performance) ratio is too low for most developers to consider it viable on the PC side.
 
In my experience with tomb raider the built-in benchmark loop produced hugely variable results when using either API. DX12 was perceptible smoother when actually playing in the geothermal Valley level, I guess because it was very cluttered and drawcall bottlenecks come into play even at relatively low framerates (which happens often in WoW as well, Suramar in particular suffers from this).

Natively multithreaded drawcall submission is almost always going to be hugely beneficial in open world games, but it can be achieved just as easily with vulkan. I don't see DX12 gaining traction over vulkan in the long run, but who knows I could be wrong.

As someone stated in a post in this thread earlier, in many cases DX12 paths of these games perform worse when heavily GPU limited. This could be driver side immaturity, it could be developers not being able to match the efficiency of code written by respective IHVs.

To put it in TLDR form; Vulkan and DX12 allow developers to have more finely tuned code, but they also enable them to write horribly tuned code, far more horrible than they could have done using DX11.

Who can you trust to do a good job using these APIs? Epic Games, Blizzard, DICE, Crytek, Rockstar, Valve etc

Who can you not trust? Bethesda (lol), IO interactive (Hitman), EA games etc

Its a massive investment of time, therefore money, and it only makes sense if there will be significant long term return, all the developers I mentioned as being capable of it license(or use themselves) their engines for many games, Rockstar being the exception with one game that sells 100m lol


I agree with most of what you stated,

the only part I don't, Vulkan vs DX adoption, DX will never loose its marketshare, they evolve faster than Khronos group, Ever since DX9 MS has been able to set up the API guide lines for the industry. Vulkan although very close to DX12 in terms of features, its not adopted in consoles, so Xbox will be the driving force to use DX12 over Vulkan, and that will keep DX12's marketshare stable.
 
I agree with most of what you stated,

the only part I don't, Vulkan vs DX adoption, DX will never loose its marketshare, they evolve faster than Khronos group, Ever since DX9 MS has been able to set up the API guide lines for the industry. Vulkan although very close to DX12 in terms of features, its not adopted in consoles, so Xbox will be the driving force to use DX12 over Vulkan, and that will keep DX12's marketshare stable.

That sounds about right to be fair, but I was thinking more from game development side for PC, working with vulkan is far more efficient because it can be used on multiple OSs. How that would hold up when the xbone version is using dx12 I have no idea

I expect dx11 to remain relevant for a while
 
Yeah I agree it would be more efficient working in Vulkan and then porting to DX12, but its going to come from engine developers first of course, not sure how keen they will be on starting off with Vulkan mainly because of outside of Windows gaming is not top priority.
 
I really hope blizzard revamp the world of warcraft engine and transition to vulkan, that's a game where you will see double digit performance improvement
 
I really hope blizzard revamp the world of warcraft engine and transition to vulkan, that's a game where you will see double digit performance improvement


Just curious do you have an article to direct me to I could read to understand that?
 
I really hope blizzard revamp the world of warcraft engine and transition to vulkan, that's a game where you will see double digit performance improvement


Very true! although I don't play MMO's or any thing much of online games lol, but yeah Blizzard would benefit from that!
 
DX 12 has been thought about since the HD 7950 and HD 7990 ! this why the 290X also supports DX 12 and some of these older cards may show some speed gains .
 
two years building from scratch is absolutely nothing, but when you already have software that is developed and you are updating it, or optimizing it its a lot of time.

DX12, Vulkan is able the ability to access more cores, and ability to do more in your GPU's if they have free resources. But eek that extra performance out, takes a lot more time via testing and programming. So there needs to be a balancing act for the learning curve studios, because they tend not to set their time lines, the time lines are set by their publishers, hence why the initial batch of DX12 games were all so so.
Yeah if the game is fully released then yeah 2 years is a lot from a game production standpoint. But when it comes to DX12 it really is a honest rewrite for graphic assets to see the benefits. Otherwise you are doing just enough to get through the execution errors and performance issues, which many of the early games did just to say they had an DX12 execution path.
 
Last edited:
I think DX12 and Vulkan are going to be the next DX9 in terms of adoption and lifespan. It's just that developers aren't fully up-to-speed yet, and most of the games you see had the newer APIs tacked on at the end.

For native DX12/Vulkan apps there is huge potential, especially for multi-threading (which was severely limited in older APIs). We are just at the point where 8-core CPUs are affordable, and the software can actually take advantage. But it's a hugely complex field, and I wouldn't expect developers to figure things out in only 2 years.

But in a few more years the situation could be very different. DX12 will likely remain popular on desktop due to Microsoft's dominance (and w/ XB1 as well) but Vulkan runs on a lot more systems, even on Windows (where 7 and 8 are supported but not w/ DX12) so there is potential there. We already saw this happen w/ Star Citizen dropping DX12 to go Vulkan due to the broader OS support.
 
I think DX12 and Vulkan are going to be the next DX9 in terms of adoption and lifespan. It's just that developers aren't fully up-to-speed yet, and most of the games you see had the newer APIs tacked on at the end.

For native DX12/Vulkan apps there is huge potential, especially for multi-threading (which was severely limited in older APIs). We are just at the point where 8-core CPUs are affordable, and the software can actually take advantage. But it's a hugely complex field, and I wouldn't expect developers to figure things out in only 2 years.

But in a few more years the situation could be very different. DX12 will likely remain popular on desktop due to Microsoft's dominance (and w/ XB1 as well) but Vulkan runs on a lot more systems, even on Windows (where 7 and 8 are supported but not w/ DX12) so there is potential there. We already saw this happen w/ Star Citizen dropping DX12 to go Vulkan due to the broader OS support.

really? I thought Star Citizen dropped DX 12 to go with vulcan so when they delay they can say: Switching API's was a lot harder than we thought.
 
Remember, DX11 came out many years ago. Games are simply bigger and taking longer to make these days. Sequels tend to start well before the first game has shipped. Just due to the nature of the industry/development cycle, I don't think we'll be seeing things like APIs picked up as quickly as it has in the past.

We'll be seeing a lot more Battlefield Hardline, Borderlands The Pre-Sequel and Batman Arkham Origins type of games where little time is given to innovate and make significant changes.
 
I can see Vulkan having more success with Android devices where OpenGL is already the most prevalent API used but the widespread adoption of Vulkan still has have to overcome the hurdle of the platform's severe fragmentation. Something that is not as much of an issue with Apple and its own Metal API. For the PC, MS' DirectX is still the dominant API and I don't see that changing overnight.
 
Back
Top