It's 2016. Where dat DX12 game?

Please provide links and proof of future EA games that will be using DX12.

Johan Andersson Frostbite Technical Director who runs a Fury X tri crossfire says Frostbite suppports dx12. Why wouldn't they they have one of the best game engines that keep evolving with new features which will debut on the EA flagship a simple google search will reveal it
 
Last edited:
Johan Andersson Frostbite Technical Director who runs a Fury X tri crossfire says Frostbite suppports dx12. Why wouldn't they they have one of the best game engines that keep evolving with new features which will debut on the EA flagship a simple google search will reveal it

An engine supporting a thing and developers utilising a thing are not the same thing, not even close.

The point is that, in the real world, major games take a couple of years to make, and on top of that are extremely expensive, devs (or more accurately, the publishers that control their budgets) rarely have the time to risk new technologies or to give their engineers much time to experiment and get used to them.

The timeline is basically that some games early in production right now might use some easy/quick/safe to implement elements of dx12, and hit in 12-24 months, then after that cycle, when its more of a known quantity and the experience is out there (and there's a bigger install base that can actually benefit from it), it'll become closer to the norm.

So really it'll be more like 3 to 4 years before we see a lot of stuff fully exploiting it.
 
An engine supporting a thing and developers utilising a thing are not the same thing, not even close.

The point is that, in the real world, major games take a couple of years to make, and on top of that are extremely expensive, devs (or more accurately, the publishers that control their budgets) rarely have the time to risk new technologies or to give their engineers much time to experiment and get used to them.

The timeline is basically that some games early in production right now might use some easy/quick/safe to implement elements of dx12, and hit in 12-24 months, then after that cycle, when its more of a known quantity and the experience is out there (and there's a bigger install base that can actually benefit from it), it'll become closer to the norm.

So really it'll be more like 3 to 4 years before we see a lot of stuff fully exploiting it.

It doesn't take 3 or 4 years, Until DX9 New versions released every year or two (snd I don't mean the .1, .2, etcetera versions either). 9 to 10 was 4 years. 10 to 11 was 3 years. 11 to 12 was an extreme example of 6 years, but we had games pretty much right away. Consider that XBox1 also uses DX12 and it's really strange we don't have any DX12 games for PC.
 
It doesn't take 3 or 4 years, Until DX9 New versions released every year or two (snd I don't mean the .1, .2, etcetera versions either). 9 to 10 was 4 years. 10 to 11 was 3 years. 11 to 12 was an extreme example of 6 years, but we had games pretty much right away. Consider that XBox1 also uses DX12 and it's really strange we don't have any DX12 games for PC.

Xbone didn't have Dx12 till a few months ago....... Think it was Sept of last year when Dx12 was released for Xbone. It was released to certain developers for early preview, but not all and that was about a year ago.
 
Not everyone owns DX12 compatible hardware/OS, so software publishers would utterly STUPID to release DX12 exclusive games right now.

That's it.
 
Not everyone owns DX12 compatible hardware/OS, so software publishers would utterly STUPID to release DX12 exclusive games right now.
That's it.

Well you might think that the common denominator would push sales more then anything else. But you end up with something that looks very dated in the end it is not all about pushing boundaries but rather progress DX12 can get ahead and DX11/DX9 is not going to allow you to do so , for over 85% of the market DX9/11 would suffice.

There is a cross over section where the new API will take over due to portability to other platforms then just PC if you have a "shared" base for your game across android/consoles/pc development would pay of even tho the 3 platforms have not to much in common at first glance.
 
An engine supporting a thing and developers utilising a thing are not the same thing, not even close.

The point is that, in the real world, major games take a couple of years to make, and on top of that are extremely expensive, devs (or more accurately, the publishers that control their budgets) rarely have the time to risk new technologies or to give their engineers much time to experiment and get used to them.

The timeline is basically that some games early in production right now might use some easy/quick/safe to implement elements of dx12, and hit in 12-24 months, then after that cycle, when its more of a known quantity and the experience is out there (and there's a bigger install base that can actually benefit from it), it'll become closer to the norm.

So really it'll be more like 3 to 4 years before we see a lot of stuff fully exploiting it.
That's a rough argument as XB1 already has DX12 games with the features implemented. Ease of porting to PC was one of the big selling points of DX12. So why aren't devs doing it?

DX12 for XB1 is a relatively minor change over what was already there. Devs have had years to run DX12 and test it. It's not like DX12 is something brand new nobody has seen before. Even on PC there were a handful of games with Mantle years ago. Same general idea. The ONLY reason it isn't happening is because Nvidia cards aren't capable of running the paths without heavy modification. Something that shouldn't have been necessary with DX12. A simple solution would be to use DX11 for Nvidia and DX12 for AMD(until pascal hopefully), but the marketing departments likely hate that. Have the DX12 gains on Nvidia really been enough to warrant the change from DX11 on the benchmarks we've seen so far?
 
Its not that simple, development with a closed box system vs and open box system is quite different. PC's still don't have edram (well embedded sram), and probably will never. Also we saw with different amounts of ACE's performance differences within AMD's own line with async. There are many other factors too, but the end result is with an open box system like a PC development is always more complex, same optimizations for one system won't come across to another system.
 
Johan Andersson Frostbite Technical Director who runs a Fury X tri crossfire says Frostbite suppports dx12. Why wouldn't they they have one of the best game engines that keep evolving with new features which will debut on the EA flagship a simple google search will reveal it

I wouldn't consider Frostbite one of the "best" engines, far in the limited capacity we have seen it, it seems good.
 
Its not that simple, development with a closed box system vs and open box system is quite different. PC's still don't have edram (well embedded sram), and probably will never. Also we saw with different amounts of ACE's performance differences within AMD's own line with async. There are many other factors too, but the end result is with an open box system like a PC development is always more complex, same optimizations for one system won't come across to another system.

A bit more complex, but in most cases you have far more capable hardware. XB1 is roughly a 7000 series AMD card. Newer cards with better clocks, more memory, etc should at the very least give acceptable performance with only moderate work. If XB1 had a modern high-end GPU I could see that argument. EDRAM would require a change, but it's used for a platform with far less bandwidth. In the case of PS4 it's not even used. Regardless of all of that, porting the DX12 paths should be easier than rewriting everything for DX11. There's an argument for a simplified DX12 path, but atm that means Win10/Nvidia. Is there enough of a market to warrant that considering the gains that might be had over DX11?
 
While I think you are right in saying porting over Xbox one and PS4 titles should be easier to Dx12 pc's vs porting over to Dx11, its not just about the video card, its about the CPU's and sub systems too. There are so many options that the pc has over either of the two systems just in raw performance wise, that its hard to ignore all that extra performance. Currently most of these engines were programmed for DX11 hardware and then added in Dx12 feature sets, so it kinda goes both ways.
 
An engine supporting a thing and developers utilising a thing are not the same thing, not even close.

The point is that, in the real world, major games take a couple of years to make, and on top of that are extremely expensive, devs (or more accurately, the publishers that control their budgets) rarely have the time to risk new technologies or to give their engineers much time to experiment and get used to them.

The timeline is basically that some games early in production right now might use some easy/quick/safe to implement elements of dx12, and hit in 12-24 months, then after that cycle, when its more of a known quantity and the experience is out there (and there's a bigger install base that can actually benefit from it), it'll become closer to the norm.

So really it'll be more like 3 to 4 years before we see a lot of stuff fully exploiting it.

Dice created Frostbite why wont they use their engine you forget the battlefield series and EA uses battlefield to showcase the engine, all EA games use the frostbite engine except sims you seem to forget that.
 
I dont think time spent leveraging more cores would really be a worthy investment. The i5 is likely the most popular gaming cpu, and the cpu isn't usually a bottleneck anyway. dx 12 and allowing gamers to utilize mismatched gpus is way more useful. Have a 680 and buy a 980? Just throw it in with the 680! AMD is the only company that had more than 4 cores available to the average consumer for a while. If that's all that kept amd from really competing, I bet they would have invested more in game dev on the cpu end as well. Thats also why intel kept at 4 for the most part. Theres a diminishing return after 4, so its better to invest in improving the core over adding more.
 
Show me an engine capable of levolution and destruction

Do you realize both those things have very little to do with the engine but very much do to game design based off of features that can be put into an engine right?

Take a look at UE4 engine blueprints in their forums, people have made these types of things through blueprints for UE4 without touching the engine code. Granted they aren't fully featured as what is in BF4 but that just takes more time to do something like that. Rendering technology isn't the same thing as those two features which are separate from the engine.

This is coming from me developing on different game engines, each engine has its limitations in what it can do, is one better than the other not really, the only difference is what a game designer can do with what they have, and what limitations can be hidden in the final product.

It really comes down to how flexible is the engine in what the game developer needs are.
 
I dont think time spent leveraging more cores would really be a worthy investment. The i5 is likely the most popular gaming cpu, and the cpu isn't usually a bottleneck anyway. dx 12 and allowing gamers to utilize mismatched gpus is way more useful. Have a 680 and buy a 980? Just throw it in with the 680! AMD is the only company that had more than 4 cores available to the average consumer for a while. If that's all that kept amd from really competing, I bet they would have invested more in game dev on the cpu end as well. Thats also why intel kept at 4 for the most part. Theres a diminishing return after 4, so its better to invest in improving the core over adding more.

Actually AMD could have thrown all the money they wanted at it but it wouldn't have changed much. DX11 was the limiting issue with CPUs and core counts. Limitations to how many cores could talk the CPU. Splitting the work across cores (yes Razor we know not all work can be split, just referring to what can be) does reduce some latency and allow for higher fps but as we have seen never in multiples but rather finer percents. That single core at a time limitation is what hindered further progress, to what extent DX12 will increase the fps because of this is unknown as of yet. Therefore AMDs more cores was just too early.
 
Actually AMD could have thrown all the money they wanted at it but it wouldn't have changed much. DX11 was the limiting issue with CPUs and core counts. Limitations to how many cores could talk the CPU. Splitting the work across cores (yes Razor we know not all work can be split, just referring to what can be) does reduce some latency and allow for higher fps but as we have seen never in multiples but rather finer percents. That single core at a time limitation is what hindered further progress, to what extent DX12 will increase the fps because of this is unknown as of yet. Therefore AMDs more cores was just too early.

I agree with you AMD's more cores was too early, but going too wide also has disadvantages too, the overhead of going wide, has to be taken into account, its throughput vs latency. There is a fine a line between was is optimal for any given situation and this does shift is based on hardware and software and needs of both.
 
Nvidia fanboy strikes again i said show me you did not provide evidence Frostbite> Nvidia Engine

Levolution can be done in UE4/Cryengine. It is just scripted destruction. You can implement that kind of stuff in most engines if you put the time into it. Whether it fits a game or developers want to take the time to create those features depends on if a game will have it or not. Engines provide a lot, but they aren't a game. You can't take an engine, plop in some assets, and hit a "port to DX 12" button and have a game. Each project changes and makes modifications to the engine.
 
The designed for DX12 games will come in time. No sense in pushing them out at this point. The same crowd that is complaining about no DX12 games being on the market will switch to being the crowd that will bitch at developers about the games being poorly optimized. Pretty much because their single GPU setups can't handle it :D
 
When nVidia has a proper hardware solution we'll see it. Until then nVidia will continue to pay devs to make what suits them. Since AMD seems to not be paying devs right now to do DX12 games, it isn't happening.
 
The designed for DX12 games will come in time. No sense in pushing them out at this point. The same crowd that is complaining about no DX12 games being on the market will switch to being the crowd that will bitch at developers about the games being poorly optimized. Pretty much because their single GPU setups can't handle it :D
They already exist though. They just aren't being put on the market. That's the problem.
 
engines weren't ready till end of last year with Dx12, so for pc games to have it don't expect them for another 6 months, which is still faster deployment that previous generations of Dx games.
 
engines weren't ready till end of last year with Dx12, so for pc games to have it don't expect them for another 6 months, which is still faster deployment that previous generations of Dx games.

I don't really expect a ton of DX12 games on PC. My guess is devs go for Vulkan instead, which would provide a larger market. Tomb Raider (2013) had a bunch of linux files show up today and it wouldn't be surprising if Square Enix was porting stuff towards a future Nintendo platform. The devs said DX12 for Rise was coming later, but Vulkan would make more sense there as well. Best guess for Vulkan is now GDC in March, so the timetables would be about right.
 
When nVidia has a proper hardware solution we'll see it. Until then nVidia will continue to pay devs to make what suits them. Since AMD seems to not be paying devs right now to do DX12 games, it isn't happening.

This makes no sense whatsoever, paying for DX12 games? it is a Microsoft only API. It is not a check mark solution it requires (if you don't have a game engine) a lot of work.

The only DX12 "alpha" out there is Ashes of the Singularity and that is because they ported it from Mantle...
 
Fable is in alpha for dx12 as well i believe

That was the original leak, but things do change. It was originally stated to be async heavy as well. Think there was a released bench results of Fable or maybe just a video.
 
If DX12 is a standard before 2018-19, I'll die of shock.

It can never be considered a standard, it won't work ever on PS4 or on any Nintendo product ;) . Or even such a thing as Android or Linux support will never happen.
 
If DX12 is a standard before 2018-19, I'll die of shock.

It may well be that it is a big enough jump that unforseen issues with the standard emerge and we have a short-lived DX12 that evolves into DX13 (rather than a rapid succession of 12.1, 12.2...), where the core concepts of DX12 are retained, but laid out in a better fashion.

I don't think that will be the case (at all), but not entirely without prior history.
 
Back
Top