DX12 to be announced on March 20th

it seems that maxwell is going to be a big flop, if DX12 will be out next year, who will buy maxwell now?
 
no hardware can be ready for a specs that isn't finalized yet ;)
We're not sure at what stage it's at. For all we know, it could've been finalized months ago. It's not a one-way, black box process, either. IHVs work with Microsoft on Direct3D, much like IHVs (among others) comprise the OpenGL WG.

It's unlikely Microsoft is doing anything that IHVs aren't aware of. I also think its unlikely, given the timing, that Maxwell/GCN would be unable to support at least a reasonable subset of the API.
 
We're not sure at what stage it's at. For all we know, it could've been finalized months ago. It's not a one-way, black box process, either. IHVs work with Microsoft on Direct3D, much like IHVs (among others) comprise the OpenGL WG.

It's unlikely Microsoft is doing anything that IHVs aren't aware of. I also think its unlikely, given the timing, that Maxwell/GCN would be unable to support at least a reasonable subset of the API.

maxwell architecture has been finalized when the GTX750 Ti was out.
GTX750 Ti does not support DX12, next maxwell cards will not support it neither.
 
You're operating under the assumption that DX12 will not work on current DX11 level GPUs. That could be the case but given that MS is applying a forked version of DX12 to the XB1, and AMD / nvidia / and intel are doing co-presentations with MS on DX12, i'd say that isn't a wise assumption. The premier feature of DX12 will be a lessened abstraction layer as Mantle does. This does not require new hardware. The only way I can see DX12 requiring new hardware is if there is a new shader model in development, or incredibly new features. Neither of those scenarios are even remotely likely.

But, we will see on the 20th. I have a feeling, as mentioned, that the main feature of DX12 will be abstraction layer related. The intent will be to allow more direct GPU access and lessened driver/CPU overhead. These are things that likely will not require new designs in terms of hardware, but like I said. We'll see on the 20th of this month.
 
Last edited:
maxwell architecture has been finalized when the GTX750 Ti was out.
GTX750 Ti does not support DX12, next maxwell cards will not support it neither.
I was suggesting that we don't know at what stage Direct3D is, not Maxwell. We only know the 750 series doesn't support DX12 because DX12 hasn't been announced: we know absolutely nothing beyond that.

The 750, may, at some future time, support some subset of D3D12.
 
I still don't see a huge leap ahead for DX12 other than the unified memory stuff and low level api (another "mantle") We have also not heard anything about Shader Model 6 (replace SM5/DX11) Unless they are keeping it firmly under wraps.

But hey, who the hell knows. we will find out next week. Fun stuff.
 
maxwell architecture has been finalized when the GTX750 Ti was out.
No it hasn't, the 750 Ti isn't even using 20nm lithography (which is expected of proper maxwell cards).

The 750 Ti is testing ground for new tech, but it's hardly what the final version of Maxwell will be.
GTX750 Ti does not support DX12, next maxwell cards will not support it neither.
You don't know that. The GTX 750 Ti might support DX12 just fine (because DX12 might not require a new shader model).
 
You're operating under the assumption that DX12 will not work on current DX11 level GPUs.

The history teached that. No assumption, is certainty.
I never seen a minor version supported on old hardware, let alone on a major version.
 
I was suggesting that we don't know at what stage Direct3D is, not Maxwell. We only know the 750 series doesn't support DX12 because DX12 hasn't been announced: we know absolutely nothing beyond that.

The 750, may, at some future time, support some subset of D3D12.

there is no supset, an API is supported or not supported, no subset.
 
there is no supset, an API is supported or not supported, no subset.
That's a silly comment. In OpenGL, for instance, Core is a non-strict (once strict) subset of Compatibility, and ES is a strict subset of OpenGL. You then have additional extensions — not part of any profile — which, when used in tandem with one of the profiles, supersets the API.

Microsoft isn't quite so lenient, but they do have CAPS: you use them to query hardware feature availability for fully-conforming hardware because they aren't required for conformance. If it was an all-or-nothing game, CAPS would have no use. There's also the whole idea behind feature levels, which are just non-strict subsets of the core API. You can have a DX11.2-compliant part that only supports feature level 10_0!
 
Who will live will see but I'm quite sure that no current cards will support DX12.

Dude, you can't possibly be "quite sure" about an API specification that isnt' even public yet. Also, API subsets have existed for years now, most recently with DX11 cards supporting either 11.2 level or the subsets 11.1 / 11.0. Please educate yourself before commenting on what you don't know.

Until the 20th when DX12 is presented nobody can now anything about it since it's unknown. However, the inclusion of XBOX in the announcement page implies that it will not require new hardware since XB1 supports Shader Model 5 only. SM6 would most likely require some new HW capabilities, but as long as DX12 keeps using SM5, chances are quite high all current DX11 hardware will support most if not all of DX12 features (or a subset that applies to gaming, like Nvidia does to avoid having to support features that don't directly/greatly benefit gaming).
 
Last edited:
The history teached that. No assumption, is certainty.
I never seen a minor version supported on old hardware, let alone on a major version.
History has shown exactly the opposite... I mean heck, DX 11.2 runs on DX9 cards :confused:

Not all of the latest-and-greatest features of the API are supported by DX9 hardware, but the DX 11.2 API will run on DX9, DX10, DX10.1, DX11, DX11.1, and DX11.2 cards. It has explicit support for older hardware through the feature-level system.

Microsoft introduced "feature levels" in DX11, which have gone a long way towards unlocking hardware from specific versions of DirectX.

there is no supset, an API is supported or not supported, no subset.
Look into how feature levels work on DX11+...

It allows a developer to target the latest version of DX (right now, that would be 11.2) and still have their graphics engine run on older hardware. It's still a DX 11.2 graphics engine, still using the DX11.2 API, but it includes exceptions and substitutions for older hardware.

You can have a DX11.2-compliant part that only supports feature level 10_0!
This! Exactly this! Thank you for summing it up so well.
 
History has shown exactly the opposite... I mean heck, DX 11.2 runs on DX9 cards :confused:

Not all of the latest-and-greatest features of the API are supported by DX9 hardware, but the DX 11.2 API will run on DX9, DX10, DX10.1, DX11, DX11.1, and DX11.2 cards. It has explicit support for older hardware through the feature-level system.

Microsoft introduced "feature levels" in DX11, which have gone a long way towards unlocking hardware from specific versions of DirectX.


Look into how feature levels work on DX11+...

It allows a developer to target the latest version of DX (right now, that would be 11.2) and still have their graphics engine run on older hardware. It's still a DX 11.2 graphics engine, still using the DX11.2 API, but it includes exceptions and substitutions for older hardware.


This! Exactly this! Thank you for summing it up so well.

Crysis 3 didn't run on any cards below dx11 and COD Ghosts also requires DX11. BF4 requires DX11. This backward compatibility stuff is just bull. The videocard is either dx11 or not. This half way in stuff is all just a bogus gimmick. DX10 cards wont run these games. Why even say DX11 is backwards compatible when its not.
 
it seems that maxwell is going to be a big flop, if DX12 will be out next year, who will buy maxwell now?


Considering the fact this happens every DX cycle I don't see how it'll be different. Based off early results Maxwell is looking to kick ass and take names in power savings, heat, and performance charts.

You also have to remember it'll take another 2 years before DX12 really starts to make waves after its initial release/GPU support like always. Consoles wont be pushing it like they were with DX9 and are going to continue to do with DX11, the latter which will be with us for another decade unless we have a rapid 5 year next-gen cycle.
 
Crysis 3 didn't run on any cards below dx11 and COD Ghosts also requires DX11. BF4 requires DX11. This backward compatibility stuff is just bull.
You're describing feature level 11_0 and 11_1 games. Obviously they require GPUs that support those feature levels.

While it's not really what I'm talking about when I say that current-gen GPUs might support DX12 (I'm suggesting they may support feature level 12_0, assuming it exists as a feature level), the API itself isn't strictly coupled to any specific feature level support. A D3D11.2 renderer can totally run on hardware that supports only feature level, say, 10_0. It's still an 11.2 renderer, but only targeting that class of hardware which supports that feature level or above.
 
You're describing feature level 11_0 and 11_1 games. Obviously they require GPUs that support those feature levels.

While it's not really what I'm talking about when I say that current-gen GPUs might support DX12 (I'm suggesting they may support feature level 12_0, assuming it exists as a feature level), the API itself isn't strictly coupled to any specific feature level support. A D3D11.2 renderer can totally run on hardware that supports only feature level, say, 10_0. It's still an 11.2 renderer, but only targeting that class of hardware which supports that feature level or above.

Yup. Hence why BFBC2 ran in DX11 on the 200 series, feature level was only DX9 but they used 11 for API efficiencies.
 
Crysis 3 didn't run on any cards below dx11 and COD Ghosts also requires DX11. BF4 requires DX11.
I never said developers couldn't choose to lock-out older hardware. They're perfectly welcome to do so... A developer is also perfectly welcome to use the DX11.2 API to target DX9-level hardware.

Basically, the titles you've pointed out are DX11 games that require feature-level 11_0 and 11_1. There are DX11 games that only require feature-level 10_0 that will run on DX10 hardware.

This backward compatibility stuff is just bull.
That's because it's not "backwards compatibility."

It is, in fact, direct support for down-level hardware, in the latest API, should the developer wish to use it.

The videocard is either dx11 or not. This half way in stuff is all just a bogus gimmick
No, you simply don't understand the concept of feature levels.

You can write a rendering engine around the DirectX 11.2 API and have it run on a GeForce FX 5200 if you really wanted.

DX10 cards wont run these games.
Then the developer has chosen not to implement the feature level necessary to support your hardware.

But as has been stated previously, it's quite possible to write a graphics engine in DX11.2 that only requires feature level 10_0, which will then run happily on DX10 hardware like the GeForce 8 series.
 
I don't believe DX12 will require new hardware.

I think that's why is driver EOLing products. So that all [if atleast most] DX11 cards will be able to run DX12.
 
I never said developers couldn't choose to lock-out older hardware. They're perfectly welcome to do so... A developer is also perfectly welcome to use the DX11.2 API to target DX9-level hardware.

Basically, the titles you've pointed out are DX11 games that require feature-level 11_0 and 11_1. There are DX11 games that only require feature-level 10_0 that will run on DX10 hardware.


That's because it's not "backwards compatibility."

It is, in fact, direct support for down-level hardware, in the latest API, should the developer wish to use it.


No, you simply don't understand the concept of feature levels.

You can write a rendering engine around the DirectX 11.2 API and have it run on a GeForce FX 5200 if you really wanted.


Then the developer has chosen not to implement the feature level necessary to support your hardware.

But as has been stated previously, it's quite possible to write a graphics engine in DX11.2 that only requires feature level 10_0, which will then run happily on DX10 hardware like the GeForce 8 series.

I'm just talking about from a user perspective not just from a feature technical level of DX. In the real world your not going to see dx9 or dx10 cards supported in most new games. Most developers don't take advantage of this backwards compatibility which is why i say its a gimmick. Its useless mostly. Going to have to buy the latest graphics card anyway because most games will only support the latest features from the latest hardware.
 
In the real world your not going to see dx9 or dx10 cards supported in most new games.
Already been pointed out that the Battlefield Bad Company 2 supports running DX11 on DX10 hardware (Feature level 10_0 is supported by the developer). It's not the only title that supports feature levels in order to enable broader hardware compatibility.

Most developers don't take advantage of this backwards compatibility which is why i say its a gimmick.
Gimmicks don't add value. Feature levels add value. Ergo, feature levels are not a gimmick.

It's not Microsoft's fault if developers choose not to take advantage of it, but it is a legitimately awesome thing to have on-hand. A developer can effectively ALWAYS use the latest version of the DirectX API and target any hardware set they like.

Its useless mostly.
Useless? Hardly! I have yet to run into a DX11.1 or DX11.2 application that wont run on DX11 (Feature level 11_0) hardware. That's a very good thing!

If feature levels didn't exist, using the DX 11.2 API would exclude all DX11 and DX11.1 cards. That would SUCK.

Going to have to buy the latest graphics card anyway because most games will only support the latest features from the latest hardware.
Which has been true... pretty much never.

Almost ALL current games support some kind of fallback rendering mode. Either feature levels or a full secondary DX9 renderer.
 
A quick look at Steam seems to suggest that a lot of notable new games are still DX9. A few DX11 titles I see, like Rust, Arma 3 and F1 2013 must support lower feature levels (or alternative APIs), according to the published system specs. Arma 3 supports DX10-class hardware; F1 2013 supports DX10-class hardware; Rust seems to support DX9-class hardware.

I don't know if the "most games only support the latest features" claim really holds any water.
 
Click bait... Original article requires subscription :rolleyes:.

Giving hardware-level access to an API is not "integrating Mantle into DirectX11" unless AMD has given their code base to Microsoft to integrate directly, which is highly unlikely. Mantle isn't the first API to do this, anyway, since OpenGL is basically assembly for graphic cards.
 

This reeks of click bait.

Mantle is already its own complete API. Maybe they're adopting some similar concepts, but don't expect it rebadged.

Edit ahaha the second article's spiel about assembly is retarded.
 
Last edited:
Well, it's semiaccurate and Charlie D is a known idiot with a well publicized agenda. So yeah. Not worth reading.
 
It would be interesting to see the benchmarks comparing windows 8 vs windows 7 for dx12.
 
All I really care about is that Microsoft has somehow, god willing come away with lessons learned from the whole desktop user / Windows 8 fail and makes Windows 9 / DirectX 12 a wonderful experience, not just for all users but for desktop users as well.

There are so many problems with Microsoft falling behind, their phones, mobile os,. console performance issues, leadership, etc that I'm worried we will have more of the same ol' lack-luster performance out of them.
 
All I really care about is that Microsoft has somehow, god willing come away with lessons learned from the whole desktop user / Windows 8 fail and makes Windows 9 / DirectX 12 a wonderful experience, not just for all users but for desktop users as well.

There are so many problems with Microsoft falling behind, their phones, mobile os,. console performance issues, leadership, etc that I'm worried we will have more of the same ol' lack-luster performance out of them.

Fortunately 2014 is the Year of the Linux Desktop.
 
Crytek has been "evaluating" the API for quite some time now, showing interest back at the AMD Developer Summit. Since then, they have apparently made a clear decision on it. It is also not the first time that CRYENGINE has been publicly introduced to Mantle, with Chris Robert's Star Citizen, also powered by the 4th Generation CRYENGINE, having announced support for the graphics API. Of course, there is a large gap between having a licensee do legwork to include an API and having the engine developer provide you supported builds (that would be like saying UnrealEngine 3 supports the original Wii).

Hopefully we will learn more as GDC continues.

Editor's (Ryan) Take:

As the week at GDC has gone on, AMD continues to push forward with Mantle and calls Crytek's implementation of the low level API "a huge endorsement" of the company's direction and vision for the future. Many, including myself, have considered that the pending announcement of DX12 would be a major set back for Mantle but AMD claims that is "short sited" and as more developers come into the Mantle ecosystem it is proof AMD is doing the "right thing."

Here at GDC, AMD told us they have expanded the number of beta Mantle members dramatically with plenty more applications (dozens) in waiting. Obviously this could put a lot of strain on AMD for Mantle support and maintenance but representatives assure us that the major work of building out documentation and development tools is nearly 100% behind them.

If stories like this one over at Semiaccurate are true, and that Microsoft's DirectX 12 will be nearly identical to AMD Mantle, then it makes sense that developers serious about new gaming engines can get a leg up on projects by learning Mantle today. Applying that knowledge to the DX12 API upon its release could speed up development and improve implementation efficiency. From what I am hearing from the few developers willing to even mention DX12, Mantle is much further along in its release (late beta) than DX12 is (early alpha).

AMD indeed was talking with and sharing the development of Mantle with Microsoft "every step of the way" and AMD has stated on several occasions that there were two outcomes with Mantle; it either becomes or inspires a new industry standard in game development. Even if DX12 is more or less a carbon copy of Mantle, forcing NVIDIA to implement that API style with DX12's release, AMD could potentially have the advantage of gaming performance and support between now and Microsoft's DirectX release. That could be as much as a full calendar year from reports we are getting at GDC.
http://www.pcper.com/news/General-Tech/GDC-14-CRYENGINE-Support-Mantle-AMD-Gets-Another-Endorsement
 
If you've been paying attention to Charlie's stuff over the years there is a pretty clear bias.

Is bias another word for stupid?
If his website is as you describe it then why would there be anyone going there in the first place..

Why would people actively fool themselves and pay for a subscription and such ?
 
Frostbite, CryEngine and Unreal on board Mantle. Who's left, Unity?

Benefit of Mantle is you don't need to upgrade your Windows like with DX.
 
Back
Top