sblantipodi
2[H]4U
- Joined
- Aug 29, 2010
- Messages
- 3,765
it seems that maxwell is going to be a big flop, if DX12 will be out next year, who will buy maxwell now?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
We're not sure at what stage it's at. For all we know, it could've been finalized months ago. It's not a one-way, black box process, either. IHVs work with Microsoft on Direct3D, much like IHVs (among others) comprise the OpenGL WG.no hardware can be ready for a specs that isn't finalized yet
We're not sure at what stage it's at. For all we know, it could've been finalized months ago. It's not a one-way, black box process, either. IHVs work with Microsoft on Direct3D, much like IHVs (among others) comprise the OpenGL WG.
It's unlikely Microsoft is doing anything that IHVs aren't aware of. I also think its unlikely, given the timing, that Maxwell/GCN would be unable to support at least a reasonable subset of the API.
I was suggesting that we don't know at what stage Direct3D is, not Maxwell. We only know the 750 series doesn't support DX12 because DX12 hasn't been announced: we know absolutely nothing beyond that.maxwell architecture has been finalized when the GTX750 Ti was out.
GTX750 Ti does not support DX12, next maxwell cards will not support it neither.
No it hasn't, the 750 Ti isn't even using 20nm lithography (which is expected of proper maxwell cards).maxwell architecture has been finalized when the GTX750 Ti was out.
You don't know that. The GTX 750 Ti might support DX12 just fine (because DX12 might not require a new shader model).GTX750 Ti does not support DX12, next maxwell cards will not support it neither.
You're operating under the assumption that DX12 will not work on current DX11 level GPUs.
I was suggesting that we don't know at what stage Direct3D is, not Maxwell. We only know the 750 series doesn't support DX12 because DX12 hasn't been announced: we know absolutely nothing beyond that.
The 750, may, at some future time, support some subset of D3D12.
That's a silly comment. In OpenGL, for instance, Core is a non-strict (once strict) subset of Compatibility, and ES is a strict subset of OpenGL. You then have additional extensions — not part of any profile — which, when used in tandem with one of the profiles, supersets the API.there is no supset, an API is supported or not supported, no subset.
Who will live will see but I'm quite sure that no current cards will support DX12.
History has shown exactly the opposite... I mean heck, DX 11.2 runs on DX9 cardsThe history teached that. No assumption, is certainty.
I never seen a minor version supported on old hardware, let alone on a major version.
Look into how feature levels work on DX11+...there is no supset, an API is supported or not supported, no subset.
This! Exactly this! Thank you for summing it up so well.You can have a DX11.2-compliant part that only supports feature level 10_0!
History has shown exactly the opposite... I mean heck, DX 11.2 runs on DX9 cards
Not all of the latest-and-greatest features of the API are supported by DX9 hardware, but the DX 11.2 API will run on DX9, DX10, DX10.1, DX11, DX11.1, and DX11.2 cards. It has explicit support for older hardware through the feature-level system.
Microsoft introduced "feature levels" in DX11, which have gone a long way towards unlocking hardware from specific versions of DirectX.
Look into how feature levels work on DX11+...
It allows a developer to target the latest version of DX (right now, that would be 11.2) and still have their graphics engine run on older hardware. It's still a DX 11.2 graphics engine, still using the DX11.2 API, but it includes exceptions and substitutions for older hardware.
This! Exactly this! Thank you for summing it up so well.
it seems that maxwell is going to be a big flop, if DX12 will be out next year, who will buy maxwell now?
You're describing feature level 11_0 and 11_1 games. Obviously they require GPUs that support those feature levels.Crysis 3 didn't run on any cards below dx11 and COD Ghosts also requires DX11. BF4 requires DX11. This backward compatibility stuff is just bull.
You're describing feature level 11_0 and 11_1 games. Obviously they require GPUs that support those feature levels.
While it's not really what I'm talking about when I say that current-gen GPUs might support DX12 (I'm suggesting they may support feature level 12_0, assuming it exists as a feature level), the API itself isn't strictly coupled to any specific feature level support. A D3D11.2 renderer can totally run on hardware that supports only feature level, say, 10_0. It's still an 11.2 renderer, but only targeting that class of hardware which supports that feature level or above.
I never said developers couldn't choose to lock-out older hardware. They're perfectly welcome to do so... A developer is also perfectly welcome to use the DX11.2 API to target DX9-level hardware.Crysis 3 didn't run on any cards below dx11 and COD Ghosts also requires DX11. BF4 requires DX11.
That's because it's not "backwards compatibility."This backward compatibility stuff is just bull.
No, you simply don't understand the concept of feature levels.The videocard is either dx11 or not. This half way in stuff is all just a bogus gimmick
Then the developer has chosen not to implement the feature level necessary to support your hardware.DX10 cards wont run these games.
I never said developers couldn't choose to lock-out older hardware. They're perfectly welcome to do so... A developer is also perfectly welcome to use the DX11.2 API to target DX9-level hardware.
Basically, the titles you've pointed out are DX11 games that require feature-level 11_0 and 11_1. There are DX11 games that only require feature-level 10_0 that will run on DX10 hardware.
That's because it's not "backwards compatibility."
It is, in fact, direct support for down-level hardware, in the latest API, should the developer wish to use it.
No, you simply don't understand the concept of feature levels.
You can write a rendering engine around the DirectX 11.2 API and have it run on a GeForce FX 5200 if you really wanted.
Then the developer has chosen not to implement the feature level necessary to support your hardware.
But as has been stated previously, it's quite possible to write a graphics engine in DX11.2 that only requires feature level 10_0, which will then run happily on DX10 hardware like the GeForce 8 series.
Already been pointed out that the Battlefield Bad Company 2 supports running DX11 on DX10 hardware (Feature level 10_0 is supported by the developer). It's not the only title that supports feature levels in order to enable broader hardware compatibility.In the real world your not going to see dx9 or dx10 cards supported in most new games.
Gimmicks don't add value. Feature levels add value. Ergo, feature levels are not a gimmick.Most developers don't take advantage of this backwards compatibility which is why i say its a gimmick.
Useless? Hardly! I have yet to run into a DX11.1 or DX11.2 application that wont run on DX11 (Feature level 11_0) hardware. That's a very good thing!Its useless mostly.
Which has been true... pretty much never.Going to have to buy the latest graphics card anyway because most games will only support the latest features from the latest hardware.
Click bait... Original article requires subscription .Rumor Mill says that DX12 is DX11 + Mantle.
Semiaccurate
http://semiaccurate.com/2014/03/18/microsoft-adopts-mantle-calls-dx12/
WCCFTECH (shamelessly paraphrased the Semiaccurate article)
http://wccftech.com/microsoft-directx-12-mantle-integrated-performance-boost-xbox-questionable/
Rumor Mill says that DX12 is DX11 + Mantle.
Semiaccurate
http://semiaccurate.com/2014/03/18/microsoft-adopts-mantle-calls-dx12/
WCCFTECH (shamelessly paraphrased the Semiaccurate article)
http://wccftech.com/microsoft-directx-12-mantle-integrated-performance-boost-xbox-questionable/
It would be interesting to see the benchmarks comparing windows 8 vs windows 7 for dx12.
Well, it's semiaccurate and Charlie D is a known idiot with a well publicized agenda. So yeah. Not worth reading.
What well publicized agenda is that?
All I really care about is that Microsoft has somehow, god willing come away with lessons learned from the whole desktop user / Windows 8 fail and makes Windows 9 / DirectX 12 a wonderful experience, not just for all users but for desktop users as well.
There are so many problems with Microsoft falling behind, their phones, mobile os,. console performance issues, leadership, etc that I'm worried we will have more of the same ol' lack-luster performance out of them.
He is anti Nvidia and pro AMD.
[citation needed]
Crytek has been "evaluating" the API for quite some time now, showing interest back at the AMD Developer Summit. Since then, they have apparently made a clear decision on it. It is also not the first time that CRYENGINE has been publicly introduced to Mantle, with Chris Robert's Star Citizen, also powered by the 4th Generation CRYENGINE, having announced support for the graphics API. Of course, there is a large gap between having a licensee do legwork to include an API and having the engine developer provide you supported builds (that would be like saying UnrealEngine 3 supports the original Wii).
Hopefully we will learn more as GDC continues.
Editor's (Ryan) Take:
As the week at GDC has gone on, AMD continues to push forward with Mantle and calls Crytek's implementation of the low level API "a huge endorsement" of the company's direction and vision for the future. Many, including myself, have considered that the pending announcement of DX12 would be a major set back for Mantle but AMD claims that is "short sited" and as more developers come into the Mantle ecosystem it is proof AMD is doing the "right thing."
Here at GDC, AMD told us they have expanded the number of beta Mantle members dramatically with plenty more applications (dozens) in waiting. Obviously this could put a lot of strain on AMD for Mantle support and maintenance but representatives assure us that the major work of building out documentation and development tools is nearly 100% behind them.
http://www.pcper.com/news/General-Tech/GDC-14-CRYENGINE-Support-Mantle-AMD-Gets-Another-EndorsementIf stories like this one over at Semiaccurate are true, and that Microsoft's DirectX 12 will be nearly identical to AMD Mantle, then it makes sense that developers serious about new gaming engines can get a leg up on projects by learning Mantle today. Applying that knowledge to the DX12 API upon its release could speed up development and improve implementation efficiency. From what I am hearing from the few developers willing to even mention DX12, Mantle is much further along in its release (late beta) than DX12 is (early alpha).
AMD indeed was talking with and sharing the development of Mantle with Microsoft "every step of the way" and AMD has stated on several occasions that there were two outcomes with Mantle; it either becomes or inspires a new industry standard in game development. Even if DX12 is more or less a carbon copy of Mantle, forcing NVIDIA to implement that API style with DX12's release, AMD could potentially have the advantage of gaming performance and support between now and Microsoft's DirectX release. That could be as much as a full calendar year from reports we are getting at GDC.
If you've been paying attention to Charlie's stuff over the years there is a pretty clear bias.