Thoughts on current lineup not supporting DX12?

Cygnus A

n00b
Joined
May 21, 2015
Messages
23
I am in the market for new monitor plus GPU. My plan is to go Ultrawide with Freesync/Gsync, but after reading about the DX12 issue I am not sure which to go with.

I had planned to go 980ti with the upcoming Acer XR34CKA, but now it seems the Fury X with the XR34CK may be more future proof for next years DX12 games.

Id really like some unbiased input from those that have done the research. Getting locked into Freesync or Gsync is really going to suck long term, but that is where I am at.
 
https://www.reddit.com/r/nvidia/comments/3c5e0k/is_the_980ti_fully_dx12_complaint/

Ok I have read on this time and time again, fact is, one of the "few" that support DX12 Nvidia wise is the 980Ti the rest are hit and miss on certain feature levels and so forth, AMD wise vast majority of their cards going to 7k series (GCN 1-1.1-1.2 overall) support DX12 in full, for all but certain features which I believe 285/380/390s/Fury are the ones that support it.

AMD and Nvidia should not have mixed and matched like that, Intel is bad enough when it comes to certain products supporting X and yet the next chip up or down may not support X but does support Y.

Anyways, DX10.1 11.1/2 were pretty much only inherently supported by AMD graphics cards as Nv felt that they were "useless" features so why bother supporting it, and in the case of DX12, some of the more advanced features etc were and are directly from 11.1/2 so if one does not support previous, then generally they cannot support after as it requires the same baseline code/hardware to enable it.

So in general yes most Nvidia stuff going way back when will "support" DX12, however, they will not/cannot support in full(or even majority of) what they did not have the hardware/software built into their products to allow, so basically wont be till Pascal which is ~2016 that Nvidia should be in a position to fully support DX12, where AMD wise, most of their products do in fact support the lions share of DX12.

I suppose, in Nvidia case that is no surprise, tune and tweak for the moment to give performance for current stuff (even if that means screwing others over to make own products appear better then they truly are) but when it comes to using other designs such as Mantle, or enabling DX10/11/12 to its fullest, why bother THEY didn't make it (even had less then a minor hand in developing it hence the do not care) so its not important enough, "The Way We Screw You Over"
 
Last edited:
It is important to understand that you don't have to have DirectX12 hardware to "support" DirectX12. In no way, shape, or form is DirectX 12 "support" tied to Feature Level 12_0 or greater. DX12 was designed to work on current hardware to a large extent. Most of it's improvements come from it being a low-level API, moreso than anything that can't be run on hardware that supports at least feature_level 11_0.

Another thing that is important to understand is that a card supporting a certain level of DirectX does not guarantee support for a particular feature level. AMD 5000 and 6000 series cards for example, support DirectX11 but cannot run in DirectX12, feature level 11_0. In addition, not being able to run at a certain feature level does not prevent any hardware above the feature level that is supported from being used. Nvidia kepler hardware for example, supports almost all of feature_level 11_1. While it doesn't fully support feature level 11_1, there is nothing preventing developers from taking advantage of the portion that it does support.
 
It is important to understand that you don't have to have DirectX12 hardware to "support" DirectX12. In no way, shape, or form is DirectX 12 "support" tied to Feature Level 12_0 or greater. DX12 was designed to work on current hardware to a large extent. Most of it's improvements come from it being a low-level API, moreso than anything that can't be run on hardware that supports at least feature_level 11_0.

Another thing that is important to understand is that a card supporting a certain level of DirectX does not guarantee support for a particular feature level. AMD 5000 and 6000 series cards for example, support DirectX11 but cannot run in DirectX12, feature level 11_0. In addition, not being able to run at a certain feature level does not prevent any hardware above the feature level that is supported from being used. Nvidia kepler hardware for example, supports almost all of feature_level 11_1. While it doesn't fully support feature level 11_1, there is nothing preventing developers from taking advantage of the portion that it does support.

I understand they will "support" DX12, but what about the single pipeline of Nvidia vs AMDs multilpe pipelines? Isnt that why the 290x is matching the 980ti in the current DX12 benchmarks? It is frustrating there are no Fury X benchmarks on this yet.
 
I am in the market for new monitor plus GPU. My plan is to go Ultrawide with Freesync/Gsync, but after reading about the DX12 issue I am not sure which to go with.

I had planned to go 980ti with the upcoming Acer XR34CKA, but now it seems the Fury X with the XR34CK may be more future proof for next years DX12 games.

Id really like some unbiased input from those that have done the research. Getting locked into Freesync or Gsync is really going to suck long term, but that is where I am at.

You aren't going to get much of an unbiased input from these forums. There is a good discussion here: http://hardforum.com/showthread.php?t=1873640

DX 12 Features vs Tiers are confusing so you might want to read up here: http://www.extremetech.com/extreme/...what-amd-intel-and-nvidia-do-and-dont-deliver

All in all, if you don't need to upgrade now, don't. Wait and see once more games come out that utilize DX12. Looks like Nvidia is behind now, but both have some driver issues with Windows 10 and its not certain if its architecture issues holding back Nvidia or bad drivers. Obviously the later can be fixed while the former can not, so its best to wait and see.

Ark was supposed to get its DX12 release this weekend (for the free weekend) but it got delayed, in what most people are suspecting was issues with Nvidia performance as its a GW title.
 
At this point you might as well just wait for the 16 nm generation.

DX12 is insanely over-hyped. This whole industry is just desperate right now to sell stuff.
 
Waiting isnt much of an option since I am transferring my current GPU to my HTPC so my kids can play stuff in the living room. Also I am mainly prepping for Fallout 4 at this point, but I would rather have the better solution for long term and that seems unclear at this point, mainly because I am investing in a G/Freesync monitor.
 
Heyyo,

Tbh? I wouldn't worry so much over it... Look at the present. There's DirectX 11.0, 11.1 and 11.2... now, name how many games use DirectX 11.2? :p

I know Frostbite Engine that powers Battlefield 4 uses DirectX 11.1 stuff... but I dunno offhand how many games use the tiled resources of DirectX 11.2. If anyone knows and can link proof or something that would be interesting to see. I didn't see much other than a Microsoft and NVIDIA blog talking about using tiled resources...

Prime example of what I'm talking about:
http://www.dsogaming.com/interviews...n-physx-dx11-2-windows-8-global-illumination/

DSOGaming: What’s your opinion about Windows 8 and DX11.2? Will you be using any of DX11.2’s exclusive features?

BT: We will definitely make sure that our games have no problem running on Windows 8 and 8.1 but at this point we are not considering using any of the DX11.1 or 11.2 features. It is very important for us to keep the platform specific changes to a minimum while still making use of everything that’s available to us. Some of the DX11.1 features are actually available in DX11 via standard APIs, so we are using those in the development process but not necessarily in the released game.

So... what I gather from that? Unless developers are developing games for Xbox One to use DirectX 12.1 which I doubt it supports those features? Odds are history will repeat itself and pretty much all the games will use DirectX 12.0
 
imo by the time you see a dx12 game that you want to play chance are you'll want a faster card anyway.
 
Basically, if your game is a glorified bitcoin miner with a few low-res graphics tacked on, then AMD cards are going to have an advantage. If your game is the more traditional type that is graphics-bound, don't expect much of a DX12 miracle for either nvidia or AMD cards.

But boy are the internets abuzz these days. I haven't seen it like this since the huge 3.5 GB "scandal" that turned out to be less than a molehill. More like an anthill.
 
In regards to the Gsync over Freesync, nothing is really stopping Nvidia from supporting Free, but they wont cause it is not "theirs" whereas everyone else to my knowledge cannot support G as that is another proprietary bs Nvidia thing so you can buy their cards and the part that allows it on the monitors (could it not also be put onto the motherboard instead of the monitor somehow as in some sort of added expansion card that would link to the GPU like a SLI for Gsync instead?)

Anyways, I know Intel recently announced they will be supporting and behind Freesync, and as far as I know others can use it when they see fit as the rules/obligations are far more "lax" without requiring specific hardware(chip/card or whatever) so I personally think that Freesync is more inline with probably will see more prevelance in the industry on a whole then Gsync will as one can pretty much be used by everyone, the other cannot, and face it, at least currently, Intel is the headhoncho when it comes to sheer numbers in the computer industry, Nvidia may be quite large, but Intel is gargantuan so anything they support either A) falls on its face or B) has insane support levels by all kinds of other industry heavyweights,

Long story short, Nvidia should have opened that specific tech up as the actual design makes sense and probably does have more possible performance benefit from the way that it directly links with the hardware compared to Freesync, but there really was no need to make it specific to their GPU when they could have just made the "chip" be able to be used/tuned more like say an FPGA or whatever so anyone can make/create them with possible license revenue compared to just a "handful" of specific usage scenarios.
 
Basically, if your game is a glorified bitcoin miner with a few low-res graphics tacked on, then AMD cards are going to have an advantage. If your game is the more traditional type that is graphics-bound, don't expect much of a DX12 miracle for either nvidia or AMD cards.

But boy are the internets abuzz these days. I haven't seen it like this since the huge 3.5 GB "scandal" that turned out to be less than a molehill. More like an anthill.

lol glorified bitcoin miner, Radeons for the most part have always been more "multithreaded" then their Geforce counterparts so anything that can effectively tap into the way they are built gain a massive amount of raw performance, but like most things, many take the easy simple way out, so many Radeons from the old to new do not get as much raw grunt as they technically are capable of.

Bad analogies incoming, Nvidia is like a muscle car with supercharger, raw, simple performance and efficiency, Radeons like twincharged W12/Rotary fancier far more complex to "tune/build" and generally not as efficient but when you can get it just right they bloody well fly.

It hasn't been all that many years past that Nvidia cards were the inefficient gasguzzlers that would overheat and breakdown till they went back to the drawing board and decided to make them fuel injected with a smaller bore :p
 
Bad analogies incoming, Nvidia is like a muscle car with supercharger, raw, simple performance and efficiency, Radeons like twincharged W12/Rotary fancier far more complex to "tune/build" and generally not as efficient but when you can get it just right they bloody well fly.

Some cars are better on drag strips and others on road courses. So it really depends on how you intend to use the car. To that end, I'm quite surprised to see just how seriously people are taking this AotS benchmark.
 
Basically, if your game is a glorified bitcoin miner with a few low-res graphics tacked on, then AMD cards are going to have an advantage. If your game is the more traditional type that is graphics-bound, don't expect much of a DX12 miracle for either nvidia or AMD cards.

But boy are the internets abuzz these days. I haven't seen it like this since the huge 3.5 GB "scandal" that turned out to be less than a molehill. More like an anthill.

Ashes of Singularity is really fun man. It's Supreme Commander on steroids. I do understand that strategy games aren't for everyone though. :)
 
Waiting isnt much of an option since I am transferring my current GPU to my HTPC so my kids can play stuff in the living room. Also I am mainly prepping for Fallout 4 at this point, but I would rather have the better solution for long term and that seems unclear at this point, mainly because I am investing in a G/Freesync monitor.

Well in that case I'd definately recommend AMD as FreeSync monitors are much better for the price.

$800 for 42" AH-IPS hdmi 2.0, DP, no PWM, and freesync.

http://hardforum.com/showthread.php?t=1869589

There are tons of freesync monitors coming out / have come out recently and all are at least $100-200 than a gsync equivalent.

I would ignore most of the posts in here because you have people telling you that DX 12 now is not going to be worth it even though they were pushing for it a few months ago.

All of the major studios will be pushing for DX12 games, EA/Dice will be including it in Battlefront and Battlefield, and all the other titles that use Frostbyte (a lot). Its also supported in every major game engine and has been for most of the year or longer.

No games will require DX12 for a long time, but those that support it will get more IQ, better framerates, or both by utilizing it so no reason for devs to not include it.
 
Safest option would be temporary second hand/refurb 290/970 class card until dust settles then jump to 16nm.

The only publicly available DX12 software is alpha version of game sponsored by AMD by developer who only made Mantle benchmark before that.

On the other hand top AMD cards have only 4GB of ram which might be problematic in the future as 4GB cards are suffering in latest CoD

http://www.pcgameshardware.de/Call-...cials/BO3-Beta-Benchmarks-Windows-10-1169217/
 
Safest option would be temporary second hand/refurb 290/970 class card until dust settles then jump to 16nm.

The only publicly available DX12 software is alpha version of game sponsored by AMD by developer who only made Mantle benchmark before that.

On the other hand top AMD cards have only 4GB of ram which might be problematic in the future as 4GB cards are suffering in latest CoD

http://www.pcgameshardware.de/Call-...cials/BO3-Beta-Benchmarks-Windows-10-1169217/

And you recommend a 970? With it's gimped 3.5GB ram?
 
And you recommend a 970? With it's gimped 3.5GB ram?

If you're not playing 4K resolutions and you have to get a relatively newish card to tide you over until Pascal releases next year without breaking the bank, you can get a 970 from EVGA's B Stock page for $250-$260 and save money towards a Pascal flagship which, according to the rumor mill, could be anywhere from a 50% to 100% improvement over the 980Ti.
 
Safest option would be temporary second hand/refurb 290/970 class card until dust settles then jump to 16nm.

The only publicly available DX12 software is alpha version of game sponsored by AMD by developer who only made Mantle benchmark before that.

Game sponsored by AMD? You mean the game engine that has code given to them by Nvidia and who has worked with direct with Microsoft, Intel, Nvidia and AMD including giving them all source code for over a year?

Oh and the same developers that you know.. worked on Civ 5 and others? Just because they are a new company doesn't mean they are in experienced and never created games before. Hell one of the members they helped write parts of DX9/DX10.
 
If you're not playing 4K resolutions and you have to get a relatively newish card to tide you over until Pascal releases next year without breaking the bank, you can get a 970 from EVGA's B Stock page for $250-$260 and save money towards a Pascal flagship which, according to the rumor mill, could be anywhere from a 50% to 100% improvement over the 980Ti.

Come on, there are never any decent cards on that b stock page...That's where they dump their POS 700 series and below for sale. Good luck EVER finding a 970 there for sale.
 
Come on, there are never any decent cards on that b stock page...That's where they dump their POS 700 series and below for sale. Good luck EVER finding a 970 there for sale.

They had some in stock for the past several weeks. Just checked again and they're gone now.
 
Come on, there are never any decent cards on that b stock page...That's where they dump their POS 700 series and below for sale. Good luck EVER finding a 970 there for sale.

i sold mine at ebay.... there are a bunch of them.
 
I am in the market for new monitor plus GPU. My plan is to go Ultrawide with Freesync/Gsync, but after reading about the DX12 issue I am not sure which to go with.

I had planned to go 980ti with the upcoming Acer XR34CKA, but now it seems the Fury X with the XR34CK may be more future proof for next years DX12 games.

Id really like some unbiased input from those that have done the research. Getting locked into Freesync or Gsync is really going to suck long term, but that is where I am at.
I wouldn't make a purchasing decision based on FUD and a war of words. All I've seen points to current NVIDIA hardware supporting async, but the hardware is just optimized for the DirectX 11 workload. Fact is nothing you buy at this moment is going to be future proof for anything.
 
Come on, there are never any decent cards on that b stock page...That's where they dump their POS 700 series and below for sale. Good luck EVER finding a 970 there for sale.

I just bought a 970 from bstock a week ago, you will find one easily if you keep checking every few days or so.
 
Frankly if this asynchronous shader technology is not supported by the GTX980 then I feel mislead into purchasing (two of the suckers) as I expected full DX12 compatibility.

Also, there is a chicken and an egg problem here. If NVIDIA does not support async, the install base for that feature is minuscule and that feature is less likely to be supported. But NVIDIA sold a whole generation of cards based on DX12 compatibility. So you have a whole bunch of people who, in fact, thought they were creating an install base for DX12 but in fact it is just gimped DX12?! But the consoles clearly use async. So now you have a situation where the consoles and AMD support one feature but NVIDIA--the market leader in PC graphics cards, does not support that feature? Doesn't that sort of relegate a large portion of the PC install base as being behind technologically? Doesn't it just impose another barrier to bringing games to the PC? I don't see this as shifting business to AMD, I only see this as a net negative for the platform as a whole. Very stupid.
 
Frankly if this asynchronous shader technology is not supported by the GTX980 then I feel mislead into purchasing (two of the suckers) as I expected full DX12 compatibility.

Also, there is a chicken and an egg problem here. If NVIDIA does not support async, the install base for that feature is minuscule and that feature is less likely to be supported. But NVIDIA sold a whole generation of cards based on DX12 compatibility. So you have a whole bunch of people who, in fact, thought they were creating an install base for DX12 but in fact it is just gimped DX12?! But the consoles clearly use async. So now you have a situation where the consoles and AMD support one feature but NVIDIA--the market leader in PC graphics cards, does not support that feature? Doesn't that sort of relegate a large portion of the PC install base as being behind technologically? Doesn't it just impose another barrier to bringing games to the PC? I don't see this as shifting business to AMD, I only see this as a net negative for the platform as a whole. Very stupid.

I think you are massively misinterpreting the situation.

It was Nvidia's choice to label their Maxwell2 cards as possessing both full DX12 support & async computing. This is not some conspiracy "gotcha" ruse by console developers, game developers & AMD.

If you believe the latest Beyond3D tests, Maxwell2 does indeed possess async computing, but it appears to perform very poorly relative to AMD. So unless someone from Nvidia decides to release specific technical information based on actual Maxwell2 architecture, it is going to be impossible to get down to the exact nitty gritty. Oxide developers already have stated that Maxwell2 does not "natively" support async, but until someone with actual uarch knowledge of Maxwell2 speaks up, it's a fool's errand to continue running down async computing in Maxwell2.

The industry is going to move forward, with or without AMD, or Nvidia. If the CONSENSUS of developers is that we need async computing to achieve efficient and maximum performance, that is what the MARKET and industry will dictate. This is completely irrespective of AMD and Nvidia.

So there is absolutely no net negative for the industry here. Cards will still perform, Maxwell2 does indeed fully support DX12, but they just wont be performance leaders.

If you truly believe that AMD wont receive extra business out of this situation, you're just blind and dumb. IF it is true, that a $250 AMD card today matches a $650 Nvidia card in DX12 @ 4k res, then you'd be a fool to leave that amount of value on the table as a consumer just for the sake of brand loyalty. We're now talking about the difference between upgrading CPU + MOBO, or a 1 TB SSD, or a Freesync monitor now.

People will dictate that the markets will be more efficient.
 
Someone I know just bought two 980 ti cards for his system last week and is going full autist rage on his social media right now. This is fucking hilarious. I think a few Nvidia fanboys might actually suicide over this.
 
Someone I know just bought two 980 ti cards for his system last week and is going full autist rage on his social media right now. This is fucking hilarious. I think a few Nvidia fanboys might actually suicide over this.

wait till he finds out that AMD made improvements to CFX under DX12.
 
I think you are massively misinterpreting the situation.

It was Nvidia's choice to label their Maxwell2 cards as possessing both full DX12 support & async computing. This is not some conspiracy "gotcha" ruse by console developers, game developers & AMD.

If you believe the latest Beyond3D tests, Maxwell2 does indeed possess async computing, but it appears to perform very poorly relative to AMD. So unless someone from Nvidia decides to release specific technical information based on actual Maxwell2 architecture, it is going to be impossible to get down to the exact nitty gritty. Oxide developers already have stated that Maxwell2 does not "natively" support async, but until someone with actual uarch knowledge of Maxwell2 speaks up, it's a fool's errand to continue running down async computing in Maxwell2.

The industry is going to move forward, with or without AMD, or Nvidia. If the CONSENSUS of developers is that we need async computing to achieve efficient and maximum performance, that is what the MARKET and industry will dictate. This is completely irrespective of AMD and Nvidia.

So there is absolutely no net negative for the industry here. Cards will still perform, Maxwell2 does indeed fully support DX12, but they just wont be performance leaders.

If you truly believe that AMD wont receive extra business out of this situation, you're just blind and dumb. IF it is true, that a $250 AMD card today matches a $650 Nvidia card in DX12 @ 4k res, then you'd be a fool to leave that amount of value on the table as a consumer just for the sake of brand loyalty. We're now talking about the difference between upgrading CPU + MOBO, or a 1 TB SSD, or a Freesync monitor now.

People will dictate that the markets will be more efficient.

Pretty much this. Right now it doesn't matter because we have jack shit for DX12 content. In a year or so, who knows?
 
Pretty much this. Right now it doesn't matter because we have jack shit for DX12 content. In a year or so, who knows?

The part that matters to me and probably many other people is whether to go Free-Sync or G-Sync monitor for the long term. If it turns out AMD is going to rule DX12 for a long time, then the answer is Free-Sync. If they just have a gentle head start but NVIDIA will take over anyways, then G-Sync is probably the answer.
 
The part that matters to me and probably many other people is whether to go Free-Sync or G-Sync monitor for the long term. If it turns out AMD is going to rule DX12 for a long time, then the answer is Free-Sync. If they just have a gentle head start but NVIDIA will take over anyways, then G-Sync is probably the answer.

I wish some of the monitor manufacturers would actually give us the things we really want in monitors. Not the least of which would be G-Sync and Freesync compatibility.
 
The part that matters to me and probably many other people is whether to go Free-Sync or G-Sync monitor for the long term. If it turns out AMD is going to rule DX12 for a long time, then the answer is Free-Sync. If they just have a gentle head start but NVIDIA will take over anyways, then G-Sync is probably the answer.

Intel is backing free-sync (adaptive-sync) and there is no reason for monitors to not include it since... its free and part of the DP standard. They have to pay ($100+?) for a gsync module.

Nvidia should support Adaptive-Sync in the future, no reason for them not to except to limit gamers to their hardware.
 
Here is the main question you have to ask between DX11 and DX12. Is a full DX12 game going to "look" visually better than a DX11 game. I.E., are the graphical effects going to look superior to the DX11 equivalent.

I think, off the bat, no, developers are going to opt for the performance increase, over visual quality increase, at first. Down the road a few years? I do not know.
 
Here is the main question you have to ask between DX11 and DX12. Is a full DX12 game going to "look" visually better than a DX11 game. I.E., are the graphical effects going to look superior to the DX11 equivalent.

I think, off the bat, no, developers are going to opt for the performance increase, over visual quality increase, at first. Down the road a few years? I do not know.

Naturally though, with the increase in performance, you'll have an increase in headroom. Developers will have more of a resource budget to crank out even better effects at good framerates. It's never really a question as to how good you want something to look, but how good CAN you make it look while performing well on a target platform.
 
Naturally though, with the increase in performance, you'll have an increase in headroom. Developers will have more of a resource budget to crank out even better effects at good framerates. It's never really a question as to how good you want something to look, but how good CAN you make it look while performing well on a target platform.

They also have to think of backwards compatibility with people not on DX12. The game is still going to have to run well on DX11 for a long while.
 
Someone I know just bought two 980 ti cards for his system last week and is going full autist rage on his social media right now. This is fucking hilarious. I think a few Nvidia fanboys might actually suicide over this.
Just tell him to return the cards or sell them himself and get Fury X's.
I'm sure if he complained enough to Amazon/Newegg they would take them back. :rolleyes:
 
Solid explanation of the current state of things: http://wccftech.com/nvidia-amd-directx-12-graphic-card-list-features-explained/

I feel that this is definitely a "bird in the hand is worth two in the bush" situation. There is no way I am going to pay full price for lower perf/$ in current games in order to harness some future theoretical performance gain. In a couple years when real AAA games use DX12, we will be ahead 1 or 2 hardware generations and GCN1.2 vs Maxwell 2.0 discussions will be long forgotten.
 
Solid explanation of the current state of things: http://wccftech.com/nvidia-amd-directx-12-graphic-card-list-features-explained/

I feel that this is definitely a "bird in the hand is worth two in the bush" situation. There is no way I am going to pay full price for lower perf/$ in current games in order to harness some future theoretical performance gain. In a couple years when real AAA games use DX12, we will be ahead 1 or 2 hardware generations and GCN1.2 vs Maxwell 2.0 discussions will be long forgotten.

First time I've read anything educational on WCCFTech :eek:
 
Honestly I think it sucks for some people, but as a casual gamer who seemingly play a lot of games based on Direct X 11 and stuff...It really doesn't affect me.

I built my computers not to really tech demo the latest and greatest, but they still aren't bad computers, so I'm sure the frame rates of AAA titles in 2015~2016 will run somewhat okay on the 2 computers.


I have really close friends who work at Hi Rez, and I mostly play Smite. I recently just got into the Beta for Paladins and if history repeats itself I think the game probably isn't gonna be a tech demo for latest gen tech and just solid good graphics with nice gameplay.

The only game recently that I had issues with was Ark and it was odd seeing a game run at 50~60fps on Epic running 1440p.
 
Back
Top