DX12 to be announced on March 20th

sry if this has been asked be4 but does DX12 have a release date??? tia.

- sock
"you think thats air your breathing?" - keanu reaves, jurassic park
 
Microsoft announced that older GPU will be able to support DX12.
DX12 change the way to use the shaders, so I don't know how an old GPU can handle such a new API.

IMHO the announcements is paied from manufacturers like AMD and nVidia. :D
 
Microsoft announced that older GPU will be able to support DX12.
DX12 change the way to use the shaders, so I don't know how an old GPU can handle such a new API.
Feature levels. Same thing DX11 uses to run on DX10 hardware.

Apparently, no changes to shaders are needed to get the massive reduction in CPU overhead that they demonstrated, so DX11 cards will be able to take advantage of DX12's headline improvement.
 
So, as far as current GPU's go, when will see this DX12 update?

As stated multiple times; preview in 2H 2014, full release 2H 2015.

That was referring to the release of DX12 not specifically the update for already existing GPU's. I was asking specifically about the DX12 update for already existing GPU's, especially NVidia.

Unless you can find a credible source to substantiate the claim it remains unclear.

I contacted NVidia and they talked like the new driver support may come out in April for NVidia cards but, they can't publicly confirm it until Microsoft announces DX12. So, we will get a driver update in April but, it won't be called "DX12 driver update" or anything, even tho it sort of is.

Ahh, then there's this: Nvidia Boasts New Driver Surpasses AMD's Mantle in Games
 
I contacted NVidia and they talked like the new driver support may come out in April for NVidia cards but, they can't publicly confirm it until Microsoft announces DX12. So, we will get a driver update in April but, it won't be called "DX12 driver update" or anything, even tho it sort of is.

Ahh, then there's this: Nvidia Boasts New Driver Surpasses AMD's Mantle in Games
Which means that until now the Nvidia GPUs had 30% less performance ON PURPOSE... :rolleyes:
 
Well, I think it's a bit early to resort to those type of theories. I mean really, game optimizations happen all the time and this supposed driver is giving like a 5 fps increase in Thief which pushes it ahead of Mantle. But then again, it's a 5 frame difference. Is that a 30% less performance figure? I don't think so.

I honestly don't feel NV would intentionally do something like that, but i'm sure it won't stop some of the out there theories. But we'll see what the new driver does. This is apparently a problem that NV has worked on for a while, and their developers (including John Mcdonald, who has done numerous talks on the matter) have been working on lessening driver overhead for everything across the board. So it could be that? Or maybe some partial DX12 beta features in the new driver? Who knows. In any case I think it's a bit early to throw out the "intentional driver gimping" theories. I really don't think that's the case. If they could have easily extracted 30% more performance, they would have done so to claim a commanding lead over the 290X. NV likes having a halo GPU title with no question. 30% would have done that. But I don't see 30%. Their slides indicate a 5 fps gain in Thief, which certainly isn't 30%.
 
Which means that until now the Nvidia GPUs had 30% less performance ON PURPOSE... :rolleyes:

why do sites show graphs where 0 isnt the bottom of scale, fast in practice graph is misrepresented as it bottoms out at 48.
 
why do sites show graphs where 0 isnt the bottom of scale, fast in practice graph is misrepresented as it bottoms out at 48.
Because all bars are identical below 48

The axis is clearly labeled, so it's not "misrepresented" unless you can't read...
 
Confirmation that I was correct:

Tested: Nvidia's Performance-Boosting GeForce 337.50 Driver

"Nvidia has posted a new driver - the GeForce 337.50 Beta driver. Normally we wouldn't give driver releases all that much attention, though this one is a little different.

Nvidia claimed that this driver will bring some very big performance improvements in a whole number of games. Some games will experience performance improvements of up to 64 percent on single GPUs, while SLI performance can be boosted by as much as 71 percent. The optimizations have been done through optimizations with DirectX, specifically reducing CPU overhead in certain games.........."


So, as far as current GPU's go, when will see this DX12 update?

As stated multiple times; preview in 2H 2014, full release 2H 2015.

That was referring to the release of DX12 not specifically the update for already existing GPU's. I was asking specifically about the DX12 update for already existing GPU's, especially NVidia.

Unless you can find a credible source to substantiate the claim it remains unclear.

I contacted NVidia and they talked like the new driver support may come out in April for NVidia cards but, they can't publicly confirm it until Microsoft announces DX12. So, we will get a driver update in April but, it won't be called "DX12 driver update" or anything, even tho it sort of is.

Ahh, then there's this: Nvidia Boasts New Driver Surpasses AMD's Mantle in Games
 
You're going to have to fill us in on how this driver update "sort of is" the DX12 driver.
 
Ok, so it's a good driver but not a panacea, as to be expected. What happens when they don't use a powerful CPU?
 
You're going to have to fill us in on how this driver update "sort of is" the DX12 driver.

Because they all but admitted it.

(Instead of complaining all the time (not directed at any body in particular) ... there are simply far too many negative Nancy nit-pickers offering inaccurate opinions before they've even read the relevant articles. I recommend a new strategy - actually read the articles first *BEFORE* whining. It's just basic common sense.)

Below are quotes from NVidia that resemble many of the claims from the March 20th event on DX12:

"Headline features of today’s 337.50 Beta driver include DirectX 11 and SLI performance optimizations..."

"By remaining laser-focused on DirectX 11 (and the forthcoming DirectX 12), we have dramatically enhanced the efficiency of our DirectX 11 driver, enabling performance gains of up to 71% in DirectX 11 games, and additional benefits in games with multi-GPU SLI support. Unlike changes tailored around custom graphics APIs, our new optimizations can benefit all DirectX 11 titles the second they go on sale, regardless of developer or publisher, and apply to all DirectX 11-capable NVIDIA GPUs."

"Compared to our competitor’s new, much-discussed API, our testing found the combination of the new 337.50 Beta drivers, and DirectX 11, to offer a faster experience with suitably-matched hardware, as found in enthusiast systems"

"Unlike our competitor’s API solution, the DirectX 11 enhancements in 337.50 Beta benefit all GeForce GTX DirectX 11 GPUs, not just the latest and greatest graphics cards."

"In 337.50 Beta we’ve dramatically reduced multi-GPU CPU overhead, resulting in considerable performance gains in CPU-bound games, and additional performance gains in the majority of SLI-enabled titles, as the chart below illustrates:"

GeForce 337.50 Beta Performance Drivers Increase Frame Rates By Up To 71%
http://www.geforce.com/whats-new/articles/nvidia-geforce-337-50-beta-performance-driver

"The drivers enhance DirectX in such a manner that they free up CPU overhead."

http://www.guru3d.com/news_story/geforce_337_50_beta_sli_performance_benchmarks.html

Sadly, the claimed performance increases will mostly be in games that are CPU heavy (i.e. Hitman) and more than one GPU. So, apparently, a single 760 will see little advantage at this point. Perhaps a later driver update will help with that ... we'll see.
 
Last edited:
"The drivers enhance DirectX in such a manner that they free up CPU overhead."

Right, which has always been an optional portion of the DirectX 11 spec. Microsoft has always had provisions in DirectX 11 for reducing CPU overhead, it's just that neither AMD or Nvidia bothered to implement them (until now).

So this is just "a more complete DirectX 11 implementation," nothing to do with the new features coming to DirectX 12...
 
Sadly, the claimed performance increases will mostly be in games that are CPU heavy (i.e. Hitman) and more than one GPU. So, apparently, a single 760 will see little advantage at this point. Perhaps a later driver update will help with that ... we'll see.
Well, Crysis 3 is extremely CPU heavy as well, I had dips under 30fps quite a few times with a crossfire of r9 290 at 1080p.
Also, the upcoming Watch Dogs appears to be extremely CPU heavy as well, so that will benefit for sure.
 
"The drivers enhance DirectX in such a manner that they free up CPU overhead."
They do not enhance DirectX. They enhance NVIDIA's implementation of DirectX (11). This has nothing to do with DirectX 12. NVIDIA says specifically that the enhancements are to their DX11 driver.
 
This was not my comment, it was a quote:

"The drivers enhance DirectX in such a manner that they free up CPU overhead."

http://www.guru3d.com/news_story/geforce_337_50_beta_sli_performance_benchmarks.html

You're right that it's not DX12 tho. It's just coincidence that this DX11 update coincides with all this DX12 and Mantle talk and has probably caused some confusion. However, these new features will likely be apart of DX12 as well.

They do not enhance DirectX. They enhance NVIDIA's implementation of DirectX (11). This has nothing to do with DirectX 12. NVIDIA says specifically that the enhancements are to their DX11 driver.
 
Quite frankly I'm not so sure why nvidia is so upset about Mantle, maybe it has further reaches/support than I'm currently aware. Good they at least got the Cry Engine working better, more games coming on that one.
 
Quite frankly I'm not so sure why nvidia is so upset about Mantle, maybe it has further reaches/support than I'm currently aware. Good they at least got the Cry Engine working better, more games coming on that one.

Because AMD has all three console wins, therefore the PC is Nvidia's only refuge. It's in Nvidia's interest to be as dominant in the PC market as possible, but they have an API disadvantage.
 
Quite frankly I'm not so sure why nvidia is so upset about Mantle, maybe it has further reaches/support than I'm currently aware. Good they at least got the Cry Engine working better, more games coming on that one.

I don't think they are upset with Mantle. So far it has been buggy and under performing.

NVIDIA has been able to get similar performance gains with out having to resort to a separate API.
 
Mantle is in its infancy, iteration gains will overtake nvidias dx11 optimizations as developers become more familiar with it.
 
Any one having an iota of understanding about software stack would concur DirectX 11 optimizations have no correlation with Mantle and DirectX 12.0.

Basically in order to leverage Mantle, game developers have to work closely with AMD, how can NVidia all of a sudden come up with the magic sauce to make Mantle work with their drivers. If they did then I commend their brilliance and would only buy their stuff knowing they have some wizards in the drivers department.

So for the naive, it's preposterous to mix two different sets of technologies. End of discussion!
 
So for the naive, it's preposterous to mix two different sets of technologies. End of discussion!
Only if it's a desperate attempt to make your "team" look better, for more on that we go to PRIME1:
 
Only if it's a desperate attempt to make your "team" look better, for more on that we go to PRIME1:

Here comes the "team" crap again.

My post was pure sarcasm, if you are dumb enough not to understand it then I can not do anything. I have got both Tri-Cool R9 290s and 780 GTX Ti by the way.

If you plan to simply ignore what I wrote between the lines then it's upto you. Let me re-iterate, "it is impossible for NVidia to leverage DX 12 or Mantle related APIs in DirectX 11".
 
Mantle is in its infancy, iteration gains will overtake nvidias dx11 optimizations as developers become more familiar with it.

By the time Mantle is out of its "infancy" developers will be using OpenGL 4.4 and DirectX 12 which works on all hardware and provides similar close to metal programming.

AMD should abandon it's proprietary API and get its drivers working with all games using open standards.
 
by the time that happens, Mantle will be mature, and updated.

DX12 adoption will be at least 18 months out, and may require updated hardware. Mantle wont.
 
by the time that happens, Mantle will be mature, and updated.

DX12 adoption will be at least 18 months out, and may require updated hardware. Mantle wont.

DX12 does not require new hardware and is not locked to a small percentage of the market like Mantle. You sure are pushing hard for a proprietary API...
 
not really, im pushing for the industry to advance.

If Mantle forces MS to step it up, ill support it as much as possible.

And so far it looks to be doing its intended job.

DX12 WILL require new hardware to leverage all its functions, MS has said as much. Base functionality wont, but full supprt will.
 
not really, im pushing for the industry to advance.

If Mantle forces MS to step it up, ill support it as much as possible.

And so far it looks to be doing its intended job.

DX12 WILL require new hardware to leverage all its functions, MS has said as much. Base functionality wont, but full supprt will.

OpenGL 4.4 was announced before Mantle and is about as open a standard as you can get. Why are you not supporting that, instead of a completely closed and vendor locked API like Mantle? I know why.
 
because the developers dont give a crap about OGL.

If they did, id support it as well.

what was the last AAA OGL title?

Rage?

give me a break...
 
OpenGL 4.4 was announced before Mantle and is about as open a standard as you can get. Why are you not supporting that, instead of a completely closed and vendor locked API like Mantle? I know why.

Why are you such a hypocrite? You complain about the proprietary Mantle, but go on praising G-sync and other proprietary Nvidia tech. Don't pretend you care about open standard considering you say anything that is non-Nvidia is garbage, and anything Nvidia is amazing. You are like those steam fanboys who want a monopoly for some reason. I guess if you want to pay $1000+ for mid range performance, a monopoly would be great.
 
Why are you such a hypocrite? You complain about the proprietary Mantle, but go on praising G-sync and other proprietary Nvidia tech. Don't pretend you care about open standard considering you say anything that is non-Nvidia is garbage, and anything Nvidia is amazing. You are like those steam fanboys who want a monopoly for some reason. I guess if you want to pay $1000+ for mid range performance, a monopoly would be great.

mantle is a hole in the water, nvidia proved to do better without proprietary API.
I hope that AMD will do the same over G-SYNC.
 
Personnaly i dont care as long as the games are running fine.

I downloader the unreal 4 demo that someone over neogaf compiled and I'm running them at over 60fps in 1440p... OK it not a real game, nor the demo have any drivers optimisation but one of them look already 10 times better than 99% of the games out there so.

I prefer engine optimisation to driver and API battles
 
Personnaly i dont care as long as the games are running fine.

I downloader the unreal 4 demo that someone over neogaf compiled and I'm running them at over 60fps in 1440p... OK it not a real game, nor the demo have any drivers optimisation but one of them look already 10 times better than 99% of the games out there so.

I prefer engine optimisation to driver and API battles

where did you downloaded it?
 
Personnaly i dont care as long as the games are running fine.

I downloader the unreal 4 demo that someone over neogaf compiled and I'm running them at over 60fps in 1440p... OK it not a real game, nor the demo have any drivers optimisation but one of them look already 10 times better than 99% of the games out there so.

I prefer engine optimisation to driver and API battles

where did you downloaded it?

What he said.
 
Back
Top