Why didn't Nvidia implement DX10.1?

BioSergeant

Weaksauce
Joined
Feb 23, 2008
Messages
106
Why did they decide not to implement it? Anybody know why? I find it interesting that they didn't.
 
They just haven't gotten around to it because they haven't really changed their architecture since the G80. I'm sure their next gen will support it. It seems to be overall a very minor thing, and right now only Assassins Creed even has support for it.

It's just not an important consideration in this current generation of cards. That will change when get to next gen, as more support comes online.
 
There's a reason why it's called DirectX 10.1 and not 11.

Remember when MS released DirectX 8.1? Same deal.
 
Because theres no reason to. As far as i know, all it does it make a few things standard that dont really effect current hardware or games.
 
LOL, I was going to post about that. I can remember all the Nvidia fans talking about how the 6600/6800 was better then the x800 line simply because they supported SM3. I wonder how many of those who posted in this thread claiming the lack of DX10.1 isn't a big deal are the same people....

I think heatless has it right. The 8800 simply hasn't changed since its been introduced. They've increased clock speeds and enabled extra stream processors, but the chip is basically the same. Perhaps when the real 9000s get here they will add 10.1, Tessellation, etc.
 
Shader model 3 ring a bell? The lack of dx9c was one of the major issues that hurt the x800 line.

Even though it caused no real world problems- by the time SM 3.0 was actually put into real use, that gen. of cards didn't have a chance at running it.
 
Even though it caused no real world problems- by the time SM 3.0 was actually put into real use, that gen. of cards didn't have a chance at running it.
Then why is there a shader patch for Bioshock that allows it to run on later SM 2.0 cards? ;)
 
Because it still doesn't matter.

The point is, SM 3.0 is a trivial issue for that series of cards. Any game that makes extensive use of it has to be played on low settings anyway, so you can just set it not to use SM 3.0, or dnl a patch to disable it.
 
Shader model 3 ring a bell? The lack of dx9c was one of the major issues that hurt the x800 line.

Shader model 3 is completely independent of DirectX. It is a hardware feature and is utilized by both DirectX and OpenGL. There was no "hardware support" for a/b/c. The changes in DirectX 9 happened because of changes in the hardware, not the other way around like is happening here and what happened with DirectX 8.

Even still, the change from 10 to 10.1 is very miniscule.
 
Summming up what the others have said: because it doesn't matter unless you're running ATI hardware with horribly broken AA implementation and have to run everything through the stream processors. Since Nvidia isn't... no compelling reason to devote time to it.

Also, I ran an X800XT prior to my 8800 GTXs. You know what I lost as a result of no SM3.0? HDR support in some titles (Oblivion and Test Drive Unlimited come to mind) and some stuff in Far Cry I think. I lost no sleep over it.
 
Shader model 3 is completely independent of DirectX. It is a hardware feature and is utilized by both DirectX and OpenGL. There was no "hardware support" for a/b/c. The changes in DirectX 9 happened because of changes in the hardware, not the other way around like is happening here and what happened with DirectX 8.

Even still, the change from 10 to 10.1 is very miniscule.

DX9c added support for SM3. Lacking SM3 the x800 line was unable to realize DX9c capacity.

Nvidia didn't add 10.1 because they didn't think they would need to and unless ati can get devs to start using it enmass before nvidia's next chip (assuming it has 10.1) nvidia will have made a good choice.
 
It still much earlier in the DX10 than the DX9 when SM3 came out, so 10.1 at this point is irrelevant sans one game. With new parts coming in the next few months, 10.1 support is even less than irrelevant.
 
DX9c added support for SM3. Lacking SM3 the x800 line was unable to realize DX9c capacity.

Which is what I said.

DX9c added support for a change in hardware. The difference between DX9c and early versions is because hardware changed. The difference between DX10 and DX10.1 (as well as DX8 and DX8.1 earlier) is because the featureset of DirectX changed, not the hardware. To use that new featureset, however, new hardware had to have been developed.

In other words, there is no such thing as a "DirectX 9.0c" card, because all DX9 cards provide full support for that version of DirectX. Some features that DirectX 9.0c supported (but did not introduce) could not be used for hardware reasons outside of DirectX. SM3.0 is one of these. DirectX 9.0c does NOT equal SM3.0.

However, to provide hardware-level support for DX10.1, a DX10.1 card is needed.

The differences here is that the changes were mandated by MS, not by nVidia / ATI.
 
To use DX9c (to its fullest) you needed an SM3 compatible card. Not all cards had SM3 and thus not all cards fully supported DX9c. I am aware that DX9c != SM3, I am just using it as the talking point as it was the issue of most discussion back in the day. In short
New hardware was needed to support DX9c
New hardware is needed to support DX10.1

I understand what your saying about being able to run DX9 games with out HDR (as is relevant to SM3), but those cards did not support the full feature set available to devs via DX9c.
 
Why did they decide not to implement it? Anybody know why? I find it interesting that they didn't.

Because DX10.1 is just a minor upgrade. It brings almost nothing new to the table and no games support it, even if AC does seem to use, but will remove it in a patch.
 
I think people are blowing the DX 10.1 thing out of proportion. I think Nvidia if they are strongarming because it is a TWIWMTBP game that is totally unethical but its all speculation at this point. UBI says that their implementation removed a render pass and that is why it allows some improvement in fps. Is it really that much of an improvement when the AA is enabled? Are there hard numbers?
 
'From the above graph, it is obvious that AMD's ATI Radeon HD 3870 enjoys a considerable performance boost with Windows Vista Service Pack 1. In fact, the Radeon HD 3870 averaged about 34% faster with Vista SP1 at 1600x1200 with 2X MSAA.'

so %34 is minor :)
 
Nice read. I guess Nvidia is being an ass after all. Not that it matters for me :p
 
'From the above graph, it is obvious that AMD's ATI Radeon HD 3870 enjoys a considerable performance boost with Windows Vista Service Pack 1. In fact, the Radeon HD 3870 averaged about 34% faster with Vista SP1 at 1600x1200 with 2X MSAA.'

so %34 is minor :)
In a "w00t, I can finally match the competition" kinda way, yes...
 
The reason Nvidia is still releasing "new" GPUs based off a shrunk version of the G80 from 2006 is because ATI/AMD hasn't been competitive. The difference between the 5800 and 6800 was huge, and every following "generation" has been a big step up from the previous one.. *except* for the 9x00 cards. The only card that's significantly faster is the GX2, and that's because it's two cards bolted together. The others are basically re-branded 8x00 cards meant to flood the market, give Nvidia cards more "shelf space" and confuse customers. Had the Radeon 3870 been significantly faster than the 8800U, the 9800 would surely have been a completely different card. I'm sure Nvidia is sitting on some much more interesting tech, but they see no reason to release it to the consumers at this time.

It's similar to the situation with the Geforce 3/4 several years ago. When it turned out that ATI's Radeon 8500 was no match for the aging Geforce3, Nvidia deliberately delayed the Geforce4 and re-released the Geforce3 as Geforce3 Ti instead. When they finally gave us the Geforce4, ATI's surprise move with the 9700 then caught them off guard and they got what they deserved for trying to milk every last penny from obsolete products instead of making their best technology available to gamers.
 
It's really not a big deal right now, and probably won't matter much for a while. The sole current DX10.1 game is having that functionality removed for a while.

It's not a big upgrade like DX9.0b -> DX9.0c, and even as that showed, it took a while for SM3 games to start showing up. If you want to take anything from this, the faster HD 38xx cards may be slightly more future proof than 8800 and 9800 series cards, if future games can get significant speed ups from poorly performing functionality on HD 38xx series cards. :p
 
Back
Top