Is anyone SM3.0 / DX9.0c compliant?

chrisf6969

[H]F Junkie
Joined
Oct 27, 2003
Messages
9,011
Just a funny turn of events:

First ATI, tries to ingore SM3.0 and say how unimportant it us.... yadda yadda yadda...

Then they tout SM3.0 done right.

Then Nvidia, pisses in ATI's cereal with SM3.0 NOT done right, b/c its missing Vertex Texturing.

Now ATI is pissing right back saying Nvidia didn't implement SM3.0 right either.

Ok, so now it looks like NEITHER of them did it right? SHIT, now what do I buy..... Matrox's Paraphalegic? :)

If you feel you need a warning for no flame war, you're probably inviting one. Keep it clean and we'll keep the discussion. - p[H]
 
Until a game comes out that uses Vertex Texturing, its pointless to try and say ATi screwed up. Unified is going to be used, from what I have read anyways. Same goes for NV.
 
I hope WGF & unified pipeline/shaders whatever will eliminate all the variations like SM1.1, 1.2, 1.3, 1.4, etc...

But there will no doubtedly be varations b/c of different HW & different drivers.

HOPEFULLY, Microsoft's restrictions for hardware compliance will be strict enough to make EVERYTHING perfectly compatible, but loose enough to give hardware makers the flexibility to come up with their own creative ways to meet that compatibility.
 
fallguy said:
Until a game comes out that uses Vertex Texturing, its pointless to try and say ATi screwed up. Unified is going to be used, from what I have read anyways. Same goes for NV.
As pointed out basically every time it comes up, that's what the IL-2 series uses for its water effects.

Granted, that's only 3 games. But...still....

(Whole thing is amusing. So, neither is fully SM3.0 compliant, huh? Nice.)
 
speaking of "IL"

I heard Infinium Lab's console will be SM3.0 compliant when it comes out next week! ;) j/k

edit: oh and bitboy's videocard too! LOL
 
dderidex said:
As pointed out basically every time it comes up, that's what the IL-2 series uses for its water effects.

Granted, that's only 3 games. But...still....

(Whole thing is amusing. So, neither is fully SM3.0 compliant, huh? Nice.)

Thats a NV specific optimization if memory serves. So its likely it wouldnt work for ATi anyways. Even so, vertex textures can be done, just via software. So its moot.

The point being, none of this matters. It doesnt matter to me that NV fails these tests, unless there is a game, or games that it makes a difference in. The same goes for ATi.
 
The Question is, Which Implementation is faster?
Once we know which one is faster while achieving the same IQ then if Devs are not programing for it still we know one Company is shafting us.
 
fallguy said:
Thats a NV specific optimization if memory serves. So its likely it wouldnt work for ATi anyways.

Correct its an NV Specific opengl call. Would never run on the ATI cards even if it had the properhardware (not with out a patch from the dev or speical driver tweak).

Anyways whats sad is both are spending time trying to market "faults" with the other where that extra time/effort could have been better used elsewhere...
 
Whole thing is amusing. So, neither is fully SM3.0 compliant, huh? Nice.

X1K fails no DCT tests. 7800 fails a whole bunch of DCT tests. Draw your own conclusion.
 
Apple740 said:
X1K fails no DCT tests. 7800 fails a whole bunch of DCT tests. Draw your own conclusion.

Did you run this yourself? I thought the article said they were in the process of testing the xl.
 
Maybe they should change their marketing slogans.

ATi: SM3.0 done more right, but not completely compliant.

NV: SM3.0 for longer than the competition, but slight less compliant.
 
Anyone notice the lack of full 3.0 compliance in their games? Developers will surely work around these minor failings for the parts of both companies, so like the VTF situation this is a real 'none' issue IMO.

And, yes, hopefully the next DirectX iteration puts a complete halt to this history of bifurcated shader models that really should be named after a particular IHV rather than being numbered.
 
this is funny cuz back in the days when ppl post the "x800 xl or the 6800gt" many nvidia people said 6800gt cuz it has SM 3 hahahaahaha i laugh now cuz i saved 100 bucks and got the x800 xl
 
I'm running the test on my 6800 GT right now, so far 6000 tests have passed and 24 failed on the Pixel Shader 3.0, it looks like it could take a while, and it's Quake 4 time so i'm gonna cut it short, i'll run it later if someone else doesnt do it.
 
For me the problem isn't really nvidia not supporting SM3.0, but basing their whol "6800 is maybe less performant, but is more future proof" argument on SM3.0... now somebody actaully decided to test it, and they're not even compliant....this is just laughable.

Doing that is illegal in europe, they could get sued big time...
 
I think both companies get waivers from MS for failed tests since some of the problems are related to DCT itself.....................I could be wrong though :)
 
Jima13 said:
I think both companies get waivers from MS for failed tests since some of the problems are related to DCT itself.....................I could be wrong though :)

You are wrong. ATi hasnt failed any DX9 test. Vertex texture fetch is optional, its not in the dct test. ATi is DX9 compliant. NV doesnt appear to be. It fails parts of the dct test.

But as I said, I dont think it matters. Unless its proven to matter in a game, which it hasnt yet.
 
heh... How ironic. I like how theres more nvidia fans than ATI also but I dont blame them because ATI has fumbled the ball with geting there R520 out the door.

I find it ironic extremely because there was 1 thing that failed on the X1k series and developers could work around it but people were completly bitching about it. Now this and people make excuses and dont bitch about it at all when nvidia fails more than 1 test. Thats humans for you...
 
pArTy said:
I find it ironic extremely because there was 1 thing that failed on the X1k series and developers could work around it but people were completly bitching about it.

Again (since you didnt seem to read the thread), ATi didnt fail anything. They designed it the way they did. Its not broken, it didnt fail any test, its 100% DX9 compliant.
 
edit : im off topic with that
on topic, since both companies lack some 3.0 features, why dont we call it even and discuss perofmance/iq? since its impossible to discuss this stuff without flame wars breaking out

If you knew you were offtopic, you shouldn't have posted it in the first place. - p[H]
 
once again we now understand why the [H] does the reviews the way they do. look at the games, how they run, and tell us how fast they run and how good they look....

thanks kyle =D
 
emailthatguy said:
once again we now understand why the [H] does the reviews the way they do. look at the games, how they run, and tell us how fast they run and how good they look....

thanks kyle =D

Well you can't say that the recent R520 review wasn't at least a little biased given all the ATI bashing that was done (not saying that some of it wasn't warranted considering the facts). I think there needs to be an article on this topic from a more reliable source than the inquirer , though. How about it, Kyle?
 
I find the whole thing sucking. not only ATI tried to pull a fast one but now nVidia seem to be stiffing us (the consumer) as well. What's the next thing for us to do. A law suit? I for one will be sending letters out finding out why my card does not support SM3.0 as advertised...
 
Oberon said:
Well you can't say that the recent R520 review wasn't at least a little biased given all the ATI bashing that was done (not saying that some of it wasn't warranted considering the facts). I think there needs to be an article on this topic from a more reliable source than the inquirer , though. How about it, Kyle?

Biased how? I'm glad that the XT wasn't reviewed. [H] should review it when it hits retail which should be in November. Should be.
 
[H]ard's review did seem a little bias to me too in the way thigns were worded. (Again, I am not sure it was unwarrented being how late to market they were and [H] thinking ATI had been using them for PR). I still remember the ending. Something along the lines of "ATI's card is ok, but if your a real gamer then buy a 7800". I was like wtf didn't the review just say preformace was about equal and ati brought some new features to the table. I saw little phases like this throughout the review that just seemed a little off base.
 
so it seems i am in a predicament.

i, the consumer, have the option to buy a 7800 series card (or two) from nvidia, and rip through current and upcoming games at speeds which challenge the gods themselves.

or, i can go the ATI route - but only after selling my firstborns for a couple grand...

of course, its not like my firstborns would really understand the importance of SM3 - hell, I don't even understand it that much!

well, i mean, HL2's HDR looks fine and dandy on any of the cards :rolleyes: ...

my point, as iterated already by some people: as long as what i see on my screen looks good, its all good.
 
This is too good for words given the rampant nvidia !!!!!!ism on these boards and on the site in general. The R520 is SM3.0 compliant while the 7800 series doesn't quite measure up in that area. It's just interesting to see all the nvidia !!!!!!s change their tune from hoping that the R520 somehow isn't SM3.0 compliant to nothing is SM3.0 compliant (I mean if nvidia can't get it right, how dare any other company have the foresight to do it right ;) ). The truth remains that per the originators of the SM3.0 spec, MS, the R520 is fully compliant while the 7800 is most definitely not even close.

In the end, as has been said, non-compliance doesn't mean a thing unless it somehow limits performance and/or IQ. It's just nice seeing the nvidia !!!!!!s eating crow (any !!!!!!s eating crow is nice, but given the annoyingness of the nvidia ones lately, they deserve it more than any others atm.) Of course, this just means the !!!!!! fight will move onto the next battleground. It never ends but its always fun to watch.
 
phez said:
my point, as iterated already by some people: as long as what i see on my screen looks good, its all good.

Yup, that enhanced shimmering is all good ;)
 
The only issue I see with this is that of false advertising....

Seriously, as far as I'm concerned, if the card does the job, and does it well, then really there should be no problem. Can you see the lack of SM3 and DX9c support in a game? I can't, and my 6600GT is "DX9c compliant" and "has SM3 support".

Personally, I don't see this a a big practical issue. But from where the industry stands at the moment, there will no doubt be a moral (if thats the right term to use) argument by the corporations and the f@nb0ys about the others "shady deals" and improper advertising. It may reach the courts, but from a gamers point of view, it shouldn't be too much of a problem.

Thats my 2 cents.
 
4keatimj said:
The only issue I see with this is that of false advertising....

Yep that it is big time.

People gave ATI hard time because of "SM 3.0 done right" and I think it is just fair that Nvidia get some heat from this one, if it really is true.

So Nvidia side has been bashing ATI side for months now about their SM 3.0 support just to find that their cards ain´t any better (In advertising terms), Ironic is the word Iam looking for :D
 
does this thread even matter? few days later, M$ can change the spec of SM3.0 and move all the "failed" test to a newer version call SM3.01b, just like what ati calls its X8XX series's SM2.0b....no point seiously, all we need to know is, if the game support SM3.0 does my GF6xxx series run on it? if it does, everyone is a winner is not, law suit. happy now? same apply to ati, if a game the vertex something and "has to" run in software mode else the SM3.0 mode will be disable, than we looking @ alot off pissed RED team, else, we see alot of happy red riding hood
 
The real world tests tell all. SM 3 done right or not without the software to make the comparison it doesnt matter.

Furthermore even if the X1K line of cards does perform on par with 7800's, where the ball was dropped on ATI's behalf was with the Crossfire implementation. I mean a cable bridge? I like my refresh rates thanks, and if you do too youd go SLI so you werent limited by a bridge cable. Furthermore a Xfires ability to run different cards simultaneously is moot, because the faster card automatically scales down to the speed of the slower card. So if you thought youd get off buying a Master Card X1K card while retaining your 800 series, youve cheated yourself out of money and performance on the X1K.

No I am not a !!!!!!, I go with the hardware that performs best at the time I buy, be it green or red. Nvidia holds the crown, and until the XT launches they continue to do so.
 
By the time we need to care about SM3.0, X1800 and 7800 will be old news and they will have a new implementation.
 
kcthebrewer said:
You guys do know that it could just be the driver holding back full compliance.


Could be but I hear the 6x00 cards also fail the same test. Given the fact that the 6x00 has been out for a long time now... maybe...
 
Psylenced said:
The real world tests tell all. SM 3 done right or not without the software to make the comparison it doesnt matter.

Furthermore even if the X1K line of cards does perform on par with 7800's, where the ball was dropped on ATI's behalf was with the Crossfire implementation. I mean a cable bridge? I like my refresh rates thanks, and if you do too youd go SLI so you werent limited by a bridge cable. Furthermore a Xfires ability to run different cards simultaneously is moot, because the faster card automatically scales down to the speed of the slower card. So if you thought youd get off buying a Master Card X1K card while retaining your 800 series, youve cheated yourself out of money and performance on the X1K.

No I am not a !!!!!!, I go with the hardware that performs best at the time I buy, be it green or red. Nvidia holds the crown, and until the XT launches they continue to do so.
The X1K isn't affected by the Crossfire resolution/refresh limits that the X800 has, as far as I know.

I find this thread funny. ATI had VTF missing, and it spawned several discussions and heated arguments. Now, with this, barely a simmer. Where is everybody?
 
I think there needs to be an article on this topic from a more reliable source than the inquirer , though. How about it, Kyle?

Seconded! :D

Additionally, I would just like to say that the sooner everybody accepts Zoop as being the best game ever the sooner we call just relax and destroy our eyesight in peace :p
 
Back
Top