Is anyone SM3.0 / DX9.0c compliant?

chrisf6969

[H]F Junkie
Joined
Oct 27, 2003
Messages
9,013
Just a funny turn of events:

First ATI, tries to ingore SM3.0 and say how unimportant it us.... yadda yadda yadda...

Then they tout SM3.0 done right.

Then Nvidia, pisses in ATI's cereal with SM3.0 NOT done right, b/c its missing Vertex Texturing.

Now ATI is pissing right back saying Nvidia didn't implement SM3.0 right either.

Ok, so now it looks like NEITHER of them did it right? SHIT, now what do I buy..... Matrox's Paraphalegic? :)

If you feel you need a warning for no flame war, you're probably inviting one. Keep it clean and we'll keep the discussion. - p[H]
 

fallguy

2[H]4U
Joined
Sep 8, 2001
Messages
3,964
Until a game comes out that uses Vertex Texturing, its pointless to try and say ATi screwed up. Unified is going to be used, from what I have read anyways. Same goes for NV.
 

chrisf6969

[H]F Junkie
Joined
Oct 27, 2003
Messages
9,013
I hope WGF & unified pipeline/shaders whatever will eliminate all the variations like SM1.1, 1.2, 1.3, 1.4, etc...

But there will no doubtedly be varations b/c of different HW & different drivers.

HOPEFULLY, Microsoft's restrictions for hardware compliance will be strict enough to make EVERYTHING perfectly compatible, but loose enough to give hardware makers the flexibility to come up with their own creative ways to meet that compatibility.
 

dderidex

Supreme [H]ardness
Joined
Oct 31, 2001
Messages
6,328
fallguy said:
Until a game comes out that uses Vertex Texturing, its pointless to try and say ATi screwed up. Unified is going to be used, from what I have read anyways. Same goes for NV.
As pointed out basically every time it comes up, that's what the IL-2 series uses for its water effects.

Granted, that's only 3 games. But...still....

(Whole thing is amusing. So, neither is fully SM3.0 compliant, huh? Nice.)
 

chrisf6969

[H]F Junkie
Joined
Oct 27, 2003
Messages
9,013
speaking of "IL"

I heard Infinium Lab's console will be SM3.0 compliant when it comes out next week! ;) j/k

edit: oh and bitboy's videocard too! LOL
 

fallguy

2[H]4U
Joined
Sep 8, 2001
Messages
3,964
dderidex said:
As pointed out basically every time it comes up, that's what the IL-2 series uses for its water effects.

Granted, that's only 3 games. But...still....

(Whole thing is amusing. So, neither is fully SM3.0 compliant, huh? Nice.)

Thats a NV specific optimization if memory serves. So its likely it wouldnt work for ATi anyways. Even so, vertex textures can be done, just via software. So its moot.

The point being, none of this matters. It doesnt matter to me that NV fails these tests, unless there is a game, or games that it makes a difference in. The same goes for ATi.
 

{NG}Fidel

Supreme [H]ardness
Joined
Jan 17, 2005
Messages
6,286
The Question is, Which Implementation is faster?
Once we know which one is faster while achieving the same IQ then if Devs are not programing for it still we know one Company is shafting us.
 

Jbirney

Gawd
Joined
Nov 14, 2003
Messages
528
fallguy said:
Thats a NV specific optimization if memory serves. So its likely it wouldnt work for ATi anyways.

Correct its an NV Specific opengl call. Would never run on the ATI cards even if it had the properhardware (not with out a patch from the dev or speical driver tweak).

Anyways whats sad is both are spending time trying to market "faults" with the other where that extra time/effort could have been better used elsewhere...
 

Apple740

Gawd
Joined
Oct 21, 2004
Messages
641
Whole thing is amusing. So, neither is fully SM3.0 compliant, huh? Nice.

X1K fails no DCT tests. 7800 fails a whole bunch of DCT tests. Draw your own conclusion.
 
Joined
Jul 7, 2005
Messages
839
Apple740 said:
X1K fails no DCT tests. 7800 fails a whole bunch of DCT tests. Draw your own conclusion.

Did you run this yourself? I thought the article said they were in the process of testing the xl.
 

Un4given

Gawd
Joined
Nov 8, 2003
Messages
779
Maybe they should change their marketing slogans.

ATi: SM3.0 done more right, but not completely compliant.

NV: SM3.0 for longer than the competition, but slight less compliant.
 

John Reynolds

Limp Gawd
Joined
Aug 13, 2003
Messages
203
Anyone notice the lack of full 3.0 compliance in their games? Developers will surely work around these minor failings for the parts of both companies, so like the VTF situation this is a real 'none' issue IMO.

And, yes, hopefully the next DirectX iteration puts a complete halt to this history of bifurcated shader models that really should be named after a particular IHV rather than being numbered.
 

tvdang7

Supreme [H]ardness
Joined
Jun 8, 2005
Messages
4,302
this is funny cuz back in the days when ppl post the "x800 xl or the 6800gt" many nvidia people said 6800gt cuz it has SM 3 hahahaahaha i laugh now cuz i saved 100 bucks and got the x800 xl
 
Joined
Jul 7, 2005
Messages
839
I'm running the test on my 6800 GT right now, so far 6000 tests have passed and 24 failed on the Pixel Shader 3.0, it looks like it could take a while, and it's Quake 4 time so i'm gonna cut it short, i'll run it later if someone else doesnt do it.
 

LyCoS

Limp Gawd
Joined
Aug 15, 2004
Messages
205
For me the problem isn't really nvidia not supporting SM3.0, but basing their whol "6800 is maybe less performant, but is more future proof" argument on SM3.0... now somebody actaully decided to test it, and they're not even compliant....this is just laughable.

Doing that is illegal in europe, they could get sued big time...
 

Jima13

Gawd
Joined
Feb 19, 2002
Messages
878
I think both companies get waivers from MS for failed tests since some of the problems are related to DCT itself.....................I could be wrong though :)
 

fallguy

2[H]4U
Joined
Sep 8, 2001
Messages
3,964
Jima13 said:
I think both companies get waivers from MS for failed tests since some of the problems are related to DCT itself.....................I could be wrong though :)

You are wrong. ATi hasnt failed any DX9 test. Vertex texture fetch is optional, its not in the dct test. ATi is DX9 compliant. NV doesnt appear to be. It fails parts of the dct test.

But as I said, I dont think it matters. Unless its proven to matter in a game, which it hasnt yet.
 

pArTy

Limp Gawd
Joined
Jul 21, 2004
Messages
433
heh... How ironic. I like how theres more nvidia fans than ATI also but I dont blame them because ATI has fumbled the ball with geting there R520 out the door.

I find it ironic extremely because there was 1 thing that failed on the X1k series and developers could work around it but people were completly bitching about it. Now this and people make excuses and dont bitch about it at all when nvidia fails more than 1 test. Thats humans for you...
 

fallguy

2[H]4U
Joined
Sep 8, 2001
Messages
3,964
pArTy said:
I find it ironic extremely because there was 1 thing that failed on the X1k series and developers could work around it but people were completly bitching about it.

Again (since you didnt seem to read the thread), ATi didnt fail anything. They designed it the way they did. Its not broken, it didnt fail any test, its 100% DX9 compliant.
 

Digital Viper-X-

[H]F Junkie
Joined
Dec 9, 2000
Messages
14,764
edit : im off topic with that
on topic, since both companies lack some 3.0 features, why dont we call it even and discuss perofmance/iq? since its impossible to discuss this stuff without flame wars breaking out

If you knew you were offtopic, you shouldn't have posted it in the first place. - p[H]
 

emailthatguy

Limp Gawd
Joined
May 4, 2002
Messages
503
once again we now understand why the [H] does the reviews the way they do. look at the games, how they run, and tell us how fast they run and how good they look....

thanks kyle =D
 

Oberon

n00b
Joined
Jun 23, 2004
Messages
40
emailthatguy said:
once again we now understand why the [H] does the reviews the way they do. look at the games, how they run, and tell us how fast they run and how good they look....

thanks kyle =D

Well you can't say that the recent R520 review wasn't at least a little biased given all the ATI bashing that was done (not saying that some of it wasn't warranted considering the facts). I think there needs to be an article on this topic from a more reliable source than the inquirer , though. How about it, Kyle?
 

Zardoz

2[H]4U
Joined
Aug 27, 2000
Messages
3,251
I find the whole thing sucking. not only ATI tried to pull a fast one but now nVidia seem to be stiffing us (the consumer) as well. What's the next thing for us to do. A law suit? I for one will be sending letters out finding out why my card does not support SM3.0 as advertised...
 

KENNYB

2[H]4U
Joined
Jul 26, 2004
Messages
3,147
Oberon said:
Well you can't say that the recent R520 review wasn't at least a little biased given all the ATI bashing that was done (not saying that some of it wasn't warranted considering the facts). I think there needs to be an article on this topic from a more reliable source than the inquirer , though. How about it, Kyle?

Biased how? I'm glad that the XT wasn't reviewed. [H] should review it when it hits retail which should be in November. Should be.
 

ClearM4

2[H]4U
Joined
Oct 5, 2005
Messages
2,320
[H]ard's review did seem a little bias to me too in the way thigns were worded. (Again, I am not sure it was unwarrented being how late to market they were and [H] thinking ATI had been using them for PR). I still remember the ending. Something along the lines of "ATI's card is ok, but if your a real gamer then buy a 7800". I was like wtf didn't the review just say preformace was about equal and ati brought some new features to the table. I saw little phases like this throughout the review that just seemed a little off base.
 

phez

n00b
Joined
Apr 17, 2005
Messages
27
so it seems i am in a predicament.

i, the consumer, have the option to buy a 7800 series card (or two) from nvidia, and rip through current and upcoming games at speeds which challenge the gods themselves.

or, i can go the ATI route - but only after selling my firstborns for a couple grand...

of course, its not like my firstborns would really understand the importance of SM3 - hell, I don't even understand it that much!

well, i mean, HL2's HDR looks fine and dandy on any of the cards :rolleyes: ...

my point, as iterated already by some people: as long as what i see on my screen looks good, its all good.
 

OldBoy

Limp Gawd
Joined
Oct 3, 2005
Messages
308
This is too good for words given the rampant nvidia !!!!!!ism on these boards and on the site in general. The R520 is SM3.0 compliant while the 7800 series doesn't quite measure up in that area. It's just interesting to see all the nvidia !!!!!!s change their tune from hoping that the R520 somehow isn't SM3.0 compliant to nothing is SM3.0 compliant (I mean if nvidia can't get it right, how dare any other company have the foresight to do it right ;) ). The truth remains that per the originators of the SM3.0 spec, MS, the R520 is fully compliant while the 7800 is most definitely not even close.

In the end, as has been said, non-compliance doesn't mean a thing unless it somehow limits performance and/or IQ. It's just nice seeing the nvidia !!!!!!s eating crow (any !!!!!!s eating crow is nice, but given the annoyingness of the nvidia ones lately, they deserve it more than any others atm.) Of course, this just means the !!!!!! fight will move onto the next battleground. It never ends but its always fun to watch.
 

OldBoy

Limp Gawd
Joined
Oct 3, 2005
Messages
308
phez said:
my point, as iterated already by some people: as long as what i see on my screen looks good, its all good.

Yup, that enhanced shimmering is all good ;)
 

4keatimj

[H]ard|Gawd
Joined
Oct 16, 2003
Messages
1,327
The only issue I see with this is that of false advertising....

Seriously, as far as I'm concerned, if the card does the job, and does it well, then really there should be no problem. Can you see the lack of SM3 and DX9c support in a game? I can't, and my 6600GT is "DX9c compliant" and "has SM3 support".

Personally, I don't see this a a big practical issue. But from where the industry stands at the moment, there will no doubt be a moral (if thats the right term to use) argument by the corporations and the f@nb0ys about the others "shady deals" and improper advertising. It may reach the courts, but from a gamers point of view, it shouldn't be too much of a problem.

Thats my 2 cents.
 

Alarmer

n00b
Joined
Jul 3, 2003
Messages
31
4keatimj said:
The only issue I see with this is that of false advertising....

Yep that it is big time.

People gave ATI hard time because of "SM 3.0 done right" and I think it is just fair that Nvidia get some heat from this one, if it really is true.

So Nvidia side has been bashing ATI side for months now about their SM 3.0 support just to find that their cards ain´t any better (In advertising terms), Ironic is the word Iam looking for :D
 

allenpan

[H]ard|Gawd
Joined
Jul 27, 2005
Messages
1,724
does this thread even matter? few days later, M$ can change the spec of SM3.0 and move all the "failed" test to a newer version call SM3.01b, just like what ati calls its X8XX series's SM2.0b....no point seiously, all we need to know is, if the game support SM3.0 does my GF6xxx series run on it? if it does, everyone is a winner is not, law suit. happy now? same apply to ati, if a game the vertex something and "has to" run in software mode else the SM3.0 mode will be disable, than we looking @ alot off pissed RED team, else, we see alot of happy red riding hood
 

Psylenced

n00b
Joined
Jun 27, 2005
Messages
60
The real world tests tell all. SM 3 done right or not without the software to make the comparison it doesnt matter.

Furthermore even if the X1K line of cards does perform on par with 7800's, where the ball was dropped on ATI's behalf was with the Crossfire implementation. I mean a cable bridge? I like my refresh rates thanks, and if you do too youd go SLI so you werent limited by a bridge cable. Furthermore a Xfires ability to run different cards simultaneously is moot, because the faster card automatically scales down to the speed of the slower card. So if you thought youd get off buying a Master Card X1K card while retaining your 800 series, youve cheated yourself out of money and performance on the X1K.

No I am not a !!!!!!, I go with the hardware that performs best at the time I buy, be it green or red. Nvidia holds the crown, and until the XT launches they continue to do so.
 

Devnull

[H]F Junkie
Joined
Apr 21, 2000
Messages
8,486
By the time we need to care about SM3.0, X1800 and 7800 will be old news and they will have a new implementation.
 

Jbirney

Gawd
Joined
Nov 14, 2003
Messages
528
kcthebrewer said:
You guys do know that it could just be the driver holding back full compliance.


Could be but I hear the 6x00 cards also fail the same test. Given the fact that the 6x00 has been out for a long time now... maybe...
 

Sc4freak

Limp Gawd
Joined
Dec 23, 2004
Messages
320
Psylenced said:
The real world tests tell all. SM 3 done right or not without the software to make the comparison it doesnt matter.

Furthermore even if the X1K line of cards does perform on par with 7800's, where the ball was dropped on ATI's behalf was with the Crossfire implementation. I mean a cable bridge? I like my refresh rates thanks, and if you do too youd go SLI so you werent limited by a bridge cable. Furthermore a Xfires ability to run different cards simultaneously is moot, because the faster card automatically scales down to the speed of the slower card. So if you thought youd get off buying a Master Card X1K card while retaining your 800 series, youve cheated yourself out of money and performance on the X1K.

No I am not a !!!!!!, I go with the hardware that performs best at the time I buy, be it green or red. Nvidia holds the crown, and until the XT launches they continue to do so.
The X1K isn't affected by the Crossfire resolution/refresh limits that the X800 has, as far as I know.

I find this thread funny. ATI had VTF missing, and it spawned several discussions and heated arguments. Now, with this, barely a simmer. Where is everybody?
 

Slartibartfast

Supreme [H]ardness
Joined
Sep 25, 2004
Messages
7,275
I think there needs to be an article on this topic from a more reliable source than the inquirer , though. How about it, Kyle?

Seconded! :D

Additionally, I would just like to say that the sooner everybody accepts Zoop as being the best game ever the sooner we call just relax and destroy our eyesight in peace :p
 
Top