ATi and Valve alliance bad for nVidia?

Dyne

Limp Gawd
Joined
Sep 22, 2004
Messages
211
I have started this topic as a contructive alternative to the infamous "Valve Sucks" Thread which is nothing but a flame war...Hopefully we can accomplish something here rather than flaming users who prefer one camp over another...

USELESS POSTS / FLAMES WILL BE DELETED!!
ATI and Valve have been known to bundle the games together since ATI's 9600XT and 9800XT Cards. No one had a problem with that; until nVidia users with GeForce FX Series cards realized that they were NOT seeing DirectX 9 Features. The game was being run in DirectX 8.1 Compatibility Mode which eliminates most of the Pixel Shader 2.0 Features the FX Series was touting around since the (horrible) 5800 Ultra.

All other games show a virtual dead-heat between the 5900 Ultra and the 9800 Pro; except in the case of the Source Engine. Here, nVidia Users are seeing less than 1/2 the frame rates of their competitors, which quite frankly, has many of us pissed off to say the least.

Valve originally had a Mixed Mode for the NV3x which allowed both FP32 bit and FP16 bit for those shaders that did not need it. Somewhere in the various delays, this mode was never completed. Therefore Valve wasted time developing a mode to help the NV3x series and it became a moot point.

Why is Valve wasting time developing a mode that essentially got dropped? Why does masking the device ID to appear as a Radeon Card fix DX9 Mode on the FX Series, while the default Device ID Results in a ton of glitches? Why didn't Valve Implement Partial Precision for the FX? Is it a Conspiracy? Are ATi and Valve trying to eliminate nVidia Competition? Has the FX Series deemed useless? Maybe we will get some answers from either ATi or Valve concerning the literally piss-poor FX Series performance as well.
 
Dyne said:
I have started this topic as a contructive alternative to the infamous "Valve Sucks" Thread which is nothing but a flame war...Hopefully we can accomplish something here rather than flaming users who prefer one camp over another...

USELESS POSTS / FLAMES WILL BE DELETED!!

Oh really? You suddenly a mod?

And just as many good points have been brought up in that thread too.
 
Dyne said:
All other games show a virtual dead-heat between the 5900 Ultra and the 9800 Pro; except in the case of the Source Engine. Here, nVidia Users are seeing less than 1/2 the frame rates of their competitors, which quite frankly, has many of us pissed off to say the least.

Oh and uh...No that's not the case
 
Show me a game that the 9800 Pro KILLS the 5900...Besides HL2 and the Source Engine...Why is it that all OTHER Games have the cards relatively close in performance?
 
Wasn't there a [H] front page article about value programming againist nvidia cards, and a work around for it?

I don't see it as bad for Nvidia.
In fact Nvidia pulled the same crap with FarCry; who remembers all the ATI owners moaning when the game first came out because it wouldn't even run.
 
Dyne said:
Show me a game that the 9800 Pro KILLS the 5900...Besides HL2 and the Source Engine...Why is it that all OTHER Games have the cards relatively close in performance?

Farcry

The end!
 
Netrat33 said:
Oh really? You suddenly a mod?
LOL
---

Who cares anyways? There's more Doom3 engine games coming out than Source engine games. Geez, there's probably more CryTek engine games coming out than Source engine games too.
 
hey wait - there is that vampire one that looks about as impressive as quake 3
 
pxc said:
LOL
---

Who cares anyways? There's more Doom3 engine games coming out than Source engine games. Geez, there's probably more CryTek engine games coming out than Source engine games too.

Doom3 engines more than source: How do you know? At least Source has 1 already (granted not as flashy but still) I'm sure it has more to follow too. NOT saying there will be no Doom3 engine games. Just saying that's a pretty bold statement with no line up.

I'd like to see more crytek engine games though. See what people can do with it.
 
Netrat33 said:
Doom3 engines more than source: How do you know?
How do you not know?

Q4, SOF3, COD2, unnamed Splash Damage game (probably RTCW2)
 
pxc said:
How do you not know?

Q4, SOF3, COD2, unnamed Splash Damage game (probably RTCW2)

*laughing* I'm not saying! I'm askin! :D

Quake 4 yes,
COD2, has that even been announced?

Splash damage game?
 
Netrat33 said:
*laughing* I'm not saying! I'm askin! :D

Quake 4 yes,
COD2, has that even been announced?

Splash damage game?

COD2 has been announced but nothing more than that.
 
Well, the main thing that is bugging me is the "Magic" Ati Device ID Fix for the DX9 Path for GFFX Cards....Can anyone with say a Matrox card see if there are errors that are fixed the same way?
 
Dyne said:
Well, the main thing that is bugging me is the "Magic" Ati Device ID Fix for the DX9 Path for GFFX Cards....Can anyone with say a Matrox card see if there are errors that are fixed the same way?

I don't think matrox cards are made for games anyway.
 
Should someone about to buy a 6800 nVidia card be worried about this? I'm still not going to buy an ATi card right now (nothing they have that I want, plus I don't like their shady market tactics)..
 
DragonGX said:
Should someone about to buy a 6800 nVidia card be worried about this? I'm still not going to buy an ATi card right now (nothing they have that I want, plus I don't like their shady market tactics)..

No you're fine. only 5700 and lower have an issue.
 
I think the main thing here is that you can get better IQ and almost as good performance just by tricking HL2 to thinking you have an ATI card. Also, ATI having full access to testing and benchmarking their cards while nvidia couldn't is another issue. I don't believe that is common among game developers.
 
obs said:
I think the main thing here is that you can get better IQ and almost as good performance just by tricking HL2 to thinking you have an ATI card. Also, ATI having full access to testing and benchmarking their cards while nvidia couldn't is another issue. I don't believe that is common among game developers.

Yes, this is exactly what I think we deserve the right to know! I mean how unfair and just morally WRONG is that? An FX masked as an ATI Card works fine, but an FX as an FX has flaws....I believe something is not right...
 
DragonGX said:
Should someone about to buy a 6800 nVidia card be worried about this? I'm still not going to buy an ATi card right now (nothing they have that I want, plus I don't like their shady market tactics)..

i run at 1600x1200 8AA/2AF fine , my pc aint all that either

if it gets to a really busy bit and you cant handle the slight fps loss then go to 1200x1000 and 4af - plays great then
 
DragonGX said:
Should someone about to buy a 6800 nVidia card be worried about this? I'm still not going to buy an ATi card right now (nothing they have that I want, plus I don't like their shady market tactics)..

The 6800 GT is faster than the x800 Pro in Doom 3, Far Cry and HL2. And ATI doesn't sell a product comparable to the 6800 (faster than 9800 XT, slower than x800 Pro, but cheaper than either).

For instance:

http://graphics.tomshardware.com/graphic/20041004/vga_charts-08.html
http://graphics.tomshardware.com/graphic/20041004/vga_charts-09.html
http://www.hardocp.com/article.html?art=NjkyLDI=
http://www.hardocp.com/article.html?art=NjkyLDM=

So I wouldn't worry about it.
 
DragonGX said:
Should someone about to buy a 6800 nVidia card be worried about this?
Naw, nothing to worry about. Even a 6600GT plays HL2 well at 1600x1200.

I'm not saying to buy one card over another one. Everyone should do research before they buy.
 
Dyne said:
ATI and Valve have been known to bundle the games together since ATI's 9600XT and 9800XT Cards.

You say that like Valve makes tons of games, the only two games I recall are HL1 and HL2. I believe CS came from the user community as did DOD (not to sure about DOD).
 
Dyne said:
All other games show a virtual dead-heat between the 5900 Ultra and the 9800 Pro; except in the case of the Source Engine. Here, nVidia Users are seeing less than 1/2 the frame rates of their competitors, which quite frankly, has many of us pissed off to say the least.
Not even close to the truth. Just about any DX9 game shows the 9800Pro ahead........ Nice try though. :)

Dyne said:
Valve originally had a Mixed Mode for the NV3x which allowed both FP32 bit and FP16 bit for those shaders that did not need it. Somewhere in the various delays, this mode was never completed. Therefore Valve wasted time developing a mode to help the NV3x series and it became a moot point.

Why is Valve wasting time developing a mode that essentially got dropped? Why does masking the device ID to appear as a Radeon Card fix DX9 Mode on the FX Series, while the default Device ID Results in a ton of glitches? Why didn't Valve Implement Partial Precision for the FX? Is it a Conspiracy? Are ATi and Valve trying to eliminate nVidia Competition? Has the FX Series deemed useless? Maybe we will get some answers from either ATi or Valve concerning the literally piss-poor FX Series performance as well.
It came down to a cost/benefit issue. Spend unknown more time fixing the FX shortcomings for 2.55 percent of their users or just release the game. We all know what won this!
 
R1ckCa1n said:
Not even close to the truth. Just about any DX9 game shows the 9800Pro ahead........ Nice try though. :)

It came down to a cost/benefit issue. Spend unknown more time fixing the FX shortcomings for 2.55 percent of their users or just release the game. We all know what won this!
Spend and unknown amount of time to accomplish something that a couple people did with third party tools? They could have added the option for dx9 and run the same path as the 9800 pro while using 16 bit shaders.

Although I do agree, Valve's motto should be "We cater to the majority." One of these days I might get Steam to work right.
 
No, its just like nvidia working with devs to have the flashy TWINTBP logo, infact, it's not nearly as bad since you dont have an ati splash screen.
And you fail to mention how the FX series fsaa is no where near the level of 9500+ FSAA.
4X FX AA= 2x 9500+ AA
 
Well here we go again....

The question no one is asking and everyone is overlooking is the premise. How does this "hurt" NVidia? It seems to me this hurts Nvidia *users* a lot more than Nvdia *the company*. Why? Because all the POS FX5xxx cards are at EOL more or less. I doubt HL2 is hurting Nvidia sales. In fact, this problem may actually increase Nvidia sales. Why? Because people tend to be unusually loyal to either Nvidia or ATI. If I were an FX5xxx owner who really wanted great DX9 performance out of HL2, what do you think I'd do? That's right, I'd go and buy *another* new Nvidia card.

Oh yeah, and by the way, if you haven't heard me say this for the thousandth time yet, I own an Nvidia card (not an FX5xxx obviously) and I'm happy with it.
 
I'm not quite sure why we are beating this horse to death again in a whole new thread when their are many others already outthere.

However, I will clarify the salient point once again..

First, if you run the various hacks to get PP DX9.0 in HL2 on a FX5800-FX5950 card you WILL be running slower than you were under DX8.1. Period. Proven already on many threads by people who ACTUALLY tested it. Forcing 9.0 and the ATI path slows you WAY down. Hacking in 100% PP brings your frames up a significant amount, enough to make the game playable on and enthuiasts PC, but NOT enough to make the game playable on a average home users PC, and NOT as high as you had with DX8.1.

So all Valve probably did was test the mixed precision code and DX9.0 on "normal" PC (i.e, 256mb ram, 1.8mhz P4, built in sound, etc.) with an FX5800+ card.... And of course found the game unplayable. Then they tried it in DX8.1 mode on the same PC and found the game playable at 1024x768 (which I think most would agree is the minimum resolution a company should shoot for) and so they used DX8.1.


Basically all I am saying is there is a simple explanation for why Valve would chose 8.1 over 9.0. I am not defending their programming abilities, since they maybe could have programmed the game better in general so 9.0 would work on an FX series card. However, I think conspiracy against Nvidia is a much more complex explanation, and therefore far less likely, than a simple decision made based on testing of the near-final product on a "normal" PC.
 
wuts the point of this thread? nothing useful for me particularly.. but what's the point except the ati valva alliance and i knew from reading the title that this is already gonna be a flamewar part 2 or something..
 
Please God Make It Stop :eek: Owning both an Nvidia card and an ATI card I cant tell the diffrence :p They both play games both Cost$$$ to much?Both cause more Problems than the civil war and Worldwar2 combined.(I guess we will have to get used to this with holidays upon us more Children posting :D
The [H]ard [C]hildcare Centre
 
I dont blame ATI for anyting, I blame Valve for being money hungry. Before hl2 came out I said to my self this is that one game where everyone needs to own one without trying to pirate it or whatever but after seeing how money hungry they are for taking dirty money to make other 1/4 of gamers feel like crap is wrong and now I can care less for their game and I hope everyone that pirated the game should not feel guilty.
 
HeavenX said:
I dont blame ATI for anyting, I blame Valve for being money hungry. Before hl2 came out I said to my self this is that one game where everyone needs to own one without trying to pirate it or whatever but after seeing how money hungry they are for taking dirty money to make other 1/4 of gamers feel like crap is wrong and now I can care less for their game and I hope everyone that pirated the game should not feel guilty.
is the 1/4 gamers the nvidia !!!!!!s, or uninformed people who bought the FX?
Even using FP16 it still is slow.
the DX8.1 mode isn't that bad either.. still looks excellent with my 8500 64MB with details maxed+ 16AF.
plays good for the most part also, few slow downs, but changing the texture detail to medium fixes it.
 
I know I should not respond...but...

Dyne said:
ATI and Valve have been known to bundle the games together since ATI's 9600XT and 9800XT Cards. No one had a problem with that; until nVidia users with GeForce FX Series cards realized that they were NOT seeing DirectX 9 Features.


Valve made this quite clear on the shader day even LAST year that even after adding a mixed mode path the FX still was faster in the x8.1 code path:

http://www.anandtech.com/video/showdoc.aspx?i=1863&p=5


The game was being run in DirectX 8.1 Compatibility Mode which eliminates most of the Pixel Shader 2.0 Features the FX Series was touting around since the (horrible) 5800 Ultra.

Valve made the chocie that this gave the most bang for the buck for FX users.


All other games show a virtual dead-heat between the 5900 Ultra and the 9800 Pro; except in the case of the Source Engine. Here, nVidia Users are seeing less than 1/2 the frame rates of their competitors, which quite frankly, has many of us pissed off to say the least.

All of the other games that use shaders are part of NV TWIMTBP program. HL2 is not.


Valve originally had a Mixed Mode for the NV3x which allowed both FP32 bit and FP16 bit for those shaders that did not need it. Somewhere in the various delays, this mode was never completed. Therefore Valve wasted time developing a mode to help the NV3x series and it became a moot point. Why is Valve wasting time developing a mode that essentially got dropped?

ID spent over a YEAR developing the NV30 path in the Doom3 game. It was dropped after NV shader replacement/driver optimizations brought the FX cards up to speed. I dont see you complaining about that so whats the point here?

Why does masking the device ID to appear as a Radeon Card fix DX9 Mode on the FX Series, while the default Device ID Results in a ton of glitches?

Maybe it the FX has driver issues with HL2. There was a banding issue on Farcy with the FX cards before the 1.3 (or 1.1 patch I forget wich now). The fix for FX uses was to use the same deviceID trick that FX users use in HL2, namely use the ATI deviceID. When they did that, the banding went away.

Remember this: http://www.nvnews.net/vbulletin/showthread.php?t=30854&highlight=farcry

Why didn't Valve Implement Partial Precision for the FX?

They tried. It still was much slower than the ATI parts and slower than the DX8.1 path. Now you have about 18% of the total HL2 players using the FX cards. However a most of them are using either the 5200 or the 5600 serries of FX cards which is still TO SLOW using mixed precision. Thus only the people with 5700+ or higher can run it in mixed mode. That results in about 4.8% of the total user base. Now your Gabe. You still have to implement HRD. Still have to add SM3.0. Still have to add ATI's 3Dc. What are you going to do. Go back and spend time to add a mixe mode path that only helps 5% of your user base and still is SLOWER then the default?


Is it a Conspiracy? Are ATi and Valve trying to eliminate nVidia Competition?

Good greif. Use your head. If ATI/Valve wanted to pull a fast one, then why would they make the DX8.1 path the default for the FX? I mean if your gonna to screw NV, force all FX users to use the full DX9 path. And you know just how slow that will make the FX cards run....


Has the FX Series deemed useless?

Far from it. Its a valid card that has some worth. Just it requires special cases. Either NV optiomizing their drivers (ALA Doom3, or using mixed mod every where). For the last two years developers have had to bend over to support the mix mode just to get simular frame rates vrs ATI. These developers have been on NV TWIMTBP program. Here is a case of one developer thats not. And as I pointed out the gave FX user the best prefomance for the IQ...

Maybe we will get some answers from either ATi or Valve concerning the literally piss-poor FX Series performance as well.

No the first synthic benchmarks over 2 years ago showed us the piss poor prefromance of the FX serries. Users here and else where chose to ignore it.

The facts of the matter is that it would have been nice to have it in HL2. But its not. You also have to remember there are manythings missing in HL2, like PS3.0, 3Dc, HDR, ect. maybe valve will come back and at it later? Valve is not the first company to drop the mix mode path as ID did that as well with Doom3. For what ever reason it was dropped....as this thread should be....
 
Mods,

PLEASE just add this whole thread to the end of the "Valve Sucks" thread. We really don't want to have to see R1ckCa1n mistake 18% for 2.55% another 1000 times.
 
Badger_sly said:
Mods,

PLEASE just add this whole thread to the end of the "Valve Sucks" thread. We really don't want to have to see R1ckCa1n mistake 18% for 2.55% another 1000 times.

I don't think vBulletin has the capability to concatenate threads onto eachother... atleast I dont think.
 
easy answers

1. if your FX card does not play half life 2 up to the standard you expected - get rid and buy something that does!!

2. my old FX card was red and i liked it - so those who bitch should erm shut up or whatever

3. Go outside , there is more to life than games on your word processor , get laid , get drunk etc - its more fun !!!
 
wickfut said:
easy answers

1. if your FX card does not play half life 2 up to the standard you expected - get rid and buy something that does!!

2. my old FX card was red and i liked it - so those who bitch should erm shut up or whatever

3. Go outside , there is more to life than games on your word processor , get laid , get drunk etc - its more fun !!!

Video games are less risky, and a lot of the times more fulfilling. :)
 
Badger_sly said:
Mods,

PLEASE just add this whole thread to the end of the "Valve Sucks" thread. We really don't want to have to see R1ckCa1n mistake 18% for 2.55% another 1000 times.

I know right. Makes me want to throw out BS figures to make ATi look good just watch the poor sap quote me all over the place.

Bottom line, it would be nice to have the option for FX users...but in all reality Valve barely got the game out the door a year late without doing a pp path...

Give credit where credit is due, the r300 was the right card at the right time...the nv3x was the wrong card at the wrong time...if you have an FX class card...upgrade...I did...and I"m much happier for it...*pets his 6800GT*
 
Back
Top