I have started this topic as a contructive alternative to the infamous "Valve Sucks" Thread which is nothing but a flame war...Hopefully we can accomplish something here rather than flaming users who prefer one camp over another...
All other games show a virtual dead-heat between the 5900 Ultra and the 9800 Pro; except in the case of the Source Engine. Here, nVidia Users are seeing less than 1/2 the frame rates of their competitors, which quite frankly, has many of us pissed off to say the least.
Valve originally had a Mixed Mode for the NV3x which allowed both FP32 bit and FP16 bit for those shaders that did not need it. Somewhere in the various delays, this mode was never completed. Therefore Valve wasted time developing a mode to help the NV3x series and it became a moot point.
Why is Valve wasting time developing a mode that essentially got dropped? Why does masking the device ID to appear as a Radeon Card fix DX9 Mode on the FX Series, while the default Device ID Results in a ton of glitches? Why didn't Valve Implement Partial Precision for the FX? Is it a Conspiracy? Are ATi and Valve trying to eliminate nVidia Competition? Has the FX Series deemed useless? Maybe we will get some answers from either ATi or Valve concerning the literally piss-poor FX Series performance as well.
USELESS POSTS / FLAMES WILL BE DELETED!!
ATI and Valve have been known to bundle the games together since ATI's 9600XT and 9800XT Cards. No one had a problem with that; until nVidia users with GeForce FX Series cards realized that they were NOT seeing DirectX 9 Features. The game was being run in DirectX 8.1 Compatibility Mode which eliminates most of the Pixel Shader 2.0 Features the FX Series was touting around since the (horrible) 5800 Ultra. All other games show a virtual dead-heat between the 5900 Ultra and the 9800 Pro; except in the case of the Source Engine. Here, nVidia Users are seeing less than 1/2 the frame rates of their competitors, which quite frankly, has many of us pissed off to say the least.
Valve originally had a Mixed Mode for the NV3x which allowed both FP32 bit and FP16 bit for those shaders that did not need it. Somewhere in the various delays, this mode was never completed. Therefore Valve wasted time developing a mode to help the NV3x series and it became a moot point.
Why is Valve wasting time developing a mode that essentially got dropped? Why does masking the device ID to appear as a Radeon Card fix DX9 Mode on the FX Series, while the default Device ID Results in a ton of glitches? Why didn't Valve Implement Partial Precision for the FX? Is it a Conspiracy? Are ATi and Valve trying to eliminate nVidia Competition? Has the FX Series deemed useless? Maybe we will get some answers from either ATi or Valve concerning the literally piss-poor FX Series performance as well.