Valve sucks

Status
Not open for further replies.
palabared said:
lol funny he hasnt said a word since :p

that's cuz we pwned his @$$... he's either a total n00b or he didn't bother to read a thing before posting his bullcrap opinion- it's the radeon cards that are favored, not bad.

tho if he shows up again with more bs, he's gonna wind up on my don't trade with list for chronic moron syndrome... i shoulda put him on there in the first place but it'd look funky now if i did.
 
tazz said:
half life 2 looks sweet on my gt 6800 card.
its looks even better then the pic thats on the frist post
That's cause you're running it in full precision, not FP16

starhawk said:
p3n00b... if you recall... the fx5950's were neck-and-neck with the radeon 9800's... i personally planned on getting an albatron fx5950 ultra (second best of the pack) before the 6800's came out- and i loathe to h*** going with anything worse than second-best.
Neck in neck on anything NOT DX9.
 
cpu mag said in their pc modder issue that the two were basically equal... and they tested both dx8 and dx9... cuz they had a chaintech ti4*** in there...
 
Flaming and Name Calling will get you banned.

This has been a great discussion, please keep it on an adult level.
 
Met-AL said:
That's cause you're running it in full precision, not FP16
FP16 shouldn't make that big of a difference. I'm a little disappointed on my Mobility 9600 because there's visible banding in some places, but it's not a big deal.

Neck in neck on anything NOT DX9.
true.dat. The 9800 had far superior DX9 performance to the NV35/NV38. No point in arguing that.

The only thing I bring up is that every other DX9 game, with the exception of HL2, was playable on the FX 59x0 cards. Sure using a 59x0 card gives up FSAA in most cases (and definitely @ 1600x1200), but the other DX9 games were very playable.
 
gordon151 said:
The point that a lot are missing (especially ruined, whom is going around to multiple forums posting ridiculous statements on this) is that the performance boost is from the default performance after they have *forced* the game to run into dx9 mode, as well as using the 9800 device id. Performance in that mode was already unbearably low, most likely due to the fact many of the shaders were running in full precision mode, which is why forcing partial precision improved performance. The thing I'm wondering is, how does performance after doing that compare to the default nv3x mode for hl2. If it compares good and the iq gain is worth the performance loss, then the question people should be asking is "why did they remove the mixed-mode?".

That's exactly the question that needs to be asked.

Unfortunately, I can't test it, as I don't have Half-Life 2. However, as noted, in the Guru3d thread, many other have posted and it DOES appear to be working. Using partial precision hints (part of the DX9 spec), the game would, apparently, be MUCH faster in DX9 mode on GeForce-FX cards.

THAT is the real crux of the issue. The users that just keep posting useless trolling like "FX cards suck" and such aren't helping.

I mean, really, on WHAT do you base that? On Gabe's assertions over the past couple years that the FX cards suck? Well, it looks like he intentionally sabotaged them TOO suck.

On Carmack needing to code them their own path in Doom3 for performance? Well, wait, he didn't have to after all, huh? The FX cards run the same ARB2 path as ATI cards do...and run it just as fast, too.

On Far Cry? Granted, the FX cards ARE slower than their ATI peers...but still offer competitive framerates.

The point of all this is that the FX cards do NOT suck in Direct3d if you code a game properly to DX9 spec. And, obviously, we know they don't suck in OpenGL.

A poster a few up asked if this trick can be used in other games - in fact, it can be used in ANY games. HOWEVER....some games actually *need* at least FP24 precision to run, so the FX cards must run it in FP32. If they don't - if they run it in FP16 - there isn't enough precision for some of the calculation and the effect is....weird.

For example, posted in the Guru3d forums:
{HLH} said:
halo_fp16_fp32.jpg

Which is very clearly showing that FP16 is not enough in those cases for Halo.

And makes even more startling how good Half-Life 2 looks using FP16 - if there were artifacts because it NEEDED 24-bit precision at LEAST, as you can see they'd be pretty obvious.

That nobody can find any yet....is pretty damning.
 
dderidex said:
Well, the funny thing is that Valve coded Half-Life 2 to use FP24 shaders all the time every time. And it's really not needed. Nope. In fact, FP16 seems to do the trick all the time - as seen in that above pic. FP16 and FP24 are indistinguishable in Half-Life 2.

No, the funny thing is that Valve went with DX9 minimum spec at the time, which was FP24. NV chose to use FP32 and FP 16 with the FX cards. Its not Valves fault for going with the spec. It was NV's (bad) decision to not use FP24.

The FX cards are slower at DX9, get over it. Upgrade to a newer/better card.
 
amdownzintel said:
Sorry I must be a nub, but how do you run HL2 in DirectX9.0? Can anyone help me :(
add -dxlevel 90 to the HL2 shortcut, after the \hl2.exe" part.

If you're running it through steam, add that same command to the launch options for half-life 2. Right click it and select properties to get to the launch options.
 
Ok, here's the point, that some ATI fan boys arent getting.

Instead of putting a tag in there that forces FX cards to run DX8 (which looks like ass compared to Dx9), why not properly code the game to use FP16 where 16 will suffice instead of forcing 24, which makes all of NV's cards use fp32 which hurts them all, but MOSTLY the FX line, b/c they were fairly crappy cards! It makes for an inefficient engine, it may have improved performance to allow a higher resolution on all cards. (possibly)

If they had efficiently code the game to use FP16 where it was sufficient, and FP24/32 when higher precision was needed it would improve performance on ALL cards, but mostly Nvidia card (and especially the FX line % wise) and make Nvidia win probably all the benchmarks, which ATI would be PISSED about since they paid big bucks to put those little coupons in like 2 years ago.

Noone is arguing that FX line of cards sucked when they ran FP32, compared to ATI's equivalent cards at FP24.
 
fallguy said:
No, the funny thing is that Valve went with DX9 minimum spec at the time, which was FP24. NV chose to use FP32 and FP 16 with the FX cards. Its not Valves fault for going with the spec. It was NV's (bad) decision to not use FP24.

The FX cards are slower at DX9, get over it. Upgrade to a newer/better card.
No, it's Valve's bad decision to use FP24 all the time (sticking to the DX9 spec) but then not ALSO using 'partial precision hints' to let cards know when FP24 was not needed (and that's ALSO in the DX9 spec).

IOW, they were picking and choosing which part of the DX9 spec to stick to in order to hurt the FX cards the most.

Heck, you only have to look at what happens TO the DX9 mode when you tell Half-Life 2 you have an ATI card! That's all you have to do to clean up all the artifacts from running an FX card in DX9 mode, just tell Half-Life it's not an FX card but an ATI card. Suddenly, *poof*, all the artifacting is gone.
 
Ok what we need here is some benchies and IQ testing.

And what FP does ATi's 9800/X800 series use in HL2?
 
chrisf6969 said:
Instead of putting a tag in there that forces FX cards to run DX8, why not properly code the game to use FP16 where 16 will suffice instead of forcing 24, which makes all of NV's cards use fp32 which hurts them all, but MOSTLY the FX line, b/c they were fairly crappy cards!
The point of this thread is why HL2 runs slowly. Speculation for the motive Valve had is all over the place. So let me add my theory. ;)

Supporting the extra path was taking up development time. The game was late, very late. Continuing to support the mixed-mode path was probably dropped due to those reasons, nothing sinister. I believe that's the main reason.

$6 million of loyalty and pressure from ATI probably had a little to do with it, but I doubt that was the main reason.
 
GabooN said:
Ok what we need here is some benchies and IQ testing.

And what FP does ATi's 9800/X800 series use in HL2?
ATI does all PS2.0 calculations in FP24, IIRC.

And yeah, IQ testing is needed. I can do some on my 5200 later today (LOL). But I want to play with my new x700 Pro 256MB. :mad:
 
So it seems that these tweaks only are needed for Nvidia's FX line, correct?

The users of the 6800 series can't/don't need to benefit from the tweaks?

I am pretty sure my 6800 GT is doing more than fine the way it is.
 
fallguy said:
No, the funny thing is that Valve went with DX9 minimum spec at the time, which was FP24. NV chose to use FP32 and FP 16 with the FX cards. Its not Valves fault for going with the spec. It was NV's (bad) decision to not use FP24.

The FX cards are slower at DX9, get over it. Upgrade to a newer/better card.

It's valve's fault for not delivering the best gaming experience for nvidia users. While they were spending all that time making "ATi levels" and bragging about the performance of ATi cards, they could have been flipping some switches to try to boost performance on the FX series which is an extremely popular line of cards. But hey, if marketing dollars are more important to you than your customers getting the most out of your game, then go ahead...stick "ATi" all over the box and cd and include vouchers and coupons for ATi cards all over the place...
 
Well I believe that Valve should release a patch or something so that NVIDIA users can gain some fps here and there. They might as well. Discoveries like these are very bad publicity and should be handled with care.
 
Anyone with an NVIDIA card who has tried this tweak, can you please run a regular benchmark, then an FP16 benchmark with the following timedemos and report back the FPS:

HARDOCP TIMEDEMOS:
http://www.fileshack.com/file.x?fid=5857

ATI TIMEDEMOS:
http://www.tommti-systems.de/main-Dateien/misc/HL2timedemos.zip

To do a timedemo, go to advanced keyboard options and enable developer console, then hit ~ and type timedemo demofile.dem

Lets see the results if Valve coded their shaders properly and efficiently, for use with FP16 when FP32 was not necessary for most of the game.
 
fallguy said:
No, the funny thing is that Valve went with DX9 minimum spec at the time, which was FP24.
Actually, according to Microsoft, partial precision is part of the PS2.0 standard. http://msdn.microsoft.com/library/d...ixelShaders/Instructions/Modifiers_ps_2_0.asp I posted that on the first page. _pp is only a hint and can be ignored, which is what ATI does when it runs a shader with that hint.

But yeah, nvidia blew it big time by failing to support FP24 on the FX. I think that was meant to be a jab at ATI (or really bad early design choice) and it backfired.
 
I agree with Trudude... they should release a patch... I have two BFG Tech cards and formally a 9500@9700 owner... Im not a !!!!!! of any one camp but simply switched to what I COULD GET MY HANDS ON at the time. Thus far the cards work great and a patch that helps out Half Life 2 based games perform better would be most welcome.
 
I think now it is prevalent that it is beside the point for the 6800 series. The 6800s run in FP32, where it has strong performance. The whole point of the argument is that the FXs run very poorly with FP32, which it defaults to since it can't run FP24.

So again, I think it's totally beside the point for the 6800 series. For those of us who have it, carry on. :)
 
Yeah, dang that Valve for going with DX9 specs.

Everyone and their mother knew the FX cards didnt do DX9 nearly as well as the ATi counterparts. I bought one too, but after Farcry came out, I sold it pretty fast. It was a lot slower than my 9800XT, and the AA didnt look as good, it also took a huge hit when running any AA. Farcry shows a huge performance lead for ATi cards when looking at the last gen too. I guess they cant code either, according to your logic.

If you want FP16, stick to synthetic benchmarks. Id rather not have it in my games. Heaven forbid you lay the blame where it goes, on NV's door step. THEY chose to use FP16, and FP32, and not go with the minimum DX9 spec, of FP24.

Just get over the fact that the FX cards were a lesser card than the R3xx cards. In just about every way, except OpenGL. Their AA looked better, took less of a hit with AA, are almost always faster in highres+AA/AF, and are overall faster in DX9 games.

Some people like to go and laugh at "Shader day" when Gabe said the ATi cards were much, much faster than the NV cards. It appears now it was true, if you got a FX card intending to play HL2, its nobodys fault but your own.
 
Why would valve intentionally do that when people are either buying 6XXX series cards or X800 series cards, when both can run the game about the same?
Lets see some screen shots!!
Like the previous poster said, you still have inferior FSAA and AF, its totally nvidias fault,.
I think we should make sure there aren't places where FP16 isn't enough before jumping the fun on valve- they have nothing to gain from it since it didn't ship during the FX or 9700/9800 product life.
 
^eMpTy^ said:
It's valve's fault for not delivering the best gaming experience for nvidia users. While they were spending all that time making "ATi levels" and bragging about the performance of ATi cards, they could have been flipping some switches to try to boost performance on the FX series which is an extremely popular line of cards. But hey, if marketing dollars are more important to you than your customers getting the most out of your game, then go ahead...stick "ATi" all over the box and cd and include vouchers and coupons for ATi cards all over the place...

Maybe they shoulda made Doom 3 more forgiving for the Ati cards ... maybe just maybe the same thing was done on Doom 3 ... they intentionally made X800 cards run slower .. but that doesn't matter because ATi people don't care .. their cards run the game WELL ...

You do notice that some companies are selling Doom 3 games with their nVidia cards .. right ? so why can't ATi do the same ?

Keep this debate smart .. and stop attacking the wrong people ... sponsoring is one thing and cheating is another ...
 
ajm786 said:
I think now it is prevalent that it is beside the point for the 6800 series. The 6800s run in FP32, where it has strong performance. The whole point of the argument is that the FXs run very poorly with FP32, which it defaults to since it can't run FP24.

So again, I think it's totally beside the point for the 6800 series. For those of us who have it, carry on. :)

Actually, if you have a 6x00 or 9x00 or X800 series card it would also help your performance, too. Not to the same degree or % that it would help on the FX line, b/c their architecture was much weaker. More performance can never hurt. It will just let you play up one more resolution or AA setting, etc... so why not optimize your engine to be EFFICIENT?

ATI cards would benefit a little.
Nv 6x00 cards woud benefit decently.
FX 5x00 card would benefit A LOT percentage wise.
 
fallguy said:
Yeah, dang that Valve for going with DX9 specs.

Look, I keep bringing this point up, and you keep ignoring it.

VALVE DIDN'T FOLLOW THE DX9 SPEC!!!

So they went with FP24, big deal, DX9 spec ALSO calls for using partial precision hints when possible, and they DIDN'T do that.

AGAIN: VALVE "PICKED AND CHOOSED" WHICH PART OF THE DX9 SPEC TO FOLLOW TO HURT THE FX CARDS THE MOST.
 
well considering that this thread made it to the front page, i'll bet that the [H] staff (prob. Kyle and Brent) will be looking into this.
in the meantime though, i'm quite curious as to how much this tweak will help out th nv4x based nvidia cards, since they are already superior to the nv3x in every way, and not terribly far behind ati in most benchmarks.
also, this would explain quite well why ati scales better with resolution and aa
 
mohammedtaha said:
Maybe they shoulda made Doom 3 more forgiving for the Ati cards ... maybe just maybe the same thing was done on Doom 3 ... they intentionally made X800 cards run slower .. but that doesn't matter because ATi people don't care .. their cards run the game WELL ...

Doom3 just followed the OpenGL spec, they didn't do anything WITH it to intentionally harm ATI cards....it's just that ATI cards suck in OpenGL. (Seriously - in ALL OpenGL, they are slower in Quake 3, Call of Duty, etc - anything OpenGL, ATI is slower in than nVidia)
 
dderidex said:
Look, I keep bringing this point up, and you keep ignoring it.

VALVE DIDN'T FOLLOW THE DX9 SPEC!!!

So they went with FP24, big deal, DX9 spec ALSO calls for using partial precision hints when possible, and they DIDN'T do that.

AGAIN: VALVE "PICKED AND CHOOSED" WHICH PART OF THE DX9 SPEC TO FOLLOW TO HURT THE FX CARDS THE MOST.

"when possible" I guess you're a coder now, right?

Just get over the fact the FX cards are not very good compared to the ATi counterparts in DX9 games, and upgrade or stop crying about it every week.
 
Just a couple of things to point out.

Definition wise, you don’t “force” FP24, you actually “force” FP16 – the default for compilation of shaders on under DirectX is “full precision” and the hardware requirements to be classified as full precision states that you must support at least FP24, so Valve is just using the default precision for DX; FP16 is part of the specification as an optional “Partial Precision” but you have to explicitly request it.

Now, given that Valve had already stated that the low end FX series would be treated as DX8, the only boards that they intended for the “Mixed mode” to operate with would be the 5800/5900/5950. If you take a look at the Steam video card stats this constitutes 2.55% of their install base – conversely, 30% of their install base is running DX9 ATI boards that would receive no performance or quality differences from this path.

Given the time for coding and validation within the game environment, especially in a title that has already slipped 14 months further than it should, is the further coding and support requirements worth it for 2.5% of your userbase?
 
dderidex said:
Doom3 just followed the OpenGL spec, they didn't do anything WITH it to intentionally harm ATI cards....it's just that ATI cards suck in OpenGL. (Seriously - in ALL OpenGL, they are slower in Quake 3, Call of Duty, etc - anything OpenGL, ATI is slower in than nVidia)
Actually there's a thread about beyond3d which explains how doom3 was coded for cards which didn't have strong FPU performance(read fx series) and that actually hurts performance for both the 6800 and X800
 
DaveBaumann said:
Now, given that Valve had already stated that the low end FX series would be treated as DX8, the only boards that they intended for the “Mixed mode” to operate with would be the 5800/5900/5950. If you take a look at the Steam video card stats this constitutes 2.55% of their install base

Given the time for coding and validation within the game environment, especially in a title that has already slipped 14 months further than it should, is the further coding and support requirements worth it for 2.5% of your userbase?
if valve increased performance with nv3x based cards, i'll be willing to bet that that 2.5% would jump up significantly because more people will want to try it out, knowing that it will run somewhat decently.
kind of a catch-22 there huh? ;)
 
DaveBaumann said:
Given the time for coding and validation within the game environment, especially in a title that has already slipped 14 months further than it should, is the further coding and support requirements worth it for 2.5% of your userbase?

Except, for all intents and purposes, this would require NO extra coding. Using 3dAnalyze, we can just do all the calculations meant to run in FP24 in FP16 instead, and there is no visual quality difference.

Would it be better if they wrote seperate shaders designed to use FP16 instead? Sure - but we can already see that's not necessary, there is no discernable image quality difference when simply running their existing shaders using only 16-bit precision.

(And, FWIW, this was *tested* on a GeForce FX 5900 - so using the default DX9 mode for them, these cards are NOT running in 'mixed mode' when doing DX9 regardless of what Valve *said*)
 
(cf)Eclipse said:
i'm quite curious as to how much this tweak will help out th nv4x based nvidia cards, since they are already superior to the nv3x in every way, and not terribly far behind ati in most benchmarks.


How would this help out any of the NV4X based video cards? We don't want to run HL2 in FP16; we want to run it in FP32.
 
mohammedtaha said:
Maybe they shoulda made Doom 3 more forgiving for the Ati cards ... maybe just maybe the same thing was done on Doom 3 ... they intentionally made X800 cards run slower .. but that doesn't matter because ATi people don't care .. their cards run the game WELL ...

You do notice that some companies are selling Doom 3 games with their nVidia cards .. right ? so why can't ATi do the same ?

Keep this debate smart .. and stop attacking the wrong people ... sponsoring is one thing and cheating is another ...

This isn't a thread about Doom3 performance...

Bundling a game is one thing, giving away free vouchers with all ATi cards, making ATI specific levels for the game, paying $6 million, coming out with early benchmarks to support one company when the game was nowhere near completion, and then having ATi stamped all over the box and on the cd....now that's a completely different thing...

Pull out your doom3 box, you won't see a single nvidia logo anywhere on it.
 
ajm786 said:
How would this help out any of the NV4X based video cards? We don't want to run HL2 in FP16; we want to run it in FP32.
but if you really don't need fp32... even if the nv40 is way stronger with fp32 than the nv30/35 is, fp16 is still faster. if it can be proven that there is no discernable image difference, why not do it?
 
fallguy said:
"when possible" I guess you're a coder now, right?

Just get over the fact the FX cards are not very good compared to the ATi counterparts in DX9 games, and upgrade or stop crying about it every week.
Actually, yes, I am.

Anyway, I have no problem with "the fact the FX cards are not very good compared to the ATI counterparts in DX9 games"....but there is a rather significant chasm between "not very good" and "godawful".

The FX cards - as has been demonstrated using 3dAnalyze - are still able to achieve perfectly playable framerates with near identical image quality.

As FAST as ATI's? No. But, then, they are still playable, so why does it matter?

The complaint is not to try and make these cards as good as ATI's in DX9 or something - that's obviously impossible, and nobody is trying. The complaint is more - WHY DID VALVE MAKE THEM SUCK SO BAD in Half-Life 2? They punished their performance FAR more than necessary!

And while the FX 59x0 series is only 2.5% of their install base (I believe that), the FX 5600, 5700, and 5800 can all realistically run in DX9 mode using these tweak, too - and I think the total number of FX cards across ALL those brands is a little higher! (I'll grant it's probably a little unrealistic to expect the FX 5200 and 5500 to run the full DX9 mode, even WITH partial precision hints correctly implemented)
 
(cf)Eclipse said:
if valve increased performance with nv3x based cards, i'll be willing to bet that that 2.5% would jump up significantly because more people will want to try it out, knowing that it will run somewhat decently.
kind of a catch-22 there huh? ;)

Not really those owners can run it in the default path (DX8) and still get better FPS. You average Joe is not going to know how to force rendering modes. He is going to run it out of the box (like HardOCP tested) which means it defaults to DX8 for all but 2.5% of the ammount of FX cards out there. 2.5% is a drop in the vitural bucket. There are about 1100 shaders used in HL2. Verifying the all look good in mixed mode will add time to development. Most of us had already given up on them....

Also consider this that ATI's HDR and 3Dc are STILL NOT USED in HL2. If they are late getting in these features that "ATI paid" for then why would they not be late with other stuff?
 
(cf)Eclipse said:
but if you really don't need fp32... even if the nv40 is way stronger with fp32 than the nv30/35 is, fp16 is still faster. if it can be proven that there is no discernable image difference, why not do it?

It was my understanding that FP32 was slightly more detailed than FP16; then again, you said if it can be proven. If this is the case, then I stand corrected.

BTW, what mode does the ATI X800 series run with by default? FP24? Or do they also do FP32?
 
Status
Not open for further replies.
Back
Top