Valve sucks

Discussion in 'Video Cards' started by dderidex, Nov 30, 2004.

Thread Status:
Not open for further replies.
  1. palabared

    palabared [H]ard|Gawd

    Messages:
    1,073
    Joined:
    Jul 2, 2003
    lol funny he hasnt said a word since :p
     
  2. starhawk

    starhawk [H]ardForum Junkie

    Messages:
    8,909
    Joined:
    Oct 4, 2004
    that's cuz we pwned his @$$... he's either a total n00b or he didn't bother to read a thing before posting his bullcrap opinion- it's the radeon cards that are favored, not bad.

    tho if he shows up again with more bs, he's gonna wind up on my don't trade with list for chronic moron syndrome... i shoulda put him on there in the first place but it'd look funky now if i did.
     
  3. Met-AL

    Met-AL [H]ardness Supreme

    Messages:
    7,874
    Joined:
    Apr 9, 2002
    That's cause you're running it in full precision, not FP16

    Neck in neck on anything NOT DX9.
     
  4. starhawk

    starhawk [H]ardForum Junkie

    Messages:
    8,909
    Joined:
    Oct 4, 2004
    cpu mag said in their pc modder issue that the two were basically equal... and they tested both dx8 and dx9... cuz they had a chaintech ti4*** in there...
     
  5. amdownzintel

    amdownzintel Limp Gawd

    Messages:
    201
    Joined:
    Oct 24, 2004
    Sorry I must be a nub, but how do you run HL2 in DirectX9.0? Can anyone help me :(
     
  6. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    47,987
    Joined:
    May 18, 1997
    Flaming and Name Calling will get you banned.

    This has been a great discussion, please keep it on an adult level.
     
  7. pxc

    pxc [H]ard as it Gets

    Messages:
    33,064
    Joined:
    Oct 22, 2000
    FP16 shouldn't make that big of a difference. I'm a little disappointed on my Mobility 9600 because there's visible banding in some places, but it's not a big deal.

    true.dat. The 9800 had far superior DX9 performance to the NV35/NV38. No point in arguing that.

    The only thing I bring up is that every other DX9 game, with the exception of HL2, was playable on the FX 59x0 cards. Sure using a 59x0 card gives up FSAA in most cases (and definitely @ 1600x1200), but the other DX9 games were very playable.
     
  8. dderidex

    dderidex [H]ardness Supreme

    Messages:
    6,313
    Joined:
    Oct 31, 2001
    That's exactly the question that needs to be asked.

    Unfortunately, I can't test it, as I don't have Half-Life 2. However, as noted, in the Guru3d thread, many other have posted and it DOES appear to be working. Using partial precision hints (part of the DX9 spec), the game would, apparently, be MUCH faster in DX9 mode on GeForce-FX cards.

    THAT is the real crux of the issue. The users that just keep posting useless trolling like "FX cards suck" and such aren't helping.

    I mean, really, on WHAT do you base that? On Gabe's assertions over the past couple years that the FX cards suck? Well, it looks like he intentionally sabotaged them TOO suck.

    On Carmack needing to code them their own path in Doom3 for performance? Well, wait, he didn't have to after all, huh? The FX cards run the same ARB2 path as ATI cards do...and run it just as fast, too.

    On Far Cry? Granted, the FX cards ARE slower than their ATI peers...but still offer competitive framerates.

    The point of all this is that the FX cards do NOT suck in Direct3d if you code a game properly to DX9 spec. And, obviously, we know they don't suck in OpenGL.

    A poster a few up asked if this trick can be used in other games - in fact, it can be used in ANY games. HOWEVER....some games actually *need* at least FP24 precision to run, so the FX cards must run it in FP32. If they don't - if they run it in FP16 - there isn't enough precision for some of the calculation and the effect is....weird.

    For example, posted in the Guru3d forums:
    Which is very clearly showing that FP16 is not enough in those cases for Halo.

    And makes even more startling how good Half-Life 2 looks using FP16 - if there were artifacts because it NEEDED 24-bit precision at LEAST, as you can see they'd be pretty obvious.

    That nobody can find any yet....is pretty damning.
     
  9. fallguy

    fallguy 2[H]4U

    Messages:
    3,953
    Joined:
    Sep 8, 2001
    No, the funny thing is that Valve went with DX9 minimum spec at the time, which was FP24. NV chose to use FP32 and FP 16 with the FX cards. Its not Valves fault for going with the spec. It was NV's (bad) decision to not use FP24.

    The FX cards are slower at DX9, get over it. Upgrade to a newer/better card.
     
  10. pxc

    pxc [H]ard as it Gets

    Messages:
    33,064
    Joined:
    Oct 22, 2000
    add -dxlevel 90 to the HL2 shortcut, after the \hl2.exe" part.

    If you're running it through steam, add that same command to the launch options for half-life 2. Right click it and select properties to get to the launch options.
     
  11. chrisf6969

    chrisf6969 [H]ardForum Junkie

    Messages:
    9,014
    Joined:
    Oct 27, 2003
    Ok, here's the point, that some ATI fan boys arent getting.

    Instead of putting a tag in there that forces FX cards to run DX8 (which looks like ass compared to Dx9), why not properly code the game to use FP16 where 16 will suffice instead of forcing 24, which makes all of NV's cards use fp32 which hurts them all, but MOSTLY the FX line, b/c they were fairly crappy cards! It makes for an inefficient engine, it may have improved performance to allow a higher resolution on all cards. (possibly)

    If they had efficiently code the game to use FP16 where it was sufficient, and FP24/32 when higher precision was needed it would improve performance on ALL cards, but mostly Nvidia card (and especially the FX line % wise) and make Nvidia win probably all the benchmarks, which ATI would be PISSED about since they paid big bucks to put those little coupons in like 2 years ago.

    Noone is arguing that FX line of cards sucked when they ran FP32, compared to ATI's equivalent cards at FP24.
     
  12. dderidex

    dderidex [H]ardness Supreme

    Messages:
    6,313
    Joined:
    Oct 31, 2001
    No, it's Valve's bad decision to use FP24 all the time (sticking to the DX9 spec) but then not ALSO using 'partial precision hints' to let cards know when FP24 was not needed (and that's ALSO in the DX9 spec).

    IOW, they were picking and choosing which part of the DX9 spec to stick to in order to hurt the FX cards the most.

    Heck, you only have to look at what happens TO the DX9 mode when you tell Half-Life 2 you have an ATI card! That's all you have to do to clean up all the artifacts from running an FX card in DX9 mode, just tell Half-Life it's not an FX card but an ATI card. Suddenly, *poof*, all the artifacting is gone.
     
  13. GabooN

    GabooN Limp Gawd

    Messages:
    401
    Joined:
    Jun 14, 2004
    Ok what we need here is some benchies and IQ testing.

    And what FP does ATi's 9800/X800 series use in HL2?
     
  14. pxc

    pxc [H]ard as it Gets

    Messages:
    33,064
    Joined:
    Oct 22, 2000
    The point of this thread is why HL2 runs slowly. Speculation for the motive Valve had is all over the place. So let me add my theory. ;)

    Supporting the extra path was taking up development time. The game was late, very late. Continuing to support the mixed-mode path was probably dropped due to those reasons, nothing sinister. I believe that's the main reason.

    $6 million of loyalty and pressure from ATI probably had a little to do with it, but I doubt that was the main reason.
     
  15. pxc

    pxc [H]ard as it Gets

    Messages:
    33,064
    Joined:
    Oct 22, 2000
    ATI does all PS2.0 calculations in FP24, IIRC.

    And yeah, IQ testing is needed. I can do some on my 5200 later today (LOL). But I want to play with my new x700 Pro 256MB. :mad:
     
  16. mentok1982

    mentok1982 [H]ardness Supreme

    Messages:
    4,359
    Joined:
    Sep 17, 2004
    So it seems that these tweaks only are needed for Nvidia's FX line, correct?

    The users of the 6800 series can't/don't need to benefit from the tweaks?

    I am pretty sure my 6800 GT is doing more than fine the way it is.
     
  17. ^eMpTy^

    ^eMpTy^ 2[H]4U

    Messages:
    3,233
    Joined:
    Jul 21, 2004
    It's valve's fault for not delivering the best gaming experience for nvidia users. While they were spending all that time making "ATi levels" and bragging about the performance of ATi cards, they could have been flipping some switches to try to boost performance on the FX series which is an extremely popular line of cards. But hey, if marketing dollars are more important to you than your customers getting the most out of your game, then go ahead...stick "ATi" all over the box and cd and include vouchers and coupons for ATi cards all over the place...
     
  18. trudude

    trudude [H]ard|Gawd

    Messages:
    1,647
    Joined:
    Jul 17, 2003
    Well I believe that Valve should release a patch or something so that NVIDIA users can gain some fps here and there. They might as well. Discoveries like these are very bad publicity and should be handled with care.
     
  19. tranCendenZ

    tranCendenZ 2[H]4U

    Messages:
    3,844
    Joined:
    Jun 6, 2004
    Anyone with an NVIDIA card who has tried this tweak, can you please run a regular benchmark, then an FP16 benchmark with the following timedemos and report back the FPS:

    HARDOCP TIMEDEMOS:
    http://www.fileshack.com/file.x?fid=5857

    ATI TIMEDEMOS:
    http://www.tommti-systems.de/main-Dateien/misc/HL2timedemos.zip

    To do a timedemo, go to advanced keyboard options and enable developer console, then hit ~ and type timedemo demofile.dem

    Lets see the results if Valve coded their shaders properly and efficiently, for use with FP16 when FP32 was not necessary for most of the game.
     
  20. pxc

    pxc [H]ard as it Gets

    Messages:
    33,064
    Joined:
    Oct 22, 2000
    Actually, according to Microsoft, partial precision is part of the PS2.0 standard. http://msdn.microsoft.com/library/d...ixelShaders/Instructions/Modifiers_ps_2_0.asp I posted that on the first page. _pp is only a hint and can be ignored, which is what ATI does when it runs a shader with that hint.

    But yeah, nvidia blew it big time by failing to support FP24 on the FX. I think that was meant to be a jab at ATI (or really bad early design choice) and it backfired.
     
  21. Skids1

    Skids1 Gawd

    Messages:
    862
    Joined:
    Jul 16, 2004
    I agree with Trudude... they should release a patch... I have two BFG Tech cards and formally a 9500@9700 owner... Im not a !!!!!! of any one camp but simply switched to what I COULD GET MY HANDS ON at the time. Thus far the cards work great and a patch that helps out Half Life 2 based games perform better would be most welcome.
     
  22. jyi786

    jyi786 [H]ardness Supreme

    Messages:
    5,460
    Joined:
    Jun 13, 2002
    I think now it is prevalent that it is beside the point for the 6800 series. The 6800s run in FP32, where it has strong performance. The whole point of the argument is that the FXs run very poorly with FP32, which it defaults to since it can't run FP24.

    So again, I think it's totally beside the point for the 6800 series. For those of us who have it, carry on. :)
     
  23. fallguy

    fallguy 2[H]4U

    Messages:
    3,953
    Joined:
    Sep 8, 2001
    Yeah, dang that Valve for going with DX9 specs.

    Everyone and their mother knew the FX cards didnt do DX9 nearly as well as the ATi counterparts. I bought one too, but after Farcry came out, I sold it pretty fast. It was a lot slower than my 9800XT, and the AA didnt look as good, it also took a huge hit when running any AA. Farcry shows a huge performance lead for ATi cards when looking at the last gen too. I guess they cant code either, according to your logic.

    If you want FP16, stick to synthetic benchmarks. Id rather not have it in my games. Heaven forbid you lay the blame where it goes, on NV's door step. THEY chose to use FP16, and FP32, and not go with the minimum DX9 spec, of FP24.

    Just get over the fact that the FX cards were a lesser card than the R3xx cards. In just about every way, except OpenGL. Their AA looked better, took less of a hit with AA, are almost always faster in highres+AA/AF, and are overall faster in DX9 games.

    Some people like to go and laugh at "Shader day" when Gabe said the ATi cards were much, much faster than the NV cards. It appears now it was true, if you got a FX card intending to play HL2, its nobodys fault but your own.
     
  24. Moloch

    Moloch [H]ard|Gawd

    Messages:
    1,027
    Joined:
    Sep 11, 2004
    Why would valve intentionally do that when people are either buying 6XXX series cards or X800 series cards, when both can run the game about the same?
    Lets see some screen shots!!
    Like the previous poster said, you still have inferior FSAA and AF, its totally nvidias fault,.
    I think we should make sure there aren't places where FP16 isn't enough before jumping the fun on valve- they have nothing to gain from it since it didn't ship during the FX or 9700/9800 product life.
     
  25. mohammedtaha

    mohammedtaha 2[H]4U

    Messages:
    3,652
    Joined:
    Oct 14, 2004
    Maybe they shoulda made Doom 3 more forgiving for the Ati cards ... maybe just maybe the same thing was done on Doom 3 ... they intentionally made X800 cards run slower .. but that doesn't matter because ATi people don't care .. their cards run the game WELL ...

    You do notice that some companies are selling Doom 3 games with their nVidia cards .. right ? so why can't ATi do the same ?

    Keep this debate smart .. and stop attacking the wrong people ... sponsoring is one thing and cheating is another ...
     
  26. chrisf6969

    chrisf6969 [H]ardForum Junkie

    Messages:
    9,014
    Joined:
    Oct 27, 2003
    Actually, if you have a 6x00 or 9x00 or X800 series card it would also help your performance, too. Not to the same degree or % that it would help on the FX line, b/c their architecture was much weaker. More performance can never hurt. It will just let you play up one more resolution or AA setting, etc... so why not optimize your engine to be EFFICIENT?

    ATI cards would benefit a little.
    Nv 6x00 cards woud benefit decently.
    FX 5x00 card would benefit A LOT percentage wise.
     
  27. dderidex

    dderidex [H]ardness Supreme

    Messages:
    6,313
    Joined:
    Oct 31, 2001
    Look, I keep bringing this point up, and you keep ignoring it.

    VALVE DIDN'T FOLLOW THE DX9 SPEC!!!

    So they went with FP24, big deal, DX9 spec ALSO calls for using partial precision hints when possible, and they DIDN'T do that.

    AGAIN: VALVE "PICKED AND CHOOSED" WHICH PART OF THE DX9 SPEC TO FOLLOW TO HURT THE FX CARDS THE MOST.
     
  28. (cf)Eclipse

    (cf)Eclipse Freelance Overclocker

    Messages:
    30,028
    Joined:
    Feb 18, 2003
    well considering that this thread made it to the front page, i'll bet that the [H] staff (prob. Kyle and Brent) will be looking into this.
    in the meantime though, i'm quite curious as to how much this tweak will help out th nv4x based nvidia cards, since they are already superior to the nv3x in every way, and not terribly far behind ati in most benchmarks.
    also, this would explain quite well why ati scales better with resolution and aa
     
  29. dderidex

    dderidex [H]ardness Supreme

    Messages:
    6,313
    Joined:
    Oct 31, 2001
    Doom3 just followed the OpenGL spec, they didn't do anything WITH it to intentionally harm ATI cards....it's just that ATI cards suck in OpenGL. (Seriously - in ALL OpenGL, they are slower in Quake 3, Call of Duty, etc - anything OpenGL, ATI is slower in than nVidia)
     
  30. fallguy

    fallguy 2[H]4U

    Messages:
    3,953
    Joined:
    Sep 8, 2001
    "when possible" I guess you're a coder now, right?

    Just get over the fact the FX cards are not very good compared to the ATi counterparts in DX9 games, and upgrade or stop crying about it every week.
     
  31. DaveBaumann

    DaveBaumann n00bie

    Messages:
    39
    Joined:
    Jun 24, 2003
    Just a couple of things to point out.

    Definition wise, you don’t “force” FP24, you actually “force” FP16 – the default for compilation of shaders on under DirectX is “full precision” and the hardware requirements to be classified as full precision states that you must support at least FP24, so Valve is just using the default precision for DX; FP16 is part of the specification as an optional “Partial Precision” but you have to explicitly request it.

    Now, given that Valve had already stated that the low end FX series would be treated as DX8, the only boards that they intended for the “Mixed mode” to operate with would be the 5800/5900/5950. If you take a look at the Steam video card stats this constitutes 2.55% of their install base – conversely, 30% of their install base is running DX9 ATI boards that would receive no performance or quality differences from this path.

    Given the time for coding and validation within the game environment, especially in a title that has already slipped 14 months further than it should, is the further coding and support requirements worth it for 2.5% of your userbase?
     
  32. Moloch

    Moloch [H]ard|Gawd

    Messages:
    1,027
    Joined:
    Sep 11, 2004
    Actually there's a thread about beyond3d which explains how doom3 was coded for cards which didn't have strong FPU performance(read fx series) and that actually hurts performance for both the 6800 and X800
     
  33. (cf)Eclipse

    (cf)Eclipse Freelance Overclocker

    Messages:
    30,028
    Joined:
    Feb 18, 2003
    if valve increased performance with nv3x based cards, i'll be willing to bet that that 2.5% would jump up significantly because more people will want to try it out, knowing that it will run somewhat decently.
    kind of a catch-22 there huh? ;)
     
  34. dderidex

    dderidex [H]ardness Supreme

    Messages:
    6,313
    Joined:
    Oct 31, 2001
    Except, for all intents and purposes, this would require NO extra coding. Using 3dAnalyze, we can just do all the calculations meant to run in FP24 in FP16 instead, and there is no visual quality difference.

    Would it be better if they wrote seperate shaders designed to use FP16 instead? Sure - but we can already see that's not necessary, there is no discernable image quality difference when simply running their existing shaders using only 16-bit precision.

    (And, FWIW, this was *tested* on a GeForce FX 5900 - so using the default DX9 mode for them, these cards are NOT running in 'mixed mode' when doing DX9 regardless of what Valve *said*)
     
  35. jyi786

    jyi786 [H]ardness Supreme

    Messages:
    5,460
    Joined:
    Jun 13, 2002

    How would this help out any of the NV4X based video cards? We don't want to run HL2 in FP16; we want to run it in FP32.
     
  36. ^eMpTy^

    ^eMpTy^ 2[H]4U

    Messages:
    3,233
    Joined:
    Jul 21, 2004
    This isn't a thread about Doom3 performance...

    Bundling a game is one thing, giving away free vouchers with all ATi cards, making ATI specific levels for the game, paying $6 million, coming out with early benchmarks to support one company when the game was nowhere near completion, and then having ATi stamped all over the box and on the cd....now that's a completely different thing...

    Pull out your doom3 box, you won't see a single nvidia logo anywhere on it.
     
  37. (cf)Eclipse

    (cf)Eclipse Freelance Overclocker

    Messages:
    30,028
    Joined:
    Feb 18, 2003
    but if you really don't need fp32... even if the nv40 is way stronger with fp32 than the nv30/35 is, fp16 is still faster. if it can be proven that there is no discernable image difference, why not do it?
     
  38. dderidex

    dderidex [H]ardness Supreme

    Messages:
    6,313
    Joined:
    Oct 31, 2001
    Actually, yes, I am.

    Anyway, I have no problem with "the fact the FX cards are not very good compared to the ATI counterparts in DX9 games"....but there is a rather significant chasm between "not very good" and "godawful".

    The FX cards - as has been demonstrated using 3dAnalyze - are still able to achieve perfectly playable framerates with near identical image quality.

    As FAST as ATI's? No. But, then, they are still playable, so why does it matter?

    The complaint is not to try and make these cards as good as ATI's in DX9 or something - that's obviously impossible, and nobody is trying. The complaint is more - WHY DID VALVE MAKE THEM SUCK SO BAD in Half-Life 2? They punished their performance FAR more than necessary!

    And while the FX 59x0 series is only 2.5% of their install base (I believe that), the FX 5600, 5700, and 5800 can all realistically run in DX9 mode using these tweak, too - and I think the total number of FX cards across ALL those brands is a little higher! (I'll grant it's probably a little unrealistic to expect the FX 5200 and 5500 to run the full DX9 mode, even WITH partial precision hints correctly implemented)
     
  39. Jbirney

    Jbirney Gawd

    Messages:
    528
    Joined:
    Nov 14, 2003
    Not really those owners can run it in the default path (DX8) and still get better FPS. You average Joe is not going to know how to force rendering modes. He is going to run it out of the box (like HardOCP tested) which means it defaults to DX8 for all but 2.5% of the ammount of FX cards out there. 2.5% is a drop in the vitural bucket. There are about 1100 shaders used in HL2. Verifying the all look good in mixed mode will add time to development. Most of us had already given up on them....

    Also consider this that ATI's HDR and 3Dc are STILL NOT USED in HL2. If they are late getting in these features that "ATI paid" for then why would they not be late with other stuff?
     
  40. jyi786

    jyi786 [H]ardness Supreme

    Messages:
    5,460
    Joined:
    Jun 13, 2002
    It was my understanding that FP32 was slightly more detailed than FP16; then again, you said if it can be proven. If this is the case, then I stand corrected.

    BTW, what mode does the ATI X800 series run with by default? FP24? Or do they also do FP32?
     
Thread Status:
Not open for further replies.