Valve sucks

Discussion in 'Video Cards' started by dderidex, Nov 30, 2004.

Thread Status:
Not open for further replies.
  1. dderidex

    dderidex [H]ardness Supreme

    Messages:
    6,329
    Joined:
    Oct 31, 2001
    First (before I explain how this is relevant to the "video card" thread), guess what graphics card this is rendered on? It's not my pic (I'm just hosting at the moment). Do pay attention to the FPS counter in the lower right.

    [​IMG]
     
  2. dderidex

    dderidex [H]ardness Supreme

    Messages:
    6,329
    Joined:
    Oct 31, 2001
    Another hint....some tweak was done....

    Using the Valve CS:Source stress test, this tweak boosted performance on a certain card from 36 FPS to 64 FPS with virtually no image quality loss.

    Guess the card yet?

    OR the tweak?
     
  3. OriginalReaper

    OriginalReaper Bad Trader

    Messages:
    224
    Joined:
    Mar 10, 2004
    Fx 5700?
    tweak = textures medium?
     
  4. 6800GTOwned

    6800GTOwned Limp Gawd

    Messages:
    204
    Joined:
    Sep 27, 2004
  5. bboynitrous

    bboynitrous 2[H]4U

    Messages:
    2,528
    Joined:
    Feb 29, 2004
    Your 5700, and you switched to OpenGL.
     
  6. SuX0rz

    SuX0rz [H]ard|Gawd

    Messages:
    1,292
    Joined:
    Jul 30, 2002
    Radeon 9200, because Valve belongs to ATi ;)
     
  7. archevilangel

    archevilangel [H]ard|Gawd

    Messages:
    1,825
    Joined:
    Jan 15, 2003
    heh, can this tweak be performed on other nvidia cards? Not that I'd need it, but it'd be nice.
     
  8. 6800GTOwned

    6800GTOwned Limp Gawd

    Messages:
    204
    Joined:
    Sep 27, 2004
    :mad: Don't toy with my emotions!!!



    ........



    That's not possible!..... is it?
     
  9. dderidex

    dderidex [H]ardness Supreme

    Messages:
    6,329
    Joined:
    Oct 31, 2001
    Oh, it's pretty sad really.

    Basically, some guys on Guru3d figured out what Valve did to cripple nVidia cards.

    First off, you need 3dAnalyze. I'm assuming everyone knows that you can force HL2 to run in DX9 mode on FX cards, right? Only, you get artifacts in the water and other areas?

    Well, that's pretty easy to fix. Just have the 3dAnalyze util report your card as an ATI Radeon instead of a GeForce FX.

    *taddah* All the artifacts go away, and you get true DX9 reflections!

    Okay, but there IS a performance hit doing that. How to get around that?

    Well, the funny thing is that Valve coded Half-Life 2 to use FP24 shaders all the time every time. And it's really not needed. Nope. In fact, FP16 seems to do the trick most the time - as seen in that above pic. FP16 and FP24 are indistinguishable in Half-Life 2 for the most part.

    Again, using 3dAnalyze you can test this. It is capable of forcing a card to use only FP16 shaders no matter what is requested. You'll see virtually no image quality difference doing that - just a HUGE performance boost. Why? Well, because while FP16 is all that Half-Life 2 *needs* almost all the time, if they let the GeForce FX cards do THAT, they might have been competitive! So, instead, they forced full precision in every shader op (unneeded), which caused the GF-FX cards to render the DX9 mode in FP32 all the time. With the obvious associated performance hit.

    Try it yourself. The link to the article is here. Download 3dAnalyze, and follow these instructions:
    Amazing, huh?

    AND NOW, AN EDIT:
    With more data!


    Now, this tweak is GREAT for proving what kind of performance hit Valve kindly provided GeForce FX users....but it's obviously not usable to play the game with. Why?

    1) Well, first, 3dAnalyze is simply not stable enough to use as a workaround to play the whole game
    2) In the case of some specific shaders (some windows, a few surfaces), there ARE visible artifacts - color banding - as a result of forcing partial precision. Is this a problem? Not really - the whole point of this observation is that Valve should have allowed partial precision MOST of the time, not ALL of the time. GeForceFX cards have a more-than-is-needed full precision mode (which they are stuck with running full-time currently) that would be perfectly suitable for the few times full precision is actually NEEDED in the game.

    So, in short, Valve handicapped the GeForceFX cards by 'picking and choosing' which part of the DX spec to follow - they chose not to implement the partial precision hints allowed for in the spec, and which are obviously usable in almost every case in the game, and which would have made the GeForce FX cards competitive!
     
  10. joecuddles

    joecuddles [H]ard|Gawd

    Messages:
    1,103
    Joined:
    Aug 16, 2004
    Ouch, tsk tsk Valve. Anyone going to do some in depth benches / results on this? Brent? :p
     
  11. bboynitrous

    bboynitrous 2[H]4U

    Messages:
    2,528
    Joined:
    Feb 29, 2004
    lol, I don't think so. I was just saying random crap hoping to be right.
     
  12. Dallows

    Dallows [H]ardness Supreme

    Messages:
    6,832
    Joined:
    Jun 18, 2004
    any effect on 6800 series?
     
  13. tranCendenZ

    tranCendenZ 2[H]4U

    Messages:
    3,847
    Joined:
    Jun 6, 2004
    lol i called this so long ago

    funny even at FP32 all the time, 6800 still kicks this game's ass. But it would be even faster if Valve used good shader programming and put in partial precision calls where appropriate.
     
  14. archevilangel

    archevilangel [H]ard|Gawd

    Messages:
    1,825
    Joined:
    Jan 15, 2003
    hmmm, even as a 6800 gt owner I need some side by side screenshots before I jump on this bandwagon. Maybe I'll have time to test it out, but it is sad if true considering that the 6800 gt still ties with with the x800 pro.
     
  15. Mr Mean

    Mr Mean [H]ard|Gawd

    Messages:
    1,358
    Joined:
    Oct 2, 2004
    Hey Kyle and Brent are you going to investigate this? I would like a second opinion on this.
     
  16. sakurakana1003

    sakurakana1003 Gawd

    Messages:
    719
    Joined:
    Nov 22, 2004
    Top of the line video cards from both companies are supposed to be neck-to-neck with each other this round of the fight anyways. Good discovery though. :D
     
  17. dderidex

    dderidex [H]ardness Supreme

    Messages:
    6,329
    Joined:
    Oct 31, 2001
    Yeah, that's the real kicker.

    All 3dAnalyze can do is force FP16 *all the time*. Although all the screenshots taken to date of it show *no* image quality difference....you gotta wonder.

    All that means is that it's a damn shame Valve didn't code this right. Let's assume we DO somewhere find some artifacts by forcing it to FP16 (none found yet, just saying). Let's say as much as 5% of the rendering needs at least 24-bit floating point instructions to actually render properly.

    If they'd used _pp hints for the REST, then 95% of the time, the FX cards would be running FP16, and 5% of the time running FP32 (instead of 100%, as they are now). There would be literally *NO* image quality difference in this hypothetical situation - anywhere, at all - and the performance would still be 95% of the boost we are seeing by forcing it to always use FP16.

    Valve could have done this, and then the FX cards would be running in DX9 mode perfectly competitively instead of using DX8 mode. I'll grant the FX 5900s would probably still lose to the Radeon 9800 Pros and XTs....but we would be talking about playable framerates still, and identical image quality - rather than the 9800s slaughtering the FX cards we 'see' now.

    (Course, it's *entirely* possible Half-Life 2 really doesn't EVER need more than FP16, and forcing it to use that 100% of the time will come up with no artifacts at all. In which case....shame on Valve, seriously!)
     
  18. geekcomputing

    geekcomputing [H]ard|Gawd

    Messages:
    1,100
    Joined:
    Aug 30, 2004
    the 9700-9800 line and relatives do 24 percision while the FX line does 16 or 32 depending.

    so yes.. its official by forcing the program to do 24 and not letting teh card decide its like valve locked in ati and exluded valve. can you say black mail for not bidding the highest?

    haah what bs.

    fuck valve..and fuck steam.

    too bad for the fx owners..

    but the 6800 or 6600 owners should be just fine.
     
  19. glynn

    glynn [H]Lite

    Messages:
    104
    Joined:
    Nov 17, 2004
    Unfortunatly because the compatition is so tight between both vendors we are gonna start seeing more such instances as this i think personally :( It really is Not on when companys start to disadvantage one card in pref to the other :mad:
    I just REALLY hope to god that Game Devs Dont adopt this trend more often?
     
  20. Moloch

    Moloch [H]ard|Gawd

    Messages:
    1,030
    Joined:
    Sep 11, 2004
  21. Mister E

    Mister E ?

    Messages:
    2,558
    Joined:
    Sep 14, 2004
    It was a shame that Doom3 wasnt coded more appropriately for radeon cards too.
     
  22. Chippy

    Chippy [H]Lite

    Messages:
    87
    Joined:
    Oct 8, 2004
    Er, in what way?

    As I understand it the performance delta between 6800 and X800 in D3 is more to do with ATI's inferior OpenGL implementation, rather than how D3 is coded.

    Or are you suggesting otherwise?

    Chip
     
  23. CrimandEvil

    CrimandEvil Dick with a heart of gold

    Messages:
    20,312
    Joined:
    Oct 22, 2003
    He's just trying to cover for ATi having horrendous OpenGL support at the time, D3 uses the same code path for both NV and ATi which is something he should have known by now. :rolleyes:

    Thanks for the tipoff Dderidex, really makes you wonder what the fuck Valve was doing with all that time. :rolleyes: Maybe Valve will now release a patch to fix the numberous bugs as well as their intentional crippling of NV cards..... yeah right. Not after they took all that Red ATI money. ;)

    So has anyone run any extensive benchmarks as well as played through the game fully to make sure there wasn't any problems?

    Honestly I would feel sort of conflicted, this is pretty much the same thing as the "Humus tweak" (without the fact that NV didn't release the info though) since it gives a performance increase while it "lowers" image quality but I really don't like Valve nor do I respect them enough to worry about it myself. This isn't really like the whole D3 thing anyways. ;)
     
  24. mohammedtaha

    mohammedtaha 2[H]4U

    Messages:
    3,936
    Joined:
    Oct 14, 2004
    If Valve did really tweak to improve performance for ATi .. then that's their own decision .. they can do whatever they want to improve performance for the company that sponsors them .. or did u miss the ATi sign on the CD or whatever u used to purchase the game ...

    If this is indeed what happened .. yes I do agree .. Shame on Valve .. but U CAN'T say they suck ... cause HL2 is an AMAZING game ... I love the music .. they brought back the feel of the old HL to the new one with amazing physics and graphics ..

    there's no other game out there that can prove itself against the HL2 engine .. not even D3 ... cause all I've seen in D3 is shadows and skin texture and lighting ... everything else was dark or hidden in shadows ... oh ..

    There's no war to start over the different engines ... it's obvious which one is better NOW ...
     
  25. defiant

    defiant Limp Gawd

    Messages:
    461
    Joined:
    Jun 14, 2003
    Cant you read....they didnt just tweak performance on ATI cards, they crippled performance on nvidia cards. Thats just low.

    On a side note, i think this thread should be stickied and renamed to something like "Half life 2 Performance Tweak for Nvidia Cards" so people actually know what its about
     
  26. Legend

    Legend n00bie

    Messages:
    16
    Joined:
    Oct 23, 2004
    Er...can you read?

    First of all, you're assuming something that hasn't been proven yet, by you especially or anyone for that matter.......

    Also, I'm in agreement that they didn't "cripple" performance on nVidia cards if this does turn out to be true. They optimized for ATI, just like software companies do for Intel processors. So you're saying that if X software company makes their product work better if a processor uses SSE3 (which is done, btw) that means they're "crippling" Athlon 64 cpu's? Didn't think so.

    I think you're right about the sticky part, though, if this does turn out to work.
     
  27. DropTech

    DropTech Limp Gawd

    Messages:
    296
    Joined:
    Oct 11, 2004
    This in no way helps out ATi as they can run variable FP also. They hindered the lower end graphics users because they made it run at 32 all the time. This isn't a propriatary issue... (which you're saying it is with your instruction example), it's an issue that valve went out of their way basically (32FP I assume would be longer then variable 16-32..or do they have to write code sets for both?) to cripple the FX series whent they didn't have to.
     
  28. obs

    obs [H]ardness Supreme

    Messages:
    4,395
    Joined:
    Nov 4, 2002
    There is a reasonable expectation in this industry that all video cards are supported the best they can. Sure, in some cases extra features that may only exist on one brand are utilized much like SSE2/3 with processors. However, when you have to tell the game that your card is some other card in order to get the game to run properly, there is a problem. If this problem is true, I really hope Valve issues a patch to fix it. I am going to be optimistic here and think they will.
     
  29. DanK

    DanK Gawd

    Messages:
    949
    Joined:
    Apr 21, 2004
    The "war" over which engine is technically better was over before it was started... and the winner was Doom 3. Half-life 2 looks excellent... because Valve has amazing texture artists. id Software's just don't compare. If you search around (I believe I saw it in these forums), there is a thread wherein someone used the respective editors for each game to remove textures from HL2 and place them into a D3 level. The resulting screenshots (to me, at least) look phenominal: D3's engine lighting HL2's world.

    Don't get me wrong, I'm not saying that Source is a bad engine. It's just that a lot of people are saying things about the engines when they are really commenting on things not related to the actual engines. HL2 is, imo, a much better game than D3. But D3's engine is capable of much more, I think.
     
  30. Met-AL

    Met-AL [H]ardness Supreme

    Messages:
    7,905
    Joined:
    Apr 9, 2002
    So, it's Valve's fault that they used the DX9 spec? FP16 is not DX9 spec. FP24 or HIGHER is DX9 spec. FP16 is LOWER than FP24.

    It isn't like anyone was forcing you to by a FX card. It was well known that they sucked in this respect and shader performance. There was no conspiracy to pull a fast one. Go back a year ago and these board were full of information pointing to the fact that the FX line had problems..big problems. You chose not to take that advice and now your mad that you have to run a hack to get your card to run a game in DX9 mode.
     
  31. gamer1drew

    gamer1drew [H]Lite

    Messages:
    98
    Joined:
    Jul 17, 2004
    Any chance this tweak works with CSS or Vampire the Masquearades?
     
  32. Santroph

    Santroph Limp Gawd

    Messages:
    133
    Joined:
    Mar 27, 2003
    That really depends on what are u thinkin about when u talk about the WHOLE engine. I really, really love the way Half life 2 is graphically, but i think that INDOORS doom 3 probably can have a better engine, outdoor I don't know since D3 have none.

    BUT:
    If you think about physics engine, source can't be mached by D3.
    When u think about the face emotion, again source can't be mached and this time by noone.

    Obviously, not everyone want to see or even care for these things, but for those who do, like me, source is a better engine than D3, not that I didn't liked the engine of D3 (engine because D3 sucked for me), but source is better in my op.

    P.S: Source as far as I can see is way more forgiving with still good image.
    I can run really fine HL2 with 1280 in my 9700 PRO, but not even with 1024 with good fps in D3. :(
     
  33. obs

    obs [H]ardness Supreme

    Messages:
    4,395
    Joined:
    Nov 4, 2002
    Read the post. First, he had to tell HL2 he had a different card to get it to run without artifacts. That is the real issue here as of course the Radeon card he was pretending to be would use FP24 as that is its native mode. However, the parent found a perfectly acceptable way to run DX9 on a FX card, and all you can say is "he shouldn't have bought that card."
     
  34. chrisf6969

    chrisf6969 [H]ardForum Junkie

    Messages:
    9,151
    Joined:
    Oct 27, 2003
    There is no comparison to D3. That game is coded very efficiently and runs the same path on both cards. It just comes down to ATI's OpenGL support sucking compared to Nvidia's.

    Games are supposed to be coded as efficiently as possible and only use resources when needed. That would be like a game developer forcing SM3.0 at all times when only ps1.1 is needed.

    Valve purposely coded everything to 24fp which is not really that much better than 16fp. You can notice a difference if you compared pixels, but not really from gameplay. The main difference is on FX cards... 24fp is bumped up to 32fp, b/c they dont have a 24fp mode. They could have coded most of the game in 16fp, and only used 24fp (32) where needed. For efficiency's sake, like many games do.

    More people could probably be playing with higher resolutions, etc... It should help ATI cards (and Nv 6x00 cards) some, and Nv FX 5x00 cards A LOT.
     
  35. DanK

    DanK Gawd

    Messages:
    949
    Joined:
    Apr 21, 2004
    True, there are areas that I didn't consider, such as facial animation, that Source wins hands down.

    As far as Source vs. D3 performance, Source was made to run on ATi's 9xxx series, so it's not surprising that it works well on that hardware.

    I think both engines will give us some great games in the future, though.

    On an unrelated note, I thought DX9 allowed for partial precision shaders in situations where it would help performance, so long as the original shader length was maintained (i.e., by using temporaries)? Can someone provide a link to relevant documentation? (I'd look for it, but I have class! gotta go.)
     
  36. Elrein

    Elrein n00bie

    Messages:
    25
    Joined:
    Aug 15, 2004
    Funny everything i throw @ my X800XTPE runs super sweet :D , im sure if i had a 6800ultra id be saying the same

    Get over it and just play the f'n game :p
     
  37. Jbirney

    Jbirney Gawd

    Messages:
    530
    Joined:
    Nov 14, 2003
    Value did have a mixed mode for FX cards. They showed the Mix mode results during the shader day event last year. They also showed the results of runing it in full DX9 mode. In the end they said it was faster to run HL2 on an FX card in DX8 (8.1) mode. Thats something we all knew about LAST year. Why is this news now? Are you sure when your force something to run your not breaking something else? I mean you force to use ATI ID and then Force 3D analyize to run something? Are you 100% sure thats not breaking something in the chain? Are those results vaild? Are you 100% sure the IQ is the same? But no instead of trying to make 100% sure on all of this you start a post. Nice :confused:
     
  38. Met-AL

    Met-AL [H]ardness Supreme

    Messages:
    7,905
    Joined:
    Apr 9, 2002
    No, you read the posts. The problem is that DX9 requires FP24. Since the FX can do 16 or 32, it has to do 32, which kills the performance. The hack makes the Source engine render in FP16 which is not DX9 spec. Complaining that Valve stuck to the DX9 spec and saying they suck is stupid. The problem is not with Valve. It is with the hardware of the FX cards. Same goes with the D3 engine on ATi cards.
     
  39. jon67

    jon67 Gawd

    Messages:
    585
    Joined:
    Oct 29, 2004
    I think you need to clarify your perception of what an "engine" is. I guess your definition of "engine" excludes everything that HL2 does better than D3 , i.e. textures, water, physics simulation, AI, long-distance rendering... everything but the lighting of course, since D3 is nothing but lighting effects.
     
  40. pxc

    pxc Pick your own.....you deserve it.

    Messages:
    35,300
    Joined:
    Oct 22, 2000
    Uh, actually DX9 does allow FP16 as part of the spec for PS2.0 (_pp = partial precision, which is what 3D-Analyze is forcing): http://msdn.microsoft.com/library/d...ixelShaders/Instructions/Modifiers_ps_2_0.asp

    nvidia will probably just force the option in future drivers anyways through app detection. Nice trick to see though and interesting that Valve intentionally tanked performance.
     
Thread Status:
Not open for further replies.