Valve sucks

Discussion in 'Video Cards' started by dderidex, Nov 30, 2004.

Thread Status:
Not open for further replies.
  1. Moloch

    Moloch [H]ard|Gawd

    Messages:
    1,027
    Joined:
    Sep 11, 2004
    There's nothing wrong with nvidia letting [H] do some benchmarking though right?
    am I rite?
     
  2. Jbirney

    Jbirney Gawd

    Messages:
    528
    Joined:
    Nov 14, 2003
    I't did not help them at all. Users on 6800s over in the NV forum reported a 2% gain.
     
  3. SuX0rz

    SuX0rz [H]ard|Gawd

    Messages:
    1,292
    Joined:
    Jul 30, 2002
    I want to see some numbers before I start freaking out about this.
     
  4. Jbirney

    Jbirney Gawd

    Messages:
    528
    Joined:
    Nov 14, 2003

    fastZreject will be enabled on ATI cards in a future patch yet Brent chose not to benchmark that. What every they do be consistant. Don't benchmark one and not the other...
     
  5. Spank

    Spank Limp Gawd

    Messages:
    378
    Joined:
    Dec 10, 2003
    i just looked at my box and dont see a single ati logo on there, it is a small black/white ati logo on the dvd though
     
  6. psikoticsilver

    psikoticsilver [H]Lite

    Messages:
    92
    Joined:
    Oct 6, 2004
    ID did NOT program Doom3 for beter NVidia performance--in fact, if you remember back in the days of their first Doom3 display they were using r300 silicon and touting IT as the better performer for Doom3. This is thread is NOT about Doom3 performance, nor is it about "well ID did it too!"

    As far as I can tell this just Valve trying to compensate for sitting around an entire year doing nothing but masturbating and watching football on the 6 million dollars that they got from the auction. Then suddenly in January they woke up and remembered, "oh crap, we have to write that game, don't we?" Allowing partial precision use for even just one card would require the coding they weren't prepared to do because they were so behind--using 3danalyze is NOT the same thing as making your game run _pp hints natively.

    (Another interesting thing to remember, is that this is business. Valve held an AUCTION for the HL2 boxing rights, ATi forked over more cash then NVidia, thus they got the rights to put HL2 in their boxes. ATi didn't just hand Valve money and say "make us a game." This sort of thing seems to be said a lot and it's false.)

    Bring on the bencmarks.

    *edit*
    To fit in with this thread better, I must provide the complimentary flame...
    Fl4mz0rezzz NVIDIOTS SUX0rz FANATics Sux0rzrzz!
     
  7. (cf)Eclipse

    (cf)Eclipse Freelance Overclocker

    Messages:
    30,028
    Joined:
    Feb 18, 2003
    yeah, fp32 is more detailed than fp16, but only in situations that cause significant rounding errors with fp16. it seems that the codepath hl2 uses is not complicated enough to create these kinds of errors.
    also, ati doesn't have fp32 or fp16 right now. all the dx9 based cards they have can only do fp24. nvidia just has fp32/fp16


    i think you misunderstood what i said. think it over again while keeping in mind that it would be valve putting in a slight bit of code that forces fp16 for some of the fp24 code paths that don't cause the rounding errors i spoke of before (as someone else pointed out before, most likely a very significant majority of the shaders can be run at fp16)
    this seriously wouldn't take too long to code, and it would have increased their install base among the people with a nv3x based card.



    can you point me to where you speak of? i just took a look through [H]'s nvidia forum, and didn't see a single thread about this. and changing from fp32 to fp16 should net then a much larger performance gain than 2%
     
  8. DaveBaumann

    DaveBaumann n00bie

    Messages:
    39
    Joined:
    Jun 24, 2003
    The default path for these boards is now DX8.1, not DX9, so there performance for them is already there since the DX8.1 path is faster.

    If you want to hack it, then the 5 minutes of analysis thats probably gone on here will probably tell you there is no discernable IQ differences. If you want to produce a large game, veryify that it operates on 10's of different type of graphics chips, on 1000's of different configurations, and that each of the 1000's of shaders under each mode (DX8, DX8.1, DX9 full precision, DX9 Partial Precision) produces the correct output in each of the different scenarious you are you using them then the validation is going to take some time. For a game thats already late, is it worth it more for 2.5% of the install base when they will get perfectly good performance and pretty good quality under the DX8.1 mode anyway?
     
  9. arentol

    arentol 2[H]4U

    Messages:
    2,712
    Joined:
    Jun 15, 2004
    Wow! So you work for Valve then? I mean you would have to work for them to KNOW they were intentionally trying to HURT FX cards wouldn't you?

    Here's the thing... Most FX cards aren't that great. The FX5950 MAY be halfway decent, but when coding for FX cards Valve had to code for the LOWEST COMMON DENOMINATOR, which is the FX5200.

    So the only valid test of whether Valve was trying to hurt FX cards, rather than simply chosing the code path that would allow ALL FX cards to play the game effectively, would be to run the game at default settings on an "average" PC (2.0ghz Intel or AMD 2000+, 512mb of ram, etc.) with an FX5200 card, then do the same with the original posters various "tweaks" all turned on, and compare the two. Keeping in mind at which settings and resolutions the game is "playable", (runs at 40fps average with minimums not lower than 20fps, not counting the ~0fps spikes that sometimes happen in this game).

    If that can't be done then atleast compare it on an average PC with an FX5900 at 1024x768 and then cut the results FPS in half... Which is about the typical speed difference at that resolution for a 5200 versus a 5900.

    By accusing Valve of intentionally gimping the FX series, while only comparing the fast FX cards, and not the SLOWEST FX CARD, you are doing Valve a serious disservice as well as invalidating any comments you may care to make.


    (In other words, if DX9.0 with Partial Precision at 1024x768 doesn't run well on an average PC and an FX5200 then Valve made the right choice, and if it does then they MAY have intentionally screwed FX owners... MAYBE.).
     
  10. Dallows

    Dallows [H]ardness Supreme

    Messages:
    6,816
    Joined:
    Jun 18, 2004
    I'm trying this on my 6800GT for shits and giggles because for some reason my system refuses to perform as it should, regardless of the newly formatted hdd.
     
  11. jyi786

    jyi786 [H]ardness Supreme

    Messages:
    5,460
    Joined:
    Jun 13, 2002
    I think the important thing to remember here is that IF what dderidex has suggested doing is enough to get the FX series of cards to work properly in DX9, and FP16 IS the lowest pixel precision? allowed in DX9, why did Valve default the FX series to DX 8.1 codepath when it could do DX9 at almost the same FPS? That doesn't make sense to me. More like a marketing trick to make more people buy ATI.

    I guess it is all about business, after all. :rolleyes:
     
  12. dderidex

    dderidex [H]ardness Supreme

    Messages:
    6,313
    Joined:
    Oct 31, 2001
    Except they are *already* treating the FX 5200 different from the FX 59xx - IE., they aren't trreating all the same, anyway.

    The FX 5800 and 5900 run the DX8.1 path, and the FX 5200/5600/5700 run the DX8 path.

    And the FX 5200 doesn't even do THAT well (IE., it's not playable at 1024x768), so holding the whole line back for that one card is a little absurd.
     
  13. espent

    espent n00b

    Messages:
    5
    Joined:
    Aug 23, 2004
    If people took time to analyze all the games out there I'm sure they would find games that give nVidia or ATI an advantage. At least 50% of the games I play show me a nVidia logo before I get into the game. If I were paranoid I could say that these games have been programmed to give nVidia an advantage over ATI.

    People should stop crying and flaming Valve. They may have a reason for this, or it can be some simple mistake.
     
  14. Spank

    Spank Limp Gawd

    Messages:
    378
    Joined:
    Dec 10, 2003
    why would they put pp hints in there dx9 path, no card would benefit from it. You are supposed to be running dx8 on FX cards.

    Show me some benchmarks where you get a better gaming experience using dx9 with pp over running the intended dx8 path
     
  15. maleficarus

    maleficarus Gawd

    Messages:
    744
    Joined:
    Nov 17, 2004
    "(Course, it's *entirely* possible Half-Life 2 really doesn't EVER need more than FP16, and forcing it to use that 100% of the time will come up with no artifacts at all. In which case....shame on Valve, seriously!)"

    And of course like any good lawyer would tell you assuming gets you nowhere and dosen't prove your point :p

    Basicaly what you are doing is assuming that for 100% of this game there would be no artifacts or any issues assosiated with the above. Well you can't proof that no matter how much you would like to. SO having said that you are basicaly accusing Valve for coding for a certain compay with no proof just acusations and assumptions. Same on you not Valve!
     
  16. dderidex

    dderidex [H]ardness Supreme

    Messages:
    6,313
    Joined:
    Oct 31, 2001
    It's not for performance reasons, it's for quality.

    If running the DX9 path with partial precision hints even gets you the same (or very close) performance to the DX8 path....that would be a better option then, wouldn't it? You'd be getting more realistic reflections, more realistic lighting, etc.
     
  17. DaveBaumann

    DaveBaumann n00bie

    Messages:
    39
    Joined:
    Jun 24, 2003
    Somebody hacking it and thinking there is no discernable difference is different from a developer validating that the output that is generated is the output that the artists expect to see. By altering the precision it operates at you are potentially altering the results that are intended to be produced – this already has to be tested for full precision (probably both full precisions), adding another precision means an extra layer of tests through all the usage scenarios of the shaders within the game.

    The fuss that occured around Doom3’s performance arose because JC decided to remove higher precision vector normalisiation by math (something that ATI cards like) and made the lower quality vector normalisation via cubemaps (something the was originally used in the NV30/20 paths) the default for everything. Why? Not because he was biasing away from ATI, but because to wanted to make the output as consistent as possible across each of the available boards so that the artists wouldn’t be wondering if the output was what they expected under each different path. It turns out that there were few output difference, but to get the game out this is the choice he id made; the same logic applies with Valve’s case – there are fewer usage scenarios to test for.
     
  18. prince of the damned

    prince of the damned n00b

    Messages:
    50
    Joined:
    Nov 5, 2004
    İ agree with espent nearly every game İ have played has the nvidia logo but until hl2 i have never seen an ati logo ....

    But if valve did this on purpose thenits really going to get ugly.
     
  19. Netrat33

    Netrat33 [H]ardness Supreme

    Messages:
    4,894
    Joined:
    Aug 6, 2004

    You also wont see one x800 series card recommended on the box either yet you do see 6800 series. Bet you didn't notice that :D

    the early benchmarks wasn't valve's or ati's fault. That was a stolen copy.

    Free voucher is pretty much the same thing as bundling the game with it. It just wasn't out yet. It's much like the 5XXX series cards saying "recommended for doom3" before the game wasn't out.
     
  20. Jbirney

    Jbirney Gawd

    Messages:
    528
    Joined:
    Nov 14, 2003
    Again maybe if it was a 3rd rate game. But this is the GOTY, that has 1000+ shaders. And you want to make sure all of those switches on the complier produces good mixmode code. Thats not a trival thing to do....and yes it would add time. And it wouldn't increase the install base at all. Those with FX cards either already know they are week at DX9 or dont know the difference as they are not PC savvy. Most of the Hard Core H's have since upgraded...



    Again this tweak only seems to help FX users and not 6800 users. Here is one user on a 6800 that gained a whooping 2 fps on his 6800:
    http://www.nvnews.net/vbulletin/showthread.php?t=41625&page=2&pp=15
     
  21. maleficarus

    maleficarus Gawd

    Messages:
    744
    Joined:
    Nov 17, 2004
    Good point! Seeing an ATi logo on a game is as rare as seeing an AMD TV ad.
     
  22. jyi786

    jyi786 [H]ardness Supreme

    Messages:
    5,460
    Joined:
    Jun 13, 2002
    Although I do see the logic in this, I still can't fathom the reason why it would be any different if the FX series could do partial precision in DX9 compared to just defaulting in DX8.1. It still has to do with marketing (ATI over nVidia).
     
  23. Netrat33

    Netrat33 [H]ardness Supreme

    Messages:
    4,894
    Joined:
    Aug 6, 2004
    I almost bought a videocard because thinking nvidia was the only way to play (we are talking about a gamer that pretty much abandoned PC gaming during the voodoo/geforce 2 days and that was still good enough). Then doing research saw that ATI was bringing da heat
     
  24. maleficarus

    maleficarus Gawd

    Messages:
    744
    Joined:
    Nov 17, 2004
    What I think the thread starter for this thread forgets in NVIDIA has been doing this for years with ID. Ever wonder why Quake3 always ran better on NVIDIA cards? Or how bout Far Cry. I mean dammit it there is a cg_complier folder in the friggen game for crying out loud LOL For all that dosen't know cg was NVIDIA's compiler code. What does that say? That would be like having a GLIDE folder in Unreal :D

    SO please spare me this "lets hate Valuve" crap cause NVIDIA is more quilty then everyone for this. And what I love the most is the fact that [H] makes a big deal about it posting a direct link to this BS thread on there start page (right under there "let's all hate Valuve for what they did to CS) section
     
  25. dderidex

    dderidex [H]ardness Supreme

    Messages:
    6,313
    Joined:
    Oct 31, 2001
    If you actually read the thread instead of spouting off uninformed, you'd see that this has already been addressed several times.
     
  26. Hooligan

    Hooligan Gawd

    Messages:
    834
    Joined:
    Aug 7, 2001
    Matrox Parhelia 4 life
     
  27. -=bladerunner=-

    -=bladerunner=- n00b

    Messages:
    23
    Joined:
    Nov 17, 2004
    Except that 'ARB2' path was optimized for nV cadrs. Humus's patch showed that pretty well.

    yes, but in Far Cry FX 3x doesn't run same path as R3xx

    they sucks in DX9, that's for sure
     
  28. PSYKOMANTIS

    PSYKOMANTIS [H]ard|Gawd

    Messages:
    1,128
    Joined:
    Sep 20, 2002
  29. pxc

    pxc [H]ard as it Gets

    Messages:
    33,064
    Joined:
    Oct 22, 2000
    There's a small ATI logo on the back of my HL2 box, between the Vivendi logo and closed captioned logo.
     
  30. -=bladerunner=-

    -=bladerunner=- n00b

    Messages:
    23
    Joined:
    Nov 17, 2004
    That's not true . In CoD run Ati cards at least as fat as nV cards if no faster
     
  31. maleficarus

    maleficarus Gawd

    Messages:
    744
    Joined:
    Nov 17, 2004
    Unimformed?

    No, what I see is someone picking at a game company (thread titled : Valve sucks!) them going on some tirade kinda like what [H] does all the time LOL
     
  32. espent

    espent n00b

    Messages:
    5
    Joined:
    Aug 23, 2004
    Maybe Valve decided that the best balance between quality and performance for FX cards were DX8.1. Complaining about DX9 performance is useless when the game doesn't officially support DX9 on these cards.

    Or maybe Valve found an issue when running DX9 on some of these FX cards and decided to go the DX8.1 route so your avarage gamer wouldn't run into these problems.
     
  33. pxc

    pxc [H]ard as it Gets

    Messages:
    33,064
    Joined:
    Oct 22, 2000
    The Humus patch helped very little after the 4.9 beta(?) showed the problem was mostly OpenGL/in game AF related. IOW, it was a driver problem.
     
  34. kakarotxiv

    kakarotxiv Gawd

    Messages:
    651
    Joined:
    Jun 21, 2004
    lol doom3 was coded for nvidia but with hl2 being a massively superior game i understand why nvidia users are complaining
     
  35. maleficarus

    maleficarus Gawd

    Messages:
    744
    Joined:
    Nov 17, 2004
    Fact of the matter is the actual quality differences between DX8.1 and DX9.0 would be like looking for a needle in a hay stack. There just isn't anything different other then some water reflections that you wouldn't really see considering the only time in HL2 you see water is when you got the hammer down doing like 600 notts in a small boat jumping over everything you can think of LOL
     
  36. Wixard

    Wixard 2[H]4U

    Messages:
    2,119
    Joined:
    Sep 14, 2002
    Yep, fixed in a driver.

    To bad by the time ATi got it's act together the dust had settled and reviews and benchmarks were more or less dying down.
     
  37. CGFMaster

    CGFMaster Limp Gawd

    Messages:
    165
    Joined:
    Jul 27, 2004
    Wow! This article is amazing!

    So far we have:

    9,592 views

    156 posts
    ~and the most astounding~
    0 benchmark results
     
  38. Gavinni

    Gavinni Limp Gawd

    Messages:
    403
    Joined:
    Sep 29, 2004
    No... the 4.9+ had the lookup table problem fixed in the driver, so the driver tells the game to apply the humus tweak basicly...
     
  39. pxc

    pxc [H]ard as it Gets

    Messages:
    33,064
    Joined:
    Oct 22, 2000
    No. Read the B3D thread.

    Beta 8.07 had the hacks (plural) added.
     
  40. Chris_B

    Chris_B [H]ardness Supreme

    Messages:
    4,997
    Joined:
    May 29, 2001
    Personally i think if games companies are optomising for one gfx card or another its total bullshit and its something that should stopped before it goes too far.

    We pay from 300-500$ (£400 for me) or more for these cards and we expect to get our moneysworth. I don't agree with this "the way its meant to be played" horseshit and i don't agree with valve taking 6 million from ati and "reccomending" that one card is more suited for a game.
     
Thread Status:
Not open for further replies.