Valve sucks

Status
Not open for further replies.
^eMpTy^ said:
This isn't a thread about Doom3 performance...

Bundling a game is one thing, giving away free vouchers with all ATi cards, making ATI specific levels for the game, paying $6 million, coming out with early benchmarks to support one company when the game was nowhere near completion, and then having ATi stamped all over the box and on the cd....now that's a completely different thing...

Pull out your doom3 box, you won't see a single nvidia logo anywhere on it.
There's nothing wrong with nvidia letting [H] do some benchmarking though right?
am I rite?
 
ajm786 said:
How would this help out any of the NV4X based video cards? We don't want to run HL2 in FP16; we want to run it in FP32.

I't did not help them at all. Users on 6800s over in the NV forum reported a 2% gain.
 
Moloch said:
There's nothing wrong with nvidia letting [H] do some benchmarking though right?
am I rite?


fastZreject will be enabled on ATI cards in a future patch yet Brent chose not to benchmark that. What every they do be consistant. Don't benchmark one and not the other...
 
^eMpTy^ said:
and then having ATi stamped all over the box and on the cd

i just looked at my box and dont see a single ati logo on there, it is a small black/white ati logo on the dvd though
 
ID did NOT program Doom3 for beter NVidia performance--in fact, if you remember back in the days of their first Doom3 display they were using r300 silicon and touting IT as the better performer for Doom3. This is thread is NOT about Doom3 performance, nor is it about "well ID did it too!"

As far as I can tell this just Valve trying to compensate for sitting around an entire year doing nothing but masturbating and watching football on the 6 million dollars that they got from the auction. Then suddenly in January they woke up and remembered, "oh crap, we have to write that game, don't we?" Allowing partial precision use for even just one card would require the coding they weren't prepared to do because they were so behind--using 3danalyze is NOT the same thing as making your game run _pp hints natively.

(Another interesting thing to remember, is that this is business. Valve held an AUCTION for the HL2 boxing rights, ATi forked over more cash then NVidia, thus they got the rights to put HL2 in their boxes. ATi didn't just hand Valve money and say "make us a game." This sort of thing seems to be said a lot and it's false.)

Bring on the bencmarks.

*edit*
To fit in with this thread better, I must provide the complimentary flame...
Fl4mz0rezzz NVIDIOTS SUX0rz FANATics Sux0rzrzz!
 
ajm786 said:
It was my understanding that FP32 was slightly more detailed than FP16; then again, you said if it can be proven. If this is the case, then I stand corrected.

BTW, what mode does the ATI X800 series run with by default? FP24? Or do they also do FP32?
yeah, fp32 is more detailed than fp16, but only in situations that cause significant rounding errors with fp16. it seems that the codepath hl2 uses is not complicated enough to create these kinds of errors.
also, ati doesn't have fp32 or fp16 right now. all the dx9 based cards they have can only do fp24. nvidia just has fp32/fp16


Jbirney said:
Not really those owners can run it in the default path (DX8) and still get better FPS. You average Joe is not going to know how to force rendering modes. He is going to run it out of the box (like HardOCP tested) which means it defaults to DX8 for all but 2.5% of the ammount of FX cards out there. 2.5% is a drop in the vitural bucket. There are about 1100 shaders used in HL2. Verifying the all look good in mixed mode will add time to development. Most of us had already given up on them....

Also consider this that ATI's HDR and 3Dc are STILL NOT USED in HL2. If they are late getting in these features that "ATI paid" for then why would they not be late with other stuff?
i think you misunderstood what i said. think it over again while keeping in mind that it would be valve putting in a slight bit of code that forces fp16 for some of the fp24 code paths that don't cause the rounding errors i spoke of before (as someone else pointed out before, most likely a very significant majority of the shaders can be run at fp16)
this seriously wouldn't take too long to code, and it would have increased their install base among the people with a nv3x based card.



Jbirney said:
I't did not help them at all. Users on 6800s over in the NV forum reported a 2% gain.
can you point me to where you speak of? i just took a look through [H]'s nvidia forum, and didn't see a single thread about this. and changing from fp32 to fp16 should net then a much larger performance gain than 2%
 
(cf)Eclipse said:
if valve increased performance with nv3x based cards, i'll be willing to bet that that 2.5% would jump up significantly because more people will want to try it out, knowing that it will run somewhat decently.
kind of a catch-22 there huh? ;)

The default path for these boards is now DX8.1, not DX9, so there performance for them is already there since the DX8.1 path is faster.

dderidex said:
Except, for all intents and purposes, this would require NO extra coding. Using 3dAnalyze, we can just do all the calculations meant to run in FP24 in FP16 instead, and there is no visual quality difference.

If you want to hack it, then the 5 minutes of analysis thats probably gone on here will probably tell you there is no discernable IQ differences. If you want to produce a large game, veryify that it operates on 10's of different type of graphics chips, on 1000's of different configurations, and that each of the 1000's of shaders under each mode (DX8, DX8.1, DX9 full precision, DX9 Partial Precision) produces the correct output in each of the different scenarious you are you using them then the validation is going to take some time. For a game thats already late, is it worth it more for 2.5% of the install base when they will get perfectly good performance and pretty good quality under the DX8.1 mode anyway?
 
dderidex said:
AGAIN: VALVE "PICKED AND CHOOSED" WHICH PART OF THE DX9 SPEC TO FOLLOW TO HURT THE FX CARDS THE MOST.

Wow! So you work for Valve then? I mean you would have to work for them to KNOW they were intentionally trying to HURT FX cards wouldn't you?

Here's the thing... Most FX cards aren't that great. The FX5950 MAY be halfway decent, but when coding for FX cards Valve had to code for the LOWEST COMMON DENOMINATOR, which is the FX5200.

So the only valid test of whether Valve was trying to hurt FX cards, rather than simply chosing the code path that would allow ALL FX cards to play the game effectively, would be to run the game at default settings on an "average" PC (2.0ghz Intel or AMD 2000+, 512mb of ram, etc.) with an FX5200 card, then do the same with the original posters various "tweaks" all turned on, and compare the two. Keeping in mind at which settings and resolutions the game is "playable", (runs at 40fps average with minimums not lower than 20fps, not counting the ~0fps spikes that sometimes happen in this game).

If that can't be done then atleast compare it on an average PC with an FX5900 at 1024x768 and then cut the results FPS in half... Which is about the typical speed difference at that resolution for a 5200 versus a 5900.

By accusing Valve of intentionally gimping the FX series, while only comparing the fast FX cards, and not the SLOWEST FX CARD, you are doing Valve a serious disservice as well as invalidating any comments you may care to make.


(In other words, if DX9.0 with Partial Precision at 1024x768 doesn't run well on an average PC and an FX5200 then Valve made the right choice, and if it does then they MAY have intentionally screwed FX owners... MAYBE.).
 
I'm trying this on my 6800GT for shits and giggles because for some reason my system refuses to perform as it should, regardless of the newly formatted hdd.
 
DaveBaumann said:
If you want to hack it, then the 5 minutes of analysis thats probably gone on here will probably tell you there is no discernable IQ differences. If you want to produce a large game, veryify that it operates on 10's of different type of graphics chips, on 1000's of different configurations, and that each of the 1000's of shaders under each mode (DX8, DX8.1, DX9 full precision, DX9 Partial Precision) produces the correct output in each of the different scenarious you are you using them then the validation is going to take some time. For a game thats already late, is it worth it more for 2.5% of the install base when they will get perfectly good performance and pretty good quality under the DX8.1 mode anyway?

I think the important thing to remember here is that IF what dderidex has suggested doing is enough to get the FX series of cards to work properly in DX9, and FP16 IS the lowest pixel precision? allowed in DX9, why did Valve default the FX series to DX 8.1 codepath when it could do DX9 at almost the same FPS? That doesn't make sense to me. More like a marketing trick to make more people buy ATI.

I guess it is all about business, after all. :rolleyes:
 
arentol said:
(In other words, if DX9.0 with Partial Precision at 1024x768 doesn't run well on an average PC and an FX5200 then Valve made the right choice, and if it does then they MAY have intentionally screwed FX owners... MAYBE.).
Except they are *already* treating the FX 5200 different from the FX 59xx - IE., they aren't trreating all the same, anyway.

The FX 5800 and 5900 run the DX8.1 path, and the FX 5200/5600/5700 run the DX8 path.

And the FX 5200 doesn't even do THAT well (IE., it's not playable at 1024x768), so holding the whole line back for that one card is a little absurd.
 
If people took time to analyze all the games out there I'm sure they would find games that give nVidia or ATI an advantage. At least 50% of the games I play show me a nVidia logo before I get into the game. If I were paranoid I could say that these games have been programmed to give nVidia an advantage over ATI.

People should stop crying and flaming Valve. They may have a reason for this, or it can be some simple mistake.
 
dderidex said:
So they went with FP24, big deal, DX9 spec ALSO calls for using partial precision hints when possible, and they DIDN'T do that.

AGAIN: VALVE "PICKED AND CHOOSED" WHICH PART OF THE DX9 SPEC TO FOLLOW TO HURT THE FX CARDS THE MOST.

why would they put pp hints in there dx9 path, no card would benefit from it. You are supposed to be running dx8 on FX cards.

Show me some benchmarks where you get a better gaming experience using dx9 with pp over running the intended dx8 path
 
"(Course, it's *entirely* possible Half-Life 2 really doesn't EVER need more than FP16, and forcing it to use that 100% of the time will come up with no artifacts at all. In which case....shame on Valve, seriously!)"

And of course like any good lawyer would tell you assuming gets you nowhere and dosen't prove your point :p

Basicaly what you are doing is assuming that for 100% of this game there would be no artifacts or any issues assosiated with the above. Well you can't proof that no matter how much you would like to. SO having said that you are basicaly accusing Valve for coding for a certain compay with no proof just acusations and assumptions. Same on you not Valve!
 
Spank said:
why would they put pp hints in there dx9 path, no card would benefit from it. You are supposed to be running dx8 on FX cards.

Show me some benchmarks where you get a better gaming experience using dx9 with pp over running the intended dx8 path
It's not for performance reasons, it's for quality.

If running the DX9 path with partial precision hints even gets you the same (or very close) performance to the DX8 path....that would be a better option then, wouldn't it? You'd be getting more realistic reflections, more realistic lighting, etc.
 
ajm786 said:
I think the important thing to remember here is that IF what dderidex has suggested doing is enough to get the FX series of cards to work properly in DX9, and FP16 IS the lowest pixel precision? in DX9, why did Valve default the FX series to DX 8.1 codepath when it could do DX9 at almost the same FPS? That doesn't make sense to me.

Somebody hacking it and thinking there is no discernable difference is different from a developer validating that the output that is generated is the output that the artists expect to see. By altering the precision it operates at you are potentially altering the results that are intended to be produced – this already has to be tested for full precision (probably both full precisions), adding another precision means an extra layer of tests through all the usage scenarios of the shaders within the game.

The fuss that occured around Doom3’s performance arose because JC decided to remove higher precision vector normalisiation by math (something that ATI cards like) and made the lower quality vector normalisation via cubemaps (something the was originally used in the NV30/20 paths) the default for everything. Why? Not because he was biasing away from ATI, but because to wanted to make the output as consistent as possible across each of the available boards so that the artists wouldn’t be wondering if the output was what they expected under each different path. It turns out that there were few output difference, but to get the game out this is the choice he id made; the same logic applies with Valve’s case – there are fewer usage scenarios to test for.
 
İ agree with espent nearly every game İ have played has the nvidia logo but until hl2 i have never seen an ati logo ....

But if valve did this on purpose thenits really going to get ugly.
 
^eMpTy^ said:
This isn't a thread about Doom3 performance...

Bundling a game is one thing, giving away free vouchers with all ATi cards, making ATI specific levels for the game, paying $6 million, coming out with early benchmarks to support one company when the game was nowhere near completion, and then having ATi stamped all over the box and on the cd....now that's a completely different thing...

Pull out your doom3 box, you won't see a single nvidia logo anywhere on it.


You also wont see one x800 series card recommended on the box either yet you do see 6800 series. Bet you didn't notice that :D

the early benchmarks wasn't valve's or ati's fault. That was a stolen copy.

Free voucher is pretty much the same thing as bundling the game with it. It just wasn't out yet. It's much like the 5XXX series cards saying "recommended for doom3" before the game wasn't out.
 
(cf)Eclipse said:
i think you misunderstood what i said. think it over again while keeping in mind that it would be valve putting in a slight bit of code that forces fp16 for some of the fp24 code paths that don't cause the rounding errors i spoke of before (as someone else pointed out before, most likely a very significant majority of the shaders can be run at fp16)
this seriously wouldn't take too long to code, and it would have increased their install base among the people with a nv3x based card.

Again maybe if it was a 3rd rate game. But this is the GOTY, that has 1000+ shaders. And you want to make sure all of those switches on the complier produces good mixmode code. Thats not a trival thing to do....and yes it would add time. And it wouldn't increase the install base at all. Those with FX cards either already know they are week at DX9 or dont know the difference as they are not PC savvy. Most of the Hard Core H's have since upgraded...



can you point me to where you speak of? i just took a look through [H]'s nvidia forum, and didn't see a single thread about this. and changing from fp32 to fp16 should net then a much larger performance gain than 2%

Again this tweak only seems to help FX users and not 6800 users. Here is one user on a 6800 that gained a whooping 2 fps on his 6800:
http://www.nvnews.net/vbulletin/showthread.php?t=41625&page=2&pp=15
 
prince of the damned said:
İ agree with espent nearly every game İ have played has the nvidia logo but until hl2 i have never seen an ati logo ....

But if valve did this on purpose thenits really going to get ugly.

Good point! Seeing an ATi logo on a game is as rare as seeing an AMD TV ad.
 
DaveBaumann said:
Somebody hacking it and thinking there is no discernable difference is different from a developer validating that the output that is generated is the output that the artists expect to see. By altering the precision it operates at you are potentially altering the results that are intended to be produced – this already has to be tested for full precision (probably both full precisions), adding another precision means an extra layer of tests through all the usage scenarios of the shaders within the game.

The fuss that occured around Doom3’s performance arose because JC decided to remove higher precision vector normalisiation by math (something that ATI cards like) and made the lower quality vector normalisation via cubemaps (something the was originally used in the NV30/20 paths) the default for everything. Why? Not because he was biasing away from ATI, but because to wanted to make the output as consistent as possible across each of the available boards so that the artists wouldn’t be wondering if the output was what they expected under each different path. It turns out that there were few output difference, but to get the game out this is the choice he id made; the same logic applies with Valve’s case – there are fewer usage scenarios to test for.

Although I do see the logic in this, I still can't fathom the reason why it would be any different if the FX series could do partial precision in DX9 compared to just defaulting in DX8.1. It still has to do with marketing (ATI over nVidia).
 
prince of the damned said:
İ agree with espent nearly every game İ have played has the nvidia logo but until hl2 i have never seen an ati logo ....

But if valve did this on purpose thenits really going to get ugly.

I almost bought a videocard because thinking nvidia was the only way to play (we are talking about a gamer that pretty much abandoned PC gaming during the voodoo/geforce 2 days and that was still good enough). Then doing research saw that ATI was bringing da heat
 
What I think the thread starter for this thread forgets in NVIDIA has been doing this for years with ID. Ever wonder why Quake3 always ran better on NVIDIA cards? Or how bout Far Cry. I mean dammit it there is a cg_complier folder in the friggen game for crying out loud LOL For all that dosen't know cg was NVIDIA's compiler code. What does that say? That would be like having a GLIDE folder in Unreal :D

SO please spare me this "lets hate Valuve" crap cause NVIDIA is more quilty then everyone for this. And what I love the most is the fact that [H] makes a big deal about it posting a direct link to this BS thread on there start page (right under there "let's all hate Valuve for what they did to CS) section
 
maleficarus said:
What I think the thread starter for this thread forgets in NVIDIA has been doing this for years with ID. Ever wonder why Quake3 always ran better on NVIDIA cards?

If you actually read the thread instead of spouting off uninformed, you'd see that this has already been addressed several times.
 
dderidex said:
That's exactly the question that needs to be asked.

On Carmack needing to code them their own path in Doom3 for performance? Well, wait, he didn't have to after all, huh? The FX cards run the same ARB2 path as ATI cards do...and run it just as fast, too.

Except that 'ARB2' path was optimized for nV cadrs. Humus's patch showed that pretty well.

dderidex said:
On Far Cry? Granted, the FX cards ARE slower than their ATI peers...but still offer competitive framerates.

yes, but in Far Cry FX 3x doesn't run same path as R3xx

dderidex said:
The point of all this is that the FX cards do NOT suck in Direct3d if you code a game properly to DX9 spec. And, obviously, we know they don't suck in OpenGL.

they sucks in DX9, that's for sure
 
Spank said:
i just looked at my box and dont see a single ati logo on there, it is a small black/white ati logo on the dvd though
There's a small ATI logo on the back of my HL2 box, between the Vivendi logo and closed captioned logo.
 
dderidex said:
Doom3 just followed the OpenGL spec, they didn't do anything WITH it to intentionally harm ATI cards....it's just that ATI cards suck in OpenGL. (Seriously - in ALL OpenGL, they are slower in Quake 3, Call of Duty, etc - anything OpenGL, ATI is slower in than nVidia)
That's not true . In CoD run Ati cards at least as fat as nV cards if no faster
 
dderidex said:
If you actually read the thread instead of spouting off uninformed, you'd see that this has already been addressed several times.

Unimformed?

No, what I see is someone picking at a game company (thread titled : Valve sucks!) them going on some tirade kinda like what [H] does all the time LOL
 
Maybe Valve decided that the best balance between quality and performance for FX cards were DX8.1. Complaining about DX9 performance is useless when the game doesn't officially support DX9 on these cards.

Or maybe Valve found an issue when running DX9 on some of these FX cards and decided to go the DX8.1 route so your avarage gamer wouldn't run into these problems.
 
-=bladerunner=- said:
Except that 'ARB2' path was optimized for nV cadrs. Humus's patch showed that pretty well.
The Humus patch helped very little after the 4.9 beta(?) showed the problem was mostly OpenGL/in game AF related. IOW, it was a driver problem.
 
lol doom3 was coded for nvidia but with hl2 being a massively superior game i understand why nvidia users are complaining
 
espent said:
Maybe Valve decided that the best balance between quality and performance for FX cards were DX8.1. Complaining about DX9 performance is useless when the game doesn't officially support DX9 on these cards.

Or maybe Valve found an issue when running DX9 on some of these FX cards and decided to go the DX8.1 route so your avarage gamer wouldn't run into these problems.

Fact of the matter is the actual quality differences between DX8.1 and DX9.0 would be like looking for a needle in a hay stack. There just isn't anything different other then some water reflections that you wouldn't really see considering the only time in HL2 you see water is when you got the hammer down doing like 600 notts in a small boat jumping over everything you can think of LOL
 
pxc said:
The Humus patch helped very little after the 4.9 beta(?) showed the problem was mostly OpenGL/in game AF related. IOW, it was a driver problem.

Yep, fixed in a driver.

To bad by the time ATi got it's act together the dust had settled and reviews and benchmarks were more or less dying down.
 
Wow! This article is amazing!

So far we have:

9,592 views

156 posts
~and the most astounding~
0 benchmark results
 
pxc said:
The Humus patch helped very little after the 4.9 beta(?) showed the problem was mostly OpenGL/in game AF related. IOW, it was a driver problem.
No... the 4.9+ had the lookup table problem fixed in the driver, so the driver tells the game to apply the humus tweak basicly...
 
Gavinni said:
No... the 4.9+ had the lookup table problem fixed in the driver, so the driver tells the game to apply the humus tweak basicly...
No. Read the B3D thread.

Beta 8.07 had the hacks (plural) added.
 
Personally i think if games companies are optomising for one gfx card or another its total bullshit and its something that should stopped before it goes too far.

We pay from 300-500$ (£400 for me) or more for these cards and we expect to get our moneysworth. I don't agree with this "the way its meant to be played" horseshit and i don't agree with valve taking 6 million from ati and "reccomending" that one card is more suited for a game.
 
Status
Not open for further replies.
Back
Top