Valve sucks

Status
Not open for further replies.
^eMpTy^ said:
I can't believe you even went there...but ok Chris...ask and ye shall receive:

ATI:

"SLI is a horrible idea"

QUOTE]


*LAUGHING* They NEVER said that!
the only thing they expressed was a concern...AND JUST A CONCERN...and it was ONE PERSON from ATI who said it about two different cards from different companies at different speeds
 
Chris_B said:
Can't recall ati putting shader replacement into the drivers, theres a HACK (ill save you the time of capatilising it like a 2 year old) but its not officially from ati.
It's called catalyst A.I....
Chris_B said:
They can all be disabled on ati drivers, far as im aware only specific optomisations can be disabled on forceware.
I call bullshit on that one since when other people disabled the optimizations via a registry hack the performance loss was much more significant.
Chris_B said:
If they sound like a hairdryer they are, if not i dont have a problem
Nobody asked what you thought. ATi said they were evil, now they need them, so they are magically cool...that's called a "lie" Chris...
Chris_B said:
How many games are using ps3? This "must have omgomg get it now" feature sure has had a lot of uses thus far.
Way to miss the point and buy into another ATi lie...you can't have PS3 games before PS3 hardware...if you can do everything PS3 can do with PS 2.0b then why is the r520 PS3 compliant? huh Chris?
Chris_B said:
Jurys still out on sli.
:)
Again, you miss the point...I don't know why I even waste my time arguing with a person so poorly informed and absent of insight...ATi said SLI was useless...then a month later they announced AMR...again...that's called a "lie" Chris...
 
^eMpTy^ said:
Hellllllllllooooooooo???

Nvidia marketing nvidia cards makes sense...VALVE marketing ati cards does not make sense...

Valve should be impartial...putting "ATi" all over the box and even the cds and having an ati coupon in the box is just over the line...


Well then yell at Crytek for slapping Nvidia logos all over their CDs and game and website.
 
Netrat33 said:
*LAUGHING* They NEVER said that!
the only thing they expressed was a concern...AND JUST A CONCERN...and it was ONE PERSON from ATI who said it about two different cards from different companies at different speeds

now we're both laughing...wow you're uninformed...
 
Just an FYI to everyone bragging about HL2 physics - Valve didn't write most of the physics code in HL2. It was licensed from a company called Havok.

The Guru of 3D unofficial benchmarks seem to show about a 45% gain from using FP16. I'll see if I can do some benchmarks tonight.

I have an FX 5900XT, I own Half-Life 2, and I also have a Mobility Radeon 9700.

We need the following benchmarks:

- FX5900, DX 8.1
- FX5900, DX9, Identify card as FX5900, 32-bit
- FX5900, DX9, Identify card as R9800, 32-bit
- FX5900, DX9, Identify card as R9800, 16-bit
- Radeon 9800, DX9, Identify card as R9800, 24-bit

Everyone, please use the following settings:

- All in-game quality settings on maximum (excluding AA and AF)
- 1024x768x32, No AA, No AF
- Stock clocks for the Radeon 9800 pro (380/680) and GeForce FX 5900XT (400/700) so that we can compare data
 
^eMpTy^ said:
It's called catalyst A.I....

I call bullshit on that one since when other people disabled the optimizations via a registry hack the performance loss was much more significant.

Nobody asked what you thought. ATi said they were evil, now they need them, so they are magically cool...that's called a "lie" Chris...

Way to miss the point and buy into another ATi lie...you can't have PS3 games before PS3 hardware...if you can do everything PS3 can do with PS 2.0b then why is the r520 PS3 compliant? huh Chris?

Again, you miss the point...I don't know why I even waste my time arguing with a person so poorly informed and absent of insight...ATi said SLI was useless...then a month later they announced AMR...again...that's called a "lie" Chris...




Ati didn;t say sli was useless, they said as a tech demonstration it was interesting but crazy for implementation. Its amazing you would automatically assume that since ati dismisses it they wouldn;t go on to develop their own version of it, companies put their own spin onto things so they go in their favour, if its a feature the opposition has do you think ati are gonna come out and say yeah go buy that, that obviously better than what we have. If any company took that attitude they wouldn;t last long. Its called "business" kid get that drilled into your head.

Ps3 hardware = zzzz, does nothing that 2.0 b can;t do, its just another checkbox feature that will probably not be used properly untill 1-2 years down the line.

As for the new cooler, last i checked that pic was from the inq, not exactly reliable, plus it looked photoshopped, untill i see it from ati im not really beleiving it.
 
Netrat33 said:
Well then yell at Crytek for slapping Nvidia logos all over their CDs and game and website.

*looks at farcry cds*

I don't see any nvidia logo? and I don't see any nvidia coupons either...care to pull anything else out your ass?
 
^eMpTy^ said:
now we're both laughing...wow you're uninformed...


Dude I read the article straight from Hardocp's front page when it was there months ago. It was only a concern..FIND ME..SHOW ME THEN..WHERE ATI states "Sli sucks, horrible idea" If you actually SHOW me where they say that...I'll apologize...but I bet you wont find it ANYWHERE.
 
^eMpTy^ said:
*looks at farcry cds*

I don't see any nvidia logo? and I don't see any nvidia coupons either...care to pull anything else out your ass?


fc_nvidia2.gif



What?

http://www.farcry.ubi.com/
 
Chris_B said:
Ati didn;t say sli was useless, they said as a tech demonstration it was interesting but crazy for implementation. Its amazing you would automatically assume that since ati dismisses it they wouldn;t go on to develop their own version of it, companies put their own spin onto things so they go in their favour, if its a feature the opposition has do you think ati are gonna come out and say yeah go buy that, that obviously better than what we have. If any company took that attitude they wouldn;t last long. Its called "business" kid get that drilled into your head.

Ps3 hardware = zzzz, does nothing that 2.0 b can;t do, its just another checkbox feature that will probably not be used properly untill 1-2 years down the line.

As for the new cooler, last i checked that pic was from the inq, not exactly reliable, plus it looked photoshopped, untill i see it from ati im not really beleiving it.

finally a sign of life...that's the point Chris...it's all marketing...and lately ATi has been in full-on denial mode...denying that everything nvidia has is useful...and what does that tell you? nvidia is in the technological drivers seat this year...and ATi is riding bitch...
 
Netrat33 said:
*snickering* You don't have to be suck a d**K about it ;)

My bad! But it is on the box and INSIDE THE BOX had nvidia advertisements.

lol...nice typo...honestly...that's hysterical...

and yeah nvidia does the TWIMTBP nonsense...and yes the intro screen is obnoxious...but I think it's safe to say that Valve has gone out of their way to sell ATi cards...and that alone is enough to make me suspicious when I see these forced FP16 screenshots...

that's all this thread boils down to...the whole situation is "suspicious"...
 
^eMpTy^ said:
lol...nice typo...honestly...that's hysterical...

and yeah nvidia does the TWIMTBP nonsense...and yes the intro screen is obnoxious...but I think it's safe to say that Valve has gone out of their way to sell ATi cards...and that alone is enough to make me suspicious when I see these forced FP16 screenshots...

that's all this thread boils down to...the whole situation is "suspicious"...


DAMN THE TYPO!! *shaking fist*

also nvidia is in the manual. I just know I saw nvidia all over farcry and really thought it was on the cd...oh well.

I think it's the same what valve did in advertising their "space" and Ati won in that.
Like I posted earlier, I almost thought the only card out there to buy were nvidia cards and that was the best performance. I was coming out of voodoo5 and gf2 card and didn't really research. When I did I was kind of shocked. That form of advertising works in selling your cards whether it's true or not. I can't blame either company for trying.

If Valve TRULY and malicously sabotaged the FX line...yea that's pretty crappy. But I don't think this is the case. I'm still holding out. It doesn't make sense for ANY video game COMPANY to make their game suck on one type of card. They are out to make money. And how do you get more money, making it available to the masses as much as possible. It doesn't make sense to me.
 
Netrat33 said:
Dude I read the article straight from Hardocp's front page when it was there months ago. It was only a concern..FIND ME..SHOW ME THEN..WHERE ATI states "Sli sucks, horrible idea" If you actually SHOW me where they say that...I'll apologize...but I bet you wont find it ANYWHERE.

dude...I'm not imagining things...promise...I'll see if I can't find you the link...they basically said it didn't make any sense to market SLI...and they expressed all kinds of unfounded "concerns" about it...how nice of them to be "concerned"...
 
If as the thread starter says that there is no image quality difference between fp16 and fp32 then why even have anything above 16 then? Although the post above me shows that there is an image quality difference.
 
Netrat33 said:
If Valve TRULY and malicously sabotaged the FX line...yea that's pretty crappy. But I don't think this is the case. I'm still holding out. It doesn't make sense for ANY video game COMPANY to make their game suck on one type of card. They are out to make money. And how do you get more money, making it available to the masses as much as possible. It doesn't make sense to me.

I think it was more of a mis-management issue on the part of Valve...they were short on time and beholden to ATi to deliver performance on their cards...so I think they cheesed out on the FX support and ATi seized the opportunity to show that their performance was vastly superior...clearly a pp patch would do some good...but they don't seem to have time...too busy programming more obnoxious activation techniques into steam no doubt...
 
^eMpTy^ said:
dude...I'm not imagining things...promise...I'll see if I can't find you the link...they basically said it didn't make any sense to market SLI...and they expressed all kinds of unfounded "concerns" about it...how nice of them to be "concerned"...

I think it's been a while for BOTH of us remembering the article. I see concerns as valid.
And like I said it was one person from ATI.
 
HOLY CRAP, whats wrong with you people? I just went through all 14 pages of this topic and no benches. I got the topic starters bench, which isn't enough. One person vaguely mentions another topics bench of the 6800, which isn't enough. And finally someone posted an image quality test, but you don't know the exact card they used of if they even did it right.

Dderidex, is the glass messed up in your tests like the pictures above or is it normal?

People stop arguing about BS and start benching with your 5xxx/6xxx cards.
 
Why does every thread remotely related to ATI/NVidia/HL2 ends up in a discussion about marketing crap... Honestly, enthusiasts that can read thru the marketing pay more attention to it than the people whom it's actually aimed at (average consumers, who just end up asking the enthusiasts for advice on what to buy, either that or they go with the defaults from their OEM)... Cut it out, it's irrelevant, it's just marketing.

Those screenies Spank posted are probably one of the reasons why they went with FP32... The glass shaders tend to be some of the more intensive ones in HL2 so it's not surprising, still I'd hope they would invest some time in a mixed-mode path for future versions of the engine that they'll be licensing.
 
Sly said:
I'm not sure if that's a valid argument for what you're trying to say. It's a logo for a program for getting games to run reliably on their hardware.

nVidia - TWIMTBP
http://www.nzone.com/object/nzone_twimtbp_gameslist.html

ATI - GITG
http://www.ati.com/gitg/gaming/gametitlesALL.html

And both companies have the same program. FarCry is listed on both. If you're condemning nVidia for TWIMTBP, you'll be doing the same for GITG.

Yeah, gotta love how UT2003 and UT2004 have that giant 3d nvidia intro, yet ATI still has bids on it. The one I love the most is how Tribes Vengeance uses the UT2004 engine setup almost completely. (its basically a mod to ut2004) Yet they claim it works best on ATI cards.
 
NEVERLIFT said:
Anyone know if this works with Vampire The Masquerade Bloodlines?

It works with any game. But the question is, will there be any noticeable difference when you do so.
 
Sly said:
I'm not sure if that's a valid argument for what you're trying to say. It's a logo for a program for getting games to run reliably on their hardware.

nVidia - TWIMTBP
http://www.nzone.com/object/nzone_twimtbp_gameslist.html

ATI - GITG
http://www.ati.com/gitg/gaming/gametitlesALL.html

And both companies have the same program. FarCry is listed on both. If you're condemning nVidia for TWIMTBP, you'll be doing the same for GITG.

So why isn't there an ATI logo when I startup Farcry? Could it be because TWIMTBP is also a marketing program!?
 
Spank said:
Did you get that using the 3dAnalyze tweak?

What kind of difference in score is there?

In any case, if this is indeed legitimate evidence of FP16 not providing enough in the part of the game, then it's as I thought. MOST the time, it doesn't, SOMETIMES it does.

Which is, realistically, what should be expected. What Valve should have done is code their FP24 shaders all over the place they wanted to, and in 95% of them, provide a partial precision option. In some cases - such as the glass above (again, IF this is, in fact, legit), then it would make sense to use the 'full precision' DX9 spec.

That IS why nVidia offered multiple tiers of precision, after all. To get the speed benefit of the low-precision options, and the choice of using high precision in the few cases it was anticipated it would ACTUALLY be needed.
 
Yeah, where are all the benches. Goddamn NV slackers, bench and post that shit! Go go go!

*cracks whip*

:p
 
dderidex said:
Which is, realistically, what should be expected. What Valve should have done is code their FP24 shaders all over the place they wanted to, and in 95% of them, provide a partial precision option. In some cases - such as the glass above (again, IF this is, in fact, legit), then it would make sense to use the 'full precision' DX9 spec.

Please note: there is no such thing as an "FP24 Shader" there is only a "Shader" (implicitly full precision) or a "Shader which contains Partial Precision hints".

However, the problem with you suggestion above is that it only accounts of that shader alone - it doe not account for when that shader interacts with other shaders or all the different surface / materials in the rest of the game it is used on. You have to test all the usage scenarios as you can't necessarily predict how reducing the precision will effect the use of that shader when it interacts with other things.
 
Warning – long post, please don’t read if you don’t like reading long posts. While this does indeed seem obvious, the world seems to be plagued by people who despite knowing what they dislike, just go ahead and do it anyways simply to complain about it afterwards ;)

My thoughts…

I borrowed a copy of HL2 from a mate the day after it was released, he was having problems with his computer, however I only had access for 1 maybe 2 days, so I ripped through the single player double time, over the period of about 8-10 hours I got through 12 out of the 14 chapters before returning the game. I played in Medium difficulty mode BTW.

I played it through with default settings; in fact for my rig (XP3000 @ 11.5x200, 1gb PC3200 ram, FX 5900 Ultra modded to FX 5950 Ultra, further clocked to 550/975) I used 1280x960 with 2xQ AA (looks more or less like 4x with performance of 2x) and 4xAF with all the settings set to the highest, in DX8.1 mode.

The game was incredible and I loved it so much that I actually bought it the day after, which is a rare occurrence for me ;)

I then played the game through again over the period of about 2-3 days again on medium mode, reaching my previous point and finishing the remainder of the game in only a few more hours, great game although I do think there’s some disappointing aspects (not to be mentioned here)

I’m now on my 3rd pass, I decided to check out what the game would look like in DX9 mode, knowing that my video card isn’t the best performer when it comes to heavy DX9 implementation I was incredibly surprised with the results.

I again played with full settings, however upping the DX specification to 9.0, I dropped 4xAA to 2xAA the resolution from 1280x960 to 1024x768 and left AA as it was. Turned out the game really didn’t look much different, quality wise there were a few areas where the 2.0 shaders stood out, the light blooms was the main one, most of the reflective areas seemed no different.

The only bug I’ve found so far is certain more isolated problems with reflections in water, first noticed at the start of the canal, other than this the game is running just as well and because of the great job 2xQ AA does, it really doesn’t look that much worse of, in fact I’d go as far to say that the overall IQ is higher and the game experience is more realistic.

While I’ve seen no one go through the entire game and take exact duplicate screenshots of ever shader instance I don’t think anyone has really pointed out and large, obvious or frequently re-occurring problems with forcing 16bit FP precision using the hack. I’m fairly sure due to the nature of some people’s obsession with video cards brands at least 1 person would be quick to point out all the flaws with this method but nothing so far. That is of course not to say they don’t exist, I can’t claim to know that, however the outlook is good so far.

It’s funny how the visual artefacts with the use of 2.0 shaders disappear when the ID of the video card vendor is changed to match that of certain ATI video cards, obviously some level of detection is going on, which should make fixes easy for valve to implement. The FX59XX range might suffer from shader 2.0 difficulties due to its architecture, however that isn’t really an excuse to remove the choice of this feature from the gamer. At the end of the day its our choice, and many other people bought this range of cards with over clocking in mind, and so can actually run shader 2.0 games with far less of an impact in performance, I can confirm this from my own use.

Ok, so the average Joe wants to run the game out of the box without much messing around, that’s great, simply put the default DX version on these cards to DX8.1, but leave in DX9.0 as an option. Even if they were to add a message box claiming using this mode with your current hardware would severely degrade performance, but still allowing it without setting console commands or using specific switches would have been preferable to what they did.

However the ability to run the game in force 16bit FP precision with seemingly little IQ degradation is interesting that’s for sure. At the end of the day different gamers can’t tell certain visual effects apart, for instance some of my family who have played parts of the game cannot tell the difference if the resolution is changed significantly, or if different levels of Ansiotropic Filtering are used, same as some cannot tell that 2.0 shaders are used instead of their fallback counterparts. It doesn’t really make sense to keep these options from users if they do not have significant problems, I also find it hard to believe that it would take a company such as valve to mimic a crude implementation such as this hack for this specific purpose.
Again this does not have to be an option set as default or even be available outside “advanced” options, however why would it hurt to offer gamers such choice? We can change our texture resolutions to something that’s really bad to allow an increase in other areas for instance resolution or any other number of game features.
I do understand that the artists and coders want to see their game played under the circumstances they build, however the changes made to the visual aspects of the game after its been run through drivers which alter god knows what with their own optimisations and tweaks, are often far worse, and drivers might themselves even implement this very feature in the near future. (in my opinion Nvidia is doing well with game specific settings and optimisations at the moment)

Together with the fact that ATI and Valve are working hand in hand, lots of money has crossed hands, and valves previous support of ATI’s hardware as the must have hardware for their product is a means for some suspicion I suppose, anyone who thinks this sort of thing doesn’t happen in Industry and commerce in general, is really kidding themselves, it happens all the time and so what’s stopping it happening in this case? Well nothing really, although we probably shouldn’t jump to conclusions it’s defiantly fishy.

I think the question “could have Valve done more to boost the game experience with the FX 5 users with minimal effort?” I would have to say that these options really should have been left in, certainly at least the option to use DX9.0 from the menu so people can try and decide for themselves, rather than be dictated to.

However have valve deliberately sabotaged these same users? Well probably not.

Consider this, we knew a full year back, how the performance would be like on most of these cards and a lot of this had already been figured out way back then. A year later and we see little to no effort on the part of Valve to optimise their engine with the use of shaders. And this isn’t just specifically for Nvidia users; its just good coding practice, running things at an unnecessarily high precision just isn’t good practice.

Some of the arguments I’ve read about them having to stick to DX9.0 specification is being misused, while it does call for at least 24bit precision for 2.0 shaders, the WHOLE engine does not rely on these shaders, in fact what appears like a fairly small percentage of shaders actually need to be 24bit precision.

DX9.0 specification does NOT specify that 24bit precision is now needed to render older shader models, and this is true for a seemingly large percentage of the shaders in HL2.

Laying the fault on Nvidia for producing a video card that runs at either 16bit or 32bit precision is not really fair, if used correctly their hardware can produce some good results. While no one is going to argue that its as good or even better than ATI, its certainly arguable than you can pull respectable DX9.0 performance out of the range.

Some things I’d like to see:

- Kyle to throw together a comparison of IQ to see if he can scrape together many differences.
- Pictures and examples of shaders effects being significantly incorrectly rendered due to forced 16bit precision
- A count on shaders in the game that NEED to run using shader 2.0 and therefore need at least 24bit precision.

Apologies for the long post, I’m covering a lot of ground missed with the first 13 odd pages of this thread, very interesting indeed.
 
tranCendenZ said:
lol fallguy, he said he spent all that time making a partial precision path which the game does not have. That is part of the point, either that statement was pure BS or the pp path was discarded purposely.


So John Carmak working on the NV30 path in D3 for at least one year must also be pure BS right? I mean same thing, when the game shipped no NV30 path... Really your logic is twisted beyond belief..
 
Moloch said:
Unless there is some bug, it would appear a large number of you underestimated valves ability to code a DX9 game:D
hate to say I told ya so(I said make sure thereis no IQ diff before jumping the gun).
As was hypothesized pretty early on, there may well be SOME cases where FP24 really IS the minimum precision needed.

But Valve never 'tells us' - the engine doesn't indicate where FP24 is needed or where partial precision can be used instead.

That's why I didn't post this as "a solution to poor GeForceFX performance in DX9" or anything. Hell, 3dAnalyze isn't even stable enough to PLAY the game!

The whole point of this thread is that "Valve sucks" because they chose to not implement part of the DX9 spec (partial precision) that would have allowed the FX cards to play competitively.

All the 3d Analyze 'hack' does is prove that MOST the game can run in partial precision with no flaws.

So, now we have proof that it's not 100% of the game that can run fine in partial precision. Alright, that makes sense. But just because a couple shaders need to be run in FP24/32 doesn't mean ALL of them do.
 
Jbirney said:
So John Carmak working on the NV30 path for at least one year must also be pure BS right? I mean same thing, when the game shipped no NV30 path... Really your logic is twisted beyond belief..

Yeah and that was a good reason why quite a few people bought their nv30 cards, especially in light of the early benchmarks that he released which showed the 5900 way ahead of the 9800 pros.
 
Moloch said:
You should learn what it actually does before spouting bullshit.
Typical nvidiot...
lol chill dude. u think make it seem like the israelis vs. palestinians war

Nothing will change by calling someone a name. You should know that.
 
palabared said:
lol chill dude. u think make it seem like the israelis vs. palestinians war

Nothing will change by calling someone a name. You should know that.
I am calling what he is- in every thread relating to nvidia or ati he constantly bashes ati, claiming they're doing some of the shader replacement nvidia is famous for during the FX era :rolleyes:
 
While writing that marathon I decided to try out some of this stuff.

I ran downloaded the program, installed tryed it out and it wouldn't load HL2 itself, however testing showed that in fact it was effecting HL2 when it was loaded and HL2 was ran from steam.

GameSettings:
1024x768
Maximum everything
DX9.0 enforced with fixed bugs (using method of ATI 9800 product ID)


Driver Settings:
Driver Version 61.77
2xQAA (acts like 4xaa with speeds of 2xaa, aa that i have grown to love)
4xAF
High Quality
V-sync: OFF
Trilinear Optimisations: OFF
Ansiotropic Filtering: OFF

http://www.hackerz.tc/hzinternal/tot/HL2-PixelShader-16bit-and-32bit.jpg

The picture is around 700k, and is a side by side comparison of the two 1024x768 screenshots, added together with adobe photoshop, saved to jpg with maximum quality (100) and no other alterations made.

The "cl_showfps 1" command was used to display the current average FPS and the screenshots were taken with the in game screenshot capture.

16bit is on the left, 32 bit is on the right, frame rates are roughly 29FPS and 41FPS respectivly, and the performance was a lot better in game with 16bit forced obviously, while this area ran particuarly badly compared to most other areas I considered 30FPS playable, but with 41 FPS I could easily up the resolution one step to 1280 960.

Machine specs for anyone who missed them:

XP3000 @ 11.5x200
1Gb PC3200 Ram @ 200
FX 5900 Ultra Modded to FX 5950 Ultra, further overclocked to 550/975

Let me know if the screenshot method is not accurate enough, if you guys want it done again with other methods it will have to wait untill tomorrow im afraid.
 
Status
Not open for further replies.
Back
Top