Valve sucks

Status
Not open for further replies.
Jbirney said:
Pssstt I Think he is agreeing with you and informing to the others that a partial mode for HL2 just is not going to happen :)
QFT
QFT means quoted for truth...
 
Scali said:
Erm... You are talking about supersampling, but all modern cards use multisampling instead, which is quite different from rendering the screen at a higher resolution. And for the record... you need 4x (ordered grid) supersampling to render at twice the screen resolution (width*2 and height*2).
With modern multisampling it's often much faster to render with 4xAA than to increase the resolution by 1 or 2 notches, to get the same perceived smoothness.

My bad on multisampling versus supersampling. But even with a rotated grid, there are still gaps at 4xAA that, by the most strict definition, account for at least 1/9th of the higher resolution. And, while you are correct about multisampling being more efficient and in some cases more attractive, the hardware still has to calculate the color of that virtual pixel just like any real pixel. With all of today's shader programs and the texture manipulation involved therein, the only real difference between multisampling and supersampling is that the virtual pixel information is not written to the framebuffer.

Scali said:
I don't think Valve meant that kind of 'customizable'.
I think they were talking about the ability of developers to customize the engine to suit their needs. You are talking about the users customizing a game to suit their needs. That is quite a different thing.
Obviously any developer can choose to add NV3x-specific paths to their Source-engine powered game, if they wish to do so.

I concede that they did not mean user customizable, but that's what most hardcore gamers always liked about the Quake Series and Half-Life. It allowed one to change everything about the 3d engine and optimize it for their specific system, their own way.

Scali said:
I just think you're making a huge deal out of nothing.

Obviously we disagree hear, and admittedly, I would not be making such a big deal if I did not have a GFFX myself, but I would still be disappointed in Valve for this even if I had an ATI X800 PRO PE.

PE = (Phantom Edition) :p

rcolbert said:
Best answer is to go 1600x1200 with 4XAA on. Everyone can do that, right?

I still can... in everything except Half-Life 2. Of course, to be really playable, I'd have to drop Doom 3 to 1024x768 and most other games (Far Cry included) to 1280x960. But in Half-Life 2, I can only run at 800x600 with all the highest settings. And I haven't even tried AA or AF for benchmarking purpose due to fear of getting single digit framerates.
 
Let this thread sink to the bottom. Why do you people keep bumping this crap up.
 
Optimus said:
My bad on multisampling versus supersampling. But even with a rotated grid, there are still gaps at 4xAA that, by the most strict definition, account for at least 1/9th of the higher resolution. And, while you are correct about multisampling being more efficient and in some cases more attractive, the hardware still has to calculate the color of that virtual pixel just like any real pixel. With all of today's shader programs and the texture manipulation involved therein, the only real difference between multisampling and supersampling is that the virtual pixel information is not written to the framebuffer.



I concede that they did not mean user customizable, but that's what most hardcore gamers always liked about the Quake Series and Half-Life. It allowed one to change everything about the 3d engine and optimize it for their specific system, their own way.



Obviously we disagree hear, and admittedly, I would not be making such a big deal if I did not have a GFFX myself, but I would still be disappointed in Valve for this even if I had an ATI X800 PRO PE.

PE = (Phantom Edition) :p



I still can... in everything except Half-Life 2. Of course, to be really playable, I'd have to drop Doom 3 to 1024x768 and most other games (Far Cry included) to 1280x960. But in Half-Life 2, I can only run at 800x600 with all the highest settings. And I haven't even tried AA or AF for benchmarking purpose due to fear of getting single digit framerates.
Dude something is wrong with your system!
I played HL2 on 1024 maxed with a 2500+(stock) and a radeon 8500.
FPS was mostly around 40, with was with 16X AF, no AA(obviously).
What driver settings?
Gee sorry about bumping it, but 800x600 maxed is weak!
 
I can add that if you are running on a 256,000 colour LCD instead of a good old fashioned CRT, you might not notice the difference in higher colour depths or precision. Nvidia would look the same as ATi in that case.

Some people do notice the difference in 16 and 24 bit precision though. I can easily spot the difference between a 16x and 8x AF, athough many tell me that I'm crazy.

I can usually spot the small differences between 16 and 24 bit precision when looking on a 21 inch CRT. The difference between 24 bit and 32 bit defies my visual senses though.
 
zenops, you comming to lanageddon? :D

Gotta agree, LCD's do not match a CRT at all. Running a 22" Samsung Syncmaster 1200NF (20" Viewable, perfectly flat) and there is no comparrison between a high quality CRT and any LCD i've seen. Thats not to say LCD's arent good, but they arent as good a CRT's ;)

This reminds me of the 3dfx 24bit vs. 32bit arguments back in the day.. I remember when I used to run quake3 on a Voodoo 5 5500 (then later on a Radeon 64mb DDR VIVO SE) at 16bit with 32bit textures.. or maybe it was the other way around. Got really good performance with only a very small hit to visual quality.. pair it up with 8x or higher anisotropic and it was great ;) Was able to run Q3 @ 2048x1536 with those settings and it was decent FPS.
 
Optimus said:
My bad on multisampling versus supersampling. But even with a rotated grid, there are still gaps at 4xAA that, by the most strict definition, account for at least 1/9th of the higher resolution. And, while you are correct about multisampling being more efficient and in some cases more attractive, the hardware still has to calculate the color of that virtual pixel just like any real pixel. With all of today's shader programs and the texture manipulation involved therein, the only real difference between multisampling and supersampling is that the virtual pixel information is not written to the framebuffer.

On the contrary. For every pixel in the poly (on screen), the shader is executed only once. Only z/stencil are done at higher resolution. Unlike with supersampling, where you would indeed have to perform shading for every subpixel. So there is a HUGE difference between the two approaches when shaders are involved.

This difference is exactly why multisampling can have aliasing in some cases (bumpmapping for example), and centroid sampling is required to avoid sampling outside the polygon edges.
 
Scali said:
On the contrary. For every pixel in the poly (on screen), the shader is executed only once. Only z/stencil are done at higher resolution. Unlike with supersampling, where you would indeed have to perform shading for every subpixel. So there is a HUGE difference between the two approaches when shaders are involved.

This difference is exactly why multisampling can have aliasing in some cases (bumpmapping for example), and centroid sampling is required to avoid sampling outside the polygon edges.

This I did not know. Yet again, I stand corrected... or sit... nevermind.

And correct me if I'm wrong:
Z/stencil is where only the depth of the "virtual" pixels is calculated in order to decide the proper blending between the two possible pixel colors?

Moloch said:
Dude something is wrong with your system!

Yeah, it has an FX in it. :D

ZenOps said:
I can add that if you are running on a 256,000 colour LCD instead of a good old fashioned CRT, you might not notice the difference in higher colour depths or precision. Nvidia would look the same as ATi in that case.

I think this could be the case as I play on a CRT and all my friends with ATI cards use LCDs. So I can also agree that it is possible that ATI has better gamma and texture settings than NVidia. But then, I never intentionally disagreed with that possiblility. I simply could not be more than neutral on the topic as I have no evidence of either being better in my own past experience.

--------------------------------------------------
Off topic:
trudude said:
Let this thread sink to the bottom. Why do you people keep bumping this crap up.

"Bumping" is when someone replies to a topic with irrelevant banter for the express purpose of moving the discussion to the top of the messageboard. To be honest, your post is the most "bumping" of any here, even when considering that the opposite effect was intended.
 
Optimus said:
This I did not know. Yet again, I stand corrected... or sit... nevermind.

And correct me if I'm wrong:
Z/stencil is where only the depth of the "virtual" pixels is calculated in order to decide the proper blending between the two possible pixel colors?

Basically the z/stencilbuffer determines visibility of each pixel. So you know the amount of coverage of poly inside the pixel by counting how many subpixels pass the z/stenciltest, and you can use that as a blend factor.
And this explains why you can't get antialiasing with alphatesting either. The alpha is part of the framebuffer, not of the z/stencilbuffer, and it is not sampled at a higher resolution.

But still, multisampling is an interesting solution, since it can be performed very cheaply. For polygon edges it's an excellent solution. And you can always code manual supersampling if you so desire. NVIDIA was actually promoting this with the introduction of the NV4x.
 
Optimus said:
Yeah, it has an FX in it. :D
Ya but an FX should be able to pwn an 8500, they're both running the game in DX8.1 mode, yet I'm able to play at 1024 maxed with 16AF.
Was using a stock 2500 at the time, maybe he has a slower cpu.
 
LyCoS said:
where the hek is the "unsubscribe to this thread" button ??

go to subscribed threads

check this thread ... on the far right

at the bottom pick delete subscription

and then press go
 
Moloch said:
Ya but an FX should be able to pwn an 8500, they're both running the game in DX8.1 mode, yet I'm able to play at 1024 maxed with 16AF.
Was using a stock 2500 at the time, maybe he has a slower cpu.

This is exactly what I mean. As you can see in my signature, my system is old but by no means is it weak. And this is proven by the fact that it performs above average in all games except HL2. There is something in HL2 that is actively crippling Nvidia FX video cards. ATI and/or Valve may or may not have intended for it to be there, but the fact remains that it is there.

I am utterly certain that Valve is going to fix this issue. I would just prefer that they do so immediately after finishing the game (Valve Anti-Cheat is not yet finished, thus HL2 is not yet finished).
 
Optimus said:
This is exactly what I mean. As you can see in my signature, my system is old but by no means is it weak. And this is proven by the fact that it performs above average in all games except HL2. There is something in HL2 that is actively crippling Nvidia FX video cards. ATI and/or Valve may or may not have intended for it to be there, but the fact remains that it is there.

I am utterly certain that Valve is going to fix this issue. I would just prefer that they do so immediately after finishing the game (Valve Anti-Cheat is not yet finished, thus HL2 is not yet finished).
Valve aren't crippling nvidia cards as you say, using 3danalize to change nvidia cards to ati ones gains you like 3 fps.
the 5200 is just a pos, it's that simple.
Isn't it a 2 pipe card?
I forgot how crappy the 5200 was :eek:
 
Moloch said:
Valve aren't crippling nvidia cards as you say, using 3danalize to change nvidia cards to ati ones gains you like 3 fps.

As I have said before, I don't think Valve did it on purpose. It is conceivable, though highly unlikely, that ATI "helped" them optimize their shaders "for" ATI hardware, but I find that hard to believe considering they know how it feels to be on the receiving end of that kind of activity.

Moloch said:
the 5200 is just a pos, it's that simple.
Isn't it a 2 pipe card?
I forgot how crappy the 5200 was :eek:

Yeah, I wouldn't touch the FX 5200 with a 50 foot pole.
My 5900 is pretty good though. It's even better after a 5950 Ultra BIOS Upgrade.
 
Optimus said:
As I have said before, I don't think Valve did it on purpose. It is conceivable, though highly unlikely, that ATI "helped" them optimize their shaders "for" ATI hardware, but I find that hard to believe considering they know how it feels to be on the receiving end of that kind of activity.



Yeah, I wouldn't touch the FX 5200 with a 50 foot pole.
My 5900 is pretty good though. It's even better after a 5950 Ultra BIOS Upgrade.
Nvidia had had plenty of time to hack the shaders up to speed them up. just nvidia isn't as strong in HL2.
Didn't valve basically have an auction, and ati won it?
 
Valve isn't really crippling the FX series. The FX series is crippled by design.
The main difference is that Valve didn't allow the FX series to run their ps2.0 path, while others do.
By not running the ps2.0 path, NVIDIA has no option to replace the shaders and artificially inflate performance, like they've done in Doom3, FarCry, and many other games with ps2.0 shaders.
 
Scali said:
By not running the ps2.0 path, NVIDIA has no option to replace the shaders and artificially inflate performance, like they've done in Doom3, FarCry, and many other games with ps2.0 shaders.

I see what you mean, but it's still a little unfair to those who don't know about command line flags in HL2, don't care what brand they have in their box, and happened to end up with an FX (5800 or higher).

It's all moot at this point anyway as someone has optimized NVidia's 71.24 drivers for FX performance. They still look good (minus the most minute banding in at very random times) and I get a stable 45 fps in the CS Source Video Stress test at dxlevel 90. Incidentally, there is no difference in performance (+ or - .01%) between dxlevel 81 and dxlevel 82 (mixed mode) with these drivers.
 
Optimus said:
I see what you mean, but it's still a little unfair to those who don't know about command line flags in HL2, don't care what brand they have in their box, and happened to end up with an FX (5800 or higher).

I doubt those people would even be able to tell that they're running the DX8.1 path instead of the DX9.0 path.
It's not like the visuals are THAT much more ugly.
I still think it'd be more unfair to default to the DX9 path (or mixed for that matter), which results in below-average performance. With DX8.1 you at least get excellent playability on the FX series, even the lower models. The framerates and resolutions are now in line with other games. If they chose the other option, it wouldn't be.
 
The fx5900 can handle hl2 at a decent resolution with 60fps+ with all details up in dx9, minus the AA and AF, so I really dont see what your going on about. Its not like hl2 is the backbreaker your making it out to be...

Just saw this....
Scali said:
By not running the ps2.0 path, NVIDIA has no option to replace the shaders and artificially inflate performance, like they've done in Doom3, FarCry, and many other games with ps2.0 shaders.
Whats wrong with optimizing code for hardware?
 
Scali said:
Valve isn't really crippling the FX series. The FX series is crippled by design.
The main difference is that Valve didn't allow the FX series to run their ps2.0 path, while others do.
If you have seen the FX ps2.0 shader performance in Halo then you would be praising Valve for having the fx's not default to that. Halo on the fx cards runs at half the speed it does at 1.1
 
I saw some pictures for the new DOD:Source today and i just want to say again that VALVE SUCKS!

I purchase a legitimate copy of HL2 and because i didn't download a silver or gold edition for $60 or more off their site i wont get to play a mod that should be available for free to begin with.

And i'm still pissed off at their crappy packaging in the retail copies. Paper sleeves and no instruction manual is just pitiful for a $50 game.
 
Lord of Shadows said:
Whats wrong with optimizing code for hardware?

The main problem here is that it seems to give false expectations about the capabilities of the hardware. Here is one game that does not have these optimizations, and look at the topic of this thread!
So there's your problem... any game that the driver is not optimized for, will perform quite badly in comparison to the ones that are.
 
shadowbreaker513 said:
If you have seen the FX ps2.0 shader performance in Halo then you would be praising Valve for having the fx's not default to that. Halo on the fx cards runs at half the speed it does at 1.1

That is exactly the point I've been trying to make for ages now. But apparently some FX owners want the high quality graphics instead of the performance.
If they would have informed themselves properly before buying, they'd be owning Radeons now and have both... but ofcourse it's better to blame the game developers and cook up all kinds of conspiracy theories instead.
 
Scali said:
That is exactly the point I've been trying to make for ages now. But apparently some FX owners want the high quality graphics instead of the performance.
If they would have informed themselves properly before buying, they'd be owning Radeons now and have both... but ofcourse it's better to blame the game developers and cook up all kinds of conspiracy theories instead.

Or they would have bought the 6800 series. Either way, I personally think this thread has gone on long enough, but I'm not a mod :p
 
lopoetve said:
Or they would have bought the 6800 series. Either way, I personally think this thread has gone on long enough, but I'm not a mod :p
6800?
They bough the 5XXX series like a year ago or something.
 
If people actually bought FX cards after the GF6 series was released, that'd be even more pathetic than going FX before :)
 
Moloch said:
6800?
They bough the 5XXX series like a year ago or something.

I meant waited till now, or bought their way out of them, or bought the 9800 and then the 6800...

Scali said:
If people actually bought FX cards after the GF6 series was released, that'd be even more pathetic than going FX before :)

I know quite a few peopole that have done so.
 
lots of people bought the fx cause the ps1.1 performance was great. and when the card came out there were no dx9 games to say that the card sucked. so for 1 year proplr were buying cards without knowing that they suck
 
ludachaz said:
lots of people bought the fx cause the ps1.1 performance was great. and when the card came out there were no dx9 games to say that the card sucked. so for 1 year proplr were buying cards without knowing that they suck

It's been said often enough, but I'll repeat it again, since you must have missed it: 3DMark03... and to a certain extent also Tomb Raider: Angel Of Darkness.
 
The fact that the Radeon 9700Pro came out more than a year before the FX5900 and still owned it more often than not should have been a clue to folks that Nvidia wasn't putting forth their best effort. The fact that Nvidia on the NV3x core ALWAYS sucked out of the gate on a given game and then gained a 30% performance increase with the next driver release should have clued people in to the fact that Nvidia was writing game specific optomizations which were nothing more than cheats most of the time. The fact that we know that instruction order on the NV3x products can drastically change the number of times a pixel has to go through the pipeline when common sense tells us that shouldn't be the case should have been the red flag to tell people that buying the last gen of Nvidia cards was a bad idea.

Oh yeah, and I'm replacing my 6800GT today with a 6800 Ultra. If anyone is still fondly clinging to their FX5900/5950 I suggest you get over it and look for my 6800GT on eBay within the week (unless I decide to build a SFF system instead.)
 
I don't know why people attack Valve and Steam so much...

1: They have brought to us two of the greatest games of all time (Half Life 1 and 2)
2: The Steam system updates your games to the newest version with no work on your part.
3: Steam has eliminated the need to re-install any of the games from their library off of CD's or DVD's.
4: Despite how buggy the game was upon release, within 3 months they have turned Half Life 2 and CS:Source into a near flawless product. Few companies can claim this.

Who cares if your 3 year old Nvidia card runs like crap in HL2. My brand new ATI card runs like crap on Doom3. But you won't see me posting a "OMG! My c4rd Runz 1ike p00 in D00m!! ID SUX MY BAWLS!" on the HardForums.

Your posts say it clear, you can spoof the game into thinking it's an ATI card but it crashes after about 5 minutes. This tells me that you are forcing the program to do something it wasn't meant to do. It'd be like me overclocking my processor to double the speed and complaining that windows won't stay up for more than an hour.

So either grow up and quit complaining or get a job and buy a card worthy to play the Game of the Year.
 
chiablo said:
3: Steam has eliminated the need to re-install any of the games from their library off of CD's or DVD's.
I haven't bashed Valve for anything more than the original point in this thread: dropping the already working mixed mode DX9 path. HL2 to this day still stands alone as the only DX9 game that the FX5900/FX5950 doesn't run acceptably.

---
But back to the point... Steam is unreliable. 4 times in the last week there were outages lasting longer than several minutes. And several times in-between, games wouldn't even start because Steam was too busy or down. And last week wasn't unique.

I didn't pay for HL2 (free HL2 scratch off card), so i'm not complaining too much about it in forums. This is my first post about it. If I did pay $50, I would be very angry now.
 
pxc said:
HL2 to this day still stands alone as the only DX9 game that the FX5900/FX5950 doesn't run acceptably.

And that couldn't possibly have anything to do with the fact that HL2 has the most diverse and advanced shading seen in any game so far? (Apart from lack of shader replacements in the drivers ofcourse).
 
Lord of Shadows said:
We need to have someone give scali an "I <3 fx5900" title =)

Why?
I happen to hate the FX series, and anyone trying to defend it.
I think Optimus deserves this title.
 
Status
Not open for further replies.
Back
Top