Valve sucks

Status
Not open for further replies.
Frosteh said:
While writing that marathon I decided to try out some of this stuff.

I ran downloaded the program, installed tryed it out and it wouldn't load HL2 itself, however testing showed that in fact it was effecting HL2 when it was loaded and HL2 was ran from steam.

GameSettings:
1024x768
Maximum everything
DX9.0 enforced with fixed bugs (using method of ATI 9800 product ID)


Driver Settings:
Driver Version 61.77
2xQAA (acts like 4xaa with speeds of 2xaa, aa that i have grown to love)
aid.
wtf.. it acts like 2x fsaa with a blur filter :eek: :confused:
 
I'm using an FX5600[Refuse to type these two letters] to run Vampire: Bloodlines right now, and from what I've heard:
The dxlevel flags no longer work
The default render level is DX9 (I think ..... the oohhh-pretty water effects are all there)

So basically every time I step over a puddle in tha ..... t. ..... g ..... a ....... ...... ...... m ............ .e..... m...y frame rates drop down so low you can watch the burn-in start on my monitor.

So if V:TM:B really does force DX9, which in turn forces FP24 which forces FP32 on my poor old card, this FP16 hack could let me play some levels without resorting to things like the "no textures hack" (nVidia Control Panel -> Advanced Options (somewhere) -> Set all textures to minimum mipmap levels)
 
Why are people still claiming it forces FP24, when dave-b corrected you guys saying it forces full precision, which is FP32 for nvidia, and FP24 for ati?
Please note: there is no such thing as an "FP24 Shader" there is only a "Shader" (implicitly full precision) or a "Shader which contains Partial Precision hints".
 
Frosteh said:
While writing that marathon I decided to try out some of this stuff.

I ran downloaded the program, installed tryed it out and it wouldn't load HL2 itself, however testing showed that in fact it was effecting HL2 when it was loaded and HL2 was ran from steam...............

What did you used to get in DX8.1 mode on the same settings?

Also will people stop saying FX sux, because it doesn't.
 
Well essentially AA is bluring, to me its a really good balance of AA, it works well with mid range resolutions such as 1280 960, sometimes I press 1600 1200 with 2xAA, rarely drop to 1024 768 and use 8xAA, pretty much never use 4xAA.

But thats not really important here, hows the pic everyone, to be pefectly honest i can't see any difference, and while I maybe wrong, I thought the mostly likly place for 2.0 shader effects would be transparency with further layers of reflection aswell as difraction etc.

I can't tell the difference between the 2 looking at the pictures, let alone in game :}
 
R1ckCa1n said:
Will this thread ever die? We now see this presents no real performance gain and alot of smoke.

Take a look at this http://www.steampowered.com/forums/showthread.php?threadid=193324 and decide if it is worth it.

Basically the FX cards suck at DX9 and were send down to DX8 for a reason. Get over it.
you killed the steam forums.....good job!

Going by the screenshots of the glass in a previous post, no wonder they use full precision.
 
gordon151 said:
What did you used to get in DX8.1 mode on the same settings?

Also will people stop saying FX sux, because it doesn't.

I Assume you mean frame rate, I can check that now, i know its a fair bit more, because when i played in DX8.1 mode my settings were slightly higher, i think i used 1280 960.

Give me a min to see exactly :p
 
Frosteh said:
Well essentially AA is bluring, to me its a really good balance of AA, it works well with mid range resolutions such as 1280 960, sometimes I press 1600 1200 with 2xAA, rarely drop to 1024 768 and use 8xAA, pretty much never use 4xAA.

But thats not really important here, hows the pic everyone, to be pefectly honest i can't see any difference, and while I maybe wrong, I thought the mostly likly place for 2.0 shader effects would be transparency with further layers of reflection aswell as difraction etc.

I can't tell the difference between the 2 looking at the pictures, let alone in game :}
The only FSAA that blurs the image was the voodoo 5 style, since it didnt adjust the LOD IIRC.
AA has nothing to do with blurring.
Read some text in a game with 2XQ..
Like dave-b said, it would have been a ton of work to test the whole game to see which areas really need full precision, and which only need partial precision.
 
Met-AL said:
you killed the steam forums.....good job!

Going by the screenshots of the glass in a previous post, no wonder they use full precision.
works fine here
 
Frosteh said:
16bit is on the left, 32 bit is on the right, frame rates are roughly 29FPS and 41FPS respectivly, and the performance was a lot better in game with 16bit forced obviously, while this area ran particuarly badly compared to most other areas I considered 30FPS playable, but with 41 FPS I could easily up the resolution one step to 1280 960.

Ummmm...you sure about that?

16 bit on the left and 32 bit on the right with the framerates of 29 and 41 respectively?

As in:
16 bit = 29fps
32 bit = 41fps

?

That doesn't sound right?
 
yea we seen the reduction in IQ this is now a mute point. I hope Hard does the right thing and update their main page about it...
 
Met-AL said:
Going by the screenshots of the glass in a previous post, no wonder they use full precision.

The point is they were suppose to do mixed mode. Doing everything that looks right in 16bit and 32bit only on things that showed obvious precision roundoff errors like the glass.
 
zeplar said:
I'm not sure if these were the ones referenced:

Reservations on NVIDIA's SLI

NVIDIA SLI "crazy"

One was! (for me at least)

I guess I can see how someone can read this into saying they thing it's a stupid idea.
But they mostly sound like they are NOT worried about "buzz words" (smaller market, sounds cool but most people will probably never see this setup in there homes)

they still bring up interesting points.
What if ATI makes there next chip quieter and less power hungry than the ultra in SLI?

*I just noticed this is off the subject originally posted*
 
Moloch said:
The only FSAA that blurs the image was the voodoo 5 style, since it didnt adjust the LOD IIRC.
.


Never had any blurring with my 5500, :confused:
 
I don't think anyone here was under the illusion we were going to be able to play HL2 through with 16bit precision and see all the shaders display exactly what they are supposed to. On this note is the glass sheet which is in DX9 using the fix for the shader bugs (the method fooling HL2 into thinking you have an ATI video card)? Just curious.

However I my position im presented with several options...

Play through with dx8.1 not see a massive difference in the shader detail in general.
Play through with dx9.0 as normal with the additional shader fix.
Play through with dx9.0 with shaders set to use 16bit precision.

I've done the first, got amazing video settings, nice quality great speed with no stuttering.

The 2nd I started playing through again for the 3rd time, this time with hard AI. Up untill around the areas the screenshots were taken basically, and up untill then was able to play with ~30fps minimum, in the video settings given, not exactly bad considering the game is set to the highest video settings itself.

Or I can do what im doing now, and give myself 10FPS at worse extra, probably more in other areas and maybe bump the rest back upto 1280 960 (used to play with dx8.1 first time around)

on that note, if i decide to use this fix i can play through the game with DX9.0 settings with minimal shader quality degredation, and use the same in game settings i would use if I was playing with DX8.1, while the frame rate will differ it wont be by enough to let me use more detail, as a good balance doesn't really exist without pushing the frame rate that little bit too far.

*edit*

In short for my situation I can play HL2 with DX9.0 settings with my setup, many people who got their FX 50XX cards overclocked them a lot and most even bumped their cheaper versions right upto the 5950 versions with the Bios mod. to say that HL2 is unplayable on these cards is a little unfair I have to say.

I've been playing through HL2 with these new DX9.0 settings and the game does look better although only really in several places, the use of 2.0 shaders isnt exactly frequent it seems. Even for those that could never play at this frame rate and demand 60fps or whatever you enthusiats demand now a days, i could still drop a lot of video settings thats for sure.

Alas it was too much work for Valve to cater for everyone.
 
So, if someone does some FP16 benches on FX hardware, they should do some FP16 on ATI too.. better see if "FP24" is as much of an advantage as you guys seem to be making it out to be..

Kind of reminds me of the 24bit vs. 32bit discussions back when 3dfx was still around.. you guys never change :rolleyes:
 
STEvil said:
So, if someone does some FP16 benches on FX hardware, they should do some FP16 on ATI too.. better see if "FP24" is as much of an advantage as you guys seem to be making it out to be..

Kind of reminds me of the 24bit vs. 32bit discussions back when 3dfx was still around.. you guys never change :rolleyes:
ATI cards are only capable of FP24.
 
Mchart said:
STEvil said:
So, if someone does some FP16 benches on FX hardware, they should do some FP16 on ATI too.. better see if "FP24" is as much of an advantage as you guys seem to be making it out to be..

Kind of reminds me of the 24bit vs. 32bit discussions back when 3dfx was still around.. you guys never change :rolleyes:
ATI cards are only capable of FP24.
And the 3dfx debate was 16-bit COLOR vs 32-bit COLOR - nothing to do with arithmetic precision.

Wow, how uninformed are you?
 
As an engine designer, valve should try to appeal to the largest range of graphics cards as possible. In this case they failed to accomidate the gforce fx5xxx series to the best of their ability.

I think an important question to consider is if the shaders themselves are not labeled with partial persision or if the engine refuses to set such a flag. If the engine will no run _pp shaders then game developers will shy away from a engine that runs poorly on a great number of graphics cards. If valve is wanting to licence their engine it is in their best interest to have the best performance on the greatest number of cards.

IMO Valve has the right to code how they see fit, but in the long run they only hurt themselves by creating a engine with less scalability.

As for the two compatision images ive seen in this thread, the glass could have been easily flagged for full persision. There are many other object in the scene using shaders none of which show any visual problems.
 
STEvil said:
So, if someone does some FP16 benches on FX hardware, they should do some FP16 on ATI too.. better see if "FP24" is as much of an advantage as you guys seem to be making it out to be..

Kind of reminds me of the 24bit vs. 32bit discussions back when 3dfx was still around.. you guys never change :rolleyes:

I believe the ATI cards only run at a fixed 24bit precision, therefore dropping to 16bit would make no real difference, i think :p
 
dderidex said:
And the 3dfx debate was 16-bit COLOR vs 32-bit COLOR - nothing to do with arithmetic precision.

Wow, how uninformed are you?

... Huh? Why are you quoting me, I was talking about something completely different.
 
Frosteh said:
I believe the ATI cards only run at a fixed 24bit precision, therefore dropping to 16bit would make no real difference, i think :p

You are correct, R300 and up can only output FP24, but internally the calculations are done at FP32.
 
HeavenX said:
32bit is alot better then 24bit if you have the right monitor...
We are talking about precision, not the color quality that you are talking about.
 
I thought when using ps3 shaders the games have to run 32bit precision, sooo since hl2 is ps2 why is 6800 running the game all 32bit precision??? Why not blend 16 and 32 together to gain performance???
 
HeavenX said:
I thought when using ps3 shaders the games have to run 32bit precision, sooo since hl2 is ps2 why is 6800 running the game all 32bit precision???

Because the 6800 does not support FP24, and HL2 runs at "full precision" by default, which is FP24 and FP32.

There are no PP hints to force it to FP16, and there really doesn't need to be, the GF6 series is powerful enough to run it at FP32.
 
Loto_Bak said:
As an engine designer, valve should try to appeal to the largest range of graphics cards as possible. In this case they failed to accomidate the gforce fx5xxx series to the best of their ability.

I think an important question to consider is if the shaders themselves are not labeled with partial persision or if the engine refuses to set such a flag. If the engine will no run _pp shaders then game developers will shy away from a engine that runs poorly on a great number of graphics cards. If valve is wanting to licence their engine it is in their best interest to have the best performance on the greatest number of cards.

IMO Valve has the right to code how they see fit, but in the long run they only hurt themselves by creating a engine with less scalability.

As for the two compatision images ive seen in this thread, the glass could have been easily flagged for full persision. There are many other object in the scene using shaders none of which show any visual problems.

I think this post sums up the situation pretty well, theres no real reason that Valve couldn't have checked these shaders as the game was being developed from early on, noted which REQUIRE 24bit precision and then when it came to the crunch could have made the game more optimised.

Its not just the FX50XX users that will suffer performance loss because of the sloppy method of coding, all cards which can use 2 or more different levels of precision will suffer if they always have to run the highest possible. Its less noticeable in other cards but thats no reason not to do it.

Lets not forget unless ATI have something really special up their sleeve they're going to be having to run 32bit precision soon to ensure future shader 3.0 capability, what are they going to do, run everything in 32bit, I don't think so, and so a similar dual precision abilty seems favourable.

Its just a case of wasted power, why spend more time calculating something to a greater accuracy than required?
 
HeavenX said:
I thought when using ps3 shaders the games have to run 32bit precision, sooo since hl2 is ps2 why is 6800 running the game all 32bit precision??? Why not blend 16 and 32 together to gain performance???
Your so confusing you are confusing me in the process.

Nvidia cards have to run games which are programmed for FP24 in FP32. Because they are only capable of FP16 and FP32, so to FP24 quality, its FP32 that must be used. Now the problem arises with this because the game can mostly be run in FP16 because FP24 makes almost no difference. So what some developers are able to do is tell the card to use FP16 is most areas, then use FP32 in the areas where it is needed. Valve chose not to do this, even though it clearly could have.
 
Mchart said:
Your so confusing you are confusing me in the process.

Nvidia cards have to run games which are programmed for FP24 in FP32. Because they are only capable of FP16 and FP32, so to FP24 quality, its FP32 that must be used. Now the problem arises with this because the game can mostly be run in FP16 because FP24 makes almost no difference. So what some developers are able to do is tell the card to use FP16 is most areas, then use FP32 in the areas where it is needed. Valve chose not to do this, even though it clearly could have.

GF6 series cards seem to do just fine at FP32.

It's the GFFX that have trouble with FP32 and need the PP hints to run at FP16 in PS/VS 2.0 because of performance issues.
 
Could someone help clarify what exactly happened to valves plans for mixed mode which was supposed to do this. It was obviously scapped somewhere along the line, but for what reason, development time?
 
Brent_Justice said:
GF6 series cards seem to do just fine at FP32.

It's the GFFX that have trouble with FP32 and need the PP hints to run at FP16 in PS/VS 2.0 because of performance issues.
Yes, I know this.
 
Frosteh said:
Could someone help clarify what exactly happened to valves plans for mixed mode which was supposed to do this. It was obviously scapped somewhere along the line, but for what reason, development time?

*cough*
$6 million from ATI
*cough*
 
Frosteh said:
Could someone help clarify what exactly happened to valves plans for mixed mode which was supposed to do this. It was obviously scapped somewhere along the line, but for what reason, development time?
In that issue of PCgamer where they explain the technical shit of HL2, they just say they scrapped it, but they didnt give a real answer. My guess, and many other peoples guess, is that since ATI paid them a shitload of money, they decided not to do it, just so it would run better on ATI cards. It also could have been because they were out of time, but thats a lame excuse because they had at LEAST 2 years to work it out. Not only that, but nvidia works really hard to have developers use this in their games, and even makes it easier by providing them this utility that basically converts the stuff on the fly.
 
Brent_Justice said:
GF6 series cards seem to do just fine at FP32.

It's the GFFX that have trouble with FP32 and need the PP hints to run at FP16 in PS/VS 2.0 because of performance issues.
Do you know how much gain in performance hl2 would have gained if Valve did code it with both 16 and 32? Just give it to me out of ur head, you dont have to be exact...
 
HeavenX said:
Do you know how much gain in performance hl2 would have gained if Valve did code it with both 16 and 32? Just give it to me out of ur head, you dont have to be exact...
Read all 16 pages of this thread, its in there. But its a lot. You can roughly say a 50% performance gain, which then brings it back up to par with the ATI cards.
 
HeavenX said:
Do you know how much gain in performance hl2 would have gained if Valve did code it with both 16 and 32? Just give it to me out of ur head, you dont have to be exact...
Well, given that just a few posts above, someone noted that going from 32-bit precision 'all the time' to 16-bit precision 'all the time' kicked their FPS from 29fps to 41fps in one level....

And in the CS:Source test, going from 32-bit 'all the time' to 16-bit 'all the time' kicked the FPS from 36fps up to 64fps....

All that matters is how often partial precision can be used instead of full precision - by all accounts, probably 95% of the time. Which means the poster right above me is dead on - if Valve had coded the game right - using partial precision hints when it could be used, and only forcing full precision WHEN IT WAS ACTUALLY NEEDED, GeForceFX users *would* be seeing something like a 50% performance boost over their current numbers in DX9 mode.
 
Status
Not open for further replies.
Back
Top