bboynitrous said:Your 5700, and you switched to OpenGL.
Amazing, huh?Presi said:Open it and follow the numbers:
1. select HL2.exe file in half-life 2 folder
2. select any file inside the folder half-life 2\bin
3. select Steam.exe
than check these options:
- Under the section Pixel and Vertex Shader: FORCE LOW PRECISION PIXEL SHADER
- Under the section Remove stuttering: PERFORMANCE MODE
- on the bottom left: FORCE HOOK.DLL
If you haven't change the file dxsupport.cfg with the method described in the beginnig of this thread, you can obtain the same result typing in the section DIRECTX DEVICE ID'S the ATI Vendor and Device ID, there are just two device though.
....
In the end 3D ANALYZE gives me an error, CREATEPROCESS FAILED, I launch HL2 anyway, the water looked awesome, awesome detail and I noticed a boost in performance too. I think around 20/30% which allowed me to play the WATER HAZARD level with this setting: 1024x768 everything max, water relection to ALL, 2xAA, 4xAnisotropic with a range of fps of 40 and >150.
Now, this tweak is GREAT for proving what kind of performance hit Valve kindly provided GeForce FX users....but it's obviously not usable to play the game with. Why?Frosteh said:While writing that marathon I decided to try out some of this stuff.
I ran downloaded the program, installed tryed it out and it wouldn't load HL2 itself, however testing showed that in fact it was effecting HL2 when it was loaded and HL2 was ran from steam.
GameSettings:
1024x768
Maximum everything
DX9.0 enforced with fixed bugs (using method of ATI 9800 product ID)
Driver Settings:
Driver Version 61.77
2xQAA (acts like 4xaa with speeds of 2xaa, aa that i have grown to love)
4xAF
High Quality
V-sync: OFF
Trilinear Optimisations: OFF
Ansiotropic Filtering: OFF
http://www.hackerz.tc/hzinternal/tot/HL2-PixelShader-16bit-and-32bit.jpg
The picture is around 700k, and is a side by side comparison of the two 1024x768 screenshots, added together with adobe photoshop, saved to jpg with maximum quality (100) and no other alterations made.
The "cl_showfps 1" command was used to display the current average FPS and the screenshots were taken with the in game screenshot capture.
32bit is on the left, 16 bit is on the right, frame rates are roughly 29FPS and 41FPS respectivly, and the performance was a lot better in game with 16bit forced obviously, while this area ran particuarly badly compared to most other areas I considered 30FPS playable, but with 41 FPS I could easily up the resolution one step to 1280 960.
Machine specs for anyone who missed them:
XP3000 @ 11.5x200
1Gb PC3200 Ram @ 200
FX 5900 Ultra Modded to FX 5950 Ultra, further overclocked to 550/975
Let me know if the screenshot method is not accurate enough, if you guys want it done again with other methods it will have to wait untill tomorrow im afraid.
lol, I don't think so. I was just saying random crap hoping to be right.6800GTOwned said:Don't toy with my emotions!!!
........
That's not possible!..... is it?
Yeah, that's the real kicker.tranCendenZ said:lol i called this so long ago
funny even at FP32 all the time, 6800 still kicks this game's ass. But it would be even faster if Valve used good shader programming and put in partial precision calls where appropriate.
It was a shame that Doom3 wasnt coded more appropriately for radeon cards too.dderidex said:it's a damn shame Valve didn't code this right.
Er, in what way?Mister E said:It was a shame that Doom3 wasnt coded more appropriately for radeon cards too.
Cant you read....they didnt just tweak performance on ATI cards, they crippled performance on nvidia cards. Thats just low.mohammedtaha said:If Valve did really tweak to improve performance for ATi .. then that's their own decision .. they can do whatever they want to improve performance for the company that sponsors them .. or did u miss the ATi sign on the CD or whatever u used to purchase the game ... ...
Er...can you read?defiant said:Cant you read....they didnt just tweak performance on ATI cards, they crippled performance on nvidia cards. Thats just low.
On a side note, i think this thread should be stickied and renamed to something like "Half life 2 Performance Tweak for Nvidia Cards" so people actually know what its about
This in no way helps out ATi as they can run variable FP also. They hindered the lower end graphics users because they made it run at 32 all the time. This isn't a propriatary issue... (which you're saying it is with your instruction example), it's an issue that valve went out of their way basically (32FP I assume would be longer then variable 16-32..or do they have to write code sets for both?) to cripple the FX series whent they didn't have to.Legend said:Er...can you read?
First of all, you're assuming something that hasn't been proven yet, by you especially or anyone for that matter.......
Also, I'm in agreement that they didn't "cripple" performance on nVidia cards if this does turn out to be true. They optimized for ATI, just like software companies do for Intel processors. So you're saying that if X software company makes their product work better if a processor uses SSE3 (which is done, btw) that means they're "crippling" Athlon 64 cpu's? Didn't think so.
I think you're right about the sticky part, though, if this does turn out to work.
There is a reasonable expectation in this industry that all video cards are supported the best they can. Sure, in some cases extra features that may only exist on one brand are utilized much like SSE2/3 with processors. However, when you have to tell the game that your card is some other card in order to get the game to run properly, there is a problem. If this problem is true, I really hope Valve issues a patch to fix it. I am going to be optimistic here and think they will.Legend said:Er...can you read?
First of all, you're assuming something that hasn't been proven yet, by you especially or anyone for that matter.......
Also, I'm in agreement that they didn't "cripple" performance on nVidia cards if this does turn out to be true. They optimized for ATI, just like software companies do for Intel processors. So you're saying that if X software company makes their product work better if a processor uses SSE3 (which is done, btw) that means they're "crippling" Athlon 64 cpu's? Didn't think so.
I think you're right about the sticky part, though, if this does turn out to work.
The "war" over which engine is technically better was over before it was started... and the winner was Doom 3. Half-life 2 looks excellent... because Valve has amazing texture artists. id Software's just don't compare. If you search around (I believe I saw it in these forums), there is a thread wherein someone used the respective editors for each game to remove textures from HL2 and place them into a D3 level. The resulting screenshots (to me, at least) look phenominal: D3's engine lighting HL2's world.mohammedtaha said:there's no other game out there that can prove itself against the HL2 engine .. not even D3 ... cause all I've seen in D3 is shadows and skin texture and lighting ... everything else was dark or hidden in shadows ... oh ..
There's no war to start over the different engines ... it's obvious which one is better NOW ...
That really depends on what are u thinkin about when u talk about the WHOLE engine. I really, really love the way Half life 2 is graphically, but i think that INDOORS doom 3 probably can have a better engine, outdoor I don't know since D3 have none.DanK said:The "war" over which engine is technically better was over before it was started... and the winner was Doom 3. Half-life 2 looks excellent... because Valve has amazing texture artists. id Software's just don't compare. If you search around (I believe I saw it in these forums), there is a thread wherein someone used the respective editors for each game to remove textures from HL2 and place them into a D3 level. The resulting screenshots (to me, at least) look phenominal: D3's engine lighting HL2's world.
Don't get me wrong, I'm not saying that Source is a bad engine. It's just that a lot of people are saying things about the engines when they are really commenting on things not related to the actual engines. HL2 is, imo, a much better game than D3. But D3's engine is capable of much more, I think.
Read the post. First, he had to tell HL2 he had a different card to get it to run without artifacts. That is the real issue here as of course the Radeon card he was pretending to be would use FP24 as that is its native mode. However, the parent found a perfectly acceptable way to run DX9 on a FX card, and all you can say is "he shouldn't have bought that card."Met-AL said:So, it's Valve's fault that they used the DX9 spec? FP16 is not DX9 spec. FP24 or HIGHER is DX9 spec. FP16 is LOWER than FP24.
It isn't like anyone was forcing you to by a FX card. It was well known that they sucked in this respect and shader performance. There was no conspiracy to pull a fast one. Go back a year ago and these board were full of information pointing to the fact that the FX line had problems..big problems. You chose not to take that advice and now your mad that you have to run a hack to get your card to run a game in DX9 mode.
No, you read the posts. The problem is that DX9 requires FP24. Since the FX can do 16 or 32, it has to do 32, which kills the performance. The hack makes the Source engine render in FP16 which is not DX9 spec. Complaining that Valve stuck to the DX9 spec and saying they suck is stupid. The problem is not with Valve. It is with the hardware of the FX cards. Same goes with the D3 engine on ATi cards.obs said:Read the post. First, he had to tell HL2 he had a different card to get it to run without artifacts. That is the real issue here as of course the Radeon card he was pretending to be would use FP24 as that is its native mode. However, the parent found a perfectly acceptable way to run DX9 on a FX card, and all you can say is "he shouldn't have bought that card."
I think you need to clarify your perception of what an "engine" is. I guess your definition of "engine" excludes everything that HL2 does better than D3 , i.e. textures, water, physics simulation, AI, long-distance rendering... everything but the lighting of course, since D3 is nothing but lighting effects.DanK said:The "war" over which engine is technically better was over before it was started... and the winner was Doom 3. Half-life 2 looks excellent... because Valve has amazing texture artists. id Software's just don't compare. If you search around (I believe I saw it in these forums), there is a thread wherein someone used the respective editors for each game to remove textures from HL2 and place them into a D3 level. The resulting screenshots (to me, at least) look phenominal: D3's engine lighting HL2's world.
Don't get me wrong, I'm not saying that Source is a bad engine. It's just that a lot of people are saying things about the engines when they are really commenting on things not related to the actual engines. HL2 is, imo, a much better game than D3. But D3's engine is capable of much more, I think.
Uh, actually DX9 does allow FP16 as part of the spec for PS2.0 (_pp = partial precision, which is what 3D-Analyze is forcing): http://msdn.microsoft.com/library/d...ixelShaders/Instructions/Modifiers_ps_2_0.aspMet-AL said:So, it's Valve's fault that they used the DX9 spec? FP16 is not DX9 spec. FP24 or HIGHER is DX9 spec. FP16 is LOWER than FP24.