Did they use the same drivers as[H]?Apple740 said:Firingsquad is showing some benches where it looks like that Nv has a problem with water shaders.
![]()
![]()
http://www.firingsquad.com/hardware/half-life_2_performance/default.asp
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Did they use the same drivers as[H]?Apple740 said:Firingsquad is showing some benches where it looks like that Nv has a problem with water shaders.
![]()
![]()
http://www.firingsquad.com/hardware/half-life_2_performance/default.asp
Calm down dude.^eMpTy^ said:why the hell would you have a 2 page report...and have 1/6th of it devoted to AA modes that no one uses? why would you even bother comparing super sampling modes on the 6800s to the 6x mode on the XTPE??? that makes ZERO sense.
how about comparing the 6x mode to nvidia's 4x and see how much of a tangible difference in image quality it makes since we all know that the super sampling modes on nvidia hardware are COMPLETELY DIFFERENT, atleast give a shot of what super sampling can do vs multi-sampling...just this blind comparison with no explanation is very confusing...especially given the brevity of the preliminary benches...
I give it an hour before people are in here talking about how the XTPE is 100% faster than the 6800U based on those graphs...this hasn't been in any other [H] vid card review that I've seen...so why start now?
Yeah the norton defragger did help ALOT with the stuttering I had prior. I used to use executive software diskeeper defragger but it didn't do a good job defragmenting with Half-life II like the Norton defragment software. It did cut the stuttering by a significant margin.FingerSlut said:anyone try this game with 2 gigs of ram? I hear the review systems had 2 gigs. Ive tried with 1 gig and 1.5 gigs, and the stuttering did go down, just not totally, im just out of ram slots
'heh read the benchmarks, n/m
Moloch said:Didn't read whole thread, but unless brent turned on +r_fastzreject 1 , the benchmarks are not good enough for me, as the nvidia cards have this enabled by default, and the ati ones dont, because are cpu limited levels, it could be slower, but the gains of said tweak is 40+fps.
http://www.beyond3d.com/forum/viewtopic.php?t=18170&sid=f63f6abc11083b04dbcc6f0a8bb28da4jon67 said:What are you saying here...? Is this a command for the HL2 console? This tiny command will improve the performance of any not-cpu-limited ATI card? Including my 9800PRO?
^eMpTy^ said:why the hell would you have a 2 page report...and have 1/6th of it devoted to AA modes that no one uses? why would you even bother comparing super sampling modes on the 6800s to the 6x mode on the XTPE??? that makes ZERO sense.
how about comparing the 6x mode to nvidia's 4x and see how much of a tangible difference in image quality it makes since we all know that the super sampling modes on nvidia hardware are COMPLETELY DIFFERENT, atleast give a shot of what super sampling can do vs multi-sampling...just this blind comparison with no explanation is very confusing...especially given the brevity of the preliminary benches...
I give it an hour before people are in here talking about how the XTPE is 100% faster than the 6800U based on those graphs...this hasn't been in any other [H] vid card review that I've seen...so why start now?
Moloch said:But it's supposed to head to head.
I agree if you're going for an out of box experience, but I think it would be worthwhile to see what happens when you enable that command, since users are reporting such large gains.
It seems kinda silly not do it, if you even knew about the command, to satify curiousity.
priyajeet said:
Mr Mean said:Hey Brent did you try using the driver forced AA/ANISO and disabling the ingame AA? Are the results the same?
Why don't you ask Catalyst Maker about this? Seems like a waste not to make good use of his expertise while he's here.Moloch said:Didn't read whole thread, but unless brent turned on +r_fastzreject 1 , the benchmarks are not good enough for me, as the nvidia cards have this enabled by default, and the ati ones dont, because are cpu limited levels, it could be slower, but the gains of said tweak is 40+fps.
Moloch said:Calm down dude.
I've already mentioned this earlier in the thread. It looks like the 66.93 drivers that ATi and FS used seem to struggle on the Canal level, the 67.02 drivers that the [H] and Anand used seem much better. Not sure why that is but 67.02 is supposed to have better shader performance and the Canal level seems pretty shader-intensive.Apple740 said:Firingsquad is showing some benches where it looks like that Nv has a problem with water shaders.
http://www.beyond3d.com/forum/viewtopic.php?t=18170coz said:Why don't you ask Catalyst Maker about this? Seems like a waste not to make good use of his expertise while he's here.
It's in the article.spyderz said:anybody cares to explain how to launch the [H] demo's
spyderz said:anybody cares to explain how to launch the [H] demo's
Brent_Justice said:put them in your half life 2/hl2 directory
unzipped of course
bring down the console and if you just want to play the demos in regular speed type playdemo demoname
demoname = the name of the demo
if you want to run a timedemo then time timedemo demoname
https://mywebspace.wisc.edu/phora/web/HL2_demos/spyderz said:anybody cares to explain how to launch the [H] demo's
macatak said:Hi everyone, i need some help with running the HL2 Hardocp TimeDemo's, i've d/loaded and extracted the Demo's to my HL2 folder and enabled the 'developers console' and i'm not sure what to type in after you type.....timedemo, i've tried a few different commands but i cant' get the demo to run.So if someone could give me(noob) some instructions i would appreciate it,.
thanks
Brent_Justice said:It would be nice if the timedemo result HL2 gives you also shows the Min FPS. Right now all it shows is the AVG FPS.
When we do gameplay evaluation using HL2 we will of course note the Min FPS in our graphs using FRAPS as usual. IMO the Min FPS is more important than average FPS.
Moloch said:Didn't read whole thread, but unless brent turned on +r_fastzreject 1 , the benchmarks are not good enough for me, as the nvidia cards have this enabled by default, and the ati ones dont, because are cpu limited levels, it could be slower, but the gains of said tweak is 40+fps.
digitalwanderer said:True, but that don't really mean the cards were exactly "apples to apples" compared.
Next comes the image quality thing, I just got done comparing in-game AA/AF to forced thru control panel/radlinker and the forced one is just drop-dead gorgeous to me....and I'm getting amazing framerates at 6xAAt2 16xAF!
I LOVE THIS GAME!!!!
Next up, I gotta somehow tear myself away from playing on me X800 to see how it looks on a GT....but I probably won't be able to until I finish the game.
Good benchies, sorry about me earlier confusion about v-sync....I was still pre-coffee/my head in HL2 and I thought the graphs all topped out at 100.
(BTW-HI TERRY!!!!)
Gavinni said:How's it apples to apples if its enabled on one and not on the other? Sounds like laziness to me...
Jesus-Hopscotching-Christ, it's enabled by default on NV cards BUT not by ATi. It's ATi/NV's fault for that, go cry to them about it. It's not Brent or any other reviewer's job to enable it on both since thats not the experience that the End user will get "out of box."Gavinni said:How's it apples to apples if its enabled on one and not on the other? Sounds like laziness to me...
CrimandEvil said:Jesus-Hopscotching-Christ
CrimandEvil said:Jesus-Hopscotching-Christ, it's enabled by default on NV cards BUT not by ATi. It's ATi/NV's fault for that, go cry to them about it. It's not Brent or any other reviewer's job to enable it on both since thats not the experience that the End user will get "out of box."
Gavinni said:It's actually not ATi/NV's fault, its valves, but w/e...
True to an extent, I forced it thru radlinker though using the "force all levels" in AF setting to improve the visuals even though it's a bit of a performance hit.burningrave101 said:CP AA/AF is only there for games that do not have the option of enabling it in-game. CP AA/AF doesn't apply it the same way as the game developer intended and its usually more optimized to try and boost fps.
Control Panel enabled AF on ATI hardware is more optimized, especially for DX9. Optimized means more things like brilinear filtering. You'll get a little better performance with a little lower IQ.
burningrave101 said:Ya know, it just seems like its been one thing after another this year that people have screamed needs to be enabled on the X800's to make the performance "fair" against the NV40's lol.
Ya know, it just seems like its been one thing after another this year that people have screamed needs to be enabled on the X800's to make the performance "fair" against the NV40's lol
I never said it was his fault, I said that if he knew about(which he didn't) that he should enable it to see how much of a gain, its like enabling SM3 for nvidia cards in far cry.. see what kind of gains are to be had.burningrave101 said:And who's fault is that that its disabled by default on ATI cards? Its sure the hell not Brents fault and its not his responsibility to use console commands to try and make the benchmarks seem more "fair".
CP AA/AF is only there for games that do not have the option of enabling it in-game. CP AA/AF doesn't apply it the same way as the game developer intended and its usually more optimized to try and boost fps.
Control Panel enabled AF on ATI hardware is more optimized, especially for DX9. Optimized means more things like brilinear filtering. You'll get a little better performance with a little lower IQ.
And the benchmarks were very much "apples to apples" because it was default to default, and out of the box to out of the box. If you start changing settings around to try and boost performance on one card just because it doesn't have something enabled by default then its no longer apples to apples because the cards aren't running at their default settings.
And like Brent said, the majority of NV40 and X800 users dont have a shits clue about some console command from beyond3d forums that boosts performance on the ATI cards so why show results the majority of users will never see without that command.
SnakEyez187 said:Why does it seem like you and some other users are trying to "sell" one company over another? Why must you make posts just to put doubt in peoples minds over their own decision on what to spend their own money on?
That's just the thing....I think I know a bit about what I'm talking about having played with both and there just ISN'T a "best card" right now, it comes down to a matter of preference.burningrave101 said:Because maybe that person that wants to spend $400-$500+ would like to spend their money on the "best" video card and with the help of people like me and others that argue which is better they are able to get all the facts laid out in front of them and dont just go off some halfbaked review they read or what some friend of theirs said that had no clue what the hell they were talking about.