According To VALVE's Gary McTaggart...

Michael Younger

Weaksauce
Joined
Oct 17, 2004
Messages
77
This will be old news for many of you, but Valve Software's resident technical guru, Gary McTaggart, was questioned about the latest generation of graphics technology.

When asked which technology was the best for Half-Life2 - the NV40 series (GeForce 6800) or the R420 series (Radeon X800) - he responded that the X800 is... 20% FASTER!

Ho-lee!

That's quite a difference, in my opinion.

He was then asked to put together three PC setups that would be sufficient to play Half Life2 on - a low-end, a mid-range, and a high-end system.

Low-End:
2 GHz P4
512MB RAM (he would prefer that you have 1GB)
ATI Radeon 9600 PRO 128MB

Mid-Range
3 GHz P4
1GB RAM
ATI Radeon 9800 PRO 128MB

High-End
3.6 GHz P4
2GB RAM
ATI x800 XT 256

Note that his systems all have an ATI video card in them!
 
lol... ATI and Valve are partners, there are "optimizations" in HL2 engine for ATI based cards. HL2 is supposed to be the ATI "killer app".
 
HL2 better be an ati killer app... I plan on killing it daily and I have an ATI card.

~Adam
 
But what happens if, over the course of the next 36 months, many of the great games end up using the Source engine?

I'm really stoked about Half-Life 2.

I'm also pretty stoked about Vampire: The Masquerade - Bloodlines, which of course also uses the Source engine.

What difference does it make if Valve is optimizing ATI cards for its engine - wouldn't we be much happier gamers if every gaming company took the time to optimize their software for specific cards? 20% is 20% in my mind - who cares how they achieved it.

The only people I would foresee objecting to this 20% increase would be... oh, let me go out on a limb here and say... uhm, nVidia owners!
 
HIGH END

he says 2gb of memory, so would that really help in HalfLife2 ? Exactly how does it hwlp the game, playing it on 23" LCD at 1920res, or just playing it at 1280res with 6xAA ?
 
Michael Younger said:
But what happens if, over the course of the next 36 months, many of the great games end up using the Source engine?

I'm really stoked about Half-Life 2.

I'm also pretty stoked about Vampire: The Masquerade - Bloodlines, which of course also uses the Source engine.

What difference does it make if Valve is optimizing ATI cards for its engine - wouldn't we be much happier gamers if every gaming company took the time to optimize their software for specific cards? 20% is 20% in my mind - who cares how they achieved it.

The only people I would foresee objecting to this 20% increase would be... oh, let me go out on a limb here and say... uhm, nVidia owners!

JEEZE dude...calm down...you gotta realise..this game wont rape your card the way doom3 does...it looks almost as good and is ALOT easier on the system....nvidia card WILL STILL PLAY IT....20% isnt alot when your talking 100fps at 16x12 with 2xaa and 8xaf......yes there are ati opitimizations...but nvidia is STILL a badass card..and will handle them very well.....calm down..
 
how does having 2 gigs of ram help while playing games you ask?it helps becuase the game can put more data out to memory instead of having to use the paging file. anytime a program has to use the paging file it causes a huge performance loss because the harddrive has to written too and read off of in order to keep doing its thing. by having that 2gigs of ram the program is allowed a lot of storage in the ram instead of the paging file.
 
This shouldn't stop me from buying a 6800nu eh? I wouldn't think so, it'll still run good right? The next game I'll be playing are Far Cry, HL2, and Stalker. Its not like I would be better off getting a 9800pro right? This seems like just nvidia vs ati propaganda again.
 
I serously doubt ati cards will be much faster than nvidia cards, atleast with nvidia's driver optimizations.
 
No it wouldn't stop yo ufrom buying a 6800nu... but it might give you good reason to save for the GT or x800 pro

~Adam
 
i'm willing to bet you will be able to have an enjoyable gaming experience on both nvidia and ati current gen hardware...
 
Brent_Justice said:
i'm willing to bet you will be able to have an enjoyable gaming experience on both nvidia and ati current gen hardware...

Yea once NV gets their driver optimizations out, they'll catch up to the inherent ATI optimizations. But even if they don't 10% or 20% is not all that bad of a hit for just one gaming engine, IMO.

~Adam
 
Didn't they say ati would be 30-40% faster a year or so ago? Funny how that number dropped. Whats to say it won't drop again.

Besides when ati has their d*cks in valves mouths what do you expect them to say? :rolleyes:

Canadian, my 6800nu gets 100.33fps on the the video stress test @ 1024x768 4xaa/4xaf....just a little fyi for ya.
 
Weren't they saying the same thing about CS:S? Wasn't that the whole purpose of the VST? Wasn't it all a bunch of horse shit?

This is getting down right retarded. :rolleyes: I bet we'll see more like 5%, lol.
 
I think you all are still stuck in the old 3dfx mentality that says xxx% faster = better.

Such is not the case anymore with these games and video cards, there are other factors and variables.

Even if Card A is "20% faster" (and i hate even typing that out) that still doesn't mean you won't have a wonderful enjoyable gaming experience on Card B, each might have different properties, for example IQ. Also 20% could be the difference between 100fps and 120fps, heh, BOTH which would be fine.

The point is don't get so caught up on bla bla bla is xx% faster etc. etc.

What matters is the overal gameplay each card delivers in a game.

And with that I'm willing to bet you will have a great time gaming with hardware from nvidia or ati.
 
hl2 isn't out yet... its coming out in mid-november, and i bet the 4.11 catalyst is coming out very close to that. which could potentially mean the kind of speed increase ATI and valve have been boasting about. MAYBE. obviously im hopeful because i have an ati card heh
 
Doh-nut, I love the quote in your sig. It really applies to this thread. :D
 
Brent, you of all ppl should know that at stressfull levels of settings when 20% of 30fps is a large amount to be worried about between playable and unplayable, a 20% hit is a killer. However, I said what I said above because most people don't play at stressfull levels, like me for example... I play at 1024by768 to maximize my aa and af to stay around 85fps (my beautiful refurbed trinitron 21 inch monitor's max hz @ up to 1600by1200) to lower eye strain since I have weak eye muscles (forgot what they call that, there's a specific name for that) that force me to zone out (as well as get migraines, infrequently, because my strategy works) and I try to keep my concentration at max longer through that strategy while getting good IQ. So anyhow back on topic, it really depends on the type of play you want to look at, 20% isn't so bad at 100fps, but 20% can be killer at 50 fps for playability.

~Adam
 
CleanSlate said:
Brent, you of all ppl should know that at stressfull levels of settings when 20% of 30fps is a large amount to be worried about between playable and unplayable, a 20% hit is a killer. However, I said what I said above because most people don't play at stressfull levels, like me for example... I play at 1024by768 to maximize my aa and af to stay around 85fps (my beautiful refurbed trinitron 21 inch monitor's max hz @ up to 1600by1200) to lower eye strain since I have weak eye muscles (forgot what they call that, there's a specific name for that) that force me to zone out (as well as get migraines, infrequently, because my strategy works) and I try to keep my concentration at max longer through that strategy while getting good IQ. So anyhow back on topic, it really depends on the type of play you want to look at, 20% isn't so bad at 100fps, but 20% can be killer at 50 fps for playability.

~Adam
Agreed. Smooth framerates (ie. a faster card) lead to a much more enjoyable gaming experience than squinting at the screen trying to figure out if your Anisotropic filtering is REALLY running at 6x. Anisotropic filtering has nothing to do with gameplay. Framerate DOES.

You will never get killed online because your Trilinear filtering isn't as "pure" as it "should be" .

But you will always die when you are trying to turn around in-game and the framerate drops to 5 frames per second.
 
KingPariah777 said:
But you will always die when you are trying to turn around in-game and the framerate drops to 5 frames per second.

ROFL, yea 5fps is like a slide show on crack, and you feel the crack seeping into your head from the screen.

~Adam
 
Nvidia will definitely catch up with Ati in HL2 performance after seeing how many cards Doom 3 helped them sell. Don't forget that the 6800s are faster than the x800s in CS:S today.
 
sodaz said:
Nvidia will definitely catch up with Ati in HL2 performance after seeing how many cards Doom 3 helped them sell. Don't forget that the 6800s are faster than the x800s in CS:S today.

not when the ATI beta 8.07's are used: http://www.xbitlabs.com/articles/video/display/cs-source.html

these are what the 4.11 catalysts will based off of(support for 256mb among other things). other benchmarks from other sites use nvidias latest betas, but only catalyst 4.9.
 
All I know is that I get 115 fps in the video stress test with my 6800 ultra on 1600x1200, all details maxed, 4xaa 8xaf. So if ATI cards get 20% more than me, good for them, but frankly I dont think anyone will notice a difference between 115 and 138 fps. I dont think ATI cards will get 138 fps, truth be told, I think they will be 5-10% better at MOST, if even that. So nvidia owners, do not worry!
 
Brent_Justice said:
Even if Card A is "20% faster" (and i hate even typing that out) that still doesn't mean you won't have a wonderful enjoyable gaming experience on Card B, each might have different properties, for example IQ. Also 20% could be the difference between 100fps and 120fps, heh, BOTH which would be fine.

pretty much what i was gonna say. i go to movies that run at what, 28fps or so and they are perfectly smooth? the max refresh rate on my lcd panel is 60hz which means each pixel can only be cycled through 60 times per second anyway?

sure i like high framerates, but if my fps never dips below 30 then odds are pretty good that i will never notice.
 
doh-nut said:
not when the ATI beta 8.07's are used: http://www.xbitlabs.com/articles/video/display/cs-source.html

these are what the 4.11 catalysts will based off of(support for 256mb among other things). other benchmarks from other sites use nvidias latest betas, but only catalyst 4.9.

Thanks for posting the link, but did you even check it? Let me count how many times the PE was faster than the Ultra.

4 times vs. 14 times where the Ultra was faster

I didn´t count the CPU dependant levels Havanna and Office. ATI takes the lead in Havanna, and NV in Office so it wouldn´t make a big difference anyway.

Yes the PE is faster in 1600x1200 and in Eye Candy mode but overall you can´t say thats a "total win" for ATI, not even in that particular resolution, since the NV cards are not that far away or in an unplayable area. Based on that I will not buy an ATI card. I will wait to see what HL2 does first but considering this xlabs test benchmark on more or less "even" beta drivers (IQ seems very similar based on their jpg screenshots) I will rather get a GT as my next card and it´s also cheaper over here than the X800pro (non vivo) anyway.

ATI needs the "magic HL2" 20% for me to just even re-consider it again.
 
PR Rep says that their partner is going to be 20% faster then the opponent release partner (nvidia) who with their partner (id) got their game out on time without missing I think 4 release dates, almost two generations of technology (which both cards have had much time to improve), where their partners )ATi) card was raped at begining (but is starting to catch up, thankfully)...

Hmmm? Irony?

And I highly doubt it will be 20, maybe 5-10 like other people have mentioned, if at all.

Just my .02 cents.

Oh and if you say "Well nVidia called that on ATi and look at what happened" well, we all know ATi had crappy OpenGL driver support at the time, and somewhat do, but look at what's happened recently with these magical 8.07 beta's... They're catching up.

PS. I also think there will be a nvidia driver set before Half Life 2.
 
DropTech said:
PS. I also think there will be a nvidia driver set before Half Life 2.
Hopefully, since there hasn't been an official, non-beta WHQL driver released since July.
 
Blackdog said:
Hopefully, since there hasn't been an official, non-beta WHQL driver released since July.

True, however the 66.81s have been WHQLed for the 6er series. They are still beta though, which ATI calls "hotfixes", IMHO a better name since a lot people think "beta" is bad.

Actually I hate that game developers get "sponsored" by IHV´s, if this trends continues to get stronger (one sided optimizations), then I simply won´t buy games like that anymore, and I don´t care if it´s a big title or not. :mad:

In my book it´s okay to optimize a slow card, and make it playable and the way the devs meant it to play. However, the hardware/drivers should make the difference on how well the cards perform in games and the game devolopers should not take sides. Lazy devs suck. It´s easy to "optimize" only for one side and count their IHV money. :rolleyes: If this continues our rigs won´t have SLI (just using it only as an equivalent for best performance), but they will need one vid card from each IHV. :rolleyes:
 
CleanSlate said:
Brent, you of all ppl should know that at stressfull levels of settings when 20% of 30fps is a large amount to be worried about between playable and unplayable, a 20% hit is a killer. However, I said what I said above because most people don't play at stressfull levels, like me for example... I play at 1024by768 to maximize my aa and af to stay around 85fps (my beautiful refurbed trinitron 21 inch monitor's max hz @ up to 1600by1200) to lower eye strain since I have weak eye muscles (forgot what they call that, there's a specific name for that) that force me to zone out (as well as get migraines, infrequently, because my strategy works) and I try to keep my concentration at max longer through that strategy while getting good IQ. So anyhow back on topic, it really depends on the type of play you want to look at, 20% isn't so bad at 100fps, but 20% can be killer at 50 fps for playability.

~Adam

that's why you find the setting that is playable on a certain card (resolution/aa/af)

why play at a setting that isn't playable?

most people adjust their settings to get the highest IQ possible that is playable on their card
 
CS:S plays fine at 1600x1200 4xAA/8xAF on high end nvidia cards, so being 20% slower is not a problem. Even if HL2 is more demanding, it will should play fine. We'll know soon.

Remember the problem with Doom3 was that the framerates weren't playable on ATI's top end card (X800 XT PE) at 1600x1200 4xAA/8xAF, so being 50% slower in that case *was* a problem. The shader replacements in A.I. fixed that problem now though.
 
I think it's fine to optimize against a given feature set; i.e. Far Cry optimizing for SM3.0, as long as the competitor has the option of putting out hardware that supports that feature set and seeing the same kind of performance increase. I hope the rumors about future SM3.0 support in Source are true; it looks like it's about to give the CryEngine a nice little boost.

Meh. My GT kicked ass in Doom3, and I'm betting it'll have no trouble with HL2. Without a doubt, it'll handle Quake 4 well. I don't think there's anything coming out before the next Unreal Engine that my card can't handle. So, why should Nvidia owners worry about their performance? We're doing just fine, thank you.
 
I dislike this new penchant of making games meant for a specific card. Sure at the high end it isn't such a big deal. (Even if I am rather unhappy with D3 performance on my XTPE)
However I feel that at the mid and low end gamers machine suddenly it gets alot more noticable.
 
The latest and greatest from both camp's will perform very well when Half Life 2 comes out. I personally won't care on bit that my XT PE will outperform a 6800. I bought a high end this generation because I wanted to turn all the eye candy on and play games at 1600x1028, and actually get some decent fps.
 
Worldhammer said:
I dislike this new penchant of making games meant for a specific card. Sure at the high end it isn't such a big deal. (Even if I am rather unhappy with D3 performance on my XTPE)
However I feel that at the mid and low end gamers machine suddenly it gets alot more noticable.

I just had flashback to pre-VESA days. You couldn't get past 640x480x16 unless the program was made for it :(

I hope we don't get a rerun of the dark ages.
 
I'm still in limbo about knowing whether HL2 will use SM 3.0, does anyone know?
 
Michael Younger said:
Note that his systems all have an ATI video card in them!

And they all have Intel processors! I guess HL2 will perform better on an Intel Pentium 4 based system than on an AMD Athlon 64 one. :rolleyes:
 
Borealis said:
Didn't they say ati would be 30-40% faster a year or so ago? Funny how that number dropped. Whats to say it won't drop again.

Besides when ati has their d*cks in valves mouths what do you expect them to say? :rolleyes:

Canadian, my 6800nu gets 100.33fps on the the video stress test @ 1024x768 4xaa/4xaf....just a little fyi for ya.
That was with the R3XX vs the NV 3X, and it holds true, they run the game in mixed mode.
I don't think nvidia will ever catch up to ati, the XT has a good reason to be faster than the 6800 ultra, the pro, however, won't be much faster than the GT, if at all.
BattleMaster said:
And they all have Intel processors! I guess HL2 will perform better on an Intel Pentium 4 based system than on an AMD Athlon 64 one. :rolleyes:
There's nothing wrong with doin a little marketing for whoever is giving you a bunch of money.
 
Back
Top