ATi and Valve alliance bad for nVidia?

obs said:
I think the main thing here is that you can get better IQ and almost as good performance just by tricking HL2 to thinking you have an ATI card. Also, ATI having full access to testing and benchmarking their cards while nvidia couldn't is another issue. I don't believe that is common among game developers.

no it's not common...the ATi/Valve deal was the biggest and most one-sided thus far...I think it's a shitty way to do business...mostly I blame Valve...ATi is doing what they need to do to sell video cards and that's fine...Valve selling out to the highest bidder is garbage...not to mention lying about the game being done to begin with...and then the whole steam debacle once you actually get the game...and then the game is buggy after all that...personally I think Valve has some serious management issues...
 
Moloch said:
is the 1/4 gamers the nvidia !!!!!!s, or uninformed people who bought the FX?
Even using FP16 it still is slow.
the DX8.1 mode isn't that bad either.. still looks excellent with my 8500 64MB with details maxed+ 16AF.
plays good for the most part also, few slow downs, but changing the texture detail to medium fixes it.

LOL...as much as you run your mouth around here and you have an 8500...
 
^eMpTy^ said:
no it's not common...the ATi/Valve deal was the biggest and most one-sided thus far...I think it's a shitty way to do business...mostly I blame Valve...ATi is doing what they need to do to sell video cards and that's fine...Valve selling out to the highest bidder is garbage...not to mention lying about the game being done to begin with...and then the whole steam debacle once you actually get the game...and then the game is buggy after all that...personally I think Valve has some serious management issues...

I agree with everything there.
 
^eMpTy^ said:
no it's not common...the ATi/Valve deal was the biggest and most one-sided thus far...I think it's a shitty way to do business...mostly I blame Valve...ATi is doing what they need to do to sell video cards and that's fine...Valve selling out to the highest bidder is garbage...not to mention lying about the game being done to begin with...and then the whole steam debacle once you actually get the game...and then the game is buggy after all that...personally I think Valve has some serious management issues...

The money was for the coupon to be with the cards, for the game to be free. It wasnt to make ATi's cards faster than NV's in the game. Despite what you might think.
 
CastleBravo said:
The 6800 GT is faster than the x800 Pro in Doom 3, Far Cry and HL2. And ATI doesn't sell a product comparable to the 6800 (faster than 9800 XT, slower than x800 Pro, but cheaper than either).

For instance:

http://graphics.tomshardware.com/graphic/20041004/vga_charts-08.html
http://graphics.tomshardware.com/graphic/20041004/vga_charts-09.html
http://www.hardocp.com/article.html?art=NjkyLDI=
http://www.hardocp.com/article.html?art=NjkyLDM=

So I wouldn't worry about it.

you are totally wrong ...

you should compare 6800 gt with the x800xt ...........not the 12 pipes pro !!!

and you should compare the 6800 nu with x800 pro.....both 12 pipes

results : ATI owns both !!
 
SoLiD_MasteR said:
you are totally wrong ...

you should compare 6800 gt with the x800xt ...........not the 12 pipes pro !!!

and you should compare the 6800 nu with x800 pro.....both 12 pipes

results : ATI owns both !!


The cards are completely different. 12 pipes on one != 12 pipes on another. It is similar to a car engine. One engine may be a v8, but perform better than a v10. If you want to compare cards, compare them as a whole, not one minute detail of the cards. Aksim tge X800XT is the top of the line card from ATI, no? Why, then lets compare it to the 6800 Ultra, which is in the SAME PRICE RANGE.

If you still feel like making a stupid comparison, lets look at how much they cost. I just got a BFG 6800 OC for $245, and it came with a free copy of HL2. I look on newegg, and the lowest priced X800 Pro is $390. Price/performance is far better on the 6800 if you look at it this way.

results: you need to go back to school.

As for trying to call me a nvidia !!!!!! or something similar - I have a Radeon 8500 that will be retired when I install my 6800. I don't prefer one company over the other. I'll buy what gives me the best performance for the least cost. I like value, as many people should. What it boils down to is that you should want both cards to do very well in comparison to each other. If one does poorly, then one company gets more market share, and video card prices go up.

Anyway, I'm done responding to this.
 
^eMpTy^ said:
LOL...as much as you run your mouth around here and you have an 8500...
I dont play games that need more(read next gen games), but I am getting atleast a 6800 NU soon.
I dont play FPS games religiously like most gamers do(atleast here) I find it rather boring, so I just beat doom3 using cheats, same with HL2, I just played to see the story. and the graphics, which again, played fine at 1024 max details and 16AF.
I'm into RPG games like NWN, KOTOR, and vampires: bloodlines, and all those games play fine at 1024, HL2 actually plays the best- I can max the details.
Btw, running my mouth? You're the king of that, you crap all over ati threads, I really have nothing on you :p
I didn't see much point of getting the 9700 cuz back then there wasn't games that use DX9, and while FSAA is obviously one reason to buy it, or any 1st gen DX9 part, thats all secondary.
And I'm obviously going to get one of the 2nd gen parts, depending on the pricing in early january.
I'm not sure if you realize this, but there's a thing called the internet, that you're using to crap all over ati threads, and on this "internet" there "sites" that review computer hardware, and you can read these reviews, and decide if like this part, and can suggest the part to people you know based on the review, and you can also talk to users of that card, or use that part your self.
 
SoLiD_MasteR said:
you are totally wrong ...

you should compare 6800 gt with the x800xt ...........not the 12 pipes pro !!!

and you should compare the 6800 nu with x800 pro.....both 12 pipes

results : ATI owns both !!

The x800 Pro is the same price as the 6800 GT. The x800 XT is the same price as the 6800 Ultra. So my comparison is right, and you are being a mindless ATI slut.

It would be exactly like me comparing a 5950 Ultra to a 9600 XT just so I could declare nVidia the winner, even though one is about three times as expensive as the other. Total bullshit.

If the 12 pipeline x800 Pro can't compete with the 16 pipeline 6800 GT at the $400 price point, that's ATI's fault for selling a nerfed product. If they priced the x800 Pro closer to the 6800, the situation would be the exact reverse.
 
The main thing I don't understand is this:

I currently have a non-pro Radeon 9600 and I am upgrading to a faster card: the FX 5900XT

In EVERY review I have seen the 9600 get's it's ass handed to it on a silver platter by the 5900XT, so I consider it an upgrade....Until you throw HL2 into the mix...Which in that case the LOWLY 9600 whips the 5900XT's ass? I think something is REALLY wrong with that...
 
SoLiD_MasteR said:
you are totally wrong ...

you should compare 6800 gt with the x800xt ...........not the 12 pipes pro !!!

and you should compare the 6800 nu with x800 pro.....both 12 pipes

results : ATI owns both !!
Except that the NVIDIA cards are both $100 cheaper in those cases.
 
fallguy said:
The money was for the coupon to be with the cards, for the game to be free. It wasnt to make ATi's cards faster than NV's in the game. Despite what you might think.

So I guess shader day, the public benchmarks, and all the offhanded statements about ATi cards being so much faster on "next generation hardware" were just out of the kindness of Valve's little heart? :rolleyes:
 
Moloch said:
I dont play games that need more(read next gen games), but I am getting atleast a 6800 NU soon.
I dont play FPS games religiously like most gamers do(atleast here) I find it rather boring, so I just beat doom3 using cheats, same with HL2, I just played to see the story. and the graphics, which again, played fine at 1024 max details and 16AF.
I'm into RPG games like NWN, KOTOR, and vampires: bloodlines, and all those games play fine at 1024, HL2 actually plays the best- I can max the details.
Btw, running my mouth? You're the king of that, you crap all over ati threads, I really have nothing on you :p
I didn't see much point of getting the 9700 cuz back then there wasn't games that use DX9, and while FSAA is obviously one reason to buy it, or any 1st gen DX9 part, thats all secondary.
And I'm obviously going to get one of the 2nd gen parts, depending on the pricing in early january.
I'm not sure if you realize this, but there's a thing called the internet, that you're using to crap all over ati threads, and on this "internet" there "sites" that review computer hardware, and you can read these reviews, and decide if like this part, and can suggest the part to people you know based on the review, and you can also talk to users of that card, or use that part your self.

IMHO if you don't even have one of the cards you like to run your mouth about all the time...you have absolutely no room to talk...but then again I don't think people who don't know what they are talking about should talk either...so now you have not one, but two good reasons to keep your mouth shut... :)

You calling me a thread crapper is like charles manson calling the pope a religious zealot...
 
Firstly, I really think that, useing the dx8 path over the dx9 path for the Fx series, was based on performance. The only chips in the fx series that remain playable using the hack are the top 2 or 3. And they are still much slower than when useing the dx8 path.

Secondly, even if it was a purposeful oversight, not optimizing for nvidia fx series performance.. This has been going on since 3dfx got it's first true competitor. Nvidia.. They both did this then.. Nvidia and now Ati, do it now.. Can't tell you how many games have the nvidia splash screen on them. Do you think the game devs put them there for free? I don't like it but it is the way it has really always been.. You are going to be nice to people who line your pockets..

At least we no longer have exclusivity. I remember when a number of games would only run in glide. In 3d anyway. Even after the TnT 1 came out, some games were written exclusively in glide, and a great number of others ran very poorly or at least not as well, in ogl or d3d..
 
^eMpTy^ said:
So I guess shader day, the public benchmarks, and all the offhanded statements about ATi cards being so much faster on "next generation hardware" were just out of the kindness of Valve's little heart? :rolleyes:

Unless you can prove that Valve made ATi cards faster because of money, stop spreading ignorance. :)
 
How about this...

Can we think of a reason why valve would have dropped the Nvidia specific code path, which was not a good/lame one?

Because i've seen a lot of responses to the code path being dropped which really at the end of the day are equally as lame as a conspiracy.

I can feel that valve team had some upper management decision makers which really didn't know what they were doing. I mean for starters we all have to put up with Steam....dont even get me started on steam.

Also can valve explain to us why the problems with the FX series in DX9 mode are easily removed with a vendor ID change for the video card. I mean fine for whatever reasons they want to deny us a mixed mode path, but then forcing FX 5XXX users no other reliable means of DX9.0 when its obviously perfectly possible, is really just the icing on the cake.

It's almost as if valve are saying to us, "hey look you CAN run DX9.0 with the FX5xxx range but it turns out crap anyways" and i refuse to believe they couldn't do it to time restraints or whatever, simply because home users can do it in 1 small easy step.

Anyways again i'll say with my overclocked FX5950 Ultra im playing the game through again with the DX9 path with no partial precision but with the vendor ID fix (so no shader faults) the settings are slightly lower but they're not what i'd call crap, and im enjoying it a lot.
 
fallguy said:
Unless you can prove that Valve made ATi cards faster because of money, stop spreading ignorance. :)
That is all they have on this theory............. :D
 
Badger_sly said:
Mods,

PLEASE just add this whole thread to the end of the "Valve Sucks" thread. We really don't want to have to see R1ckCa1n mistake 18% for 2.55% another 1000 times.
Don't be foolish son..... I will explain it to you one more time.

Of the 18 percent of FX users (which includes 5200,5600,5700,5900), only 2.55 percent (5900 specifically) of them could benefit from the mixed mode. So at the end of the day, only 2.55 percent of the steam users surveyed would benefit. :p
 
Why does everyone keep saying only 18% has an FX? What about people who chose NOT to respond to the survey? What about those with Custom Drivers / OEM / Manufacturer Drivers?

WAY more people have BOTH Makes of GFX Cards than a single survey, which in that case Valve should ALSO be catering to the Other camp as well, and not making them suffer because they wanted to assure ATi that they would be faster than nVidia, but nVidia would run it, albeit lower quality....
 
The cards can run the DX9 path just fine, with lower settings. Each person is different some people dont really need 100fps some are happy with a steady 30. Some people dont need high res or loads of AA or AF because they just dont notice the difference.

Considering they were developing the mixed mode already and they'd specifically told us that they'd spent longer doing it, you'd have thought that not completing it would be a huge waste of time, if they'd spent so long on it.

Does anyone have any idea exactly how many of the shaders in HL2 require DX9.0 rather than a lower version of the shader engine which can run at 16bit precision.

Did i read about 5% somewhere?

Due to the lack of drop in quality when you reduce the precision to 16bit on all shaders but while still using the ATI fix, i wouldnt think it would be much more than that.

*edit*

Another thing about this valves statistics collected from peoples computers, did anyone else notice that it didn't appear to be conducted or at least displayed to us very well?

One of the basic things i learnt in GCSE (UK highschool) statistics is not to overlap certain ranges so that people do not fall into more than one range.

Yet on the stats i see between 512 and 1024 ram, and between 256 and 512 etc

The most common values are taken meaning if you have 512 ram which catagory do you fit into? 256 to 512 or 512 to 1024. Maybe it was just a mistake relaying info back to us rather than when actually collecting the data, but it certainly doesn't inspire confidence in me about the people working at valve if they can't get something like this right. It could also be used to manipulate figures, although i dont think this applies to video cards specifically so a bit of topic.
 
Is it possible that Valve was working on the Mixed Mode for the leaked beta and then made mixed mode use DX8.1 with the cards with DX9 Shaders Mixed in instead of FP16 and FP32 to save work? It just seems like a LOT of work to Drop the Mixed Mode completely...
 
Dyne said:
Why does everyone keep saying only 18% has an FX? What about people who chose NOT to respond to the survey? What about those with Custom Drivers / OEM / Manufacturer Drivers?

Well because thats valve's tool and its there number one way to gauge the market. If you spent money and time building that type of reporting into steam, are you just going to ignore it?

WAY more people have BOTH Makes of GFX Cards than a single survey, which in that case Valve should ALSO be catering to the Other camp as well, and not making them suffer because they wanted to assure ATi that they would be faster than nVidia, but nVidia would run it, albeit lower quality....

if you have any proof that Valve and ATI did this on purpose, please share it. If not then please stop these posts. I have already shown you that other companies (ID) dropped a mixe mode path after spending years on it. And we both know that IF valve wanted to stick it to NV then they would have default to the DX9 path and FX users would have suffered with super low frame rates..
 
^eMpTy^ said:
So I guess shader day, the public benchmarks, and all the offhanded statements about ATi cards being so much faster on "next generation hardware" were just out of the kindness of Valve's little heart? :rolleyes:

No it was more like the turth. Remember back then you just had the TR:AOD bencmarks and the first HALO PC benchmarks that showed the FX getting killed by the 3xx cards. Pretty much every benchmark or game we had that used PS2.0 showed the FX being slower. Of course NV was able to get most of the prefromance back with heavy use of mixed mode and driver optimizations....

Besides what ever happened then was no more shaddy than NV being the sole sponser of the Offical Doom3 benchamks....
 
Ok, If you held an auction to have Rights to Box HL2 and a company bid 8 MILLION Dollars, would you NOT go out of your way to keep the business relationship alive and well? I know I surely would....Which entails that Valve Made it so the game plays FASTER on ATi...

I'm not saying the FX Series didn't have it's pitfalls, because we ALL know it did, but the way Valve is making them function is really begginning to piss me off.....

Ok, DX 8.1 is Faster than 9, We all know that....But if we are buying a DirectX 9 Card to play a Direct X 9 Game, shouldn't it be run in DX9? I don't want to hear "Oh only the 5900 and up can run it decently"...MAKE it run decently! What about Doom 3 (AWESOME Graphically looking!) which uses the same path for BOTH cards?

If I am buying a new card as an upgrade and it KILLS my old card, which is Inferior except in 1 game engine, I do NOT want it to bite me in MY (Consumers) ass because of people out to strengthen a business relationship...

Are you telling me that upgrading from my 2 year old Radeon 9600 to an FX 5900 XT isn't an upgrade?

Let's just get Valve to do one of these:

A) Finish the Mixed mode since there are a FEW things wrong with it
B) Implement Partial Prescision Hints
C) Use the nVidia FX Series tools to improve performance...

If Doom 3 Plays wonderfully on an FX, why the hell should the possible Engine of the year be any different? Wouldn't it just hurt the other game companies because people know their computer can't run it like everyone else?
 
^eMpTy^ said:
Give credit where credit is due, the r300 was the right card at the right time...the nv3x was the wrong card at the wrong time...if you have an FX class card...upgrade...I did...and I"m much happier for it...*pets his 6800GT*

*laughing* I don't know why, but that touched me right here *banging chest*
 
Dyne said:
Why does everyone keep saying only 18% has an FX? What about people who chose NOT to respond to the survey? What about those with Custom Drivers / OEM / Manufacturer Drivers?

WAY more people have BOTH Makes of GFX Cards than a single survey, which in that case Valve should ALSO be catering to the Other camp as well, and not making them suffer because they wanted to assure ATi that they would be faster than nVidia, but nVidia would run it, albeit lower quality....

Simple logic tells you the same percentage of people with FX cards who didnt respond to the survey, would be the same as people with R3x and other cards. Keeping the percentages the same.

But hey.. keep that tinfoil cap on, thinking that Valve made NV cards slower on purpose. Its not like they didnt spend FAR more time on them than ATi's cards or anything.
 
Jbirney said:
No it was more like the turth. Remember back then you just had the TR:AOD bencmarks and the first HALO PC benchmarks that showed the FX getting killed by the 3xx cards. Pretty much every benchmark or game we had that used PS2.0 showed the FX being slower. Of course NV was able to get most of the prefromance back with heavy use of mixed mode and driver optimizations....

Besides what ever happened then was no more shaddy than NV being the sole sponser of the Offical Doom3 benchamks....

ah...classic

Nvidia was NOT the sponser of the official Doom3 benchmarks...id was...and everyone else seems to know that by now...why don't you?

Moreover, neither nvidia nor ati got to see doom3 before the benchmarks, so nobody had time to optimize...for HL2 ATi had full access and nvidia didn't...

And when Halo was released, it wasn't bungie or gearbox bashing the nv3x, it was the hardware enthusiast websites...bottom line...Valve is the only game company to actively campaign for a video card company...with EVERY other game, it was nvidia or ati talking smack...

Game developers should be agnostic...and Valve isn't...
 
fallguy said:
Simple logic tells you the same percentage of people with FX cards who didnt respond to the survey, would be the same as people with R3x and other cards. Keeping the percentages the same.

But hey.. keep that tinfoil cap on, thinking that Valve made NV cards slower on purpose. Its not like they didnt spend FAR more time on them than ATi's cards or anything.

fallguy I know you're of the red persuasion so I'll run this past you for your approval...

don't you think it's a little sketch that they dropped the partial precision path after ATi paid them all that money?

I mean, they're actively and openly supporting ATi, and they happen to drop the partial precision path for nvidia cards and say "the FX series can't do dx9"...don't you think that's a little to much to be a complete coincidence? don't you think ATi's $$ might have influenced that decision just a little bit? or at least the situation is questionable enough to the point that this ATi/HL2 voucher deal should be looked down on by the industry?

I honestly don't think you can say that Valve tried to screw nvidia over...but it's sketchy enough that I would prefer the game developers and video card makers didn't have such exclusive relationships...because all it does is screw over the customers...i.e. us....
 
^eMpTy^ said:
fallguy I know you're of the red persuasion so I'll run this past you for your approval...

don't you think it's a little sketch that they dropped the partial precision path after ATi paid them all that money?

I mean, they're actively and openly supporting ATi, and they happen to drop the partial precision path for nvidia cards and say "the FX series can't do dx9"...don't you think that's a little to much to be a complete coincidence? don't you think ATi's $$ might have influenced that decision just a little bit? or at least the situation is questionable enough to the point that this ATi/HL2 voucher deal should be looked down on by the industry?

I honestly don't think you can say that Valve tried to screw nvidia over...but it's sketchy enough that I would prefer the game developers and video card makers didn't have such exclusive relationships...because all it does is screw over the customers...i.e. us....


Yes, this is exactly what I have been trying to convey, and I also believe what you are saying ^eMpTy^...

I know if I was Valve I would fix this before even more people become pissed like we have
 
^eMpTy^ said:
fallguy I know you're of the red persuasion so I'll run this past you for your approval...

don't you think it's a little sketch that they dropped the partial precision path after ATi paid them all that money?

I mean, they're actively and openly supporting ATi, and they happen to drop the partial precision path for nvidia cards and say "the FX series can't do dx9"...don't you think that's a little to much to be a complete coincidence? don't you think ATi's $$ might have influenced that decision just a little bit? or at least the situation is questionable enough to the point that this ATi/HL2 voucher deal should be looked down on by the industry?

I honestly don't think you can say that Valve tried to screw nvidia over...but it's sketchy enough that I would prefer the game developers and video card makers didn't have such exclusive relationships...because all it does is screw over the customers...i.e. us....


First off, Im not near as bias as you. I have a 6800GT. I do like ATi because I usually root for the underdog.

Secondly, you DONT know why, or when exactly they dropped it. Show me where ATi said, "the FX series can't do dx9", thanks. HL2 isnt the only game the FX cards do poorly R3x cards in DX9, Farcry with 1.3? Yeah. I dont think its wrong at all to pay for a game bundle. They paid money to have a game bundled with their cards. If the game came out when it was supposed to, it would have ROCKETED the ATi sales.

Doom3 is (or was) bundled with a certain NV card. I dont see you complaining about that.
 
^eMpTy^ said:
IMHO if you don't even have one of the cards you like to run your mouth about all the time...you have absolutely no room to talk...but then again I don't think people who don't know what they are talking about should talk either...so now you have not one, but two good reasons to keep your mouth shut... :)

You calling me a thread crapper is like charles manson calling the pope a religious zealot...
So I'm sure when you didnt have 6800, you didn't talk at all?
I dont know what I'm talking about?
haha..
What dont I know about?
What am I wrong about?
I've stated numerious times the X800 pro is slower than your precious 6800GT, which is really the only thing people can be on either side of the fence, because they're so close in performance(doom3 not counting).
The X800XT and the non existant X800XT-PE are faster than the 6800 ultra and the extreme edition.
So what's the problem ace?
Atleast I dont think the FX series was good untill HL2 came around :D
Those kids are hilarious, they seem to forgot about far cry and all the other DX9 games.
Poor kiddies, there cards were fine the DX8.1 games, but throw in some games it's supposed to be able run, and it falls flat on its little old face.
Oh man.. this is too fun.
 
fallguy said:
First off, Im not near as bias as you. I have a 6800GT. I do like ATi because I usually root for the underdog.

Secondly, you DONT know why, or when exactly they dropped it. Show me where ATi said, "the FX series can't do dx9", thanks. HL2 isnt the only game the FX cards do poorly R3x cards in DX9, Farcry with 1.3? Yeah. I dont think its wrong at all to pay for a game bundle. They paid money to have a game bundled with their cards. If the game came out when it was supposed to, it would have ROCKETED the ATi sales.

Doom3 is (or was) bundled with a certain NV card. I dont see you complaining about that.

you're right...you're not biased at all...what was I thinking... :rolleyes:

Doom3 was bundled with the video cards, that's done by the manufacturer (BFG, PNY etc)...it was NOT a deal between id and nvidia...when you buy Doom3 do you get an nvidia coupon in the box? no I don't think so...*glares at his ATi coupon from HL2 box*

No I don't know exactly when they dropped it, but I know for a fact that they dropped it AFTER the ATi deal...and that IS all I said isn't it?
 
Moloch said:
So I'm sure when you didnt have 6800, you didn't talk at all?
I dont know what I'm talking about?
haha..
What dont I know about?
What am I wrong about?
I've stated numerious times the X800 pro is slower than your precious 6800GT, which is really the only thing people can be on either side of the fence, because they're so close in performance(doom3 not counting).
The X800XT and the non existant X800XT-PE are faster than the 6800 ultra and the extreme edition.
So what's the problem ace?
Atleast I dont think the FX series was good untill HL2 came around :D
Those kids are hilarious, they seem to forgot about far cry and all the other DX9 games.
Poor kiddies, there cards were fine the DX8.1 games, but throw in some games it's supposed to be able run, and it falls flat on its little old face.
Oh man.. this is too fun.

And yet, as it falls on its face, it's still faster than your 8500...hahaha...
 
^eMpTy^ said:
you're right...you're not biased at all...what was I thinking... :rolleyes:

Can you not read? I said Im not nearly AS bias as you. I admitted I like ATi, yet I own a 6800GT. You wouldnt own a ATi card. All you do is post anti-ATi, and pro-NV. It gets old, old enough to put your worthless posts on ignore.
 
fallguy said:
Can you not read? I said Im not nearly AS bias as you. I admitted I like ATi, yet I own a 6800GT. You wouldnt own a ATi card. All you do is post anti-ATi, and pro-NV. It gets old, old enough to put your worthless posts on ignore.
Funny, I see fallguy as one of the more level headed guys on this forum. I too have a 6800GT but still think ATI has better products and they don't release broken parts.

Empty (and batman, robin, burning, ruined or trans depending on his mood)only post anti-ATI / pro-NV. Worse part is empty is barking up a tree over 2.55 percent of the Steam users.
 
^eMpTy^ said:
And yet, as it falls on its face, it's still faster than your 8500...hahaha...
Wow good one mate, got this thing years ago for 200.
Next time I play a game that lags like hell, ill be thinking of you.
 
Back
Top