Intel Core 2 Gaming Performance

StealthyFish said:
read that. He's got a great point
Thanks for that as I missed it. After reading this, how can anyone take the CPU reviews here seriously when they're so biased and two-faced?
 
haelduksf said:
No matter, an overclocked Opteron 175 matches THAT performance at the same price.

That knife cuts both ways.


and that lower priced Core 2 also OCs VERY VERY well from what i have seen

just a thought
 
dark_reign said:
Thanks for that as I missed it. After reading this, how can anyone take the CPU reviews here seriously when they're so biased and two-faced?

[H]ardOCP did a CPU review here: Intel Core 2 Music, Images, & Movie Performance

Wath you are bitching about is something that is explained in the fucking first line of the Intel Core 2 Gaming Performance

We test Intel's Core 2 Duo and Extreme using real-world gaming. Don't let a bunch of canned benchmarks lie to you about gaming performance, real gameplay experience tells a different story. Unless of course you game at 800x600.

And they even elaborate here:

As alluded to above, we can easily remove the GPU as the bottleneck in the system, but this requires running low resolution benchmarks at 640x480 or 800x600, and we all know that people that are looking at buying a new processor are not using these resolutions to game at. Basing benchmarks on this old thinking will certainly show you how true processor power scales, but it does not tell you how the gamer gets to use that processor power.

But I guess we can jugde from people posts, who actually READS the review...and who just look at the funny pics.

Terra - And that tells at lot about the people bitching... :rolleyes:
 
Elios said:
and that lower priced Core 2 also OCs VERY VERY well from what i have seen

just a thought

Shame on you for bringing logic into this thread! ;) :D

Terra...
 
LOL! Terra, that's the point....

Why all of a sudden does that matter? When [H] tested the AMDs, they had no problem using "canned benchmarks", yet now that the Conroe actually beats out the K8 in "canned benchmarks", it doesn't matter anymore.....and shouldn't be used to measure the power of Conroe.
 
Terra said:
[H]ardOCP did a CPU review here: Intel Core 2 Music, Images, & Movie Performance

Wath you are bitching about is something that is explained in the fucking first line of the Intel Core 2 Gaming Performance



And they even elaborate here:



But I guess we can jugde from people posts, who actually READS the review...and who just look at the funny pics.

Terra - And that tells at lot about the people bitching... :rolleyes:
So, let's just forget about the past?
 
Ok, given the K8 review and low-resolution benchmarks:

"As usual, we use low-resolution benchmarks to take the video card as far out of the equation as possible. Do not think that the below benchmarks represent a true gaming experience. Today's games are terribly dependent on video cards for their performance , but that does not mean that your CPU still does not play a growing role in your gaming, especially as newer games arrive. (Please check out our article on CPU scaling and your video card from last summer.) Your CPU still has many calculations to do in order to address such things as in-game world physics and AI, plus many other things. Our benchmarks below are designed to show you what CPU can give you the greatest power benefit when gaming although that may not always turn up in a frames per second graph of actual gameplay."

And now from the Conroe review:

""Let's just cut to the chase. You will see a lot of gaming benchmarks today that just simply lie to you. That is right, you will see frames per second numbers that are at best total BS, and at their worst a terrible representation of what difference a new Intel Core 2 processor will make in your gaming experience. The old ways of video game benchmarking do little to tell you about exactly how a new CPU will affect how you play your games or what experience your system supplies to you. Having more CPU power is a very cool thing, but being able to utilize it is not an easy thing to do nowadays."

I tried to bold the areas I'm trying to point out from each of those, but I'm not sure if it worked :p

Anyway, the K8 review specifically states that the low-resolution benchmarks do not represent a true gaming experience.

In the Conroe quote, he states that the old ways (i.e. low-resolution benchmarking) do little to tell you how the processor will actually improve your actual real-world performance. It sounds to me like, essentially, they've decided to ditch low-resolution benchmarking and only show what the real-world effects are. Now, if [H] discontinues this and goes back to low-resolution benchmarking for the next AMD release or whatever, then yes you can call them out on it. Until then, quit whining.

Being able to go and brag about how many frames per second you can get at the highest settings at 640x480 or 800x600 is all fine and dandy, but here's a case in point: last fall, when I was building my X2 system and trying to decide whether the 7800GT was suitable enough or if I should go with a 7800GTX, determining which to go with was rather difficult. I already knew which monitor I was planning on buying, and knew that I was going to run at 1680x1050. Thus, I looked for reviews that showed potential real-world performance between 1280x1024 to 1600x1200, for a rough approximation. Guess w hat? It was relatively difficult, even looking for simple GPU reviews, as many sites had the highest resolution they ran at being 1280x1024, and more commonly 1024x768. How was that useful to me? And if each of those are essentially the normal resolutions most people run at at the moment, how is 800x600 "real-world" useful for them? It isn't.

There's one other issue now that can try to lead somewhat towards this: current games that bring systems to their knees. We've never quite had games that truly brought even the most powerful of cards down, like we do now (such as F.E.A.R, Oblivion, etc.). Before, there were games that briefly might stress the top cards, but a few months later a new card would debut that ultimately wasn't pushed by that game. Thus, the rare game that did push a system to the limits, ultimately only did so for a limited amount of time. Then, when Sli came along, we saw the struggle between intense games and graphics cards move even further into the direction of the GPU. With games such as F.E.A.R now though, that can even stress out SLI and Crossfire, it's important to note how the real-world performance between a new cpu and the current top cards are can improve gaming. That's my only issue with the review: why they didn't make use of a rev. 304 Bad Axe with Crossfire. Ultimately however, very few people can afford it, and thus while it may show the real-world gaming environment for 0.2% of the overall computer market (if that), it doesn't help the casual enthusiast who invests in a powerful but yet cost-effective solution (such as a single card 7900GT/GTX or X1900XT/X) to see how investing in yet another upgrade will benefit them.

Atleast, this is my take on it.

Edit- it was easier to edit and underline the quotes I was to show :p
 
Perhaps you missed where we stated that we tried 7950 GX2 at first and it did not work properly.

That's cool but then you should have tried crossfire?

According to Anand SLI doesn't work right now because Intel and Nvidia are having a little fight. So the sensible thing to do if you want dual GPU benches is use the one that works right now, crossfire.

Unless you guys dont have a crossfire rig or something.
 
ignoring the [H] agenda on intel "cronies" for a minute...

here is another question:

You want real world performance(rwp) for the good of gamers, eh?

Thus, i'm sure you ment rwp for the majority of users. Well guess what..the majority of users have a nvidia 6600 GT video card. How does conroe help them!?

primary resolution used --1280 x 1024 and 1024*768.

Indeed i would think that less than 2% of gamers have the vid card used in your review :)

http://www.steampowered.com/status/survey.html
 
Sharky974 said:
That's cool but then you should have tried crossfire?

According to Anand SLI doesn't work right now because Intel and Nvidia are having a little fight. So the sensible thing to do if you want dual GPU benches is use the one that works right now, crossfire.

Unless you guys dont have a crossfire rig or something.


this is so ridiculous

you don't think hardocp knows about the issues that nvidia & intel are having?

NO %%%% SLI WON'T WORK

i bet you that hardocp spent no more than 1 boot up in trying to get SLI to work.

this whole review was cherrypicked to show conroe in the weakest way

and it still came out on top

seriously kyle & co, what is up with this

hardocp is a complete waste of time. this has been a complete ploy to bring in hits and cause a huge long drawn out thread.

Sensationlist bulllshit

that's what hardocp has come to
 
b0bd0le said:
this is so ridiculous

you don't think hardocp knows about the issues that nvidia & intel are having?

NO %%%% SLI WON'T WORK

i bet you that hardocp spent no more than 1 boot up in trying to get SLI to work.

this whole review was cherrypicked to show conroe in the weakest way

and it still came out on top

seriously kyle & co, what is up with this

hardocp is a complete waste of time. this has been a complete ploy to bring in hits and cause a huge long drawn out thread.

Sensationlist bulllshit

that's what hardocp has come to

why would [H] use SLI? Right now, the only ones running Conroe with SLI are on prototype nForce 570/590 boards with a less-than-optimized BIOS. Your logic concerning SLI doesn't make sense, seeing as how Anand didn't use SLI, others didn't use SLI, etc.

Now why didn't they use Crossfire? That's the better question. And Redstar, I'd like to see where you got the statistic that most users are running a 6600GT. The 7600 GT has generally replaced it entirely, and the performance-to-price ratio of cards such as the 7900 GT has caused sales of such cards to go greatly up. Remember, the type of gamer who is going to purchase the likes of a 7600 GT, isn't going to likely be as concerned with upgrading to a new CPU. On the other hand, the majority of those who do follow the news, have been following Conroe, etc., are far more likely to spend the extra amount of money on a higher-level single card solution such as the 7900 Series or X1800/X1900 series. Then ofcourse there's the very high-end who can afford to easily run Crossfire, etc., but those even constitute a lesser majoroity, and are probably already planning out their full Conroe systems. Thus, I think the logic behind the [H] review still holds, although it would have been nice to have shown single-card solutions for budget, intermediate and top-end, and then Crossfire for dual-card solutions. They might have planned more extensive GPU testing, but remember, their testing time was more than cut in half.

And b0bd0le, stop the rhetoric. The article clearly stated that when it came to gaming, they were going for a real-world performance view. If you really think Kyle and the rest would have been attempting to bash Conroe, they wouldn't have published the other benchmarks. As it stands, you and the rest here are whining about a single part of the review, when the overall review shows just how powerful Conroe is. Almost every other site went with the stanard suite of benchmarks, and then did low-resolution benchmarking for games. It's nice for saying "ooh, ahhh, look at the insanely high fps that my eye can only percieve 1/4th of or 1/8th of" and for bragging rights, but that's about it. Even when it comes to scaling for future GPUs, it's still not a very effective tool. Thus, [H] went with a different approach. Accept it. If you don't like it, go read Tom's Intelware.
 
"And Redstar, I'd like to see where you got the statistic that most users..."

then why didn't you click on the link i posted so your could see where i got the data from?

:)
 
RedStarSQD said:
"And Redstar, I'd like to see where you got the statistic that most users..."

then why didn't you click on the link i posted so your could see where i got the data from?

:)

OOOOHHH, sorry, I didn't realize that every single gamer in the world used Steam. My bad. :rolleyes:

Edit - sarcasm aside, I was hoping for more than a Steam survey to indicate that most people used a 6600GT.
 
i'm sure you will agree 700k is a statistically significant sample ..by all means lets see your stats to counter the data.

but perhaps you are right in that only valve customers love to use prehistoric stuff. lol :)
 
RedStarSQD said:
i'm sure you will agree 700k is a statistically significant sample ..by all means lets see your stats to counter the data.

but perhaps you are right in that only valve customers love to use prehistoric stuff. lol :)

It is a significant sample. However, take myself for example: at the time of that survey, I believe I was using a 2.8C Northwood with a Radeon 9800. A little while after, I went up to my X2 3800+ and a 7800GT. Soon, I'll be on a Conroe E6600 or E6700 and later this year, a R600. Thus, the hardware is every changing. And while I'm certain that's not the case for most of the people on that survey, it'll still change somewhat significantly at times, in some way or another.

However, I do agree on one thing: Valve customers do seem to generally use prehistoric hardware ;) Atleast, by my standards. :p
 
Of course hardware changes...this data was updated in april.

I don't think we are looking at much of a shift in in so short a time.

afterall when did the 6600 gt come out? That's right a long time before that.

The minority of gamers change hardware frequently.

So again i ask you to provide stats to back up your assertion. 1 person does not beat a 700k sample :)
 
ToastMaster said:
OOOOHHH, sorry, I didn't realize that every single gamer in the world used Steam. My bad. :rolleyes:

Edit - sarcasm aside, I was hoping for more than a Steam survey to indicate that most people used a 6600GT.

lol are you serious? Do you have some reason to believe that the users who play Steam games are not representative?

You've just lost what little credibility you ever had. There is no conceivable reason why the huge number of people playing Steam games would not be representative of gamers at large, within some reasonable percentage. Claiming otherwise is simply stupid.
 
ToastMaster said:
why would [H]
Now why didn't they use Crossfire? That's the better question.

Because:

1) Crossfire pissed them off.

2) doesn't work on the P5B.
 
Will be going for Core2 Duo E6600, I hav X1900XTs Crossfire so should scale much better :) @ 1600x1200 ...

[H] results is not surprising :p

Not much choice of motherboards atm tho
:p
 
I'm sorry, but this review of Conroe gaming performance is almost pointless - all these numbers are so GPU limited that we may just saved our time reading it and conclude:

Current high end CPUs from AMD and Intel are GPU limited on a 7900GTX when running the latest games at 1600x1200 with 4xAF/16xAA.

While this may be 'real world' to Kyle, and I'm not one to attack subjective opinions, may I point out that many LCDs only support up to 1280x1024 resolutions, and indeed have native resolutions at this setting. Many people are still running older CRT monitors, and 1600x1200 is really only viable on 20" CRTs and up as well.

While we are on the subject of 'real world' gaming, who here thinks that running at the absolute limits of playability isn't always the best way to go?

Personally, I hate slowdowns >30fps when I play online FPS like BF2 and CS:S. My aim goes to shit and the jerkiness gives me headaches. This in turn makes me uncompetitive, and when I'm uncompetitive I get angry... and, you know the rest. ;)

I noted that on many of the [H] tests, the 'playable' framerates are rather low. Of course 'playability' is totally subjective, but I personally wouldn't play at those framerates, I'd rather ease up on the AF/AA a tad and enjoy healthier framerates all round.
 
harpoon said:
While this may be 'real world' to Kyle, and I'm not one to attack subjective opinions, may I point out that many LCDs only support up to 1280x1024 resolutions, and indeed have native resolutions at this setting. Many people are still running older CRT monitors, and 1600x1200 is really only viable on 20" CRTs and up as well.

They are only "real world" in the benchmarking sense. And with current video hardware, these so-called gaming benchmarks were predictable and pointless. I think even Kyle would agree with that since nothing was really gained from it. The application benchmarks at least made a little more sense.
 
forcefed said:
Battle? What battle?
The Crossfire battle. Everyone knows it's a hastle to get set up, [H]ardOCP has never once gotten it to work out of the box in their reviews. Given that they had <2 weeks, I can fully understand not using Crossfire. I still wish they had used a P5W/7950GX2, but it's still a good review.
 
harpoon said:
While this may be 'real world' to Kyle, and I'm not one to attack subjective opinions, may I point out that many LCDs only support up to 1280x1024 resolutions, and indeed have native resolutions at this setting. Many people are still running older CRT monitors, and 1600x1200 is really only viable on 20" CRTs and up as well.

To be honest i would say 1680x1050 is a more common resolution for most of us than 1280x1024. If you will check out the display forum you will see that some of the most popular LCD's right now are 20.1" widescreens like the Dell 2005fpw/2007wfp which have a native resolution of 1680x1050. Thats the size of LCD i use because i feel that the 24" LCD's require too much graphics power to drive them in today's games. I would rather settle with a little smaller screen real estate and get decent frames in games like Oblivion.

dark_reign said:
They are only "real world" in the benchmarking sense. And with current video hardware, these so-called gaming benchmarks were predictable and pointless. I think even Kyle would agree with that since nothing was really gained from it. The application benchmarks at least made a little more sense.

Definitely. The application benchmarks they run started to show the real story behind Conroe. People should alread know by now that a CPU upgrade alone isn't going to make a huge difference in gaming while programmers are still putting all the heavy work on the graphics card and very few games make any effective use of dual core. Conroe will stretch its lead out more though at higher resolutions when we start to see DX10 cards like the R600 show up in a few months.
 
I personally think the review was good. Ive read alot of the conroe reviews, its refreshing to se a difefrent spin put on this.

I was thinking real hard about a conroe upgrade real soon, but i think im going to hold off for a while now.

ive got a x2 4400+ that does 2.8 on air( im going to water very soon), 7900 gtxs sli'd, and msi k8n diamond plus, 2 gigs of ram and a 20 in widescreen lcd this box is a this great gaming rig,the best ive ever had and i dont really feel the need to spend another 1000.00 bux on a upgrade for another 10fps, or the usual platform shift problems that always occur,(bios revisions, platform kinks, sli support, ect).

Conroe is impressive, dont get me wrong, but Kyle was right, if you got a high end x2 system the upgrade worthiness just isnt there, at least for me at this time
 
burningrave101 said:
To be honest i would say 1680x1050 is a more common resolution for most of us than 1280x1024.

You base this on what?
Wanna bet that the majority of people here on [H]ard dont have big LCD's?
Or just people that game?
We could start a poll? :)

Terra - CRT > LCD for gaming...still true...
 
Okay, new theory:

Previously, such as with AM2, [H] was using low-res benchmarks to show which CPU had more raw power. Now, with Conroe, [H] is using high-res benchmarks to show the actual framerate increase somone would see when upgrading to Conroe. I can think of a few reasons why they would do this:

1. The low-res benchmarks had been posted way too many times already - they would have been nothing new.

2. [H] has always been a fan of real-world video card tests, and they wanted to try the same approach .

3. [H] was afraid too many people would get the wrong impression from the low-res benchmarks, especially because of all the hype this release has gotten.

4. [H] is AMD-biased.

5. [H] wanted to post different results in order to get more hits.

Everyone seems to be assuming #4 or #5, probably because there are more pro-AMD comments in the review than pro-Intel comments. But we still shouldn't jump to conclusions and bash Kyle for it.

So, let's ask the only person who knows the answer:

Kyle, tell us. Why did you decide to change your testing methods with the release of Conroe?

In the meantime, I stand my ground. What needs to be done to the article is:

1. Remove the comment about AMD price cuts. The processors still won't be much cheaper than Conroe, and Conroe will give you more power that could be used in the future.

2. Remove the comment about overclocking. Conroe can probably overclock even higher than AMD.

3. Make it more clear that you've changed your testing methods, and explain why you've changed them.

4. Remove the comment that other benchmarks lie. They don't lie, they just test a completely different aspect of the processor. Explain this difference to the readers.

5. Make the conclusion less of a "AMD is still better" and more of a "Conroe is better, but it's power cannot be utillized by today's games because they are GPU-limited."

Do that, and the review is fine, everyone will stop complaining, those who want to know which processor is more powerful will know the truth, and those who want to know if upgrading now is economical will also know the truth. It will please everyone. I don't see any purpose in leaving those comments in the review.

But to everyone who's been saying that [H] is now the worst website ever, totally biased, and managed by the modern eqivalent to Boss Tweed, please STOP BASHING KYLE AND GO BACK TO THE COMPETING FORUM WHERE YOU CAME FROM. As I said before, many people don't realize how hard it is to do a real-world performance review. [H], unlike every other hardware review site on the planet, doesn't take the easy route and post "normal" benchmarks. They take the extra time required to test every system to it's limits in every game, in every niche market (widescreen, quad-SLI, unmatched Crossfire) as well. This time, problems with the hardware and limited review time did not allow them to do all that they wanted to. Still, it was a great review. Somehow, a few biased comments made their way into that review. It's not too late to fix it, however.

But whatever you do, don't change the way you do your awesome video card reviews.
 
HOCP4ME said:
Okay, new theory:

Previously, such as with AM2, [H] was using low-res benchmarks to show which CPU had more raw power. Now, with Conroe, [H] is using high-res benchmarks to show the actual framerate increase somone would see when upgrading to Conroe.

I didnt even notice that, but ya they used regular timedemos rather then "real world" benchmarks with the am2 review. And after showing a minimal increase they still gave the am2 a gold award while the conroe got nothing.

HOCP4ME said:
4. [H] is AMD-biased.

That about sums it up, sad.
 
HOCP4ME said:
But whatever you do, don't change the way you do your awesome video card reviews.

Wait a second, you bash their CPU evaluation, but love their video card review? :confused: They're both reviewed based on real-world gameplay.
 
"
Wait a second, you bash their CPU evaluation, ..."

as [H] has said, this was not a CPU evaluation. But, [H] did make a cpu conclusion :)

[H] only needs to make a few corrections as stated a few posts above :)


But, the "cronies" statement is still resonating with me. Its only a couple days ago that an inquirer dude said the same thing about sony (well more harsh language) :))

It has been an eye opeing week for me when it comes to objectivity from "impartial" sources. :)
 
HOCP4ME summed everything up nicely. I know [H] isn't a democracy, but if we start a civilized pettetion maybe they will listen, yeah?
 
Mr. Miyagi said:
Wait a second, you bash their CPU evaluation, but love their video card review? :confused: They're both reviewed based on real-world gameplay.

Well, the thing is, the only place that a video card's power matter is in real-world gameplay. Therefore it makes sense to test it's real-world gameplay. But with a CPU, real power and real-world gameplay are two very different things. [H] has tested the real-world gameplay, but as RedStar said, they made a conclusion about the CPU's power based on that. You can do that with a GPU, but not with a CPU.

Also, I'm not bashing their review, just pointing out that a few small changes need to be made.

I know [H] isn't a democracy, but if we start a civilized pettetion maybe they will listen, yeah?

I thought that would be a good idea myself, so I started a poll. Go ahead and vote, we need all the support we can get.
 
haelduksf said:
No matter, an overclocked Opteron 175 matches THAT performance at the same price.

That knife cuts both ways.


Thats the way I went, looks like it was a good decision after all.

Besides, I knew no games were CPU limited at high settings.


Of coursem Quake 3 at 640x480 might show a differnet story right... lol

:D :D :D :D
 
Terra said:
You base this on what?
Wanna bet that the majority of people here on [H]ard dont have big LCD's?
Or just people that game?
We could start a poll? :)

Terra - CRT > LCD for gaming...still true...

I'd bet it would be a close count. CRT's are only good for gaming at one thing...high refresh rates that give smoother gameplay.

LCD's are a close second in smoothness but when all advantages are considered...screw CRT's.
 
HOCP4ME said:
Well, the thing is, the only place that a video card's power matter is in real-world gameplay. Therefore it makes sense to test it's real-world gameplay. But with a CPU, real power and real-world gameplay are two very different things. [H] has tested the real-world gameplay, but as RedStar said, they made a conclusion about the CPU's power based on that. You can do that with a GPU, but not with a CPU.

Also, I'm not bashing their review, just pointing out that a few small changes need to be made.



I thought that would be a good idea myself, so I started a poll. Go ahead and vote, we need all the support we can get.


Besides the lies and cronies comments, here are my main issues -

1. Real World Gameplay by whose standards? How many people actually play a game at 1600x1200 8xAA/16xAF with a single video card? If we take last years monitor sales, over 73% sold were 19" or under. Why not show resolutions that the majority of people with new monitors utilize and then go up the resolution ladder for results.

2. The games that are CPU bound or truly benefit from raw CPU power were not included in this "real world" gameplay article. Why not? These games represent the majority of PC game sales at this time and from benchmarks on other sites, Core 2 Duo does make a difference in gaming even at the higher resolution.

3. How do we know what sections of the games were used for the FRAPs captures? Depending on the section, each one of the games listed can have a wide variability in the performance results. You can take FRAPs results at the menu and game start and get a result that benefits your desired outcome or take one with 32 players on screen in the middle of a firefight for another.

4. How many "real" gamers are going to accept below 30FPS results? The eye candy will get turned down and you will see the CPU make a difference.

5. Kyle complained about the timelines, well it seems as though everyone else had the same ones and managed to get CrossFire numbers out in time along with detailed comparisons against other CPUs. Before I hear the "It was not a CPU" article one more time, then why not show Pentium 805D and AMD X3800 results? Those two will be the new bargain basement dual core CPUs so lets see how these "real world" tests work out as most real world gamers will probably be using systems with this type of CPU power.
 
bingo13 said:
Besides the lies and cronies comments, here are my main issues -

1. Real World Gameplay by whose standards? How many people actually play a game at 1600x1200 8xAA/16xAF with a single video card? If we take last years monitor sales, over 73% sold were 19" or under. Why not show resolutions that the majority of people with new monitors utilize and then go up the resolution ladder for results.

2. The games that are CPU bound or truly benefit from raw CPU power were not included in this "real world" gameplay article. Why not? These games represent the majority of PC game sales at this time and from benchmarks on other sites, Core 2 Duo does make a difference in gaming even at the higher resolution.

3. How do we know what sections of the games were used for the FRAPs captures? Depending on the section, each one of the games listed can have a wide variability in the performance results. You can take FRAPs results at the menu and game start and get a result that benefits your desired outcome or take one with 32 players on screen in the middle of a firefight for another.

4. How many "real" gamers are going to accept below 30FPS results? The eye candy will get turned down and you will see the CPU make a difference.

5. Kyle complained about the timelines, well it seems as though everyone else had the same ones and managed to get CrossFire numbers out in time along with detailed comparisons against other CPUs. Before I hear the "It was not a CPU" article one more time, then why not show Pentium 805D and AMD X3800 results? Those two will be the new bargain basement dual core CPUs so lets see how these "real world" tests work out as most real world gamers will probably be using systems with this type of CPU power.

QFT
 
b0bd0le said:
this whole thread and site and review is lame

i think the whole point of this so called "cpu review" was to be inflammatory, this review has spawned a countless number of threads throughout the internet. I can't even fathom how much cash you have made off this one review.

this is disgusting, with your business model you make cash off of stirring the pot instead of delivering real empirical data. You light the fires under the ATI vs NVIDIA & AMD VS INTEL arguments. Hardocp is the rush limbaugh, the michael moore of this culture.

congrats on not having any substance kyle & co

Congrats on not getting the point of the review.
 
Back
Top