Real-World Gaming CPU Comparison with BFGTech 8800 GTX SLI

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
52,960
Real-World Gaming CPU Comparison with BFGTech 8800 GTX SLI - We are back again with a real world CPU gaming shoot out. With NVIDIA's 8800 series GPU supplying a big jump in performance over the last generation of video cards, we decided we need to revisit just what these new CPUs do for you when you are gaming. Is one better?


With today’s faster video cards, we certainly saw more differences between our AMD and Intel processors than we did at the launch of the Core 2 Duo, and certainly more differences than we saw in March of last year. Undoubtedly, the Core 2 Duo allowed us a better gaming experience overall. Also we saw some examples where we were again GPU bottlenecked in games even with these massively fast BFGTech 8800 GTX cards.
 

J-Mag

2[H]4U
Joined
Dec 21, 2004
Messages
3,640
Could it be that 768 MB of RAM on the video card is simply not enough for certain settings like 2560x1600 4X AA in Oblivion or motion blur being enabled in Need for Speed Carbon?

There is a console command in Oblivion that brings up the total amount of GPU memory used. I forgot what it was though. I never saw it go above ~400mb and that was at 1600x1200 4xAA (no HDR) with a modded ini for extreme LOD, texture replacement mods, parallax mods, etc...
 

chrisf6969

[H]F Junkie
Joined
Oct 27, 2003
Messages
9,013
I still think you're "gaming" (benching) at too high of resolutions / settings. If your min. hits 0,2, 18,20 o 22 fps (some samples from review) or even under 30 fps, you need to back down the resolution/game settings. IMHO.

I'd prefer to play at 1600x1200 with high AA/AF thats buttery smooth, than a higher resolution that gets choppy sometimes. Nothing spoils your fun more than a choppy section of the game causing you to die. Most games still look incredible at even 1280x1024 with max settings & high levels of AA/AF.

Personally I prefer to make my min. fps around 60+fps! But I'm probalby on the other end of the extreme.
 

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
52,960
I still think you're gaming at too high of resolutions / settings. If your min. hits 0,2, 18,20 o 22 fps (some samples from review) or even under 30 fps, you need to back down the resolution/game settings. IMHO.

I'd prefer to play at 1600x1200 with high AA/AF thats buttery smooth, than a higher resolution that gets choppy sometimes. Nothing spoils your fun more than a choppy section of the game causing you to die.

Personally I prefer to make my min. fps around 60+fps! But I'm probalby on the other end of the extreme.


Thanks for your thoughts on how we test, they are noted. We actually play the games and use our own feel for smoothness that is at an acceptable level. Your mileage may vary.
 

J-Mag

2[H]4U
Joined
Dec 21, 2004
Messages
3,640
If your min. hits 0,2, 18,20 o 22 fps (some samples from review) or even under 30 fps, you need to back down the resolution/game settings. IMHO.
.

Some games just have bad mins and there is nothing you can do about it, no matter the settings. It may have something to do with the way the program loads new areas.
 

Brent_Justice

Moderator
Joined
Apr 17, 2000
Messages
17,755
I still think you're "gaming" (benching) at too high of resolutions / settings. If your min. hits 0,2, 18,20 o 22 fps (some samples from review) or even under 30 fps, you need to back down the resolution/game settings. IMHO.

I'd prefer to play at 1600x1200 with high AA/AF thats buttery smooth, than a higher resolution that gets choppy sometimes. Nothing spoils your fun more than a choppy section of the game causing you to die. Most games still look incredible at even 1280x1024 with max settings & high levels of AA/AF.

Personally I prefer to make my min. fps around 60+fps! But I'm probalby on the other end of the extreme.

2560x1600 is the native resolution of the Dell 3007WFP. This resolution is where 8800 GTX SLI shines, and what we chose to focus on for the ultimate gaming experience.

Note that none of the minimum framerates are sustained, only brief drops in some places which did not negatively impact our gaming experience.
 

mentok1982

Supreme [H]ardness
Joined
Sep 17, 2004
Messages
4,366
I hope [H]ard|OCP will write another kick-ass article like this once the really cool mulit-core DX10
games are available. Lets have a quad core face off between AMD and Intel!
 

Mohonri

Supreme [H]ardness
Joined
Jul 29, 2005
Messages
5,757
It's interesting to see how the CPU affects performance with a top-of-the-line graphics setup, but I wonder how differing CPUs would affect gameplay on the level of us mere mortals. For example, how would CPU affect performance when you have a single X1950 Pro or 7900GT? My impression is that with a tighter GPU bottleneck, the CPU would become even less relevant than this article shows.

Gee, I sure would like to have a Min. FPS of 30, or even 20....
 

HaMMerHeD

[H]F Junkie
Joined
Jul 13, 2004
Messages
10,398
I still think you're "gaming" (benching) at too high of resolutions / settings. If your min. hits 0,2, 18,20 o 22 fps (some samples from review) or even under 30 fps, you need to back down the resolution/game settings. IMHO.



Many times, the minumum frame rate is due to a loading/caching issue which affects the computer for just a few seconds at a time. It's not usually a prolonged dip.
 

MrGuvernment

Fully [H]
Joined
Aug 3, 2004
Messages
20,571
It's interesting to see how the CPU affects performance with a top-of-the-line graphics setup, but I wonder how differing CPUs would affect gameplay on the level of us mere mortals. For example, how would CPU affect performance when you have a single X1950 Pro or 7900GT? My impression is that with a tighter GPU bottleneck, the CPU would become even less relevant than this article shows.

Gee, I sure would like to have a Min. FPS of 30, or even 20....

should check out their first review of amd vs intel on the core platform where basically they said it doesn't matter if you go AMD or Intel, the vid card is the limit, which was a review many got upset at as they showed that buying Core 2 Duo over and AMD made no diff with "mortal" resolutions and such, which i am sure many people then went out and bought SLI cards, and had crap experiences, and now this review shows diff, sure there are going to be some upset people claiming "but [H] said before ti didn't matter what cpu i got"


It should be interesting when the dx10 game are multi-core support improve... core 2 duo, being a bottle neck.....
 

chrisf6969

[H]F Junkie
Joined
Oct 27, 2003
Messages
9,013
2560x1600 is the native resolution of the Dell 3007WFP. This resolution is where 8800 GTX SLI shines, and what we chose to focus on for the ultimate gaming experience.

Note that none of the minimum framerates are sustained, only brief drops in some places which did not negatively impact our gaming experience.

I know thats where the 8800's shine, but I thought this was a re-review of the CPU's.

I just think no matter what videocard you use, if you keep pushing it that far the edge, where the minimums are so low, you're not going to show very much difference in any 2 highend CPU's. B/c you can always crank up the resolutions/settings on the most recent games to the max. Like now you're using BF2142, instead of BF2 which is a little more demanding so you're "rebottlenecking" the GPU.

Luckily, two 8800GTX's are really hard to bottleneck, so you could show some differences between the CPU's.
 

whrswoldo

2[H]4U
Joined
Jun 6, 2002
Messages
3,138
It's interesting to see how the CPU affects performance with a top-of-the-line graphics setup, but I wonder how differing CPUs would affect gameplay on the level of us mere mortals. For example, how would CPU affect performance when you have a single X1950 Pro or 7900GT? My impression is that with a tighter GPU bottleneck, the CPU would become even less relevant than this article shows.

Gee, I sure would like to have a Min. FPS of 30, or even 20....

Take a look at the Core 2 Intel gaming comparison article here. They used a 7900GTX, which is not terribly dissimilar from an overclocked 7900/ATI equivalent.
 

magoo

[H]F Junkie
Joined
Oct 21, 2004
Messages
14,489
enjoyable reading.
not feeling guilty anymore about using the C2Ds instead of FXs.:D
 

rinaldo00

[H]ard|Gawd
Joined
Mar 9, 2005
Messages
1,860
SWEET!

-THIS- was an article I was REALLY hoping to see right before I buy/build my new system!

Good job!
 

HaMMerHeD

[H]F Junkie
Joined
Jul 13, 2004
Messages
10,398
I know thats where the 8800's shine, but I thought this was a re-review of the CPU's.

I just think no matter what videocard you use, if you keep pushing it that far the edge, where the minimums are so low, you're not going to show very much difference in any 2 highend CPU's. B/c you can always crank up the resolutions/settings on the most recent games to the max. Like now you're using BF2142, instead of BF2 which is a little more demanding so you're "rebottlenecking" the GPU.

Luckily, two 8800GTX's are really hard to bottleneck, so you could show some differences between the CPU's.

I think you have missed the point entirely. 8800GTX in SLI on a 30" LCD is currently the most [H]ardcore game experience you can get. Spending that much dough on the video cards and monitor, it is nice to know exactly what your best bet is with the processors. Since the options are so different. Would it have been useful to know about lower resolutions? Yes it would. Would it have still made the point Brent and Kyle were trying to make? No, it would not.
 
Joined
Apr 28, 2005
Messages
585
Let me preface this statement for the forum denizens first by saying that I read/heard about this article the same way the rest of the public did. I do cooling not videocards/CPUs so I had no knowledge that this article was even being written.

With that out of the way, I have to say that this is the best CPU article yet. As much as I like the "real world" style benching I also like to see the "apples to apples" comparison. Having them both together just makes it perfect.
Hopefully this puts the debate over that to rest for good.

On a side note...As sic as the 3007WFP is, it cost more than my entire computer.
 

Ardrid

Limp Gawd
Joined
Jun 10, 2004
Messages
153
Excellent article. It's nice to see C2D vindicated. Looks like Intel wasn't "lying" after all, eh?
 

rinaldo00

[H]ard|Gawd
Joined
Mar 9, 2005
Messages
1,860
On a side note...As sic as the 3007WFP is, it cost more than my entire computer.
UltraSharp 3007WFP 30 inch Wide-Screen Black Flat Panel Monitor, LCD with Height Adjustable Stand, Requires a DVI-D Dual-Link Compatible Graphics Card
$1,499

Your computer cost less then $1500 How old is it?
 

jstnomega

[H]ard|Gawd
Joined
Oct 4, 2001
Messages
1,706
Forgive me for restaing the obvious, but in other words...

If one elects to stay w/a smaller display monitor, say 17" or 19", whether CRT or LCD, and as long as one is also willing to settle for a max resolution of 1024 x 768, up to maybe 1600 x 1200 and which will be game-experience-dependent/determined, then one needs neither a high end CPU nor GPU to experience gaming w/all the eye candy maxed out.

If that's the case, then it should be emphasized that as far as gaming is concerned, display monitor SIZE should be the determining factor guiding CPU/GPU purchases and those still running smaller display moitors require the high end of neither.
 

Geo Fry

Limp Gawd
Joined
Feb 27, 2006
Messages
483
I have a question about the M2:Total War bench... What unit scale size did you pick (Small, Medium, Large, Huge)? I didn't see it stated in the review. I did see that you said 900 vs 1000 but it seems that the unit scale will cripple old CPUs if set too high regardless of how many units...

My P4 can barely hang on if set to Large so I have to keep it at Medium. From playing this game for a month now this setting seems to have a bigger impact on performance than most of the other settings, at least for my rig anyways (can anyone say CPU bottleneck :p)
 

Dan_D

Extremely [H]
Joined
Feb 9, 2002
Messages
60,279
Since I have an 8800GTX SLI setup and a Core 2 Duo with a Dell 3007WFP I can safely say that you want all the processing power you can get at 2560x1600. Some games the differences between my C2D's stock speeds and overclocked speeds (3.6GHz) are noticable. They haven't been an extreme difference so far, but that may change when Crysis and some other games start hitting the market.
 

savantu

[H]ard|Gawd
Joined
Nov 13, 2002
Messages
1,325
I think you have missed the point entirely. 8800GTX in SLI on a 30" LCD is currently the most [H]ardcore game experience you can get. Spending that much dough on the video cards and monitor, it is nice to know exactly what your best bet is with the processors. Since the options are so different. Would it have been useful to know about lower resolutions? Yes it would. Would it have still made the point Brent and Kyle were trying to make? No, it would not.

Not at all.

The lower the resolution the greater the performance advantage of Conroe.
 

Mazgazine1

Limp Gawd
Joined
Feb 26, 2005
Messages
345
Now we need an "overclocking for gaming" article, to see if hitting that 3.5ghz with your c2d is worth anymore in the gaming end of things, maybe pit it against the X6800 at stock speeds and see who wins?
 

InorganicMatter

[H]F Junkie
Joined
Oct 19, 2004
Messages
15,464
Hey, you've got a small typo that makes a huge difference:

Given that gaming has yet to truly embrace multi-core CPUs, we are going to pit AMD’s high end Socket 939 Athlon 64 FX-62 (2.8GHz) against Intel’s Core 2 Duo X6800 (2.93GHz).
The FX-62 and Asus M2N32 is a socket AM2 platform. Might want to fix that, as it will get the AMD guys even more upset than they already are. ;)

Other than that typo, GREAT article. I love it when [H] clears up the BS.

not feeling guilty anymore about using the C2Ds instead of FXs.:D

You shouldn't have felt guilty to begin with. Brand loyalty to AMD does nothing more than convey a message that says "don't bother innovating, since I'll buy your useless crap either way."
 

MrWizard6600

Supreme [H]ardness
Joined
Jan 15, 2006
Messages
5,779
get over yourselves. Other Reviews test resolution, asino, AA and eye candy as constant, FPS as variable. You guys test FPS as constant, Resolution, eye-candy, asino, and AA as a veriable.

The reason rsolution works better as a constant is that it IS a constant. it doesnt change when someone throws a smoke bomb. You guys deem for us what is "tolerable" FPS wise, which is crap. I can do 30, ill go as low as 25 so long as i dont have to spray five people. My friend jamie plays wow on his 6600LE with 15FPS average and doesnt mind it. Those of us on a 7900GTX SLI Setup and already in contact with EVGA to step up to two 8800GTS's. You guys can tolerate only the Max refreash rate, which is understandable.

You guys telling us 40 is acceptable is un acceptable by my standards. FS's reviews are much more in depth as they cover everything from 800X600 to 2500X2000. We all saw what happened when you increased the resolution. We dont actually need you to tell us.

"but no body on a Core 2 Duo system would play at 800 X 600"
your hugly mistaken. hundreds of thousands play on memron, or Conroe procs at 800X600 because they have integrated graphics. They have the system they saw on sale at futureshop featuring a Core Two Duo at 2.2 Ghz for $599, and thought it was an awsome buy. They have no idea what a graphics card is, other then that a "256 deticated" is good (which of course, we all know can be a load of bull).

edit: oh yeah and theres a typo there. Socket AM2 FX 62. There is no S939 590 Platform (is there?!?).

edit2: you state that even tho the FPS was high, you were experiancing choppyness. What was the status of V-sync?
 

Shotglass01

[H]ard|Gawd
Joined
Aug 26, 2005
Messages
1,967
You guys telling us 40 is acceptable is un acceptable by my standards. FS's reviews are much more in depth as they cover everything from 800X600 to 2500X2000. We all saw what happened when you increased the resolution. We dont actually need you to tell us.

I disagree. I like knowing how far a setup will push a game. When you know where the highest limits are, you'll also know that anything less than the highest will improve speed and response. 2500x2500 @25FPS with no candy is just as bad as 640x480 @ 25FPS with no candy. At least here, you can get a sense of what resolution paired up with what graphic options should work for you.
 

Mohonri

Supreme [H]ardness
Joined
Jul 29, 2005
Messages
5,757
The lower the resolution the greater the performance advantage of Conroe.
I was under the impression that given a specific GPU, as resolution decreases, the CPU has a decreasing impact on performance. Do you have a link to support what you said?
 

Dan_D

Extremely [H]
Joined
Feb 9, 2002
Messages
60,279
I was under the impression that given a specific GPU, as resolution decreases, the CPU has a decreasing impact on performance. Do you have a link to support what you said?

The poster means that when you decrease the resolution significantly, the CPU ends up doing most of the work. At 640x480 and at 800x600 this is true with most games. As the resolution increases, the workload shifts to the graphics hardware more.

At lower resolutions, lesser graphics hardware doesn't falter as badly as the workload becomes easier. At low resolutions, comparisons between various CPUs will show a larger difference than you would have using an 8800GTX and that same range of CPUs up to a point. This article illustrates that at some point, a CPU again comes into play when using ultra high resolutions like those native to the Dell 3007WFP.

In essense, we are almost always CPU bound depending on the resolutions used.
 

Obi_Kwiet

2[H]4U
Joined
Dec 25, 2004
Messages
3,858
Wouldn't it be more more useful to run the tests with resolutions and specs that some of us might actually be able to use? I doubt very many people here have a 3007fpw and two 8800GTXs. I think one 8800GTX and a range of resolutions like 1280x1024, 1600x1200 and then 2560x1600 would have been a lot more useful. This setup is so much more than any of us will ever have that I'm not sure weather the result will necessarily scale down to us.
 

deceiver

n00b
Joined
Jan 22, 2007
Messages
6
I'd like to see an article that outlines the card that is recommended for each of the common gaming resolutions.

I run at 1280x1024 - would I really see an improvement beyond a certain point, even for games like oblivion?

maybe a series of articles, one resolution a week or something.
 

Dan_D

Extremely [H]
Joined
Feb 9, 2002
Messages
60,279
Wouldn't it be more more useful to run the tests with resolutions and specs that some of us might actually be able to use? I doubt very many people here have a 3007fpw and two 8800GTXs. I think one 8800GTX and a range of resolutions like 1280x1024, 1600x1200 and then 2560x1600 would have been a lot more useful. This setup is so much more than any of us will ever have that I'm not sure weather the result will necessarily scale down to us.

This is a pretty standard complaint. While I don't do video card reviews and can't really address what those guys will or will not do going forward, I can tell you that benchmarking all hardware in the most ideal conditions is probably the best way to go.

The 8800GTX SLI setup needed to be pushed to it's limits in order to show the difference a CPU could make with that type of setup. In this context, it was the correct choice I think. Plus you test with an X6800 CPU to make sure that you've removed as much of the bottleneck as possible and that way the only variable changing is your graphics card. (In the case of graphics cards reviews.)

In the [H] video card reviews, if you'll notice many resolutions and monitor configurations are often tested. I would think this would give you the information you'd need most. There should be no doubt that at anything less than 2560x1600, the 8800GTX is going to STILL be the fastest card out today.
 

fatboy32

[H]ard|Gawd
Joined
Mar 29, 2002
Messages
1,447
I have to say thank you to the [H]ardocp crew for doing this comparison. I'm doing research for my next gaming rig and this helped a lot. Keep up the good work guys.
 

Donnie27

Supreme [H]ardness
Joined
Dec 22, 2004
Messages
5,616
Wouldn't it be more more useful to run the tests with resolutions and specs that some of us might actually be able to use? I doubt very many people here have a 3007fpw and two 8800GTXs. I think one 8800GTX and a range of resolutions like 1280x1024, 1600x1200 and then 2560x1600 would have been a lot more useful. This setup is so much more than any of us will ever have that I'm not sure weather the result will necessarily scale down to us.

I second that motion! 1280 X 1024 is also a very common LCD res and I have my eye on at least one 8800GTX, wasn't that impressed with one 8800GS though. Besides DX-10 it was barely faster than a X1950XT. 1600 x 1200 even:)
 

savantu

[H]ard|Gawd
Joined
Nov 13, 2002
Messages
1,325
The faster the graphics system the more important it is to have a system/CPU that can feed it fast enough.

True.It was excellent that the conclusion explained which CPU has the more oomph , something the previous article failed miserably.
 

digitalfreak

Limp Gawd
Joined
Apr 18, 2005
Messages
356
If that's the case, then it should be emphasized that as far as gaming is concerned, display monitor SIZE should be the determining factor guiding CPU/GPU purchases and those still running smaller display moitors require the high end of neither.

Not size, but resolution. Back in the CRT days, you could run any resolution you wanted (to the monitor's max) and the picture quality was pretty much the same. With LCDs, you have a fixed resolution, and anything below that needs to be scaled to fit (or you get letterboxing/sidebars).
 

digitalfreak

Limp Gawd
Joined
Apr 18, 2005
Messages
356
I'm a bit confused about something. In Medieval II, the C2D could manage 4x TRMSAA, but the FX couldn't handle AA at all. I was under the impression that AA performance was directly related to the video card itself, and not the CPU.
 

qkool

Limp Gawd
Joined
Nov 13, 2003
Messages
357
In this evaluation, especially under Oblivion and some other games, we talked about some odd performance experiences. We said that at certain very high settings we experienced a “choppy” or “laggy” feeling in the game even though the framerate indicated playable framerates.

I get this no matter what resolution I use on my system. It's a 3000+, 2gigs of ram, 7800gt. You can't see it, but when you move the mouse, you literally have to wait for the camera to catch up. Yet, everything else in the game is moving along just fine. This happens to me in Far Cry also, although much less, which was the reason why I really never played the game. I can play many other games with settings that would slow my system down much more than Oblivion, but still not have the camera problem. I think it's just the game.
 

Tolyngee

Supreme [H]ardness
Joined
Oct 17, 2005
Messages
4,516
Top