Real-World Gaming CPU Comparison with BFGTech 8800 GTX SLI

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Real-World Gaming CPU Comparison with BFGTech 8800 GTX SLI - We are back again with a real world CPU gaming shoot out. With NVIDIA's 8800 series GPU supplying a big jump in performance over the last generation of video cards, we decided we need to revisit just what these new CPUs do for you when you are gaming. Is one better?


With today’s faster video cards, we certainly saw more differences between our AMD and Intel processors than we did at the launch of the Core 2 Duo, and certainly more differences than we saw in March of last year. Undoubtedly, the Core 2 Duo allowed us a better gaming experience overall. Also we saw some examples where we were again GPU bottlenecked in games even with these massively fast BFGTech 8800 GTX cards.
 
Could it be that 768 MB of RAM on the video card is simply not enough for certain settings like 2560x1600 4X AA in Oblivion or motion blur being enabled in Need for Speed Carbon?

There is a console command in Oblivion that brings up the total amount of GPU memory used. I forgot what it was though. I never saw it go above ~400mb and that was at 1600x1200 4xAA (no HDR) with a modded ini for extreme LOD, texture replacement mods, parallax mods, etc...
 
I still think you're "gaming" (benching) at too high of resolutions / settings. If your min. hits 0,2, 18,20 o 22 fps (some samples from review) or even under 30 fps, you need to back down the resolution/game settings. IMHO.

I'd prefer to play at 1600x1200 with high AA/AF thats buttery smooth, than a higher resolution that gets choppy sometimes. Nothing spoils your fun more than a choppy section of the game causing you to die. Most games still look incredible at even 1280x1024 with max settings & high levels of AA/AF.

Personally I prefer to make my min. fps around 60+fps! But I'm probalby on the other end of the extreme.
 
I still think you're gaming at too high of resolutions / settings. If your min. hits 0,2, 18,20 o 22 fps (some samples from review) or even under 30 fps, you need to back down the resolution/game settings. IMHO.

I'd prefer to play at 1600x1200 with high AA/AF thats buttery smooth, than a higher resolution that gets choppy sometimes. Nothing spoils your fun more than a choppy section of the game causing you to die.

Personally I prefer to make my min. fps around 60+fps! But I'm probalby on the other end of the extreme.


Thanks for your thoughts on how we test, they are noted. We actually play the games and use our own feel for smoothness that is at an acceptable level. Your mileage may vary.
 
If your min. hits 0,2, 18,20 o 22 fps (some samples from review) or even under 30 fps, you need to back down the resolution/game settings. IMHO.
.

Some games just have bad mins and there is nothing you can do about it, no matter the settings. It may have something to do with the way the program loads new areas.
 
I still think you're "gaming" (benching) at too high of resolutions / settings. If your min. hits 0,2, 18,20 o 22 fps (some samples from review) or even under 30 fps, you need to back down the resolution/game settings. IMHO.

I'd prefer to play at 1600x1200 with high AA/AF thats buttery smooth, than a higher resolution that gets choppy sometimes. Nothing spoils your fun more than a choppy section of the game causing you to die. Most games still look incredible at even 1280x1024 with max settings & high levels of AA/AF.

Personally I prefer to make my min. fps around 60+fps! But I'm probalby on the other end of the extreme.

2560x1600 is the native resolution of the Dell 3007WFP. This resolution is where 8800 GTX SLI shines, and what we chose to focus on for the ultimate gaming experience.

Note that none of the minimum framerates are sustained, only brief drops in some places which did not negatively impact our gaming experience.
 
I hope [H]ard|OCP will write another kick-ass article like this once the really cool mulit-core DX10
games are available. Lets have a quad core face off between AMD and Intel!
 
It's interesting to see how the CPU affects performance with a top-of-the-line graphics setup, but I wonder how differing CPUs would affect gameplay on the level of us mere mortals. For example, how would CPU affect performance when you have a single X1950 Pro or 7900GT? My impression is that with a tighter GPU bottleneck, the CPU would become even less relevant than this article shows.

Gee, I sure would like to have a Min. FPS of 30, or even 20....
 
I still think you're "gaming" (benching) at too high of resolutions / settings. If your min. hits 0,2, 18,20 o 22 fps (some samples from review) or even under 30 fps, you need to back down the resolution/game settings. IMHO.



Many times, the minumum frame rate is due to a loading/caching issue which affects the computer for just a few seconds at a time. It's not usually a prolonged dip.
 
It's interesting to see how the CPU affects performance with a top-of-the-line graphics setup, but I wonder how differing CPUs would affect gameplay on the level of us mere mortals. For example, how would CPU affect performance when you have a single X1950 Pro or 7900GT? My impression is that with a tighter GPU bottleneck, the CPU would become even less relevant than this article shows.

Gee, I sure would like to have a Min. FPS of 30, or even 20....

should check out their first review of amd vs intel on the core platform where basically they said it doesn't matter if you go AMD or Intel, the vid card is the limit, which was a review many got upset at as they showed that buying Core 2 Duo over and AMD made no diff with "mortal" resolutions and such, which i am sure many people then went out and bought SLI cards, and had crap experiences, and now this review shows diff, sure there are going to be some upset people claiming "but [H] said before ti didn't matter what cpu i got"


It should be interesting when the dx10 game are multi-core support improve... core 2 duo, being a bottle neck.....
 
2560x1600 is the native resolution of the Dell 3007WFP. This resolution is where 8800 GTX SLI shines, and what we chose to focus on for the ultimate gaming experience.

Note that none of the minimum framerates are sustained, only brief drops in some places which did not negatively impact our gaming experience.

I know thats where the 8800's shine, but I thought this was a re-review of the CPU's.

I just think no matter what videocard you use, if you keep pushing it that far the edge, where the minimums are so low, you're not going to show very much difference in any 2 highend CPU's. B/c you can always crank up the resolutions/settings on the most recent games to the max. Like now you're using BF2142, instead of BF2 which is a little more demanding so you're "rebottlenecking" the GPU.

Luckily, two 8800GTX's are really hard to bottleneck, so you could show some differences between the CPU's.
 
It's interesting to see how the CPU affects performance with a top-of-the-line graphics setup, but I wonder how differing CPUs would affect gameplay on the level of us mere mortals. For example, how would CPU affect performance when you have a single X1950 Pro or 7900GT? My impression is that with a tighter GPU bottleneck, the CPU would become even less relevant than this article shows.

Gee, I sure would like to have a Min. FPS of 30, or even 20....

Take a look at the Core 2 Intel gaming comparison article here. They used a 7900GTX, which is not terribly dissimilar from an overclocked 7900/ATI equivalent.
 
enjoyable reading.
not feeling guilty anymore about using the C2Ds instead of FXs.:D
 
SWEET!

-THIS- was an article I was REALLY hoping to see right before I buy/build my new system!

Good job!
 
I know thats where the 8800's shine, but I thought this was a re-review of the CPU's.

I just think no matter what videocard you use, if you keep pushing it that far the edge, where the minimums are so low, you're not going to show very much difference in any 2 highend CPU's. B/c you can always crank up the resolutions/settings on the most recent games to the max. Like now you're using BF2142, instead of BF2 which is a little more demanding so you're "rebottlenecking" the GPU.

Luckily, two 8800GTX's are really hard to bottleneck, so you could show some differences between the CPU's.

I think you have missed the point entirely. 8800GTX in SLI on a 30" LCD is currently the most [H]ardcore game experience you can get. Spending that much dough on the video cards and monitor, it is nice to know exactly what your best bet is with the processors. Since the options are so different. Would it have been useful to know about lower resolutions? Yes it would. Would it have still made the point Brent and Kyle were trying to make? No, it would not.
 
Let me preface this statement for the forum denizens first by saying that I read/heard about this article the same way the rest of the public did. I do cooling not videocards/CPUs so I had no knowledge that this article was even being written.

With that out of the way, I have to say that this is the best CPU article yet. As much as I like the "real world" style benching I also like to see the "apples to apples" comparison. Having them both together just makes it perfect.
Hopefully this puts the debate over that to rest for good.

On a side note...As sic as the 3007WFP is, it cost more than my entire computer.
 
Excellent article. It's nice to see C2D vindicated. Looks like Intel wasn't "lying" after all, eh?
 
On a side note...As sic as the 3007WFP is, it cost more than my entire computer.
UltraSharp 3007WFP 30 inch Wide-Screen Black Flat Panel Monitor, LCD with Height Adjustable Stand, Requires a DVI-D Dual-Link Compatible Graphics Card
$1,499

Your computer cost less then $1500 How old is it?
 
Forgive me for restaing the obvious, but in other words...

If one elects to stay w/a smaller display monitor, say 17" or 19", whether CRT or LCD, and as long as one is also willing to settle for a max resolution of 1024 x 768, up to maybe 1600 x 1200 and which will be game-experience-dependent/determined, then one needs neither a high end CPU nor GPU to experience gaming w/all the eye candy maxed out.

If that's the case, then it should be emphasized that as far as gaming is concerned, display monitor SIZE should be the determining factor guiding CPU/GPU purchases and those still running smaller display moitors require the high end of neither.
 
I have a question about the M2:Total War bench... What unit scale size did you pick (Small, Medium, Large, Huge)? I didn't see it stated in the review. I did see that you said 900 vs 1000 but it seems that the unit scale will cripple old CPUs if set too high regardless of how many units...

My P4 can barely hang on if set to Large so I have to keep it at Medium. From playing this game for a month now this setting seems to have a bigger impact on performance than most of the other settings, at least for my rig anyways (can anyone say CPU bottleneck :p)
 
Since I have an 8800GTX SLI setup and a Core 2 Duo with a Dell 3007WFP I can safely say that you want all the processing power you can get at 2560x1600. Some games the differences between my C2D's stock speeds and overclocked speeds (3.6GHz) are noticable. They haven't been an extreme difference so far, but that may change when Crysis and some other games start hitting the market.
 
I think you have missed the point entirely. 8800GTX in SLI on a 30" LCD is currently the most [H]ardcore game experience you can get. Spending that much dough on the video cards and monitor, it is nice to know exactly what your best bet is with the processors. Since the options are so different. Would it have been useful to know about lower resolutions? Yes it would. Would it have still made the point Brent and Kyle were trying to make? No, it would not.

Not at all.

The lower the resolution the greater the performance advantage of Conroe.
 
Now we need an "overclocking for gaming" article, to see if hitting that 3.5ghz with your c2d is worth anymore in the gaming end of things, maybe pit it against the X6800 at stock speeds and see who wins?
 
Hey, you've got a small typo that makes a huge difference:

Given that gaming has yet to truly embrace multi-core CPUs, we are going to pit AMD’s high end Socket 939 Athlon 64 FX-62 (2.8GHz) against Intel’s Core 2 Duo X6800 (2.93GHz).
The FX-62 and Asus M2N32 is a socket AM2 platform. Might want to fix that, as it will get the AMD guys even more upset than they already are. ;)

Other than that typo, GREAT article. I love it when [H] clears up the BS.

not feeling guilty anymore about using the C2Ds instead of FXs.:D

You shouldn't have felt guilty to begin with. Brand loyalty to AMD does nothing more than convey a message that says "don't bother innovating, since I'll buy your useless crap either way."
 
get over yourselves. Other Reviews test resolution, asino, AA and eye candy as constant, FPS as variable. You guys test FPS as constant, Resolution, eye-candy, asino, and AA as a veriable.

The reason rsolution works better as a constant is that it IS a constant. it doesnt change when someone throws a smoke bomb. You guys deem for us what is "tolerable" FPS wise, which is crap. I can do 30, ill go as low as 25 so long as i dont have to spray five people. My friend jamie plays wow on his 6600LE with 15FPS average and doesnt mind it. Those of us on a 7900GTX SLI Setup and already in contact with EVGA to step up to two 8800GTS's. You guys can tolerate only the Max refreash rate, which is understandable.

You guys telling us 40 is acceptable is un acceptable by my standards. FS's reviews are much more in depth as they cover everything from 800X600 to 2500X2000. We all saw what happened when you increased the resolution. We dont actually need you to tell us.

"but no body on a Core 2 Duo system would play at 800 X 600"
your hugly mistaken. hundreds of thousands play on memron, or Conroe procs at 800X600 because they have integrated graphics. They have the system they saw on sale at futureshop featuring a Core Two Duo at 2.2 Ghz for $599, and thought it was an awsome buy. They have no idea what a graphics card is, other then that a "256 deticated" is good (which of course, we all know can be a load of bull).

edit: oh yeah and theres a typo there. Socket AM2 FX 62. There is no S939 590 Platform (is there?!?).

edit2: you state that even tho the FPS was high, you were experiancing choppyness. What was the status of V-sync?
 
You guys telling us 40 is acceptable is un acceptable by my standards. FS's reviews are much more in depth as they cover everything from 800X600 to 2500X2000. We all saw what happened when you increased the resolution. We dont actually need you to tell us.

I disagree. I like knowing how far a setup will push a game. When you know where the highest limits are, you'll also know that anything less than the highest will improve speed and response. 2500x2500 @25FPS with no candy is just as bad as 640x480 @ 25FPS with no candy. At least here, you can get a sense of what resolution paired up with what graphic options should work for you.
 
The lower the resolution the greater the performance advantage of Conroe.
I was under the impression that given a specific GPU, as resolution decreases, the CPU has a decreasing impact on performance. Do you have a link to support what you said?
 
I was under the impression that given a specific GPU, as resolution decreases, the CPU has a decreasing impact on performance. Do you have a link to support what you said?

The poster means that when you decrease the resolution significantly, the CPU ends up doing most of the work. At 640x480 and at 800x600 this is true with most games. As the resolution increases, the workload shifts to the graphics hardware more.

At lower resolutions, lesser graphics hardware doesn't falter as badly as the workload becomes easier. At low resolutions, comparisons between various CPUs will show a larger difference than you would have using an 8800GTX and that same range of CPUs up to a point. This article illustrates that at some point, a CPU again comes into play when using ultra high resolutions like those native to the Dell 3007WFP.

In essense, we are almost always CPU bound depending on the resolutions used.
 
Wouldn't it be more more useful to run the tests with resolutions and specs that some of us might actually be able to use? I doubt very many people here have a 3007fpw and two 8800GTXs. I think one 8800GTX and a range of resolutions like 1280x1024, 1600x1200 and then 2560x1600 would have been a lot more useful. This setup is so much more than any of us will ever have that I'm not sure weather the result will necessarily scale down to us.
 
I'd like to see an article that outlines the card that is recommended for each of the common gaming resolutions.

I run at 1280x1024 - would I really see an improvement beyond a certain point, even for games like oblivion?

maybe a series of articles, one resolution a week or something.
 
Wouldn't it be more more useful to run the tests with resolutions and specs that some of us might actually be able to use? I doubt very many people here have a 3007fpw and two 8800GTXs. I think one 8800GTX and a range of resolutions like 1280x1024, 1600x1200 and then 2560x1600 would have been a lot more useful. This setup is so much more than any of us will ever have that I'm not sure weather the result will necessarily scale down to us.

This is a pretty standard complaint. While I don't do video card reviews and can't really address what those guys will or will not do going forward, I can tell you that benchmarking all hardware in the most ideal conditions is probably the best way to go.

The 8800GTX SLI setup needed to be pushed to it's limits in order to show the difference a CPU could make with that type of setup. In this context, it was the correct choice I think. Plus you test with an X6800 CPU to make sure that you've removed as much of the bottleneck as possible and that way the only variable changing is your graphics card. (In the case of graphics cards reviews.)

In the [H] video card reviews, if you'll notice many resolutions and monitor configurations are often tested. I would think this would give you the information you'd need most. There should be no doubt that at anything less than 2560x1600, the 8800GTX is going to STILL be the fastest card out today.
 
I have to say thank you to the [H]ardocp crew for doing this comparison. I'm doing research for my next gaming rig and this helped a lot. Keep up the good work guys.
 
Wouldn't it be more more useful to run the tests with resolutions and specs that some of us might actually be able to use? I doubt very many people here have a 3007fpw and two 8800GTXs. I think one 8800GTX and a range of resolutions like 1280x1024, 1600x1200 and then 2560x1600 would have been a lot more useful. This setup is so much more than any of us will ever have that I'm not sure weather the result will necessarily scale down to us.

I second that motion! 1280 X 1024 is also a very common LCD res and I have my eye on at least one 8800GTX, wasn't that impressed with one 8800GS though. Besides DX-10 it was barely faster than a X1950XT. 1600 x 1200 even:)
 
The faster the graphics system the more important it is to have a system/CPU that can feed it fast enough.

True.It was excellent that the conclusion explained which CPU has the more oomph , something the previous article failed miserably.
 
If that's the case, then it should be emphasized that as far as gaming is concerned, display monitor SIZE should be the determining factor guiding CPU/GPU purchases and those still running smaller display moitors require the high end of neither.

Not size, but resolution. Back in the CRT days, you could run any resolution you wanted (to the monitor's max) and the picture quality was pretty much the same. With LCDs, you have a fixed resolution, and anything below that needs to be scaled to fit (or you get letterboxing/sidebars).
 
I'm a bit confused about something. In Medieval II, the C2D could manage 4x TRMSAA, but the FX couldn't handle AA at all. I was under the impression that AA performance was directly related to the video card itself, and not the CPU.
 
In this evaluation, especially under Oblivion and some other games, we talked about some odd performance experiences. We said that at certain very high settings we experienced a “choppy” or “laggy” feeling in the game even though the framerate indicated playable framerates.

I get this no matter what resolution I use on my system. It's a 3000+, 2gigs of ram, 7800gt. You can't see it, but when you move the mouse, you literally have to wait for the camera to catch up. Yet, everything else in the game is moving along just fine. This happens to me in Far Cry also, although much less, which was the reason why I really never played the game. I can play many other games with settings that would slow my system down much more than Oblivion, but still not have the camera problem. I think it's just the game.
 
Back
Top