Is my e6300 @ 3.0 keeping my 8800 GTX back?

RooK

[H]ard|Gawd
Joined
Aug 19, 2001
Messages
1,108
So I have just upgraded from my 8800 GTS 320 to a GTX.... right off the bat, crysis runs fantastic on high settings.

But... how much is the processor holding it back? I'm thinking a little, but I do have it OC'ed to 3.0....

I realize the L2 cache is small, but still...

any ideas?
 
You won't notice more than say 2-8 FPS more at high settings with a more powerful CPU. Crysis is mostly GPU limited.
 
cool deal... I mainly play BF2... didn't know if I would notice much of a difference

I'll just chill with my cheap e6300 (which I love!) =D
 
Yea, probably very little, during gameplay with crysis my E4500@ 3.2 ghz never goes over 75% usage
 
Because he thinks it's common knowledge/sense. However, not everyone is a hardware/PC guru or inclined as such. The vast majority of people on this earth aren't, same goes for cars. I know a couple of PC "geeks", enthusiests, users, what-have-you who don't know squat about cars or maintainence. They get raped in the back by letting others do simple service to their cars or they buy a car brand new (when they don't have to) in stead of getting a used one at 2/3s the cost with about 15k mileage...

Not everyone's a pro at everything, there's plenty of things that we're all "dumb-asses" or ignorant at. I know it's annoying on the forums with all the same questions. Get used to it, it's the web man <3 :).
 
We see too many of these posts. At times where your graphics card isn't being taxed as hard (ie low resolution or not a lot of action) you could probably get a few more FPS by having a faster CPU. Those additional frames wouldn't do you any good because you're already running at a decent framerate at those times. In scenes of heavy action/explosions, etc where your GPU is being maxed out, more CPU won't help.

I can't think of any current games that would realistically benefit by having > 2x3.0GHz C2 cores. (SupCom, maybe?)

So in other words; for benchmarks, yes your CPU is "keeping you back". For real-world gaming, no. Your CPU is fine.
 
So I have just upgraded from my 8800 GTS 320 to a GTX.... right off the bat, crysis runs fantastic on high settings.

But... how much is the processor holding it back? I'm thinking a little, but I do have it OC'ed to 3.0....
Generally, not much if at all. There are only a handful of games that will bring your CPU to its knees before the GPU hits its own wall. Simulators, particularly Flight Simulator X (at any resolution), will be more CPU-bound than anything else.

Crysis does have its moments of intense CPU usage, but overall, it is more reliant on the GPU than anything else. I'd say you're good to go for now.
 
Any C2D at 3gig+ is not going to hold back any current GPU. In specific games that reply heavily on the CPU like RTS with 100's of units on the screen sure a slightly faster processor will help a bit, but we are not talking much..

Buying a quad core is questionable as most of the games mutlithread like ass still, and the "single thread" performance of your processor is still amounts the best. The e6300 only has 2MB cache but that only translate into a 3-5% performance loss over a 4MB cache e6320, your processor at 3.0gig is still faster than a stock e6700 and probably close to matching a stock X6800.
 
Any C2D at 3gig+ is not going to hold back any current GPU. In specific games that reply heavily on the CPU like RTS with 100's of units on the screen sure a slightly faster processor will help a bit, but we are not talking much..

Buying a quad core is questionable as most of the games mutlithread like ass still, and the "single thread" performance of your processor is still amounts the best. The e6300 only has 2MB cache but that only translate into a 3-5% performance loss over a 4MB cache e6320, your processor at 3.0gig is still faster than a stock e6700 and probably close to matching a stock X6800.

This is what i've pretty much thought - and thanks for the rest of you who chimed in as well. (you saved me some money... now what is your bank account number so I can distribute it amongst you? Did I mention i'm in Nigeria... j/k ;) )
 
Your CPU at that speed will not hold your GPU back at all. Sit back & enjoy it.
 
No no no no, this cpu bottlenecking thing is a load of bs, unless you still have a 2ghz a64.
 
Well as most gamers now have widescreen resolutions of 1680x1050 and above the cpu will not be bottlenecking if it is a modern cpu.

Trying to play at 1024x768 is a diffrent story though, but most gamers dont really game at this res.
 
Well as most gamers now have widescreen resolutions of 1680x1050 and above the cpu will not be bottlenecking if it is a modern cpu.

Trying to play at 1024x768 is a diffrent story though, but most gamers dont really game at this res.

I honestly dont believe that most gamers are 1680x1050 and above...not even close...maybe 25-35% if i had to guess, maybe 50%, but anyway thats not the point. The point was that CPU bottlenecking an 8800 series is not BS by any means, its just dependant on resolution. It is however fairly understood that, if your buying an 8800 series with a 1024 monitor, your either an un-wise spender or intending to buy a better monitor very soon.
 
I honestly dont believe that most gamers are 1680x1050 and above...not even close...maybe 25-35% if i had to guess, maybe 50%, but anyway thats not the point. The point was that CPU bottlenecking an 8800 series is not BS by any means, its just dependant on resolution.

Ahh but why would you have an 8800 series card if you didnt game at a mid-high res. Cpu bottlenecking is nearly none existant if you have a dual core cpu, and game at a resolution over 1280x1024.
 
So I have just upgraded from my 8800 GTS 320 to a GTX.... right off the bat, crysis runs fantastic on high settings.

But... how much is the processor holding it back? I'm thinking a little, but I do have it OC'ed to 3.0....

I realize the L2 cache is small, but still...

any ideas?



Not much,if you think it is,or want more just overclock it a little or a lot.
 
L2 cache doesnt really make too much of a diffrence, well it didnt used to :p
 
Ahh but why would you have an 8800 series card if you didnt game at a mid-high res. Cpu bottlenecking is nearly none existant if you have a dual core cpu, and game at a resolution over 1280x1024.

Indeed, but considering he initially bought a 320 GTS, and upgraded to the GTX, i would think hes gaming at a fairly low resolution, since most on this forum know the 320 wouldnt be much good for anything over 1280x1024.

Also, he mentioned Crysis runs fantastic on high, which would also point to a lower resolution.

Anyway, its all a moot point since we dont know the details. I didnt mean to flame you or anything, just correcting a statement that CPU bottlenecking is indeed real, just likely not in the OPs case.
 
....I didnt mean to flame you or anything, just correcting a statement that CPU bottlenecking is indeed real, just likely not in the OPs case.

Its ok, im just getting tired of these threads that people think their 3ghz dual core is bottlenecking a single card. /yawn.
 
is my Q6600 @ 3.4 holding my 8800 GTS SLI rig back ? :rolleyes: :D
 
Is my pII bottlenecking my voodoo 2's in sli? will it run fine with 128mb of ram?
 
I honestly dont believe that most gamers are 1680x1050 and above...not even close...maybe 25-35% if i had to guess, maybe 50%, but anyway thats not the point.

I know you weren't trying to make a point with the above percentage but just in case anyone is interested: According to the Valve hardware survey, only about 13% of gamers, at least those with Steam installed, play at a resolution of 1680x1050 and above. About 30% still play at 1024x768 and about 40% play at 1280x1024.
http://www.steampowered.com/status/survey.html
 
I just bought a new MSI 8800GT OC , but like someone in the posts above said, I got an athlon 64 3500+ (not overclocked) . I play at 1680 X 1050 on a 22" monitor . How badly is the cpu bottlenecking my card since I play on a high res??
 
I just bought a new MSI 8800GT OC , but like someone in the posts above said, I got an athlon 64 3500+ (not overclocked) . I play at 1680 X 1050 on a 22" monitor . How badly is the cpu bottlenecking my card since I play on a high res??

Not too bad, but the game is fighting for cpu clocks with windows, geting a dual core would get rid of this as it would offload all the other processes onto the other core.
 
Indeed, but considering he initially bought a 320 GTS, and upgraded to the GTX, i would think hes gaming at a fairly low resolution, since most on this forum know the 320 wouldnt be much good for anything over 1280x1024.

Also, he mentioned Crysis runs fantastic on high, which would also point to a lower resolution.

Anyway, its all a moot point since we dont know the details. I didnt mean to flame you or anything, just correcting a statement that CPU bottlenecking is indeed real, just likely not in the OPs case.

I would say i'm at a pretty good res... 1680x1050

And for the trolls, I have nothing for you to feed on besides your own elitism. Sorry. I have a day job and I don't sit around keeping up with hardware like I did when I was 21. But I have been a member of the forums since 01, and reading before that... so if you want to compare size, besides the fact i'm 6'2.... bring it.

And i'm also very sick and very grumpy, so bite me.
 
I would say i'm at a pretty good res... 1680x1050

And for the trolls, I have nothing for you to feed on besides your own elitism. Sorry. I have a day job and I don't sit around keeping up with hardware like I did when I was 21. But I have been a member of the forums since 01, and reading before that... so if you want to compare size, besides the fact i'm 6'2.... bring it.

And i'm also very sick and very grumpy, so bite me.

haha, oh man. we got a REAL LIFE tough guy
 
I really hope this thread is a joke

I really think you chew your food twice since your head is so far up your ass.

To the OP:

The lower the resolution the more stress it puts on your cpu and less stress is put on your gpu. I have the same cpu(sig.) and don't worry about playing crysis and what not it won't effect it at all. Also you may get fewer marks in 3dmark06 but I'm a gamer and 12K in 3dmark06 don't mean squat if I can play Crysis on high settings.
 
Mainly depends on resolution, but either way that chip will OC plenty to make sure it isnt a bottleneck.
 
Except, your wrong, its not a load of BS, its totally dependant on resolution.

No, it doesn't. By the time you lower the resolution and graphic settings to the point where your CPU is the bottlenecking component, you'd be running at a much higher frame rate than 60 and that makes the whole bottlenecking a moot point.

The point of discussing bottlenecks is that you don't want any one component on your PC to limit the rest in a way that hinders the enjoyment of your gaming experience at the settings you desire. If you lower your resolutions and settings to the point where you're bottlenecked by the CPU, the limitations posed by your CPU will not affect your gaming experience at that resolution whatsoever.

This might change when newer games come out with more physics and better AI (Alan Wake might be one), but as it stands right now, a C2D CPU bottlenecking is BS.
 
Back
Top