CPU bottleneck at which GPU

MirageMobile

Limp Gawd
Joined
Jan 22, 2007
Messages
489
q9450 at 3.5ghz

game at 1080p

currently have a 4870 and looking to upgrade

any help is appreciated
 
@ 3.5ghz, I dont think you will see any kind of bottleneck with current cards.
 
Not for a single card anyway. SLI/CF is doable but tri-sli and 3-way CF are probably out. What do you plan on running? You should be good with just about any single card or a 590/6990.
 
that cpu is fine for any card you want at 1920x1080. sure it may not give you 100% of what a gtx570 or 6970 can do in every game but its not enough to matter.
 
YES it will be a bottleneck for current gen cards, no doubt. A Q9650 clocked at 4.0ghz is a bottleneck for a 5870.

Proof:
http://www.youtube.com/watch?v=uiqS341ibbc
nice try but that is at 1600 with HBAO off not 1920 and max settings. at 1920 and max settings, the 2500k would probably only be 15-20 fps faster in BC 2 and that is about as cpu intensive as it gets. and sure its a "bottleneck" if you want to run fraps and stare at the framerate. let me know though if you can tell the difference once you get above 80-90 fps.

so the point here is that the OP has a 4870 and wants to upgrade so there is NO reason not to get the fastest gpu he can afford to play at 1920.
 
nice try but that is at 1600 with HBAO off not 1920 and max settings. at 1920 and max settings, the 2500k would probably only be 15-20 fps faster in BC 2 and that is about as cpu intensive as it gets. and sure its a "bottleneck" if you want to run fraps and stare at the framerate. let me know though if you can tell the difference once you get above 80-90 fps.

so the point here is that the OP has a 4870 and wants to upgrade so there is NO reason not to get the fastest gpu he can afford to play at 1920.

Try some math.

1920x1080 is 2073600 pixels
1600x1200 is 1920000 pixels

thats a 7.4% difference in the number of pixels. For it to make the difference you claim, there would have to be a 30-40% difference in pixels. Furthermore, with the c2q in the video, the GPU was not seeing 100% usage because the CPU could not provide the GPU with the work. The sandy bridge in the video was able to provide the GPU with 100% usage.

AKA, nice try to you, It will still be a bottleneck.

And OP was asking about bottlenecks, not about the magnitude of bottleneck. No matter what you say, a Core2 series chip will bottleneck - and a significant bottleneck - to current gen video cards in current gen games.
 
The issue here is that Bad Company 2 is bottlenecked by a q6600, not that the Op's quad would bottleneck a certain tier of GPU in every case. You should absolutely go for the best card you can afford right now.

I can tell you from my experience ([email protected]), that my FPS doubled in full multiplayer games when I moved to an i7 2600k (running 6048x1080 in both cases on a single 5870).
 
You should absolutely go for the best card you can afford right now.

I need to emphasize that I agree with this. But the underlying issue is that the card will not see full usage until OP gets a new processor. So yes, OP, do future proof your GPU by getting the best thing available. Just realize that you will need a better processor to realize its full potential.
 
Try some math.

1920x1080 is 2073600 pixels
1600x1200 is 1920000 pixels

thats a 7.4% difference in the number of pixels. For it to make the difference you claim, there would have to be a 30-40% difference in pixels. Furthermore, with the c2q in the video, the GPU was not seeing 100% usage because the CPU could not provide the GPU with the work. The sandy bridge in the video was able to provide the GPU with 100% usage.

AKA, nice try to you, It will still be a bottleneck.

And OP was asking about bottlenecks, not about the magnitude of bottleneck. No matter what you say, a Core2 series chip will bottleneck - and a significant bottleneck - to current gen video cards in current gen games.
please learn to READ more carefully. I said 1920 AND max settings would make that 2500k advantage much smaller. AGAIN that guy was at 1600x1200 WITHOUT HBAO and only using 4x AA. go to 1920x1080, turn on HBAO and use 8x AA and poof...nearly all the advantage of the 2500k would be gone. and also AGAIN, we are talking about framerates above 80-90 fps here.

the extra gpu power will massively help in maxing games like Just Cause 2, Metro 2033, Witcher 2, Crysis, Warhead, and Alien vs Predator.
 
I need to emphasize that I agree with this. But the underlying issue is that the card will not see full usage until OP gets a new processor. So yes, OP, do future proof your GPU by getting the best thing available. Just realize that you will need a better processor to realize its full potential.
because of ONE game running at 1600 and not max settings you draw that conclusion? yes his cpu will be a bottleneck in some cases BUT it will NOT be a NOTICEABLE bottleneck with a gpu upgrade. and again having more gpu power will let him run way better settings and get way better performance than he can get in with his 4870. in many games even a gtx570 or so will still be the main limitation on max settings at 1920.
 
@ 3.5ghz, I dont think you will see any kind of bottleneck with current cards.

:rolleyes:

Dude its a 3 year old processor with a mild overclock.

because of ONE game running at 1600 and not max settings you draw that conclusion? his cpu will NOT be a NOTICEABLE bottleneck with a gpu upgrade. and again having more gpu power will let him run way better settings and get way better performance than he can get in with his 4870.

Any proof to back that up?

Don't link me to some shitty anandtech benchmark. They do by far the worst cpu benchmarks.
 
:rolleyes:

Dude its a 3 year old processor with a mild overclock.



Any proof to back that up?

Don't link me to some shitty anandtech benchmark. They do by far the worst cpu benchmarks.
do you actually think there will be any cpu tests done at 1920 with max settings using a core 2 quad oced to 3.5? I will be the first to say there is a bottleneck around here but some of you are being ridiculous. by far his MAIN limitation is the 4870.
 
CPU vs. GPU bottlenecks depend on the game being run and the settings used. Graphics settings tend to change the gpu load significantly while having little affect on cpu load. While most game engines require surprisingly little cpu throughput the gpu throughput demand for the graphics engine tend to be high when settings are maxed. However some game engines do exist which require a powerful cpu, these tend to be PC exclusives but not always.

For most modern games in most situations a mildly OCed core 2 duo/quad is still sufficient. In strategy games when you have a lot of units moving around at once or in a few other games with cpu heavy engines like bfbc2 when a lot of stuff is happening at once (mainly explosions and building/terrain destruction) the cpu load can reach 100% on some cores and begin bottlenecking the gpu. But this is usually minimal and temporary. The idea that someone needs to have a totally up to date top of the line cpu to play modern video games without a cpu bottleneck is ridiculous, especially when we live in an era filled with console ports.
 
do you actually think there will be any cpu tests done at 1920 with max settings using a core 2 quad oced to 3.5? I will be the first to say there is a bottleneck around here but some of you are being ridiculous. by far his MAIN limitation is the 4870.

Oh, I agree with you there. 4870 is not a current card though. A three year old C2Q will hold back with something a little more current like a GTX560ti.
 
Oh, I agree with you there. 4870 is not a current card though. A three year old C2Q will hold back with something a little more current like a GTX560ti.
not enough to matter. if you want to run low res cpu limited tests then sure but at 1920, his 3.5 Core 2 quad will NOT be an issue with just a gtx560 Ti.
 
not enough to matter. if you want to run low res cpu limited tests then sure but at 1920, his 3.5 Core 2 quad will NOT be an issue with just a gtx560 Ti.

:rolleyes:

Do you honestly believe that a newer cpu like a 2500k won't make a difference from a three year old cpu with gaming? I've seen a few games, I've posted benchmarks. I'm not going to get into it here. You can check my post history. There are a few threads on this very forum that well pretty much back up what I have seen. You can think what you will.
 
:rolleyes:

Do you honestly believe that a newer cpu like a 2500k won't make a difference from a three year old cpu with gaming? I've seen a few games, I've posted benchmarks. I'm not going to get into it here. You can check my post history. There are a few threads on this very forum that well pretty much back up what I have seen. You can think what you will.
you can roll your eyes all you want. I am not going to dig through your posts so how about you show me a core 2 quad at 3.5 being a noticeable bottleneck for a gtx560 at 1920 on max settings or stfu?
 
you can roll your fucking eyes all you want. how about you show me a core 2 quad at 3.5 being a noticeable bottleneck for a gtx560 at 1920 on max settings or stfu?

I'm not the one making claims with doing no research and then being a condescending prick about it. PCgameshardware did some overclocking comparisons with core 2 quad on some older graphics cards on real resolutions. Alien Babel Tech did a nice comparison. Theres more out there, look.
 
I'm not the one making claims with doing no research and then being a condescending prick about it. PCgameshardware did some overclocking comparisons with core 2 quad on some older graphics cards on real resolutions. Alien Babel Tech did a nice comparison. Theres more out there, look.
and? there is not even a Core 2 quad in that test at ABT. the games with the big increase using faster cpus are using 5870 crossfire which is much faster than a single gtx560 ti. also all the quads there are delivering framerates well in excess of 90 fps in UT3 which you linked to.
 
how about some reality for you. even the Core 2 quad at just 2.83 is already matching 2500k at just 1680 in Aliens vs Predator. http://www.tomshardware.com/reviews/sandy-bridge-core-i7-2600k-core-i5-2500k,2833-20.html

at 1920 the 2500k is 26% faster than the 2.83 Core 2 quad in Metro 2033. the OP's core 2 quad is clocked 24% faster so do the math. http://www.tomshardware.com/reviews/sandy-bridge-core-i7-2600k-core-i5-2500k,2833-18.html

here the 2500k does have a huge advantage over the 2.83 core 2 quad in this very cpu heavy game. still the OP's cpu at 3.5 would do much better and deliver well over 60fps. http://www.tomshardware.com/reviews/sandy-bridge-core-i7-2600k-core-i5-2500k,2833-19.html

now that is two gpu heavy games and one cpu heavy game. that is also with a gtx580 and NO AA. so just turning on 4x AA in that one cpu intensive game would have the OP's 3.5 Core 2 quad almost dead even with a 2500k at 1920 even with a freaking gtx580.

yet in your mind his 3.5 Core 2 quad is still too slow for a gtx560 Ti? lol

so AGAIN, run max settings at 1920 especially with AA and the OP's 3.5 Core 2 quad will not be a NOTICEABLE bottleneck with a gtx260Ti or even gtx570. in fact in some games like Aliens vs Predators or Just Cause 2, even the gtx560 Ti will still be the only real limiting factor.
 
Last edited:
When I got my GTX570 for a really good price, I was curious to see how much of a bottle neck my q9550 @ 3.4. I saw the silly video comparing the 2500k in bc2 on a near static screen and NOT even maxed out!!!. I ran the Zero Dark Thirty level just like [H] does at the same settings, and got almost the same fps. Granted they use a Core i7 920 at 3.6, even then, my fps wasn't too far off from them, Min was 41, avg 71. That's enough proof for me thank your very much.
 
Just get the best GPU that
A) fits your budget
B) is faster then you have
C) and works with your PSU

And don't sweat it

It's still going to be faster then you currently have

Yes, we all know the newer cpu's rock, but it's not like he's sporting a single core Atom cpu there.
 
Some of you are forgetting that increasing graphics settings makes the cpu LESS LIKELY to bottleneck the gpu since gpu load goes up, not the other way around. If you want to guarantee a cpu bottleneck run an older game on very low settings. CPU demand does not always go up with a faster gpu, there are many factors to consider, which is why benchmarks are needed. Typically the most important thing to look at is how cpu intensive a particular engine is and is your cpu powerful enough to handle it? Strategy games for example will typically require a much better cpu than rpgs to run smoothly, regardless of video card.

and? there is not even a Core 2 quad in that test at ABT. the games with the big increase using faster cpus are using 5870 crossfire which is much faster than a single gtx560 ti. also all the quads there are delivering framerates well in excess of 90 fps in UT3 which you linked to.

Agreed. I see many people attempting to compare two cpus in a benchmark where both are capable of achieving framerates higher than 60fps average then attempting to use that as justification for buying the better cpu. It doesn't matter whether the cpu can achieve 75 or 90 fps under lower settings since both are over 60 fps and when you raise the settings or run a game with a more shader heavy engine the gpu will become the bottleneck and your framerate may drop well below 60 fps. It is important to note that a core 1 quad at 3.2GHz is roughly equal to a phenom II at 3.6GHz, and in the alienbabeltech benchmarks the the 3.8 GHz phenom II (which is slower than the ops cpu) was doing just fine.

Most cpu game benchmarks are run on older games with lower settings achieving framerates above 120 fps on todays gpus, in these scenarios the cpu becomes far more likely to bottleneck the gpu but once you crank up the settings your framerate drops due to a gpu bottleneck and you begin to see a big difference in performance between different video cards.
 
Last edited:
Some of you are forgetting that increasing graphics settings makes the cpu LESS LIKELY to bottleneck the gpu since gpu load goes up, not the other way around. If you want to guarantee a cpu bottleneck run an older game on very low settings. CPU demand does not always go up with a faster gpu, there are many factors to consider, which is why benchmarks are needed. Typically the most important thing to look at is how cpu intensive a particular engine is and is your cpu powerful enough to handle it? Strategy games for example will typically require a much better cpu than rpgs to run smoothly, regardless of video card.


actually you are little bit off there. most newer games are needing faster and faster cpus too. everything from Crysis 2 to Witcher 2 are way more cpu intensive than their predecessors. sometimes its just slopping porting alone that needs more cpu power but that is realty for pc gaming.

that being said the OP is just fine with a Core 2 quad at 3.5 running any single gpu he can afford at 1920. personally I would go with a 6970, 6950 2gb or gtx570. a 6950 1gb or gtx560 Ti is okay but games like GTA 4 and Crysis 2 with high res texture pack and DX11 need more than 1gb of vram at 1920.
 
I didn't say cpu demand was going down, I said the chance of running into a cpu bottleneck goes down because gpu load goes up. When I run an older game or a game at lower settings I need less cpu throughput, but the amount of gpu throughput needed is proportionally far less. And as we know the component that is the most overtaxed with work is the bottleneck.
 
Of all people Cannondale saying something isn't CPU limited enough to matter, when that happens it's probably true ;)
 
There is no way that a Q9450 with a decent overclock is a bottleneck for any video card. Yes you will pick up extra frames with a i5-2500K but that doesnt mean that a C2Q at 3.5 GHz cant still give you plenty of performance. If the OP upgrades to a GTX570 for example, he'll still be able to shred any game at his resolution. Now I agree that a 2500K would give him more shreadability but in a lot of games we're talking 85 fps with the C2Q and 95 with a 2500K. Either way, he's still getting great framerates.

This whole "bottleneck" thing gets thrown around way too much. A true bottleneck for say a GTX570 would be something like a Pentium 4 with HT. That would choke the crap out of a 570. A Q2C running 3.5 GHz would not.

Here is a good review from Anand that backs that up. http://www.anandtech.com/show/2777/7

Its comparing some Nehalem processors to C2Q's and Phenom X4's. The i7's are faster but not by much. And when you factor that theyre testing at medium resolutions with medium settings, that difference is gonna shrink real fast when they turn up the resolution and settings.

Here is one from Overclockers Club testing an i7-870. They used to use high resolutions and high settings for their CPU tests. Theyve lately switched back to medium settings and 1680x1050 and below resolutions. This one however goes up to 1920x1200 and max settings. The differences are marginal at best.

http://www.overclockersclub.com/reviews/intel_corei5750_corei7870/

So in closing, OP would be better off with a Sandy Bridge but wouldnt we all. That being said, he could upgrade to a GTX580 and still get great performance with his Q9450 albeit not quite as much as if he had a SB in his rig.
 
Not a bottleneck woth noting or that should keep you from upgrading your gpu before your CPU. Get what you can OP. I like 6950/6970 GTX 560 Ti or GTX 570 as choices.
 
lol the user "BababooeyHTJ" just went MIA and gave up arguing. Bottom line is bottleneck or not, a new gpu will still give the OP additional frames. He can upgrade his cpu in the future to get even more frames. It's a win-win situation really.
 
I'm not the one making claims with doing no research and then being a condescending prick about it. PCgameshardware did some overclocking comparisons with core 2 quad on some older graphics cards on real resolutions. Alien Babel Tech did a nice comparison. Theres more out there, look.

You know, cannondale06 is making some good arguments and you really haven't backed any of your claims with valid sources.

I don't believe that quad-core @ 3.5GHz would be a bottleneck for any single-GPU graphics card to date. SLI or X-FireX may bottleneck though, definitely.

Also, the lower the resolution, the more work the CPU has to do. The higher the res, the more work the GPU does, just fyi.

Still waiting to hear a response from you though. ;)

Bottom line is bottleneck or not, a new gpu will still give the OP additional frames.
+1 to this, a newer GPU will help a lot.
 
A Q9450 may not be the best CPU out there right now, and yes, it will be a limitation in some games, but compared to an HD4870, it is by no means the weak link in the system. The majority of new titles will hold 60fps with that CPU even at stock, let alone overclocked - while you should consider a CPU upgrade in the future, for now just upgrade graphics and see how you go.
 
Thank for the clarification.

My thoughts exactly.

Might grab 6950/6970 if new cards force some price drops.
 
It will bottleneck, but not a terrible system crippling one. I used to run a q9400 and a GTX 275, moved up to a 460 and even the 275 wasn't being used to it's fullest with my C2Q gained over 12FPS in almost every game I own just by swapping to a newer CPU(i7 950). TLDR version, Yes it will bottleneck any newer card then a 4870 but will work for one more mid/high end GPU update if your on a budget.
 
It will bottleneck, but not a terrible system crippling one. I used to run a q9400 and a GTX 275, moved up to a 460 and even the 275 wasn't being used to it's fullest with my C2Q gained over 12FPS in almost every game I own just by swapping to a newer CPU(i7 950). TLDR version, Yes it will bottleneck any newer card then a 4870 but will work for one more mid/high end GPU update if your on a budget.
you do know a gtx460 is only about a 10-15% faster card than your gtx275 so that is hardly what I would call an upgrade. and the OP has a 9450 which has more cache than your 9400 and is clocked at 3.5. his cpu would literally be about 35% faster than yours resulting in little to no bottleneck for a gtx570 level of card at 1920 with max settings. even a gtx580 would not be bottlenecked at all in many games if running on the best settings for a card like that. only in very cpu intensive games would the OP's cpu give up anything compared to an i5/i7 and even then playability is not impacted.
 
you do know a gtx460 is only about a 10-15% faster card than your gtx275 so that is hardly what I would call an upgrade. and the OP has a 9450 which has more cache than your 9400 and is clocked at 3.5. his cpu would literally be about 35% faster than yours resulting in little to no bottleneck for a gtx570 level of card at 1920 with max settings. even a gtx580 would not be bottlenecked at all in many games if running on the best settings for a card like that. only in very cpu intensive games would the OP's cpu give up anything compared to an i5/i7 and even then playability is not impacted.

The q9400 was running at 3.8, yes smaller cache not to sure about 35% faster though...also I have SLI 460's :)
 
The q9400 was running at 3.8, yes smaller cache not to sure about 35% faster though...also I have SLI 460's :)
well if you were at 3.8 than your 9400 would have been just as fast or even faster than his 9450 at 3.5. ;)
 
Back
Top