Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
That's my concern. I might just have to hold off until I can do a complete system.coming from a gtx260 you should get a nice boost in most games but yeah that cpu is really going to hold back any gpu upgrade. in some cases you may not get much more than half of what even a 7850 is capable of.
It depends on the game as to how much of a "bottleneck" the CPU is.
In 99% of games, you'll be more than happy with the performance upgrade and the CPU will barely make a difference and pretty much everything will look and run at about 4x the frame rate on a 7850 compared to a 260, so you'll experienec much higher settings at a much higher frame rate.
If you're happy with a frame rate of ~40-50 in BF3 multiplayer at ultra settings, then you'll be fine also for a long while.
If you choose to overclock your processor to 3Ghz or 3.3Ghz (possible if you already have a decent heatsink/mobo) you'll be more than fine for a decent experience in basically every game for a while.
CPUs have only really increased in capability at a rate of ~10% a year for the past 5 years, but GPUs way way faster, moving from your CPU to an Ivy Bridge CPU is basically the difference between 1 year of graphics cards on average, the main speed boost people notice from getting a new pc is from the SSD that they buy with it.
what does that video prove? its a Q6600 oced to 3.2 so its about a 30% faster cpu than a stock Q8300. a Q8300 would easily drop into the low 30s a lot and probably average low 40s at best where as 3570k would nearly double that.Not really. Here's a video of BF3 on a similar setup to having a 7850 GPU + Quad 8300 CPU, infact a 7850 is much better than the one in the video.
http://www.youtube.com/watch?v=9SoC2fZx9q0
I really don't see how having an Ivy bridge processor would improve the experience much, and this is in an "Intense" CPU game.
a Q6600 at 3.2 would be over 30% faster than a 2.5 Q8300. and random youtube videos is not a way to prove what you are trying prove. its fact that in multi player the newer i5 cpus are vastly better than a Q8300 would be. and a 3570k would double what Q8300 could do with full map and thats a a fact too.The Q6600 is about 20% faster than the Q8300. Here's an i2500k with an 560Ti OC http://www.youtube.com/watch?v=YAggH3bEkZI running at 51FPS average. Do you still think a i3570k would somehow double that?
a Q6600 at 3.2 would be over 30% faster than a 2.5 Q8300. and random youtube videos is not a way to prove what you are trying prove. its fact that in multi player the newer i5 cpus are vastly better than a Q8300 would be. and a 3570k would double what Q8300 could do with full map and thats a a fact too.
EDIT: maybe this will sink in for you. lol. his cpu would be slower than that X4 620 you see next to last.
4x the power? you are REALLY far off. even with no cpu limitations at all, a 7850 is not quite twice as fast as gtx260. and before you argue with that, a 7850 is a little slower than a gtx570 which is right at twice as fast as gtx260.To conclude:
Effectively with the upgrade to 7850 the OP is guaranteed:
Much high image quality.
At a higher frame rate.
In every game.
About a 4x raw performance difference.
With an absolute worst case game of ~40-50FPS in a 64 player server in BF3.
But running at ~1/3 of your resolution, you could potentially run the game at up to double the FPS with a high end CPU
I'll leave it to the OP to make his decision from here.
The Q6600 @3.2 is about 20% faster than the Q8300. You didn't take architecture into account. Here's an i2500k with an 560Ti OC http://www.youtube.com/watch?v=YAggH3bEkZI running at 51FPS average. Do you still think a i3570k would somehow double that?
well the 620 is clocked at 3.0 so its every bit as fast or faster than the Q8300 at 2.5 which was already a cut down core 2 quad.Clock for clock the Athlon X4's are not nearly as good as the Core 2 Quads. His CPU would certainly be better than the second to last Athlon X4, but I agree with the rest of what you're saying. An i5 or i7 would be a huge boost.
4x the power? you are REALLY far off. even with no cpu limitations at all, a 7850 is not quite twice as fast as gtx260. and before you argue with that, a 7850 is a little slower than a gtx570 which is right at twice as fast as gtx260.
well the 620 is clocked at 3.0 so its every bit as fast or faster than the Q8300 at 2.5 which was already a cut down core 2 quad.
well I am just looking at benchmarks and it would be around twice as fast at the same settings. and again that is with no cpu bottlenecking either. and some of the higher settings you think he would run would also impact his cpu not just the gpu. bottom line is he will not even come close to getting what a 7850 can fully deliver in most cases.Sorry, was getting confused with a 250, a 3x raw performance diff is more comparable. Regardless, comparing cards from such different generations is difficult, there are features in the latest architectures which provide way more performance under certain loads - think the kind of loads that next gen engines will move towards, and much more stability in FPS compared to earlier generations - Comparing old games between these cards will only show performance differences in features that date back to the 260. None of the new features of the card are used.
At a much deeper level, you can easily get 10x improvements in some workloads due to different caching methods, better multiplexing of parallel work loads with different instructions, better local data share, synchronisation and cache consistency, better compilers for helping advantage of these features and better debugging abilities which allows for better optimisation/performance analysis.
the graph states 2.6Ghz
(Super colorful graph that doesn't have anything to do with the OP)
perhaps you missed the whole point of the graph which was to show what an 8300 level cpu could do. would it make you feel better to see it getting 40-45 fps average and sub 30 fps minimums at 1920x1080? if its sluggish at 640x480 then its going to be just as bad at 1920x1080.Lol who plays at that resolution? Are they using an iphone as a monitor?
I'm not going to argue that the 8300 isn't going to limit higher end gpus, but its not as dramatic as the graph shows. At that low of a resolution a lot of the work is being put on the cpu.
I have a similar situation. Dell PC. I have a stock i7 2600 and a 6970. Would jumping up to a gtx 680 see a bottleneck from that i7, making it not worth grabbing a 680?