Is the new 5870 pointless on anything less than an i7?

Have had a couple of people tell me that there is no point upgrading my current 8800GTX to a 5870 as I wouldn't see much if any performance difference as I am severely limited by my QX6700 CPU running at 3.2 Ghz. My LCD's natives resolution is 1920x1200, so I have had to pass on a lot of the most recent titles as I am just not able to run them with all the bells and whistles I would like to be able to, and I was hoping that upgrading my video card would allow me to run some of these games. Common consensus amongst my nerd peeps is that I would see a bigger gain my ditching my current CPU and motherboard (Bad Axe 2) and keeping my 8800GTX than I would by buying a new video card. Needless to say, I am kind of torn. I passed on the last gen of video cards not seeing a big enough performance jump from my 8800GTX, but after reading through the reviews, maybe it is the newer CPU's that are making just as much of a difference as the video card. Would a newer card be totally wasted on my nearly defunct current rig, or would I see at least some slight gains in more GPU intensive titles?


Sure, you will see a gain from the 58xx card, no doubt. However, you will probably see the same gain from a 4890 or 1GB 4870 at that resolution. If you just wait a few weeks for the 58xx fury to subside the prices on the 48xx cards will make your desicion very easy!

Personally, I would probably go for the 5870 card and then upgrade to an i5 sometime in the near future. I would not spend a whole lot of money on a new board for your q6700
 
I don't see how any quad-core that is clocked over 3.4 - 3.5GHz will bottleneck this gen of cards, other than the 5870X2. Although, games that are heavy on CPU calculations such as RTS will be faster on I7's, but that's true even with current cards.
 
Yeah @ 1920x1200 with filtering the 5870 will be fine with any fairly modern CPU.

In less demanding titles you might lose some FPS and in that case FPS will already be superfluous. Imagine getting 104FPS instead of 120 in an ancient title or one that uses an old engine.

Like Ramon said, games that are CPU limited will be running at high FPS most of the time, so you won't be feeling a difference.

The 5870 will matter in very demanding titles where it is shouldering the load.
 
This is one of those highly underreported topics that keeps coming up. Guru3D is the only place that's done a comprehensive article on i7 versus C2D/Q when it comes to SLI/CF GPU bottlenecking, though I know [H] has a useful graph buried in here somewhere.

The short of it is this - you will see a massive improvement from an 8800 series card to a 5870. Will that improvement be less than what it could be if you don't have an i5/i7? Probably. But so what?

Except for someone sitting on a pot of gold, almost EVERYONE is "bottlenecked" in CPU or GPU somewhere along the line...actually the word bottleneck is insanely misused. A bottleneck is running a Celeron with a GTX 280, or an i7 with a Geforce 4 MX. What you will experience is somewhat-less-than-full-size neck.

In short, it's worth it to get the new tech. AMD is not stupid and they would not develop a card that would only be useful if you're one of the minority of gamers who have moved to i7/i5 platform.
 
I'm moving from an 8800GTX on an E8400@4ghz to an ATI 5870. I don't see any reason to upgrade to an I7 unless a ton of titles suddenly support 4 cores / 8 hyperthreaded.

When the I7's came out I remember seeing benchs of my E8400 beating the I7 at similar clocks in games because their lack of support for all cores. So I will enjoy the massive performance boost that I only really need for ARMA2 atm.

So I doubt your ol Quad at 3.2 will hurt ya to much. You will see a fantastic boost I imagine.
 
Just to put your mind even more at ease, I took delivery of my 5870 about 3 hours ago and have been playinc Crysis Warhead for about the last hour. The difference between the 5870 and my 4850 (which has similar performance to the 8800GTX) is nothing short of staggering. I can actually max out the game at 19x12 along with 2x AA and still maintain playable frame rates. The only time I noticed a noticable dip in frame rates is when there are lots of fires/explosions/smoke in the scene. I'm hopeful that with upcoming driver updates, perhaps as easrly as Cat. 9.10 next month, even those dips will be far less noticable. Oh, and this is on a Q6600 @ 3.2GHz. Basically my experiences with Crysis Warhead mimics almost to the T the [H]ard review for this game, despite them running it on an i7 920 @ 3.6GHz
 
Just to put your mind even more at ease, I took delivery of my 5870 about 3 hours ago and have been playinc Crysis Warhead for about the last hour. The difference between the 5870 and my 4850 (which has similar performance to the 8800GTX) is nothing short of staggering. I can actually max out the game at 19x12 along with 2x AA and still maintain playable frame rates. The only time I noticed a noticable dip in frame rates is when there are lots of fires/explosions/smoke in the scene. I'm hopeful that with upcoming driver updates, perhaps as easrly as Cat. 9.10 next month, even those dips will be far less noticable. Oh, and this is on a Q6600 @ 3.2GHz. Basically my experiences with Crysis Warhead mimics almost to the T the [H]ard review for this game, despite them running it on an i7 920 @ 3.6GHz

this is the kind of feedback i'm looking for! keep your first impressions coming RamonGTP :)
 
THIS statement comes from a guy who used to sit on a pot of gold, and now I live on monthly basis and pay my bills and going back to school and living as it goes, thanks to the economic turmoil, but Happiness counts the most to me and my family. Now leaving that aside, I used to upgrade my system like there is no tomorrow, and I have had a q9550 for a while and haven't had the money to go i7, and I had saved a pennies over months to upgrade my graphic card, though I shouldn't spend money on that because I am pretty broke, but to me if I can't enjoy myself once in a while than life ain't worth living.

my q9550 runs at 4.1ghz when I game and if someone is gonna tell me that I am gonna get bottleneck with hd 5870 which I havn't gotten yet than I will probably throw a bottle on their neck, because i7 or q9550 if you run them at crazy speed I don't think they will be a bottleneck for another two generation of graphics card until may be 7870 comes out.
 
Yeah, I think his friends just don't want him having a better video card than they do!
 
There is a newegg review where the user states he thinks his i7 @ 3.65ghz is bottlenecking his 5870... good for a laugh.
 
There is a newegg review where the user states he thinks his i7 @ 3.65ghz is bottlenecking his 5870... good for a laugh.

Good 'ole Newegg. Where people think they're an experienced computer technician because they know where the power button is.
 
You will hear "yes," as adamantly as "no," on this topic. Take your pick.

What I see so far is that while neither HD5850 or HD5870 is pointless on non-i5 or i7 hardware, it may well be pointless on non-quad-core hardware, if for no other reason than CPU-bounding rearing its ugly mug. If a game is CPU-bound on your current setup (and you have a dual-core CPU) that is a pretty good indication that a quad-core CPU will take the boundaries away, especially if the bound CPU is overclocked. While the number of multicore games and applications is still small, multitasking is on the rise, and that alone is making quad-core pretty much a requirement.

That is why I won't be pulling the trigger on HD5850 until after I swap in a Q9550.

More data - Newegg purchaser running his HD5870 with a C2D (replaced CF'd HD4850s) http://www.newegg.com/Product/ProductReview.aspx?Item=N82E16814161301

Never thought I'd see one (single GPU) card replace two single-GPU cards and have both frame rates and detail levels go up.
 
Last edited:
all games using dx9 or dx10 have only one render thread.

games using dx11 may use tesselation and that means that the cpu don't need to calculate as much data for the gpu and it did with previous directx versions.
 
Its same like you got 4870X2 or GTX295 imo e8400 @ 3,8 or Q9550@ 3.6 will be enough.
 
The only thing that gets bottlenecked without an i7 are extreme multi gpu set ups like 285 tri sli or 4870x2 crossfirex (x2). Then again in [H] review Q9650 @ 3.6 = i7 @ 3.6 in Crysis fps (with two 4870x2 in crossfirex).

Also there is no such thing really as bottlenecking or unbottlenecking because every new cpu architecture update automatically deems the older arch as a bottleneck due to performance difference inbetween the two. Actual end user experience most of the time will be indistinguishable but in benchmarks "ZOMFG CORE i256 is 50% faster then core i128, 128 cores cpus are bottlenecking GTA 6 and Crysis 3. ALSO 2 2kw powersupplies are bottlenecking my GTX 595XXX ULTRA ZOMFG HEXA SLI RIG FUUUUUUUU (hexa -16)."
 
Last edited:
By "bottlenecking" does everyone mean just the max FPS? My only two concerns are min FPS & average FPS, and I would guess that any "bottlenecking" to those two are minimal--min FPS might not be affected at all.
 
By "bottlenecking" does everyone mean just the max FPS? My only two concerns are min FPS & average FPS, and I would guess that any "bottlenecking" to those two are minimal--min FPS might not be affected at all.

Average and minimum.
 
Any Phenom II will run 5870 fine. You will always have bottlenecks. Depending on the Game and the resolution you have, either your CPU or GPU will be bottle necked (the higher your resolution the less of a bottleneck your CPU becomes).

The question is will all current games be perfectly playable? The answer is probably yes.

It's kind of a funny way to think of a CPU bottleneck but you can always get a bigger monitor (or more monitors) if your CPU is a bottleneck :p

In my experience money spent on GPUs is more game play rewarding than money spent on CPUs. In other words a video card that costs $200 makes more difference in gaming than a CPU-Mobo combo that costs $200 more.
 
It's kind of a funny way to think of a CPU bottleneck but you can always get a bigger monitor (or more monitors) if your CPU is a bottleneck :p

Then the wallet becomes the bottleneck :p

And that's the worst type to have.
 
After playing with my 5870 all weekend.. here is my story about CPU and bottlenecking..

I have a Q6600 overclocked to ~3.1-3.2GHz... but when I installed my 5870 I noticed Need for Speed Shift was not running as well as I thought it would. I had it set at 1920x1080, 2xAA, and everything up high. I decided to lower the graphics to 720p, and it performed the EXACT same. I checked my processor speed and it was showing as 2.4GHz. My overclock was not on! So I went into bios and overclocked back to 3.15. With that overclock I gained what appears to be a good 30% increase in frames. My Need for Speed went from being unplayable (to me) at 1080p to being as smooth as silk. Every game I though at it runs perfectly. And I think "perfectly" is not dipping below 50fps except for the ones like Crysis that make good use of motion blur. My 3Dmark Vantage scores went from ~5000 to 10000.. but I really don't care much about vantage tbh.

Point I am trying to make is your pc will MOST DEFINITELY see the difference. Anyone who says otherwise is an idiot. Tell your friends to come over to this board and start up this debate.. they won't last long.
 
After playing with my 5870 all weekend.. here is my story about CPU and bottlenecking..

I have a Q6600 overclocked to ~3.1-3.2GHz... but when I installed my 5870 I noticed Need for Speed Shift was not running as well as I thought it would. I had it set at 1920x1080, 2xAA, and everything up high. I decided to lower the graphics to 720p, and it performed the EXACT same. I checked my processor speed and it was showing as 2.4GHz. My overclock was not on! So I went into bios and overclocked back to 3.15. With that overclock I gained what appears to be a good 30% increase in frames. My Need for Speed went from being unplayable (to me) at 1080p to being as smooth as silk. Every game I though at it runs perfectly. And I think "perfectly" is not dipping below 50fps except for the ones like Crysis that make good use of motion blur. My 3Dmark Vantage scores went from ~5000 to 10000.. but I really don't care much about vantage tbh.

Point I am trying to make is your pc will MOST DEFINITELY see the difference. Anyone who says otherwise is an idiot. Tell your friends to come over to this board and start up this debate.. they won't last long.

Hrmm... maybe you didnt need a new GPU after all if the CPU was back to stock speed.
 
Hrmm... maybe you didnt need a new GPU after all if the CPU was back to stock speed.

If all of the other games didn't see such a significant boost I would have been likely to agree with you. In this case it was mainly Need for Speed Shift. In fact it was running choppy and smooth at the same time. Just bad studdering. Crysis, Red Faction Guerrilla, Grand Theft Auto, and Fallout 3 were running fine @ 1080p with all the eye candy up. Ya have to understand, it wasn't until very recently my bios changed. No matter how much I overclocked, the 4850 I had was not doing it for me the way I like to run games.
 
Back
Top