Terminal GPU upgrade for sandy bridge?

Furystrike

[H]ard|Gawd
Joined
Jul 6, 2004
Messages
1,535
My [email protected] is paired with a 1440p 60Hz display driven by a 970 GTX. Thinking about giving this system one last GPU upgrade. I'm drawn to the 1660 Ti or 2060 RTX. Cost really isn't an issue as long as there is a performance benefit, but I don't want to waste money on more GPU than this system can use.

Worth it or too CPU bottlenecked? This rig will become a plex server in 6 months or so and will be using GPU transcoding, so low power utilization is also a consideration
 
The CPU can be a bottleneck, but I think a GPU upgrade would still have some benefit.

1660 Ti is mostly a card for 1080p ultra settings, but can do 1440p (I even got some games to run at 4K but with tweaked settings).

You'd be looking at a 30% - 50% uplift in performance coming from a GTX 970, assuming the rest of your system doesn't hold you back.

If the cost is not a problem, probably look at the RTX 2060. The 1660 Ti is also decent but will be just cutting it at 1440p.
 
I had a 4c/4t i5 IB with a 970 ( for a long time) and picked up my minimum frames with an i7 drop in at the same clocks, for a net of maybe 50 bucks after selling the old chip. Run your ram >1600 if you can.
 
Ran a 8370,yes it was a dog,and a evga 970 for a while and it did alright.did start to see the 4 gb start to hold it back. With a 2500 cpu I would think it would do a lot better than my old setup .
 
Had a 2600K@4,5 with an GTX 1080 at 1440p. Going to a 1080 Ti in some games brought no gains vs the no Ti. So somewhere in between that performance level i would draw the line on the Sandy i7. For the i5 not sure on how some games seem to demand more threads these days.
 
Either of those GPUs would do fine at 1440p/60 and that processor. If your goal was high Hz that would change things.
 
Thanks for advice. I went with the 2060. GPU and CPU utilization both range between 90-100% in my preliminary testing, so they seem pretty well balanced. I might try to get a bit more out of this CPU, 4.7 or 4.8 ghz doesn't seem impossible.

If I can ride out this system that I built in 2011 until the next big CPU and GPU refreshes, I'll be exceptionally happy
 
Kinda similar situation as OP, will be running a 1080 into the sunset with this 2600k system, which is quite similar to your overall performance balance.

Sandy Bridge forever!

UPDATE 04-24: I lied, I'm fucking weak haha, dropped in a 2070 but still on Sandy
 
Last edited:
Great choice. I think you'll be happy.

Also be sure to overclock your system RAM speed if you can. That can make a difference too.
 
Chucked a v64 in mine and at 4.4 I still am cpu limited in some games.. But it's noticeably smoother than 290x at 1440/60 and extra vram helps at highest settings. I won't put a bigger gpu in this rig in future unless it goes 4k. Will be doing a zen2 build this year if it has the clocks I've been expecting (minimum 4.5) as the IPC jump is nice.
Have the 290x on a 2600x mitx workstation currently and it's yet to be tested.. Be good to see if it has same cpu limitations or not..
 
Either of those GPUs would do fine at 1440p/60 and that processor. If your goal was high Hz that would change things.
A 2500k stands no chance of keeping 60 fps in some newer games. Even my 4770k at 4.3 has all 8 threads pegged and drops below 60 in several games and even below 50 in a couple. Pretty much most modern games will have that 2500k pegged nearly the whole time so even if the fps numbers are fine there will be some hitching and stuttering in quite a few games.
 
Last edited:
A 2500k stands no chance of keeping 60 fps in some newer games. Even my 4770k at 4.3 has all 8 threads pegged and drops below 60 in several games and even below 50 in a couple. Pretty much most modern games will have that 2500k pegged nearly the whole time so even if the fps numbers are fine there will be some hitching and stuttering in quite a few games.
What games? I've had no problems. I also don't play unoptimized garbage like anthem, so maybe that's the issue?
 
What games? I've had no problems. I also don't play unoptimized garbage like anthem, so maybe that's the issue?
I can see I will be wasting my time to list any games since they will all be considered "unoptimized".
 
See the benchmarks here: https://www.gamersnexus.net/guides/2773-intel-i5-2500k-revisit-benchmark-for-2017/page-3

Definitely an older chip and struggling on the lows, but should still be playable.
In most games it can still hold 60 fps but many newer games it cant. Again though the issue in many newer games is some stuttering and even panning around is not always smooth as the cpu is pegged anytime its not gpu limited. If someone is playing without vsync or using gsync then they will probably not notice anything as long as they stay gpu limited.
 
I just watched a YouTube vid (I forget the dude and will try to post the link later) he compared a FX3850 vs. [email protected] (which is about as fast as Zen will go).
I was shocked how well the FX did. Especially @4K where GPU is very bottle-necked. Now the FX could do 8 threads which may help it over the long run vs. a i5, but the IPC of the older Intel's at the time was so far superior I would think in all but the most heavily threaded games the i5 will perform best. Anyways it was definitely an eye opener to me at how well older arc"s can perform when given a good GPU at higher rez's.
He was using a Radeon 7 with 6 very modern games for testing btw.
 
I just watched a YouTube vid (I forget the dude and will try to post the link later) he compared a FX3850 vs. [email protected] (which is about as fast as Zen will go).
I was shocked how well the FX did. Especially @4K where GPU is very bottle-necked. Now the FX could do 8 threads which may help it over the long run vs. a i5, but the IPC of the older Intel's at the time was so far superior I would think in all but the most heavily threaded games the i5 will perform best. Anyways it was definitely an eye opener to me at how well older arc"s can perform when given a good GPU at higher rez's.
He was using a Radeon 7 with 6 very modern games for testing btw.
That old pos FX8350 can not even keep 60 fps in probably nearly half of the newer games. Making yourself gpu limited at 4k and getting 40 fps is not the best answer for most people looking for better overall performance.
 
That old pos FX8350 can not even keep 60 fps in probably nearly half of the newer games. Making yourself gpu limited at 4k and getting 40 fps is not the best answer for most people looking for better overall performance.
I understand that, but at higher rez, like the OP@1440 the older CPU's are surprisingly not as high a bottleneck as one would expect. "That old pos" held its own and was in most of the tests very close to the [email protected] in the sense that it was very playable. Something that me, and you apparently, had deep biases against.

Here is the link.
 
I understand that, but at higher rez, like the OP@1440 the older CPU's are surprisingly not as high a bottleneck as one would expect. "That old pos" held its own and was in most of the tests very close to the [email protected] in the sense that it was very playable. Something that me, and you apparently, had deep biases against.

Here is the link.
Yes I keep up with that youtube channel.

I am not only talking about cpu bottlenecks just for the sake of talking about it limiting the gpu. I am also talking about playable bottlenecks which for me means sub 60 fps and/or some stuttering. Raising the resolution to the point where the gpu is the cause of crappy framerates is not the solution for people like me. The FX8350 will perform worse overall than even a stock 2500k but the 4 cores of the 2500k will certainly be fully pegged most of the time thus causing some hitches and stuttering in some games even if it can get 60 fps.
 
Yes I keep up with that youtube channel.

I am not only talking about cpu bottlenecks just for the sake of talking about it limiting the gpu. I am also talking about playable bottlenecks which for me means sub 60 fps and/or some stuttering. Raising the resolution to the point where the gpu is the cause of crappy framerates is not the solution for people like me. The FX8350 will perform worse overall than the oced 2500k but the 4 cores of the 2500k will certainly be fully pegged most of the time thus causing some hitches and stuttering in some games even if it can get 60 fps.
Yes for people like you it is time to move on. No time to waste on that upgrade. I would suggest starting right now.
 
I understand that, but at higher rez, like the OP@1440 the older CPU's are surprisingly not as high a bottleneck as one would expect. "That old pos" held its own and was in most of the tests very close to the [email protected] in the sense that it was very playable. Something that me, and you apparently, had deep biases against.

Here is the link.

Wow! I thought FX would be much worse. It's still surprisingly in the game.
 
I run my 2080ti on a oc'd 3930k, at 4k gaming the CPU isn't a bottleneck.
 
  • Like
Reactions: Auer
like this
I am seeing some bottle neck here with a stock 1070ti at 1080p, I think it's pretty much the tops for my Z68/ 2600K build. Coming from a 970, then a 1060 6G OC, the 1070ti is a solid improvement. At 1440 or 4K, you would prabaly still benefit in most all games, even with a 2080ti, just not as much as on a 8th/9th gen platform.
 
Biggest concern is the lows. The older CPUs can deal with the lightest parts of recent games without a problem but when the action gets heavy you'll have to deal with the chop and more charitable definitions of playable.

Even with just a pair of GTX970's, my last upgrade from a 4.8GHz 2500k to the same speed 6700k was like night and day in terms of actual playability.
 
Back
Top