Upgrade CPU or GPU? ([email protected]/GTX 780)

Stiletto

Supreme [H]ardness
Joined
Jul 13, 2008
Messages
6,434
Been rocking this setup for many years now, but I'm feeling the weight of progress slowing me down. Which is the bigger bottleneck for 1080p/144Hz gaming on modern games? Full specs:
i5-2500k @ 4.0GHz
16GB Corsair Vengeance DDR3-1600
EVGA GTX 780 FTW

Is there a worthy upgrade option for 1155 that might show me some serious gains? Or should I consider a GTX 1070 for a boost?
 
It depends on the games, but I think you'd notice a graphics card upgrade more readily than a CPU upgrade.

In some games, you need more than four threads these days (see: Assassin's Creed Origins/Odyssey), but it's not universal. You could upgrade to a 3770K, which would eliminate that problem for you, but I don't think it would be dramatically faster on a per-core basis, and that's mostly what still matters.

So, if you can't upgrade both, I'd do some testing in the games you intend to play and see which part of your system is being maxed out, and which parts are idling, and then upgrade accordingly.
 
I’ve still got my 2500k running at 4.5. (4.6 until two weeks ago when it pooped itself over 1866 ram speeds) at 144hz you would need to upgrade both but it also depends on which games you are referring to specifically. I was running really well at 60hz on destiny2 with a 2080ti while I waited for more parts to come in but if you want stability way up there you need way more cores period. I think if you wanted to push your luck you could get a 2600k clock it as high as it will go @~1.375v and somehow get 2133 running with stability. Then run that with a 1070 and I imaging you would survive for a time. BUT that’s not at highest settings. On a 1070 I was getting 60fps smooth at 1080p but at absolutely max settings things like the Witcher 3 would fluctuate in the most busy locations on the map. This was on a laptop but it wasn’t a maxq chip and I had it over clocked undervolted and had the CPU 6820hk all core over clocked @ 3.8.
 
GPU, 100%.

I had a 2600K until recently and it was chugging along fine with a 1080ti. I probably wouldn't go beyond a 1080ti so if you are looking at a 1070 it'd would be a solid upgrade.
 
CPU now, GPU soon*. I had a 2500K OCed to 4.7 GHz and upgraded it last year. Then I upgraded my GTX 970 a couple months later after the Super cards were released. I wouldn't do it any differently.

Your 2500K will bottleneck any new video card you pair it with. There's no point in spending $500 on a new video card when the CPU/RAM will drag it down to the speed of a $400 card.


* Actual time frame depends on how much you want to spend on a GPU, if you want to wait ~6 months for the expected refresh from AMD and Nvidia this year, etc. Another valid option would be to wait until you have enough money to replace everything at once.
 
I owned a GTX 780ti and it was a dog in modern games. The architecture and VRAM buffer just kills the experience. If you had a R9 290x then that would have been a different story. The late Kepler cards aged horribly for current games.

Upgrade your GPU. 2060 super would do perfectly at the very least.
 
I owned a GTX 780ti and it was a dog in modern games. The architecture and VRAM buffer just kills the experience. If you had a R9 290x then that would have been a different story. The late Kepler cards aged horribly for current games.

Upgrade your GPU. 2060 super would do perfectly at the very least.

Agreed. That or a used 1070 is a phenomenal value for 1080p.
 
GPU.

Even a cheap used RX580 would be a nice upgrade over the 780ti for under $150.
 
Lower resolutions and higher FPS in most cases bottlenecks the CPU already and the 2500k still just a quad core. Both of your parts need the upgrade. If you are willing to push clocks a bit a 2600k or 2700k would get you a lot of mileage with twice the theoretical cores and they can be had pretty cheap here or on eBay. Then grab that 1070 also on here and you might be out only 250ish dollars if you are lucky. If you can only pick one in absolute terms. It depends on what you want: the most FPS or eye candy. CPU for the first and GPU for the second in most cases. However you will probably notice FPS boosts even with just the GPU since it is sooooo old. My old 770gtx wasn’t pulling its weight for a long time when I replaced it but I don’t know how much better the 780 is by comparison.
 
@1440p, GPU for sure. But a 2600K wouold be good as well. Maybe a 1070ti and 2600K if you can manage that. I was using that combo up until Dec last year.
 
Agree with the above a used 1070 for about $200 would be your best bang for buck I was running my 1070 with a 3770 before I upgraded and it was doing fine with 1440 at 100hz.
A 2060 will get you more future proofing, but cost in the $350-400 range depending on the brand/model.
If you want to drop your power and heat you can pick up a 3770 non-k for about $55-60 and get the same or better performance as your OC'ed 2500k (assuming your motherboard has a bios update to support the ivy bridge). ~ I have that 3770 for sale if you're interested.
 
Had a 2500K @ 4.5 until about a year ago and it did great with most games. The GPU is definitely the better option here, especially if you're getting a 1070 (at a great price I hope!). There also may be room left in that 2500K depending on the cooling/power delivery/mobo.
 
While my motto is always upgrade GPU first, you want to play at over 100fps, so I'll have to say both.

A faster card will be bottlenecked by your CPU and a your current GPU won't benefit as much from a faster CPU.
 
GPU makes sense, because it will give you good (>60) framerates in pretty much anything out there right now. I wouldn't upgrade the CPU unless you're also planning a significant GPU upgrade (better than a 1070) because you'd not going to get those super-high framerates in modern games without a significant outlay of funds.
 
Depends on the kind of games you play, AAA games that are running lower level APIs will let you get away with a weaker/older CPU. Indie games or games from smaller studios don't tend to be using fancy apis or aren't as well programmed though and will often be more CPU bottlenecked.
 
Last edited:
As others said, it depends on what type of game you’re playing. For example, if you’re playing anything based on Frostbite (Battlefield V, etc) than both the CPU and the GPU in your set up are going to be slow. Just upgrading the GPU will not help you in this case, as that engine is designed for 6-core processors. I discovered this with my i5 4670k before I upgraded to my Ryzen 3700k. Did a GPU upgrade first and was still only pulling around 60 FPS in Battlefield V, and after checking my monitors, I saw the GPU was only seeing something like 60% stress.

This is why it’s always important to see how hardware performs for the games you actually play, beyond just for, say 3D Mark.
 
Back
Top