Any reason to upgrade a 2500k for 1440p?

ramborage

n00b
Joined
Oct 3, 2017
Messages
17
Gaming on a 144hz monitor with gsync with a gtx1070 and 2500k overclocked to 4.8ghz

Only reason I ask is all benchmarks I see is at 1080p for different processors.
 
That's a heck of an overclock :)

Some videos here.
https://www.youtube.com/results?search_query=i5+2500k+gtx+1070+1440p

i5 2500k will get you by, its a hell of a chip. I'd say anything over 1080p to start looking for a replacement. You will probably be bottlenecked some in your minimum frames.

A good alternative would be a Ryzen 1700. Microcenter has good combo deals on those.

My dad is using my old i5 2500k but he only games at 900p (1680x1050) with a GTX 970.
 
Gaming on a 144hz monitor with gsync with a gtx1070 and 2500k overclocked to 4.8ghz

Only reason I ask is all benchmarks I see is at 1080p for different processors.
early on in pubg, I went from a 2500k at 4.2ghz maybe, I can't remember. to a i7 7700k and saw 30 fps improvement with a gtx 1070 but I can't remember if it was with my old 1050p monitor or my new 1440p, I think it was with my new monitor.

if your at 1440p or higher then without a doubt its holding you back. you're likely not getting close to 100 fps unless your playing cs go or something along those hardware requirements.
 
I sure wish there was a Microcenter near me, if for no other reason than to have a tech place to go do some drooling. I even miss compUSA sometimes, from when I lived in S. Florida. Now, the little city that I live in has absolutely nothing. Even Staples and Radio Shack are gone.
 
Upgrade to what? Scaling the resolution up actually diminishes CPU impact
 
The resolution isnt the issue its whether your minimum framerates are high enough.
If gaming up to 60fps you could get by but at 144fps you will see a big improvement.
Any time your GPU isnt maxed out on a AAA title, your CPU is most likely the problem. (Slow ram can also kill min fps)
 
Any reason to upgrade a 2500k for 1440p?

- Battlefield 1



game is practically unplayable with only 4 cores
 
Any reason to upgrade a 2500k for 1440p?

- Battlefield 1



game is practically unplayable with only 4 cores


I don't get it. Looks playable to me? I get 100% cpu utilization on Witcher 3 a lot but gpu stays at 97-100% with excellent frames. 60+ at max details.

Only thing I can think of is some slight stutters maybe?
 
Back to my original post, every benchmark I've seen so far has been testing at 1080p. Is 1440p and higher not so cpu dependent now that only the GPU is the component that makes a difference at higher res?
 
Back to my original post, every benchmark I've seen so far has been testing at 1080p. Is 1440p and higher not so cpu dependent now that only the GPU is the component that makes a difference at higher res?
That depends on the GPU, game and game settings.

What a slow CPU or memory can do is reduce the minimum framerate so the GPU doesnt get used fully, regardless of GPU and resolution.
It will have the same effect on low framerate at 1080p as 1440p as 4K in this regard.
If you have a GPU capable of hitting the desired framerate it can be gimped by the CPU.

If your GPU is maxing out all the time then a faster CPU wont help much. (there are occasions when hitches may be reduced but framerate cant improve)
If you are not hitting your target framerate all the time and your GPU is not being fully utilised at those times then its likely a faster CPU or CPU+memory will sort it.
The only caveat is if the game engine has a bottleneck.


When looking at CPU benchmarks at 1080p, look at the minimum framerates because if they are not high enough at 1080p, they wont be fast enough at higher res either.
 
  • Like
Reactions: SoFGR
like this
Back to my original post, every benchmark I've seen so far has been testing at 1080p. Is 1440p and higher not so cpu dependent now that only the GPU is the component that makes a difference at higher res?
It's not that simple. Realize that IPC in current gen Intel processors is 30-40% better than Sandy Bridge, and certain types of mathematical and bit operations are much faster than that. Look at post #3 in this thread. Yes, it's true that as resolution increases so does CPU dependency decrease, generally speaking. But CPU utilization never tells the whole tale.

To answer your original question, I think it would be worthwhile to upgrade to a Skylake generation CPU (Skylake, Kaby Lake, Coffee Lake).
 
Worth upgrading as you can get 6 core i5s these days. A few games do take advantage of that, and in the future more will. Ryzen should have similar IPC as well and you can get 6C/12T for a bit less than Intel's offering. You will also be able to upgrade to Zen 2 in 2019 due to the socket type. If IPC is very important I'd probably consider the 6C/6T i5 though.
 
I went from a 2500k to a 4770k. GPU
usage shot up to 99% and my min frames increased. That is your minimum upgrade path along with 16 GB ram and ssd. I am holding out to the next gen cpu to upgrade the total system.
 
Last edited:
I went from 2500k @ 4.6ghz to 6700k @ 4.8ghz. It helped in average FPS, but the min FPS is what I noticed the most. In Witcher 3 on 2500k it'd dip to mid 30's for 1-2 seconds in new areas. 6700k never dropped below 70 fps.
 
I don't get it. Looks playable to me? I get 100% cpu utilization on Witcher 3 a lot but gpu stays at 97-100% with excellent frames. 60+ at max details.

Only thing I can think of is some slight stutters maybe?

finally made the switch to ryzen 2600x, best 200$ ever spent

the difference is very evident I think, 55% cpu usage tops, min fps 80, less input lag better frame pacing etc

 
  • Like
Reactions: Nenu
like this
Back
Top