How much FPS to expect from i5 2500k to a recent i7 on 1080ti?

rolltide78

n00b
Joined
Jul 20, 2018
Messages
25
I'm contemplating a jump to a 1440p 144hz gsync monitor and curious if my 2500k will be able to handle fps increases over 90 for example? I'm wondering if I'll need to upgrade my cpu as well.

Thanks,
G
 
"curious if my 2500k will be able to handle fps increases over 90 for example"

A 2500k cant even keep a minimum of 60 fps in some games. Heck it cant even average 60 fps in some really cpu demanding games especially if stock. It just really depends on the game and where being tested.

I would wait for the 9 series cpus and get at least a 6 core / 12 thread cpu and 16 gb of fast ram.
 
Last edited:
These benchmarks are for a 1080, but the short answer is: No, a 2500K won't do well at framerates above 90, and you may see performance loss as high as -50% over a modern i7 or even greater in some cases.

The long answer is it heavily depends what games you're playing. Benchmarks can only provide limited information, and they are generally done on titles that are more GPU bottlenecked than CPU, so I'd say most benchmarks are being generous towards old CPUs.
 
These benchmarks are for a 1080, but the short answer is: No, a 2500K won't do well at framerates above 90, and you may see performance loss as high as -50% over a modern i7 or even greater in some cases.

The long answer is it heavily depends what games you're playing. Benchmarks can only provide limited information, and they are generally done on titles that are more GPU bottlenecked than CPU, so I'd say most benchmarks are being generous towards old CPUs.


Thanks. My main go tos are Witcher 3, Kingdom Come Deliverance, and Skyrim but also play Doom, Wolfenstein, and stuff like Origins. I can get way up in framerate on Doom and Wolfenstein but my monitor currently just discards anything over 85. I know skyrim only runs best at 60 anyway.

My cpu is at 4.6 ghz currently and seems to do pretty well. I think I'll consider the upgrade after I can find a good but budget friendly gsync 1440p 144hz monitor.
 
depends on game and setup

running at 8k dispely resolution is a lot less stressfull for cpu than running at 640x480.
 
1440p is my desire.

This is a strong generalization again because different games has a different load balance between cpu and graphics.

but at 1444 we still see some huge CPU bottlenecks in even modern games
newer games has a trendy to have more GPU load vs CPU load for the same frame rendered due to the fact graphics are "Easier to scale" than the task the CPU does.


Without overclocking I believe your CPU would be strained to deliver the 144fps at
Even with overclocking I would assume your CPU is going to bottleneck that graphics cards at that resolution in most games


In short I would not advice 1080ti for that resolution
 
I'm running a 7700K and 1080ti with a 1440p 165hz monitor. I feel the resolution and refresh are a good match for my system. With my settings cranked on most games I hover in the low 100s, but if course that varies by title.
 
This is a strong generalization again because different games has a different load balance between cpu and graphics.

but at 1444 we still see some huge CPU bottlenecks in even modern games
newer games has a trendy to have more GPU load vs CPU load for the same frame rendered due to the fact graphics are "Easier to scale" than the task the CPU does.


Without overclocking I believe your CPU would be strained to deliver the 144fps at
Even with overclocking I would assume your CPU is going to bottleneck that graphics cards at that resolution in most games


In short I would not advice 1080ti for that resolution

yeah I'm overclocked to 4.6 ghz so am taking advantage of that.
 
I honestly dont see getting a 1600k over a 2500k to give much boost in most games.
but i also dont play a lot if triple AAA modern games anymore.

I just dotn see alot of games taking advantages of the parrrlelism in 2600k over a 2500k ( exception is doom 2016)
offcause if you can get a 2600k free or near that, I wouldt not stop you
 
I honestly dont see getting a 1600k over a 2500k to give much boost in most games.
but i also dont play a lot if triple AAA modern games anymore.

I just dotn see alot of games taking advantages of the parrrlelism in 2600k over a 2500k ( exception is doom 2016)
offcause if you can get a 2600k free or near that, I wouldt not stop you
Well that is flat-out not true as pretty much every modern game goes well over 50% CPU usage even on my 4770k at 4.3. In fact I get all 8 threads pegged in several modern games and turning off hyper-threading would be an absolute mess in some games such as Watch Dogs 2 and Mafia 3. 4 older cores absolutely does not cut it in some games.
 
Well that is flat-out not true as pretty much every modern game goes well over 50% CPU usage even on my 4770k at 4.3. In fact I get all 8 threads pegged in several modern games and turning off hyper-threading would be an absolute mess in some games such as Watch Dogs 2 and Mafia 3. 4 older cores absolutely does not cut it in some games.

Wow so a 2600k is a big jump from a 2500k?
 
Wow so a 2600k is a big jump from a 2500k?
It would really depend on which game and where you're testing at. There will be parts of some games where a 2600 k is well over 50% faster. In fact I can tell you that Mafia 3 is not even very playable on a 2500 k as it will be fully pegged the whole time trying to push a 1080ti. Even at 1440p Mafia 3 pushes my overclocked 4770k over 90% CPU usage. Parts of Watch Dogs 2 also completely pegs my CPU and my frame rate to drop into the 50s so I can't even imagine having no hyper-threading there.
 
Wow so a 2600k is a big jump from a 2500k?
It's situational. Hyperthreading will make a significant difference in games that can make use of it.

Skyrim, for instance, (like many open world games) used to stutter and hitch when loading scenes on my old 7600k. It played smooth as butter, but you could tell when it was hammering the CPU.

I had a good opportunity and upgraded to a 7700k, and that stuttery hitch disappeared entirely. Nothing else in my system changed, I just upgraded from a non-hyperthreading processor to a hyperthreading one.
 
It would really depend on which game and where you're testing at. There will be parts of some games where a 2600 k is well over 50% faster. In fact I can tell you that Mafia 3 is not even very playable on a 2500 k as it will be fully pegged the whole time trying to push a 1080ti. Even at 1440p Mafia 3 pushes my overclocked 4770k over 90% CPU usage.

Crazy had no idea. These are pretty much what I play in rotation (Video Game ADD) at the moment: Witcher 3, Kingdom Come Deliverance, and Skyrim but also play Doom, Wolfenstein, and stuff like Origins obviously for now. So far I haven't noticed anything negative with the 2500k but then again I don't really know how to see bottlenecks.
 
Crazy had no idea. These are pretty much what I play in rotation (Video Game ADD) at the moment: Witcher 3, Kingdom Come Deliverance, and Skyrim but also play Doom, Wolfenstein, and stuff like Origins obviously for now. So far I haven't noticed anything negative with the 2500k but then again I don't really know how to see bottlenecks.
Well I know that Witcher 3 will absolutely Peg all four cores of a 2500 k in certain parts of the game for sure. There are quite a lot of demonstrations of that on YouTube.
 
Well I know that Witcher 3 will absolutely Peg all four cores of a 2500 k in certain parts of the game for sure. There are quite a lot of demonstrations of that on YouTube.

Yeah I'm sure. I have noticed when my 60-80 fps drops to 40s at times. I guess ignorance is bliss :) Wonder if my existing board could handle a 2600k.
 
Just get gsync and it won’t matter. You’ll be amazed how little you’ll pay attention to FPS with gsync or freesync.

Your Sandy Bridge CPU, If overclocked shows it’s age more on minimum FPS than on maximum FPS.
 
Just get gsync and it won’t matter. You’ll be amazed how little you’ll pay attention to FPS with gsync or freesync.

Your Sandy Bridge CPU, If overclocked shows it’s age more on minimum FPS than on maximum FPS.

Thanks. I've been looking for a deal on that Dell 27 gsync one so hopefully I'll get lucky.
 
Ok I'm a moron. I have the 2600k i7 already overclocked to 4.6 ghz.


Well that will make a big difference. 2600K can pull far better minimums than a 2500K. I know... I own both in still active systems. Gamer's Nexus put the 2600K at 4.7GHZ in the range of 1st gen Ryzen chips for games that don't leverage a lot of cores. That said, you will not be getting the most out of a 1080ti with that CPU if you're gaming at 1080P or lower. However, it will still give you a good experience, and should run most games well over 60FPS. A Gsync monitor will help with stutter if you drop below your refresh rate. I suggest 1440P or better. Check out this vid.

 
Last edited:
It really depends on the game. The higher the resolution, usually the less of a factor the cpu is. You'd be surprised how well old hardware can hang.

For example watch how well the Xeon X5675 compares to the brand new coffee lake 8700k.



Yes there are a couple of games where the i7 8700k comes ahead by 30% but there are also many games where the FPS is virtually identical to the older X5675 system.
 
Copied and pasted from another thread I replied to... And yes I ran the same 1070 GTX in the last system as this one! ;)

"I went from a 2700k @ 4.6ish to a 6700k @ 4.6ish and most games especially big MP games like BF1 etc ARE running MUCH better so I imagine even more so with a 8700k! As far as possible lower frames this may shed some light on the issue. When I 1st got the 6700k I actually had insanely high cpu usage a lower FPS until I turned off all the power saving junk in the bios and BAM! My FPS and smoothness in gameplay went through the roof!"
 
Nice. You are tempting me now. Maybe I can find a used combo on the cheap sometime soon. I'm not having much luck finding my Dell 27 gsync monitor.
 
3 years ago I upgraded from a clocked 2500K to a clocked 6600K simply to maintain 60fps in Witcher 3 and to raise the framerate in Project Cars on VR.
It made a tremendous difference, not only to the min framerate but in the smoothness.
1/2yr later I changed to a clocked 6700K and again that is smoother.

edit
somethings wrong with my timeline there, I wasnt playing Project Cars on VR 2.5 yrs ago.
My heads too fugged today to work it out :)
 
Last edited:
At this point in time with games finally starting to use more threads, it's pretty safe to say a decently overclocked Nehalem i7 is better than a 2500K. 4 threads is just a big limiting factor with a modern OS like Win10 and games that can use more than a couple cores.
Moving to a new i7 of any kind will be a big leap forward on a high end GPU like the 1080Ti and ESPECIALLY in scenarios where you need high FPS for a 144hz display.
 
Well that is flat-out not true as pretty much every modern game goes well over 50% CPU usage even on my 4770k at 4.3. In fact I get all 8 threads pegged in several modern games and turning off hyper-threading would be an absolute mess in some games such as Watch Dogs 2 and Mafia 3. 4 older cores absolutely does not cut it in some games.

first of all i believe you mean logical cores and not threads. CPU's does not have threads.
Threads are a seperation of software. Not of your physsical cores

Seconds just because eou see utiliziation in average over time on all 8 logical cores does not mean it was doen at the same time.
You have to realize what you are measuring is an average over time.

This has being explained time over time on this forum.


The only semi valid point you have is that you can see more than 50% CPU usage. However due to the drawbacks from threads conflicts depending on how much above that does not necessarily means more performance. threades conflcis from SMT can evne increase the "load" on your cpu because you cpu is now slower to execute the code and therefor have to reserve more time for the same amount of load in at hread
But even if i gave you this point, then we are still at the lack of emprical data to support your claim. just anecdoctale.


If you can provide some real evidence that pretty much every modern game support above 4 (logical) cores I would be highly interested..
 
Last edited:
Back
Top