Kaby Lake 7700K vs Sandy Bridge 2600K IPC Review @ [H]

Seems like Kaby Lake should really only make to case to upgrade if you are still on a 1st Gen i7.

The IPC upgrade will probably jump to 30% or more.

Wish it was possible to get up to 4.5 on this old beast. Using around 1.4v just to get stable at 4Ghz lol.
 
I feel weird sitting pretty with my 6 year old cpu.

I am interested in the real game benchmarking though, I know a GPU upgrade would be most worth it in my system, but I'm also flirting with going X99.
 
This is super interesting and really informative. I'm looking at upgrading my [email protected] to a Kaby Lake in August or so. I'd have thought there'd me more of an increase to performance in +6 years of CPU progress, but without AMD pushing Intel, I guess not.

I'd imagine that advances in higher bandwidth DDR4 might boost that some. Maybe Ryzen will be the push Intel needs to actually advance things a bit
 
Looking forward to the gaming benchmarks. That was a great but depressing read lol. I won't be moving off the 5820k for 5 or 6 more years beyond its death. I do wnat Ryzen for an HTPC upgrade from a tricore turd. Tempted by a good i3 or decent i5 but want to wait and see what pricing and real performance on Ryzen is.
 
:D
ahh guys, all relative

if you happen to have a sandy bridge that is a stellar overclocker, then I'm not surprised at not jumping the ship


but if you happen to have a dud :ROFLMAO:


well I'm pushing my Kabylake 5Ghz prime stable on air
5.2 Ghz at every other burning tool/test I can find
though it does get toasty on air, even after delid

an AiO is arriving soon

running into a wall at 5.3Ghz (temp jumps massively :ninja: )
gonna have a look at how voltages get applied; load line and some automatic voltage stuff; once I'm running water instead of air
 
Kyle, thanks for the review.

Also, thanks for confirming I haven't gotten too old. This Ivy Bridge setup is the longest I've ever held onto a 'base' system (mobo/cpu/ram), and I was starting to think I might be maturing or something, not having to upgrade so often.

Thanks for confirming I've not upgraded, because it hasn't been worth it to upgrade!
 
The 5820k and x99 still seems to be an excellent buy at launch. Not much more expensive than the 4790k and z97, 50% extra cores, more pcie lanes and m2 slots.
 
Yea, I'm really eyeing one of those Samsung m.2 drives.

Just hard to pull the trigger when I'm not really that bad right now. My son build a newer machine (6700K) with a standard SSD. It's faster than mine, noticeably. But, I wouldn't say it's $800 noticeable...

Even with a better HDD, I don't know if I could justify the cost.
Chances are you won't see any difference between standard SSDs and newer NVMe drives.
 
[QUOTE="WhoBeDaPlaya, post: 1042755975, member: 56245"
Easy - not as well for two reasons.
1) Doesn't clock as high as Sandy
2) ~10% lower IPC

1 + 2 together kind of kills Nehalem / Westmere. Now if we're talking about them compared to a non-HT Sandy, then it's somewhat of a tossup.[/QUOTE]

But from a gaming perspective, how well does it hold up? What's the maximum GPU it can keep up with?

These are the questions people want to know before they upgrade, but few sites bother to benchmark. If a i7 920 is about the same performance as a stock 2500k, doesn't that imply that even the i7 920 isn't significantly CPU limited in gaming situations?

Those are the questions I would like answered.
 
So, what you are saying is that my Haswell i7-4790K will last me for quite a while, since it is two generations later than the Sandy Bridge Architecture. OK, fair enough. It should be more than adequate for my gaming needs for a few years, not to mention virtualization and video editing. Heck, my two Sandy Bridge i5-2400 are more than powerful enough for testing Windows 10 slow and fast rings, plus some beta testing. I do have one quibble with the article.

For these gaming benchmarks we are using a NVIDIA TITAN video card. We are far from GPU limited.

While it may be inappropriate for a [H]ardOCP, I'm still going to ask the question.... what is the graphic performance with the built-in GPU on the processor? My expectation is that it isn't going to be as good as a AMD or nVidia graphics card, but what kind of FPS are we going to get at the various resolutions?
 
Last edited:
This is a great test - thanks very much for putting in the time to do it!

I've found myself curious on many occasions regarding the newer generations of Intel CPU. Now I'm happy to stick with my i7-2600k I've had in my rig for what seems an eon. :)

Very curious in particular to see the gaming benchmarks, as you say, I expect that delta to narrow significantly.
 
So, are we complaining that our older cpus are still up to scratch? I'm quite happy - more money can be spent on gpus, instead of having to get a new cpu and a gpu every couple of years for gaming. Not that any games really push the cpu hard anyway - there really is no need for faster cpus for gaming.
 
So, are we complaining that our older cpus are still up to scratch? I'm quite happy - more money can be spent on gpus, instead of having to get a new cpu and a gpu every couple of years for gaming. Not that any games really push the cpu hard anyway - there really is no need for faster cpus for gaming.
You must not be playing new games or be sensitive to framerate.
My 2500K causes the framerate to drop below 60 FPS in nearly every game that I bought in 2016.
This stagnation is killing game performance because an extra 30% from Kaby Lake is not enough to keep new games above 60 FPS minimums as they get more demanding.
I can only hope that more games start to take better advantage of multi-threading because quad-cores just aren't enough for demanding games now.
I'll either be upgrading to an 8-core Ryzen or waiting for Skylake-X once we have details on final Ryzen hardware. (Skylake-X should be a known quantity - Skylake IPC with more cores)
 
So, unless a cpu design singularity is achieved it would seem that we won't get significant performance increases until Black Phosphorus becomes actually useable(and even then it will be a one time only speed bump due to the massive increase in frequency).

Well, hopefully AMD can show us something neat in the coming weeks, because we sure as heck could use the competition in these stagnant times.
 
You must not be playing new games or be sensitive to framerate.
My 2500K causes the framerate to drop below 60 FPS in nearly every game that I bought in 2016.
This stagnation is killing game performance because an extra 30% from Kaby Lake is not enough to keep new games above 60 FPS minimums as they get more demanding.
I can only hope that more games start to take better advantage of multi-threading because quad-cores just aren't enough for demanding games now.
I'll either be upgrading to an 8-core Ryzen or waiting for Skylake-X once we have details on final Ryzen hardware. (Skylake-X should be a known quantity - Skylake IPC with more cores)
I had a 3570k and didn't notice cpu usage reaching 90%.
Now I have a 5820k - it has a similar IPC. Don't know what you are playing, but bf1 doesn't hit a cpu bottleneck, just a gpu bottleneck for me. Rise of the Tomb Raider also hit a gpu bottleneck for me. I am quite sensitive to fps, i actually replaced 660ti sli with a 980ti because the 660ti sli could barely keep 60fps in fallout 4 - but 980ti didn't help in some cases as the game is really badly made (sub 30fps in some places and low cpu and gpu usage). Anyway, as I said - don't know what kind of game hits a serious cpu bottleneck before it hits a gpu bottleneck with a 2500k and a decent gpu. Then again, 2500k doesn't have HT and that's what may be keeping it down in some titles as most new games do quite well with more cores.
Also, your 2500k is quite old at this point and don't know why it's bad that it held quite well for so long. Were it an i7 and had HT you'd probably be hapoy with it for a bit longer. If a 7700k were to be enough again for 4-5 years - that's pretty good. But I have a feeling that games will start using more and more cores and 4 physical cores may not be enough in the future.
I'm still happy that a cpu is usable for longer and is a good investment (unlike a gpu).
 
Last edited:
My 2500K causes the framerate to drop below 60 FPS in nearly every game that I bought in 2016.
This stagnation is killing game performance because an extra 30% from Kaby Lake is not enough to keep new games above 60 FPS minimums as they get more demanding.

I mentioned this above, but it bears repeating: hyperthreading helps more than it did back when we chose between the 2500k and 2600k. The point is that while the jump from 2600k to 7700k (etc.) is showing ~30%, it'll be bigger for the 2500k/3570k/etc.
 
I mentioned this above, but it bears repeating: hyperthreading helps more than it did back when we chose between the 2500k and 2600k. The point is that while the jump from 2600k to 7700k (etc.) is showing ~30%, it'll be bigger for the 2500k/3570k/etc.
Apples to apples, ie. non-HT -> non-HT / HT -> HT.
So you should see a similar ~25% increase in perf going from a 2500K -> 6600/7600K.
 
I had a 3570k and didn't notice cpu usage reaching 90%.
Now I have a 5820k - it has a similar IPC. Don't know what you are playing, but bf1 doesn't hit a cpu bottleneck, just a gpu bottleneck for me. Rise of the Tomb Raider also hit a gpu bottleneck for me. I am quite sensitive to fps
CPU usage doesn't have to reach 100% to bottleneck a game.
You need to be looking at per-core usage, and even then some games will bottleneck when it is below 100%. (possibly a sign that you don't have enough memory bandwidth/speed)
It's easy to tell when your CPU is holding you back because performance will be low but GPU usage will drop below 100%.

i actually replaced 660ti sli with a 980ti because the 660ti sli could barely keep 60fps in fallout 4 - but 980ti didn't help in some cases as the game is really badly made (sub 30fps in some places and low cpu and gpu usage)
That's a telltale sign of a CPU bottleneck.
Low GPU usage indicates a CPU bottleneck.
The low CPU usage at the same time typically indicates that it's not highly multi-threaded, and your CPU's per-core performance isn't high enough.
Fallout 4 also really likes memory bandwidth. DDR3 or lower speed DDR4 doesn't cut it.

Anyway, as I said - don't know what kind of game hits a serious cpu bottleneck before it hits a gpu bottleneck with a 2500k and a decent gpu. Then again, 2500k doesn't have HT and that's what may be keeping it down in some titles as most new games do quite well with more cores.
Dark Souls 3, XCOM2, Civilization VI, Dishonored 2, ABZU, Deus Ex: Mankind Divided, HITMAN, Batman: The Telltale Series, Watch Dogs 2, No Man's Sky are all the 2016 games I played which were CPU-limited on my 2500K @ 4.5GHz, causing the games to drop below 60 FPS and stutter.
DOOM was CPU-limited too, but that meant it dropped from >160 FPS to <80 FPS.
The minimum remained above 60 FPS, but those framerate drops still hurt the experience.
I've seen a lot of people complaining about their CPU bottlenecking them in Battlefield this year, though it's not a game that I played.

Also, your 2500k is quite old at this point and don't know why it's bad that it held quite well for so long. Were it an i7 and had HT you'd probably be hapoy with it for a bit longer. If a 7700k were to be enough again for 4-5 years - that's pretty good. But I have a feeling that games will start using more and more cores and 4 physical cores may not be enough in the future.
I'm still happy that a cpu is usable for longer and is a good investment (unlike a gpu).
The fact that it's been six years and upgrading is only going to bring ~30% performance is not a good thing.
Lack of progress is the only reason that old CPUs are still viable today.
I would much rather have had to upgrade my CPU because things were so much faster today that it couldn't compete any more.

I mentioned this above, but it bears repeating: hyperthreading helps more than it did back when we chose between the 2500k and 2600k. The point is that while the jump from 2600k to 7700k (etc.) is showing ~30%, it'll be bigger for the 2500k/3570k/etc.
Yes, that's true.
In the games which can scale beyond four threads, like Gears of War 4, an i7-2600K can outperform n i5-7600K.
That's a very small number of games right now though, but hopefully that's going to change - especially if Ryzen is priced correctly to make 8 core CPUs mainstream.
 
Apples to apples, ie. non-HT -> non-HT / HT -> HT.
So you should see a similar ~25% increase in perf going from a 2500K -> 6600/7600K.

Not to disagree (you are right), but the further implication is that while the 2600k felt like an unreasonable expense over the 2500k for the performance at the time for some of us, that's no longer true- these days, I don't even bother recommending the i5's, nor would I use one myself, for gaming workloads.

We've crossed the point where that extra US$100 for HT actually makes sense so long as it doesn't bust the budget outright.
 
Yes, that's true.
In the games which can scale beyond four threads, like Gears of War 4, an i7-2600K can outperform n i5-7600K.
That's a very small number of games right now though, but hopefully that's going to change - especially if Ryzen is priced correctly to make 8 core CPUs mainstream.

Games don't even have to use it for it to matter. Unless you're using a machine that has been loaded specifically for benchmarking, HT can keep things running smoothly by handling low-intensity background tasks that would have otherwise cause a full core to hop threads. We have so much stuff running in the background these days that there's a clear benefit to the overall experience.
 
Not to sound like an AMD fanboi, but I hope Ryzen delivers a decent pimp slap to Intel so that
we can at least get 6c/12t and 6c/6t as the mainstream top-end parts.

We've been dicking around the ~4GHz mark since f*cking Core2 (not counting suicide P4 runs),
and dicking around 4c since ... Core2.

Pretty soon (maybe even now?), we'll all need Viagra just to get [ H ]ard about computing :|
 
CPU usage doesn't have to reach 100% to bottleneck a game.
You need to be looking at per-core usage, and even then some games will bottleneck when it is below 100%. (possibly a sign that you don't have enough memory bandwidth/speed)
It's easy to tell when your CPU is holding you back because performance will be low but GPU usage will drop below 100%.

That's a telltale sign of a CPU bottleneck.
Low GPU usage indicates a CPU bottleneck.
The low CPU usage at the same time typically indicates that it's not highly multi-threaded, and your CPU's per-core performance isn't high enough.
Fallout 4 also really likes memory bandwidth. DDR3 or lower speed DDR4 doesn't cut it.

Dark Souls 3, XCOM2, Civilization VI, Dishonored 2, ABZU, Deus Ex: Mankind Divided, HITMAN, Batman: The Telltale Series, Watch Dogs 2, No Man's Sky are all the 2016 games I played which were CPU-limited on my 2500K @ 4.5GHz, causing the games to drop below 60 FPS and stutter.
DOOM was CPU-limited too, but that meant it dropped from >160 FPS to <80 FPS.
The minimum remained above 60 FPS, but those framerate drops still hurt the experience.
I've seen a lot of people complaining about their CPU bottlenecking them in Battlefield this year, though it's not a game that I played.

The fact that it's been six years and upgrading is only going to bring ~30% performance is not a good thing.
Lack of progress is the only reason that old CPUs are still viable today.
I would much rather have had to upgrade my CPU because things were so much faster today that it couldn't compete any more.

Yes, that's true.
In the games which can scale beyond four threads, like Gears of War 4, an i7-2600K can outperform n i5-7600K.
That's a very small number of games right now though, but hopefully that's going to change - especially if Ryzen is priced correctly to make 8 core CPUs mainstream.
Well, what I meant is that I like that the software does not really require one to upgrade the cpu as often - some people can't afford that and a slight increase in minimum framerate is not worth buying a new pc over (it's not just the cpu, but the rest of the components that need to be changed as well). I do like progress and it is annoying, that the new cpus are "meh", but on the other hand I still remember when I had to buy a whole new system to play some new games and that sucked. And, as I said, most newer games are quite well multithreaded and HT helps a ton.
As for cpu bottlenecks - when I wrote "cpu usage" I meant each core. I have msi afterburner osd set up to show each core/thread separately and none were reaching 99% usage. I do remember how cpu bottlenecks look like, because that was the reason why I went from a q6600 to a 3570k - it was a night and day difference.
I think everyone smart enough is hoping for Ryzen to bring affordable and well performing hexa- or octa-cores with their "HT" (simultaneous multi threading?) to the masses and not mess it up this time.
As for FO4 - I spent two days overclocking my ram and playing with timings, yet in the end the result was 0fps difference. When fps drops to 30 or even below that I can see that neither any of the cpu cores nor the gpu are overloaded. Usually it's somewhere in the city and it's not graphically intensive or anything. I just started ignoring that as it's quite rare and the rest of the game runs well (it's still a crappy engine when one compares graphics quality and performance with something more modern or even gta5).
 
Nice one Kyle!

Kinda backed up everything I'd thought myself - and Ive got an Ivy 3770K at 4.5GHz so wont be upgrading any time soon either.

I sure do miss the 90s arms race of AMD and Intel with double the performance every year and a bit
 
How much is the SB memory limited compared to SKL/KBL? And the KBL seems to be performing like it had 2133Mhz memory.

upload_2017-1-14_15-52-54.png

upload_2017-1-14_15-53-4.png

upload_2017-1-14_15-53-21.png

upload_2017-1-14_15-53-31.png

upload_2017-1-14_15-54-21.png

upload_2017-1-14_15-54-50.png
 
What is the relevance of gaming benches at 640 x 480 res if as stated:

As always, these benchmarks in no way represent real-world gameplay. These are all run at very low resolutions to try our best to remove the video card as a bottleneck. I will not hesitate to say that anyone spouting these types of framerate measurements as a true gameplay measuring tool in today’s climate is not servicing your needs or telling you the real truth.

I realize these ultra low resolutions were pertinent 25 years ago, but they serve absolutely no purpose today.
 
The site has been killing it lately. I started reading the articles again!

Thanks for this piece. I never expected my 2500k to last me this long. I mean I have the itch but I need a better ryzen to upgrade than meager 5% improvements and faster iGPUs along with comparable or worse clocks. It's funny, when I bought my CPU back in March 2011 I said to myself my next CPU shall be an 8 core. Some promises I keep
 
The BEST TREAD EVER!! Well done. Now think about the average user why would they upgrade?
 
The site has been killing it lately. I started reading the articles again!

Thanks for this piece. I never expected my 2500k to last me this long. I mean I have the itch but I need a better ryzen to upgrade than meager 5% improvements and faster iGPUs along with comparable or worse clocks. It's funny, when I bought my CPU back in March 2011 I said to myself my next CPU shall be an 8 core. Some promises I keep

Honestly makes me wonder why Intel doesn't make larger improvements. Is it that they are incapable or that they choose to? I get the feeling it's choice, which doesn't make sense to me. I'll upgrade when something much better comes along, meaning Intel isn't only competing with AMD but is competing with themselves also. Give me annual ~5% gains, that's fine, I'll sit on my 2500K for 8+ years. Give me 20% a year and I'll buy a new chip and motherboard platform every 2-3 years. I'm sure many people are the same, enthusiasts will upgrade for large gains frequently. Heck lots of "normal" people will upgrade their "slow computer" with something new if it's faster.
 
Honestly makes me wonder why Intel doesn't make larger improvements. Is it that they are incapable or that they choose to? I get the feeling it's choice, which doesn't make sense to me. I'll upgrade when something much better comes along, meaning Intel isn't only competing with AMD but is competing with themselves also. Give me annual ~5% gains, that's fine, I'll sit on my 2500K for 8+ years. Give me 20% a year and I'll buy a new chip and motherboard platform every 2-3 years. I'm sure many people are the same, enthusiasts will upgrade for large gains frequently. Heck lots of "normal" people will upgrade their "slow computer" with something new if it's faster.

I think people forget that all IPC gains since the Pentium Pro have been minor. Everything else came from something else. Frequency, integration like IMC, new instructions and so on. And back then a desktop CPU was 20-30W, not 65-140W.
 
Honestly makes me wonder why Intel doesn't make larger improvements. Is it that they are incapable or that they choose to? I get the feeling it's choice, which doesn't make sense to me. I'll upgrade when something much better comes along, meaning Intel isn't only competing with AMD but is competing with themselves also. Give me annual ~5% gains, that's fine, I'll sit on my 2500K for 8+ years. Give me 20% a year and I'll buy a new chip and motherboard platform every 2-3 years. I'm sure many people are the same, enthusiasts will upgrade for large gains frequently. Heck lots of "normal" people will upgrade their "slow computer" with something new if it's faster.

The ones who care, we're a niche. Surely they've done the math and found out it's not worth it. No doubt they can offer more cores at reasonable prices even without dropping the iGPUs. Just no financial incentive
 
The ones who care, we're a niche. Surely they've done the math and found out it's not worth it. No doubt they can offer more cores at reasonable prices even without dropping the iGPUs. Just no financial incentive

If you just want more cores you can buy cheap Xeons for 300$ and up for 8 cores. You can even get 10 cores for 600$. The problem is more cores for a consumer is pretty much worthless outside the sub 1% crowd.

And the consumer want small, portable, efficient.
 
If you just want more cores you can buy cheap Xeons for 300$ and up for 8 cores. You can even get 10 cores for 600$. The problem is more cores for a consumer is pretty much worthless outside the sub 1% crowd.

And the consumer want small, portable, efficient.

I am aware of the market realities. We all are.

And what 300 euro Xeons? Read what you just said and tell me this is fair - 8 cores for 300, 10 cores for 600 hahah. To hell with Intel's pricing structure. I was about to get a "cheap" 4 xeon this summer. Almost hit Buy when I realized I was about to pay 4x the cost for 2x the silicon gound in my HTPC CPU. Not buying on principle alone.
 
I am aware of the market realities. We all are.

And what 300 euro Xeons? Read what you just said and tell me this is fair - 8 cores for 300, 10 cores for 600 hahah. To hell with Intel's pricing structure. I was about to get a "cheap" 4 xeon this summer. Almost hit Buy when I realized I was about to pay 4x the cost for 2x the silicon gound in my HTPC CPU. Not buying on principle alone.

http://ark.intel.com/products/family/91287/Intel-Xeon-Processor-E5-v4-Family#@Server

You can also get super cheap older Xeons from datacenters that replaces them due to TCO.
 
Thanks Kyle for doing this, and Nimisys for generously providing the sandy bridge hardware. Much appreciated guys.

The results do make me sad though. I know you guys all know this already, but the desktop enthusiast market is quite small (and shrinking). The CPU demands for the vast majority of consumer and business users have not increased in any significant way in the last 5 years. In an increasingly mobile world where other factors, like power consumption/efficiency, size, heat, etc. really do matter, I think speed (especially for the shrinking desktop segment) is the last thing Intel really cares about. When consumers by any sort of computing device, I don't even think "speed" is a key factor anymore. Form factor, build quality, portability, user experience, battery life, convenience---that's what's advertised.

I'm not going to pretend that I have any idea whether Intel can produce much faster chips. I just know there isn't much business need to do so, so they don't have to.

In this community, we think differently of course. We enjoy computing hardware that pushes the limits of what is possible. I think that's why we're all hoping Ryzen will, to any reasonable degree, live up to its hype----if only to put pressure on Intel to innovate more aggressively in this segment.

At least we're still getting nice improvements on the GPU side. With VR, >4k resolutions, and high refresh rates gaining popularity, we're going to need those gains more than ever. We just seem to be living in a world now where GPU matters a lot more than CPU.

I've been visiting here more often lately, and just want to say I really appreciate this thread, this review, and this community. You guys still are awesome.

Edit: So, haven't posted here in a while, despite lurking quite a bit. Noticed the build in my signature that has been out of service for many years now. Enjoying this blast from the past Saturday morning.
 
Oh, you mean second hand or cheap LGA2011 Xeons hehe. That whole platform felt like a scam to me.

Anyway. Listen, I don't need your recommendations. You seem like a nice person but opinions on hardware are rather fanboyish. You're shilling most of time and not sure to what end.

If it all turns out that you feel entitled to cheap fast clocked 8 cores or more on a mainstream platform. Then so be it. But dont try and blame someone else for it with the infantile name calling.
 
If it all turns out that you feel entitled to cheap fast clocked 8 cores or more on a mainstream platform. Then so be it. But dont try and blame someone else for it with the infantile name calling.

Well, I am as entitled to a reasonably priced 8 core chip as Intel is to my money. You're free to try and spin it any way you want.
 
What is the relevance of gaming benches at 640 x 480 res if as stated:
I realize these ultra low resolutions were pertinent 25 years ago, but they serve absolutely no purpose today.
To benchmark the effects of CPU on a game, you must first eliminate any possibility of the GPU affecting the results.
Using the lowest resolution available is a good way to achieve that.
 
Honestly makes me wonder why Intel doesn't make larger improvements. Is it that they are incapable or that they choose to? I get the feeling it's choice, which doesn't make sense to me. I'll upgrade when something much better comes along, meaning Intel isn't only competing with AMD but is competing with themselves also. Give me annual ~5% gains, that's fine, I'll sit on my 2500K for 8+ years. Give me 20% a year and I'll buy a new chip and motherboard platform every 2-3 years. I'm sure many people are the same, enthusiasts will upgrade for large gains frequently. Heck lots of "normal" people will upgrade their "slow computer" with something new if it's faster.

You can't be serious. I'd rather spend my money on a video card or nice display than shitty CPU that I put inside the box.
 
Back
Top