Kaby Lake 7700K vs Sandy Bridge 2600K IPC Review @ [H]

I remember the days of 486/25 to 486/60 to Pentium 90 to 166 to 300 etc etc. Every single jump was like double the performance.

The minor benefit of a new processor over my 2500K just doesn't seem to be worth it especially as coupled with a 1080 gtx, every game I have runs flawlessly on my 60Hz 2560x1600 screen. Maybe Kyle's VR benches will convince me otherwise.
 
I swear I am doing that same thing with my Gulftown Xeons and original 2013 Nvidia GTX Titan. This build is the longest I have ever used a single computer without modification because it was simply so powerful when put together. I think the 2nd longest I ever went was with my ATI 9700 Pro... which I still have somewhere.

Oh that 9700 Pro was legendary, wasn't it? But we're starting to age ourselves a little here ;)
 
Great article - this is why I still read hardocp.com! I've got my 2600k running at 4.4 paired with a 970 video card. It still handles all the game I'm playing just fine (1080). Other tasks I do on this work great.
As others have pointed out, the only reason I have to upgrade right now would be for more PCI lanes, storage options, and the like.
I'm very interested in the VR performance. If I do any upgrades, it would be a better display (still at 1080) and GPU (Nvidia 970 - not bad at 1080, but entry level for VR). I also have not bought into VR yet - I've tried it and think it's awesome, but don't have wife-permission to move forward. $800 is too much to "sweep under the rug" :)
 
Nimisys, thank you for sending it in, and Kyle, thank you for this report. If you need another 2600k, let me know. I'm local in DFW and can just drive it over. In any case, this just verifies my thinking and goes to show AMD that there is a large number of the community waiting for them to get their act together, and I admit that I am one of them.
 
The motherboard used for the 7700K system is the ASUS ROG Maximus IX Formula. Sorry for the stock picture, I have already turned over the motherboard to Dan for further review, so I did not have it in the office.

Frigg'n dumbass. You were supposed to take pictures before you sent it anywhere. Pics or shens! :p
 
Huge props to Nimisys for sending over his hardware to Kyle.

Huge thank you to Kyle for actually going through the time to do these benchmarks.

So many people are running that generation of processor that it is an oft burning question, how much faster is it then my current cpu. This goes a long way to showing people exactly what or what they are not getting from it. And I think we are going to see the margin of the Kaby Lake decrease once you run some VR testing on it.

Thanks again guys, great article!
 
Kyle, any chance of running a quick benchmark using a "real" GPU and resolution?

GTX1080 or Titan at 1080p or 4K?

I doubt there would be any major difference, but it would remove all doubt.
 
Kyle, any chance of running a quick benchmark using a "real" GPU and resolution?

GTX1080 or Titan at 1080p or 4K?

I doubt there would be any major difference, but it would remove all doubt.

Resolution isn't as important as settings, there are actually settings that have a weight in CPU performance, so as I said in an early thread and I think someone else said in the first page, a real world resolution with real settings i'm 100% sure and confident it will show an appreciable and contrastable differences as that was the reason I had to keep my 6700K when I was doing test which I do with every generation of CPUs..

Also the game suite doesn't help when everything it's also very very old with the exception of AoTS, 1080P with a Titan X at max setting will easily expose any CPU Bottleneck in games as Fallout 4 or GTA V as just 2 examples, which doesn't include higher exponents as Crysis 3 or Battlefield Multiplayer Games.
 
Thanks for doing this article Kyle/[H] staff. Had my Z77 board die recently and pretty much came to the conclusion that it's better to just buy another used motherboard rather than spend 500+ on a 6700K/7700K for little benefit (used for mostly gaming). One request I have (if you guys can do it) is have a game like BF1 and do a CPU test from 2/3/4/5/6 series all at the same clocks and compare the results. I know the memory would also be a variable as well.
 
Thank you Nimisys ! And thank you Kyle.

I thought it'd be a little more of a spread. I've been holding out because my 2600K isn't really slow for what I do with it. I WANT to upgrade and have been looking for reasons to do so. I think the only thing pushing me now is a 5+ GHz overclock and that's only for the e-peen status.

I wish Intel would put a little more effort into the enthusiast market. I know I'm not alone in that I'm ready and wanting to upgrade to something significant. I want to give Intel my money, but I want something that's worth the money. If I were building a brand new system, I'd go Kaby Lake in a heartbeat. But, I have zero reason to upgrade my 2600K....

This write up pretty much solidified my decision to not upgrade yet...

Bittersweet, though. Great to see that I am not throwing my money at Intel for a 'small' upgrade, but upset that I'm not upgrading.
 
I am going to run real world VR gaming benchmarks with the 2600K. Don't worry. We will see if the latest tech beats on it or not. And I now have two Titan X Pascal cards here so we will be the least GPU encumbered that we can be.
 
Tears-Of-Joy.jpg


So Much Thanks....anxiously awaiting those "4fps increases" when you run like-minded GPU's on both parts :) Or so I predicts...maybe I'm wrong...don't think I'm wrong.....I'm ready to do a new build but now I know I really am just buying updated hardware, not necessarily performance-changing numbers. Awaiting the gaming benchmarks. Thanks again for doing this, I suspect a lot of people have an interest in this.
 
Last edited:
Thanks for doing this article Kyle/[H] staff. Had my Z77 board die recently and pretty much came to the conclusion that it's better to just buy another used motherboard rather than spend 500+ on a 6700K/7700K for little benefit (used for mostly gaming). One request I have (if you guys can do it) is have a game like BF1 and do a CPU test from 2/3/4/5/6 series all at the same clocks and compare the results. I know the memory would also be a variable as well.
Just think about how many hours of testing you are asking for...... We don't do "one run and its done" shit around here. We MAKE SURE.
 
Excellent article! Many thanks to Nimisys for ponying up his hardware! Looking forward to the VR review.

I mainly upgraded from my 2500K to a 5930K for Star Citizen and to be VR ready. My dad has the old box now, still runs like a champ.

Very nice work Kyle. You really hit the nail on the head by showing us that Intel has only delivered 20-25% in nearly 6 years of millions, possibly billion dollars worth of alleged Intel R&D time. It tells me two things, we NEED AMD to be competitive, and it also shows us the possibility that the current "Core" architecture is simply maxed out, and maybe we will get 5% IPC increase when the "next gen" get released.

Right now, it's AMD's time to make it, or break it, for all of us.

Another interesting point I would like to rase, is one that nobody, not even you have mentioned (to the best of my knowledge) Kyle, is that despite 6 years of architecture optimisation, as well as process and feature improvements/shrinks, Intel's CPU's are still just as hot and voltage hungry when moderately overclocking as they have always been. I have always wondered why this is the case? It used to be that when we had a process shrink, we could overclock the same but at lower voltage and then get lower temperatures, or overclock higher, and get the same sort of temperatures and voltages as before. Yet now, you get a process shrink or even two, and you get maybe 200MHz more overclocking at the same temps as before, and almost the same voltage, even Intel's new transistors (which I think have been changed 3 times since the 2600K) have not really changed this behaviour. We are still running just about any high end Intel CPU at or very close to 4.5GHz, and still getting the same kind of temps generation after generation. Sometimes I wonder if Intel really is doing any kind of work on these CPU's...

I really hope Zen has good performance but I will be floored if it manages to beat Intel by much (if any) for total performance, my main hope is it has better price/performance vs. Intel.

The fact of the matter is we hit the speed and size limit for Si transistors a while ago. They packed more of them in with the better process tech by basically standing them them up on end. We also hit a volage and frequency wall so all you can do is optimize power and try to go more parallel. We need a new (or old, GaAs) substrate material or significant modification to Si (strained, graphene doping, etc) to move out of the lab and be able to be produced in quantity to get another big speed boost.

These are the reasons Intel has been mostly been making their chips more power efficient with fairly minimal gains in IPC and packing more cores on the higher end chips. They do work on it but it's a hard problem to get more real single thread performance. Look at ARM, a few years ago there were a ton of people expecting it to take over servers like it had mobile. Didn't happen, bunch of people made systems and they ended up being duds. Meanwhile all-in-one x86 devices are starting to displace ARM tablets.

GPUs have the advantage that their workload is VERY friendly to parallelization so we've seen better gains from process tech there but they're still hit by the Si speed limits as well.
 
Awesome review, I'm interested to see some more "real world" gaming tests. I feel like most people who have a Sandy Bridge i7 are probably running something mid to high tier from this gen or last gen (GTX 970/980/1060/1070 or AMD 280x/290/290x/390/480), a mid range card from AMD & NVIDIA would be cool to see at 1080p. I would expect to see minimal differences in most games, it really is remarkable how well SB has aged.
 
Many thanks Kyle & Nimisys. I'm still amazed at my Sandy Bridge 2600k. 6 years of running at 4.8 GHz and still relevant. Knock wood that the cpu & the Asrock board don't go tits up on me. The cpu has been water cooled since I got it. It was said previously, this cpu was one of the best pc related purchases that I've made. Looking forward to the real life game testing. I'm hoping you do a 4k resolution comparison as well. VR comparison would be slick as well. Thanks again!
 
Well then.

I'll continue to sit on my 2600K until it explodes (it seems to have degraded as 4.4 is no longer stable but 4.2 is). Sounds like I'll be hitting guides and reading to see if there are settings I should optimize to squeeze more speed out of it.

Isn't it really odd? There was a time when I'd have declared holding the same system for more than 2 years was PREPOSTEROUS !!
 
I still have a 2500k sitting in the box since the Asus Z77 motherboard died few years ago. I guess O should pit it in the FS forum if it is still competitive.
 
Awesome review Kyle (and thanks Nimisys)! I'm still rocking a 2500K and it hasn't let me down yet. I have the same motherboard that you used here too. This year I added more RAM (went from 8GB to 16GB) and got a new GPU (RX 480) and can run every game I've thrown at it just fine at 1080p. It's really telling that the IPC gains have only been about 25% over the last 6 years. I guess if you look at stock clocks that's another 20% or so but since the Sandys are so easy to OC it's not a big deal.

I'm definitely planning on keeping my system going for the near future. I mainly use my system for gaming and photo/video editing. For gaming I probably wouldn't see much, if any, benefit to an upgrade. I would for photo and video editing but my system is "good enough" for now. I really hope RyZen is competitive with what Intel has to offer. If the IPC is somewhere between Sandy Bridge and Kaby Lake, but with twice the cores (and SMT), it will be a definite upgrade for me (at least for photo/video editing). If the price is right I would definitely go for AMD next time just to spite Intel.
 
Is anyone really surprised since the general advice between processor generations with i3/5/7 is don't bother to upgrade? Intel has a 24 core/ 48 thread Xeon for the server market where the real money is. How is that for sitting on their asses? It is nice to see the the benchmark if only to offer some reassurance that you don't need to upgrade. Has Intel abandon the enthusiast market? Mostly yes but not entirely since IPC and clock rates have gone up and generally power consumption has gone down. Hoping AMD catches up? Most speculation is that AMD is running at Broadwell IPC levels. Is AMD going to be a game changer? I'm not holding my breath but their live streaming DOTA2 demo appeared to pwn Intel.
 
Intel has milked us with these small upgrades, but that is what happens when they have no competition..
I sure hope AMD lives up to the hype....This time!
 
Thanks for the write-up, [H]. Looks like I'm not upgrading from Ivy unless Zen rocks.

I hope it does, just to poke Intel in the eye. And of course to give consumers more choice.

Not holding my breath, though.
 
I know those drives can be used on the older motherboards via adapters. I was talking about booting to NVMe drives which isn't supported because the UEFI of the days didn't have the feature. I don't know how well modifying the UEFI on those works. As I understand it, that's hit and miss at best. That's not a solution everyone will want to try I'm sure.
I actually have a BIOS for my board (Asus P8 Z68-V Gen3) modified and ready to flash, just waiting on the M.2 SSD and adapter :D

I'll let you know how it goes, though I don't expect any trouble (I'm using the same board as the person who developed this hack in the first place, paired with a Core i7 3770k).
Also, thinking about it, there's an even simpler way to run Wundows from NVMe on z68 / z77 without any modification to the UEFI.

Make sure there's also a SATA disk in the system and put the bootloader (JUST the bootloader) on that, point it at a Windows install resident on the NVMe storage device using BCDedit, boot up... Done deal.

Edit: Or get a 950 Pro, which can boot on z68 systems with no modification (containts its own NVME option ROM).
 
Last edited:
I'm not upset that Kaby Lake may not really be a worthy upgrade right now. I'm actually happy that the Sandy and Ivy Bridge CPU's were so revolutionary and had such great head room that they are still relevant today. I recently upgraded my 3570k to a 3770k @ 4.7 GHz and my computer kicks ass again, and I didn't have to upgrade anything else to make that happen.
 
Nice article!

Looks to be a ballpark 5-10% increase per "generation" from 2K to 7K, depending on process of course.

Those with 2K series systems, especially those who have been heavily worked, might find good justification to dish out the coin. Later generations not so much(unless you have to be a part of the peeing contest).
 
My 3570k is at 4896Mhz and will end up being slightly better than the Sandy Bridge chip and honestly not far away from Skylake. Kyle is right, I have enthusiast dollars burning a hole in my pocket, so as long as AMD doesn't suck, I'm in just for the fun of building a new system.
 
Nice article. Someone mentioned QuickSync, good point. I think benches of AVX, AES, integrated graphics, and power consumption at ~3 GHz would illustrate improvement in necessary directions. Probably some bug fixes along the way too.
 
I built my new rig last summer (Went from C2Quad to an i7 6700k)
Yeah, I'm looking at not upgrading that anytime soon, unless intel really decides to change the game later. This is crazy, 5 years and a 20%ish increase? That's absurd!

I really hope Kyle's right about Ryzen and it changing the game. If AMD makes a big leap this time, they could seriously catch intel napping!
 
As a quick laugh, I would love to see one or two gaming benchmarks where you rely on the iGPU only for Sandy vs. Kaby. I remember when people overclocked the iGPU on Sandy and Ivy, has that even been tried with Kaby?
 
Being a recent 2600K Sandybridge user to move to the 6700K. It was not the CPU that got me to move. It was features on the newer chipset / motherboard that I wanted. And my older ASUS MAXIMUS IV Extreme and 6700K still runs great as a second system.
 
Grateful for the article!

I'd certainly go AMD only if their motherboards don't suck and looking at the rumored chipset features they look rather ... boring. So don't suck AMD!
 
Nice article. Someone mentioned QuickSync, good point. I think benches of AVX, AES, integrated graphics, and power consumption at ~3 GHz would illustrate improvement in necessary directions. Probably some bug fixes along the way too.

As a quick laugh, I would love to see one or two gaming benchmarks where you rely on the iGPU only for Sandy vs. Kaby. I remember when people overclocked the iGPU on Sandy and Ivy, has that even been tried with Kaby?

We did two benchmarks specific to Quick Sync in our 7700K IPC review. Basically no difference between Skylake and Kaby.

As for "gaming" Kaby is a big jump...if your standards are that low for "gaming."
 
Understatement.

I'm sitting on Haswell and not feeling compelled to do one thing about it. Talk about a generational sweet spot...

Given that we had several generations between Sandy (Ivy, Haswell, Braswell, Sky) and Kaby Lake, I'm kind of shocked to see that the single threaded gain is still in the single digits (9%) for some tests, multiply that by 4 (36%) and we're still not seeing equivalent scaling (closer to 25% on average) on a multiprocess level for the other tests...

Which really leads me to wonder that if all the improvements are not from architectural changes, but just physics: shrinking the process and rearranging the die. (Not to mention the slight RAM speed difference between the setups)

Yeah, this really is AMD's chance to shine. Hope they don't fuck it up.
That statement says alot, especially after 6 yrs. As Kyle has pointed out this new chip is all about what your going to be doing with it. I thought I was falling far behind and needed a boost. I think I will relax abit till AMD comes into the picture. Give Intel credit their R&D is busy. If no one is pushing Intel with new products, why would a company put new items on the market if it doesn't need to, because of loyalty? Intel is a business and runs itself as such. One doesn't put out a "New and improved" just for the hell of it.
 
Back
Top