Kaby Lake 7700K vs Sandy Bridge 2600K IPC Review @ [H]

perhaps try and be a bit more useful with you post next time, maybe why hes misinformed...

As shown in above post and over and over in this thread, that poster is ignoring everything and just repeating ad nauseam at this point.
 
Hi Kyle
I have a feature request for the reviews. Instead of handbrake presets can you focus on x264 at maybe 1080p and x265 at like 1080p or 4k?

I have a i7 920 myself still and one of the biggest improvements to x265 is it's avx and fma3 optimizations. It would contrast IPC improvements to show instruction optimization improvements.

Thanks for all the hard work (ha cus hard ocp).
 
Thank you for jumping straight to the heart of the matter. I (and many others, review sites) are more interested in 'realistic scenarios' rather than the theoretical. Just find it humorous that others have not been able to grasp that and basically insisting that the theoretical is more valid because it shows bigger differences. Differences that are of no use to anyone for practical purposes whatsoever.
How are you not getting this?
Low resolution tests are indicative of real-world performance.

800x600:


3326x1871:


5120x2880:


The only thing that changes with resolution is GPU load.
Performance in a CPU-bound game is identical up to the point at which GPU load reaches 100%.
What happens then? The framerate drops lower, which reduces the CPU load.

The only thing that this 7700K vs 2600K test failed to show was what the performance difference is going to be like in new games.
Nobody cares if Lost Planet runs at 302 FPS or 362 FPS.
I sure as hell care that a 2500K can't keep new games like Deus Ex: Mankind Divided above 60 FPS at all times, and would like to know if a 7700K is capable of achieving that, or if I would need a 6 or 8 core CPU.
If it's only 20-30% faster, then that's not fast enough. (30% faster = 43.5 FPS)
 
Thanks for the article. It really makes me happy since I recently put back together my old 2600k system. I only needed a new PSU, Case, and SSD-OS for it. I still had the MOBO, CPU, Ram, 2 HDD's, and 2 x GTX970's laying around. I even used one the Gigabyte SLI bridges that came with my 1080s for some extra flair. Using it in the living room for 1080p/120hz gaming/web browsing/HTPC. Couldn't be happier. Six years later and its still a relevant system.

I, of course, love my newer system(still young at 2-3 years) but I try to build systems to last 5-10 years w/ upgrades and that 2600k is my second to do so. It replaced a Pentium 4 system that clocked at 4.3ghz(still have it in a closet) that also lasted six years. That had my first raid, XP, and it's last GPU was something like a Radeon HD2600 that provided some DirectX 10 to XP.
 
Hi Kyle
I have a feature request for the reviews. Instead of handbrake presets can you focus on x264 at maybe 1080p and x265 at like 1080p or 4k?

I have a i7 920 myself still and one of the biggest improvements to x265 is it's avx and fma3 optimizations. It would contrast IPC improvements to show instruction optimization improvements.

Thanks for all the hard work (ha cus hard ocp).

Since 2600K does not support this, it would be hard to compare.

That said, I did cover this in our recent 6700K vs 7700K IPC Review.
 
/Sill rocking the 3930k, 16gb of Corsair DDR 3, and Maximus IV Formula I bought from Phillyboy in the FS/FT forum in November 2013.

/which says a lot about the industry, because I used to upgrade whenever the wind shifted. That was back before Moore's Law died.
 
Thank you for jumping straight to the heart of the matter. I (and many others, review sites) are more interested in 'realistic scenarios' rather than the theoretical. Just find it humorous that others have not been able to grasp that and basically insisting that the theoretical is more valid because it shows bigger differences. Differences that are of no use to anyone for practical purposes whatsoever.

Yes they are. Take this example: you have two CPUs on two different platforms which cost about the same. Which one should you purchase?

1080p: Both CPUs perform the same

Based on this, both CPUs are equally powerful and both should last the same amount of time.

480p: CPU1 is 20% faster then CPU2

Baesd on this, you know CPU1 is more powerful then CPU2, and will likely last a longer period of time before needing to be upgraded.

So it's more then theoretical. There is reason to run these types of benchmarks.
 
Minimum FPS is another. And if you want 60FPS in Fallout 4 at any place of the map, 6700K/7700K plus fast memory is the answer.
 
Very disappointed. For work I have a couple of dual xeon ivy bridge (a couple of 10/20s) with 128GB ram. I've toyed with possibly upgrading the systems but frankly haven't been impressed by any of the newer dual xeon chip reviews I've seen. And prices have stayed the same or even gone up for these.

I would also settle for getting a single cpu system with maxxed out ram but only if the performance of that single cpu would match the dual ivy bridge xeons. That hasn't looked too promising either.
 
Thank you for jumping straight to the heart of the matter. I (and many others, review sites) are more interested in 'realistic scenarios' rather than the theoretical. Just find it humorous that others have not been able to grasp that and basically insisting that the theoretical is more valid because it shows bigger differences. Differences that are of no use to anyone for practical purposes whatsoever.

Jesus Christ

look at the conclusion section, the first 2 sentences, and come back here and post what it says

does it mention Kyle is coming back with real world comparisons by any chance?
did Kyle mention something about VR and how it handles on the old 2600k?

or did you only look at the pictures?

edit:
and please point me in the direction of any gaming test of a sandy bridge vs Kably lake
 
I was still happy with my 2500k @ 4.4 but I really wanted an upgraded platform, so I jumped to Skylake. Plus, 4 years is enough for one build. I have to treat myself once in a while.
 
Only reason I upgraded my 2500k @ 4.5 system is that the motherboard died. But the ddr3-2400 is nice in the new system, vs 1600 in the old system, so there's that.
 
Only reason I upgraded my 2500k @ 4.5 system is that the motherboard died. But the ddr3-2400 is nice in the new system, vs 1600 in the old system, so there's that.

When I went from a 2500k to a 4790k (both @4.4), I got some noticeable gains in CPU bound games. But at first, I still had the old RAM (1600), I picked up some 2133 sticks later (of course doubling the amount at the same time) and was actually shocked to see yet more gains in those CPU bound games (I play a lot of older titles and I don't go nuts with AA - so it's more than just a few titles for me). So at the end of the day it did feel like a good upgrade. But yeah, Sandy Bridge was amazing and can definitely still hold its own very very well.
 
Not surprised. My launch 2600k is still happily humming along in my son's system. Unfortunately, his MB is going bad, which would be the second time I've had to replace an MB for that chip. Probably not worth buying another used one at this point even though the CPU is perfectly good. Intel has completely slacked off on meaningful upgrades for desktop CPUs and AMD hasn't given them any reason to push the boundaries.
 
wow, I was expecting closer to 40%, not 25 maybe 30%...


AMD PLEASE DON'T BLOW IT LIKE YOU ALWAYS DO.

Remember that this is a pretty highly overclocked 2600k. Base clock is 3.4k. So you're probably getting 10-15% off the overclock alone. So a STOCK i7-2600k vs a STOCK i7-7700K is probably more like 35-45%. Now. One can easily argue that a 2600k overclocks well and easily. But not everyone did overclock their 2600k's. I had played with overclocking before getting my 2600k system and just never bothered, opting for a more compact case without the air flow that I'd have wanted for overclocking.
 
When I went from a 2500k to a 4790k (both @4.4), I got some noticeable gains in CPU bound games. But at first, I still had the old RAM (1600), I picked up some 2133 sticks later (of course doubling the amount at the same time) and was actually shocked to see yet more gains in those CPU bound games (I play a lot of older titles and I don't go nuts with AA - so it's more than just a few titles for me). So at the end of the day it did feel like a good upgrade. But yeah, Sandy Bridge was amazing and can definitely still hold its own very very well.
You went from a non-HT -> HT part (and the L2 cache also increased from 6MB -> 8MB), so the difference will be exaggerated.
 
This makes me feel better about my current i7-980x

I want to upgrade, but I'm probably going to hold out for Ryzen. Primarily because I don't want Intel ME though.
 
This makes me feel better about my current i7-980x

I want to upgrade, but I'm probably going to hold out for Ryzen. Primarily because I don't want Intel ME though.

Same here. Holding out for Ryzen. i7-3770k was my last upgrade.
 
Same here. Holding out for Ryzen. i7-3770k was my last upgrade.

I just hope they are not stingy and all into monetizing pcie lanes. I need something for my main server that can crunch numbers encoding while running multiple raid cards.
 
Sandy Bridge supports AVX 1 just fine. Im running x265 currently on my i7 920 but its very slow. It is made to fall back on SSE if need be.
Sorry, I was talking about H.265 QuickSync support, my bad.
 
AMD used to blow away Intel back in Athlon original and T-brid days.
I though Athlon XP (Thoroughbred, Barton...) was only neck to neck with Pentium 4 (Willamette, Northwood). AMD was only able to surpass Intel when they release ClawHammer/NewCastle Athlon 64 and Intel release the Prescott P4 dud? Correct me if I'm wrong.
 
Thanks for the review and the person donating the hardware. It was great.

It's depressing to see improvements come so slowly to the desktop. My 2600k is six years old, power supply and case are ten years old, monitor is six or seven. Guess it's just GPUs and monitors for the next while. 4k 30+ inch with a very high variable refresh rate and a GPU that can handle it would be great.
 
I though Athlon XP (Thoroughbred, Barton...) was only neck to neck with Pentium 4 (Willamette, Northwood). AMD was only able to surpass Intel when they release ClawHammer/NewCastle Athlon 64 and Intel release the Prescott P4 dud? Correct me if I'm wrong.
I love the history of the good old days!
The P4 had abysmal IPC compared to the Athlon/Athlon XP (and P3). Intel led in clockspeed metrics, so AMD started back up with the PR system. (Side note: AMD claimed their PR ranked the Athlon XP against the original Athlon, but everyone knew that it was actually intended to compare AXP to P4).
In the beginning of the PR system, they actually conservatively ranked the speed, so a PR rated AXP was generally faster than a 100MHz faster P4 (AXP 1700+ (1.47 GHz) was generally as fast as 1.8GHz P4). As Intel continued to ramp up the P4's clock speed, bus speed, and added cache to the SKUs, the PR rating started to fall apart, and by the end of the AXPs life, the PR rating was overestimating the chip's performance.
When the Athlon 64 came out, AMD was still utilizing the high IPC/ lower clock approach, and continued the PR rating scheme, but the Athlon 64 PR rated chips once again outperformed the comparatively clocked P4.
 
On another note: At home I've been using an Opteron (equivalent of quad core Phenom 2) for ages now, and the upgrade itch has been killing me for a while. My PC at work is an Ivy Bridge i7.
I've been waiting for Zen to be released, and decided I was done waiting, so I ordered an i7 7700k. The gear came in this weekend, and I've got everything set up. Subjectively speaking, the Kaby Lake blows away my work PC's Ivy Bridge. I haven't OCed the 7700k, and of course the work PC's Ivy isn't overclocked either, but at stock speeds the Kaby is just so ridiculously fast. As these things go, most of the time either chip would be sitting idle at its ~1 GHz base clock, but when opening Chrome, or Visual Studio, or what have you, the Kaby just boosts up, does its job immediately, and is done, while the Ivy just takes longer. (It's also crazy hearing the fan on the Kaby speed up for a second while the chip clocks up and then idling back down immediately). Both systems have equivalent SSDs so it's the processor that's making the difference.

I totally get that the Ivy overclocked probably is a stone's throw from the Kaby. But man, I am just super impressed with this system. Here's to holding on to this setup until 2025.
 
I am not an overclocker, so I wish these showdowns would show comparisons at stock speeds. I have a 2600k running at stock speeds and would love to know what kind of performance improvement I can expect moving to a 7700k at stock.

Your wishes will not be granted at HardOCP as what you are wishing for is not the focus of this website.
 
You could have said that to begin with. I am a new user here to the forums and it is disheartening to be made to feel like I am not welcome here after my first post.

I have not been able to get much of an overclock out of my 2600k so I stopped bothering years ago. It is not just your site. Everyone seems to like to do same-clock comparisons of CPU's. This is great if you are able to overclock your previous CPU to the clock rate of the new one. In the case of Kaby Lake, the difference doesn't seem to be enough for people to recommend an upgrade, but this is only because the Sandy bridge CPU is clocked to the same rate as the stock Kaby Lake. In my case, since I am not overclocking my 2600k, the difference may be worth the upgrade. I am just trying to find some type of benchmarks to confirm that.

I don't think I have ever not seen a 2600k hit 4.5ghz. To me it's not the chip, but either the motherboard/memory or user error.

I mean 4.5ghz is so easy on Sandy Bridge....easiest CPU I ever overclocked.
 
I don't think I have ever not seen a 2600k hit 4.5ghz. To me it's not the chip, but either the motherboard/memory or user error.

I mean 4.5ghz is so easy on Sandy Bridge....easiest CPU I ever overclocked.

Last time I tried to clock it up that high, my machine was overheating when video encoding with Handbrake.

I have an ASUS P8Z68-V PRO/GEN3 Motherboard and a Coolermaster Hyper 212 EVO CPU cooler.
 
Last time I tried to clock it up that high, my machine was overheating when video encoding with Handbrake.

I have an ASUS P8Z68-V PRO/GEN3 Motherboard and a Coolermaster Hyper 212 EVO CPU cooler.

Sounds like a cooling problem then, not a CPU issue. Sandy Bridge CPU's run very cool. Now if it was Ivy Bridge I could understand, but Sandy Bridge CPU's were top notch for overclocking.
 
Sounds like a cooling problem then, not a CPU issue. Sandy Bridge CPU's run very cool. Now if it was Ivy Bridge I could understand, but Sandy Bridge CPU's were top notch for overclocking.

My CPU has never run very cool. When I first got the machine, I had to use the stock cooler for a few days until my Coolermaster arrived. With the stock cooler and at stock speeds, the CPU temp would go up over 90 degrees. With the Hyper 212 EVO, it gets up in the 70s at full load. If I overclock, it goes back up to over 90 degrees.
 
My CPU has never run very cool. When I first got the machine, I had to use the stock cooler for a few days until my Coolermaster arrived. With the stock cooler and at stock speeds, the CPU temp would go up over 90 degrees. With the Hyper 212 EVO, it gets up in the 70s at full load. If I overclock, it goes back up to over 90 degrees.

Sounds like user error to me. The stock cooler is suppose to be perfectly fine for 2600k at stock. Make sure you are setting the CPU correct, make sure you didnt use too much thermal paste. Did you remove the clear plastic coat on the bottom of the Heatsinks before installing?

The more and more you keep giving use information, the more it sounds like user error to me.
 
Sounds like user error to me. The stock cooler is suppose to be perfectly fine for 2600k at stock. Make sure you are setting the CPU correct, make sure you didnt use too much thermal paste. Did you remove the clear plastic coat on the bottom of the Heatsinks before installing?

The more and more you keep giving use information, the more it sounds like user error to me.

I've been building PC's since I was 8 years old. I know how to properly seat a CPU cooler.

Don't know if it makes a difference, but I live in South Florida and right now, the ambient temperature of my home office where the computer is is around 83 degrees F. Doesn't help that I have 2 desktops and 2 servers running in that room.
 
You could have said that to begin with. I am a new user here to the forums and it is disheartening to be made to feel like I am not welcome here after my first post.

I have not been able to get much of an overclock out of my 2600k so I stopped bothering years ago. It is not just your site. Everyone seems to like to do same-clock comparisons of CPU's. This is great if you are able to overclock your previous CPU to the clock rate of the new one. In the case of Kaby Lake, the difference doesn't seem to be enough for people to recommend an upgrade, but this is only because the Sandy bridge CPU is clocked to the same rate as the stock Kaby Lake. In my case, since I am not overclocking my 2600k, the difference may be worth the upgrade. I am just trying to find some type of benchmarks to confirm that.

Oh boy! You need to have a thick skin to hang around here. Kyle doesn't take any shit and generally tells it like he sees it. My advice to you, get used to that attitude. :)


Kyle, how are you going to handle an IPC test of Intel vs AMD when RyZen becomes available? Clock for clock, core count the same, SMT off/on, etc...
 
I've been building PC's since I was 8 years old. I know how to properly seat a CPU cooler.

Don't know if it makes a difference, but I live in South Florida and right now, the ambient temperature of my home office where the computer is is around 83 degrees F. Doesn't help that I have 2 desktops and 2 servers running in that room.
Don't what to tell ya, but there is something seriously wrong.

It isn't Kyles fault you cant clock to 4.5ghz on your CPU. When 99.99999999999999% of Sandy Bridge CPU's do. So it is a totally valid comparison. I am sorry you are having heating issues, but it sounds more and more that is it not the CPU that is the issue.

Good luck
 
Don't what to tell ya, but there is something seriously wrong.

It isn't Kyles fault you cant clock to 4.5ghz on your CPU. When 99.99999999999999% of Sandy Bridge CPU's do. So it is a totally valid comparison. I am sorry you are having heating issues, but it sounds more and more that is it not the CPU that is the issue.

Good luck

Why not. It is definitely possible I just got a badly binned CPU. Not every CPU is a winner.

Also, I never blamed Kyle for my lack of overclock. I was just taken aback by his snarky comment to what I though was a legitimate question.
 
Why not. It is definitely possible I just got a badly binned CPU. Not every CPU is a winner.

Sure you could say that, but if you havent tested it in another system to see if it's the CPU, you cant make that claim. its super rare a CPU is that bad, And Intel will not ship a CPU with a stock cooler if it can't keep it cool. Thats why I keep saying there is something else wrong if you cant even run it at stock.

Try it on another system see if you have the same issue. If it does run hot in the same setup, then I could agree with you.
 
Whoa whoa whoa, just because a CPU isn't a stellar oc'er doesn't mean that it is bad. My 4790K only does 4.5, then it's a voltage wall at 4.6/4.7 and then not stable at 4.8 at any voltage I'm willing to try. My 2500k would play Quake @ 5.2 GHz on air cooling with some voltage, though I dared not fire up prime95 hahaha
 
Back
Top