Intel Skylake Core i7-6700K IPC & Overclocking Review @ [H]

Did that already. I already replaced my PSU- still shutting down with 980Ti SLI setup in every slot. PCI-e slot 4 doesn't even work. It's either the motherboard or one of my 980Tis and since slot 4 doesn't work with either card I'm willing to bet it's the motherboard. Ironically this didn't happen for 2+yrs with my OG 6GB Titans. Not until I installed my 2nd 980Ti, then it all went fubar. Gigabyte POS motherboard. 2 weeks past warranty.:rolleyes:

Anyways, MA tax free weekend next week. I can pick up a 5820 in store @ Micro Center for $299. The MSI X99S SLI Plus is on sale @ Newegg this weekend for $150 after rebates and discounts. All I need is DDR and a cooler and I'm all set.

Hmm. I wonder if the TX tax free weekend includes computer parts (probably not). If Micro has a deal on Skylake :cool:
 
Can one delid the cpu and go to the slug as it should be with a cooling solution? If not I will keep my 5930K @ 4.8 ghz ( will delid it when the enthusiast warranty runs out ) and my delided 4790K at 5 ghz and continue on with my happy geek dance.
You hear about how Haswell-E supposedly uses solder under the heatspreader? If that's true, you sure you wanna de-lid it? Probably a greater risk of damaging the die. I know people were seeing solder on the engineering samples. I don't recall if I read that the retail CPUs were using solder as well.
 
You hear about how Haswell-E supposedly uses solder under the heatspreader? If that's true, you sure you wanna de-lid it? Probably a greater risk of damaging the die. I know people were seeing solder on the engineering samples. I don't recall if I read that the retail CPUs were using solder as well.

Yeah you're right that delidding Haswell E there's a good chance some of the die will be destroyed. He's better off selling it when something better comes out rather than destroying it for another 0.1 or 0.2 Ghz.
 
Last edited:
Great review. However, I can't help feeling a bit let down by Intel. Power efficiency, more USB ports, or even faster storage - these are all marginal improvements, esp when the GPU is the bottleneck. Where is the USP (unique selling proposition)? Where's the 6 core/12 threaded consumer CPU overclockable to 5 Ghz? After all these years, I regret not going for a 2600k, and cheapening out on a 2500k given how long it has lasted...

I think the marginal items you mentioned were certainly necessary. I agree that more performance would have been nice of course. I don't think of Skylake as a let down. If I thought about it only in comparison to Devil's Canyon and Haswell I'd agree. When you compare it to Sandy Bridge I think its a decent improvement. Again not only because of the CPU. The platform gets major improvements, many of which are relevant now. As for 6c/12t CPUs, I don't think you'll see that on the consumer side anytime soon. Its unfortunate, but that's how things are going to be for awhile it seems. Intel really likes this Pentium / Pentium Pro type business model. One CPU and platform for the regular folk, and one for the professionals, enthusiasts and server class systems. The latter isn't always better for home usage and gaming either.


More on the I7-5775C beating the I7-6700 in discrete card gaming at 500mhz disadvantage:
https://youtu.be/3PpbZmX_dwo?t=46m0s

Intel really needs to add that Iris Pro stuff on their Skylakes.

Evidently you missed one very important thing in the Tech Report's actual data which is posted in their article on Skylake. They do compare it to Broadwell but they keep Skylake at DDR4 2133 clocks. In most cases the difference between Broadwell, Skylake and Haswell is very small in games. We've shown, and other sites have shown that Skylake can improve dramatically as you scale the memory speeds upwards. A fact that the Tech Report either didn't explore or was unaware of. I believe this would have altered their data and conclusion substantially. As it is in productivity and synthetic benchmarks Broadwell falls behind Skylake. Substantially in some cases. In the gaming benchmarks Broadwell did indeed do better much of the time but not all the time. Far Cry 4 is one example where it didn't win or the results were too close to call. There was only one part of that test where Broadwell did better. "Time spent beyond 16.7ms" specifically. 4 games is hardly a huge sampling but in many of the Tech Report tests Broadwell wasn't that far ahead of Skylake. When you are talking about 1 or 2 FPS and your running DDR 4 at relatively low speeds that difference is pretty easy to explain. Again I think testing Skylake beyond DDR4 2666MHz would have put things in a different light. We've shown that memory clocks matter with Skylake and we've seen less impact with memory speeds out of other platforms than Skylake.

For an enthusiast, who cares about stock performance?

I do agree that Iris Pro would have had benefits, especially in mobile applications. According to Intel the development cycle for Skylake took around 6 years to complete. It was probably designed completely independently from Broadwell. I don't think they can just slap a new iGPU into a CPU die once they reach a certain point in development. Again, Skylake is targeted towards more enthusiast type applications which makes the inclusion of Iris Pro less important. I don't believe Intel considered it an advantage for gamers and the like who will depend primarily on discreet graphics. The tests you mentioned which Broadwell was the "winner" are actually fairly few and far between or are by miniscule margins. We are also comparing Z97 and DDR3 vs. Z170 and DDR4. The latter platform hasn't been out a week and there is potentially a lot more that can be done to improve the UEFI BIOS' on these motherboards and again, with lower latency and higher clocked DDR4 modules, I think Skylake would take the cake.
 
Last edited:
I don't think of Skylake as a let down. If I thought about it only in comparison to Devil's Canyon and Haswell I'd agree. When you compare it to Sandy Bridge I think its a decent improvement.

Sandy Bridge is almost five years old. It's hard to believe we're comparing a tech product to something 4-5 years old to find something good to say about it. Do we do that with any other technology company? Imagine ARM or nvidia or AMD releasing a product that was nearly identical to the previous generation and then saying, "yeah but it's pretty decent compared to their products from five years ago."

Skylake is a completely new architecture and in reference to Haswell it's a smaller node too. And there's practically nothing to talk about here. There was a tick and a tock and the hand didn't move.

Maybe I'm being too cynical... but again, if we at least got something that used less power and generated less heat, then we could say that we got something out of 14nm.

People have criticized TSMC for their woes. Intel 14nm is a complete trainwreck and nobody wants to talk about it.
 
People have criticized TSMC for their woes. Intel 14nm is a complete trainwreck and nobody wants to talk about it.

Well, with each successive die shrink, said shrink becomes more and more technologically difficult.

We are quickly closing in on the limits of what is possible with silicon.

This is.more than likely the new normal, and it will only get worse.
 
Hmm. I wonder if the TX tax free weekend includes computer parts (probably not). If Micro has a deal on Skylake :cool:

The problem for me is if MA Micro Center local stores actually have the 6770k in stock by next weekend otherwise I'd probably go with Skylake. Right now it's a tossup for me. It's either go with a 5820k or a 6770k. I really don't know what to do at this point. I have to decide if I ever want to go 3-way SLI. Probably not, but it's a nice option to have.

Also I know the 5820s don't overclock well, if at all. Which is a problem. So there's that too.

Either way if I don't make my decision by tomorrow I'm going to miss out that MSI X99 Plus motherboard @ Newegg for $150. That's a great deal.

Someone help me decide!!!:D
 
Sandy Bridge is almost five years old. It's hard to believe we're comparing a tech product to something 4-5 years old to find something good to say about it. Do we do that with any other technology company? Imagine ARM or nvidia or AMD releasing a product that was nearly identical to the previous generation and then saying, "yeah but it's pretty decent compared to their products from five years ago."

Skylake is a completely new architecture and in reference to Haswell it's a smaller node too. And there's practically nothing to talk about here. There was a tick and a tock and the hand didn't move.

Maybe I'm being too cynical... but again, if we at least got something that used less power and generated less heat, then we could say that we got something out of 14nm.

People have criticized TSMC for their woes. Intel 14nm is a complete trainwreck and nobody wants to talk about it.

In the history of PC computing we've had long periods of stagnation like this with slow adoption rates and high prices due to components being sourced or controlled by a single manufacturer pretty much since the inception of the PC.
 
In the history of PC computing we've had long periods of stagnation like this with slow adoption rates and high prices due to components being sourced or controlled by a single manufacturer pretty much since the inception of the PC.

I can't believe I'm going to say this (publicly), but if next year AMD actually came to the table with a 6-8 core chip w/ a very small GPU and good performance, power-efficiency, and all the latest chipset features at a reasonable price, they'd have a chance to win some people back. It shouldn't be too hard to catch somebody that's nearly standing still. Intel treats the desktop market similar to how MS treats PC gamers.

edit: I mean... even I would consider buying something from AMD again... which is saying a lot.
 
I can't believe I'm going to say this (publicly), but if next year AMD actually came to the table with a 6-8 core chip w/ a very small GPU and good performance, power-efficiency, and all the latest chipset features at a reasonable price, they'd have a chance to win some people back. It shouldn't be too hard to catch somebody that's nearly standing still. Intel treats the desktop market similar to how MS treats PC gamers.

edit: I mean... even I would consider buying something from AMD again... which is saying a lot.

We can only hope AMD gets back in the fight but unless they acquire some amazing technology from another company in a purchase like they did before, I just don't see it. They lack the R&D budget to compete with Intel. People talk about AMD's glory days with the Athlon CPUs but the reality is that they bought technology and hired engineers from two companies to pull that off.

AMD lost just about all the staff responsible for that.
 
Semiaccurate throwing shade…

The Z170 has a few nice features, 10 USB3 ports, 14 USB2, GbE, 6 SATA presumably -6Gb, and support for NVMe. The real interesting part is that there are 20 PCIe3 lanes off the chipset, a good thing right? Umm no, it is monumentally stupid. Why? Because the link above connecting the CPU and the Z170 is DMI 3.0 which is 4 lanes of roughly PCIe3 bandwidth wide, ~32Gbps.

That means you can hook up 20 8Gbps PCIe3 lanes, 160Gbps for the math averse, plus 36Gbps worth of SATA and 50Gbps of USB3 to it, and it all talks over the DMI 3.0 link. 230Gbps doesn’t fit well into 32Gbps last time I checked, not that we ever expect anyone to come close to this setup. That said we do expect a buyer of a high-end gaming desktop to have a NVMe drive or two and, well you are already over the DMI bandwidth limit there. Plug in a USB3 peripheral and you are choking.

If you have a GPU in the 16x slot off the CPU, then you are pretty much locked out of any type of storage speed. Those meager four lanes omitted from the CPU for no sane reason were a lot more useful than you might expect. If anyone at Intel is wondering why we keep laughing every time we hear about how you really want to revive the desktop, stop and think about the stupid decisions you are making. Cutting PCIe lanes off the CPU and chaining way way way too many ports for the meager bandwidth the chipset has is not a way to win back the enthusiasts you screwed previously. Then again charging them a 10+% premium to buy back the overclocking features to removed a few generations ago isn’t much on the placation side either. This architecture is just stupid and wrong.
 
It's easy to delid a soldered useless heat spreader with the right (simple) tools and a tad of skill. No big deal at all.

I just want to know if the IHS is soldered on Skylake? Haven't read on way or the other.
 
I find it interesting that my Sandy-E can hit 5.0 ghz withh a little coaxing, but these newer chips can't.

I know the IPC makes up for it, while low generation over generation, 4 generations of marginal IPC improvements have added up.
 
It's easy to delid a soldered useless heat spreader with the right (simple) tools and a tad of skill. No big deal at all.

I just want to know if the IHS is soldered on Skylake? Haven't read on way or the other.

I don't know that anyone has been told this by Intel. I asked, but didn't get an answer.
 
Zarathustra[H];1041782822 said:
I find it interesting that my Sandy-E can hit 5.0 ghz withh a little coaxing, but these newer chips can't.

I know the IPC makes up for it, while low generation over generation, 4 generations of marginal IPC improvements have added up.

Intel stressed this point when they briefed us on Skylake.
 

Wow, he needs a different career, this one is too much for him.
He claims Intel tell everyone nothing and then claims that the DMI 3.0 interface is no match for what will connect to it by a 1:9 ratio. That sounds bad.
I will seek other sources for confirmation, he appears to have little information and is pretty pissed because of it.
 
I guess we'll have to wait until IDF for Intel to give out some more interesting details.
 
Wow, he needs a different career, this one is too much for him.
He claims Intel tell everyone nothing and then claims that the DMI 3.0 interface is no match for what will connect to it by a 1:9 ratio. That sounds bad.
I will seek other sources for confirmation, he appears to have little information and is pretty pissed because of it.

He's not entirely wrong. The DMI 3.0 basically has a 4x PCIe link on the back end for a maximum throughput of 3940MB/s or roughly there about. A dual M.2 RAID 0 stripe can saturate that. Anything that connects to the PCH has to go through that link. Now, that sounds bad on paper but we had far less with DMI 2.0. Effectively we are doing the same stuff over DMI 3.0 that we did over 2.0. The real truth is you will rarely saturate these connections fully. You won't be hammering a USB 3.1 port or your SATA ports all the time. It isn't as bad as it sounds.

The fact of the matter is that Intel wants people who are doing heavy I/O workloads to run X99. You have far more PCIe lanes that don't go through a PCH on X99. This is intentional. People who don't need it, can buy a cheaper product that makes them happy. People that do need it will pay a premium to get it and Intel knows it.
 
Honestly, anybody who has a SB or higher can hold off on upgrading unless they are making an i5 to i7 transition.
 
Thanks for the link. :D

Np.

I've never delidded but if it lets me get over 5ghz on water, I'll gladly try. Doesn't seem overly complicated ... just hope the MSI Z170 Titanium comes with their delid spacer.
 
Well I think you have to separate the questions. For some of us its a "Should I upgrade?" This is a decision I have already made because my old I7 920 motherboard is lacking in features I want, but there could be any number of reasons to want or need to upgrade.

The question I am struggling with is "What should upgrade to?"

I am struggling with I7 -5820k or I7 6700k

I use my PC for a lot of things so the extra cores, cache and quad channel memory would be nice, but the 6700k is higher clock speed, easier to overclock and will have many more options for some nice motherboard features.

Both will do fine in most any games I play and odds are I will be just as happy even when using the few multi threaded apps I use because it isn't like I sit at my PC waiting for stuff to finish.

I am very much leaning toward Skylake, mainly because of the motherboards. I love my I7 920, but intel quickly moved away from the 1366 socket for consumers and focused on the next thing, I feel the same will happen with Haswell-E.

I do not want to be disappointed in my choice, but after reading all the info I don't feel like one or the other is going to make enough difference so I might as well buy the newest once it is available at MC.
 
Not so.
I am cpu limited in 3 games with a 2500K @ 4.3GHz.
You want to make a bet on that? I bet we can compare benchmarks and game performance and there will be much more than 3 games where your cpu is holding back full potential of a 980 Ti. Of course in MOST cases it is not even close to a playable limitation so maybe that is what you are referring to with the 3 games claim.
 
I dont disagree.
I have 3 games that are cpu limited.
 
So is my i7 930 is holding back my 980 and 1080p g-sync monitor?

I ran that Heaven benchmark and it's purring along at 144fps at 640x480 :D
 
Well I think you have to separate the questions. For some of us its a "Should I upgrade?" This is a decision I have already made because my old I7 920 motherboard is lacking in features I want, but there could be any number of reasons to want or need to upgrade.

The question I am struggling with is "What should upgrade to?"

I am struggling with I7 -5820k or I7 6700k

I'm in exactly the same spot, struggling to make the same choice.

Part of me believes that by the time either CPU becomes limiting the extra cores may make more of a difference than the clockspeed.

OTOH if we get really good procs from kaby lake or cannonlake we can always sell of our X99 stuff and upgrade.
 
I dont disagree.
I have 3 games that are cpu limited.

CPU limitation in modern titles is relatively rare, at least at the frame rates most people play at. Just about every CPU out there released in the last 5 years can handle most titles at the 60+ fps level. There are a few exceptions, but not too many.

That is, until you add SLI or Crossfire into the mix.

Usually CPU load is independent of resolution, for instance 60fps at 640x480 loading the CPU identically to 60fps at 4k.

This isn't the case when using SLI or Crossfire. Enabling SLI or crossfire instantly ups the CPU load, and it only gets worse the higher resolution you set.

For instance, Red Orchestra 2 - for me - plays MUCH better at 4.8ghz than at 4ghz. A lot of the frame time spikes disappear, and the game plays better.
 
Zarathustra[H];1041783237 said:
CPU limitation in modern titles is relatively rare, at least at the frame rates most people play at. Just about every CPU out there released in the last 5 years can handle most titles at the 60+ fps level. There are a few exceptions, but not too many.

That is, until you add SLI or Crossfire into the mix.

Usually CPU load is independent of resolution, for instance 60fps at 640x480 loading the CPU identically to 60fps at 4k.

This isn't the case when using SLI or Crossfire. Enabling SLI or crossfire instantly ups the CPU load, and it only gets worse the higher resolution you set.

For instance, Red Orchestra 2 - for me - plays MUCH better at 4.8ghz than at 4ghz. A lot of the frame time spikes disappear, and the game plays better.

I am not mistaken.
Watching my cpu use, gpu use and framerate makes it apparent.
 
Witcher 3, GTA V and Project Cars.

Thanks for the response. I was wondering if you'd say GTA V. I'm not sure I can pin point where or when but I also feel CPU bottlenecked in this game and I like to play it a lot. I run 670 SLI
 
Thanks for the response. I was wondering if you'd say GTA V. I'm not sure I can pin point where or when but I also feel CPU bottlenecked in this game and I like to play it a lot. I run 670 SLI

Use MSI Afterbruner to log CPU use on the graph.
You can see when it approaches max and GPU use drops.
 
I am not mistaken.
Watching my cpu use, gpu use and framerate makes it apparent.
CPU usage does not always tell you if you are cpu limited. Most games do not use every bit of a cpu and you can be cpu limited with 50% usage on quad in some games.
 
CPU usage does not always tell you if you are cpu limited. Most games do not use every bit of a cpu and you can be cpu limited with 50% usage on quad in some games.

^This is very correct.. a game can be single threaded limited and that's 12% usage in a 4c/8t chip.
 
CPU usage does not always tell you if you are cpu limited. Most games do not use every bit of a cpu and you can be cpu limited with 50% usage on quad in some games.

Easily verified by varying the clockspeed of the cpu and memory.
When CPU use gets beyond 90% average and framerate drops, it is usually due to cpu bottleneck. A bump in ram speed sometimes helps..
If you dont think this to be the case, explain why.

Please dont give mystical answers.

^This is very correct.. a game can be single threaded limited and that's 12% usage in a 4c/8t chip.
When considering a cpu bottleneck, each core is analysed individually.
It only takes one to bottleneck.
 
Easily verified by varying the clockspeed of the cpu and memory.
When CPU use gets beyond 90% average and framerate drops, it is usually due to cpu bottleneck. A bump in ram speed sometimes helps..
If you dont think this to be the case, explain why.

Please dont give mystical answers.


When considering a cpu bottleneck, each core is considered individually.
It only takes one to bottleneck.
Um what? Most of that is nonsense and you dont really seem to have grasp on this subject at all. And the easiest and best way to check for a cpu bottleneck is simply lower the res and look at the framerate change.
 
Um what? Most of that is nonsense and you dont really seem to have grasp on this subject at all. And the easiest and best way to check for a cpu bottleneck is simply lower the res and look at the framerate change.

Look if you are going to talk bollocks I'll just ignore you.
Lowering the res doesnt confirm the issue, its part of a diagnosis.

Please explain what I posted that is nonsense.
 
Back
Top