Intel Skylake Core i7-6700K IPC & Overclocking Review @ [H]

Look if you are going to talk bollocks I'll just ignore you.
Lowering the res doesnt confirm the issue, its part of a diagnosis.

Please explain what I posted that is nonsense.
Talk bollocks? You might want to review your own previous nonsensical comment as it contained no logic. And yes lowering the res will confirm if the cpu is a limitation in that spot. Anyway I will just unsubscribe as there is no point in arguing when you dont seem to understand.
 
Talk bollocks? You might want to review your own previous nonsensical comment as it contained no logic. And yes lowering the res will confirm if the cpu is a limitation in that spot. Anyway I will just unsubscribe as there is no point in arguing when you dont seem to understand.

Chickening out because your ball aching didnt win you any browny points :p
Lowering the res does not confirm a cpu botteneck alone.
You havent answered anything I asked.
What did I post that is nonsense and why is it nonsense?
Lets test your logic.
 
The real world gains of Skylake over Haswell are a FRACTION of that of Haswell over Sandy Bridge. I can't help but think that Broadwell could have matched or beat Skylake at every measure if it wasn't for the overhead of the freakishly powerful Iris 6200.

First AMD with the 7970/280x/380x and now this, it almost feels like the last 3-4 years of computing advancement went on rest cycle.
 
Zarathustra[H];1041783418 said:
Not saying you were. Those 3 titles might be those rare ones :p. I have no idea, I've never run any of them

I struck unlucky by the lucks of it :p
 
Zarathustra[H];1041783237 said:
CPU limitation in modern titles is relatively rare, at least at the frame rates most people play at. Just about every CPU out there released in the last 5 years can handle most titles at the 60+ fps level. There are a few exceptions, but not too many.

That is, until you add SLI or Crossfire into the mix.

Usually CPU load is independent of resolution, for instance 60fps at 640x480 loading the CPU identically to 60fps at 4k.

This isn't the case when using SLI or Crossfire. Enabling SLI or crossfire instantly ups the CPU load, and it only gets worse the higher resolution you set.
.

Man this is absolutely Wrong, how can you say the load at 640x480 is the same as 4K?. Draw Distance are directly affected by Resolution scaling, so the higher resolution the higher CPU resources you need to fill all the extra space in screen....
 
I am just going to point out that CPU reviews typically show low resolution so you can see the effect of the CPU ( like the review we all just read about the Skylake CPUs.)
 
Man this is absolutely Wrong, how can you say the load at 640x480 is the same as 4K?. Draw Distance are directly affected by Resolution scaling, so the higher resolution the higher CPU resources you need to fill all the extra space in screen....

It may increase CPU load slightly, but I have never seen it be significant. In the noise level, until except for when I used crossfire, and now that I've added SLI.
 
I struck unlucky by the lucks of it :p

So, I should clarify what I said above.

Those titles may be high CPU load titles, but something still doesn't seem right here.

Are you trying to vsync at 120hz or something?

Are you running SLI/CFX?

I mean, a 2500k at 4.3 SHOULD be within ~10% of the base clock of a stock 6700k.

If these games were so demanding, I feel like I would have heard about it.

Something just doesn't make sense with your experiences. Makes me wonder if something else is wrong with your rig.
 
Did I miss where it said how much better it overclocked?

Weird that they delidded and then didn't even test an overclock.

http://www.overclock.net/t/1568357/skylake-delidded ... on this link from earlier they did a before/after test.

Test Used:
Wprime1024 5.1ghz 1.48v

Before: 96C hit and failed within seconds (hard to tell what load after 2 minutes would have been probably 105+)
After: 78C Warmest core and passed thumb.gif

Running a 6700k at 5.1 stable sounds pretty damn good if true for most users. Wish it had some benchmarks at that speed.
 
Weird that they delidded and then didn't even test an overclock.

http://www.overclock.net/t/1568357/skylake-delidded ... on this link from earlier they did a before/after test.

Test Used:
Wprime1024 5.1ghz 1.48v

Before: 96C hit and failed within seconds (hard to tell what load after 2 minutes would have been probably 105+)
After: 78C Warmest core and passed thumb.gif

Running a 6700k at 5.1 stable sounds pretty damn good if true for most users. Wish it had some benchmarks at that speed.

My 4790K when naked with a EK Block runs 18c cooler than with the useless ihs in place. I've had it up to 5.2 ghz stable as a rock. I will wait and see how the Skylake revs and stepping go. The 5930K is a screamer also. So there really is no reason for me to upgrade. But I do enjoy playing with new junk. :)
 
Skylake is a completely new architecture and in reference to Haswell it's a smaller node too. And there's practically nothing to talk about here. There was a tick and a tock and the hand didn't move.

Maybe I'm being too cynical... but again, if we at least got something that used less power and generated less heat, then we could say that we got something out of 14nm.


Isn't Skylake more of a cpu aimed at notebooks? I keep reading of Skylake "30% battery perfomamce increases", 16-41% HD graphics performance etc and it seems Skylake was intended to impact mobile market more than desktop. Desktops have kinda maxed out relative to most people's puposes.

Realistically, most people don't really care about overclocking anymore because it barely makes a difference. Take away the bench jockey's apps and nobody can tell a 4.2 from a 5.0. A few extra framerates on an already able PC means little outside niche forums. If people are driving Corvettes in Manhattan it matters little if they upgrade to a Bugatti Veyron because they aren't going anywhere with the extra performace anyway. Roads, laws, traffic just don't alow performace above a certain level.

Desktop cpus are on the edge of the envelope. No reason Intel or anyone should care about a aiming to make a 6.0 cpu that can be OCed to 7.0. Having more cores makes more sense but apps etc don't drive much of a need for that either yet. Desktop hardware is kinda dead in the water and usual hype like silly militarized marketing of power supplies cant hide the fact theres barely a reason to upgrade much these days Barely a reason to OC either..


Intel's Skylake expected to bring stamina and speed to your laptop
http://www.techradar.com/news/compu...ring-stamina-and-speed-to-your-laptop-1300173
 
Weird that they delidded and then didn't even test an overclock.

http://www.overclock.net/t/1568357/skylake-delidded ... on this link from earlier they did a before/after test.

Test Used:
Wprime1024 5.1ghz 1.48v

Before: 96C hit and failed within seconds (hard to tell what load after 2 minutes would have been probably 105+)
After: 78C Warmest core and passed thumb.gif

Running a 6700k at 5.1 stable sounds pretty damn good if true for most users. Wish it had some benchmarks at that speed.

Was it stable? Sounds like it died after 2 minutes, probably not gonna hit 5. Still, it'd be nice if intel used better TIM. I get why they're not using solder, but there's really no reason not to use bet TIM. Alternatively, they could just have a lose IHS, let you use your own TIM and then put the IHS over the CPU.

If it's for enthusiasts, then there's really no reason not to do one of those.
 
Isn't Skylake more of a cpu aimed at notebooks? I keep reading of Skylake "30% battery perfomamce increases", 16-41% HD graphics performance etc and it seems Skylake was intended to impact mobile market more than desktop. Desktops have kinda maxed out relative to most people's puposes.

Just looking at the i7-6700K it does not appear to be more power-efficient than Haswell. It has a slightly higher TDP and the power charts I saw show Skylake higher or about the same. Maybe that'll be different for their notebook CPUs.

I do agree about the performance situation though. You can only get so much. If it came cooler, quieter, and smaller than the previous gen it would at least be considered an improvement IMO.
 
Things like VR and AR are going to be very CPU hungry.

But you're right. The reason we don't get more performance on the desktop is that 90% of desktops don't need more performance.

Another reason is datacenters are limited by power and cooling, not the number of sockets they can fit in a rack.

If either of those change and there is a compelling need for more power in a single socket I'm sure the brilliant minds at intel can make it happen. But right now they are just solving other issues. Trying to get x86 into cell phones and tablets. Trying to keep ARM out of laptops.

Mobile computing is the future of computing. Intel knows it and is working hard to make sure it stays relevant.
 
Zarathustra[H];1041783553 said:
So, I should clarify what I said above.

Those titles may be high CPU load titles, but something still doesn't seem right here.

Are you trying to vsync at 120hz or something?

Are you running SLI/CFX?
Seems fine to me, nothing else is out of order.
I'm using single 1080p 60Hz displays only with a single 980ti.

I mean, a 2500k at 4.3 SHOULD be within ~10% of the base clock of a stock 6700k.

If these games were so demanding, I feel like I would have heard about it.

Something just doesn't make sense with your experiences. Makes me wonder if something else is wrong with your rig.
I will confirm the difference, there is a reported 20%+ IPC difference and I will be clocking it higher.
If I can hit 5GHz (I am on water), that will net me another 16%.
And yep, I have got a crap 2500K :p
But it has worked well for over 4.5 years initially at 4.6GHz but that slowly crept down.
It has no faults other than needing a lot of vcore to clock.

Btw, I'm not saying I need that much extra CPU performance, I reckon about 10% extra will do me fine with the current crop of games.
If my 2500K was at 4.7GHz, I should be ok hitting 60fps constant (ie minimum fps). Unfortunate, but then again, I get to upgrade :)
 
Last edited:
Not really sure why this is such a good upgrade path for 2500K/2600K users. The improvements seem very minor.
Not going to upgrade because of the new chipset either. We all know how intel can be with chipsets.
 
Not really sure why this is such a good upgrade path for 2500K/2600K users. The improvements seem very minor.
Not going to upgrade because of the new chipset either. We all know how intel can be with chipsets.

Yep it is minor if you can overclock beyond 4.6GHz on SB.
Even when below that, you need a fast card to get framerates high enough to hug 60fps or higher.
Higher fps uses more cpu and some of todays games push the cpu very hard.
You can reduce settings to alleviate the issue without paying anything.
It might not even matter to you if fps drops to say 50 at times.

If you want minimum of 60fps at max settings in everything, upgrading architecture is an option.
Or if you are bored and want some new hardware.
Both my criteria :)
 
OTOH maybe a freesync/gsync monitor is the way to spend the money for smooth gaming.

No more 60fps minimum requirement.
 
Having had G-Sync monitors before, I can tell you it is still a requirement.

I can remember when 30fps was all people wanted (back in the WC days). Our movies are at 24fps and TV around 30....that said, those don't do well if you pan quickly to the left or right (which some directors seem to forget).
 
So there's still not a monumental increase in speeds. My 2600K is at 4.5 Ghz and it won't double my performance. Wonder how much of a difference is the CPU and how much is the RAM speeds. Doesn't really matter I suppose as the Encoding benchmarks would have shown it if there was any.
 
So there's still not a monumental increase in speeds. My 2600K is at 4.5 Ghz and it won't double my performance. Wonder how much of a difference is the CPU and how much is the RAM speeds. Doesn't really matter I suppose as the Encoding benchmarks would have shown it if there was any.

I'm disappointed as well. Last month my 2600k@5ghz had to retire due to my custom loop leaking and destroying the board. Picked up a 5960x and was surprised to see my min fps I'm bf4 jump dramatically.
 
I can remember when 30fps was all people wanted (back in the WC days). Our movies are at 24fps and TV around 30....that said, those don't do well if you pan quickly to the left or right (which some directors seem to forget).

Since Castle Wolfenstein and the birth of the first person shooter I've always remembered 30 as the accepted minimum and 60 as the desired minimum.
 
I can remember when 30fps was all people wanted (back in the WC days). Our movies are at 24fps and TV around 30....that said, those don't do well if you pan quickly to the left or right (which some directors seem to forget).
TV is 60, interlaced or not, it's still 60 temporal frames

I've never ever, ever been happy with 30fps.
ever
 
I can remember when 30fps was all people wanted (back in the WC days). Our movies are at 24fps and TV around 30....that said, those don't do well if you pan quickly to the left or right (which some directors seem to forget).

I find that 30fps LOOKS fine, but the mouse response/feel is not very good.

I'm OK with 30fps in single player FPS or strategy games, but in multiplayer FPS games, I demand that the frame rate never dip below a flatline of 60fps, preferably with all eye candy turned on, because - well - that's the way the developers intended it to look :p
 
Zarathustra[H];1041784417 said:
I find that 30fps LOOKS fine, but the mouse response/feel is not very good.

I'm OK with 30fps in single player FPS or strategy games, but in multiplayer FPS games, I demand that the frame rate never dip below a flatline of 60fps, preferably with all eye candy turned on, because - well - that's the way the developers intended it to look :p

Same. Can't take 30 FPS. Occasional dip sure
 
I really have to laugh at folks with old i7's that are still questioning a upgrade. Just admit your either a tight ass or that slow is cool. There are these comical hold outs that want to be a part of a wave. but refuse to pay and play. If a old Ive or Sandy is still throwing a rooster tail for you. Then why are you here?

I have fun Z97 and X99 machines. But what if Skylake turns into the next celly 300 of 2.4c. They do a rev and step right. I may own 30 of them and bin them out.

I stayed on a Q6600 for years at 4.8ghz. It took 4790K to make me hit the buy button. Then I saw the speed. Then went too x99. Thus I'm always looking for the next fun thing.

There is no sane reason for me to upgrade. But if there is a cool toy I might buy.
 
I really have to laugh at folks with old i7's that are still questioning a upgrade. Just admit your either a tight ass or that slow is cool. There are these comical hold outs that want to be a part of a wave. but refuse to pay and play. If a old Ive or Sandy is still throwing a rooster tail for you. Then why are you here?

I have fun Z97 and X99 machines. But what if Skylake turns into the next celly 300 of 2.4c. They do a rev and step right. I may own 30 of them and bin them out.

I stayed on a Q6600 for years at 4.8ghz. It took 4790K to make me hit the buy button. Then I saw the speed. Then went too x99. Thus I'm always looking for the next fun thing.

There is no sane reason for me to upgrade. But if there is a cool toy I might buy.

Nah, their point is valid.

Max overclock to max overclock performance has only gone up 10-15% in the 4 generations since Sandy.

In the golden age of our shared hobby in 2000-2005 or so, we'd see double or even triple of that improvement in a single generation.

And very few modern titles or applications even push the meager improvements we get generation over generation.

Back then, if you didn't upgrade your CPU for a year, you'd be seriously behind. Today? 4 years later, my CPU is still only 10-15% behind a max overclocked 6700k, and it doesn't even matter.

There simply isn't much of a reason to upgrade anymore.

Or - rather - the raw performance reason to upgrade is gone, so instead they have replaced it with other "features" like m.2 slots (and the ability to boot from them) and DDR4 RAM.

When I finally upgrade the 3930k, it will likely be because the motherboard is falling apart, or missing some feature I need, not because I need the performance.

What a stark contrast that is to back in the day when you never had to worry about degrading your CPU by overvolting it, because it would be obsolete before it died anyway...
 
Last edited:
I really have to laugh at folks with old i7's that are still questioning a upgrade. Just admit your either a tight ass or that slow is cool.

I have an i7-870. I hardly ever push it these days. It's hardly slow. I probably wouldn't even notice an upgrade to Skylake. So what would be the point other than to make Intel richer and me poorer?

Intel hauls in like a trillion dollars every two minutes so they can bite me. I'm not going to be a part of that unless I'm getting something worthwhile in return, since I have some self-control and don't jump all over these things from a little bit of glitzy marketing.

Intel has given up on the desktop market anyway. All they care about is selling laptops and notebooks now, and their high-margin enterprise business.
 
I really have to laugh at folks with old i7's that are still questioning a upgrade. Just admit your either a tight ass or that slow is cool. There are these comical hold outs that want to be a part of a wave. but refuse to pay and play. If a old Ive or Sandy is still throwing a rooster tail for you. Then why are you here?

I have fun Z97 and X99 machines. But what if Skylake turns into the next celly 300 of 2.4c. They do a rev and step right. I may own 30 of them and bin them out.

I stayed on a Q6600 for years at 4.8ghz. It took 4790K to make me hit the buy button. Then I saw the speed. Then went too x99. Thus I'm always looking for the next fun thing.

There is no sane reason for me to upgrade. But if there is a cool toy I might buy.

If it doesn't double my speed, then it's not worth an upgrade. Taking a 10 minute task to 7 minutes, just isn't that big of a deal to me. Unless it can change a task that takes longer than overnight to something that takes an hour, it's not going to change my workflow. lol30%faster just isn't enough to fuck with.
 
I really have to laugh at folks with old i7's that are still questioning a upgrade. Just admit your either a tight ass or that slow is cool. There are these comical hold outs that want to be a part of a wave. but refuse to pay and play. If a old Ive or Sandy is still throwing a rooster tail for you. Then why are you here?

I have fun Z97 and X99 machines. But what if Skylake turns into the next celly 300 of 2.4c. They do a rev and step right. I may own 30 of them and bin them out.

I stayed on a Q6600 for years at 4.8ghz. It took 4790K to make me hit the buy button. Then I saw the speed. Then went too x99. Thus I'm always looking for the next fun thing.

There is no sane reason for me to upgrade. But if there is a cool toy I might buy.

Contradict yourself much?

The original i7 was a huge jump over the Q6600, then when sandy bridge added IPC improvements and even more clock you still didn't jump.

It took you another tock-plus(ivy bridge) and then finally a tick before you jumped!!!

The knee of this curve really seems to be sanyd bridge. The jump from nehalem (top clocks ~3.8-4.2 to sandy ~4.4-4.8 was a good chunk + the IPC. The last real get excited for it launch. Even then if you had a nehalem that wasn't quite enough to jump.

People like to feel like their upgrade means something. For every benchmark you can find that shows 20, 30 or even 40% jumps to skylake, I can find another that's basically a push. If you're basically a gamer it's hard not to look at most of the benchmarks that show at best 3-5% bumps and get excited about it. Especially when upgrading your graphics card will get you a much bigger bang for your buck. At some point we'll all need to upgrade. Maybe this is the tipping point, maybe not.
 
I have an i7-870. I hardly ever push it these days. It's hardly slow. I probably wouldn't even notice an upgrade to Skylake. So what would be the point other than to make Intel richer and me poorer?

Intel hauls in like a trillion dollars every two minutes so they can bite me. I'm not going to be a part of that unless I'm getting something worthwhile in return, since I have some self-control and don't jump all over these things from a little bit of glitzy marketing.

Intel has given up on the desktop market anyway. All they care about is selling laptops and notebooks now, and their high-margin enterprise business.

You can't be more wrong because you can't right? =D. going from your 870 (even if it's high OC'd) to a stock SB 2600K/2700K will be a big upgrade and strongly noticeable specially if you compare directly to 4.5+ghz SB 2600K, your chip isn't even Socket 1366 to have the advantage of Triple channel.

If you say you wouldn't notice a difference its because you have never used anything newer than Nehalem, that chip its old and it's actually slow to the point it will be a bottleneck in a lot of games. Things it's just even worse if you compare it to high-ends X79 Chips.. so not even talk about Haswell X99 here..
 
I'm curious if intel hyperthreading will increase DX12 gaming performance yet (if it will increase more than 1%). I'm guessing if the focus is on multithreading draw calls, then you would need a SLi/Crossfire setup to match.
 
I'm curious if intel hyperthreading will increase DX12 gaming performance yet (if it will increase more than 1%). I'm guessing if the focus is on multithreading draw calls, then you would need a SLi/Crossfire setup to match.

No DX12 does some cool things, but it's not a free lunch.

Also it's more about enabling the developers to do new things with 10x the draw calls. Since current games already fit in the CPU budget, there will be no immediate jump in perf.
 
No DX12 does some cool things, but it's not a free lunch.

Also it's more about enabling the developers to do new things with 10x the draw calls. Since current games already fit in the CPU budget, there will be no immediate jump in perf.

I actually disabled hyperthrreading on mine recently.

I was having some frame time spikes in SLI and I wanted to check if maybe it had anything to do with processes accidentally winding up on the same physical CPU.

It didn't seem to make mucb of a difference :p
 
Back
Top