i5-6600k vs i7-6700k after overclocking achieve similiar speeds?

vmirjamali

Limp Gawd
Joined
May 14, 2010
Messages
170
Hi I'm wondering if these two chips achieve similiar speeds after overclockin them out. I've seen a few posts on here stating that the i5-6600k has been achieving 4.2-4.4 ghz which is i7-7600k levels. Is this true and if so does this make the 6600k a better chip price wise at the same frequency after overclock?
 
you need to do a better homework.. nope.. even at the same exact frequency the 6700K will be still 30-40% faster in Multi-thread applications and a few amount of games due to hyperthreading and extra cache.. so you are comparing a 4c/4t vs 4c/8t.. and also no, remember that the 6700K default turbo its 4.2ghz and even at stock speed vs a 4.4ghz 6600K, the 6700K will still be faster.
 
The CPUs will have the same single-threaded performance when overclocked. Each seem to be hitting around 4.6 GHz on air:

http://hwbot.org/hardware/processor/core_i7_6700k/

http://hwbot.org/hardware/processor/core_i5_6600k/

These CPUs will have the exact same performance in games that utilize four cores or less.

It's your call if you want the extra multithreaded performance (20-30% higher performance). In most games you won't notice the difference (with a single GPU).

I mostly agree, but if you are playing games that barely run at 60 FPS with everything turned on or have a monitor that does above 60hz it can be an advantage to have the extra power. Minimum frame rates should be higher.

I think general usage is much faster with the HT, I cant imagine going back to an I5, but if your PC is 100% gaming machine maybe an I5 is all you need.

Tell yourself whatever is needed to justify buying the fastest toy out there!
 
I mostly agree, but if you are playing games that barely run at 60 FPS with everything turned on or have a monitor that does above 60hz it can be an advantage to have the extra power. Minimum frame rates should be higher.

I think general usage is much faster with the HT, I cant imagine going back to an I5, but if your PC is 100% gaming machine maybe an I5 is all you need.

Tell yourself whatever is needed to justify buying the fastest toy out there!

Okay, so basically you're lying to yourself to justify your purchase. Real smooth there high-roller :D

No, you don't get any real-world performance increase with those extra threads. It's all in your head. Unless you're running Cinebench all day, those threads are sitting idle.

Anandtech tested Broadwell Core i5 and Core i7 processors. They got so close in performance that you can't tell the difference in games: both average and minimum frame rate are the same.

http://www.anandtech.com/show/9320/intel-broadwell-review-i7-5775c-i5-5675c/9

Three modern games were tested with minimum framerate:

Grand Theft Auto V - the minimum framerate percentage is so close you wouldn't notice. It's only a 1.75% difference.

In GRID Autosport the minimum difference is 1.5 frame per second. Barely 1% difference.

In Shadows of Mordor, they score the same minimum frame rates, even when running SLI or CFX.

zaniix, the reason why yopu see the huge gap in performance between the 6600k and the 6700k in benchmarks is because Intel has factory-overclocked the 6700k by 500 MHz. Since both CPUs seem to be capable of the same overclocked speeds, that difference goes away.
 
Last edited:
Fair enough. But that only applies if you're using an Nvidia card, in a very specific subset of DX11 games

It still won't affect the rest of your system, and how it feels in everyday use.
 
Sometimes I wish all I cared about was gaming. Then I could easily get away with buying the cheap CPUs.
 
Anandtech tested Broadwell Core i5 and Core i7 processors. They got so close in performance that you can't tell the difference in games: both average and minimum frame rate are the same.

I thought those benchmarks were very interesting also. The biggest difference between an i5 and an i7, aside from HT, is usually the cache. A 6700k vs 6600k is 8MB vs 6MB of L3 cache. 5775C vs 5675C is 6MB vs 4MB L3 cache, but broadwell also has that massive 128MB L4 cache to fall back on. I wonder to what extent that L4 cache is able to mitigate the difference in L3 cache.

As far as HT, I view that more as insurance for the not so distant future. DX12 has a lot of potential to increase the number of cores games are able to take advantage of. We might very well find ourselves in a position where 6-core processors (haswell-e, etc) benefit greatly and even take the spotlight. If you go skylake, those 4 extra threads on the i7 will certainly help hedge your bets in that regard.
 
As far as HT, I view that more as insurance for the not so distant future. DX12 has a lot of potential to increase the number of cores games are able to take advantage of. We might very well find ourselves in a position where 6-core processors (haswell-e, etc) benefit greatly and even take the spotlight. If you go skylake, those 4 extra threads on the i7 will certainly help hedge your bets in that regard.

I don't see a problem with the viewpoint, the extra threads are more worthwhile then they were four years back. At least there are a few popular games out there that can use more threads.

But just be aware that existing games that can't hit 45fps minimum (Nvidia GPU) on a Core i5 at stock, AND benefit from those extra threads on the i7 are less than the fingers on my hand. The total sales of 8-thread processors will continue to be low so long as Intel charges a $100-150 premium for them, so game developers will continue to target Core i3 and Core i5 gaming systems as entry-level and high-end.

Just look at how well the i3 4330 does on those benchmarks. Not a bad value gaming CPU! Yeah, it gets destroyed on Crysis 3, sure, but it will play everything else with ease.

So no fear-mongering to encourage sales here. It still doesn't make enough sense from a "value"standpoint, unless you're a hardcore 120hz gamer. :D
 
Last edited:
The total sales of 8-thread processors will continue to be low so long as Intel charges a $100-150 premium for them, so game developers will continue to target Core i3 and Core i5 gaming systems as entry-level and high-end.

On the flip side, both the XBox One and the PS4 use 8-core processors. The number of console gamers dwarfs the number of PC gamers many times over. I feel that provides significant incentive for developers to indeed target >4 cores. We've seen a number of console ports already, however those have all been ports to DX11, including all of it's limitations. Yet despite that, games like GTA5 are still amazingly multi-threaded compared to their predecessors. Once console games are finally being ported over to an equivalent low-level API on the PC (such as DX12), I do feel that we will see a large increase in core count utilization, but only time will tell.
 
Last edited:
On the flip side, both the XBox One and the PS4 use 8-core processors. The number of console gamers dwarfs the number of PC gamers many times over. I feel that provides significant incentive for developers to indeed target >4 cores. We've seen a number of console ports already, however those have all been ports to DX11, including all of it's limitations. Yet despite that, games like GTA5 are still amazingly multi-threaded compared to their predecessors. Once console games are finally able to be ported over to an equivalent low-level API on the PC (such as DX12), I do feel that we will see a large increase in core count utilization, but only time will tell.

Six of eight cores available to games until recently. Do not forget that when you compare them with PCs.

And again money talks. You completely stepped around my argument. A game that has six threads will run just fine on a four thread CPU like a Core i3 or Core i5. Until Intel unveils a 12 thread CPU on the mainstream platform, the 8 thread processors will not be affordable. So they will still target 60fps for the Core i5, end of story.

Speaking of money talks, that's precisely the reason they went with 8 Jaguar cores. Because the people complained about high power during gaming and lack of power management at idle, so they used an off-the-shelf low-power core that had full power management. AMD was the only source for this in 2013 that also offered a powerful integrated GPU.

They had to choose 8 Jaguar cores over 4 Steamroller cores because Steamroller was not ready to ship in 2013, and Steamroller isn't an FPU powerhouse anyway. So if they wanted to launch in 2013, Jaguar was the only way.

It wasn't a preference for more cores. With 2-wide decode, OOO, the floating-point throughput per-clock of Jaguar is significantly higher than Xenon. But with only half the clock speed, that advantage is fairly small, so the 3 extra cores (6 total available to games until recently) are critical to getting more performance than the 360 (this is why Microsoft is pushing cloud compute for select titles). And it's simply no faster compared to games that can make full use of Cell (which is why the PS4 went beefy on GPU to make it feel like more of an "upgrade").

And how does that compare to your average desktop chip with four-wide decode running at well above 3 GHz? Most analysis puts the performance at below Core i3 level (can be done by multiplying by 1.5x (6 cores available to the game) the performance of Kabini Athlon 5150 in multithreaded benchmarks). So much for the CPU power-house :rolleyes:

http://www.anandtech.com/bench/product/1224?vs=1197

You can pick up that 3.7 GHz monster right now for $110, so the console CPU is not exactly the target performance level when they port a game to the PC. There's so much more there to be had :D
 
Last edited:
Yes, but again money talks. You completely stepped around my argument.

What argument are you referring to exactly? It was never my intention to become another player in a back and forth bickering match, I was just responding with thoughts related to the specific text that I quoted. I never intended to imply that an 8-core AMD chip would compare to a 4-core Intel chip, or imply that they chose an 8-core chip for any specific reason. My point was only that the console developers will be forced to develop games that are as multi-threaded as possible in order to take advantage of the hardware at their disposal. The fact that they are indeed dealing with 8 slower cores instead of 4 faster cores only increases the need for games to be multi-threaded. I have to imagine that will result in multi-threading benefits on the PC also.
 
What argument are you referring to exactly? It was never my intention to become another player in a back and forth bickering match, I was just responding with thoughts related to the specific text that I quoted. I never intended to imply that an 8-core AMD chip would compare to a 4-core Intel chip, or imply that they chose an 8-core chip for any specific reason. My point was only that the console developers will be forced to develop games that are as multi-threaded as possible in order to take advantage of the hardware at their disposal. The fact that they are indeed dealing with 8 slower cores instead of 4 faster cores only increases the need for games to be multi-threaded. I have to imagine that will result in multi-threading benefits on the PC also.

*SIX* THREADS. If you're going to badger on all day about something so unimportant, the least you could do is get the number of cores right that a Bone/PS4 game can use.

I can't see why this matters. If most people don't care about fps above 60, why will they care that the engine is capable of handling more cores?

Until the next round of console hardware (rating from complex down to less complex games) , a Core i3 will be more than enough for 40-75fps, and the Core i5 will be more than enough for 60-120fps, and the Core i7 will be more than enough for 80-140fps when it is ported to the PC. This is assuming your console game targets 30 to 60fps, and has 6+ threads

How many people can tell the difference between 60 and 80fps? Most of he world doesn't care about the 30 to 60 you see on most console games. So again it comes down to VALUE - what do I get for that extra $150 that I can actually SEE?

Until we get more capable consoles, there's not much value in a Core i7. Better than there used to be, but still not very high.
 
Last edited:
*SIX* THREADS. If you're going to badger on all day about something so unimportant, the least you could do is get the number of cores right.

I can't see why this matters. If most people don't care about fps above 60, why will they care?

Until the next round of console hardware (rating complex-less complex games) , a Core i3 will be more than enough for 45-75fps, and the Core i5 will be more than enough for 60-120fps, and the Core i7 will be more than enough for 80-140fps. This is assuming your console game targets 30 to 60fps.

How many people can tell the difference between 60 and 80fps? Most of he world doesn't care about the 30 to 60 you see on most console games. So again it comes down to VALUE - what do I get for that extra $150 that I can actually SEE?

I was never actually debating about value, or what is "enough". I was just adding a few thoughts and opinions in regards to why I feel more threads will become useful in the future. Value is often in the eye of the beholder, and gambling in regards to the future is a large part of that. Gambling is often based on speculation about the future, and in this case, the degree to which games will make use of extra cores.
 
Last edited:
I decided not to speculate that those hyperthreaded cores would provide some value in the future and just ordered a 6600k to upgrade my 2500k. If a 6700K does end up giving more than a 10% performance boost when games start appearing that take advantage of DirectX12, then I may go ahead and upgrade to the 6700k and sell the 6600k. Not a big deal, and not worth an extra $100 to me at this time. I've never been a big fan of hyperthreading, so until I see some real benchmarks showing me that it offers tangible benefits to my games, I'll stick with the i5.
 
If the system is strictly for gaming I doubt it'll matter much, then again, if it's strictly for gaming I wouldn't necessarily upgrade from Sandy.
 
.....
So no fear-mongering to encourage sales here. It still doesn't make enough sense from a "value"standpoint, unless you're a hardcore 120hz gamer. :D

I'm not a 120hz hardcore gamer but I am researching now ahead of my impending Skylake build. The last time I did a full build was in 2008 with an ABit IP35Pro and a Q6600 CPU. CPU has since been upgraded to Q9650 when that became available for what I felt like was a reasonable price.

When I do this Skylake build, I'm building with an i7 not an i5. Not because I'm a "high roller" or just don't care about the money but rather because I'm looking at the longest term components that I can use as far into the future as possible. I feel like in this mindset another $150 for double the threads and other i7 features (larger cache, higher stock clocks, etc.) is a no-brainer. I might actually break 10 years use on this next machine - and yes, CPU may be upgraded at some point, along with video card, memory hard drive etc.

If I were looking for absolute maximum value and a use-horizon of say 12-24 months, I might consider saving the $150 and putting it towards a better video card. The way I use machines, for the length of time I use them, double threads + i7 features for 50% more cost on the CPU - even if those threads/features aren't maximally right now - makes sense. But then I have an Abit IP35Pro I still use daily with over 60,000 power applied hours (26ish thousand actual up-time) on it. I use it to game (Diablo 3 at 2560x1440 mainly), run Photoshop, Office apps and web browsers. Don't have much time for writing software any more but it's also perfectly capable of running Visual Studio or whatever IDE you prefer. As long as I never saw a newer machine compile anything, I might even be happy with how quick my old ABit compiled something! :D

Bottom line for me: Saying more threads aren't useful just because they're not useful right this second is too similar to saying 4 cores is not much different than 2 cores and we know how that one ends. ;)

Edit: Just read the post right above mine - if strictly for gaming, I wouldn't be upgrading. Need upvote button for that post! :)
 
I bought a 6600K for the higher IPC, good clockability and because they are in the shops.
Its a lot better than a 2500K for my gaming, 1080p @ 60fps.
It will improve gaming at 120Hz+ a lot too.

If I need a 6700K later, I'll get one.
All thats available for now is photographs.

ps I'm no longer cpu bound in any game (with a clocked 980ti) so a 6700K wont help at all atm.
 
Back
Top