Skylake vs. overclock 2500k reviews?

quiktake

Gawd
Joined
Feb 22, 2011
Messages
573
So, I have been reading skylake reviews for a week and still have not seen a review with an overclocked 2500k included for comparison. I have seen alot of conjecture and math to try to calculate a possible comparison, but I think a real gamer oriented overlock comparison is warranted of the nonhyperthreaded Intel cpus from the last 5 years.

Are there any reviews with a 2500k overclocked out there? I assume it's just a time constraint issue for the reviewers.
 
Kyle's review is fantastic, but it makes me sad. There is very little compelling reason to upgrade from a Sandybridge system, and some of my friends are still on Nehalem systems they don't care to bother upgrading.

I really wanted Skylake to be a 20%+ IPC bump over Haswell. I actually bought my 2500+ and a cheap motherboard (that I have since replaced with a P8Z68-V PRO) as a quick upgrade for my existing computer (Q6600) 4 years ago with the idea that I'd do a full build later on when something better came out. Other than picking up some video cards (that I actually bought for Bitcoin mining) I haven't done anything with the system other than a mild (for Sandybridge) overclock.

Now all my hopes are riding on AMD to actually pull something off that's at least competitive with... I guess Kirbylake? Is that what Intel will be selling at the time Zen is released? I'm not that convinced, but I'm willing to buy one even if it's just competitive with Intel's best. I really miss upgrading, and yes I know that's sad. At this point I may just have to buy a new CPU/Mobo/RAM for boot from PCI SSD support and that's just difficult to swallow.
 
Almost no point upgrading from OC'd Sandy.

Every time I read reviews of skylake that "What a piece of junk" quote from Star Wars comes to mind when they first see the Millenium Falcon.

It's just depressing how little innovation Intel has done in the past 8? years now. We have what, onboard GPUs? Slightly smaller/faster chips that have basically hit a seemingly impenetrable CPU wall?

I hope something like HP's memristors come along and eventually dump them on their head in the server space and ARM continues to trounce them in mobile. Maybe we'll eventually see something on the desktop other than this broken x86 monopoly.
 
Do you guys think there will be any advantage having on board GPU in the near future?

Thinking on making the jump from my old trusty i7-860 :D
 
Do you guys think there will be any advantage having on board GPU in the near future?

Thinking on making the jump from my old trusty i7-860 :D

No, I don't see any near-term benefit from onboard GPUs. Also, I bet that if there is a benefit, you'll have to buy a new one to get it anyway.

But, if you're on a Nehalem, the performance increase from buying and overclocking a 6700k might be worth it. If it were me I'd ask myself, "Am I having performance issues based on my CPU in anything I run on a regular basis?" and go from there.
 
if you are asking "Do I need to upgrade my CPU?", ask yourself these questions:

1. do I have an SSD? if no, then buy SSD, If yes, continue on
2. Do I have more than 4GB of RAM? if no then upgrade RAM, if yes, continue on
3. does my CPU usage go above 90% for the stuff that I do on it? if answer is no, then save money for something else, if yes then upgrade CPU

If I didn't kill my i7-920 back in the day by spilling water over the motherboard (don't ask), I would still be running it.
 
3. does my CPU usage go above 90% for the stuff that I do on it? if answer is no, then save money for something else, if yes then upgrade CPU

That hasn't necessarily been relevant since the single-core era. A poorly multi-threaded game or app might only make use of one or two cores. Two cores fully utilized would only show up as 50% usage on a quad-core CPU (assuming no hyper-threading). Upgrading to a new CPU would still increase performance in those games and apps. If you spend a significant amount of time playing/using those games/apps, then a CPU upgrade could easily be justified. Unfortunately there are more games that are poorly multi-threaded than properly multi-threaded, otherwise everyone would be going to Haswell-e and Skylake would be the new Celeron.
 
I loved Kyle's review, but I don't think anyone here would game at 1024x768 or 640x480 at low settings. I don't think that is a realistic way to compare chips. It's an artificial CPU limited scenario, but it does show a hypothetical (best case scenario) 25% increase in FPS over an overclocked 2500k

Are there any more "natural" comparisons out there? Maybe I am just being two picky. It just seems a lot of advice in the forums says a 25% speed boost is worth it. I think looking at FPS minimums instead of just averages would be useful.
 
All the gaming benchmarks on Skylake at HardOCP were done at super-low resolution settings to maximize the CPU influence and minimize the GPU influence on the results.

While the numbers paint a picture of performance slightly increasing year over year with the different architectures, at resolutions people actually play at, it really doesn't matter at all whether you have 2500k or a 6700k. This makes Skylake pretty much worthless for PC gamers if you ask me (if you already have a SB or above system). The only thing to really look forward to now is Zen.

Now a really budget Skylake-H Pentium or Celeron on a sub $200 notebook still remains pretty interesting, for the IGP alone.
 
two reviews with overclocked 2500k vs broadwell\skylake
just wish there was some good overclocked 5820k tests

PCLab.pl Core i5-6600K and Core i7-6700K - test Skylake Intel (LGA1151) in Windows 10 Skylake - a worthy successor to Sandy Bridge?
http://translate.googleusercontent....6.html&usg=ALkJrhj0alS52oor9_1qlHHzwwoFrn2_1g
a3.png


PURE PC
https://translate.google.com.au/tra...ake_mocne_cztery_rdzenie?page=0,33&edit-text=

here is another interesting review but unfortunately they dont provide any test system info and there is some question to there accuracy with some stutters seen on the 4790k system that others claim shouldnt happen
EUROGAMER
Core i7 6700K 4.4GHz Core i7 4790K 4.4GHz Core i7 3770K 4.4GHz Core i7 2600K 4.4GHz
http://www.eurogamer.net/articles/digitalfoundry-2015-intel-skylake-core-i7-6700k-review
 
Last edited:
I can only assume (based on their test platform page) that the graph you posted is CPU vs CPU w/ integrated graphics....

2500k overclocked vs skylake will not show that large of a gap in gaming. A small gap overclocked equally? Sure, but not 33%.
 
they are using a 980ti strix at pclab
pure pc is using a 980 g1

in gpu limited games the difference will be lower
for there cpu benchmarks they chose cpu limited sections of cpu heavy games to test unlike many other reviews that do the opposite using on rails sections of gameplay that can often be much less cpu intensive
some of anandtechs cpu benchmarks for example show a pentium being faster than a 4790k... thats how badly gpu bottlnecked they are
 
I can only assume (based on their test platform page) that the graph you posted is CPU vs CPU w/ integrated graphics....

2500k overclocked vs skylake will not show that large of a gap in gaming. A small gap overclocked equally? Sure, but not 33%.

actually you can check the performance on GTA V, The Witcher 3 and Watch Dogs Here: http://pclab.pl/art65154-29.html

Crysis 3 And Far Cry 4 here: http://pclab.pl/art65154-28.html

and as you can see, yes, the differences can be BIG, and that's what most people just refuse to see... and BTW all the tests are made with a 980TI Strix
 
*looks at over clocked 2500K system*
*looks at disappointing oc 6600K bench marks*
*closes browser tabs looking at new motherboards and DDR4*
 
not against everyone else's findings
just against lazy review sites that are running gpu bottlnecked tests which is most english websites unfortunately

same thing with ddr3 ram performance reviews a lot of reviews show no difference due to a gpu bottlneck when there is a 7-15% gain in cpu limited games

all most english reviews are showing you is that in a lot of games a older cpu is fast enough
trouble is sometimes its if you only play the sections of the game they tested
bf4 example a sp mission train ride or the jet flight all on rails is not even close to representative of the cpu load in mp gameplay
 
Last edited:
Everyone's assumptions also leave out differences between PCIe 2.0 on SB and 3.0 on newer systems. PCIe scaling has not been retested on the newest games with the 980 or 980ti that I can find.
 
two reviews with overclocked 2500k vs broadwell\skylake
just wish there was some good overclocked 5820k tests

Not sure if I trust the numbers on the Polish site anymore than someplace else. They have a 2500k being bested by the 2600k by almost 11% in BF4, etc.
 
I loved Kyle's review, but I don't think anyone here would game at 1024x768 or 640x480 at low settings. I don't think that is a realistic way to compare chips. It's an artificial CPU limited scenario, but it does show a hypothetical (best case scenario) 25% increase in FPS over an overclocked 2500k

Are there any more "natural" comparisons out there? Maybe I am just being two picky. It just seems a lot of advice in the forums says a 25% speed boost is worth it. I think looking at FPS minimums instead of just averages would be useful.

Tests at resolutions like that are purely for the purpose of isolating the CPU. It isn't meant to be a real-world gameplay test.
 
Actually it has been.

http://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/21.html

That's the exact article that spurred me to set up my X79 Dark running a 3930K that was running PCIE 2.0 on over to PCIE 3.0. It might not be officially supported but it does work (I did have to disable RAM XMP profiles and set manual speed/timing).

Going from PCIE 2.0 X8 to PCIE 3.0 X8 or X16 I saw an increase of 3.5 FPS in the Valley benchmark @ 1080P extreme preset. That's 100 FPS on the nose to 103.5. Granted it isn't a big difference, but it was free! Also, it was right on the money on the % difference.
 
I'm really getting the itch to upgrade but looks like I'll be keeping my 2500K at 4.8Ghz.
 
I'm really getting the itch to upgrade but looks like I'll be keeping my 2500K at 4.8Ghz.

If anything I'd get a Gsync monitor - as long as you stay above 30 FPS it will be beautiful! Now if there only were plans to equip a large TV with Gsync, then I'd be set.
 
I loved Kyle's review, but I don't think anyone here would game at 1024x768 or 640x480 at low settings. I don't think that is a realistic way to compare chips. It's an artificial CPU limited scenario, but it does show a hypothetical (best case scenario) 25% increase in FPS over an overclocked 2500k

Are there any more "natural" comparisons out there? Maybe I am just being two picky. It just seems a lot of advice in the forums says a 25% speed boost is worth it. I think looking at FPS minimums instead of just averages would be useful.

That's the main reason because I love everything in [H]OCP since 9 years ago except Gaming CPU reviews...

They have the famous Real World GamePlay CPU scaling and that was the last Review I have enjoyed and it's actually +6 years old.. so ancient and useless actually.
 
Actually it has been.

Dan I am aware of that article and it is almost a year old. I guess I mean specifically testing on Witcher 3/GTA, which seem to be the most demanding games yet, and using the 980ti.

That article does show a measurable difference in Ryse when using a 980. For SLI users - who would be limited to 8x on either chipset - there is a 19% difference between 3.0 8x and 2.0 8x at 1920.

If the ti is more powerful, and the newer games are more demanding, the gap could grow (or be nonexistent).
 
Dan I am aware of that article and it is almost a year old. I guess I mean specifically testing on Witcher 3/GTA, which seem to be the most demanding games yet, and using the 980ti.

That article does show a measurable difference in Ryse when using a 980. For SLI users - who would be limited to 8x on either chipset - there is a 19% difference between 3.0 8x and 2.0 8x at 1920.

If the ti is more powerful, and the newer games are more demanding, the gap could grow (or be nonexistent).

http://www.guru3d.com/articles-pages/pci-express-scaling-game-performance-analysis-review,1.html

You can see the gap starting here and like you said it will continue to grow. PCie 3.0, dual x16, extra CPU cores, and fast RAM are things everyone likes to say makes little difference, but it adds up. May not be a huge difference but when I'm paying the knot for my GPU trying to milk every frame for GTA V 4K I dont want to be bottlnecked in any shape or form.
 
Yeah that's a much better looking test. Something is seriously up with that Polish site. Either testing methodology is bad (2600k should not be beating 2500k by almost 12% in the same game) or the numbers are just flat out made up.

Why not?
Ht can affect average fps by about that much if the game can make use of 8 threads trouble with ht is the game play is often no smoother despite the fps increase
 
Why not?
Ht can affect average fps by about that much if the game can make use of 8 threads trouble with ht is the game play is often no smoother despite the fps increase

HT doesn't really double the threads, it allows for a single core to do work on a second thread simultaneously if that thread is asking it to use a part of itself that the first thread isn't utilizing. It doesn't help in games most of the time. Actually it can hurt performance if the OS isn't scheduling the cores well. If HT alone boosted performance that much, almost no one here would be buying i5s for their rigs. That being said it does seem to help in certain cases and hurt in others. There is a lot of conflicting information out there regarding HT as it really isn't that well understood by us enthusiasts.

Here is a thread comparing the performance in BF4 with an Ivy Bridge i7 with HT vs no HT:

http://www.overclock.net/t/1480050/...-off-win8-1-vs-win7-new-nvidia-337-50-drivers
 
Last edited:
Gah I'm in the same boat....

My 2500k continues to hum along at a healthy 4.7ghz, but I absolutely loathe the Gigabyte GA-Z68XP-UD3 board it sits on for various reasons I won't get into here.

I thought the time was right now that DDR4 is here and my 2500k is basically going on four years old now, and those i5-6600k + mobo + RAM combos on Newegg look so tempting.... but I'm not seeing a huge leap in performance in the reviews and benchmarks. Not compared to the increase I saw when I went from a Core 2 Quad Q9450 to the i5-2500k, anyway.

Decisions, decisions....
 
Last edited:
HT doesn't really double the threads, it allows for a single core to do work on a second thread simultaneously if that thread is asking it to use a part of itself that the first thread isn't utilizing. It doesn't help in games most of the time. Actually it can hurt performance if the OS isn't scheduling the cores well. If HT alone boosted performance that much, almost no one here would be buying i5s for their rigs. That being said it does seem to help in certain cases and hurt in others. There is a lot of conflicting information out there regarding HT as it really isn't that well understood by us enthusiasts.

Here is a thread comparing the performance in BF4 with an Ivy Bridge i7 with HT vs no HT:

http://www.overclock.net/t/1480050/...-off-win8-1-vs-win7-new-nvidia-337-50-drivers

Generally Hyperthreading has a minimal negative impact in gaming. I think many gamers leave it on most of the time or opt for CPUs that don't have it. That link shows HT in BF4 which by itself shows decent gains. Generally you will lose a few frames or there will be no meaningful change in games while Hyperthreading is enabled. BF4 is heavily threaded as it is designed to leverage CPUs with more than 4 threads. Clock speed still matters more but all things being equal the CPU with the most cores wins in BF4.
 
If I didn't live near a micro center I would still be rocking my 2550k or maybe even my 920. At this point I can usually upgrade for damn near free if I catch the right sale and sell my used stuff. The move to acrylic tubing will probably slow down my upgrades now. It's a lot more work than I thought it would be to bend all that tubing.
 
Generally Hyperthreading has a minimal negative impact in gaming. I think many gamers leave it on most of the time or opt for CPUs that don't have it. That link shows HT in BF4 which by itself shows decent gains. Generally you will lose a few frames or there will be no meaningful change in games while Hyperthreading is enabled. BF4 is heavily threaded as it is designed to leverage CPUs with more than 4 threads. Clock speed still matters more but all things being equal the CPU with the most cores wins in BF4.

Gains with HT on? I don't see anything, looks like they are playing multiplayer so it doesn't seem to be beyond typical variance in a non-standard test. There seems to be zero practical effect of HT in BF4 on a quad core. That being said, I have seen some tests where dual cores with HT seem to benefit a good amount in BF4.
 
That's the main reason because I love everything in [H]OCP since 9 years ago except Gaming CPU reviews...

They have the famous Real World GamePlay CPU scaling and that was the last Review I have enjoyed and it's actually +6 years old.. so ancient and useless actually.

Making a newer article based on the same line of thinking would be useless, as well. How many API and driver updates have there been in the past 6 years? How many games have been developed and released that are written to utilize those updates? How many new GPU architectures have been released that have a real impact on increasing real-world gaming performance and allow for higher resolutions, refresh rates, and eye candy?

My point is, most modern CPUs released in the past 5 years paired with an even more recent GPU(s) are going to do the job just fine for almost every PC gaming enthusiast, since the largest determining factor for gaming performance 1080p+ is the GPU(s) being used.
 
Gains with HT on? I don't see anything, looks like they are playing multiplayer so it doesn't seem to be beyond typical variance in a non-standard test. There seems to be zero practical effect of HT in BF4 on a quad core. That being said, I have seen some tests where dual cores with HT seem to benefit a good amount in BF4.

According to the overclock.net article there were some gains with a simple driver change. The HT on tests showed some gains over the HT off tests. The gains when they are there do show more than a 7% change from enabling HT. I get what your saying about the nonstandard nature of the test, but if the tester followed the same path in his testing I'm willing to give some benefit of the doubt. I've done that type of testing and it's hard to do but you can get reasonable consistency in them.
 
According to the overclock.net article there were some gains with a simple driver change. The HT on tests showed some gains over the HT off tests. The gains when they are there do show more than a 7% change from enabling HT. I get what your saying about the nonstandard nature of the test, but if the tester followed the same path in his testing I'm willing to give some benefit of the doubt. I've done that type of testing and it's hard to do but you can get reasonable consistency in them.

Since there isn't an actual standard BF4 MP test, everyone's numbers are going to be different. The only way to paint a decent picture is get a good sample pool of running roughly the same test with the same hardware over and over again and taking the average. That's what the guy did in that forum post and if you look at the overall averages he put on the spreadsheet, he actually lost perf with HT on the older drivers in Win 7 vs having it on, and with the newer drivers it was even. Go look at the spreadsheets he provided again and look at his averages at the bottom of each spreadsheet. Even so, I think the 6 FPS difference in the OLD vs OLD test was just error. It could have also been an OS thing as the discrepancies weren't there in Win 8.1, which has smarter scheduling for HT, which would help it not lose in performance vs having HT off.
 
Back
Top