Intel's 8th Generation Core Family - Coffee Lake (LGA 1151, 6C/12T)

Where do you expect Core i7-8700K's Turbo to land?

  • 3.8/3.9 GHz

    Votes: 0 0.0%
  • 4.0/4.1 GHz

    Votes: 3 23.1%
  • 4.2/4.3 GHz

    Votes: 6 46.2%
  • 4.4/4.5 GHz

    Votes: 3 23.1%
  • 4.6/4.7 GHz

    Votes: 1 7.7%

  • Total voters
    13
  • Poll closed .
As an Intel user I will say this...

There is no point in trying to "combat" certain people in this forum. Some individuals have a distinct tendency to post obscure "facts and references" that prove the point they are trying to make not just in this thread but all over the forum (an egregious example being that not that long ago, there were several slides pointing out the 8700k was significantly better at 720p resolutions than Ryzen...pretty much a useless metric;

Those 720p benches are historically made by one of the more famous and rigorous review sites in the world. The technical reason for performing those low-resolution tests (known as "CPU-tests" in the industry) have been explained to you and others about a dozen of times. Those 720p tests are even routinely mentioned even by AdoredTV, which is not precisely an Intel fanboy. :rolleyes:

Testing RyZen only at 4K resolutions will generate GPU bottlenecks and will hide the gaming deficits of RyZen. AMD itself tried to manipulate RyZen reviews by suggesting reviewers to increase resolution when testing. Fortunately, reviewers gave the finger to AMD and performed scientific tests to measure CPU performance.

or early on when the argument was whether or not the 6C/12T mainstream part was a response to AMD which was roundly criticized at the time by certain individuals but the supply still isn't sufficient to drive the price down to MSRP unlike with previous Intel releases, but it wasn't a "rushed" launch), but then dismiss any "facts and references" that paint any other brands in a good light (multi-threaded Cinebench scores for example or the lack of difference while gaming at 4k resolutions while costing far less).

I find interesting as some people believes that everything made by Intel is always a response to AMD. I guess this people believes that AMD is in the center of the Universe or something. Yes, some things made by Intel are in response to AMD, but the inverse is also true.

Intel had originally planned a post-Skylake 8C mainstream much before Zen was tapeout. Then the difficulties with 14nm forced a delay of the original tick-tock roadmap (including the 8C 10nm parts) and forced Intel to introduce the tick-tock-optimization-2nd-optimization roadmap with the new Kabylake and CoffeLake.

I find funny as some claim Intel introducing a 6C CoffeLake is response to AMD, but AMD introducing 8C Zen is not response to Intel. Despite Intel has 8C in the roadmap before Zen.

Funny as the same people that accuses now Intel of paper-launch forgets the initial supply problems with the AM4 platform or forgets are Zen was initially scheduled for 2016 and pretends now that Zen was always on track.

No one dismiss CineBench scores. They measure what they measure. What is being refuted is the pretension of certain people on mentioning only CB scores and ignoring anything else. This is from Arstechnica review of CoffeeLake:

Even though it has two fewer cores than the Ryzen 1800X (a CPU that costs a hefty £437), the 8700K comes in faster in many production workloads. It's four seconds quicker in Blender at stock, and 11 seconds quicker when overclocked. It's faster at Handbrake video encoding too, and miles ahead in 7-Zip's synthetic benchmark, which tends to favour clock speed even in multithreaded mode. It's only in PovRay and Cinebench that 1800X comes out on top—and only then by a small amount.

We are saying is that CineBench is not representative of those "many production workloads" where RyZen loses by a large margin. We are saying that CineBench is not representative of average performance.

You can report performance of any chip in two ways: Either you give a list of benches and scores or you give the average performance. What you cannot do for RyZen is mention only a favorable case as CineBench and ignore the rest.

In fact, even on the value proposition they refuse to acknowledge that any other brand might be worthwhile. Personally, I see a fantastic "value" in a sub $200 6C/12T Ryzen 1600 chip (or a $230 8C/12T part when it's on sale).

"Value" is subjective.


When exactly was the last time you had a 720p screen? I haven't had one since 2007...a useless metric.

Oh lord, and here we go again...
 
Last edited:

My point is that no matter what anyone says that could possibly paint AMD in a positive light certain people argue against it. You're one of the only people who will even argue against the "value" argument.

But it's funny that you criticize me for not using "production workloads" but then defend a 720p screen resolution which isn't a modern real world scenario :p. And then "oh lord, and here we go again" dismissing the argument. I don't really care. I was using it as an example, and TaintedSquirrel and IdiotInCharge discussed it yesterday.

I've never said that AMD doesn't respond to Intel. I think it would be foolish to argue that they don't, specifically on pricing.

I didn't have AM4 supply problems. I read some reviews, and I picked up a 1700 (non-x) to play around with at MSRP within a week of launch. My point isn't to praise AMD it's to point out that this Intel launch was very un-Intel, and it's reasonable to believe that it was pushed up to have a product available (or sort of available) for the holiday season as opposed to a Q1/Q2 launch which is more normal for Intel's mainstream parts at least since Gen 2 core.
 
Summing up 96 pages:

1. Intel has a stronger per-core performance.
2. AMD has lower per-core performance, but typically more cores at the same price point.
3. Honestly, both have really nice parts which will do very well in some areas, but are still pretty good in the rest.

See, I'm a peacemaker.
 
But then you have to wonder why you'd spend extra money for no (or maybe minimal is a better word) tangible difference at higher resolutions. Who really cares what is taken into account if the performance is indistinguishable? Like I said earlier. The ~$200 difference is essentially the difference between a 1070Ti/1080 and 1080Ti in a budget.

But is there minimal to no difference? The answer is no unless you stick to short benchmark runs and limited game selection. Not to mention minimal FPS and that you most likely upgrade your GPU more often than the CPU.

You cant keep steady 60FPS in any resolution either on anything AMD based in SP3 for example.

And this is today, just going downhill over time.
 
Last edited:
My point is that no matter what anyone says that could possibly paint AMD in a positive light certain people argue against it. You're one of the only people who will even argue against the "value" argument.

My only remark was "Value is subjective" and you are criticizing me only for noticing that value judgments aren't universal. In fact, if you check the Amazon best selling chip list, the R5 1600 is currently on position #4, with three Intel chips among the top positions (#2 is the 8700k CoffeLake). The R5 1600 is in #4 because there are more people that disagrees with your "fantastic value" proposition than people that agrees with you.

But it's funny that you criticize me for not using "production workloads" but then defend a 720p screen resolution which isn't a modern real world scenario :p. And then "oh lord, and here we go again" dismissing the argument. I don't really care. I was using it as an example, and TaintedSquirrel and IdiotInCharge discussed it yesterday.

We already explained you why 720p tests are made and why they are known in the industry as "CPU tests". What is sad is that after correcting people as you a dozen of times, you still believe that a 720p test is made to test gaming on old monitors and that 720p tests are irrelevant to "modern real world scenario". :facepalm:
 
720P tests are pretty useful to me to not show the average FPS in a game but how much a CPU will dip in the minimums. I find the lower resolution game tests pretty accurate to show cpu-bound situations when a game drops from 100fps to 50-60fps.
 
720P tests are pretty useful to me to not show the average FPS in a game but how much a CPU will dip in the minimums. I find the lower resolution game tests pretty accurate to show cpu-bound situations when a game drops from 100fps to 50-60fps.
I know I was looking at lots of 720p tests to see how my 4K monitor will do.
 
Your 4k 60hz monitor would be fine with any recent CPU.
I know. :) That is my point. If your looking for examples, look for what you are running.
1080 and up is what 99%? of the people run. That should be the baseline.
 
The entire point of 720p tests isn't to show current system performance. It is to show what your future holds if you keep the same CPU but get a Wickedly Fast new GPU, which effectively eliminates your GPU bottleneck on whatever resolution and refresh rate you have. That effect is simulated in current titles, current systems, current games by lowering the resolution. All the same paths are hit, but with much less overhead on your current shitty GPU. Thus, the bottleneck then shifts away from the GPU to places unknown, but commonly the CPU.

For people buying a CPU, this may be interesting information.

To keep decrying it with a statement of "But I don't game at 720p" is aggressively missing the point.
 
now answer the real question, when is 10nm+ 8c 9gen cpu coming out. 2H2018 like eurocom mentioned? or Q1 2019?
 
The entire point of 720p tests isn't to show current system performance. It is to show what your future holds if you keep the same CPU but get a Wickedly Fast new GPU, which effectively eliminates your GPU bottleneck on whatever resolution and refresh rate you have. That effect is simulated in current titles, current systems, current games by lowering the resolution. All the same paths are hit, but with much less overhead on your current shitty GPU. Thus, the bottleneck then shifts away from the GPU to places unknown, but commonly the CPU.

For people buying a CPU, this may be interesting information.

To keep decrying it with a statement of "But I don't game at 720p" is aggressively missing the point.

Quoted for troof.
 
The entire point of 720p tests isn't to show current system performance. It is to show what your future holds if you keep the same CPU but get a Wickedly Fast new GPU, which effectively eliminates your GPU bottleneck on whatever resolution and refresh rate you have. That effect is simulated in current titles, current systems, current games by lowering the resolution. All the same paths are hit, but with much less overhead on your current shitty GPU. Thus, the bottleneck then shifts away from the GPU to places unknown, but commonly the CPU.

For people buying a CPU, this may be interesting information.

To keep decrying it with a statement of "But I don't game at 720p" is aggressively missing the point.
You should look at info that is based on what you have and what you run. 720 test is for max cpu usage. That is why at 4K it doesn't matter what cpu as I am gpu bound.
As it gets higher the cpu demand goes to the gpu, unless I am wrong. So to me I would want gaming benchmarks close to the hardware I have. Common sense.
 
You should look at info that is based on what you have and what you run. 720 test is for max cpu usage. That is why at 4K it doesn't matter what cpu as I am gpu bound.
As it gets higher the cpu demand goes to the gpu, unless I am wrong. So to me I would want gaming benchmarks close to the hardware I have. Common sense.

720 is not for max CPU usage. It is for minimal GPU usage. This is what CPU tests and reviews do - test the CPU. You minimize other factors, or you aren't actually testing the CPU. As you say, common sense.

Not everyone is only interested in finding the solution to your specific criteria of "does just fine in cases where the CPU doesn't matter". This is why my partner's laptop has ancient i5. It doesn't matter in that context. But those needs are not the same as my desktop's needs.

Your point is taken - CPU performance isn't always the most important thing. But it is also not nothing, and we're trying to look at and discuss how various architectures handle various tasks. Many want to know what the absolute best is for their usage, not just what some would call "good enough".

There is value in determining the best product for many use cases. For yours, I may suggest one product to someone. For my cases, I may suggest another. I like to have all these data points to make a decision.

I'm honestly not trying to bust your chops, but rather ask that you not be so dismissive of tests which are interesting for other people's use cases. They ARE valid, even if not for you.
 
720 is not for max CPU usage. It is for minimal GPU usage. This is what CPU tests and reviews do - test the CPU. You minimize other factors, or you aren't actually testing the CPU. As you say, common sense.

Not everyone is only interested in finding the solution to your specific criteria of "does just fine in cases where the CPU doesn't matter". This is why my partner's laptop has ancient i5. It doesn't matter in that context. But those needs are not the same as my desktop's needs.

Your point is taken - CPU performance isn't always the most important thing. But it is also not nothing, and we're trying to look at and discuss how various architectures handle various tasks. Many want to know what the absolute best is for their usage, not just what some would call "good enough".

There is value in determining the best product for many use cases. For yours, I may suggest one product to someone. For my cases, I may suggest another. I like to have all these data points to make a decision.

I'm honestly not trying to bust your chops, but rather ask that you not be so dismissive of tests which are interesting for other people's use cases. They ARE valid, even if not for you.
Yes, if people are going to run 720p. yeah.
 
That early? Wow. I figured it would be a bit later.

We'll see if it turns out to be accurate. Let's hope Intel doesn't bend over the Z370 crowd with the promise of DDR4 2800 support and extra magical power pins in the Z390 to support the additional cores.
 
Yes, if people are going to run 720p. yeah.

We can explain it for you, but we cannot understand it for you.
Ever taken a look a HardOCP's CPU tests btw?

https://www.hardocp.com/article/2017/08/10/amd_ryzen_threadripper_1950x_1920x_cpu_review/6

I'll quote for you:
No matter how many times I write this paragraph, a lot of folks do not seem to "get it." These are very much "benchmarks." These are good in helping us understand how well CPUs are at performing calculations in 3D gaming engines. These benchmarks in no way represent real-world gameplay. These are all run at very low resolutions to try our best to remove the video card as a bottleneck. I will not hesitate to say that anyone spouting these types of framerate measurements as a true gameplay measuring tool in today’s climate is not servicing your needs or telling you the real truth.



The gaming tests below have been put together to focus on the processor power exhibited by each system. All the tests below consist of custom time demos built with stressing the CPU in mind. So much specialized coding comes into the programming now days we suggest that looking at gaming performance by using real-world gameplay is the only sure way to know what you are going to get with a specific game.
 
On the other hand, while 720p tests "artificially remove the GPU as bottleneck", I would argue that high resolution gaming benchmarks artificially remove the CPU.

If all you do is play single player games on demo levels great, but multiplayer games have completely different CPU usage profiles from the traditional single player benchmark runs and there are almost no reviewers that take this into account, sometimes you see the BF1 MP benchmark run but even that is rare.
 
On the other hand, while 720p tests "artificially remove the GPU as bottleneck", I would argue that high resolution gaming benchmarks artificially remove the CPU.

If all you do is play single player games on demo levels great, but multiplayer games have completely different CPU usage profiles from the traditional single player benchmark runs and there are almost no reviewers that take this into account, sometimes you see the BF1 MP benchmark run but even that is rare.

To be honest, even single player games will have vastly varying CPU & GPU requirements from one location to another throughout the course of the play through.

That's why you can't just say "look they benchmarked this game at 120fps with those parts so this means I can play 100% of the game while never dipping below 120 fps with the same parts". WRONG.

A game can be mostly GPU bound, but sometimes be CPU/RAM bound - or the other way around. Benchmarks are only good for comparing performance between different products. 720p benchmarks clearly show that Intel still has a considerable edge for gaming. And it does matter regardless of the resolution you play at. Even when playing at 1440p/4k your CPU will sometimes get hammered. Benchmarks can't always show this because they don't have unlimited time to finish all the single player games or play hundreds of hours on MP titles.
 
Last edited:
PUBG ate my Ryzen 1600 @ 4ghz system alive at times while the game is like butter on the 8700K system.. I am talking up to 50% faster @ 1440P on a 1080 Ti frequently.
 
Pubg is also a broken mess that uses way less utilization on amd 6 core than CFL 6 core.

Does it? And how is it broken? I have a feeling the response will be more bad excuses. You should just have bought the better product from the start.
 
You should look at info that is based on what you have and what you run. 720 test is for max cpu usage. That is why at 4K it doesn't matter what cpu as I am gpu bound.
As it gets higher the cpu demand goes to the gpu, unless I am wrong. So to me I would want gaming benchmarks close to the hardware I have. Common sense.

Yes, if people are going to run 720p. yeah.

And you got it wrong again...

720p tests today aren't made to measure performance on 4K today. We run 4K tests for that.

720p tests today aren't made to measure performance on 720p screens.

720p tests today can be used to know how the CPU will perform on 4K gaming in a future, when faster GPUs eliminate or reduce the GPU bottleneck.
 
Last edited:
I might give the silicon lottery another chance, I always stood by the first sample I recieved even if it was a bad one but yea 8600K @ 4.7GHz 1.31v is a bit hard to stomach haha and perhaps this way I might break my bad streak. Won't return it though as I'm not that big moral fan of such procedure and don't wanna be workstation-less meanwhile so will sell it for a bit loss.

Considering temps is good to be on an air setup (Phanteks PH-TC14PE) and non-delidded with around ~80C during prime95 non-AVX load of 30 mins~1 hr duration in a well ventilated case of 3x120mm fan intake and 3x120mm 1x140 mm exhaust, ~22C ambient, it's a bit letdown to say at least and my e-peen has a bit hard to stomach this one.

If another sample isn't really getting close to avg clocks of say 4.9~5GHz 1.32-1.35v or so perhaps, I suppose it must be something else which is wrong, temps seem to be too okay to be a bad mounting issue (can lack of pressure from cooler onto IHS cause such poor OC results for example? Could it be perhaps something about the motherboard? ASrock Taichi btw and yes even the motherboard's own 4.8GHz autoclock wasn't perfectly stable at 1.36v (I reckon would need ~1.38v) and I've double checked all bios settings and compared with other guys with the same mobo hitting 5GHz clocks below 1.3v just in case it's something I missed)
 
Last edited:
Great result! I need 1.328 to run my 6700k at 4.5 and next step is 1.36 for 4.6 stable. You got really nice sample, but I’ll wait for 8-core personally :D

oh boy your 6700k seems to be below average. i'd say 1.36v could get people 4.7ghz ish, kaby bring that up by another 100-200mhz. CFL bring that up by another 100 mhz while adding 2 cores.

i wanna know 10nm++, can probably add 2 more cores while getting decent efficiency PLUS 5-5.1ghz on all 8 cores but thats like 2020. 10nm+ will do for now if comes out in 2018
 
oh boy your 6700k seems to be below average. i'd say 1.36v could get people 4.7ghz ish, kaby bring that up by another 100-200mhz. CFL bring that up by another 100 mhz while adding 2 cores.

i wanna know 10nm++, can probably add 2 more cores while getting decent efficiency PLUS 5-5.1ghz on all 8 cores but thats like 2020. 10nm+ will do for now if comes out in 2018

Yeah, it's definitely not the best sample, but I won't say it's below average. it's one the lower side of the averageness :D Not really much difference in real world between 4.5-4.6-4.7-4.8 though, especially at 1440p that I'm running. Studied numerous benchmarks and settled for lower temps and 100% stability. Might get more lucky with Intel's mainstream 8-core next time. By that time (presuming release will be in 2019) it will be a 4-year run with my 6700k, which will go to HTPC pastures.
 
My 8700k is finally up and running.

I got a little spooked at first because I was seeing 0 performance improvements over my 4790k rig in games (but pure CPU benchmarks looked fine). Turns out the BIOS of my GA z370 Aorus Gaming 5 was just faulty: with XMP enabled on my 4000mhz sticks I was getting the bandwidth (total bandwidth) of single channel 2133mhz RAM! Ouch. Updating to the latest BIOS fixed it and now I'm seeing the performance improvements I was expecting.

Gains are huge sometimes. AC Origins went from 70 to 100fps (maxed, 1080p), Arma 3 AI benchmark from 30 to 40 (4k, ultra, 12k view distance) etc. Huge improvements in minimums across the board as well.

It's also way cooler and uses considerably less power than my 4790k, which actually gives me quite a bit of headroom to play around with the clocks even without delidding.
 
Last edited:
My 8700k is finally up and running.

I got a little spooked at first because I was seeing 0 performance improvements over my 4790k rig in games (but pure CPU benchmarks looked fine). Turns out the BIOS of my GA z370 Aorus Gaming 5 was just faulty: with XMP enabled on my 4000mhz sticks I was getting the bandwidth (total bandwidth) of single channel 2133mhz RAM! Ouch. Updating to the latest BIOS fixed it and now I'm seeing the performance improvements I was expecting.

Gains are huge sometimes. AC Origins went from 70 to 100fps (maxed, 1080p), Arma 3 AI benchmark from 30 to 40 (4k, ultra, 12k view distance) etc. Huge improvements in minimums across the board as well.

It's also way cooler and uses considerably less power than my 4790k, which actually gives me quite a bit of headroom to play around with the clocks even without delidding.

this reminds me, how is kaby vs ivy? afaik haswell runs hotter than ivy right being first gen 14nm vs 2nd gen 22nm. what about kaby vs ivy in general
 
due to constant stress and pressure from juanrga and Shintai pushing intel's propaganda i caved in and bought a 5.2 8700k from SL. time to play.

mean while waiting for 9800k..
 
$660 - Looks like the only ones not sold out are the top tier. $430 for a 5.3 Ghz 8600k is not that bad of a deal. Especially considering some people are really getting burned on those clocks after spending $300+.
 
microcenter is now at $229 for 8600k, bundle with mobo for $30 off combo. This was what I was waiting for if I didnt stumble into a non working 7700k for $67
 
Back
Top