Intel's 8th Generation Core Family - Coffee Lake (LGA 1151, 6C/12T)

Where do you expect Core i7-8700K's Turbo to land?

  • 3.8/3.9 GHz

    Votes: 0 0.0%
  • 4.0/4.1 GHz

    Votes: 3 23.1%
  • 4.2/4.3 GHz

    Votes: 6 46.2%
  • 4.4/4.5 GHz

    Votes: 3 23.1%
  • 4.6/4.7 GHz

    Votes: 1 7.7%

  • Total voters
    13
  • Poll closed .
icelake? 10nm+ 8 cores september of this year? wtf that was 3 months ago how does that work

What did you expect? CPU released and everyone ready next day and a full QA run? There is a reason why Zen still got a shitload of issues and broken chips and platform. And its called lack of QA.
 
Last edited:
What did you expect? CPU released and everyone ready next day and a full QA run? There is a reason why Zen still got a shitload of issues and broken chips and platform. And its called lack of QA.

i dont care about zen. i only wanted to know if ICL is still coming in 2nd of 2018 because we all want that 8 core 5ghz, which road map doesnt show ICL, nor CFL-S 8 core.
 
i dont care about zen. i only wanted to know if ICL is still coming in 2nd of 2018 because we all want that 8 core 5ghz, which road map doesnt show ICL, nor CFL-S 8 core.

Roadmaps change as the target comes closer, Cascade Lake wasn't there either before.

Proper QA for a CPU is 12-15 months and its been so for ages.

You also wanted 6 core at 5Ghz didn't you? Did you buy? I guess not.
 
Roadmaps change as the target comes closer, Cascade Lake wasn't there either before.

Proper QA for a CPU is 12-15 months and its been so for ages.

You also wanted 6 core at 5Ghz didn't you? Did you buy? I guess not.

nope couldnt get one 5ghz 6 cores it was sold out too fast. also the laptop i want to put a 6 core cpu into didnt really change it's heatsink design and its barely enough for a 4 core cpu. once 8 core hits there will be TWO laptop that can take it in and i'll look then

remember what Eurocom confirmed 8 core z390 next year? im hoping thats still coming and intel just purposely not showing it on roadmap so that people would buy 6 cores for now.
 
there are some news on the z370-drama-front. apparently some guys in poland managed to get an i3-8350k running on a z170 board. they used a custom BIOS which runs with everything except the IGP.

source in polish.

So they got a quad-core working in a mobo that supports up to quad-cores? Color me impressed! :rolleyes:
 
I find this frustrating.

If you're not pleased with Shintai and Juanrga's posts and information then come back at them with facts and references. Can we please stop with the one-liner snipes at them for having a bias? A bias is irrelevant if they are posting verifiable facts. Do the same to combat them. If you do not, most of us will assume you don't actually have an answer.
 
I find this frustrating.

If you're not pleased with Shintai and Juanrga's posts and information then come back at them with facts and references. Can we please stop with the one-liner snipes at them for having a bias? A bias is irrelevant if they are posting verifiable facts. Do the same to combat them. If you do not, most of us will assume you don't actually have an answer.

Or put them on ignore. It has made this forum a better place in my opinion.
 
I find this frustrating.

If you're not pleased with Shintai and Juanrga's posts and information then come back at them with facts and references. Can we please stop with the one-liner snipes at them for having a bias? A bias is irrelevant if they are posting verifiable facts. Do the same to combat them. If you do not, most of us will assume you don't actually have an answer.
It just would make it easier for them to do the one liners. Its not like you will get a non biased answer which is not really helping anybody.

I hope 10nm+ goes good.
 
I find this frustrating.

If you're not pleased with Shintai and Juanrga's posts and information then come back at them with facts and references. Can we please stop with the one-liner snipes at them for having a bias? A bias is irrelevant if they are posting verifiable facts. Do the same to combat them. If you do not, most of us will assume you don't actually have an answer.

As an Intel user I will say this...

There is no point in trying to "combat" certain people in this forum. Some individuals have a distinct tendency to post obscure "facts and references" that prove the point they are trying to make not just in this thread but all over the forum (an egregious example being that not that long ago, there were several slides pointing out the 8700k was significantly better at 720p resolutions than Ryzen...pretty much a useless metric; or early on when the argument was whether or not the 6C/12T mainstream part was a response to AMD which was roundly criticized at the time by certain individuals but the supply still isn't sufficient to drive the price down to MSRP unlike with previous Intel releases, but it wasn't a "rushed" launch), but then dismiss any "facts and references" that paint any other brands in a good light (multi-threaded Cinebench scores for example or the lack of difference while gaming at 4k resolutions while costing far less). In fact, even on the value proposition they refuse to acknowledge that any other brand might be worthwhile. Personally, I see a fantastic "value" in a sub $200 6C/12T Ryzen 1600 chip (or a $230 8C/12T part when it's on sale). I've been called an AMD shill for pointing this out even though I'm running an Intel setup. I played through Assassin's Creed Origins recently on a Ryzen system and never felt like I was missing out on anything I would have gotten with an Intel setup. In my mind, the consumer is a winner when there's not a significant difference that affects the overall experience (even if it gets 10 less FPS on Ultra at 1080p running a 1080Ti according to Techspot). I've gotten 4 or 5 "3 day bans" for debating back and forth with certain individuals because they hit the report button for nonsense claiming a personal attack (at least twice because of this thread). I find this far more frustrating than anything Hagrid has ever posted even if I don't agree with everything Hagrid posts. I've been much more satisfied with certain individuals on my ignore list as certain individuals seem to be on personal missions to slay the opposition for no good reason.

At the end of the day, if people want to spend $400 on an 8700k that's fine. I won't spend that kind of money because 1). The cost of Ryzen parts (CPU and motherboard) and the commitment to the AM4 platform through 2020 and the relative performance of these parts compared to Intel; 2). The relative cost of a 7820X when it's on sale ($450 from Newegg last month) and the 7800X (sub $300) along with the probability of running another generation of chips on the X299 platform and the low cost of off-lease Xeons in the future which are now locked out of the mainstream platform post-Haswell; 3). The impending launch of 8 core mainstream parts sooner than later along with the Z390 chipset also coming sooner than later with only speculative support for new CPU's on the Z370 platform (similar to the way Intel abandoned the Z270 in 8 months). 4). The fact that I would have been more than happy to drop in an 8700k into my perfectly functioning high end Z270 board, but I'm not dropping another $200 for a similar high end board with a similar chipset running a similar process node on the same architecture just because of some magical power pins which may or may not actually make a difference.

THIS IS MY OPINION. Unless someone's name has "Official ____ representative" in their title and has been authorized to make statements to the public, we are all sharing opinions and have different expectations and uses for our computers. I'm not attacking anyone who purchased an 8700k, nor anyone that likes their Ryzen setup. If it performs like you want and you're happy with it then I'm happy for you.
 
8700k was significantly better at 720p resolutions than Ryzen...pretty much a useless metric
pZzs1e6.gif


This is an AMD fan talking point and it goes against 20+ years of CPU benchmarking. It needs to die.
 
This is an AMD fan talking point and it goes against 20+ years of CPU benchmarking. It needs to die.

Yup. Straight up scientific method, and suddenly we should pretend that science is bad!

It's faster at gaming, every day of the week. It's faster today, and it'll be faster tomorrow, and given how long people keep CPUs, that's an incredibly important piece of information for potential purchasers.
 
This is an AMD fan talking point and it goes against 20+ years of CPU benchmarking. It needs to die.

When exactly was the last time you had a 720p screen? I haven't had one since 2007...a useless metric.

But well done in proving my point...a petty meme posted based on one line that caught your eye rather than a coherent response to the overall idea of what I was getting at.
 
Yup. Straight up scientific method, and suddenly we should pretend that science is bad!

It's faster at gaming, every day of the week. It's faster today, and it'll be faster tomorrow, and given how long people keep CPUs, that's an incredibly important piece of information for potential purchasers.
But is it faster that some/most people will notice? Why spend hundreds of extra $$ if they will see no difference. That is also a point. :)
 
When exactly was the last time you had a 720p screen? I haven't had one since 2007...a useless metric.

But well done in proving my point...a petty meme posted based on one line that caught your eye rather than a coherent response to the overall idea of what I was getting at.
When was the last time anyone played CineBench for fun? That's never stopped certain people from citing the score as a performance metric, along with a dozen other synthetic benchmark tools.

We can say that 720p testing is the most accurate since it does the best job at removing all other bottlenecks in the system. However, deciding what metrics are "useless" is a pointless debate since it's going to differ from person to person. Its usefulness depends on what monitor/GPUs you plan on running during the lifespan of your CPU. For example, if someone doesn't plan on getting a GPU faster than the 1080 Ti before they replace their CPU (3-5 years for the average person?) then 720p benchmarks aren't useful for them, since they will always be bottlenecked by modern GPU's performance.

In other words, as GPUs get faster, the amount of error from today's non-720p CPU benchmarks will increase.

I would just like to point out 2 things:

1. The AMD community has been a staunch defender of fairness in benchmarks, re: AMD vs Nvidia, but for some reason they are totally cool with artificial GPU bottlenecks when it comes to CPU testing. I have a feeling if the results were flipped they would also be defending 720p tests. I still would be too, obviously.
2. Rather than debate a more complex issue it's a lot easier to just say "hurr durr I don't play games at 720p". It's a childish response. These are the same people who use phrases like "Intel's TIM is toothpaste" which I've ranted about in the past.
 
But is it faster that some/most people will notice? Why spend hundreds of extra $$ if they will see no difference. That is also a point. :)

Notice when?

Game development isn't standing still, and even most enthusiasts don't upgrade their CPU more than every three or four cycles. It makes sense to get the fastest reasonable thing you can.
 
When was the last time anyone played CineBench for fun? That's never stopped certain people from citing the score as a performance metric, along with a dozen other synthetic benchmark tools.

We can say that 720p testing is the most accurate since it does the best job at removing all other bottlenecks in the system. However, deciding what metrics are "useless" is a pointless debate since it's going to differ from person to person. Its usefulness depends on what monitor/GPUs you plan on running during the lifespan of your CPU. For example, if someone doesn't plan on getting a GPU faster than the 1080 Ti before they replace their CPU (3-5 years for the average person?) then 720p benchmarks aren't useful for them, since they will always be bottlenecked by modern GPU's performance.

In other words, as GPUs get faster, the amount of error from today's non-720p CPU benchmarks will increase.

I would just like to point out 2 things:

1. The AMD community has been a staunch defender of fairness in benchmarks, re: AMD vs Nvidia, but for some reason they are totally cool with artificial GPU bottlenecks when it comes to CPU testing. I have a feeling if the results were flipped they would also be defending 720p tests. I still would be too, obviously.
2. Rather than debate a more complex issue it's a lot easier to just say "hurr durr I don't play games at 720p". It's a childish response. These are the same people who use phrases like "Intel's TIM is toothpaste" which I've ranted about in the past.

I replace my GPUs pretty much annually, so I guess FOR ME it is a useless metric. I understand what you're saying, and I don't think any rational person is going to say that Intel isn't faster than AMD in CPU bottleneck situations. My point is that at a modern resolution with a modern graphics card there isn't a significant difference to justify almost twice the price unless you are in a situation where the absolute performance is necessary (twitch gaming maybe?). A $200+ difference in price is $200 in someone's pocket and maybe the difference between a 1080 and a 1080Ti.

So I guess if I'm going to boil it down. It's an economic argument something like FPS per $. Personally, I'd take a Ryzen/1080Ti over an 8700k/1080 combo.
 
I replace my GPUs pretty much annually, so I guess FOR ME it is a useless metric.

You're making the argument for us- you replace your GPUs annually, and many replace their GPUs every other generation on average, but do not replace their CPUs.

That difference between platforms is minimal if you consider replacing your CPU (and probably whole platform) earlier because it cannot feed your latest GPU in the latest games.
 
My point is that at a modern resolution with a modern graphics card there isn't a significant difference to justify almost twice the price unless you are in a situation where the absolute performance is necessary (twitch gaming maybe?).
People on 144 Hz monitors, mostly. But that goes back to the "usefulness" debate.
Ryzen dominates on value even in 720p tests. Nobody is questioning that... Well somebody might question that, but not me.
 
People on 144 Hz monitors, mostly. But that goes back to the "usefulness" debate.
Ryzen dominates on value even in 720p tests. Nobody is questioning that... Well somebody might question that, but not me.

I guess that's always been my point is that there is a value consumer who will settle for 85% of the performance for 50% of the price, and it's not a terrible option. I wasn't trying to be belligerent before with the 720p test statement.

You're making the argument for us- you replace your GPUs annually, and many replace their GPUs every other generation on average, but do not replace their CPUs.

That difference between platforms is minimal if you consider replacing your CPU (and probably whole platform) earlier because it cannot feed your latest GPU in the latest games.

I misread what he said before. I get it from what you're saying here and in the other thread. Case in point Assassin's Creed Origins CPU test. 1080p is a common average joe gaming resolution, and it's a pretty CPU heavy game (if only because of the DRM). I'm not sure the average gamer is going to be disappointed or even notice the difference between the 8700k and 1600X when he has an extra $200 in his pocket. And I say that after noting that you're not an average gamer and hold your system to a higher standard and you purposely spend extra money to get the performance you're looking for. In two years with a 2080Ti or whatever comes out with a newer game engine, I don't know for sure that the difference is going to be more than the same 10% or so with the same CPU's.
 
I misread what he said before. I get it from what you're saying here and in the other thread. Case in point Assassin's Creed Origins CPU test. 1080p is a common average joe gaming resolution, and it's a pretty CPU heavy game (if only because of the DRM). I'm not sure the average gamer is going to be disappointed or even notice the difference between the 8700k and 1600X when he has an extra $200 in his pocket. And I say that after noting that you're not an average gamer and hold your system to a higher standard and you purposely spend extra money to get the performance you're looking for. In two years with a 2080Ti or whatever comes out with a newer game engine, I don't know for sure that the difference is going to be more than the same 10% or so.

Well, my standard isn't just mine: I'm applying logic learned from decades of upgrading. And that's this: GPU goes longer than CPU in the near term, and CPU goes longer than GPU in the long term.

My advice would be to not short-change yourself on either.
 
I guess that's always been my point is that there is a value consumer who will settle for 85% of the performance for 50% of the price, and it's not a terrible option. I wasn't trying to be belligerent before with the 720p test statement.



I misread what he said before. I get it from what you're saying here and in the other thread. Case in point Assassin's Creed Origins CPU test. 1080p is a common average joe gaming resolution, and it's a pretty CPU heavy game (if only because of the DRM). I'm not sure the average gamer is going to be disappointed or even notice the difference between the 8700k and 1600X when he has an extra $200 in his pocket. And I say that after noting that you're not an average gamer and hold your system to a higher standard and you purposely spend extra money to get the performance you're looking for. In two years with a 2080Ti or whatever comes out with a newer game engine, I don't know for sure that the difference is going to be more than the same 10% or so with the same CPU's.


I just want to say that for anyone looking to upgrade for a game like PUBG, the 8700K is worth every single penny over a Ryzen 1600 @ 4Ghz even at 1440p or even 4K.

The Ryzen system felt okay for the game but with the 8700K you would think you're running Team Fortress 2 or something.
 
This is an AMD fan talking point and it goes against 20+ years of CPU benchmarking. It needs to die.

Didnt this prove BS when comparing the 2500k to Bulldozer at very low reaolutions? It was predicted that the 2500k would be better 5 years down the road with this metric, and 5 years later the similiar priced bulldozer pretty much caught up at normal resolutions.
 
Ah, here is one of those tests:
https://www.computerbase.de/2017-02/cpu-skalierung-kerne-spiele-test/#diagramm-watch-dogs-2-fps

Yeah, the 2500k can overclock better, but we do not see the same 20% disperity when comparing the 2500k to bulldozer when they tested at 640x480 for those cpus back in the day.

The same may be true for comparing R5 to i5 without h/t today.
We were discussing equal core chips (R5 1600/X vs 8700K).
If somebody wants to gamble on extra core utilization (R7 vs 8700K) then that's their own risk. Benchmarks can't predict the future.

But do you really want to sit on an inferior chip for 5 years hoping for it to be competitive some day? I challenge you to find a Sandy owner who wishes they got Vishera instead.

I wouldn't compare Sandy/Vishera to Ryzen/KBL/CFL at all, though.
 
Last edited:
If somebody wants to gamble on extra core utilization (R7 vs 8700K) then that's their own risk. Benchmarks can't predict the future.

So, I want to take a moment to explore this idea- not as a refutation or even an argument really-

We can't predict, going forward, what will actually be more important for gaming, or any application. Will six real cores (8600k) be enough? Will four real cores with four hyperthreaded cores (quad-core R5's, 7700k) be enough? Hell, will six cores with hyperthreading (R5 1600+, 8700k) be enough? How will applications balance increased threaded resources, versus increased single-core performance?

If I were to make a bet, it would be that the answer is somewhere between the two: however, if I were to take a 'worst case' assessment, it'd be that with CPUs with similar aggregate performance, say an 8700k and the R7's, higher single-core performance has a greater chance of providing better performance in future applications than greater threaded resources.

Part of the reasoning is like this. While the R7's have similar number-crunching ability, and are duly impressive, not everything is parallelizable. This means that the performance floor in less parallelizable tasks, if you will, is bound to single-core performance. This is what drives maximum frametimes, and what really affects how 'smooth' a game is perceived to 'feel'.

The other part is that I'm betting games will become more complex over time. With increasing complexity will come increasing difficulty in splitting up workloads and making use of many-thread resources. Based on that bet and understanding how single-core performance affects overall gaming performance, I'd prefer the CPU with the faster cores and higher IPC (together!) so long as it has enough cores to keep the chosen application fed and keep OS and other background tasks out of the way.


And here's where my bet fails: in the off-chance that games don't get more complex, that whatever complexity that does come about doesn't eat up the resources (potentially) freed by low-overhead APIs like Vulkan and DX12, or if developers get much better at splitting out resources across threads and manage to mitigate the need for higher single-core performance, then betting on Ryzen may have been the better bet.


And in summary, I feel that both will happen. One artifact is that consoles have had their hardware die cast, more or less, so development toward better threading and low single-core usage has been ingrained if not wholly successful, and another artifact is that developers and publishers are seeing the usefulness (and market!) of PC gaming stay steady and are willing to put in the resources to make stunning if not entirely compelling games.
 
When exactly was the last time you had a 720p screen? I haven't had one since 2007...a useless metric.

But well done in proving my point...a petty meme posted based on one line that caught your eye rather than a coherent response to the overall idea of what I was getting at.

We can also argue gaming benchmarks in higher resolutions doesn't take into the account of the entire game. just as the only CPU to hit a steady 60FPS in any res anywhere at FO4 for example is a OCed 6700K with fast memory.
 
Didnt this prove BS when comparing the 2500k to Bulldozer at very low reaolutions? It was predicted that the 2500k would be better 5 years down the road with this metric, and 5 years later the similiar priced bulldozer pretty much caught up at normal resolutions.

No it didn't. The BS was that FX would age better yet it was nothing but a fantasy to try and excuse an outdated product at launch.

Lets just see at SP3 as example. Ryzen at SB IPCish and FX is way, way behind as usual. And then we dont even have to talk about overclocking potential. And Ryzen will age just as bad for the same reasons, low OC and low IPC.
upload_2017-12-10_11-41-10.png
 
Last edited:
Ah, here is one of those tests:
https://www.computerbase.de/2017-02/cpu-skalierung-kerne-spiele-test/#diagramm-watch-dogs-2-fps

Yeah, the 2500k can overclock better, but we do not see the same 20% disperity when comparing the 2500k to bulldozer when they tested at 640x480 for those cpus back in the day.

The same may be true for comparing R5 to i5 without h/t today.

How many SB owners do you know running with 1333Mhz memory vs 1866Mhz for FX owners.

Right, a fixed benchmark was what it was. AMDbase.de did a good job selecting too :D
 
We can also argue gaming benchmarks in higher resolutions doesn't take into the account of the entire game. just as the only CPU to hit a steady 60FPS in any res anywhere at FO4 for example is a OCed 6700K with fast memory.

But then you have to wonder why you'd spend extra money for no (or maybe minimal is a better word) tangible difference at higher resolutions. Who really cares what is taken into account if the performance is indistinguishable? Like I said earlier. The ~$200 difference is essentially the difference between a 1070Ti/1080 and 1080Ti in a budget.
 
Average fps at high resolution is far less important than minimums when you are looking at the CPU. There might only be a small difference in the average, but one could be much choppier.

Fallout 4 is a prime example of a game where GPU bound tests won't tell you much about relative CPU performance.
 
Back
Top