Intel's 8th Generation Core Family - Coffee Lake (LGA 1151, 6C/12T)

Discussion in 'Intel Processors' started by -Sweeper_, Apr 19, 2017.

Where do you expect Core i7-8700K's Turbo to land?

Poll closed Jul 25, 2017.
  1. 3.8/3.9 GHz

    0 vote(s)
    0.0%
  2. 4.0/4.1 GHz

    3 vote(s)
    23.1%
  3. 4.2/4.3 GHz

    6 vote(s)
    46.2%
  4. 4.4/4.5 GHz

    3 vote(s)
    23.1%
  5. 4.6/4.7 GHz

    1 vote(s)
    7.7%
  1. RPGWiZaRD

    RPGWiZaRD Gawd

    Messages:
    700
    Joined:
    Jan 24, 2009
    Word, I'm not that kinda person, neither would it be worth the hazzle to me being without a workstation for so long and take it out and install a new one, takes just forever to get those CPU cooler heatsink screws tightened down! But I hope my luck would turn at least once. I always get slightly below average samples, this one then would belong in the far worst end that I've had though as I imagined worst samples of 8600K would do 4.8GHz at 1.32~1.33v (4.9GHz at ~1.4v'ish) or so perhaps meanwhile the best 8600K one I saw so far went 5.2GHz 1.35v stable on air or 5.3GHz 1.41v (high temps).

    Maybe one day I'll get the average+ chip, one day!!
     
    Last edited: Dec 7, 2017 at 6:27 PM
    Nightfire likes this.
  2. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    8,584
    Joined:
    Aug 5, 2013
    If an enthusiast overclocker gets a chip in the, let's say, bottom 10th percentile (4.8 GHz at 1.35V), I'm cool with them returning it if they want to. There's a certain point where a k-series chip loses its value, people deserve to get what they paid for. And if someone isn't happy with their purchase they're allowed to return it, usually with a 10-15% restocking fee. Newegg Premier and Amazon Prime exist for situations like this, too. The cost of convenience is already covered in your membership.

    If they want to go through a dozen CPUs to get a golden sample, then its the retailer's responsibility to cut them off.

    Also if I had a chip that bad, I'd worry about early failure. That's some weak silicon.
     
    Last edited: Dec 7, 2017 at 6:51 PM
    Nightfire likes this.
  3. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    8,584
    Joined:
    Aug 5, 2013
    Anybody know if the CPU will fry if you get LM on the 4 contacts to the bottom-left of the die?

    [​IMG]
     
  4. legcramp

    legcramp [H]ardForum Junkie

    Messages:
    10,512
    Joined:
    Aug 16, 2004
    How long do you guys stress your CPU and under what program until you declare it stable?
     
  5. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    8,584
    Joined:
    Aug 5, 2013
    I do 3 or 4 hours of RealBench Encoding + Multitasking. However, all of my unstable OC's have failed in less than 1 hour (all of my OC's that survived more than 1 hour have been guaranteed stable). The new version of RealBench supports AVX if that's something you want to test.

    Prime95 hasn't been reliable for me since Haswell.
     
  6. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    4,474
    Joined:
    Jun 13, 2003
    No OCCT?

    I've been trying to use it, and it's hard to get stuff stable...
     
  7. legcramp

    legcramp [H]ardForum Junkie

    Messages:
    10,512
    Joined:
    Aug 16, 2004
    Thanks guys, I have been doing 3-4 hours of both latest versions of pirme95 small fft + realbench to get my stable overclock. Maybe it's overkill but I find I need 1.360 vcore to get prime95 stable for that long with my 8700K @ 5ghz de-lidded. Realbench seems to need less vcore for me and maybe it's a more realistic load. My chip seems pretty average but I am not using any AVX offset and I am also enabling all the power-savings feature too. (Idles in the mid 20s and load is high 70s at these settings)
     
  8. oleNBR

    oleNBR Limp Gawd

    Messages:
    171
    Joined:
    Dec 9, 2016
    icelake? 10nm+ 8 cores september of this year? wtf that was 3 months ago how does that work
     
  9. Shintai

    Shintai [H]ardness Supreme

    Messages:
    5,401
    Joined:
    Jul 1, 2016
    What did you expect? CPU released and everyone ready next day and a full QA run? There is a reason why Zen still got a shitload of issues and broken chips and platform. And its called lack of QA.
     
    Last edited: Dec 8, 2017 at 4:10 AM
  10. oleNBR

    oleNBR Limp Gawd

    Messages:
    171
    Joined:
    Dec 9, 2016
    i dont care about zen. i only wanted to know if ICL is still coming in 2nd of 2018 because we all want that 8 core 5ghz, which road map doesnt show ICL, nor CFL-S 8 core.
     
  11. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    8,584
    Joined:
    Aug 5, 2013
    I'm down to 1.248v on 5 GHz. I might crank up my fans and see if I can actually hit 5.3.
     
    Shintai and IdiotInCharge like this.
  12. Shintai

    Shintai [H]ardness Supreme

    Messages:
    5,401
    Joined:
    Jul 1, 2016
    Roadmaps change as the target comes closer, Cascade Lake wasn't there either before.

    Proper QA for a CPU is 12-15 months and its been so for ages.

    You also wanted 6 core at 5Ghz didn't you? Did you buy? I guess not.
     
  13. oleNBR

    oleNBR Limp Gawd

    Messages:
    171
    Joined:
    Dec 9, 2016
    nope couldnt get one 5ghz 6 cores it was sold out too fast. also the laptop i want to put a 6 core cpu into didnt really change it's heatsink design and its barely enough for a 4 core cpu. once 8 core hits there will be TWO laptop that can take it in and i'll look then

    remember what Eurocom confirmed 8 core z390 next year? im hoping thats still coming and intel just purposely not showing it on roadmap so that people would buy 6 cores for now.
     
    Speedeu4ia likes this.
  14. juanrga

    juanrga [H]ard|Gawd

    Messages:
    1,157
    Joined:
    Feb 22, 2017
    So they got a quad-core working in a mobo that supports up to quad-cores? Color me impressed! :rolleyes:
     
    Shintai likes this.
  15. juanrga

    juanrga [H]ard|Gawd

    Messages:
    1,157
    Joined:
    Feb 22, 2017
    Shintai likes this.
  16. Hagrid

    Hagrid Kyle's Boo

    Messages:
    6,560
    Joined:
    Nov 23, 2006
    Technical in that they want to milk more money. Nothing out of the ordinary.
     
    Nightfire, oleNBR and kirbyrj like this.
  17. juanrga

    juanrga [H]ard|Gawd

    Messages:
    1,157
    Joined:
    Feb 22, 2017
    Indeed, remember that Intel and Nvidia are pure evil! There absolutely no technical reasons behind decisions. [End sarcasm]
     
  18. Hagrid

    Hagrid Kyle's Boo

    Messages:
    6,560
    Joined:
    Nov 23, 2006
    No, your saying is Intel rules and AMD sucks.
     
  19. PhaseNoise

    PhaseNoise Gawd

    Messages:
    647
    Joined:
    May 11, 2005
    I find this frustrating.

    If you're not pleased with Shintai and Juanrga's posts and information then come back at them with facts and references. Can we please stop with the one-liner snipes at them for having a bias? A bias is irrelevant if they are posting verifiable facts. Do the same to combat them. If you do not, most of us will assume you don't actually have an answer.
     
  20. Brackle

    Brackle Old Timer

    Messages:
    6,962
    Joined:
    Jun 19, 2003
    Or put them on ignore. It has made this forum a better place in my opinion.
     
  21. OmegaStarAlpha

    OmegaStarAlpha n00bie

    Messages:
    8
    Joined:
    Aug 26, 2015
    Im tempted to wait for that.
     
  22. Hagrid

    Hagrid Kyle's Boo

    Messages:
    6,560
    Joined:
    Nov 23, 2006
    It just would make it easier for them to do the one liners. Its not like you will get a non biased answer which is not really helping anybody.

    I hope 10nm+ goes good.
     
  23. kirbyrj

    kirbyrj Why oh why didn't I take the BLUE pill?

    Messages:
    21,887
    Joined:
    Feb 1, 2005
    As an Intel user I will say this...

    There is no point in trying to "combat" certain people in this forum. Some individuals have a distinct tendency to post obscure "facts and references" that prove the point they are trying to make not just in this thread but all over the forum (an egregious example being that not that long ago, there were several slides pointing out the 8700k was significantly better at 720p resolutions than Ryzen...pretty much a useless metric; or early on when the argument was whether or not the 6C/12T mainstream part was a response to AMD which was roundly criticized at the time by certain individuals but the supply still isn't sufficient to drive the price down to MSRP unlike with previous Intel releases, but it wasn't a "rushed" launch), but then dismiss any "facts and references" that paint any other brands in a good light (multi-threaded Cinebench scores for example or the lack of difference while gaming at 4k resolutions while costing far less). In fact, even on the value proposition they refuse to acknowledge that any other brand might be worthwhile. Personally, I see a fantastic "value" in a sub $200 6C/12T Ryzen 1600 chip (or a $230 8C/12T part when it's on sale). I've been called an AMD shill for pointing this out even though I'm running an Intel setup. I played through Assassin's Creed Origins recently on a Ryzen system and never felt like I was missing out on anything I would have gotten with an Intel setup. In my mind, the consumer is a winner when there's not a significant difference that affects the overall experience (even if it gets 10 less FPS on Ultra at 1080p running a 1080Ti according to Techspot). I've gotten 4 or 5 "3 day bans" for debating back and forth with certain individuals because they hit the report button for nonsense claiming a personal attack (at least twice because of this thread). I find this far more frustrating than anything Hagrid has ever posted even if I don't agree with everything Hagrid posts. I've been much more satisfied with certain individuals on my ignore list as certain individuals seem to be on personal missions to slay the opposition for no good reason.

    At the end of the day, if people want to spend $400 on an 8700k that's fine. I won't spend that kind of money because 1). The cost of Ryzen parts (CPU and motherboard) and the commitment to the AM4 platform through 2020 and the relative performance of these parts compared to Intel; 2). The relative cost of a 7820X when it's on sale ($450 from Newegg last month) and the 7800X (sub $300) along with the probability of running another generation of chips on the X299 platform and the low cost of off-lease Xeons in the future which are now locked out of the mainstream platform post-Haswell; 3). The impending launch of 8 core mainstream parts sooner than later along with the Z390 chipset also coming sooner than later with only speculative support for new CPU's on the Z370 platform (similar to the way Intel abandoned the Z270 in 8 months). 4). The fact that I would have been more than happy to drop in an 8700k into my perfectly functioning high end Z270 board, but I'm not dropping another $200 for a similar high end board with a similar chipset running a similar process node on the same architecture just because of some magical power pins which may or may not actually make a difference.

    THIS IS MY OPINION. Unless someone's name has "Official ____ representative" in their title and has been authorized to make statements to the public, we are all sharing opinions and have different expectations and uses for our computers. I'm not attacking anyone who purchased an 8700k, nor anyone that likes their Ryzen setup. If it performs like you want and you're happy with it then I'm happy for you.
     
  24. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    8,584
    Joined:
    Aug 5, 2013
    [​IMG]

    This is an AMD fan talking point and it goes against 20+ years of CPU benchmarking. It needs to die.
     
    juanrga, Shintai and IdiotInCharge like this.
  25. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    4,474
    Joined:
    Jun 13, 2003
    Yup. Straight up scientific method, and suddenly we should pretend that science is bad!

    It's faster at gaming, every day of the week. It's faster today, and it'll be faster tomorrow, and given how long people keep CPUs, that's an incredibly important piece of information for potential purchasers.
     
    juanrga, kalston and Shintai like this.
  26. kirbyrj

    kirbyrj Why oh why didn't I take the BLUE pill?

    Messages:
    21,887
    Joined:
    Feb 1, 2005
    When exactly was the last time you had a 720p screen? I haven't had one since 2007...a useless metric.

    But well done in proving my point...a petty meme posted based on one line that caught your eye rather than a coherent response to the overall idea of what I was getting at.
     
  27. Hagrid

    Hagrid Kyle's Boo

    Messages:
    6,560
    Joined:
    Nov 23, 2006
    But is it faster that some/most people will notice? Why spend hundreds of extra $$ if they will see no difference. That is also a point. :)
     
  28. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    8,584
    Joined:
    Aug 5, 2013
    When was the last time anyone played CineBench for fun? That's never stopped certain people from citing the score as a performance metric, along with a dozen other synthetic benchmark tools.

    We can say that 720p testing is the most accurate since it does the best job at removing all other bottlenecks in the system. However, deciding what metrics are "useless" is a pointless debate since it's going to differ from person to person. Its usefulness depends on what monitor/GPUs you plan on running during the lifespan of your CPU. For example, if someone doesn't plan on getting a GPU faster than the 1080 Ti before they replace their CPU (3-5 years for the average person?) then 720p benchmarks aren't useful for them, since they will always be bottlenecked by modern GPU's performance.

    In other words, as GPUs get faster, the amount of error from today's non-720p CPU benchmarks will increase.

    I would just like to point out 2 things:

    1. The AMD community has been a staunch defender of fairness in benchmarks, re: AMD vs Nvidia, but for some reason they are totally cool with artificial GPU bottlenecks when it comes to CPU testing. I have a feeling if the results were flipped they would also be defending 720p tests. I still would be too, obviously.
    2. Rather than debate a more complex issue it's a lot easier to just say "hurr durr I don't play games at 720p". It's a childish response. These are the same people who use phrases like "Intel's TIM is toothpaste" which I've ranted about in the past.
     
    juanrga, Shintai and IdiotInCharge like this.
  29. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    4,474
    Joined:
    Jun 13, 2003
    Notice when?

    Game development isn't standing still, and even most enthusiasts don't upgrade their CPU more than every three or four cycles. It makes sense to get the fastest reasonable thing you can.
     
  30. kirbyrj

    kirbyrj Why oh why didn't I take the BLUE pill?

    Messages:
    21,887
    Joined:
    Feb 1, 2005
    I replace my GPUs pretty much annually, so I guess FOR ME it is a useless metric. I understand what you're saying, and I don't think any rational person is going to say that Intel isn't faster than AMD in CPU bottleneck situations. My point is that at a modern resolution with a modern graphics card there isn't a significant difference to justify almost twice the price unless you are in a situation where the absolute performance is necessary (twitch gaming maybe?). A $200+ difference in price is $200 in someone's pocket and maybe the difference between a 1080 and a 1080Ti.

    So I guess if I'm going to boil it down. It's an economic argument something like FPS per $. Personally, I'd take a Ryzen/1080Ti over an 8700k/1080 combo.
     
  31. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    4,474
    Joined:
    Jun 13, 2003
    You're making the argument for us- you replace your GPUs annually, and many replace their GPUs every other generation on average, but do not replace their CPUs.

    That difference between platforms is minimal if you consider replacing your CPU (and probably whole platform) earlier because it cannot feed your latest GPU in the latest games.
     
  32. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    8,584
    Joined:
    Aug 5, 2013
    People on 144 Hz monitors, mostly. But that goes back to the "usefulness" debate.
    Ryzen dominates on value even in 720p tests. Nobody is questioning that... Well somebody might question that, but not me.
     
    kirbyrj likes this.
  33. kirbyrj

    kirbyrj Why oh why didn't I take the BLUE pill?

    Messages:
    21,887
    Joined:
    Feb 1, 2005
    I guess that's always been my point is that there is a value consumer who will settle for 85% of the performance for 50% of the price, and it's not a terrible option. I wasn't trying to be belligerent before with the 720p test statement.

    I misread what he said before. I get it from what you're saying here and in the other thread. Case in point Assassin's Creed Origins CPU test. 1080p is a common average joe gaming resolution, and it's a pretty CPU heavy game (if only because of the DRM). I'm not sure the average gamer is going to be disappointed or even notice the difference between the 8700k and 1600X when he has an extra $200 in his pocket. And I say that after noting that you're not an average gamer and hold your system to a higher standard and you purposely spend extra money to get the performance you're looking for. In two years with a 2080Ti or whatever comes out with a newer game engine, I don't know for sure that the difference is going to be more than the same 10% or so with the same CPU's.
     
    IdiotInCharge likes this.
  34. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    4,474
    Joined:
    Jun 13, 2003
    Well, my standard isn't just mine: I'm applying logic learned from decades of upgrading. And that's this: GPU goes longer than CPU in the near term, and CPU goes longer than GPU in the long term.

    My advice would be to not short-change yourself on either.
     
    kirbyrj likes this.
  35. legcramp

    legcramp [H]ardForum Junkie

    Messages:
    10,512
    Joined:
    Aug 16, 2004

    I just want to say that for anyone looking to upgrade for a game like PUBG, the 8700K is worth every single penny over a Ryzen 1600 @ 4Ghz even at 1440p or even 4K.

    The Ryzen system felt okay for the game but with the 8700K you would think you're running Team Fortress 2 or something.
     
  36. Nightfire

    Nightfire Limp Gawd

    Messages:
    329
    Joined:
    Sep 7, 2017
    Didnt this prove BS when comparing the 2500k to Bulldozer at very low reaolutions? It was predicted that the 2500k would be better 5 years down the road with this metric, and 5 years later the similiar priced bulldozer pretty much caught up at normal resolutions.
     
  37. Nightfire

    Nightfire Limp Gawd

    Messages:
    329
    Joined:
    Sep 7, 2017
  38. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    8,584
    Joined:
    Aug 5, 2013
    We were discussing equal core chips (R5 1600/X vs 8700K).
    If somebody wants to gamble on extra core utilization (R7 vs 8700K) then that's their own risk. Benchmarks can't predict the future.

    But do you really want to sit on an inferior chip for 5 years hoping for it to be competitive some day? I challenge you to find a Sandy owner who wishes they got Vishera instead.

    I wouldn't compare Sandy/Vishera to Ryzen/KBL/CFL at all, though.
     
    Last edited: Dec 10, 2017 at 4:55 AM
  39. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    4,474
    Joined:
    Jun 13, 2003
    So, I want to take a moment to explore this idea- not as a refutation or even an argument really-

    We can't predict, going forward, what will actually be more important for gaming, or any application. Will six real cores (8600k) be enough? Will four real cores with four hyperthreaded cores (quad-core R5's, 7700k) be enough? Hell, will six cores with hyperthreading (R5 1600+, 8700k) be enough? How will applications balance increased threaded resources, versus increased single-core performance?

    If I were to make a bet, it would be that the answer is somewhere between the two: however, if I were to take a 'worst case' assessment, it'd be that with CPUs with similar aggregate performance, say an 8700k and the R7's, higher single-core performance has a greater chance of providing better performance in future applications than greater threaded resources.

    Part of the reasoning is like this. While the R7's have similar number-crunching ability, and are duly impressive, not everything is parallelizable. This means that the performance floor in less parallelizable tasks, if you will, is bound to single-core performance. This is what drives maximum frametimes, and what really affects how 'smooth' a game is perceived to 'feel'.

    The other part is that I'm betting games will become more complex over time. With increasing complexity will come increasing difficulty in splitting up workloads and making use of many-thread resources. Based on that bet and understanding how single-core performance affects overall gaming performance, I'd prefer the CPU with the faster cores and higher IPC (together!) so long as it has enough cores to keep the chosen application fed and keep OS and other background tasks out of the way.


    And here's where my bet fails: in the off-chance that games don't get more complex, that whatever complexity that does come about doesn't eat up the resources (potentially) freed by low-overhead APIs like Vulkan and DX12, or if developers get much better at splitting out resources across threads and manage to mitigate the need for higher single-core performance, then betting on Ryzen may have been the better bet.


    And in summary, I feel that both will happen. One artifact is that consoles have had their hardware die cast, more or less, so development toward better threading and low single-core usage has been ingrained if not wholly successful, and another artifact is that developers and publishers are seeing the usefulness (and market!) of PC gaming stay steady and are willing to put in the resources to make stunning if not entirely compelling games.