Intel's 8th Generation Core Family - Coffee Lake (LGA 1151, 6C/12T)

Where do you expect Core i7-8700K's Turbo to land?

  • 3.8/3.9 GHz

    Votes: 0 0.0%
  • 4.0/4.1 GHz

    Votes: 3 23.1%
  • 4.2/4.3 GHz

    Votes: 6 46.2%
  • 4.4/4.5 GHz

    Votes: 3 23.1%
  • 4.6/4.7 GHz

    Votes: 1 7.7%

  • Total voters
    13
  • Poll closed .
Dear lord,

Okay certain reviewers produce results that are 20-40% higher than the mean because of the MCE active, they then claim it as stock performance, the rest of the reviewers have it disabled or no feature at all and give a representation of the product out the box ie: like it should be. The consumer buys the product with the intention of getting the formers results but gets the laters, this is already a misrepresentation, one that gives rise to a cause of action where party a puffs or touts x performance but it doesn't happen. The loser is the person that shelled out x dollars or currency to buy it, now stop being dense, you owe nothing to intel and they certainly don't give a crap about you

You're talking about motherboards and motherboard reviewers missing a feature- and you want to blame that on Intel?
 
You're talking about motherboards and motherboard reviewers missing a feature- and you want to blame that on Intel?
It's in small part the mobo manufacturers' fault for sending review samples with a cheat feature enabled, it should've been off by default.
It's almost entirely the reviewers' fault for both not knowing what MCE is, and not noticing their clocks were high during benchmarking.

IDK if Intel had anything to do with it. I wouldn't attribute it to malice, either since it doesn't seem like they were intentionally trying to inflate numbers. The feature is on by default because 99.9% of people who buy the mobos will want the free extra clock speed.

Just by browsing around a bit, it seems my preconceived notion that JayzTwoCents is a complete moron have been confirmed a bit more.
 
Wrong XFR is extended boost available to 3-4 SKUs and AMD made it well clear how it operates.

MCE is a permanent 4.7ghz oc across all cores, the 8700k all core is 4.3 circumstantial depending on xyz factors it is best case. The extreme splits between no MCE and MCE was like 1300-1550, massive boosted scores. Also XFR doesn't operate on CB15 MT tests the 1700-1800X all run 3.5-3.6ghz all cores lastly all XFR runs were marked XFR enabled. So yes MCE is a overclock and it runs outside intels spec listed turbo states, it is motherboard vendor controlled and intel tried to cover its boost policy in a shady way

Then guru3D posted about intel refusing to disclose turbos, then backtracking to say they will only disclose baseclock and single core in the future.

All reviews as steve burke stated need to be re tested with MCE off as this feature is only given to higher end Z370 boards not all Z370's have it and further chipsets to come do not feature this option B and H boards are usually locked and OC limits are extremely low. I am clearly not the only one that called this a spade, a gimmick to artificially push the 8700k above the 7700k.

Dear lord,

Okay certain reviewers produce results that are 20-40% higher than the mean because of the MCE active, they then claim it as stock performance, the rest of the reviewers have it disabled or no feature at all and give a representation of the product out the box ie: like it should be. The consumer buys the product with the intention of getting the formers results but gets the laters, this is already a misrepresentation, one that gives rise to a cause of action where party a puffs or touts x performance but it doesn't happen. The loser is the person that shelled out x dollars or currency to buy it, now stop being dense, you owe nothing to intel and they certainly don't give a crap about you

AMD didn't explain well how XFR operates. In fact sites as CanardPC reported how AMD deleted all info about XFR from technical docs just before launch and provided only marketing material to reviewers. And I remember how initially people believed only X models have XFR, until it was latter found that non-X models also have XFR boost. One can do similar comments about the close relationship between memory clock and IF clock, which wasn't initially communicated by AMD and only discovered by reviews latter when found huge variations on performance when overclocking RAM. Finally we could also devote chapters to explain all the dirty tricks that AMD used during early demos of Zen to increase performance. Your Steve Burke mentioned some of those dirty tricks, or how AMD tried to influence his review to favor RyZen with weird gaming settings. All this happened, but I have never seen you complain about those. At contrary, you defended AMD then.

So stop your AMD-is-fair but Intel-is-evil propaganda. The points relevant to this discussion are:
  1. Reviews of Zen have used XFR activated and 140W rated coolers for R7 models. No problem with using that feature of hardware, but problem with reviews not disclosing their scores (e.g., CB15 ST) are with XFR activated. You never complained.
  2. Reviews of Zen have memory overclocked to 3200MHz or higher, which overclocks the chip interconnect. No problem with using overclocked RAM, but with reviews not disclosing their scores (e.g. games) aren't for stock clocks, because the interconnect is overclocked. You never complained.
  3. Some sites that have enabled MCE are also reporting temperatures and power consumption as if it was stock, and then pretending that CFL is hot and power hungry. You ignore this part.
  4. Sites that tested both MCE enabled and disabled, report smaller gap than you pretend. It is 1429 vs 1583 or 1448 vs 1578, not "1300 vs 1550" as you pretend. That 20--40% higher performance is only in your mind. Even the Steve Burke review, which you are repetitively quoting in this thread, gives a 9% higher performance with MCE enabled.
  5. You make it sound as if all reviews enabled MCE and mislead people, but this is far from true. The CB15MT scores of reviews are: TeawkTown (1395); TrustedReviews (1390); ArsTechnica (1530); ExTremeTech (1446); LegitReviews (1449); PcWorld (1400); HotHardware (1522); Guru3d (1296); AnandTech (1364); Kitguru (1404). So only two sites tested with MCE enabled, the rest did with MCE disabled. And one those two that tested with MCE enabled, devoted paragraphs to silly rants about power consumption, TIM, and temperatures. I am not surprised because Arstechnica is pro-AMD and the guy is a complete ignorant.
 
Last edited:
AMD didn't explain well how XFR operates. In fact sites as CanardPC reported how AMD deleted all info about XFR from technical docs just before launch and provided only marketing material to reviewers. And I remember how initially people believed only X models have XFR, until it was latter found that non-X models also have XFR boost. One can do similar comments about the close relationship between memory clock and IF clock, which wasn't initially communicated by AMD and only discovered by reviews latter when found huge variations on performance when overclocking RAM. Finally we could also devote chapters to explain all the dirty tricks that AMD used during early demos of Zen to increase performance. Your Steve Burke mentioned some of those dirty tricks, or how AMD tried to influence his review to favor RyZen with weird gaming settings. All this happened, but I have never seen you complain about those. At contrary, you defended AMD then.

So stop your AMD-is-fair but Intel-is-evil propaganda. The points relevant to this discussion are:
  1. Reviews of Zen have used XFR activated and 140W rated coolers for R7 models. No problem with using that feature of hardware, but problem with reviews not disclosing their scores (e.g., CB15 ST) are with XFR activated. You never complained.
  2. Reviews of Zen have memory overclocked to 3200MHz or higher, which overclocks the chip interconnect. No problem with using overclocked RAM, but with reviews not disclosing their scores (e.g. games) aren't for stock clocks, because the interconnect is overclocked. You never complained.
  3. Some sites that have enabled MCE are also reporting temperatures and power consumption as if it was stock, and then pretending that CFL is hot and power hungry. You ignore this part.
  4. Sites that tested both MCE enabled and disabled, report smaller gap than you pretend. It is 1429 vs 1583 or 1448 vs 1578, not "1300 vs 1550" as you pretend. That 20--40% higher performance is only in your mind. Even the Steve Burke review, which you are repetitively quoting in this thread, gives a 9% higher performance with MCE enabled.
  5. You make it sound as if all reviews enabled MCE and mislead people, but this is far from true. The CB15MT scores of reviews are: TeawkTown (1395); TrustedReviews (1390); ArsTechnica (1530); ExTremeTech (1446); LegitReviews (1449); PcWorld (1400); HotHardware (1522); Guru3d (1296); AnandTech (1364); Kitguru (1404). So only two sites tested with MCE enabled, the rest did with MCE disabled. And one those two that tested with MCE enabled, devoted paragraphs to silly rants about power consumption, TIM, and temperatures. I am not surprised because Arstechnica is pro-AMD and the guy is a complete ignorant.

holy cow i was gonna stay its all reviewers fault for not knowing features of mobo and you came in with this fat comment. tl;dr.

OrangeKhrush its true, reviewers need to pay attention to these things. if they had 7700k reviews done less than a year ago, simple comparison can easily show theres something wrong.

7700k 4 cores 4.5ghz turbo all cores
8700k 6 cores 4.3ghz turbo all cores

50% more cores but at a lower frequency will result in around 40-45% increase in a well optimized software like CB15, yet they hit over 50%+. either their 8700k review is very wrong, or their 7700k review is very wrong, not that hard. if they can't figure this out then they have terrible insight, no point following them further. this is also why i love adoretv's video.
 
AMD didn't explain well how XFR operates. In fact sites as CanardPC reported how AMD deleted all info about XFR from technical docs just before launch and provided only marketing material to reviewers. And I remember how initially people believed only X models have XFR, until it was latter found that non-X models also have XFR boost. One can do similar comments about the close relationship between memory clock and IF clock, which wasn't initially communicated by AMD and only discovered by reviews latter when found huge variations on performance when overclocking RAM. Finally we could also devote chapters to explain all the dirty tricks that AMD used during early demos of Zen to increase performance. Your Steve Burke mentioned some of those dirty tricks, or how AMD tried to influence his review to favor RyZen with weird gaming settings. All this happened, but I have never seen you complain about those. At contrary, you defended AMD then.

So stop your AMD-is-fair but Intel-is-evil propaganda. The points relevant to this discussion are:
  1. Reviews of Zen have used XFR activated and 140W rated coolers for R7 models. No problem with using that feature of hardware, but problem with reviews not disclosing their scores (e.g., CB15 ST) are with XFR activated. You never complained.
  2. Reviews of Zen have memory overclocked to 3200MHz or higher, which overclocks the chip interconnect. No problem with using overclocked RAM, but with reviews not disclosing their scores (e.g. games) aren't for stock clocks, because the interconnect is overclocked. You never complained.
  3. Some sites that have enabled MCE are also reporting temperatures and power consumption as if it was stock, and then pretending that CFL is hot and power hungry. You ignore this part.
  4. Sites that tested both MCE enabled and disabled, report smaller gap than you pretend. It is 1429 vs 1583 or 1448 vs 1578, not "1300 vs 1550" as you pretend. That 20--40% higher performance is only in your mind. Even the Steve Burke review, which you are repetitively quoting in this thread, gives a 9% higher performance with MCE enabled.
  5. You make it sound as if all reviews enabled MCE and mislead people, but this is far from true. The CB15MT scores of reviews are: TeawkTown (1395); TrustedReviews (1390); ArsTechnica (1530); ExTremeTech (1446); LegitReviews (1449); PcWorld (1400); HotHardware (1522); Guru3d (1296); AnandTech (1364); Kitguru (1404). So only two sites tested with MCE enabled, the rest did with MCE disabled. And one those two that tested with MCE enabled, devoted paragraphs to silly rants about power consumption, TIM, and temperatures. I am not surprised because Arstechnica is pro-AMD and the guy is a complete ignorant.


Foaming at the mouth boy, even raising issues of non contention here, it would be only Juan that would talk about AMD when AMD wasn't mentioned once until you mentioned it to start off with, cool so according to you XFR works on all models, i'll add that to my list of unicorns for future debate. Address the topic, stop side skirting I know you do that very well.

FYI, you can watch the video and you can see a great many more tested with MCE and the ones with 1300's didn't have the feature on, or the vendor has it by default off.
 
Foaming at the mouth boy, even raising issues of non contention here, it would be only Juan that would talk about AMD when AMD wasn't mentioned once until you mentioned it to start off with, cool so according to you XFR works on all models, i'll add that to my list of unicorns for future debate. Address the topic, stop side skirting I know you do that very well.

FYI, you can watch the video and you can see a great many more tested with MCE and the ones with 1300's didn't have the feature on, or the vendor has it by default off.

I mentioned AMD only because you are accusing Intel of playing dirty on reviews, still you always avoid to mention how dirty AMD plays. Do I need to make again for you a resume of all dirty tricks used by AMD in Zen demos or how AMD tried to manipulate early RyZen reviews?

I said initially people believed that only X models as 1800X or 1700X have XFR, then it was latter found that non-X models have XFR as well. The 1700 has; the 1600 has; the 1400 has; the 1200 has. I didn't write that "XFR works on all models"; I didn't write the word "all", so stop pretending that I did.

And of course, you ignore rest of points in my message, such as my demonstration about how most mainstream reviews tested with MCE disabled or how those that tested both MCE disabled and enabled found a performance boost of about 9--11%, instead the nonsensical 40% that you keep repeating.
 
This is anandtechs bench testbed specs:
almost running it overclocked, almost, I guess all reviewers ran Ryzen with OC memory

This is techspots review of Ryzen running DDR4 2666mhz even running intel on higher spec memory

https://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/page2.html

So it seems like froll struck again,
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
So, it is all fine when AMD reviews used XFR and 140W coolers in the reviews, despite this is a kind of automatic overclock. It is all fine when AMD reviews use overclocked RAM (which automatically overclocks the chip via Infinity Fabric). It is also fine when reviews test AMD chips overclocked but label them as stock in graphs.

But if Intel reviews use MCE then it is all unfair and unethical. LOL

Stop Please. Even Intel calls MCE and Overclock.


I mentioned AMD only because you are accusing Intel of playing dirty on reviews, still you always avoid to mention how dirty AMD plays. Do I need to make again for you a resume of all dirty tricks used by AMD in Zen demos or how AMD tried to manipulate early RyZen reviews?

I said initially people believed that only X models as 1800X or 1700X have XFR, then it was latter found that non-X models have XFR as well. The 1700 has; the 1600 has; the 1400 has; the 1200 has. I didn't write that "XFR works on all models"; I didn't write the word "all", so stop pretending that I did.

And of course, you ignore rest of points in my message, such as my demonstration about how most mainstream reviews tested with MCE disabled or how those that tested both MCE disabled and enabled found a performance boost of about 9--11%, instead the nonsensical 40% that you keep repeating.

9% is accurate when comparing 4.3 Ghz vs. 4.7 ghz or MCE being activated. But remember, Intel does not actually guarantee even the 4.3 Ghz clocks of Turbo 2.0 max. Several things have to be ideal: Type of workload, # active cores, est. current consumption, est. power consumption, and temp. This this explains the lower scores of Guru 3D and others. Most review chips and premium motherboards dish out the 4.3 ghz no problem. However, If you get a real turd of a chip, it could be as low as 3.7 ghz all core, which is a 27% difference.
 
Stop Please. Even Intel calls MCE and Overclock.




9% is accurate when comparing 4.3 Ghz vs. 4.7 ghz or MCE being activated. But remember, Intel does not actually guarantee even the 4.3 Ghz clocks of Turbo 2.0 max. Several things have to be ideal: Type of workload, # active cores, est. current consumption, est. power consumption, and temp. This this explains the lower scores of Guru 3D and others. Most review chips and premium motherboards dish out the 4.3 ghz no problem. However, If you get a real turd of a chip, it could be as low as 3.7 ghz all core, which is a 27% difference.


Intel can be implicated in that they choose which boards and chips to send to "friends of intel" and the scraps go elsewhere, normally the entry Z370 boards and likely repurposed chips or i5's to the "you can have this as a token of what we think of you"
 
Explains why the review scores were all over the place, MCE being enabled is allowing the cpu to be overclocked for benchmarks and unlikely to be able to sustain those clocks for much beyond that due to thermals. That is pretty shady by Intel allowing that to be defaulted on, but then again they dont even want to bother to tell you the multi-core clocks anymore.
 
I get this is a coffee lake thread, but afaik there is NO 10nm Intel products thread and I thought some would like this somewhat good news:

 
MCE is enabled by default on a lot of high end Asus boards. when you have a K SKU cpu, good mobo and good cooling all core turbo has virtually always worked on intel cpus sense sandy bridge days and nobody complained once till the 8700k showed up and started kicking 8c/16t ryzen's butt. I'm all for reviewers going apples to apples on reviews stock vs stock but if they are going to do that it needs to be MCE Disabled on intel ram at default and same with AMD no ram overclocking. Jayztwocents did it the right way in his revised review and guess what nothing really changed both cpu's just went down a notch in performance overall. The 8700k still won easily overall only loosing to the core count/thread happy cinebench R15 test which unless your into doing software rendering nobody cares about and if you are into that heavily Threadripper is clearly the CPU and Platform of choice not a consumer platform meant for gaming and general tasks.
 
MCE is enabled by default on a lot of high end Asus boards. when you have a K SKU cpu, good mobo and good cooling all core turbo has virtually always worked on intel cpus sense sandy bridge days and nobody complained once till the 8700k showed up and started kicking 8c/16t ryzen's butt. I'm all for reviewers going apples to apples on reviews stock vs stock but if they are going to do that it needs to be MCE Disabled on intel ram at default and same with AMD no ram overclocking. Jayztwocents did it the right way in his revised review and guess what nothing really changed both cpu's just went down a notch in performance overall. The 8700k still won easily overall only loosing to the core count/thread happy cinebench R15 test which unless your into doing software rendering nobody cares about and if you are into that heavily Threadripper is clearly the CPU and Platform of choice not a consumer platform meant for gaming and general tasks.
I posted this in our review thread, but it deserves to be said again here. We worked with ASUS years ago to get MCE into the UEFI when TurboBoost was new as we were having issues getting the clocks locked across all cores. This feature is literally years old. If you are reviewing CPUs and are not aware of MCE, you have no business reviewing CPUs. If you are reviewing CPUs and NOT checking core clocks before running a single benchmark, you should not be reviewing CPUs. Who the hell does NOT check the core clocks before running a single benchmark? The fact that "reviewers" don't check the basics is simply crazy to me.
 
Yep its been around so long i was pretty much dying laughing yesterday when i saw the youtubers posting videos about it and people loosing their minds in the comments it was pretty good.
 
https://software.intel.com/sites/de...tion-set-extensions-programming-reference.pdf

Updated Instruction Set from Intel, includes Icelake, Cannonlake and Goldmont Plus
DL9X76IXkAMtyxk.jpg

DL9YDM_XUAAzqYI.jpg
DL9YImBXkAAE795.jpg
 
I guess all reviewers ran Ryzen with OC memory

Not all reviewers test RyZen with OC memory. And my complain wasn't about overclocking memory (and thus automatically overclocking Zen-based CPU IF). My complain was about reviews that do, but then presenting the Zen-based chip as stock on graphs or in main text of the review. I don't care if Anandtech or Arstechnica run RyZen chip with the interconnect overclocked to reduce latency and increase performance. But I expect they to add an "OC" to the label in the graphs, because the scores aren't for stock settings.

The fact any reviewer knows well that overclocking RAM in RyZen/ThreadRipper chips overclocks the chip (via the relation between IF and RAM) makes it clear that those reviewers are just misleading readers.

No one of you have written a single-line complain about those reviews. Still several of you come here to start another conspiracy theory when two/three reviews tested CFL with MCE activated and label the chip as if was stock. The immense majority of reviews have tested with MCE disabled when reporting stock settings. I gave you a list.

Stop Please. Even Intel calls MCE and Overclock.

No one said MCE is not an overclock. So stop attributing to me stuff I didn't say, and just pay attention my point, because you have ignored it completely.

9% is accurate when comparing 4.3 Ghz vs. 4.7 ghz or MCE being activated. But remember, Intel does not actually guarantee even the 4.3 Ghz clocks of Turbo 2.0 max. Several things have to be ideal: Type of workload, # active cores, est. current consumption, est. power consumption, and temp. This this explains the lower scores of Guru 3D and others. Most review chips and premium motherboards dish out the 4.3 ghz no problem. However, If you get a real turd of a chip, it could be as low as 3.7 ghz all core, which is a 27% difference.

9% higher performance is what the GN review measured. The same GN review that OrangeKrush used to convince us that performance was 20--40% higher with MCE activated.

With MCE disabled other sites measured 1429cb and 1448cb. Guru3D measured 1296cb. The lowest score of all the reviews that I listed.

Guru3d is a biased and useless site. They always obtain lower scores for Intel chips and higher scores for AMD chips than everyone else. This is not new. MCE doesn't have anything to do.
 
Last edited:
Guru3d bias. Techspot is also useless if I recall older threads.

Can we get a list of IDF approved sites so we dont have to go through this every time?
 
Only a irresponsible reviewer would run benchmarks they are planning on publishing without first verifying how many cores are running and at what speed. If reviewers were being more thorough, they would have seen that something was amiss, and we would not be having this ridiculous discussion.
 
Only a irresponsible reviewer would run benchmarks they are planning on publishing without first verifying how many cores are running and at what speed. If reviewers were being more thorough, they would have seen that something was amiss, and we would not be having this ridiculous discussion.

Reviews make money on how controversial they are, as that makes them "different" and thus linked everywhere and clicked more often.

Nothing is more controversial than sheer incompetence + bravado, hence the donald trump sites keep donald trumping.

"Top 10 ways AMD is going to cause Intel to go bankrupt today! Click now!"

There are very few ways to be correct, but oh so many flavorful and interesting ways to be wrong.
 
Last edited:
I would expect a good reviewer to know about this. Shocked that some websites did not. Glad Kyle knows whats up. Damnit Kyle you need to do more youtube shit!!! Make Linus look like a fool please!
 
Foaming at the mouth boy

LMAO

this is why tomshardware still the best, paul is a legit reviewer trust him way more than tom website itself and anandtech.
for storage review, chris from tom and Allyn from pcper wins hands down.
 
On a completely different note, I built an i3-8100 on an ASRock Z370M-ITX/ac today.

Honestly, quad-core i3 CPUs... It's ludicrously fast for $120.

i3 on a Z370 chipset is a killer, but I do like the asrock boards, especially their ITX, big fan of asrock for sure.
 
Guru3d bias. Techspot is also useless if I recall older threads.

It is interesting that you mention both sites, because both do weird things such as testing engineering samples of Intel chips but labeling them in the graphs as if were retail chips. When someone tests an engineering sample I expect an "[ES]" label in the graphs.
 
I delidded my i3-8350k this morning. This is not only my first build but the first time I tried delidding a processor also. I watched a bunch of videos and got the Rockit.88 kit. The delidding was a breeze. I fix small musical instrument parts for a living so I wasn't extremely worried about it, but then again I've never done this before so didn't know exactly what to expect. The delidding process went without a hitch.

I ordered some of those plastic razors I saw Kyle Bennet use in his 8700k delid. Those worked great for removing the old glue without damaging anything.

I used Thermal Grizzly Conductonaut. Seems like people like that stuff. I did not leave very much on there... All the videos I watched said don't leave blobs. There's LM on there, but it's a very thin layer. Hopefully that's what everyone meant. I used tape to cover the chip and was pretty obsessive about not shorting anything out with stray LM. Kind of a scary thought since this computer is 2 days old and worth almost 3 grand... Yikes.

I used Loctite superglue gel to re-lid. I saw a few instances of guys like KB using red ATV, but I don't really see the point... The Silicon Lottery guys were asking people on the OC-UK forums why anyone even bothered to glue them back together since the pressure will keep the lid on, but I decided somewhere in the middle made sense and also I'm impatient and didn't want to wait more than about 30 minutes for the thing to stick together. So I used Loctite superglue gel. Worked great and was more than stable enough after 20~30 minutes to install it back in the CPU socket.

So... I put everything back together (no easy task in this tiny case) and did some really tricky cable management that I'd been putting off until after the delid, and couple hours later I was ready to check 'er out. Plug in, press power, nothing... Shizzz. Not even a sizzle! Pulled the fan mount off, hooked my worklight back up, and prepared for the worst. Thank goodness, I noticed almost immediately that buried deep inside the atrocious cable management maze, the power cables to the PSU had simply become unplugged. Pressed those back in, and boy was I relieved when the thing turned back on without a blazing fireworks festival.

I was able to easily overclock to 5.2ghz and the thing booted right up. Haven't run any stress tests yet, but I am beyond thrilled about owning a computer that can break 5ghz+!!! That's insane for me coming from the world of Macbook Professionaly Slow laptops.

I flew around in DCS World 2 the flight sim for about 15 minutes and then looked at temps. When I overclocked the CPU to 5.0ghz before delidding it I was getting spikes to 96C and hovering around 75-80C just sitting there doing nothing. Now after gaming for a bit I was mid 50C's and idle is 36-40C. I will run some stress tests tomorrow but it appears to have been successful.

Sorry for the long winded post I'm just excited. Now I'm definitely ready to do this again whenever stores get some friggin 8700k's in stock finally, perhaps in the year 2025??

EDIT: I just ran Prime 95 for a bit and I'm getting temps around 80-85C at full load (individual cores lower, CPU package at 80-85C.) Is that too high? Did I do something wrong? I am under the impression that it'll never reach those temps like Prime 95 creates while simply gaming or other normal tasks, etc. BTW - 5.2ghz crashed Prime 95 immediately but 5.0ghz stock Asus OC works for now, that's plenty fast... I guess you could play around with voltages but 5.0ghz is fine for me.
 

Attachments

  • IMG_8161.JPG
    IMG_8161.JPG
    199.8 KB · Views: 21
  • IMG_8162.JPG
    IMG_8162.JPG
    305.5 KB · Views: 22
  • IMG_8163.JPG
    IMG_8163.JPG
    573.2 KB · Views: 24
  • IMG_8164.JPG
    IMG_8164.JPG
    534.9 KB · Views: 22
  • IMG_8165.JPG
    IMG_8165.JPG
    628 KB · Views: 23
  • IMG_8166.JPG
    IMG_8166.JPG
    551.4 KB · Views: 20
  • IMG_8167.JPG
    IMG_8167.JPG
    629.1 KB · Views: 23
  • IMG_8168.JPG
    IMG_8168.JPG
    423 KB · Views: 21
  • IMG_8178.JPG
    IMG_8178.JPG
    184.2 KB · Views: 22
Last edited:
80-85c in prime95 is not bad at all. Don't worry about it.

You must be getting stellar performance in DCS with a 5GHZ CPU. I'm jealous. I could actually buy a 8350k too but I'd rather wait for the 6 core ones to be in stock.
 
80-85c in prime95 is not bad at all. Don't worry about it.

You must be getting stellar performance in DCS with a 5GHZ CPU. I'm jealous. I could actually buy a 8350k too but I'd rather wait for the 6 core ones to be in stock.

Ok great! Thanks for the tip. That's good to know, I wasn't sure what the targets were. Never reaches those temps with normal tasks.

Yes DCS is working pretty awesomely at 5ghz CPU and 1080 ti GPU. I maxed the settins in DCS 1.5 and it doesn't hiccup. Pretty much rules. Dream come true. I've wanted a killer gaming PC since I was a little kid.

The best part for me is how it runs music software. I just can't believe how fast it is it. On this 4-core i3 5ghz I can open 4 instances of an incredibly intense synth patch at the lowest possible latency size of 64 samples (i.e. fastest latency response time) whereas I could not even run 1 instance of that patch on higher latency buffers on my Mac. Goal acheived.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Is a Cryorig H7 tower cooler going to be enough to get a 5ghz overcloack out of an 8700k?

For every Intel *5*0k/*7*0k/*600k/*700k mainstream CPU Ivy Bridge and onwards, you shouldn't be using anything less than

Thermalright True Spirit 140 Direct

https://www.amazon.com/Thermalright-TRUE-SPIRIT-140-DIRE/dp/B01MQCK1PJ/


or

Thermalright True Spirit 140 Power

https://www.amazon.com/Thermalright-True-Spirit-140-Power/dp/B00IYEEOMO/





Anything less than that and you will grace the forums with the old

"OMG PEOPLE TOLD ME Hyper 212 would be best cooler and would be perfect!"

"OMG 7700K is OVERHEATING WITH MY PREMIUM HYPER 212 COOLER EVERYONE RECOMMENDED!"

etc. etc. etc.

Especially if you are on the newfangled trend of cases with no airflow that has been popular since 2014 or so.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Probably not. It's only slightly better than the 212 EVO.
If you get a golden chip or delid, then yes. If you're still on the market then I'd recommend the Scythe Fuma.

https://www.amazon.com/Scythe-FUMA-Rev-B-Cooler-SCFM-1100/dp/B075FX95F2

NzVa5F1.png


My friend has an H7 on his 7700K and he swears he's stable at 5.1 GHz below 75C. So what do I know?

That would be the HSF I would >>least<< advise to use, as it uses massively more than the Intel allowed clamping pressure.

You WILL do noticeable warping of your CPU package/motherboard.

It also comes with sleeve bearing fans, so plan on having to replace them within 1-3 years.

It would be MUCH safer for you to simply delid your CPU instead of using such massively out of spec mounting pressure as the Scythe Fuma uses.

Skylake and onwards have even weaker CPU packages than the ones that came before, so this is extremely important to note.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Instead of me posting news about Intel's 10nm products every once and a while I decided to create a seperate thread


Sorry if the formatting and what not isnt that great, it's the first thread i've created :)
 
I delidded my i3-8350k this morning. This is not only my first build but the first time I tried delidding a processor also. I watched a bunch of videos and got the Rockit.88 kit. The delidding was a breeze. I fix small musical instrument parts for a living so I wasn't extremely worried about it, but then again I've never done this before so didn't know exactly what to expect. The delidding process went without a hitch.

I ordered some of those plastic razors I saw Kyle Bennet use in his 8700k delid. Those worked great for removing the old glue without damaging anything.

I used Thermal Grizzly Conductonaut. Seems like people like that stuff. I did not leave very much on there... All the videos I watched said don't leave blobs. There's LM on there, but it's a very thin layer. Hopefully that's what everyone meant. I used tape to cover the chip and was pretty obsessive about not shorting anything out with stray LM. Kind of a scary thought since this computer is 2 days old and worth almost 3 grand... Yikes.

I used Loctite superglue gel to re-lid. I saw a few instances of guys like KB using red ATV, but I don't really see the point... The Silicon Lottery guys were asking people on the OC-UK forums why anyone even bothered to glue them back together since the pressure will keep the lid on, but I decided somewhere in the middle made sense and also I'm impatient and didn't want to wait more than about 30 minutes for the thing to stick together. So I used Loctite superglue gel. Worked great and was more than stable enough after 20~30 minutes to install it back in the CPU socket.

So... I put everything back together (no easy task in this tiny case) and did some really tricky cable management that I'd been putting off until after the delid, and couple hours later I was ready to check 'er out. Plug in, press power, nothing... Shizzz. Not even a sizzle! Pulled the fan mount off, hooked my worklight back up, and prepared for the worst. Thank goodness, I noticed almost immediately that buried deep inside the atrocious cable management maze, the power cables to the PSU had simply become unplugged. Pressed those back in, and boy was I relieved when the thing turned back on without a blazing fireworks festival.

I was able to easily overclock to 5.2ghz and the thing booted right up. Haven't run any stress tests yet, but I am beyond thrilled about owning a computer that can break 5ghz+!!! That's insane for me coming from the world of Macbook Professionaly Slow laptops.

I flew around in DCS World 2 the flight sim for about 15 minutes and then looked at temps. When I overclocked the CPU to 5.0ghz before delidding it I was getting spikes to 96C and hovering around 75-80C just sitting there doing nothing. Now after gaming for a bit I was mid 50C's and idle is 36-40C. I will run some stress tests tomorrow but it appears to have been successful.

Sorry for the long winded post I'm just excited. Now I'm definitely ready to do this again whenever stores get some friggin 8700k's in stock finally, perhaps in the year 2025??

EDIT: I just ran Prime 95 for a bit and I'm getting temps around 80-85C at full load (individual cores lower, CPU package at 80-85C.) Is that too high? Did I do something wrong? I am under the impression that it'll never reach those temps like Prime 95 creates while simply gaming or other normal tasks, etc. BTW - 5.2ghz crashed Prime 95 immediately but 5.0ghz stock Asus OC works for now, that's plenty fast... I guess you could play around with voltages but 5.0ghz is fine for me.

5ghz at what voltage?
 
Yea I do plan to wait for the 8600k to be in stock. I game at 1080p, so Intel > AMD for 1080p.

Now im curious is there any place that will delid a CPU for you? TBH I would do it myself, but I do not trust my Gorilla ass hands.
 
Yea I do plan to wait for the 8600k to be in stock. I game at 1080p, so Intel > AMD for 1080p.

Now im curious is there any place that will delid a CPU for you? TBH I would do it myself, but I do not trust my Gorilla ass hands.

Silicon Lottery.

tbh its very safe and easy to do even with gorilla hands if you have a tool to do it with but if your just looking for a one time delid and done SL is your safest bet.
 
Back
Top