Those who got the 8700k

I disagree with this statement. Can your 2600k run dat sweet 60hz 4k adult film? I need an 8700k for research purposes.

Considering I game at 4k with 120+ FPS at 60Hz, I would say yes... But just to answer the troll post, I have also watched 4k movies at 60hz on my system as well.
 
LOL You call a year from now "right around the corner"? What universe do you live in?

Personally I am getting old enough that a year from now is practically right around the corner...especially given the fact that nothing has emerged to displace my 2600k in what...6 years?
 
delid too risky.....just need water. Preferably the Loop, I rather detest a closed system. I still don't know why more people don't use it..it's superrrr simple to assemble and hook up, but it's like people that don't know the difference between a fitting and hose or something I swear Lol
 
delid too risky.....just need water. Preferably the Loop, I rather detest a closed system. I still don't know why more people don't use it..it's superrrr simple to assemble and hook up, but it's like people that don't know the difference between a fitting and hose or something I swear Lol

Delid was the best thing I did for this chip. It ran 30c cooler on average and lets me hit high clocks with little compromise. No electron migration or gate issues to report.
 
Delid was the best thing I did for this chip. It ran 30c cooler on average and lets me hit high clocks with little compromise. No electron migration or gate issues to report.

Maybe if I ever acquire a proper delid tool. they say the tim is not very good quality between the lid. that said with the ek kit would be happy if we run in the 50s-60s Celsius range more.
Planned on keeping and ruuning this one awhiled OCed of course. guess it will be a fierce shootout with tims on hand their ectotherm and some mx4. mx4 does seem to work ok and pretty good against aluminum.
 
Maybe if I ever acquire a proper delid tool. they say the tim is not very good quality between the lid. that said with the ek kit would be happy if we run in the 50s-60s Celsius range more.
Planned on keeping and ruuning this one awhiled OCed of course. guess it will be a fierce shootout with tims on hand their ectotherm and some mx4. mx4 does seem to work ok and pretty good against aluminum.

The TIM is great quality. The reason for delid/relid is to remove/reduce the gap and the copper IHS.
 
The TIM is great quality. The reason for delid/relid is to remove/reduce the gap and the copper IHS.
If this were true then regular paste would be more viable, instead of always using LM. Look at things like GPU dies or between the CPU heatsink/IHS where LM is worth about 2C over regular paste.

Die size comparisons:
7700K, 126 mm²
8700K, 151mm²
GTX 1080, 314 mm²
GTX 1080 Ti, 471 mm²

Conductivity comparisons:
MX-4, 8.5 W/mk
Kryonaut, 12.5 W/mk
Liquid Ultra (LM), 38.4 W/mk
Conductonaut (LM), 73 W/mk

My guess is that since the CPU die is so small, there's not enough surface area for paste to be thermally conductive enough. So something like LM which moves a lot more heat over a smaller area can do its job properly. This is also why the IHS works so well with regular paste, the surface area is like 10x larger than the die, so there's plenty of space to spread the heat out as it travels to the heatsink. Also worth mentioning, on delidded CPUs, TGC is a few degrees cooler than CLU despite having a lot more conductivity -- due to deminishing returns on surface area vs conductivity -- the same type of result we see on paste vs LM for large areas.

The gap isn't causing any problems (well maybe a small part of the problem) and the paste is only "bad" in the sense that it's not metal/solder. When you remove the gap and use better quality paste you shave about 5C off.

It's just an oddity that people will say "Intel uses shitty TIM" and then in the same breath say "When you delid, don't use regular paste, it's not worth it" and also "Don't use LM on your GPU, it's not worth it" without asking why is it so inconsistent? If Intel's TIM is so bad, why does better paste show barely any improvement? If LM is so good, why do GPUs show barely any improvement?

vafOSqG.png


 
Last edited:
If this were true then regular paste would be more viable, instead of always using LM. Look at things like GPU dies or between the CPU heatsink/IHS where LM is worth about 2C over regular paste.

Die size comparisons:
8700K, 151mm²
GTX 1080, 314 mm²
GTX 1080 Ti, 471 mm²

My guess is that since the CPU die is so small, there's not enough surface area for paste to be thermally conductive enough. So something like LM which moves a lot more heat over a smaller area can do its job properly. This is also why the IHS works so well with regular paste, the surface area is like 10x larger than the die, so there's plenty of space to spread the heat out as it travels to the heatsink.

The gap isn't causing any problems (well maybe a small part of the problem) and the paste is only "bad" in the sense that it's not metal/solder. When you remove the gap and use better quality paste you shave about 5C off.

It's just an oddity that people will say "Intel uses shitty TIM" and then in the same breath say "When you delid, don't use regular paste, it's not worth it" and also "Don't use LM on your GPU, it's not worth it" without asking why is it so inconsistent? If Intel's TIM is so bad, why does better paste show barely any improvement? If LM is so good, why do GPUs show barely any improvement?

vafOSqG.png




Idontcare on AT tested it, until he was chased away (Results). His results was quite telling about quality, gap etc. As you also show something like AS5 that have problems over time doesn't bring much benefit.

You are certainly right about heat density as well. Also why the same HEDT cores are much easier to cool.
 
They have always soldered. Then switched to tim and temps soared

The tim is not shitty. The method they went is.

I agree with your claims Taint.
 
Idontcare on AT tested it, until he was chased away (Results). His results was quite telling about quality, gap etc. As you also show something like AS5 that have problems over time doesn't bring much benefit.

You are certainly right about heat density as well. Also why the same HEDT cores are much easier to cool.

I'll chime in on that one, it's ok accurate, right now using the d14 and Liquid Pro , it can still get into the 70s and almost hit 80c. Reason I'm going back to water.
 
Why be mad? Be happy! Be happy than Intel is coming out with an 8 core consumer CPU. Why? Simple, more cores from AMD.

I hope we see a 10 or 12 Core Zen 2 CPU announced soon.

That would be totally awesome.

The 8C/16T mainstream CPU from Intel was planned before Zen. It appeared in roadmaps before Zen was tapeout. I wonder who is copying whom.

Also I always read how we have to be grateful to AMD for something. I never read praise for Intel. AMD moving from CMT to SMT, moving from speed-demon uarch to brainiac muarch, moving from SOI to FinFETs,... are consequence of Intel offering all that before.
 
Intel just cheaped out on their chips to save cash, happens when a accountant makes a call rather then a engineer. It's so bad now pretty much everyone that overclocks their Intel chip on here has to delid it to get temps that are reasonable.
 
Intel just cheaped out on their chips to save cash, happens when a accountant makes a call rather then a engineer. It's so bad now pretty much everyone that overclocks their Intel chip on here has to delid it to get temps that are reasonable.

That's what the uninformed would claim. The first excuse was that they cheaped out on consumers while Xeons would be soldered. Yet here we stand today. :)

AMD is pretty much the last company that uses solder and even they dabble in solder replacement on lower chips.
 
That's what the uninformed would claim. The first excuse was that they cheaped out on consumers while Xeons would be soldered. Yet here we stand today. :)

AMD is pretty much the last company that uses solder and even they dabble in solder replacement on lower chips.

Just a added value for AMD and go figure on their cheapest chips they looked at not using solder. But you like to apologize for Intel so dont let me stop ya. Also the Xeons would run much cooler if they were soldered and if AMD reaches parity with Intel at the 7nm node then it will become a big issue for Intel.
 
Just a added value for AMD and go figure on their cheapest chips they looked at not using solder. But you like to apologize for Intel so dont let me stop ya. Also the Xeons would run much cooler if they were soldered and if AMD reaches parity with Intel at the 7nm node then it will become a big issue for Intel.

Yes and the RMA rates for Xeons would be higher too with solder. But that's the difference between someone working with datacenters and someone just making up their own views.

Solder isn't some kind of universal savior. Not that it isn't an issue not covered countless times the last few years.

And the fabled 7nm node saves it all! :D

#keepwaitforthenextproduct
 
Yes and the RMA rates for Xeons would be higher too with solder. But that's the difference between someone working with datacenters and someone just making up their own views.

Solder isn't some kind of universal savior. Not that it isn't an issue not covered countless times the last few years.

And the fabled 7nm node saves it all! :D

#keepwaitforthenextproduct

Yeah you keep saying it, prove they had a higher failure rate with solder vs using TIM. Only people having issues with solder were extreme overclockers using LN2 multiple times which can shock the solder and cause it to fail. You keep apologizing for Intel rather then proving it's actually better, when all the current owners see is higher temps and it's pretty bad when you cant overclock you K chip cause it's already close to thermally throttling at stock speeds.
 
delid too risky.....just need water. Preferably the Loop, I rather detest a closed system. I still don't know why more people don't use it..it's superrrr simple to assemble and hook up, but it's like people that don't know the difference between a fitting and hose or something I swear Lol

Disagreed, my 8700K under a custom loop before delid still hit 90+ under 1.3v. Took the lid off and put liquid metal on there and I dropped 15+ degrees with 1.360v. But I do test my CPU to be stable using a few hours of prime95 w/ AVX and no offset instead of an hour or two of Realbench like most do nowadays before calling it stable.

Agreed on the loop though and maybe some chips are lidded at factory better than others to I guess my post would be a YMMV on how well your chip was lidded before delidding.
 
So back to the original question, no I don't regret the 8700k one bit. The dang CPU is a crazy beast. I did delid it, and yes it dropped my temps 15-20C. Similar to legcramp's post, mine is now 4.8 stable, on air, with prime 95 AVX or hours of Realbench. I could put in a lot of work and try to take it to 5 or higher, but my goal was going for full stable under any workload and it's there.

And as for the current argument about solder or TIM, how about Intel put the nail polish on themselves, and add the liquid metal TIM themselves. They could even design in a ridge on the IHS that would help keep the liquid metal located. But unfortunately, we know that on the automated assembly line the application of the stuff would be a whole lot more involved than "squirt and close."

But you've got to admit that it would be really really nice on a $400+ CPU to have this kind of stuff done under the hood already. Either that or sell them in a 8700e (enthusiast) variant where the IHS is already off and they've applied protectant to the top of the chip. They can even send a packet of approved high temp silicone to put the IHS on with. Retaining the warranty for enthusiast builders would be easier this way, rather than having all of us more or less forced to delid to get decent temps. Yes, forced. As in, if you want to run your chip [H], you are going to have to delid.
 
Yeah you keep saying it, prove they had a higher failure rate with solder vs using TIM. Only people having issues with solder were extreme overclockers using LN2 multiple times which can shock the solder and cause it to fail. You keep apologizing for Intel rather then proving it's actually better, when all the current owners see is higher temps and it's pretty bad when you cant overclock you K chip cause it's already close to thermally throttling at stock speeds.

https://overclocking.guide/the-truth-about-cpu-soldering/
 
Disagreed, my 8700K under a custom loop before delid still hit 90+ under 1.3v. Took the lid off and put liquid metal on there and I dropped 15+ degrees with 1.360v. But I do test my CPU to be stable using a few hours of prime95 w/ AVX and no offset instead of an hour or two of Realbench like most do nowadays before calling it stable.

Agreed on the loop though and maybe some chips are lidded at factory better than others to I guess my post would be a YMMV on how well your chip was lidded before delidding.

Yeah depends on the particular chip, the Loop may not be that significant vs air sometimes but you're Average Temps are usually a little better on the Loop. So it's still being cooler, every little bit counts i guess. I need to get a delid tool me, I still have some liquid metal.
 
So the chip is specced at 4.3 GHz (all core). It is very easy to run at least 4.7 GHz on good air cooling. I find it hard to sympathize with the position anyone is really getting screwed.

Maybe something could be better optimized in some cases, but the thing does exceed what it is sold to do by a pretty good amount.
 
Cool a article that even mentions intense thermal cycling which no normal cpu use goes through (mentions using LN2). Now I asked for failure rates due to solder failure and why does the supposed inferior AMD have no issues with it and still uses it?

The article mentions more things including a "Stop hating on Intel". And where is stated that "inferior AMD" doesn't have issues? How many EPIC/TR are used in critical-mission machines?
 
The article mentions more things including a "Stop hating on Intel". And where is stated that "inferior AMD" doesn't have issues? How many EPIC/TR are used in critical-mission machines?

Once again prove they have failures with solder in normal use not ln2. The article even mentions it takes extreme thermal cycling which is not normal use. Plenty of old Opterons still used by people and older soldered Xeons without issue. Your excuse train never gets old and the fact that you constantly deflect rather then answer.
 
Once again prove they have failures with solder in normal use not ln2. The article even mentions it takes extreme thermal cycling which is not normal use. Plenty of old Opterons still used by people and older soldered Xeons without issue. Your excuse train never gets old and the fact that you constantly deflect rather then answer.

The Solder was better but I don't know decided to go to a Tim that isn't very good...
 
Once again prove they have failures with solder in normal use not ln2. The article even mentions it takes extreme thermal cycling which is not normal use. Plenty of old Opterons still used by people and older soldered Xeons without issue. Your excuse train never gets old and the fact that you constantly deflect rather then answer.

You come here first to claim that Intel uses TIM only for saving pennies. You come here to say that AMD technology is superior and has no issues. You are given an article with some info about what are the advantages of TIM over soldier. You read only part of it and, moreover, you pretend others to prove his claims, when you are not proving anything in your side?

You are the one that comes here (an Intel thread) to make strong claims anti-Intel and pro-AMD. The burden of proof is on you.
 
You come here first to claim that Intel uses TIM only for saving pennies. You come here to say that AMD technology is superior and has no issues. You are given an article with some info about what are the advantages of TIM over soldier. You read only part of it and, moreover, you pretend others to prove his claims, when you are not proving anything in your side?

You are the one that comes here (an Intel thread) to make strong claims anti-Intel and pro-AMD. The burden of proof is on you.

All I saw is that it costs about 5 dollars or less to use indium since its a rare metal, which goes in line with the "Intel does it to save a few bucks"

And when cycled from -55 to 125 degrees Celsius the bonding is getting microfractures

Which is not only quite an unusul temperature range, but other materials can get micro fractures as well when subjected to temperatures a lot that they are not meant to be used in


And the warped IHS was an issue for a long time, but the worst offeneder i had like an old Q6600 got a temp decrease of 5 degrees

A delid nets you at least 10, sometimes 15

As I have seen
There are no advantages of TIM over solder except price and ease of use

There are some Kabylakes that don' have a plane IHS, skylakes didn't



And about the microfractures

Something tells me those cpu's aren' rated to be used in temps ranging from -55 to 125
 
All I saw is that it costs about 5 dollars or less to use indium since its a rare metal, which goes in line with the "Intel does it to save a few bucks"

And when cycled from -55 to 125 degrees Celsius the bonding is getting microfractures

Which is not only quite an unusul temperature range, but other materials can get micro fractures as well when subjected to temperatures a lot that they are not meant to be used in


And the warped IHS was an issue for a long time, but the worst offeneder i had like an old Q6600 got a temp decrease of 5 degrees

A delid nets you at least 10, sometimes 15

As I have seen
There are no advantages of TIM over solder except price and ease of use

There are some Kabylakes that don' have a plane IHS, skylakes didn't



And about the microfractures

Something tells me those cpu's aren' rated to be used in temps ranging from -55 to 125

Dying CPUs with solder from sudden heat throttle isn't uncommon. Plenty of forum threads have been created about the issue. No less on serves. It´s simply a reliability issue. the consumer was just testing grounds to get the statistics numbers before moving to Xeons. And now Xeons and everything else is fully TIM. I dont think any other company left but AMD uses solder at this point.

HPC processors of various kind, FPGAs etc is all TIM.
 
I dont think any other company left but AMD uses solder at this point.

And sooner or later AMD will abandon solder. It is only that AMD comes years latter than other companies to the better technology: Recall how much latter AMD embraced SMT, brainiac muarch, FinFETs, DDR4, PCie 3...
 
I dont think any other company left but AMD uses solder at this point.

Which other CPU manufacturers use an IHS? Besides Intel (all your examples come from an Intel fab) and AMD?

Is the Power9 soldered? Frankly I don't know

I'm not bashing Intel, but saving 5 dollars on every CPU they make, i3/5/7/9, XEONs and so (those must number in millions each year) to improve a profit margin sounds exactly what a company that is being traded at stock markets would do

If you run the cpu's within spec everything is fine after all

If anything desktop users were the cannaries to test out if a simple TIM works fine before Intel switched the professional cpu's from solder to TIM
 
And sooner or later AMD will abandon solder. It is only that AMD comes years latter than other companies to the better technology: Recall how much latter AMD embraced SMT, brainiac muarch, FinFETs, DDR4, PCie 3...

Or dual cores or 64-bit consumer CPUs...oh wait...that doesn't fit your anti-AMD narrative. We're just lucky Intel still isn't on netburst or you'd be telling us how great it is.

Dying CPUs with solder from sudden heat throttle isn't uncommon. Plenty of forum threads have been created about the issue. No less on serves. It´s simply a reliability issue. the consumer was just testing grounds to get the statistics numbers before moving to Xeons. And now Xeons and everything else is fully TIM. I dont think any other company left but AMD uses solder at this point.

HPC processors of various kind, FPGAs etc is all TIM.

I don't have a problem with TIM if they'd actually apply it in a way that the average consumer didn't have to delid it to fix it.
 
Last edited:
I don't have a problem with TIM if they'd actually apply it in a way that the average consumer didn't have to delid it to fix it.

To be honest I think decreasing z height and reducing the gap between IHS and DIE would already do a lot

The 2 surfaces (DIE;IHS) don't have actual contact so a medium In between has to bridge a gap and solder (indium solder is like 83 wmk I think) and LM are just so much better at heat transfer than any other paste can be


But then, being more strict on IHS and glue tolerances would increase cost again and I don't think that's gonna happen
 
To the OP's initial question, I just upgraded to an 8700K from an I7 920 system I built at the end of 2009. I probably won't build another system for another 5 years (or longer), which may put me in the minority around here. However, I could care less what comes out in 6 months. This was my time to upgrade, I'm enjoying it, and I'll be happy regardless when the next thing comes.
 
Which other CPU manufacturers use an IHS? Besides Intel (all your examples come from an Intel fab) and AMD?

Is the Power9 soldered? Frankly I don't know

I'm not bashing Intel, but saving 5 dollars on every CPU they make, i3/5/7/9, XEONs and so (those must number in millions each year) to improve a profit margin sounds exactly what a company that is being traded at stock markets would do

If you run the cpu's within spec everything is fine after all

If anything desktop users were the cannaries to test out if a simple TIM works fine before Intel switched the professional cpu's from solder to TIM

It includes much more companies and product types than CPUs and Intel and AMD. name me a single other company still using solder besides AMD for high performance parts.

And 5$? I think we talk sub 1$. Not talking about the R&D investment into TIM and requalification etc.
 
I don't have a problem with TIM if they'd actually apply it in a way that the average consumer didn't have to delid it to fix it.

The problem is if you reduce the gap to nothing you get other issues like potentially crushed dies. Remember it needs to be for the 99%.
 
Or dual cores or 64-bit consumer CPUs...oh wait...that doesn't fit your anti-AMD narrative.

I was mentioning modern technologies that were embraced by rest of companies (Apple, IBM, Intel, Sun/SPARC,...) much before AMD did. But if you want to discuss the 'prehistory' then let me remind you that AMD's "64-bit consumer CPUs" came several years latter than 64-bit consumer CPUs from other companies. Check history on 64bit computing.
 
Last edited:
I was mentioning modern technologies that were embraced by rest of companies (Apple, IBM, Intel, Sun/SPARC,...) much before AMD did. But if you want to discuss the 'prehistory' then let me remind you that AMD's "64-bit consumer CPUs" came several years latter than 64-bit consumer CPUs from other companies. Check history on 64bit computing.

X64 is licensed from AMD by Intel and for the rest of us that is all that matters when it comes to 64 bit computing. Also Intel's / HP 64 bit solution was not for consumers it was strictly for business, they just hoped to spread it to the mainstream.
 
Back
Top