High core count Comet Lake's gonna run hot.

???

Compared to the 3950x and 3900x, yes, it's a lot.
View attachment 236456
Courtesy of Legit Reviews

Tom's HW found the two to be much more similar in power consumption. It is true that the 9900K/KS's power explodes while overclocked. The 3900X's climbs significantly as well, just not as much.

But it's not like we're looking at 300W on one team and 150W from the equivalent product from the other team.
 
Yes, you are correct that temperature and power are different metrics. That's why I mentioned each of them separately and noted that the two chip makers' products are roughly equal on both metrics. Fun fact: nobody cares about the labeling of the process outside of fanbois from one camp or the other. 300W on 14nm is still less power than 400W on 7nm.

You mentioned them separately, but conveniently only mentioned the chip that (*shock*) is significantly larger. Fun fact: nobody cares about process now that Intel is up to two years from catching up to AMD according to their roadmaps.

I saw numerous reviews stating 400-450W on stock 2970WXes. Even still, if we go with your 300W number (which is higher than AMD's stated rating, btw), we're in exactly the same place as I said before: similar power consumption for similar performance. I'm happy you agree with me there.

Links, please? I did a quick search and the highest number I saw was 310 W. With regards to the power consumption being higher than the stated TDP... uhhh, so? TDP is not the same thing as power consumption for either camp and hasn't been for a very long time. If you want to try to use that as a point against AMD, though, how about the 165W TDP 9980XE drawing nearly 200W stock in reviews?

Also, awwww, you're using the "thanks for agreeing with me" bit because you don't have an actual argument. It's cute.

The reasons I didn't mention the TR 3000 series are:
1) Nothing in Intel's desktop lineup can match that performance level, so it becomes an impossible comparison. Intel is definitely playing catch up here. Or rather, they need to start - they're so far behind right now, it isn't clear that they're even playing.
2) The "hurrdurr intel so hot" mindset predates the 3000 series by several years. It's been a false feel-good the entire time.

Ah, so we can only make comparisons on your terms. Got it. How does this mesh with your Ryzen 4000 dodge below?

Right. This is exactly what I'm mocking. The chips aren't particularly hot and don't consume a noteworthy amount of power when you compare them to the equivalent AMD unit.

They do though. The 9900K consumes >80% more power than the 3700X, a chip that is single digits away in performance at stock, and is quite difficult to cool if you want to do something as simple as run all cores at the maximum boost frequency. This is not to say AMD's parts aren't toasty (the Ryzen 3000 series certainly is), but to claim that Intel's parts aren't is dishonest.

I'm not really interested in comparing hypothetical numbers for a product that will, someday, exist in the market against something I can walk into a store a buy. Until the chip hits the wild, any comparison is BS.

It's in the hands of reviewers and has product listings all over the place. Outside factors disrupting the supply chain don't make it any less real. Also, if we're disregarding chips you can't buy, why even bother pushing back against anecdotes about a product that won't even be announced until the end of this month and won't be launched until the end of May?
 
You mentioned them separately, but conveniently only mentioned the chip that (*shock*) is significantly larger. Fun fact: nobody cares about process now that Intel is up to two years from catching up to AMD according to their roadmaps.

Why does chip size matter? Are you trying to shift the goalposts to now be about "watts per mm2?" Great, another thing that nobody cares about.

Nobody cared about process sizing before AMD caught up either. There was a time a long time ago when process sizing served as a useful metric for broadly hypothesizing about the performance of upcoming platforms which didn't exist yet. It's probably been a decade since then. But, in the context of this thread, it is still entirely irrelevant because we have real numbers from real parts that we can use instead of having to try to make guesses about chips that don't exist yet.

Links, please? I did a quick search and the highest number I saw was 310 W. With regards to the power consumption being higher than the stated TDP... uhhh, so? TDP is not the same thing as power consumption for either camp and hasn't been for a very long time. If you want to try to use that as a point against AMD, though, how about the 165W TDP 9980XE drawing nearly 200W stock in reviews?

"Ryzen Threadripper 2970WX can hit 450W during everyday operation."

If you're reading reviews that show the 9980XE at only 200W, you're reading some pretty poor reviews. Mine, at completely stock settings, is currently reporting 307W. Maybe this is the source of the discrepancy.

**mid-post update** It has now come down to reporting 301W.

Also, awwww, you're using the "thanks for agreeing with me" bit because you don't have an actual argument. It's cute.
Technically, that bit means that we're in agreement and you're simply declaring you're disagreeing because feeling like you're the rightest helps you sleep better at night. This is the part of the script where you tell me I'm wrong while restating what I said.


This is not to say AMD's parts aren't toasty (the Ryzen 3000 series certainly is), but to claim that Intel's parts aren't is dishonest.

To claim that I claimed Intel's parts don't consume power at high (compared to room temp) temps is dishonest. In fact, it's you inventing something to feel "right" about. Another victory! Congrats!


It's in the hands of reviewers and has product listings all over the place. Outside factors disrupting the supply chain don't make it any less real. Also, if we're disregarding chips you can't buy, why even bother pushing back against anecdotes about a product that won't even be announced until the end of this month and won't be launched until the end of May?

The 4000 is so real that there's a brand new thread on [H] today talking about rumors of a fall release date.
 
(Wondering how much money Intel is spending on "hacking" the AMD CPUs looking for "killer" problems and weaknesses. I bet it's more than people think.)

They don’t have to, they hired Jim Keller, the guy who designed Ryzen.
 
That's liability waiting to happen. I'd be pretty sure they won't even include a cooler, because fuck you we are Intel and you are a sucker for buying our obsolete, overheating and over-hyped, insecure crap for the sake of 3 fps in some edge-case twitch fps and muh 5 gigahurtz.

I don't want another fucking intel heatsink to throw out. FFS. From all the non-k's I bought a few years ago, I still spent the 20$ or so on the hyper seriess. Actually, I use them to cool 50w LED's, because they're bad for the environment, can't be recycled here municipally, and aren't worth the 1$ you'd get a tt a recycle center. You can use them for 100w LED's if you want an entire row of COB's to blow, because they can't handle 100w even for LED's with TJmaxes over 105c.
 
Tom's HW found the two to be much more similar in power consumption. It is true that the 9900K/KS's power explodes while overclocked. The 3900X's climbs significantly as well, just not as much.

But it's not like we're looking at 300W on one team and 150W from the equivalent product from the other team.

The 4000 is so real that there's a brand new thread on [H] today talking about rumors of a fall release date.

4000 series laptops are available in retail stores. There are a bunch of reviews where they are faster and consume less power, sometimes significantly so. Feel free to look them up. That thread you're referring to is regarding the desktop 4000 series (zen 3)

Also, numerous sites report somewhere around 230W for the 3950X and 280W for the 9900KF. 3700X is at around 150W. Do you honestly still fail to get the jokes about intel consuming a lot of power? They just do. Comet lake will simply increase the gap, there's no way around it.
 
They don’t have to, they hired Jim Keller, the guy who designed Ryzen.


The man is a damned god of chip design.

People debate how much credit he should get for athlon but he was there for that at AMD when that was happening (lead architect for original athlon 64). He went to Apple where he put their cpu division way ahead (im a Pixel and Samsung user but still recognize that the one thing I think apple does better is the CPU). Then he went to Tesla for their car chips which are solid-- maybe his only project that wasn't a home run but still impressive. It says something about you when the Tesla computer is arguably your least successful project.

Then Ryzen was developed under his leadership at AMD. Now he is at Intel. If anyone can turn around Intel's cpu designs its him. Might take a couple years or so like it took him at AMD but yeah. Whenever a company is struggling with CPU design they hire him.

Edit: on topic, Keller was hired and the last round of the bulldozer came out while he was busy making ryzen. Im hoping that similarly comet lake will be the last round of chips from Intel with basically the same architecture over the last 5 years or so.
 
Last edited:
C'mon they haven't included coolers for the K series since around Haswell. And their reasoning was sound, if you're buying that cpu you were likely OCing, and if you weren't, well you probably didn't buy the right CPU.

I bought a lower end coffee not too long ago and guess what, it came with the HSF!
Of course. But a majority of K series chips in the field aren't OC'd anyway.. they are just bought because 'intel da bess'. In that situation a stock cooler (if actually designed properly) would be fine. Interesting to hear they included one with your CL though...
The AMD included coolers are fine for getting a rig running and to be honest are pretty adequate for your average user. I used one for 4-5 months (wraith spire, whatever comes with 2600x) and it was inaudible and didn't run too hot in a damn mITX case. That would cost 5-10 bucks max to include.. more for the TR models. If Intel wants to compete they should probably look at doing something similar.
The only chip AMD don't have a cooler with from memory is the 3950x for same reasoning as Intel. At the prices they are charging it's pretty poor showing that Intel don't include something reasonable with the rest of their non-halo products below those points.
Remember the recalls on the Xeon coolers late last year? They include them there and used that 2000s extruded alu shitfest that barely handles 50W.

I had a 4,000 watt peltier cooled Titan SLI / 5960X rig at one point.
Holy shit was it water or air cooled pelt? What temps did you get?
I've been looking at ways to do that with a new heatspreader material for Zen 2 or 3... I may have a very interesting solution that can make one hell of a thermal gradient with over 10x the conductivity of copper and no heatpipes...
 
The man is a damned god of chip design.

People debate how much credit he should get for athlon but he was there for that at AMD when that was happening (lead architect for original athlon 64). He went to Apple where he put their cpu division way ahead (im a Pixel and Samsung user but still recognize that the one thing I think apple does better is the CPU). Then he went to Tesla for their car chips which are solid-- maybe his only project that wasn't a home run but still impressive. It says something about you when the Tesla computer is arguably your least successful project.

Then Ryzen was developed under his leadership at AMD. Now he is at Intel. If anyone can turn around Intel's cpu designs its him. Might take a couple years or so like it took him at AMD but yeah. Whenever a company is struggling with CPU design they hire him.

Edit: on topic, Keller was hired and the last round of the bulldozer came out while he was busy making ryzen. Im hoping that similarly comet lake will be the last round of chips from Intel with basically the same architecture over the last 5 years or so.
'

IIRC Tiger Lake will be the first chip to have "Keller DNA". Dont expect a killer chip right off the line. It will be gains over several releases due to how Intels development cycles are. Hes not there to invent an architecture. Hes already said numerous times that hes a problem solver and is very focused on that as it keeps him busy/not bored.
 
Of course. But a majority of K series chips in the field aren't OC'd anyway.. they are just bought because 'intel da bess'. In that situation a stock cooler (if actually designed properly) would be fine. Interesting to hear they included one with your CL though...
The AMD included coolers are fine for getting a rig running and to be honest are pretty adequate for your average user. I used one for 4-5 months (wraith spire, whatever comes with 2600x) and it was inaudible and didn't run too hot in a damn mITX case. That would cost 5-10 bucks max to include.. more for the TR models. If Intel wants to compete they should probably look at doing something similar.
The only chip AMD don't have a cooler with from memory is the 3950x for same reasoning as Intel. At the prices they are charging it's pretty poor showing that Intel don't include something reasonable with the rest of their non-halo products below those points.
Remember the recalls on the Xeon coolers late last year? They include them there and used that 2000s extruded alu shitfest that barely handles 50W.


Holy shit was it water or air cooled pelt? What temps did you get?
I've been looking at ways to do that with a new heatspreader material for Zen 2 or 3... I may have a very interesting solution that can make one hell of a thermal gradient with over 10x the conductivity of copper and no heatpipes...

I used a fluid. Around -10C. I was afraid to on much lower with my job on the fritz at the time. I bought a large cooler chest (that you’d use for picnics) ran water lines, power, usb/monitor extension wires through holes in the side of it that I caulked. I used the peltiers on three 5”x12” copper aluminum flat plates also inside that chest. Also 5# of desicate and a humidity monitor. Basically it was in my basement and ran wires upstairs. Also dumped the heat into my pool. Wife was like, “why is the pool 80F in April?”

I was the fastest 24/7 2 card rig on Firestrike at the time lol. Was it functionally worth it? No. But it was fun.
 
Why does chip size matter? Are you trying to shift the goalposts to now be about "watts per mm2?" Great, another thing that nobody cares about.

Nobody cared about process sizing before AMD caught up either. There was a time a long time ago when process sizing served as a useful metric for broadly hypothesizing about the performance of upcoming platforms which didn't exist yet. It's probably been a decade since then. But, in the context of this thread, it is still entirely irrelevant because we have real numbers from real parts that we can use instead of having to try to make guesses about chips that don't exist yet.

Die size matters because, as I already stated, consuming the same power with a larger die will lead to lower temperatures. Try to keep up.


I guess you just blanked that part where I said "stock" out of your memory? Either you're arguing in bad faith (likely), or you didn't even bother to check the power consumption page of that very review where it shows the 2970WX consuming a maximum of 210 W under any of their test scenarios unless you overclock it.

If you're reading reviews that show the 9980XE at only 200W, you're reading some pretty poor reviews. Mine, at completely stock settings, is currently reporting 307W. Maybe this is the source of the discrepancy.

**mid-post update** It has now come down to reporting 301W.

This really doesn't help your case.

Technically, that bit means that we're in agreement and you're simply declaring you're disagreeing because feeling like you're the rightest helps you sleep better at night. This is the part of the script where you tell me I'm wrong while restating what I said.

What does this even mean? The fact that you can't see the difference in our arguments makes it seem like you just really don't understand the core concepts at play.

To claim that I claimed Intel's parts don't consume power at high (compared to room temp) temps is dishonest. In fact, it's you inventing something to feel "right" about. Another victory! Congrats!

I never said anything like that, so ok?

The 4000 is so real that there's a brand new thread on [H] today talking about rumors of a fall release date.

That's the desktop part, you were complaining about the laptop parts. AMD's bad naming conventions aside, I can't honestly believe you already forgot about that, so this must again be a point made in bad faith. I'm done trying to reason with a troll, so have fun!
 
what a great "internet knowledge pissing contest" we got going here...
IF this chip is real it wont come with a cooler, gonna need that chiller from the last ones.
 
what a great "internet knowledge pissing contest" we got going here...
IF this chip is real it wont come with a cooler, gonna need that chiller from the last ones.

FWIW the 9700F came with a cooler, as did the 9900 (non-K).
 
Not sure why anyone is surprised that 14nm+++++ procs are going to run hot. Also, not really sure why we would compare 7nm procs temps since they obviously use less power. Power is heat. Intel is in a bad way currently and is doing the best they can. Sucks to be them. This too shall pass.
 
Not sure why anyone is surprised that 14nm+++++ procs are going to run hot.

No one is "surprised"

latest?cb=20110310085358.jpg
 
Also, not really sure why we would compare 7nm procs temps since they obviously use less power. Power is heat.

Power is heat, but heat is not temperature. Less heat in smaller area (really volume, but we don't really care about the height of the chips yet) can still result in higher temperatures. Add in the fact that AMD has changed the way temperatures are reported and you end up with chips that at least appear to run hotter even though they consume a fraction of the power. My 3800X, for example, is cooled by an Optimus Foundation water block in a loop with three 360 mm radiators and the hottest temp reported by the chip while writing this post is a good 25C over the coolant temp (so sitting at 55C in this 28C room). The "real" (average) temp is actually only 38C, but that's still warm considering the cooling being used.
 
My 3800X, for example, is cooled by an Optimus Foundation water block in a loop with three 360 mm radiators and the hottest temp reported by the chip while writing this post is a good 25C over the coolant temp (so sitting at 55C in this 28C room). The "real" (average) temp is actually only 38C, but that's still warm considering the cooling being used.

Sigh. My 3600X gets a lot hotter than that (79C on Cinebench R20), on a Corsair H150i. I've taken it off and replaced it a couple of times now and I don't know what's wrong.
 
Sigh. My 3600X gets a lot hotter than that (79C on Cinebench R20), on a Corsair H150i. I've taken it off and replaced it a couple of times now and I don't know what's wrong.

3600 runs at 64c boosting between 4.1 and 4.2 with 560mm of radiator.

Tinyyyyy offset die = hot.

Der8auer released a contraption to help center the cooler on the die, supposedly helps temps a bit.
 
Back
Top