High core count Comet Lake's gonna run hot.

1_rick

Supreme [H]ardness
Joined
Feb 7, 2017
Messages
5,359
Maybe they should rename it Lava Lake.

A leaked slide (so take it with the usual grain of salt, though) shows the 10-10900F (not the K!) with PL1 of 170W and PL2 of 224W, using as much power as an RTX 2080. Note the PL2 was at an all-core 4.5GHz. The good news is you won't need a heater come winter. Also, it was running at 93C in Aida64.

https://www.pcgamer.com/intel-comet-lake-power-draw-turbo/

Can we embed twitter? (Edit: either no, or I did it wrong. Anyway, click through for screenshots.)

https://twitter.com/9550pro/status/1247542868668862465
 
Reminds me of when the 480 came out:
yas8gg9dms111.jpg
 
This is no surprise. All Intel's 8c/16t and up CPU's are insanely hot when their clock speeds go up. This one shouldn't be that bad compared to the 12+ Core HEDT chips when they are overclocked. The 10980XE being the worst offender. At 4.7GHz I've seen them hit close to 110c under full load.
 
You people damn well know they still going to stick that flimsy little aluminum heat in there. They won't even spring for the copper slug version.
That's liability waiting to happen. I'd be pretty sure they won't even include a cooler, because fuck you we are Intel and you are a sucker for buying our obsolete, overheating and over-hyped, insecure crap for the sake of 3 fps in some edge-case twitch fps and muh 5 gigahurtz.
 

Attachments

  • greta does not approve of intel.jpg
    greta does not approve of intel.jpg
    138 KB · Views: 0
Maybe this is the reason they included some slower base clocked chips with the bunch I was intrested but heat destroys PC parts. Nobody wants a hot chip this is where AMDs design really payed off. Still rocking 8700k but intrested in next gen AMD
 
That's liability waiting to happen. I'd be pretty sure they won't even include a cooler, because fuck you we are Intel and you are a sucker for buying our obsolete, overheating and over-hyped, insecure crap for the sake of 3 fps in some edge-case twitch fps and muh 5 gigahurtz.

C'mon they haven't included coolers for the K series since around Haswell. And their reasoning was sound, if you're buying that cpu you were likely OCing, and if you weren't, well you probably didn't buy the right CPU.

I bought a lower end coffee not too long ago and guess what, it came with the HSF!
 
This is no surprise. All Intel's 8c/16t and up CPU's are insanely hot when their clock speeds go up. This one shouldn't be that bad compared to the 12+ Core HEDT chips when they are overclocked. The 10980XE being the worst offender. At 4.7GHz I've seen them hit close to 110c under full load.

Sure, but IIRC the unlocked parts were rumored to have a 125W TDP, and also IIRC TDP == PL1 == the base clock speed, not the all-core clock. Bumping PL1 by 45W seems like an open admission of failure.
 
Not surprising. Monstrous TDP's and toasty chips always happen when one company is way behind the other in architecture and manufacturing process node. It happened with Bulldozer/Excavator chips (FX 9590 at 220W) when Intel was ahead. It happened with the Pentinums that were famously toasty when AMD was ahead.
 
Could someone tell me when it's likely I'll be able to upgrade my 2600k to Intel's next (non-Skylake-based) architecture? Because that's what I'm waiting for.
 
Could someone tell me when it's likely I'll be able to upgrade my 2600k to Intel's next (non-Skylake-based) architecture? Because that's what I'm waiting for.
Well excuuuuuuuuuuuuuuuuuuuse the pandemic for messing up your upgrade life! Put your patience hat on for 2021 or go Ryzen. ;)
 
Is this actually a lot? I find that my 9980xe pulls about 300W at 100% load (mostly F@H WUs this week) and sits happily at 80C for unlimited duration. This is 30C below the throttling temp and this is all on stock clocks with the CPU showing 4.1Ghz all-core during the computes.

Compare this against, say, a 2970WX which is rated from AMD at 250W but reports show routinely hits 400W+ in real use. The two are basically the same.

If we look at temps, we see that Intel and AMD are giving temps from different locations. When we go apples-to-apples, we see one CPU happy at 80C and the other one happy at 75C. The two are basically the same.

So, I'm curious where this "omg Intel chipz r hottt n use 2many powers!!2" attitude comes from. It seems like an apples-to-apples compute comparison shows roughly equal power draw at roughly equal temps.

Don't forget that AMD's 0.01% laptop marketshare is because they can't make CPUs that are as low temp and power efficient as Intel's.
 
Could someone tell me when it's likely I'll be able to upgrade my 2600k to Intel's next (non-Skylake-based) architecture? Because that's what I'm waiting for.

Probably 2021, Rocket Lake, if I'm remembering correctly?
 
Is this actually a lot?

If you're an average desktop user used to a nominal 65W TDP (like my work PC, a Dell 8700, which absolutely will not pull more than that), then yeah. Especially if you live in a place where electricity's expensive or is hot in summer and you don't have AC (which I've seen more than one person here say applies to them.)
 
Is this actually a lot? I find that my 9980xe pulls about 300W at 100% load (mostly F@H WUs this week) and sits happily at 80C for unlimited duration. This is 30C below the throttling temp and this is all on stock clocks with the CPU showing 4.1Ghz all-core during the computes.

Compare this against, say, a 2970WX which is rated from AMD at 250W but reports show routinely hits 400W+ in real use. The two are basically the same.

If we look at temps, we see that Intel and AMD are giving temps from different locations. When we go apples-to-apples, we see one CPU happy at 80C and the other one happy at 75C. The two are basically the same.

So, I'm curious where this "omg Intel chipz r hottt n use 2many powers!!2" attitude comes from. It seems like an apples-to-apples compute comparison shows roughly equal power draw at roughly equal temps.

Don't forget that AMD's 0.01% laptop marketshare is because they can't make CPUs that are as low temp and power efficient as Intel's.

Well, considering you're asking if it's a lot and need to use HEDT processors to justify the question....
 
If you're an average desktop user used to a nominal 65W TDP (like my work PC, a Dell 8700, which absolutely will not pull more than that), then yeah. Especially if you live in a place where electricity's expensive or is hot in summer and you don't have AC (which I've seen more than one person here say applies to them.)

But that's not apples-to-apples. That's comparing an entry level Intel CPU to a top of the line [also Intel] CPU. Of course the one that is 4 times as powerful computationally is going to pull appreciably more power than the baseline. This top end model is going to pull the same 65W (or less) during times when it is doing the same compute as that 65W baseline CPU.
 
Well excuuuuuuuuuuuuuuuuuuuse the pandemic for messing up your upgrade life! Put your patience hat on for 2021 or go Ryzen. ;)

I've already had my patience hat on for maybe 4 years. Due to such minimal progress being made between CPU releases, I haven't felt a need to upgrade. And upgrading at the tail-end of this architecture would likely result in me regretting that decision for the following years after Intel's next generation releases.

I'm also waiting for Intel in a hope that it will handle Windows 7 out of the box and some other small things better and with less tweaking than AMD's CPUs. That might not be the case, but I hope it is.
 
Well, considering you're asking if it's a lot and need to use HEDT processors to justify the question....

Also, for comparison, the screenshot in the twitter link I provided showed a package temp of 93C, and individual core temps around 85-95C, and one core being only 8 degrees from max temp. It looks like the AIDA64 screenshot showed thermal throttling but at a low number, like 2%. And again, that's all-core 4.58GHz, meaning if you could get a 10-core overclock over 5GHz, you could maybe pull 300W+, and the additional heat, etc. That doesn't look great compared to the 9980XE results Thunderdolt mentions.
 
Well, considering you're asking if it's a lot and need to use HEDT processors to justify the question....

I'm asking because it seems like apples-to-apples performance actually comes with apples-to-apples temps and power consumption.

Why do you think comparing a low-end CPU's power consumption against a high-end CPU's power consumption makes sense?
 
But that's not apples-to-apples. That's comparing an entry level Intel CPU to a top of the line [also Intel] CPU. Of course the one that is 4 times as powerful computationally is going to pull appreciably more power than the baseline. This top end model is going to pull the same 65W (or less) during times when it is doing the same compute as that 65W baseline CPU.

True. And in absolute terms, yeah, it's a lot of power and heat. As I said in my other comment, the 10900F (not K) is hotter and slower than your 9980XE, with about half the cores.

If you expect that, are OK with that, can afford it, etc., sure, no problem. It does seem kinda noteworthy, while not of earth-shattering importance.
 
Is this actually a lot? I find that my 9980xe pulls about 300W at 100% load (mostly F@H WUs this week) and sits happily at 80C for unlimited duration. This is 30C below the throttling temp and this is all on stock clocks with the CPU showing 4.1Ghz all-core during the computes.

Compare this against, say, a 2970WX which is rated from AMD at 250W but reports show routinely hits 400W+ in real use. The two are basically the same.

If we look at temps, we see that Intel and AMD are giving temps from different locations. When we go apples-to-apples, we see one CPU happy at 80C and the other one happy at 75C. The two are basically the same.

So, I'm curious where this "omg Intel chipz r hottt n use 2many powers!!2" attitude comes from. It seems like an apples-to-apples compute comparison shows roughly equal power draw at roughly equal temps.

Don't forget that AMD's 0.01% laptop marketshare is because they can't make CPUs that are as low temp and power efficient as Intel's.

The mental gymnastics here are fantastic. Here we go.

1) Heat (or power consumption) != temperature. For the same power consumption, these chips will run hotter than their regular desktop counterparts because the heat is concentrated into a smaller area. (Fun fact, one of the ways Intel compensated for the heat generated by the higher clocks of their 14++ parts with respect to the 14+ versions was relaxing the gate pitch and spreading everything out a bit.)
2) The 2970WX consumes nowhere near 400W stock. You may see a bit above 300W on workloads that really stress the CPU like Blender, but that's about it. You will see more when overclocking or enabling PBO, but that ceases to be an apples-to-apples comparison. All of this is unimportant, though, because you may be aware of this little thing called the Threadripper 3000 series that has supplanted the entire WX range of Threadripper 2000 series processors with significantly higher performance and efficiency. Intel doesn't have a part to compete with that series on either front right now.
3) The "omg Intel chipz r hottt n use 2many powers!!2" attitude came from the fact that they're hot and consume a lot of power. Pretty self explanatory. It's not unheard of for someone to be unable to hit 5.0 GHz all-core on a 9900K because the required voltage (and therefore temps) are too high for even something like a decent AIO.
4) Re: AMD's laptop chips, have you been intentionally avoiding all of the coverage of the 4000 series parts, or...? That's also not to mention the gobs of money Intel throws at laptop vendors to ensure their chips make it into the best possible designs. It's a legitimate advantage to exercise, but let's not pretend it's because their technology is leaps and bounds ahead of AMD's.
 
I'm asking because it seems like apples-to-apples performance actually comes with apples-to-apples temps and power consumption.

Why do you think comparing a low-end CPU's power consumption against a high-end CPU's power consumption makes sense?

I just mentioned an 8700 as a data point. But go ahead and compare 10 cores, 90+C at 4.6GHz and 224W to your 9980. 3/4 the power, half the cores. Better have a big rad if you want 5GHz on the 10900K.

I made the posting because I thought it might be interesting to people. I wasn't saying it was a bad thing.
 
I'm asking because it seems like apples-to-apples performance actually comes with apples-to-apples temps and power consumption.

Why do you think comparing a low-end CPU's power consumption against a high-end CPU's power consumption makes sense?

You want apples-to-apples by comparing a mainstream socket CPU to HEDT socket ones? Its more than a highly overclocked (non-extreme OC methods) 9900KS and that is a notorious power hog. So, yes, in context of what it is it is a lot.
 
I could care less about TDP. I had a 4,000 watt peltier cooled Titan SLI / 5960X rig at one point.

It is hilarious to see the tables turned though.
 
  • Like
Reactions: Axman
like this
I could care less about TDP. I had a 4,000 watt peltier cooled Titan SLI / 5960X rig at one point.

I started down that road once and ran into problems due to having to run it off two circuits. Still have all the peltier stuff somewhere in a box in the basement...
 
Well, perhaps they should add a warning label on the box similar to the signs that were posted at Mcd's after the hot-coffee-burned-customer crap....

"WARNING, all CPU's are served extremely hot right out of the box, please be certain to take the necessary precautions to avoid injury"

HAHAHAHAHAHAHAHAHA ...:D..:eek:..:LOL:..:ROFLMAO:..:sorry:
 
Well, perhaps they should add a warning label on the box similar to the signs that were posted at Mcd's after the hot-coffee-burned-customer crap....

"WARNING, all CPU's are served extremely hot right out of the box, please be certain to take the necessary precautions to avoid injury"

HAHAHAHAHAHAHAHAHA ...:D..:eek:..:LOL:..:ROFLMAO:..:sorry:
gtx480_warning_sticker.jpg


just like the 480
 
Is this actually a lot? I find that my 9980xe pulls about 300W at 100% load (mostly F@H WUs this week) and sits happily at 80C for unlimited duration. This is 30C below the throttling temp and this is all on stock clocks with the CPU showing 4.1Ghz all-core during the computes.

Compare this against, say, a 2970WX which is rated from AMD at 250W but reports show routinely hits 400W+ in real use. The two are basically the same.

If we look at temps, we see that Intel and AMD are giving temps from different locations. When we go apples-to-apples, we see one CPU happy at 80C and the other one happy at 75C. The two are basically the same.

So, I'm curious where this "omg Intel chipz r hottt n use 2many powers!!2" attitude comes from. It seems like an apples-to-apples compute comparison shows roughly equal power draw at roughly equal temps.

Don't forget that AMD's 0.01% laptop marketshare is because they can't make CPUs that are as low temp and power efficient as Intel's.


???

Compared to the 3950x and 3900x, yes, it's a lot.
1586375679509.png

Courtesy of Legit Reviews
 
The mental gymnastics here are fantastic. Here we go.

1) Heat (or power consumption) != temperature. For the same power consumption, these chips will run hotter than their regular desktop counterparts because the heat is concentrated into a smaller area. (Fun fact, one of the ways Intel compensated for the heat generated by the higher clocks of their 14++ parts with respect to the 14+ versions was relaxing the gate pitch and spreading everything out a bit.)

Yes, you are correct that temperature and power are different metrics. That's why I mentioned each of them separately and noted that the two chip makers' products are roughly equal on both metrics. Fun fact: nobody cares about the labeling of the process outside of fanbois from one camp or the other. 300W on 14nm is still less power than 400W on 7nm.

2) The 2970WX consumes nowhere near 400W stock. You may see a bit above 300W on workloads that really stress the CPU like Blender, but that's about it. You will see more when overclocking or enabling PBO, but that ceases to be an apples-to-apples comparison. All of this is unimportant, though, because you may be aware of this little thing called the Threadripper 3000 series that has supplanted the entire WX range of Threadripper 2000 series processors with significantly higher performance and efficiency. Intel doesn't have a part to compete with that series on either front right now.

I saw numerous reviews stating 400-450W on stock 2970WXes. Even still, if we go with your 300W number (which is higher than AMD's stated rating, btw), we're in exactly the same place as I said before: similar power consumption for similar performance. I'm happy you agree with me there.

The reasons I didn't mention the TR 3000 series are:
1) Nothing in Intel's desktop lineup can match that performance level, so it becomes an impossible comparison. Intel is definitely playing catch up here. Or rather, they need to start - they're so far behind right now, it isn't clear that they're even playing.
2) The "hurrdurr intel so hot" mindset predates the 3000 series by several years. It's been a false feel-good the entire time.

3) The "omg Intel chipz r hottt n use 2many powers!!2" attitude came from the fact that they're hot and consume a lot of power. Pretty self explanatory. It's not unheard of for someone to be unable to hit 5.0 GHz all-core on a 9900K because the required voltage (and therefore temps) are too high for even something like a decent AIO.

Right. This is exactly what I'm mocking. The chips aren't particularly hot and don't consume a noteworthy amount of power when you compare them to the equivalent AMD unit.

4) Re: AMD's laptop chips, have you been intentionally avoiding all of the coverage of the 4000 series parts, or...? That's also not to mention the gobs of money Intel throws at laptop vendors to ensure their chips make it into the best possible designs. It's a legitimate advantage to exercise, but let's not pretend it's because their technology is leaps and bounds ahead of AMD's.

I'm not really interested in comparing hypothetical numbers for a product that will, someday, exist in the market against something I can walk into a store a buy. Until the chip hits the wild, any comparison is BS.

You want apples-to-apples by comparing a mainstream socket CPU to HEDT socket ones? Its more than a highly overclocked (non-extreme OC methods) 9900KS and that is a notorious power hog. So, yes, in context of what it is it is a lot.

I honestly could not care less about whatever marketing names get assigned. I'm comparing equivalent computational performance between a desktop chip at the high end of the market and a desktop chip at the end of the market which has had a marketing department apply HEDT branding to the retail box.

If your primary buying decision is the branding on the retail box, I think one of us in the wrong place.
 
Back
Top