The core i9 12900K used 2.3x power than the Ryzen 5950x in official benchmark

Solhokuten

[H]ard|Gawd
Joined
Dec 9, 2009
Messages
1,539
Last edited:
Three years later, and they still cant match the efficiency of TSMC 7nm!

Their only hope is to keep adding factory-overclocked P-cores, chased with a few efficiency cores!
 
I really hope this isn't true. This is coming from someone with a 5950x (yes my sig is out of date.) If Intel kept the 5950x limited to 105w while it let its chip boost out the wazoo, AND they didn't install the fixes for Ryzen's caches...then that paints this chip in a pretty poor light. My 5950x can drink 200w easily when boosting with pbo2, so I suspect it's either much closer, or worse Intel is still lagging behind. We need actual competition and advancement for a healthy industry.
 
I really hope this isn't true. This is coming from someone with a 5950x (yes my sig is out of date.) If Intel kept the 5950x limited to 105w while it let its chip boost out the wazoo, AND they didn't install the fixes for Ryzen's caches...then that paints this chip in a pretty poor light. My 5950x can drink 200w easily when boosting with pbo2, so I suspect it's either much closer, or worse Intel is still lagging behind. We need actual competition and advancement for a healthy industry.
Don’t the 3900, 3950, 5900 and 5950 pull 142w at usual max power without pro?
 
Don’t the 3900, 3950, 5900 and 5950 pull 142w at usual max power without pro?
I don't recall, but perhaps that is with pbo 1? 105w is the official tdp of the 5950x. For fun I just did a quick cinebench r20 run, and per hwinfo the cpu package pulled 236w.
 
Gasp! Oh no! Not the scawy power usage monster! If its a badass, its a badass. Seriously, who cares? This is [H].
I've got enough capacity to cool anything AMD, Intel, Nvidia or anyone else can dream up, as long as i can slap a block on it im gtg. I aint scurred 😂
My only interest is high end bang for the buck when i can finally be bothered to find a gpu for a decent price. If ever. The more time that goes by the less i seem to care.
 
Gasp! Oh no! Not the scawy power usage monster! If its a badass, its a badass. Seriously, who cares? This is [H].
I've got enough capacity to cool anything AMD, Intel, Nvidia or anyone else can dream up, as long as i can slap a block on it im gtg. I aint scurred 😂
My only interest is high end bang for the buck when i can finally be bothered to find a gpu for a decent price. If ever. The more time that goes by the less i seem to care.
You missed the point. They crippled the competition by limiting it to 105w. My 5950x can stretch its legs easily to 236w, which would have been a much more even playing field against Intel's 241w. If this report is true, it means on an even playing field Intel's newest chips are likely stomped by AMD's existing offerings.
 
You missed the point. They crippled the competition by limiting it to 105w. My 5950x can stretch its legs easily to 236w, which would have been a much more even playing field against Intel's 241w. If this report is true, it means on an even playing field Intel's newest chips are likely stomped by AMD's existing offerings.
Ah, in that case boooo!
 
Perfect for the winter season 😌

lol.... Its an all-in-one sorta deal. Buy a cpu upgrade and you also get a space heater for winter time....

Might be a problem for some of the existing heatsinks out there....
 
Gasp! Oh no! Not the scawy power usage monster! If its a badass, its a badass. Seriously, who cares? This is [H].
Data centers and servers are going to be a different story, and if the heat and power consumption reports are true, Intel is going to lose more ground to AMD and ARM variants.
The only thing keeping them afloat is vendor lock-in, preexisting contracts, and supply chain issues rendering Intel the only short-term option.

For personal use they can continue to use their marketing to continue competing, but each iteration looks more and more grim as time passes.
 
Data centers and servers are going to be a different story, and if the heat and power consumption reports are true, Intel is going to lose more ground to AMD and ARM variants.
The only thing keeping them afloat is vendor lock-in, preexisting contracts, and supply chain issues rendering Intel the only short-term option.

For personal use they can continue to use their marketing to continue competing, but each iteration looks more and more grim as time passes.
Don't care about data centers or servers. Neither interests me. I can only hope that someone eventually develops a legit ARM desktop contender or two or three! This blood sucking duopoly has needed a fresh pair of eyes for far to long. Intel will do what they do and continue to make billions. They have sunk roots in virtually everything cutting edge and/or profitable in tech. That money runs deep. My only concern with Intel is that they continue to compete in the desktop space and that AMD stays strong. I don't care how. As long as they force each other to innovate, im good. AMD will punch back if AL is in fact a badass. It isn't as if it will be a world beater. If it isn't a badass, it just means no price breaks for the value minded. The opposite in fact, which will suck.
 
Go go go Pentium 5.... ah I mean One Two Ninetieth Hundred or is that One three sixty sixty six sixty KFGLMNOP, Ahh I mean Generation current err I mean 11 no 11.5 ahhh 12 I think. Ahh Alder Lake go alder lake.

haha I joke... but real world testing done by someone other then Intel should be interesting.

So crippled OS... Crippled power envelope, and still ignoring multi thread numbers. Although I still believe Alder lake isn't specifically a terrible product quite yet, I am starting to doubt Intels claim the crown is once again theirs.
 

\

I was going to pick one up but going to wait for Meteor Lake or something better off buying a new GPU. Won't be much of a upgrade from my 10850K which I picked up right after X-Mas using a gift card. Plus it's going to run hot so I wonder what the life cycle of the cpu will be. You gotta figure 670.00 for the CPU 300.00 for the MB then DDR5 is going to be priced really high at first if you don't settle on a DDR4 MB.
 
Last edited:
Anyone actually skim the linked article?
Its a kid speculating about slides he doesn't seem to fully understand, trying to interpret what they may mean and making assumptions on relative performance.

FWIW the 12900K may well use "way more power" but let's see the hands-on tests. There are thousands of these thirsty blogs pumping out speculative infotainment daily, trying to take advantage of the information vacuum before the embargo lifts.
 
Last edited:
It surprises me that this many people still fall for clickbait this easily. Anyone actually open the link?

Its some kid babbling about slides he doesn't fully understand, speculating what they may mean and making flawed assumptions.

Let's wait for real tests, and resist the urge to click on every thirsty noname blog.
Yeah, it was hard to read.
 

\

I was going to pick one up but going to wait for Meteor Lake or something better off buying a new GPU. Won't be much of a upgrade from my 10850K which I picked up right after X-Mas using a gift card. Plus it's going to run hot so I wonder what the life cycle of the cpu will be. You gotta figure 670.00 for the CPU 300.00 for the MB then DDR5 is going to be priced really high at first if you don't settle on a DDR4 MB.

Pretty nice uplift in Cyberpunk. Looks like a decent chip.
 
So we'll bitch about a CPU's power draw but willingly throw handfulls of money at GPU's that practically require a Kilowatt class PS? C'Mon.......pick your battles, is this the [H]ardLaptop forum!?!?

Now...having said all that, if we find its only on par with Ryzen at tons of watts then clearly there's work to be done. But I don't think anyone should doubt in Intels ability to do great things, they just have to want to do them, and Ryzen 2 and 3 were really the first chips to put them on notice and that isn't that long ago (in engineering and development years).
 
So we'll bitch about a CPU's power draw but willingly throw handfulls of money at GPU's that practically require a Kilowatt class PS? C'Mon.......pick your battles, is this the [H]ardLaptop forum!?!?

Now...having said all that, if we find its only on par with Ryzen at tons of watts then clearly there's work to be done. But I don't think anyone should doubt in Intels ability to do great things, they just have to want to do them, and Ryzen 2 and 3 were really the first chips to put them on notice and that isn't that long ago (in engineering and development years).
You missed the point. If you limit one system to a lower wattage, it can’t boost. They crippled the ryzen system to make alder lake look good. It’s not that it used 250w- it’s that ryzen was limited to 105.
 

\

I was going to pick one up but going to wait for Meteor Lake or something better off buying a new GPU. Won't be much of a upgrade from my 10850K which I picked up right after X-Mas using a gift card. Plus it's going to run hot so I wonder what the life cycle of the cpu will be. You gotta figure 670.00 for the CPU 300.00 for the MB then DDR5 is going to be priced really high at first if you don't settle on a DDR4 MB.

The problem with this video is almost all the games are being tested in GPU bound scenarios, the only one that's not is Horizon Zero Dawn, and the new chip has a 20% advantage in that one, which jives with what we've been hearing about the alder lake advantage.

Of course, I expect the Zen 3 with stacked cache refresh to be similar in performance, and Zen 4 to probably beat it handily. Still, if someone already has a 3090 and wants to get the biggest benchmark scores at 720p alder lake looks like it can give you that little boost right now.
 
Anyone actually skim the linked article?
Its a kid speculating about slides he doesn't seem to fully understand, trying to interpret what they may mean and making assumptions on relative performance.

FWIW the 12900K may well use "way more power" but let's see the hands-on tests. There are thousands of these thirsty blogs pumping out speculative infotainment daily, trying to take advantage of the information vacuum before the embargo lifts.
Ja, there is no such thing as PL1 on AMD CPUs. AMD measures TDP in thermal watts (how much heat needs to be dissipated) while Intel measures in electrical watts (how much power is consumed for a given workload). The two cannot be compared directly at all. Makes good clickbait headlines, though.
 
As always I'm waiting for reviews before I decide on anything. I was sorta underwhelmed by intel's slides so when the proper reviews come out it won't surprise me if it's nothing special.
 
It seems Intel admits they need to retest Ryzen on Windows 11 as the cache patches were not installed. I guess we'll have to wait for reviews before we can verify the power disparity claims.

https://www.pcgamer.com/intel-claim...est-gaming-processor-but-has-to-retest-ryzen/

It will also be interesting to see how the Ryzen 3D V-Cache chips compare next year.
Against the Updated cache Ryzens... its pretty clear. AMD will spank them. As much as games love single thread performance, they love cache more.
At this point Intel needs to prey the Silicon shortage lasts long enough for them to get a to generation 13 before it ends. They have guaranteed sales for now as long as these don't self destruct or something. Its not like AMD can flood the market.
If Intel can hang in (and due to shortages they will sell all of these they can make likely) who knows its possible AMD misses a launch window on Zen 4... and Meteor lake doesn't. Going to be a fun few years... nice to have competition from very different products. It feels like the 90s again. lol
 
It's easy, never take "official" numbers from the manufacturers about the performance of their products at face value, ever.
 
The problem with this video is almost all the games are being tested in GPU bound scenarios, the only one that's not is Horizon Zero Dawn, and the new chip has a 20% advantage in that one, which jives with what we've been hearing about the alder lake advantage.

Of course, I expect the Zen 3 with stacked cache refresh to be similar in performance, and Zen 4 to probably beat it handily. Still, if someone already has a 3090 and wants to get the biggest benchmark scores at 720p alder lake looks like it can give you that little boost right now.

But it's a realistic test of what you would expect when playing games with the CPU. No one games at 720p low settings (well maybe some low spec gamers) so that is not indicitive of real-world performance.

Also, most of the games had a nice uplift of 10 - 20%, even with GPU limited scenarios, so it shows the CPU is good and will result in gains in real life.
 
  • Like
Reactions: DPI
like this
But it's a realistic test of what you would expect when playing games with the CPU. No one games at 720p low settings (well maybe some low spec gamers) so that is not indicitive of real-world performance.

Also, most of the games had a nice uplift of 10 - 20%, even with GPU limited scenarios, so it shows the CPU is good and will result in gains in real life.
Among my own first tests will be frametimes in VR, and I expect a nice drop there.
 
I mean I could cool it too no problem. Doesn't mean I want to dump that much more excess heat into the PC room + ramping the rad fans up more.

Heat matters more than just being able to cool the CPU off or not.

Data centers and servers are going to be a different story, and if the heat and power consumption reports are true, Intel is going to lose more ground to AMD and ARM variants.
The only thing keeping them afloat is vendor lock-in, preexisting contracts, and supply chain issues rendering Intel the only short-term option.

For personal use they can continue to use their marketing to continue competing, but each iteration looks more and more grim as time passes.

Yep, my temporary cooling solution on an open air bench for my temporary 11900K (which is neither tuned nor specially overclocked) of a Hyper 212 with two Noctua F-12 fans that I had as spares rather than the stock fan barely keeps the chip cool. I can compare that to a mildly overclocked 3770K also under a stock Hyper 212 at the office which is much more able to restrain the older chip's heat output.

According to Videocardz, an Epic Zen5 chip will have a 600 watt profile. Of course, that chip is supposedly going to have 256 cores so allowances should be made unlike Intel's 28 core Skylake.
 
Back
Top