Intel Core i9-13900K Raptor Lake CPU Offers Same Performance As Core i9-12900K With “Unlimited Power” at Just 80W

The 300W number is a worst-case scenario for power consumption, by the way. In gaming, a worst-case scenario like Civ 6 consumes around 190W. Still worse than AMD, but let's get some context in the discussion. The 13900K isn't going to be consuming 300W all the time.
That is fair... of course if your buying either the 13900 or 7950 and all your doing is gaming your probably got your budget priorities messed up a little.
If your doing the things that you buy a high core CPU for... the 13900 throttles after 17s under water, and does suck double the power.
Light loads like gaming sure its not insanely more.... I mean sitting idle it is probably not sucking any more power at all. Not being a jerk just ya I think we all know the flag ship CPUs are waste of money for gaming alone. I mean if your buying flagship CPUs your almost for sure trying to game at 4k... and well, basically every mid range or better CPU from the past 2 gens will be damn close to =.
 
Derbauer... I think 13900K at capped 90W is very interesting as it performs like a 12900K. I mean I realize you can run it to 350W+.... and it's "fast"... but at quite a cost.

IMHO, there's still value in the AMD platform, but this new stuff from Intel is very interesting.

 
Testing to me got quite complicated all around (GPU-CPU) and tester and receiver of info can be a bit overwhelmed.

How well AM5 will age versus a death end platform being a big variable if you are to put a lot of money in a motherboard which we cannot escape.

But it would be interesting to see test in similar priced affair, how similarly priced Raptor Lake perform to a AM4 system in particular (being 2 dead end platform) and what significant specs difference you have a those price a 5900x vs 5800x3d vs 7600-7700x-13600k of the sorts (with the cpu-MB-ram kit around a similar end price tag), how much you gain by spending more for a nicer DDR-5 version of the 13600k-Ryzen 4.

We created a media environment for which we get many dozens, maybe hundreds of quite redundant reviews of small reviewer, but not 2-3 from giant consumer reviews a la 90s that have a staff to go over all this in a short amount of time.

It feel a bit overwhelming for everyone involved and yet maybe close to irrelevant has there is probably no bad solution for the asked price and only marginal difference from what you decide at the end anyway.
 
The 300W number is a worst-case scenario for power consumption, by the way. In gaming, a worst-case scenario like Civ 6 consumes around 190W. Still worse than AMD, but let's get some context in the discussion. The 13900K isn't going to be consuming 300W all the time.
The Hardware Unboxed review showed the 13900K hitting 90c while playing CyberPunk 2077. Those temps were recorded using a 360mm AIO. I understand if the CPU is hitting those temps while running Cinebench but not while gaming.
 
The 300W number is a worst-case scenario for power consumption, by the way. In gaming, a worst-case scenario like Civ 6 consumes around 190W. Still worse than AMD, but let's get some context in the discussion. The 13900K isn't going to be consuming 300W all the time.
No but you better have some extra headroom on your power supply in case of a CPU spike and a transient load spike on anything 3090 or higher. 1000w minimum.
 
  • Like
Reactions: kac77
like this
https://www.amazon.com/gp/product/B0BCF54SR1/ref=ewc_pr_img_1?smid=ATVPDKIKX0DER&psc=1

Shipped and sold by Amazon for $802.89. Looks like Amazon officially a gouger.
The link you have there is up to $1484.77 now, and now shows it's shipped and sold by "M-X-C Tech"
I love that they have "Mac OS X El Capitan 10.11" listed as a supported platform for the chip.

1666304896257.png
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Weird, still shows this for me:

View attachment 520117
Amazon US vs Amazon Canada, so that's my bad it's auto-directing that link to the CA site for me.
Edit:
In the US if the order is processed and shipped from an Amazon warehouse they will just show Amazon.com for the shipper and reseller, but in Canada that doesn't work so for tax reasons, they have to list who Amazon is working as the middleman for. I didn't know that before now so yay our convoluted tax laws are doing something for me for once.
 
Last edited:
You always have to watch with Amazon who the seller is... sold by 3rd party, housed by Amazon... or actually sold by Amazon. Sold and shipped by Amazon is always good delivery and real deal products but rarely are they the cheapest option.
As a Canadian though I love hunting Amazon for mis posted Amazon deals form US sellers. The Canada markup on all things is super annoying. Its crazy things like Art supplies... normally they have exactly the right amount of "shipping and import fees" that they EXACTLY = the Canadian sell prices. Its fixed. But now and then you find a listing that isn't posed properly and shows nothing for import fees... if Amazon doesn't charge it at checkout you don't pay it. Gotten plenty of deals on things... well deals for me Freaking everyday US pricing for most people reading this. I know I got a deal if I check the listing the next day and it is updated to say +$60 shipping and import fees.
 
13600K is the best deal obviously. The pricing for the 13700K and 13900K just feel so off to me. They couldn't find a half dead E core to make the 13700K more desirable? 8P 12E. Instead you get only 2 more P-cores for $100 more?
 
So yeah, Intel and AMD both cranked their flagships to 11 so as to get the highest possible benchmarks. When properly power tuned, both are descent at efficiency but nothing exceptional. Even the not-so-extreme 13600k loses efficiency to zen 3 counterparts by like 50%.

Hoping Intel and AMD will release non-k and non-x skus that are more dialed in for efficiency.
 
Either way no reason to buy either one right now when in what 3-4 months the X3D models will be out imo.
 
Either way no reason to buy either one right now when in what 3-4 months the X3D models will be out imo.
My issue: AMD's value proposition (CPU + Mobo) is so bad right now compared to 13th gen, even with the performance boost from X3D (combined with the cost increase) they will barely bridge the gap.
Example, the 7700X is ~$70 more expensive than the 13600K, comparable mobo is ~$100 more expensive. That's $150-$200 more money for a slower chip. That's all ground the 3D chips need to cover.

If AMD can't compete in performance with a more expensive CPU right now, why would an even more expensive 3D chip solve that problem? 7600X and 7700X need their prices cut by $100, X670 boards need a $50-$100 cut.

Not worth waiting for unless you're buying the extreme of extreme CPUs... 13900K for gaming? Sure, wait for 3D instead.
 
Last edited:
My issue: AMD's value proposition (CPU + Mobo) is so bad right now compared to 13th gen, even with the performance boost from X3D (combined with the cost increase) they will barely bridge the gap.
Example, the 7700X is ~$70 more expensive than the 13600K, comparable mobo is ~$100 more expensive. That's $150-$200 more money for a slower chip. That's all ground the 3D chips need to cover.

If AMD can't compete in performance with a more expensive CPU right now, why would an even more expensive 3D chip solve that problem? 7600X and 7700X need their prices cut by $100, X670 boards need a $50-$100 cut.

Not worth waiting for unless you're buying the extreme of extreme CPUs... 13900K for gaming? Sure, wait for 3D instead.
I'm looking to give this whole gen a pass, the x3D parts whenever TSMC gets that sorted will certainly interest me from an engineering perspective but until I move up from 1440p, what I have is more than sufficient and my time allowance for gaming lately just makes it hard for me to justify that cost to myself.
I see this as the junction point where the big 3 are at a level where the gaming industry just hasn't put anything out that can really push anything they are releasing in the middle of the pack unless you are playing full 4K. And the way I see it the consoles and the pandemic pricing are going to hold things where they are for at least another 3 years.
You see AMD and Nvidia more or less abandoning the lower-end parts, I strongly believe its because they can just discount their previous-gen parts and have them fill in the gap, the platform cost for AMD currently it would make no sense for them to put out a low-end CPU. Intel is being weird, they are behaving like an underdog and it is unsettling, somebody tell them to stop.
 
You see AMD and Nvidia more or less abandoning the lower-end parts, I strongly believe its because they can just discount their previous-gen parts and have them fill in the gap, the platform cost for AMD currently it would make no sense for them to put out a low-end CPU. Intel is being weird, they are behaving like an underdog and it is unsettling, somebody tell them to stop.
After years of Intel shilling 4c8t high end parts, they deserve the lower margin parts of the business.

4 years later, it feels like we’re still in the same spot. Intel pushing megawatts of power chasing performance.

 
My issue: AMD's value proposition (CPU + Mobo) is so bad right now compared to 13th gen, even with the performance boost from X3D (combined with the cost increase) they will barely bridge the gap.
Example, the 7700X is ~$70 more expensive than the 13600K, comparable mobo is ~$100 more expensive. That's $150-$200 more money for a slower chip. That's all ground the 3D chips need to cover.

If AMD can't compete in performance with a more expensive CPU right now, why would an even more expensive 3D chip solve that problem? 7600X and 7700X need their prices cut by $100, X670 boards need a $50-$100 cut.

Not worth waiting for unless you're buying the extreme of extreme CPUs... 13900K for gaming? Sure, wait for 3D instead.
AM4 is still a thing. The 3D chip is perfectly fine there.
 
After years of Intel shilling 4c8t high end parts, they deserve the lower margin parts of the business.

4 years later, it feels like we’re still in the same spot. Intel pushing megawatts of power chasing performance.


I get it but at that time was there anything really pushing them to have more than that?
Even now its been some 4 years and we're at a point where 6c 12t is still the sweet spot for most things unless you are in a professional or semi-professional environment, for the average user Intel could probably just launch a cheap chip with 8 E cores and it would be more than enough for the average home user.
Core counts are going up but even gaming there isn't a huge benefit to them and at some point, they become almost detrimental because there just aren't enough memory channels to feed them. I mean I get that we need more cores now so game developers can utilize them later when the average core count has increased down the road but it just feels so meh to me. AMD and Intel are both so desperate to keep their consumer parts from stepping up to their enterprise ones that they are both cutting them off at the knees, I don't know what I exactly expected from Gen 13 and Zen 4, but it was more than what both of them delivered and that is totally on me, I get that but it's still a bummer.
While I didn't expect to see a huge jump in PCIe lanes I at least hoped for 4 more, I would have been happy with even 2, same with memory channels 2 is starting to get stagnant, and DDR5 opens up a lot of possibilities that really get cool when you have at least 4, especially with the core counts we are starting to see I mean 20 cores in a mainstream part? Yeah, I get 8 of those are low-power "efficient" cores, but those cores still have about as much oomph as an 8th gen Coffee Lake core which there are lots of people out there rocking happily.
It's late and I'm tired I probably just need some sleep and I can hold out hope for next year, maybe one of the game companies out there will announce something that gives me something to look forward to.
 
Sidenote, imagine an entire generation only knowing hardware reviews built around outrage/shock-faced youtube thumbnails and algorithm pandering. How did guys like Kyle Bennett ever manage to get through a hardware review without all the dumb faces?

View attachment 520061

Edit: If there has to be a thumbnail? This hardware review gets my business every time.

View attachment 520065
Really miss the old days of Kyle and his staff's hardware reviews in non-video format with actual reading and graphs.
Brings this to mind, though whatever it takes to get views in this era...

pc8ee1h4qs481.jpg
 
Hmm, I don't feel compelled to upgrade my CPU with any of these new offerings. Might wait and see what Zen 4 with 3D V-Cache and Meteor Lake will bring next year.
 
Hmm, I don't feel compelled to upgrade my CPU with either of these new offerings. Might wait and see what Zen 4 with 3D V-Cache and Meteor Lake will bring next year.
But then Zen 5 is right around the corner, AMD is already saying it is on track for 2024, will be a big.little design, and should have a larger IPC uplift than the Zen3 to Zen4 transition.
 
Sitting on a 4790k RTX 3060 ti, that is, i have put linux on my 3210m 640m le Vario to save power. I play Half Life 2 and some other games I have overlooked ) But when power are "cheap" <0.30 EUR I replay Crysis ).

I will upgrade when power comes really down, maybe with new transistors in 2025-27

I think the many cores will be for IA. Maybe it will not all be Cloud based. Have you any ideas?
 
But why stop there? Wait a few more months and Meteor Lake would be out in the same year...
One thing I would not rely on what so ever, is Intel releasing a product on time. I don't think they have released a product on time in a few years.
 
I don't know, every time I watch one of these videos I keep thinking that the X3D variant of zen 4 is worth waiting for.
Yeah you could be right, but I'm tired of playing the waiting game.
Plus it's like getting a 12900k for an extreme discount. :)
 
Yeah you could be right, but I'm tired of playing the waiting game.
Plus it's like getting a 12900k for an extreme discount. :)
I'm still on a 9900K and haven't felt like I need more CPU. Maybe when I get a 4080 or 4090 (or amd equiv) that will change.
 
They will. Knowing that the AM5 platform will last until 2025+ they will come out. Where as Z690/Z790 is EOL.

Right now Intel is the price/performance king.
From what I've read here there are technical difficulties with the TSMC process that they need to get around, first. Will AMD spend the time and money to get a 3D cache version into production is the big question.
 
I'm still on a 9900K and haven't felt like I need more CPU. Maybe when I get a 4080 or 4090 (or amd equiv) that will change.
My 10700k is basically a 9900k and i'm going to turn it into another MediaCoder-Handbrake box to add to my fleet.
It still has balls enough for gaming, but I'm looking at getting a 3090 from my cousin on the cheap.

I should be set for another 10yrs after that upgrade.
 
In short, a move to AMD 7xxx will cost you more and deliver less in many cases. Well done Intel, taking advantage of AMD's radical platform move. Thus, for many, if considering an upgrade, Intel becomes the better overall value which many switch AMD fans over to Intel (again?). We'll see.

With that said, and there's nothing definitive on this, if AMD does significantly lower the prices of the 7xxx series (essentially flushing/destroying gens prior in the channel though), AMD might become the value proposition again. GamersNexus suggested maybe at or before CES??
 
From what I've read here there are technical difficulties with the TSMC process that they need to get around, first. Will AMD spend the time and money to get a 3D cache version into production is the big question.
TSMC is having no issues with 5nm. Besides, AMD is announcing X3D at CES. Pretty sure those chips are already in production.

Not sure it will be enough to increase sales. Time will tell.
 
Back
Top