Intel Core i9-13900KS Review: The World's First 6 GHz

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,875

The fastest and most power hungry PC chip ever.​

"The KS model has the same default profile as the 13900K but also has its own new Extreme Power Profile that allows for a 320W PL1/PL2 and 400A ceiling. You'll need to ensure that your motherboard can deliver the peak current if you want to unleash the full power of the KS, as not all motherboards can for a long period of time. Motherboard vendors allow assigning a higher ICCMax value in the BIOS, typically under settings like "Core/CPU Current Limit" (the name varies by mobo maker), but that doesn't mean the motherboard can actually deliver that amount of current. Obviously, B- and H-series boards don't make the cut.

Intel defines these recommended power profiles but allows motherboard vendors to ignore them completely, and exceeding the default values doesn't void the warranty. Thus, by default, most motherboard makers completely ignore the limits and assign the maximum values for PL1, PL2, and ICCMax, resulting in higher performance and more heat.

Even at stock settings, the Core i9-13900KS hit up to 328W of power and 100C in our testing. We'll see what power draw, performance, and thermals look like, including gaming and productivity benchmarks, on the following pages."
CgvD7jpN2mKiP6bjtAG7EX-480-80.jpg



https://www.tomshardware.com/reviews/intel-core-i9-13900ks-cpu-review
 
Anyone who bitches about heat or power draw, there is a box by the door where you can in your [H]
You would figure that it's obvious that anyone who purchases such a product already threw caution to the wind when it comes to power draw; and yet we get people cracking jokes about extreme products. Look no further than the 4090 Ti thread. The only humorous part is those cracking jokes aren't the people that card is targeted toward. :rolleyes:

While I am currently not in the market for such a beast, I still eagerly await benches to see just what this thing is capable of.
 
I'm not sure I'd agree. Obviously, more testing and more information is required, but I tested the Core i9 10980XE @4.8GHz and it pulled over 400w and hit temperatures of 108c. Those throttle at 110c, not 100c. The 13900KS is no doubt faster than the 10980XE, but I don't think it's the most power hungry chip ever. Let's not forget those Xeon W3175X chips that were known to pull over 400w regularly.
 
Anyone who bitches about heat or power draw, there is a box by the door where you can in your [H]
Pretty much particularly since that is the POINT of such a part. The 13th gen is actually a pretty power efficient architecture if you want it to be. The 13900k limited to like 90-100 watts just crushes it in performance/watt metrics vs most other similar power CPUs (the 7950X similarly power limits is slight better per watt). However, that's not the target, the target is high end so instead they push the power as hard as they can, which gives you more total compute horsepower but not in a linear fashion and thus makes the performance/watt numbers tank. The KS pushes shit even harder, for even lower performance/watt but to hit that elusive 6GHz number.

As an aside if anyone is worked up about power draw and has a 13900K or KF (why?) you can pretty easy limit it in your BIOS to 90-100 watts and have a chip that will perform on par with a 12900K but use less than half the power. You can have it both ways, if you like.
 
I'm not sure I'd agree. Obviously, more testing and more information is required, but I tested the Core i9 10980XE @4.8GHz and it pulled over 400w and hit temperatures of 108c. Those throttle at 110c, not 100c. The 13900KS is no doubt faster than the 10980XE, but I don't think it's the most power hungry chip ever. Let's not forget those Xeon W3175X chips that were known to pull over 400w regularly.
Xeon 8280 enters the chat... Especially on the handful of boards that would override the boost limits...

Consumer non-HEDT sure. But HEDT/Professional parts... yeaaaahhhh.
 
Pretty much particularly since that is the POINT of such a part. The 13th gen is actually a pretty power efficient architecture if you want it to be. The 13900k limited to like 90-100 watts just crushes it in performance/watt metrics vs most other similar power CPUs (the 7950X similarly power limits is slight better per watt). However, that's not the target, the target is high end so instead they push the power as hard as they can, which gives you more total compute horsepower but not in a linear fashion and thus makes the performance/watt numbers tank. The KS pushes shit even harder, for even lower performance/watt but to hit that elusive 6GHz number.

As an aside if anyone is worked up about power draw and has a 13900K or KF (why?) you can pretty easy limit it in your BIOS to 90-100 watts and have a chip that will perform on par with a 12900K but use less than half the power. You can have it both ways, if you like.
Fun fact: Picked up a 12900 non-K (NUC 12 Extreme) the other day. Chip is fast as hell, even with a 65W TDP. You don't HAVE to go power crazy if you don't want to.

Then again, my lab draws in the KW range from the wall, so...
 
Fun fact: Picked up a 12900 non-K (NUC 12 Extreme) the other day. Chip is fast as hell, even with a 65W TDP. You don't HAVE to go power crazy if you don't want to.

Then again, my lab draws in the KW range from the wall, so...
Now I'm curious to see how the 13900KS performs at 65W. Not that I will keep it there but interested. Derbauer already proved that it is amazingly powerful and efficient at 90W.
 
I have the 13900KS and yes it can and does pull up to 400W if it feels like it. Good thing I have triple radiators and the new EK Quantum Velocity 2 cpu block all for the CPU only, the 4090 is air cooled.
The 4090s cooler really doesn't need to be swapped out for WC unless you are desperate for the slot real estate. That cooler is just on another level.
 
Well, I agree with the other people's statements about power draw and "who this was designed for". This was clearly not designed for me. Ya'll enjoy.
 
The 4090s cooler really doesn't need to be swapped out for WC unless you are desperate for the slot real estate. That cooler is just on another level.
Yea lol that's exactly why I sold my 3080Ti Hydrocopper and went with an air cooled 4090 Gaming X triple slot cooler witch is the size of about two 360mm rads put together. If you think about it, it's actually MORE cooling than if I just water-cooled the GPU because It's like having two additional 360mm rads along with the three rads I already have in the system. The rad is just massive on these 4090's I love them lol. As for the slot I am still able to slot in a Sound Blaster AE5 in the 2nd to last slot I still have an open slot at the very bottom so I'm ok. But yeah, this way the 13900KS gets ALL the radiators to itself because the 4090 is self-contained. I don't think I'll ever go back to WC GPU if coolers are this good :)
 
Yea lol that's exactly why I sold my 3080Ti Hydrocopper and went with an air cooled 4090 Gaming X triple slot cooler witch is the size of about two 360mm rads put together. If you think about it, it's actually MORE cooling than if I just water-cooled the GPU because It's like having two additional 360mm rads along with the three rads I already have in the system. The rad is just massive on these 4090's I love them lol. As for the slot I am still able to slot in a Sound Blaster AE5 in the 2nd to last slot I still have an open slot at the very bottom so I'm ok. But yeah, this way the 13900KS gets ALL the radiators to itself because the 4090 is self-contained. I don't think I'll ever go back to WC GPU if coolers are this good :)
Still have fan noise. I like having enough heat capacity in the water to set a very slow ramp on the radiator fans. That way you never notice the gradual increase in noise. Boil the frog!

Plus still the size. If you’re in a full tower it’s not a big deal, but I don’t believe a 4090 would even fit in my O11 non-XL given where the red and pump are located - not without being water cooled. Just not enough space in there.
 
Still have fan noise. I like having enough heat capacity in the water to set a very slow ramp on the radiator fans. That way you never notice the gradual increase in noise. Boil the frog!

Plus still the size. If you’re in a full tower it’s not a big deal, but I don’t believe a 4090 would even fit in my O11 non-XL given where the red and pump are located - not without being water cooled. Just not enough space in there.
Yea I'm in a full tower with 15 or 16 fans all set on low speed. Honestly, you can't hear the gpu any more than I hear the rest of the case fans and that is just leaving it stock. The 4090 cooler can be left silent, I know it's hard to believe. We are not even talking about tuning it with any undervolt or custom fan curves it's just silent stock lol. That's 400-600W out of the loop and still silent remember it has a massive cooling apparatus on it, it doesn't need fan speed.
 
This benchmark look gpu limited also in gaming most of the time at 4k with good cpus you're going to be within margin of error because it is usually gpu limited. Most modern gpus are good enough for gaming especially at 4k where the gpu is more important. If you drop the resolution and push the clocks up to past 6GHz like skatterbencher does you could see up to a 20% improvement so calling it weak is ridiculous it's actually the fastest cpu ever lol.
 
This benchmark look gpu limited also in gaming most of the time at 4k with good cpus you're going to be within margin of error because it is usually gpu limited. Most modern gpus are good enough for gaming especially at 4k where the gpu is more important. If you drop the resolution and push the clocks up to past 6GHz like skatterbencher does you could see up to a 20% improvement so calling it weak is ridiculous it's actually the fastest cpu ever lol.
All that particular screenshot demonstrates is that for Grand Theft Auto 5 (a 9 year old game) any of the higher end CPUs perform so close as to make no difference. Seriously, nobody is going to give a fuck, or even be able to see, the difference between 180 and 186fps, never mind finding a 4k screen that supports those framerates.

For most games CPU is just not that big a deal. Unless you are doing shit like trying to run Counterstrike at insane FPS because you are convinced not enough FPS is why you are hard stuck Silver 3, usually the limit is the GPU or monitor refresh rate, even when you have an older CPU. I was using an 8700k until just last week before I finally upgraded to a 13900k. Difference in gaming? None that I've noticed so far. The CPU was just not the issue. The only case I could think it would be Elder Scrolls Online and then not because the game used all of the 8700k, it didn't come close, but because it is a highly single thread limited game and the 13900k has much faster cores. So there I would see an improvement, though I don't play it at the moment.

It really isn't worth fretting over FPS differences with CPUs, that just isn't the issue for games, particularly with chonkers like the 13900 or 7950. They are SO much more powerful than games need these days.
 
All that particular screenshot demonstrates is that for Grand Theft Auto 5 (a 9 year old game) any of the higher end CPUs perform so close as to make no difference. Seriously, nobody is going to give a fuck, or even be able to see, the difference between 180 and 186fps, never mind finding a 4k screen that supports those framerates.

For most games CPU is just not that big a deal. Unless you are doing shit like trying to run Counterstrike at insane FPS because you are convinced not enough FPS is why you are hard stuck Silver 3, usually the limit is the GPU or monitor refresh rate, even when you have an older CPU. I was using an 8700k until just last week before I finally upgraded to a 13900k. Difference in gaming? None that I've noticed so far. The CPU was just not the issue. The only case I could think it would be Elder Scrolls Online and then not because the game used all of the 8700k, it didn't come close, but because it is a highly single thread limited game and the 13900k has much faster cores. So there I would see an improvement, though I don't play it at the moment.

It really isn't worth fretting over FPS differences with CPUs, that just isn't the issue for games, particularly with chonkers like the 13900 or 7950. They are SO much more powerful than games need these days.
That's great if it makes no difference to you, for me the CPU is the limiting factor in almost every game I play and it has been so for years.

It's not only because I have a top end GPU, but I like to run my games at a high refresh rate plus I play a lot of VR (and especially simulations) and older games that are very much single threaded (or at least lightly multithreaded) so the sky is the limit as far CPU performance goes for me. For example most flight sims in VR are a struggle to run at 90 let alone 120fps and you end up playing them at 45-60ish fps 99% of the time because of the CPU. While it's mostly playable like that thanks to clever software tricks, pure 120hz/120fps VR (heck even just stable 90fps/90hz) is absolute bliss and incredibly comfortable and immersive, definitely worth chasing.

There is also the impact on 1% and 0.1% low even when doing 4k Ultra+++ RTX flat screen gaming which can be very noticeably different between CPUs, but that's another can of worms.
 
I'm not sure I'd agree. Obviously, more testing and more information is required, but I tested the Core i9 10980XE @4.8GHz and it pulled over 400w and hit temperatures of 108c. Those throttle at 110c, not 100c. The 13900KS is no doubt faster than the 10980XE, but I don't think it's the most power hungry chip ever. Let's not forget those Xeon W3175X chips that were known to pull over 400w regularly.
Just going to leave this here...

 
Monster CPU, but it´s only purpose it to show how good the 13600k and the 13700k are.
 
That's great if it makes no difference to you, for me the CPU is the limiting factor in almost every game I play and it has been so for years.

It's not only because I have a top end GPU, but I like to run my games at a high refresh rate plus I play a lot of VR (and especially simulations) and older games that are very much single threaded (or at least lightly multithreaded) so the sky is the limit as far CPU performance goes for me. For example most flight sims in VR are a struggle to run at 90 let alone 120fps and you end up playing them at 45-60ish fps 99% of the time because of the CPU. While it's mostly playable like that thanks to clever software tricks, pure 120hz/120fps VR (heck even just stable 90fps/90hz) is absolute bliss and incredibly comfortable and immersive, definitely worth chasing.

There is also the impact on 1% and 0.1% low even when doing 4k Ultra+++ RTX flat screen gaming which can be very noticeably different between CPUs, but that's another can of worms.
Sadly no one benchmarks or reviews for that. High end VR is such a niche that it just doesn’t quite justify concentrated reviews it seems.

Since I don’t do simulations I’m still on an original Rift, and even the quest 2 doesn’t take nearly that much horsepower to feed it seems.

Then again, outside of simulations VR is kinda dead now - I just play beat saber. Haven’t found many other games that weren’t part of the original release tranche from way way back when.
 
That's great if it makes no difference to you, for me the CPU is the limiting factor in almost every game I play and it has been so for years.
Ok, then I would suggest looking at performance graphs to see how they do for those games, at the resolutions you play, not for GTA5. I'm not saying it can NEVER make a difference, but I think it makes a difference far less than people think. As an example of a modern game, I didn't notice any difference in Resident Evil Village at 4k120 with a 3090. Usually the limiting factor was the GPU, with the game running in the 80-90fps range. Sometimes the limiting factor was the scan out of the TV, which is limited to 120Hz at least in 10-bit HDR mode.

Don't get me wrong, I'm not hating on new, fast, CPUs. I literally just spent over $2k getting a new CPU, mobo, RAM, etc. I am just saying that when it comes to gaming, it really isn't that big a deal for most people, the issue is the GPU and/or display not the CPU. Goes extra super double for the new CPUs, and hence why you see sites recommending a 13700 or 13600 for gamers because the 13900 just doesn't matter.
 
Ok, then I would suggest looking at performance graphs to see how they do for those games, at the resolutions you play, not for GTA5. I'm not saying it can NEVER make a difference, but I think it makes a difference far less than people think. As an example of a modern game, I didn't notice any difference in Resident Evil Village at 4k120 with a 3090. Usually the limiting factor was the GPU, with the game running in the 80-90fps range. Sometimes the limiting factor was the scan out of the TV, which is limited to 120Hz at least in 10-bit HDR mode.

Don't get me wrong, I'm not hating on new, fast, CPUs. I literally just spent over $2k getting a new CPU, mobo, RAM, etc. I am just saying that when it comes to gaming, it really isn't that big a deal for most people, the issue is the GPU and/or display not the CPU. Goes extra super double for the new CPUs, and hence why you see sites recommending a 13700 or 13600 for gamers because the 13900 just doesn't matter.
QFT. Check reviews for the games you actually play then make CPU decisions based on that. Lots of folks overspend on the CPU when that budget could go to something else.
 
QFT. Check reviews for the games you actually play then make CPU decisions based on that. Lots of folks overspend on the CPU when that budget could go to something else.
Or just admit you want the shiniest toy because you want it and don't have a good reason. That's me. I got a 13900k because I wanted a top end CPU, I can afford it, and, well, that's the only reason. Realistically my 8700k was still working great. Nuendo was the only place you could argue more CPU would be useful, and then only somewhat (I can always just freeze tracks).

But I certainly wouldn't try to argue it, or any other high-end CPU, was worth it based on GTA 5 benchmarks, nor would I worry about the difference between 180 and 186fps which is the spread seen on the high end. Like if I had a 7950X I'm not going to run out and buy a 13900K because it gets 3 more FPS, which is well withing the margin of error anyhow.

For me personally, I'm not even interested unless something is over about a 5% difference, because you can see variation of more than that based on other factors.
 
Or just admit you want the shiniest toy because you want it and don't have a good reason. That's me. I got a 13900k because I wanted a top end CPU, I can afford it, and, well, that's the only reason. Realistically my 8700k was still working great. Nuendo was the only place you could argue more CPU would be useful, and then only somewhat (I can always just freeze tracks).

But I certainly wouldn't try to argue it, or any other high-end CPU, was worth it based on GTA 5 benchmarks, nor would I worry about the difference between 180 and 186fps which is the spread seen on the high end. Like if I had a 7950X I'm not going to run out and buy a 13900K because it gets 3 more FPS, which is well withing the margin of error anyhow.

For me personally, I'm not even interested unless something is over about a 5% difference, because you can see variation of more than that based on other factors.
If you know you’re buying it because it’s shiny, that is perfectly understandable. If you buy it because it’ll make you game faster because the internet talking heads said it does when paired with a gpu that’s not in the budget, that’s being suckered.
 
Or just admit you want the shiniest toy because you want it and don't have a good reason. That's me. I got a 13900k because I wanted a top end CPU, I can afford it, and, well, that's the only reason. Realistically my 8700k was still working great. Nuendo was the only place you could argue more CPU would be useful, and then only somewhat (I can always just freeze tracks).

But I certainly wouldn't try to argue it, or any other high-end CPU, was worth it based on GTA 5 benchmarks, nor would I worry about the difference between 180 and 186fps which is the spread seen on the high end. Like if I had a 7950X I'm not going to run out and buy a 13900K because it gets 3 more FPS, which is well withing the margin of error anyhow.

For me personally, I'm not even interested unless something is over about a 5% difference, because you can see variation of more than that based on other factors.
what games do you play? GTA V Online is very popular and the performance is thus relevant to many
 
what games do you play? GTA V Online is very popular and the performance is thus relevant to many
Varies a lot, plenty are not high end and thus not relevant. The most recent higher end ones have been RE: Village and Hitman 3.

The thing about that GTA 5 benchmark though is it doesn't matter if the game is popular, it isn't a good benchmark to get mad at the KS CPU because:

1) The game is old; it doesn't do a good job us making use of a modern CPU. It is fine to have to say "If you play this game here's how it performs," but I'm not going to base how good, or bad, a CPU is on it because it was coded for i7-2600k CPUs, based on when it was released.

2) The benchmark is running at 4k, which loads up the GPU quite a bit. While that is fine for a realistic play test, if the idea is to expose CPU performance then that isn't the right answer.

3) This is the big one: All the results are WAY ABOVE what matters. Most 4k displays are capping out at 120Hz, a few at 144Hz. There is only one I know of, the Neo G8, that gets up to where any of this might make a difference. Even the 12600k's 95th percentile is above the scanout of anything not the G8 so it just doesn't matter.


It is very clearly a game that pretty much any new CPU is more than capable of playing as fast as your GPU or monitor will handle in basically all cases and thus not a good reason to. So no, the 13900KS isn't impressive for it... None of the high-end CPUs are particularly. The game is clearly capping out on other things. If it is what you play, a slower CPU will do you perfectly well.

If you want to test actual CPU performance, you need something where they aren't constrained by other things, so generally gaming is not the best metric for the high end. For gaming tests, I think the question is not if there are tiny differences, but at what level does a CPU become enough that it is no longer the problem for a given game? If even a high-end GPU is constraining the FPS under realistic settings, or if it is just pegging out the refresh rates of modern displays, then call it good.
 
Varies a lot, plenty are not high end and thus not relevant. The most recent higher end ones have been RE: Village and Hitman 3.

The thing about that GTA 5 benchmark though is it doesn't matter if the game is popular, it isn't a good benchmark to get mad at the KS CPU because:

1) The game is old; it doesn't do a good job us making use of a modern CPU. It is fine to have to say "If you play this game here's how it performs," but I'm not going to base how good, or bad, a CPU is on it because it was coded for i7-2600k CPUs, based on when it was released.

2) The benchmark is running at 4k, which loads up the GPU quite a bit. While that is fine for a realistic play test, if the idea is to expose CPU performance then that isn't the right answer.

3) This is the big one: All the results are WAY ABOVE what matters. Most 4k displays are capping out at 120Hz, a few at 144Hz. There is only one I know of, the Neo G8, that gets up to where any of this might make a difference. Even the 12600k's 95th percentile is above the scanout of anything not the G8 so it just doesn't matter.


It is very clearly a game that pretty much any new CPU is more than capable of playing as fast as your GPU or monitor will handle in basically all cases and thus not a good reason to. So no, the 13900KS isn't impressive for it... None of the high-end CPUs are particularly. The game is clearly capping out on other things. If it is what you play, a slower CPU will do you perfectly well.

If you want to test actual CPU performance, you need something where they aren't constrained by other things, so generally gaming is not the best metric for the high end. For gaming tests, I think the question is not if there are tiny differences, but at what level does a CPU become enough that it is no longer the problem for a given game? If even a high-end GPU is constraining the FPS under realistic settings, or if it is just pegging out the refresh rates of modern displays, then call it good.
Ya good explanation. I mainly got the 13900KS because I wanted to give my 4090 the best platform to run it on. There has been many tests where the limiting factor of the 4090 has been the CPU performance. So it only made sense for me to get the "best" or "fastest" CPU you can get at the moment.
Also looking at the stock benchmarks for a 13900KS isn't looking at the whole picture. The 13900KS can be overclocked more than any other cpu up to and beyond 6ghz. This is where is single threaded performance is better than any other cpu giving it the edge in comparison. The edge is small but it is the fastest. So fastest CPU mated to the fastest Gpu and if you can get the fastest ram gives you the best platform.
I didn't do it for bragging rights, I just don't wanna deal with upgrading every cycle. I wanna chill for a few years and relax and enjoy my system while pumping all my games up to max at 4k144.
 
And here he takes it up to a much more reasonable 6.3 for every day driving

21% increase over stock is fantastic, that's massive.


 
Varies a lot, plenty are not high end and thus not relevant. The most recent higher end ones have been RE: Village and Hitman 3.

The thing about that GTA 5 benchmark though is it doesn't matter if the game is popular, it isn't a good benchmark to get mad at the KS CPU because:

1) The game is old; it doesn't do a good job us making use of a modern CPU. It is fine to have to say "If you play this game here's how it performs," but I'm not going to base how good, or bad, a CPU is on it because it was coded for i7-2600k CPUs, based on when it was released.

2) The benchmark is running at 4k, which loads up the GPU quite a bit. While that is fine for a realistic play test, if the idea is to expose CPU performance then that isn't the right answer.

3) This is the big one: All the results are WAY ABOVE what matters. Most 4k displays are capping out at 120Hz, a few at 144Hz. There is only one I know of, the Neo G8, that gets up to where any of this might make a difference. Even the 12600k's 95th percentile is above the scanout of anything not the G8 so it just doesn't matter.


It is very clearly a game that pretty much any new CPU is more than capable of playing as fast as your GPU or monitor will handle in basically all cases and thus not a good reason to. So no, the 13900KS isn't impressive for it... None of the high-end CPUs are particularly. The game is clearly capping out on other things. If it is what you play, a slower CPU will do you perfectly well.

If you want to test actual CPU performance, you need something where they aren't constrained by other things, so generally gaming is not the best metric for the high end. For gaming tests, I think the question is not if there are tiny differences, but at what level does a CPU become enough that it is no longer the problem for a given game? If even a high-end GPU is constraining the FPS under realistic settings, or if it is just pegging out the refresh rates of modern displays, then call it good.
I’ve got a 4K 144Hz display from over two years ago from LG. The 4090 can’t even come close to avg 120 fps sustained in GTA Online let alone 144 sustained. For me it’s a big problem
 
I’ve got a 4K 144Hz display from over two years ago from LG. The 4090 can’t even come close to avg 120 fps sustained in GTA Online let alone 144 sustained. For me it’s a big problem
Ok well then get any modern CPU, according to that chart. Even the 12600k was beyond that on their test...

...or wait, maybe is the problem that they turned down the detail, and when you turn that up the 4090 can't sustain 120fps? In that case, we are back to the "You don't need a super high-end CPU for good gaming," argument. If the GPU is the limiting factor, then a higher end CPU doesn't get you more FPS.

As Raytracing keeps increasing I think we'll just see that more and more. Control isn't new but has beautiful graphics and great RT... which just slams GPUs. It is another one where the CPU was just not the issue, the GPU was always pegged at 100%, except for the rare case where you were in an area without much on screen and it was able to peg the refresh rate for a brief bit.

Basically, my issue is you are shitting on the 13900KS as being "weak" because of a GTA 5 benchmark, when it sounds like CPU is not the limiting factor which seems to generally be the thing with games, they are usually GPU or scanout limited, not CPU limited.
 
I'm not sure I'd agree. Obviously, more testing and more information is required, but I tested the Core i9 10980XE @4.8GHz and it pulled over 400w and hit temperatures of 108c. Those throttle at 110c, not 100c. The 13900KS is no doubt faster than the 10980XE, but I don't think it's the most power hungry chip ever. Let's not forget those Xeon W3175X chips that were known to pull over 400w regularly.
I was about to buy one of those chips (W3175X) until I saw the Gamer Nexus review.
 
Ya good explanation. I mainly got the 13900KS because I wanted to give my 4090 the best platform to run it on. There has been many tests where the limiting factor of the 4090 has been the CPU performance. So it only made sense for me to get the "best" or "fastest" CPU you can get at the moment.
Also looking at the stock benchmarks for a 13900KS isn't looking at the whole picture. The 13900KS can be overclocked more than any other cpu up to and beyond 6ghz. This is where is single threaded performance is better than any other cpu giving it the edge in comparison. The edge is small but it is the fastest. So fastest CPU mated to the fastest Gpu and if you can get the fastest ram gives you the best platform.
I didn't do it for bragging rights, I just don't wanna deal with upgrading every cycle. I wanna chill for a few years and relax and enjoy my system while pumping all my games up to max at 4k144.
It's a fair reason to do it. There's nothing wrong with wanting "the best" if you can afford it. I did, or I suppose second best since I got a K and could have waited a few more days for a KS. Either way nothing wrong with saying "I want top of the line," if you are in the fortunate position to afford it.
 
Ok well then get any modern CPU, according to that chart. Even the 12600k was beyond that on their test...

...or wait, maybe is the problem that they turned down the detail, and when you turn that up the 4090 can't sustain 120fps? In that case, we are back to the "You don't need a super high-end CPU for good gaming," argument. If the GPU is the limiting factor, then a higher end CPU doesn't get you more FPS.

As Raytracing keeps increasing I think we'll just see that more and more. Control isn't new but has beautiful graphics and great RT... which just slams GPUs. It is another one where the CPU was just not the issue, the GPU was always pegged at 100%, except for the rare case where you were in an area without much on screen and it was able to peg the refresh rate for a brief bit.

Basically, my issue is you are shitting on the 13900KS as being "weak" because of a GTA 5 benchmark, when it sounds like CPU is not the limiting factor which seems to generally be the thing with games, they are usually GPU or scanout limited, not CPU limited.
I prefer to run at max settings and it can’t even come
Close to 120 solid. It dips quite a bit even with high end cpu. I’ve seen it drop to 45fps with the rig in my sig
 
I prefer to run at max settings and it can’t even come
Close to 120 solid. It dips quite a bit even with high end cpu. I’ve seen it drop to 45fps with the rig in my sig
Because of the CPU?
 
Because of the CPU?
No, the cpu is great, just a few percent difference between X3D

There isn’t a cpu out that can make up the difference or even close to required for 144+ fps 4K with GTA at maximum
 
And here he takes it up to a much more reasonable 6.3 for every day driving

21% increase over stock is fantastic, that's massive.



Don't you have a 13900KS? With a shitload of rad space? Why don't you OC it and share the results?
I rather see a [H] member OC'ing this and see the outcome than watching some ramdom Youtuber I've never heard of doing it.
 
No, the cpu is great, just a few percent difference between X3D

There isn’t a cpu out that can make up the difference or even close to required for 144+ fps 4K with GTA at maximum
...then why are you using it as a benchmark to call the KS "weak"? If it isn't the CPU that is limiting GTA, which is completely unsurprising, then it just again goes to the argument I have been making: The CPU is just not that big a deal for gaming, and using games as a benchmark for how good a CPU is doesn't make sense.
 
Back
Top