Intel Core i9-9900KS Review: The Fastest Gaming CPU Bar None

I am not writing games. I agree for UI responsiveness in a game you don't want 100% CPU usage on all cores.

That... really depends on how well the code is written.

But if it's not paralellizable, like so much game code isn't, you're going to want faster cores as more cores won't help ;).
 
The shared resources is, the RAM which was split between system and graphics. That means that 8gb is not dedicated to just system or just graphics but split between the two. That handicaps the performance of the system. Consoles are actually not future proof as you think They have a shelf life and then can't be used anymore. They get maybe 6 years max from the current generation consoles, the previous generation was 7 years. My last PC lasted me 8 years and was playing new games every year. The biggest difference is that you can change parts out and update your PC without having to buy and whole new system and all new games. That's where the PC has the advantage. Consoles only have a price advantage but that gap is closing with each new generation.
Im not talking about anything having to do with heat issues with console because lets face it thats exactly why consoles are inferior by design......they are purposely limited.
You do realize every single CPU today is governed by form factor and heat right? Feel free to put any CPU in that form factor and see what happens.

Also shared GDDR5 isn't the same as shared DDR4.
 
If you have enough cores, clockspeed matters. If you don't have enough cores, both matter.

Put another way, once you have enough cores, adding more cores does nothing, but adding clockspeed (and / or IPC) does.

Currently, that number of cores for gaming is six. That's why AMD's 1600 / 2600 / 3600 have been heralded as such good buys for gamers.


Also, your "consoles have eight cores" argument has already been shown to be bunk above. Jaguar cores are not comparable to desktop x86 cores.
Jaguar cores are desktop cores the only thing different is the memory controller and the number of cores offered for desktop 4 vs. 8.

My point isn't that one is better than the other. My point is that all of it matters.
 
You do realize every single CPU today is governed by form factor and heat right? Feel free to put any CPU in that form factor and see what happens.

Also shared GDDR5 isn't the same as shared DDR4.
You are making my point.....Thank you.
 
Intel finding new and "exciting" ways to re-release the same chips and get more reviews to attract clicks. It's all just mildly refined 2017 dies, be it this or the HDET refresh.

But reviewers need the clicks too, it's not like they're gonna say "meh it's a clock bumped 9900K, not worth a review".


all CPU are generally that way, keep refining and release as new :)
 
Jaguar cores are desktop cores the only thing different is the memory controller and the number of cores offered for desktop 4 vs. 8.

Nope.

My point isn't that one is better than the other. My point is that all of it matters.

One is better than the other, so much so that there is a tremendous gap in performance potential on the desktop side.

Only if you completely ignore the fact that all pcs are governed by tdp.

They're not.
 
Jaguar Core

"used in AMD's Semi-Custom Business Unit as a design for custom processors and is used by AMD in four product families: Kabini aimed at notebooks and mini PCs, Temash aimed at tablets, Kyoto aimed at micro-servers, and the G-Series aimed at embedded applications. Both the PlayStation 4 and the Xbox One use chips based on the Jaguar microarchitecture, with more powerful GPUs than AMD sells in its own commercially available Jaguar APUs"

So a beefed up GPU on a notebook and appliance APU core.
 
Tell that too my Althon 5350 based system, which is a desktop computer.

I can run Windows on an ARM tablet and set it on a desk, making it a 'desktop computer' too. Don't be too harsh on your Athlon 5350.

So a beefed up GPU on a notebook and appliance APU core.

Yeah, low-power CPU and high (for a console) power GPU, combined into an APU.
 
I can run Windows on an ARM tablet and set it on a desk, making it a 'desktop computer' too. Don't be too harsh on your Athlon 5350.



Yeah, low-power CPU and high (for a console) power GPU, combined into an APU.

Dude, the Athlon is a desktop X86 cpu and thems the facts, jack. (Truth hurts.)
 
Dude, the Athlon is a desktop X86 cpu and thems the facts, jack. (Truth hurts.)

The Athlon is whatever AMD says it is. They have super low-power versions for embedded use too, that you would really never want to run desktop Windows on. It's just a trademark.
 
The Jaguar cores were designed for APUs not for full dekstops.

It is a compact 3.1mm2 core that targets 2-25W devices, in particular tablets, microservers, and consoles.
https://www.realworldtech.com/jaguar/

They target console and lower powered devices not desktop computers.

And yet, it is in my Desktop Computer, go figure. Hmmmm..........

No longer available and yet, listed under the desktop processors: https://www.newegg.com/amd-athlon-5350/p/N82E16819113364
 
  • Like
Reactions: kac77
like this
Running at 1.8Ghz...
Has that been confirmed? So five plus years of development and a new process and you honestly think they'll do 1.8GHz like jaguar? They're about double IPC and the 3700x does 3.6 base and 4.4 Max boost in 65W... PS4 pro can use up to 310W.. Leaving plenty for system and GPU with a 65W CPU TDP...
It's not an inefficient 9999KFXD blast furnace you might be used to.
 
So five plus years of development and a new process and you honestly think they'll do 1.8GHz like jaguar?

AMD had >4.0GHz CPUs then too. Jaguar could have been designed (or tweaked) to run at higher core speeds- AMD was and is certainly capable of delivering that- but that's not what Sony and Microsoft ordered.

Expect them to lean more toward efficiency, especially as CPU requirements haven't risen nearly as much as GPU requirements have, and both parts have to fit within the TDP limits set by AMDs customers together.
 
Expect them to lean more toward efficiency, especially as CPU requirements haven't risen nearly as much as GPU requirements have, and both parts have to fit within the TDP limits set by AMDs customers together.

Exactly. The 2.5'sh GHz 8/16 Zen2 cores programed for dedicated hardware will be extremely power efficient and a incredible leap in processing power over last gen, leaving the majority of the juice for the GPU.
And look how far last gen pushed 8 lowly cores. Current consoles are no slouch which is obvious to anyone who plays the games rather then pigeon them into "appliance" space. :ROFLMAO::ROFLMAO::ROFLMAO::ROFLMAO:
 
Quit your whining and get a Motorola 68030 running under a real OS like AmigaOS 3.1. ;)
 
Exactly. The 2.5'sh GHz 8/16 Zen2 cores programed for dedicated hardware will be extremely power efficient and a incredible leap in processing power over last gen, leaving the majority of the juice for the GPU.
And look how far last gen pushed 8 lowly cores. Current consoles are no slouch which is obvious to anyone who plays the games rather then pigeon them into "appliance" space. :ROFLMAO::ROFLMAO::ROFLMAO::ROFLMAO:

Current consoles are impressive for what they are, but I can still see dips into what looks like around 30Hz. This gen Ryzen should fix that for the most part though, even at ~2.5Ghz.

I know VRR TVs are coming out so if that all links together that would be an even better improvement.

For desktop I’d personally go the KS over 3900x. I can’t even saturate the 8 cores of my 9900KF. The extra four cores are useless to me. Even Adobe Premiere Elements only uses 8 threads (so I’d imagine it’s mostly saturating the CPU). It’s kind of strange logic but for me, the KS is faster at everything for the same price. I’d really have to compare it to the 3800x which has nothing on an OC’d KS except a minimal price difference (in the scheme of a total system cost).

The 3900x/3950x is basically the same as Tensor cores on the RTX cards. It’s AMD trying to leverage their (massive) advantage with chiplets in the HEDT/enterprise space into the consumer realm even though it makes little sense/adds little value for the consumer.
 
Last edited:
And yet, it is in my Desktop Computer, go figure. Hmmmm..........

No longer available and yet, listed under the desktop processors: https://www.newegg.com/amd-athlon-5350/p/N82E16819113364
"The AMD Athlon 5350 Kabini is a budget-friendly, low-power System on Chip (SoC) processor that incorporates CPU, GPU and I/O controller in a single package." - Directly from the Newegg description.

Low budget CPU and GPU.......you trying to make an argument for something I just don't know what it is. I get it its in your dekstop but I know for a fact that your desktop is not blazing up the benchmarks.....which is the whole point of this topic......its not made to run a desktop at high performance.
People are arguing that the old jaguar cores are going to blow away specific desktop CPUs it's just not going to happen.
 
"The AMD Athlon 5350 Kabini is a budget-friendly, low-power System on Chip (SoC) processor that incorporates CPU, GPU and I/O controller in a single package." - Directly from the Newegg description.

Low budget CPU and GPU.......you trying to make an argument for something I just don't know what it is. I get it its in your dekstop but I know for a fact that your desktop is not blazing up the benchmarks.....which is the whole point of this topic......its not made to run a desktop at high performance.
People are arguing that the old jaguar cores are going to blow away specific desktop CPUs it's just not going to happen.

All I said is it is a desktop x86 processor, despite the claims of others.
 
Running at 1.8Ghz...

Has that been confirmed? So five plus years of development and a new process and you honestly think they'll do 1.8GHz like jaguar?

Estimate, but they will be closer to 1.8Ghz than they will be to desktop speeds of 4Ghz or higher. 2.5 would be a good guess.

They're about double IPC ....

Exactly. The 2.5'sh GHz 8/16 Zen2 cores programed for dedicated hardware will be extremely power efficient and a incredible leap in processing power over last gen, leaving the majority of the juice for the GPU.
And look how far last gen pushed 8 lowly cores. Current consoles are no slouch which is obvious to anyone who plays the games rather then pigeon them into "appliance" space. :ROFLMAO::ROFLMAO::ROFLMAO::ROFLMAO:

I do expect them to outperform the previous gen (right now's current gen) consoles. I don't think anyone has suggested otherwise, just that expecting 'PC' level of performance is probably overstating things.
Game Dev's will make it work as they always have.. working with known quantities helps with coding efficient game engines, assets can be fit into the texture budget, etc.
 
"The AMD Athlon 5350 Kabini is a budget-friendly, low-power System on Chip (SoC) processor that incorporates CPU, GPU and I/O controller in a single package." - Directly from the Newegg description.

Low budget CPU and GPU.......you trying to make an argument for something I just don't know what it is. I get it its in your dekstop but I know for a fact that your desktop is not blazing up the benchmarks.....which is the whole point of this topic......its not made to run a desktop at high performance.
People are arguing that the old jaguar cores are going to blow away specific desktop CPUs it's just not going to happen.

Neither is an i3. But it's in desktops.... Lots of them. So are i5's. Are they blazing up the charts? No. Why because they aren't for that market.

No one here has said the Jaguar chip is the fastest thing since sliced bread. What it is though in those consoles is an example how more cores can negate lower clock speeds when TDP is crucial. When it launched it was faster than what Intel had in the same tdp range.

Consoles have been multi-cored for years before desktops were. The PS2 is probably still the most multi cored system ever produced.

The only reason we are even talking about Jaguar in the first place is because people have somehow gone back to the clockspeeds is king mantra. It's not. it's one factor of many. As of right now the 9000KS is pushed so far to the limits of the process that some testing has revealed performance regressions. That's a problem and not a good thing.
 
The 8k claim is completely false, they don't even have video cards that can properly do that now.....and the PS5 is based on current hardware. We just started getting cards that can properly do 4k.
It's not based on current hardware. The video specs are RDNA2. There's not a single card out with that architecture upgrade.

Having said that I too doubt the 8K bullet point.
 
The only reason we are even talking about Jaguar in the first place is because people have somehow gone back to the clockspeeds is king mantra. It's not.

At any particular number of cores on a specific architecture, clockspeed increases are the only increases you can get.

And when talking about embedded cores that can only run so fast, more cores are the only way to do it. But outside of that? Jaguar cores were the slowest of their kind upon release.

They were used because they were fast enough, and AMD was so poor that they were willing to accept pennies for the design. They were also the only company that had both CPU and GPU IP that could be easily combined at that moment, and with every other product failing, having just enough performance- they didn't lead in any category at the time- and being dirt cheap saved them.
 
Current consoles are impressive for what they are, but I can still see dips into what looks like around 30Hz. This gen Ryzen should fix that for the most part though, even at ~2.5Ghz.

I know VRR TVs are coming out so if that all links together that would be an even better improvement.

For desktop I’d personally go the KS over 3900x. I can’t even saturate the 8 cores of my 9900KF. The extra four cores are useless to me. Even Adobe Premiere Elements only uses 8 threads (so I’d imagine it’s mostly saturating the CPU). It’s kind of strange logic but for me, the KS is faster at everything for the same price. I’d really have to compare it to the 3800x which has nothing on an OC’d KS except a minimal price difference (in the scheme of a total system cost).

The 3900x/3950x is basically the same as Tensor cores on the RTX cards. It’s AMD trying to leverage their (massive) advantage with chiplets in the HEDT/enterprise space into the consumer realm even though it makes little sense/adds little value for the consumer.

Newer API's are getting better at making full use of all cores. This is a screenshot of Doom 2016 running the Vulkan renderer, that's not jumping cores, that's using all available cores. Yes, they're not getting used 100%, but the fact they're being used reduces the ever increasing need for higher clock speeds knowing only too well that we're hitting the limits of silicon technology.

LguvZJE.png
 
Newer API's are getting better at making full use of all cores. This is a screenshot of Doom 2016 running the Vulkan renderer, that's not jumping cores, that's using all available cores. Yes, they're not getting used 100%, but the fact they're being used reduces the ever increasing need for higher clock speeds knowing only too well that we're hitting the limits of silicon technology.

View attachment 198466

If I am not mistaken one core is still pegged?
 
Feelings? You are projecting, aren't you? I stated objective fact, you stated subjective opinion and feelings........

You're still crapping on an Intel thread because someone pointed out that AMD used the Athlon branding for a CPU that competes with the Raspberry Pi :ROFLMAO:

Newer API's are getting better at making full use of all cores.

The API for graphics is one part of game logic, and it's also a dependent part of the game logic as a whole. As Dayaks mentions above, one core is pegged.

Now, I'm not arguing against games spreading the workload out, I'm just pointing out that games are an example prevalent in the desktop enthusiast world that are dependent on a single thread and therefore still benefit from increased single-core performance.
 
Back
Top