Intel Core i9-9900KS Review: The Fastest Gaming CPU Bar None

there are proof of concept tools or meltdown that would be interesting to see run on this cpu that has flags set telling the OS it has hardware mitigations to see if they're actually effective. So far I've only seen testing that takes intel at their word...but I'm not sure who would trust intel at all.

cmon phoronix and anandtech.... run https://github.com/IAIK/meltdown and the spetre tools ...
 
Last edited:
and i'll give you the benifit of the doubt that you weren't the one that tattled to the mods about the funny comment i made about your name. ;)

weak sauce bro. :[H]4life:

yeah, ok... #(s)oft ocp

Ok I bit since you’re attacking me and I have no idea why. I scrolled up, are you confusing me with your post where you changed Dookey’s name to Donkey? I couldn’t find any jokes about me.
 
One could argue that the Intel top dog is also outside of "typical" consumer area and is into pro / hobbyist area instead.

Outside of that area, just about anything works.

Given the nature of the bleeding edge segment and that mostly advance users will use those higher core counts it's safe to assume a high percentage of those users will use it outside of Gaming ... don't you think ? Like VM, encoding even streaming...

Streaming is something that you run only on the host CPU if you have to. Otherwise, if you're using high core counts -- for which development VMs is unnecessary -- you're probably doing some level of commercial work and either you can't afford a real workstation, or you're doing it for personal education.

Having more cores than you need is nice, but not when sacrificing single-core performance.

Now, if you actually need more cores...


Gotta ask, are you paid per post by the blue team ?

You're making an accusation here that you cannot back up. Perhaps read what I have actually recommended versus discussions of the merits of niche products.
 
You're making an accusation here that you cannot back up. Perhaps read what I have actually recommended versus discussions of the merits of niche products.

Didn't meant to attack you, I saw your other posts too ;) I'm just tired a little of reading that same sentence over and over again but yeah you do recommend AMD on some other cases, point given. Please don't reply with it again lol, everyone has their own personal use and they can decide what works better for them. I choose the 2700X because at the time it was what was the most useful for the tasks I had and didn't need nor wanted the TR.

I use VMs instead of having multiple dedicated computers, I find it easier to maintain and easier on the power bills and that's for hobby at home.
 
I use VMs instead of having multiple dedicated computers, I find it easier to maintain and easier on the power bills and that's for hobby at home.

Same; well, I use at least two systems, but one can be extremely low powered as it'll be running pihole. The other does everything else, starting with being a NAS.
 
Ok I bit since you’re attacking me and I have no idea why. I scrolled up, are you confusing me with your post where you changed Dookey’s name to Donkey? I couldn’t find any jokes about me.

Looks like the original comment got edited or deleted. I believe it was something to the effect of him telling you to eat your name after taking the a's out of it.
 
Looks like the original comment got edited or deleted. I believe it was something to the effect of him telling you to eat your name after taking the a's out of it.

I was curious what someone could say about an alias like Dayaks. He’s creative I’ll give him that. Heh..
 
Outside of that area, just about anything works.



Streaming is something that you run only on the host CPU if you have to. Otherwise, if you're using high core counts -- for which development VMs is unnecessary -- you're probably doing some level of commercial work and either you can't afford a real workstation, or you're doing it for personal education.

Having more cores than you need is nice, but not when sacrificing single-core performance.

Now, if you actually need more cores...

If the ipc difference was a blow out sure. But it's not, which was my point. If the ipc difference was as huge as it used to be an Intel anything at 5.1+ should blow the doors off in all benchmarks especially in anything lightly threaded but that's not the case.

Right now most people reference windows benchmarks in Linux it's even worse. Intel is up what 500 MHz or more. It should win in everything minus serious workstation or server workloads. But it's not the case.

Factor in the wattage too and it starts knocking on the door of bulldozer. Granted it's not that bad. But if Intel is pushing the chicks and wattage up to this level it better deliver a K.O and this SKU does none of that.
 
Factor in the wattage too and it starts knocking on the door of bulldozer.

It really doesn't -- Bulldozer, like Netburst, was a regression in IPC versus the product it was meant to replace. At least clockspeeds went up with Netburst and it was still competitive until AMD put their memory controller on-die.

If the ipc difference was a blow out sure. But it's not, which was my point. If the ipc difference was as huge as it used to be an Intel anything at 5.1+ should blow the doors off in all benchmarks especially in anything lightly threaded but that's not the case.

The point is that there's an edge there, and this new part operates at that edge. It's a refinement of an Intel part that's already at that edge, with better clockspeeds and lower power usage (and hardware mitigations in place).

But if Intel is pushing the chicks and wattage up to this level it better deliver a K.O and this SKU does none of that.

This is your opinion.
 
intel's skimming the best they have at a manufacturing process that's mature and running at it's peak potential.

AMD is matching them on a process that's the opposite without the money to push low yield but creme of the crop silicone just to brag.

I doubt amd is worried about any of these latest offerings from intel. But then, I doubt intel is really worried about amd in general, as it would take years to widdle away market share and funds for amd to basically flip positions with them. For now, there's room for both to be happy.

It's just disappointing that intel continually over the past decade+ has been allowing amd to innovate ...while sitting on vastly more cash ...while it just overcharges their customers and abuses their market position. Intel should be the one bringing us exciting new things since it has so much morey ability to invest in R&D... but instead it repeatedly looks like they're allowing complacency and laziness and basically greed drive them forward.

When was the last time intel had a new (not already done by amd) cpu innovation that wasn't exciting to just the NSA and hackers (that was actually sold in a sub 600 dollar cpu)?
 
  • Like
Reactions: N4CR
like this
It's just disappointing that intel continually over the past decade+ has been allowing amd to innovate ...while sitting on vastly more cash ...while it just overcharges their customers and abuses their market position. Intel should be the one bringing us exciting new things since it has so much morey ability to invest in R&D... but instead it repeatedly looks like they're allowing complacency and laziness and basically greed drive them forward.

You do realize that they're over four years behind on their own roadmaps? That they're losing marketshare right now due to their own mistakes?

When was the last time intel had a new (not already done by amd) cpu innovation that wasn't exciting to just the NSA and hackers (that was actually sold in a sub 600 dollar cpu)?

Pretty much every major architectural release. AMDs only innovation has been to put all their eggs into the TSMC basket with an architecture that is easy to produce and sell with more cores than most consumers need.
 
It really doesn't -- Bulldozer, like Netburst, was a regression in IPC versus the product it was meant to replace. At least clockspeeds went up with Netburst and it was still competitive until AMD put their memory controller on-die.
I'm not talking about a regression in ipc. I'm talking about taking the architecture to it's limits to the point where wattage goes through the roof. It sucks down more power than a 3900x.

The point is that there's an edge there, and this new part operates at that edge. It's a refinement of an Intel part that's already at that edge, with better clockspeeds and lower power usage (and hardware mitigations in place).
Lower power usage? Do you have a link to a review showing this?

This is your opinion.
Isn't being a human being great?!? You have an opinion and I have an opinion. We're twinsies!
 
by how much? and i am not talking about now (even though perfomance has improved and is still improving since day 1 reviews) but i am talking next year when new consoles drop and every tom, d!ck and hairy have a 8c/16t ZEN 2 in their living room?!
You do realize that that CPU put inside a console is going to be severely handicapped? Its not a PC there isn't a discrete Video card and RAM. Its all shared resources. Thats like buying a 64 core Xeon processor and putting it in a desktop computer.....for web surfing.
 
It really doesn't -- Bulldozer, like Netburst, was a regression in IPC versus the product it was meant to replace. At least clockspeeds went up with Netburst and it was still competitive until AMD put their memory controller on-die.

dude athlon xp was trading blows w/ P4 at a lower clockspeed and then athlon 64 straight blew them out of the water. AMD FX did have lower IPC but ran at much HIGHER CLOCK SPEEDS and w/ more threads therefore getting better performance than the product it replaced. just so you know.
 
...and slower.

just thought you all would find this interesting. looks like after the new agesa update 3800X is basically matching performance of 9900k but at a lower clockspeed.

it won't let me post link right so i am gonna put a space in it

www.reddit.com /r/Amd/comments/dpf037/3800x_1004_beta_bios/
 
You do realize that that CPU put inside a console is going to be severely handicapped? Its not a PC there isn't a discrete Video card and RAM. Its all shared resources.


you sure? i don't think they've released the rest of the specs yet. every generation it's getting closer and closer to being just an actual pc. and if they're saying it's going to be running games at possibly 8K then there can't be but so much gimping they're going to be able to do.
 
you sure? i don't think they've released the rest of the specs yet. every generation it's getting closer and closer to being just an actual pc. and if they're saying it's going to be running games at possibly 8K then there can't be but so much gimping they're going to be able to do.
The 8k claim is completely false, they don't even have video cards that can properly do that now.....and the PS5 is based on current hardware. We just started getting cards that can properly do 4k.
 
just thought you all would find this interesting. looks like after the new agesa update 3800X is basically matching performance of 9900k but at a lower clockspeed.

it won't let me post link right so i am gonna put a space in it

www.reddit.com /r/Amd/comments/dpf037/3800x_1004_beta_bios/

i remember when intel was worth buying.

back in 2016.

also when skylake 1 was released now we have skylake 4 the lakening.
 
  • Like
Reactions: N4CR
like this
dude athlon xp was trading blows w/ P4 at a lower clockspeed and then athlon 64 straight blew them out of the water. AMD FX did have lower IPC but ran at much HIGHER CLOCK SPEEDS and w/ more threads therefore getting better performance than the product it replaced. just so you know.

Quite aware, thanks -- the Athlon 64 was when AMD put the memory controller on-die.

just thought you all would find this interesting. looks like after the new agesa update 3800X is basically matching performance of 9900k but at a lower clockspeed.

it won't let me post link right so i am gonna put a space in it

www.reddit.com /r/Amd/comments/dpf037/3800x_1004_beta_bios/

I'm on [H] specifically because I have no intention of wading into the cesspool that is r/amd, thanks.
 
oh, ok. I'll go back to my eight year old 4.5GHz 6c/12th 3930k now.

At this point, the bigger appeal to me for upgrading is the associated hardware that comes with the chipset upgrade. Faster PCI-E, NVME (or faster/multiple NVME), more USB3+. I honestly haven't felt any CPU upgrade in the last several years. The other stuff is noticeable improvement though.
 
At this point, the bigger appeal to me for upgrading is the associated hardware that comes with the chipset upgrade. Faster PCI-E, NVME (or faster/multiple NVME),
PCIE3.0 Yeah got it. Intel doesn't even have 4.0. A vid card wont come close to saturating 3.0
NVME, ok lets boot a second faster than SATA3 SSD or I'll just wake my computer up. There's really nothing here.
more USB3+.
Moar. USB isn't a big deal. I'd just buy a hub.
I honestly haven't felt any CPU upgrade in the last several years.
That's what I'm talking about.
 
Well, I guess if you're "pretty sure," that is good enough for me. ;) Suck it. If I am taking the time to OC to 5.3GHz don't you think I am smart enough to kill off a bunch of Hz sucking programs while I am gaming? I will put my 8600K up against that 9900KS any day.

Only 5.3 Ghz! ...seems a bit flaccid, not to mention very out of character to me. I encourage you to aim for at least 5.5, preferably on air, which would be more in keeping with the spirit of the [H] :D


images (1).jpg
 
The 8k claim is completely false, they don't even have video cards that can properly do that now.....and the PS5 is based on current hardware. We just started getting cards that can properly do 4k.


Yeap, people that aren't savvy see the 8k bullet point and think that it will be for games, if they actually knew the power required they would understand that it is for video playback basically.
 
You do realize that that CPU put inside a console is going to be severely handicapped? Its not a PC there isn't a discrete Video card and RAM. Its all shared resources. Thats like buying a 64 core Xeon processor and putting it in a desktop computer.....for web surfing.
If performance is curtailed in order to stay within TDP limits that does not make the hardware less of a "PC" than anything else.

In addition the shared resources you are speaking of have some noticable differences that are actually better than what you can get in your everyday PC. For example the PS4 packs 8GB of GDDR5. It's well over 5 years old and was packing more available memory than your average PC at the time and most graphic cards at the time. Where this idea came from that consoles are some low powered paper weights really only started when AMD started supplying materials, which is not synonomous with low performance.

Consoles are relatively more future proofed than your average PC. They almost always have more bandwidth/memory allocation systems and same went for CPU core counts within the current generation of consoles. They were packing 8 cores five years ago!
 
Please read the thread title.
I also wanted to respond directly to this.

Regarding clock speed being more important than cores in gaming, which is absurd, we are already today right now are at the point where games perform the best at 8 cores, 6 being the sweet spot. Why is this? Well because of consoles which the main ones have 8.

All this talk of games not being able to take advantage of more cores has got to be the sillyiest thing I've heard in a while. Of ALL computer software there is today: games, virtualization, and HPC are the three main categories that benefit the most from having extra resources.

One of the reasons, aside from Intel keeping everyone stuck at 4 cores previously was that most gaming engines 5 years ago only accounted for 4 cores. Meaning if you developed a game they were centered around 4 physical cores being in the system. So if you developed a game with them then 4 cores would be the best to have. This was the case for a very long time until the PS4 / Xbox One came along. WIth these consoles having 8 physical cores in the them that provided a really good reason to move beyond the 4 core barrier.

It's a ballet for sure, but we are well past the point where software devlopement needs to evolve to parallel tasks being performed on more than one core. This response does not address another key thing in gaming performance but I'll address it later if it arises.
 
If performance is curtailed in order to stay within TDP limits that does not make the hardware less of a "PC" than anything else.

In addition the shared resources you are speaking of have some noticable differences that are actually better than what you can get in your everyday PC. For example the PS4 packs 8GB of GDDR5. It's well over 5 years old and was packing more available memory than your average PC at the time and most graphic cards at the time. Where this idea came from that consoles are some low powered paper weights really only started when AMD started supplying materials, which is not synonomous with low performance.

Consoles are relatively more future proofed than your average PC. They almost always have more bandwidth/memory allocation systems and same went for CPU core counts within the current generation of consoles. They were packing 8 cores five years ago!
The shared resources is, the RAM which was split between system and graphics. That means that 8gb is not dedicated to just system or just graphics but split between the two. That handicaps the performance of the system. Consoles are actually not future proof as you think They have a shelf life and then can't be used anymore. They get maybe 6 years max from the current generation consoles, the previous generation was 7 years. My last PC lasted me 8 years and was playing new games every year. The biggest difference is that you can change parts out and update your PC without having to buy and whole new system and all new games. That's where the PC has the advantage. Consoles only have a price advantage but that gap is closing with each new generation.
Im not talking about anything having to do with heat issues with console because lets face it thats exactly why consoles are inferior by design......they are purposely limited.
 
Regarding clock speed being more important than cores in gaming, which is absurd, we are already today right now are at the point where games perform the best at 8 cores, 6 being the sweet spot. Why is this? Well because of consoles which the main ones have 8.

If you have enough cores, clockspeed matters. If you don't have enough cores, both matter.

Put another way, once you have enough cores, adding more cores does nothing, but adding clockspeed (and / or IPC) does.

Currently, that number of cores for gaming is six. That's why AMD's 1600 / 2600 / 3600 have been heralded as such good buys for gamers.


Also, your "consoles have eight cores" argument has already been shown to be bunk above. Jaguar cores are not comparable to desktop x86 cores.
 
All this talk of games not being able to take advantage of more cores has got to be the sillyiest thing I've heard in a while. Of ALL computer software there is today: games, virtualization, and HPC are the three main categories that benefit the most from having extra resources.

Not extra cores. RAM, if there isn't enough, IPC if there are enough cores, sure.
 
If the App is threaded, ideally no cores reach 100% utilization. SO - If no cores are 100% or fully utilized - increasing single core speed or ipc is pointless.
 
The shared resources is, the RAM which was split between system and graphics. That means that 8gb is not dedicated to just system or just graphics but split between the two. That handicaps the performance of the system. Consoles are actually not future proof as you think They have a shelf life and then can't be used anymore. They get maybe 6 years max from the current generation consoles, the previous generation was 7 years. My last PC lasted me 8 years and was playing new games every year. The biggest difference is that you can change parts out and update your PC without having to buy and whole new system and all new games. That's where the PC has the advantage. Consoles only have a price advantage but that gap is closing with each new generation.
Im not talking about anything having to do with heat issues with console because lets face it thats exactly why consoles are inferior by design......they are purposely limited.

So, when I buy a new console, my old one just suddenly stops working? Oh wait, you still have to buy new games anyways, whether it is a console or PC, if you want to play those new games. And to be clear, you were not playing the newest games on an 8 year old system. For instance, RDR2 is not going to play on a system from 2011 and if you have to upgrade parts in it, it is no longer an 8 year old system.
 
  • Like
Reactions: kac77
like this
If the App is threaded, ideally no cores reach 100% utilization. SO - If no cores are 100% or fully utilized - increasing single core speed or ipc is pointless.

As a software engineer I would argue that you are wrong on both points. Although depends on what type of application we are talking about. For the some of the stuff I do my goal is to have 100% utilization on all cores all the time.
 
As a software engineer I would argue that you are wrong on both points. Although depends on what type of application we are talking about. For the some of the stuff I do my goal is to have 100% utilization on all cores all the time.

If that's not the case, your code is sub-par to say the least.

Edit: assuming we are talking about a cpu "limited" app. Per Subject.
 
Why does it seem the actual in-depth reviews of this thing are elusive as all can be right now?
 
So, when I buy a new console, my old one just suddenly stops working? Oh wait, you still have to buy new games anyways, whether it is a console or PC, if you want to play those new games. And to be clear, you were not playing the newest games on an 8 year old system. For instance, RDR2 is not going to play on a system from 2011 and if you have to upgrade parts in it, it is no longer an 8 year old system.
You missed the point. In order to play the newest games beyond this generation you have to buy a new console because the games will not play on old consoles. Whats hard to figure out about that. You are trying to take my statement to literally. PC games play on any PC. I played new games on my old PC up until i replaced it (Thanks for telling me what i was able to do with my system) ......I had my PC for 8 years......console generations only last around 6 year.....and the games they release that are new for the older consoles are chopped down.

1 part makes your system brand new? Thats something that I had never heard before......and also plays right into my argument of why Changing 1 part makes the PC better. You don't have to scrap and repalce all to continue the playing games......but you already knew that. Stop playing devils advocate.
 
If that's not the case, your code is sub-par to say the least.

The goal is to perform an operation in as fast of time you can do on a specific hardware platform where taking more time could waste money. If you can get 100% utilization of resources you have done a great job. Sadly its not even possible in most cases.
 
The goal is to perform an operation in as fast of time you can do on a specific hardware platform where taking more time could waste money. If you can get 100% utilization of resources you have done a great job. Sadly its not even possible in most cases.

We all understand the "goals". Sucking up the cpu with a 100% utilization game loop is not one of them. you are creating a bottleneck.
 
Back
Top