Intel's i9-9900K Is Only 12% Faster than AMD's 2700X at Gaming, but 66% Pricier

I'm just really hoping Intel realizes they are soon to be no longer the top dog and adjust pricing accordingly. Even HP Enterprise is dropping the Xeon in favor of Epyc chips.
 
Not really sure what you are on about with regards to freesync/gsync. With 144hz ,165hz, and 240hz monitors you are probably going to need that extra 12%

yea if you are going for frames at 1080p. let me know when you see a card push either 144fps let alone 240fps consistently at ultra even at 1080P. That is the whole purpose of freesync and gsync. I don't get this idea about chasing hz! How many gamers actually drop quality on purpose to increase their frames? Hey lets build a 4k gaming rig and drop quality and chase that fps, makes no sense.
 
I'm just really hoping Intel realizes they are soon to be no longer the top dog and adjust pricing accordingly. Even HP Enterprise is dropping the Xeon in favor of Epyc chips.
They aren't dropping xeons (unless I missed a news item), they're just recommending epyc as an alternative to (some) customers in the face of current supply constraints.
 
And these people are not going to notice this extra performance compared to the cheaper CPU from the next tier down.

Which people? You cannot speak for everyone as everyone has different goals.
 
yea if you are going for frames at 1080p. let me know when you see a card push either 144fps let alone 240fps consistently at ultra even at 1080P. That is the whole purpose of freesync and gsync. I don't get this idea about chasing hz! How many gamers actually drop quality on purpose to increase their frames? Hey lets build a 4k gaming rig and drop quality and chase that fps, makes no sense.

Gsync with a lower framerate doesn't look or feel anywhere near as smooth as just having a higher framerate/refresh rate in the first place, at least for me. It's a nice thing to have if your framerate is dropping low but it's no substitute for just having more frames. I agree pushing for 144 and 240hz is a bit pointless though. I love gaming at high refresh rates (which is why I'm still using a 1080p gysnc monitor) but beyond maybe 100hz there are seriously diminishing returns.
 
Paper tape? What luxury, I still have the cards. You fatcats enjoy your paper tape!
80-column cards were far better than paper tape, you just had to remember to set up the program ("drum") card on your IBM 029 properly.
ALWAYS sequence number your cards. And at least you could read an 80-column card.

I carried many a box of punch cards to the CDC 6600 I had access to back in 1975. I was running a multi-particle gravitational simulation on it. Every Thursday evening, I'd bring in the code (Fortran 77) and 200 cards with the particle data from the last run, which would be punched when the operator sent the task the stop command. Good times. Never accomplished anything, but I was just a kid.

Punch cards also make good shims, and can stop drafts around windows, BTW. They're at least as versatile as Zoolander.

Paper tape, punch cards, cassette tapes, floppies of all sizes: I've used them all.
Used to own a plane of core memory, even, and 16kbit DRAM chips.
Been doing computer stuff since 1971.
 
I used to have to load a 16k program into a pdp11 with no console, that was a piece of test equipment.
Everytime the power failed.
I used to have to enter the boot loader into a pdp11 using the front-panel toggle switches, one word at a time.
Every time the power failed. An 11/34 or 11/40 I think.

Had to periodically clean the fingers on the cards with a pencil eraser too. :)
 
Gsync with a lower framerate doesn't look or feel anywhere near as smooth as just having a higher framerate/refresh rate in the first place, at least for me. It's a nice thing to have if your framerate is dropping low but it's no substitute for just having more frames. I agree pushing for 144 and 240hz is a bit pointless though. I love gaming at high refresh rates (which is why I'm still using a 1080p gysnc monitor) but beyond maybe 100hz there are seriously diminishing returns.

Really depends on your eyes I guess lol. For me anything within the freesync range of my monitor is smooth as butter.
 
Fact of the matter is Intel and nvidia continue to use performance as an excuse to charge a high premium for their products. If your budget minded, broke, or running a business......AMD is your only good option.
 
Gsync with a lower framerate doesn't look or feel anywhere near as smooth as just having a higher framerate/refresh rate in the first place, at least for me. It's a nice thing to have if your framerate is dropping low but it's no substitute for just having more frames. I agree pushing for 144 and 240hz is a bit pointless though. I love gaming at high refresh rates (which is why I'm still using a 1080p gysnc monitor) but beyond maybe 100hz there are seriously diminishing returns.

Eh, it's nice to have regardless. Unless you are able to push well above 120/144 FPS (Which is hard to do right now w/ 4K) it's really nice to have.
 
yea if you are going for frames at 1080p. let me know when you see a card push either 144fps let alone 240fps consistently at ultra even at 1080P. That is the whole purpose of freesync and gsync. I don't get this idea about chasing hz! How many gamers actually drop quality on purpose to increase their frames? Hey lets build a 4k gaming rig and drop quality and chase that fps, makes no sense.

I absolutely cannot stand first person games with a mouse at 60hz and never could honestly. Its like steering a boat. I kept my CRT until 2014 for that reason. Most people playing online shooters like black ops 4 or pubg or cs:go will lower settings for peak frame rate. I cant think of any FPS game I have that doesn't top 100fps honestly. Far cry 5 being the worst performer of the year for me, still usually over 100fps on high settings. Im locked solid above 144 on black ops 4 in deathmatches all settings high. I'm sure a 2080ti on a 9700k is way faster than my system at 1080p as well. I agree with the other poster as well, freesync/gsync is no substitute for raw frame rate. I have it disabled for online shooters.

I've also noticed that games with SSAA like Metro Redux can stay at 144fps on my vega 64 and look very sharp at 1080p. I prefer that to 4k on that game any day.

Anything else with a controller that doesnt have to respond fast i play at 4k60 like Skyrim or TW3.
 
Last edited:
Gsync with a lower framerate doesn't look or feel anywhere near as smooth as just having a higher framerate/refresh rate in the first place, at least for me. It's a nice thing to have if your framerate is dropping low but it's no substitute for just having more frames. I agree pushing for 144 and 240hz is a bit pointless though. I love gaming at high refresh rates (which is why I'm still using a 1080p gysnc monitor) but beyond maybe 100hz there are seriously diminishing returns.

I completely agree. My other rig is a 2600k/1080ti/1440p-144hz-Gsync setup. I play with everything at ultra/manually maxed. Most modern games I play are so demanding they average 60-110fps. Very, very, rarely I see them hit 144fps. I can't really tell the difference at above 120 but the sweet spot seems to be ~90-110. SOTTR gave occasional drops into 45-55 and even though there was no noticeable tearing, the drop was definitely perceivable. At this point I'd rather see increased color accuracy/HDR/contrast ratios than hz.

Good for PT for re-running these. For most of the last 5-10 years we've seen many chasing 5Ghz+ as the ultimate target speeds for 4c or 4c/8t. Anything above 4Ghz mostly achieved the needs for 1080p/1440p since those resolutions have significant CPU involvement(especially 1080p). The 5Ghz bar still seems to be the sought after goal but with 8c or more but it's seeing some irrelevance since many older DX11 apps/engines can't properly scale to the additional cores/threads. For price/performance AMD definitely gets my vote, especially with 2700x-2950x. Having that much is a bit of future planning as DX12 gains adoption. However, we're at one of those points where some of the older software/API's is are widely used but current hardware is exceeding what it can optimally use. Very similar to the transition in 32bit to 64bit we saw awhile back. I think any builders using these benches need to assess what they intend to use their new rigs for while needing to understand the correlation of numbers and software limits.
 
I completely agree. My other rig is a 2600k/1080ti/1440p-144hz-Gsync setup. I play with everything at ultra/manually maxed. Most modern games I play are so demanding they average 60-110fps. Very, very, rarely I see them hit 144fps. I can't really tell the difference at above 120 but the sweet spot seems to be ~90-110. SOTTR gave occasional drops into 45-55 and even though there was no noticeable tearing, the drop was definitely perceivable. At this point I'd rather see increased color accuracy/HDR/contrast ratios than hz.

Just as a curiosity what is the bottleneck at 1440 in your system? 2600k is very long in the tooth, at least at 1080p you would get significantly higher frames with a new i7. 1440p I'm not sure. I have a 4790k that is the bottleneck in mostly all of my shooters at 1080p (paired with v64).

I would love to see 1440p 144hz, but probably not until my next build in a year or two.

As for sweet spots, that varies from person to person. Ive been so spoiled by over 2000 hours of CS:GO that I could honestly tell you if the framerate is under 200. In less competitive games i dont mind 100fps, but I do notice the blur when I move fast.
 
That's if you can even find it at the retail price of $530. Aren't most retailers setting the pre-order prices around $600? My 2700x was total overkill for my needs, but I bought it to support AMD for what I felt was a pretty good product. I've always bought Intel, so it was time for a change.
 
yea if you are going for frames at 1080p. let me know when you see a card push either 144fps let alone 240fps consistently at ultra even at 1080P. That is the whole purpose of freesync and gsync. I don't get this idea about chasing hz! How many gamers actually drop quality on purpose to increase their frames? Hey lets build a 4k gaming rig and drop quality and chase that fps, makes no sense.

My guess is you've never played any fps games on anything other than a 60hz monitor. Or any games for that matter.

The difference is enormous, but especially in fps games where fast aiming is crucial, those extra few frames make the video much more fluid. Say you move from looking at point a to point b super fast, like 1/10th of a second fast for arguments sake. If ur at 60fps you'll see only 6 frames in that 10th of a second. That could be 6 frames for a 180 degree turn. At 144fps you are going to see about 14 frames. And at 244fps 24 frames. Thats where the difference is worth it for tons of people. Most people don't seem to comprehend this for some reason. Tons of fps gamers made the switch, and tons of fps gamers turn their graphics down to achieve these framerates, however some games on ultra can hit these frame rates on high end systems.
 
Last edited by a moderator:
My clients do if I read your comment and phrasing correctly.

If asked to build a without any preferences except that it just runs, then I can build a rig with the goal of 90% performance at 70% (or less) in price.

Last year and this year I built such a rig. So for $1000 it will perform as will as many of the $1400 at %90 or better in performance.

So I just saved my client $400 for something they will hardly know the difference. It adds up with you build 4 competitive rigs for the price of 3 if you purchase elsewhere.

This is the reason why choose certain things at certain market points because I know that there is a certain value that can be applied to my customer base and everyone is happy.
This info does not come free or easy. It's 28 years build rigs and going on 32 years overall (cry) longer if you count me messing with Trash 80's back in late 79.

Or messing with the teletypes (land line to the local university at the time) trying to learn computer programming.... god this dates me...

I am still (somewhat) proudly running a 2500k. Never bought into the HT fad when it comes to low latency applications which is what games are, so I got that instead of paying good money for a 2600k.

I thought my sarcasm was as obvious as the other guy's but apparently it is not.
 
yea if you are going for frames at 1080p. let me know when you see a card push either 144fps let alone 240fps consistently at ultra even at 1080P. That is the whole purpose of freesync and gsync. I don't get this idea about chasing hz! How many gamers actually drop quality on purpose to increase their frames? Hey lets build a 4k gaming rig and drop quality and chase that fps, makes no sense.

That is not entirely accurate. Simply slamming the sliders to the right and max things out, at least at 4k, does not automatically mean the game looks better. Reducing some settings to get playable framerates well still looking good is the thing to do. At 1080p, higher fps with a monitor that can match probably does make a difference, I just have never tried it myself.
 
Which people? You cannot speak for everyone as everyone has different goals.
Gamers, as this CPU is obviously aimed at gamers since the test results are all games. Obviously a 9900K is good for productivity as well.

I'm just really hoping Intel realizes they are soon to be no longer the top dog and adjust pricing accordingly. Even HP Enterprise is dropping the Xeon in favor of Epyc chips.
I'm not happy with AMD either, as AMD didn't exactly keep their CPU prices in line with their own previous generation. AMD is pricing their products based on what Intel is pricing, which Intel has long ago left the realm of reasonable prices. You can't expect game developers to use more than 2-3 cores if the majority of people have 2-4 core CPUs. Anything above $200 is something people avoid buying for CPUs, and the benefits to games beyond 4 cores diminishes greatly.

Which is why I believe that AMD does have a plan to bring Ray-Tracing to the CPU, and not put it entirely on a GPU. Makes sense as Ray-Tracing can max out a CPU, while todays games barely touch the CPU in terms of loading up all the cores. So something like a 16 core CPU looks attractive if you play games with Ray-Tracing as it'll scale just fine with more cores, while todays games generally don't care if you have more cores, and even slow down cause more cores usually means lower clock speed.
 
Which people? You cannot speak for everyone as everyone has different goals.

The same people you referenced in your original quote: "if there is only one product at the top, is how much extra are people willing to pay for that last % of performance."

I didn't say everyone.
 
This isn't new though, right? Incremental performance at the top end is never on a linear perf/$ line. You want top dog? Pay out the arse.

Yeah, for other people, they'd be much better served with a 2700X, 9700, or a rung down from those. If you give a crap about perf/$. Some don't.

Think is the 9700k puts everything in a weird spot for Intel. It'll more than likely game very much like the 9900k, but get beaten in productivity by the 2700x all the while costing more than the 2700x. And considering that the 2700x is already this close to the top dog in gaming and possibly even closer in productivity, Intel looks more attractive to those either looking to spend a lot for over-all performance (9900k) or one that just games fine and can overclock (9600k).
 
I don't see the logic in paying 66% more for a chip with broken security for a negligible performance increase. It may well be necessary to disable hypertheading if additional exploits related to Foreshadow/TLBleed are discovered.

Unless you really only use your ridiculously expensive CPU for gaming...
 
...which is what we expected?

Yes, you're going to pay for that extra 12%+. That's also to be expected, and as alluded to above, you cannot get this performance in an AMD product so Intel is free to price as they wish.

Yes, and that's why they were so upfront about it from the get-go, amirite? Of course everybody thinks just like that, so why not tell the truth out of the gate? Right?!
 
That is not entirely accurate. Simply slamming the sliders to the right and max things out, at least at 4k, does not automatically mean the game looks better. Reducing some settings to get playable framerates well still looking good is the thing to do. At 1080p, higher fps with a monitor that can match probably does make a difference, I just have never tried it myself.
Unless you buy a 2080 TI :)
 
yea if you are going for frames at 1080p. let me know when you see a card push either 144fps let alone 240fps consistently at ultra even at 1080P. That is the whole purpose of freesync and gsync. I don't get this idea about chasing hz! How many gamers actually drop quality on purpose to increase their frames? Hey lets build a 4k gaming rig and drop quality and chase that fps, makes no sense.

I know what you mean. A hi-end build is like kind of like a Ferrari. Only a fool would allow himself to be seen driving one around with anything but the most super hot babe in the neighbourhood.

The purpose of free/g sync is to mitigate the damage. They don't completely solve anything. It's simple. 60 Hz, while smooth to you, is twice the latency of 120 Hz.
 
Somehow I don't think paying 66% more money for only a 12% performance difference is very economical.

Yes, but that is only 12% in gaming performance, I would bet it isn't 12% better in other non gaming tasks.
 
Seeing as only those with money to burn upgrade every single cycle.

Go AMD.... save enough money to upgrade to a new AMD chip next cycle. Cause all you will have to do is pick up a new CPU and plop it in. Counting the ebay resale value of your still current year old AMD chip its likely you will still have enough money to do it again the cycle after.

No doubt your twice upgraded for = money AMD chip will destroy your now 2-3 year old Intel chip.

Intel systems are buy complete once type deals. Chances are the next 2-3 zen revisions will be drop in upgrades.

That's one of the reasons why I had AMD for all my extended family. Upgrades just trickled down the chain, even my grandfather would get free drop in upgrades every so often.

I would rather go [H]ard so I can get nice rigs for me and my wife to game together than go [H]ardest on a single rig and have her (or me, if she plays on the beast rig) lag to death.
 
  • Like
Reactions: ChadD
like this
9GHmMcY.jpg
 
Yes, but that is only 12% in gaming performance, I would bet it isn't 12% better in other non gaming tasks.
If you look at Principal Technologies report, there are a number of games chosen that are old and favor Intel a lot. Like World of Warcraft which is a 2004 game engine with updates that runs 30 fps faster on Intel vs the 2700x. Also, that's not an easy game to benchmark as it's an online game with no built in benchmark. They do have DX12 right now, and multithreading will be added in patch 8.1, but for now the game has always favoured Intel. Just ask the MMO-Champion guys as they circle jerk Intel chips for this reason.

Counter Strike Global offensive is another game built on a 2004 game engine and it favours Intel by a lot. It's Intel 442 fps vs AMD 298 fps. That's just nuts how much of a difference there is. The only reason to add this benchmark is to pad Intel's numbers as anything about 120fps is overkill and the game runs fine either way. Also, Valve needs to update their source engine as it really needs tweaks for AMD.

But then you have games like Rise of the Tomb Raider which is only 15 fps faster on Intel, and Far Cry 5 which has a 22 fps increase over AMD. I know some games don't even care what CPU you use as you always get the same fps, but nothing like that shows up in Principal Technologies tests, but the nearest I can see is PUBG with Intel having a 10 fps advantage over AMD.

Pretty easy to fuck with benchmarks when you just hand pick old games or games using old engines that clearly never got optimized for Ryzen.
 
I'm just really hoping Intel realizes they are soon to be no longer the top dog and adjust pricing accordingly. Even HP Enterprise is dropping the Xeon in favor of Epyc chips.
LOL right.
HPE is full-on XEON. They’re moving so many XEONs out the door they can’t keep up with current orders.
Very few Epyc offerings right now - hopefully that’ll change soon.
Check your facts.
 
LOL right.
HPE is full-on XEON. They’re moving so many XEONs out the door they can’t keep up with current orders.
Very few Epyc offerings right now - hopefully that’ll change soon.
Check your facts.
Any links for this?
 
Gamers, as this CPU is obviously aimed at gamers since the test results are all games. Obviously a 9900K is good for productivity as well.

I will say I will take every last bit of CPU performance I can afford for the RTS games I like to play. Vulkan and DX12 for RTS hasn't quite made it big yet.

The same people you referenced in your original quote: "if there is only one product at the top, is how much extra are people willing to pay for that last % of performance."

I didn't say everyone.

In that case your response doesn't even make any sense.
 
Given the diminishing returns of cores beyond 4, the 9700K with 8 real cores is interesting. With nearly the came clocks I'll be curious if real software tests show the 9700K and 9900K not all that different.
 
Other way around, you post something and provide no facts.
Uh, YOU posted your false nonsense first with no facts. Where’s your proof HPE is “dropping” XEONs? You got called out on that twice and still haven’t provided any proof. We’re all waiting for this great industry insight you have.
 
Back
Top