Intel's i9-9900K Is Only 12% Faster than AMD's 2700X at Gaming, but 66% Pricier

Oh that's rich. You know, some of us aren't made of money ...
And you don't buy Teslas, Porsches, Rolexes, bespoke suits, or Intel i9 processors. I get that.

What I don't get is your objection to the prices those products are sold at.
Are there no Kias, Timexes, Men's Wearhouse suits, and Celerons for you to buy?

You don't want to pay for cake, then buy bread. There's plenty of both available.

[[Full disclosure: I do not own a Tesla, a Porsche, a Rolex or an Intel i9 processor. Yet.]]
 
Last edited:
The 12% is meaningless when you factor in freesync, or gsync. its a preference AMD or Intel and how much you are willing to spend. How many games scale better with 6 or 8 cores. Same goes for AMD vs Nvidia.
Kyle had done the test sometime ago where players were not aware which system they were playing on.
A small % of subjects have picked the AMD over Nvidia proving the above point.
 
Last edited:
Man even my FX4100 bulldozer is better than the new intel chips and i have had it since launch. AMD will take the lead with the new zen until 10nm.
 
Or messing with the teletypes (land line to the local university at the time) trying to learn computer programming.... god this dates me...
Ah, ASR33s and 110baud modems, what memories. And did you keep your paper tape in old 35mm film canisters?
 
And you don't buy Teslas, Porsches, Rolexes, bespoke suits, or Intel i9 processors. I get that.

What I don't get is your objection to the prices those products are sold at.
Are there no Kias, Timexes, Men's Wearhouse suits, and Celerons for you to buy?

You don't want to pay for cake, then buy bread. There's plenty of both available.

[[Full disclosure: I do not own a Tesla, a Porsche, a Rolex or an Intel i9 processor. Yet.]]

Suits are workwear. Heh.

I think there's no need to pour salt here, however.
 
But has it turned people off? 2080Ti's can't stay in stock, despite the 'poor' price/performance being the main consensus of criticism among reviewers.



Something to consider: the general advice when purchasing computer hardware seems to be 'buy what you need now'. If you don't need the best single-core performance alongside the best multi-threading performance you can get concurrently, then you don't need the 9900k, and it is then a poor price/performance proposition for you.

For me, I will say outright that I don't need the 9900k either- I didn't really need the 8700k over the 6700k I had before, but I upgraded partly because I perceived the 8700k as being relevant for quite a bit longer, and because I had a use for the 6700k and its platform elsewhere.

But what attracts me to it is that I could get higher stable clockspeeds than my 8700k gets, most likely at least, while getting two more cores. And as I've been homelabbing a bit, I'd have an 8700k to use for something else!

Now, I don't plan on doing that- as an aside, I'd like to get something server-grade for homelabbing, specifically something with gobs of ECC- and I just don't use all of the 8700k I already have.
Maybe it can't stay in stock because they are not being made that fast? Lots of reason why as well.
 
The 12% is meaningless when you factor in freesync, or gsync. its a preference AMD or Intel and how much you are willing to spend. How many games scale better with 6 or 8 cores. Same goes for AMD vs Nvidia.
Kyle had done the test sometime ago where players were not aware which system they were playing on.
A small % of subjects have picked the AMD over Nvidia proving the above point.
Not really sure what you are on about with regards to freesync/gsync. With 144hz ,165hz, and 240hz monitors you are probably going to need that extra 12%
 
Seeing as only those with money to burn upgrade every single cycle.

Go AMD.... save enough money to upgrade to a new AMD chip next cycle. Cause all you will have to do is pick up a new CPU and plop it in. Counting the ebay resale value of your still current year old AMD chip its likely you will still have enough money to do it again the cycle after.

No doubt your twice upgraded for = money AMD chip will destroy your now 2-3 year old Intel chip.

Intel systems are buy complete once type deals. Chances are the next 2-3 zen revisions will be drop in upgrades.
 
Why are people whining about price here?

We are fucking [H]ard......... GET HARD! and BE HARD!
 
  • Like
Reactions: Gavv
like this
Yes Less toys - this is what is pissing me off. I would certainly have upgraded my 1080Ti now and my 7940X with the new Skylake Refreshes and would go for 9900K for an upcoming gaming build for my kids but the whole DAMN game has gotten to another financial level! Pheww.

Seriously, you can always stay a step back, play 2-5 years old games and be done with it. The only reason to keep upgrading (i.e. burning money vs depreciation) is for business. If more processing power / per hour = more $ per hour then by all means do the calculations and go for it. Otherwise it is for fun and fun costs.
 
Not really sure what you are on about with regards to freesync/gsync. With 144hz ,165hz, and 240hz monitors you are probably going to need that extra 12%

Beat me to it.

While I'm not in this category, there are people here that pre-ordered two 2080Ti's, because what they had wasn't fast enough. <US$600 for a CPU to make better use of them?

And it still won't be fast enough.

This thread appears to have lost its [H].
 
Beat me to it.

While I'm not in this category, there are people here that pre-ordered two 2080Ti's, because what they had wasn't fast enough. <US$600 for a CPU to make better use of them?

And it still won't be fast enough.

This thread appears to have lost its [H].
Not enough of a percentage to get some people [H]ard?
2 2080ti's would be good if most of the games you play can use it.
 
Not enough of a percentage to get some people [H]ard?
2 2080ti's would be good if most of the games you play can use it.

Figure many/most of those guys- including yourself!- are using HEDT. Most were using HEDT before the 8700K dropped. Many/most didn't want to drop to the slower Ryzen CPUs.

But eight Intel cores without HEDT baggage at top clockspeeds? That might turn a few more heads.
 
I guess Intel had to figure out where they going to get all the extra pennies for the Solder usage and to revamp some of their lines for 22nm, 14nm and maybe just maybe get out the should have been 10nm 3+ years back (likely will not end up being as good as they would have wanted it to be once it finally releases)

good on PT for going back to the drawing board, shame they did not take that "extra time" to dial in proper mem timings/subtimings.

would also be interesting to see a list of games/apps that use the "horsepower" and are as little vendor bias as possible (though this is likely to be "impossible" as one way or another if using x86 it will be predominantly intel bias, x86-64 will lean somewhat over to the AMD side then of course there is potential dirty tricks from Geforce GPU)

Thanks for the revised report highlight Megalith o7
 
...
Or messing with the teletypes (land line to the local university at the time) trying to learn computer programming.... god this dates me...

Ah, ASR33s and 110baud modems, what memories. And did you keep your paper tape in old 35mm film canisters?

I rebuilt a wirewrap backplane for a PDP 8 that was given to our school, and our interface was a tty console, solid metal, 1200lbs, or switches.

Yes, we kept our paper tape in 35mmcans, or 35mm projector film cans for the long ones.

it also would do 80 column cards, but if one got misplaced, it would shred the rest of the stack.

SO we used paper tape. :)

The last thing I did was adding a cassette interface: I made each row on the paper tape a tone, and made an encoder/decoder.

This was maybe 1979...

I didn't have a modem until later, I remember connecting to the local DOD facility to play "Lunar Lander "on a TRS80 color computer; if the tape wasn't loaded, someone would notice, and spool it for us. :)

The password was the default user System pw admin, lol
 
Not terribly surprising. CPUs like this are never the "best" option for a pure gaming rig unless you just want the best or the best or something that could be amazing for several years. If you're trying to balance a budget it has always been a better option to take a couple steps down from the top end and go from there. I could see this being an interesting option for some productivity applications, but we'll see.
 
Even If your employee is hired to play games he won't finish games 12% faster :D
but some employees are required to compile this software you oh-so-conveniently download as a binary (when it's finished), which would likely also show a similar increase.

not defending Intel here, but i feel "end users" are getting a bit *too* bold these days.
 
Considering the 2080, and especially the 2080ti, can show CPU bottlenecks, you'd want to throw as much CPU speed as you can at it. That's where the i9 will rule the roost. An extra $200 to not cut your shiny new GPU off at the knees would be worth it. Doubly so considering how much this gen of GPUs cost.
 
Considering the 2080, and especially the 2080ti, can show CPU bottlenecks, you'd want to throw as much CPU speed as you can at it. That's where the i9 will rule the roost. An extra $200 to not cut your shiny new GPU off at the knees would be worth it. Doubly so considering how much this gen of GPUs cost.
At 1080 240hz?
 
I am happy with my Ryzen 2700x. I also ran a 8700k for awhile as my main rig. With the same GPU (1080 at the time) my benchmarks were all higher with the 2700x. I think I am gonna stay with this Ryzen2700 CPU for awhile. Not seeing a good reason to go for a i9-9900.
 
Considering the 2080, and especially the 2080ti, can show CPU bottlenecks, you'd want to throw as much CPU speed as you can at it. That's where the i9 will rule the roost. An extra $200 to not cut your shiny new GPU off at the knees would be worth it. Doubly so considering how much this gen of GPUs cost.

I disagree with this comment. I'm running a 5820k and while it isn't the fastest part available by any measure these days, even if I upgrade to a 2080ti i'm still not going to be anywhere near 100+ FPS @ 4k w/ max settings in current games - And that situation would be the same with the latest and greatest CPU. Upgrading off my 5820k might get me 10 FPS best case scenario @ 4K. We are still very much so GPU limited, not CPU limited. As long as you have a decent clocked part with 4+ physical cores the major constraint is still the GPU.
 
Somehow I don't think paying 66% more money for only a 12% performance difference is very economical.

I thought it would have avx-512 for that cost too but nope. Basically twice the price.

The 8 core X processor would be a better deal at least for me given the 44 lanes and avx-512 at least.
 
Look,

I agree that it is a large price premium for a small performance benefit, but that's always been the case at the high end. Prices never increase libearly with performance.

If I were in the market today, and I didn't have a soft spot for AMD, this is probably the CPU I would buy.

If you want the best possible per core performance without sacrificing the number of cores, there are no other options. This CPU is it.
 
I thought it would have avx-512 for that cost too but nope. Basically twice the price.

The 8 core X processor would be a better deal at least for me given the 44 lanes and avx-512 at least.

I agree, if you're looking at this type of CPU the X / X299 makes more sense, and if you want something for gaming on the Intel side stick with I7.
 
Not for gaming, they won't. Of course, gaming isn't everything people use these things for- and I know that isn't something that you don't understand, I'm just pointing it out to keep the argument balanced :).
Nothing in Principled Technologies report is about productivity.


Joke's on you!

[had too- I used one of these myself for a sibling's build, and it was and remains a very decent CPU for general usage, but gaming was never a strong point if you weren't playing Crysis :D ]
I've seen people make comments that an FX chip can't game, like at all, but honestly it runs just fine for games. I was expecting games to make more use of cores available, and I was very wrong. What's interesting is that Blizzard announced that WoW patch 8.1 will now have multithread optimizations, which that game heavily favored Intel for such a long time because of higher IPC. The patch isn't even out yet but tests show that DX 12 w/ MT will give you a 33% boost in performance. Which is sad for two reasons, cause one I don't use my FX 8350 anymore for gaming as it sits in my HTPC, and secondly I don't play WoW anymore as the game is kinda bullshitty lately. Also it invalidates Principled Technologies tests results yet again.
 
Paper tape? What luxury, I still have the cards. You fatcats enjoy your paper tape!

I still have the tape, lol.

I used to have to load a 16k program into a pdp11 with no console, that was a piece of test equipment.

Everytime the power failed. :)

I finally noticed the chick from accounting I went to lunch with could do it perfectly, in about 10 minutes, so we started diversifying some jobs. :)

This was in 85, lol.
 
Nothing in Principled Technologies report is about productivity.

Agreed- but it's one of the reasons to pick the 9900K over an 8700K or even 6700K/7700K.

I've seen people make comments that an FX chip can't game, like at all, but honestly it runs just fine for games.

I'm earnestly just picking on you a bit- got a buddy that plays everything on one!
 
pcgeekesq Price-performance is not your thing. Point taken. Most of us take price-performance into consideration but not blindly. Few % moar performance for a signficant load of $ will turn a lot of people off. Intel & Nvidia take this bad habit lately. I would have been [hard] on the 9900k and the 2080ti but the price-perf (or the price-fun) is just too awful for ME. Even though money is not a problem.

Price/performance goes out the window once you get to the top tier parts. The real question at the top, if there is only one product at the top, is how much extra are people willing to pay for that last % of performance. The only time price/performance makes sense is for equivalently performing products or equivalently priced products.
 
Back
Top