The Radeon Technology Group (RTG) has received its first Zen 2 sample!

Nothing I have seen looks impressive and it's looking to be priced way too high as well. 12% faster in gaming then a 2700x is jut not that impressive, especially when you factor in price.

About 17% faster with a GPU bottleneck.

JUU7PJQiSGB2QhRvsj4XNB-650-80.png


Problem I see for almost the same cash you can buy Threadripper which is far better at mulithreaded performance. I just dont see the average person spending that much more for such a little gain.

The i9 is better at both gaming and applications. The threadripper only excel on highly threaded applications as rendering.

relative-performance-cpu.png


relative-performance-games-1280-720.png
 
Last edited:
17% faster vs 12%!

Forgets to mention currently 90% more expensive and OOS...
 
Yeah it doesn't take a rocket scientist to figure out a cpu with the same number of cores, a slightly higher ips and way higher clock would beat one that is slower in both. Kinda wondering the point of showing an overclocked number though (100 mhz? LOL!!). Where are the AMD OC numbers?

Sorry OP, but AMD is still kicking the shit outta Intel in the Price / Performance war by a pretty big margin. Now that Intel has given up on the whole 10nm and stuck on 14mn, I wonder how far they will be left eating AMD's dust.
 
/facepalm. I was wondering where he was. 9900k launched and not a peep from him. Guess intel was late on their payment to him.
He's a busy boy, lots of forums to shit up, and only so much time in the day.

His title helps keep people aware of his BS, so maybe he's moved onto a new, less suspecting, target.
 
From what I can tell looking at all the reviews, overall performance difference between 9900k and 2700X is ~16%. In some rendering tasks, it's a bit lower of a delta. In some single-thread heavy tasks, it's a bit higher. But ~16%, overall.

The 9700k is only a bit better than the 8700k, overall, and is roughly equivalent to the 2700X, though as with the 8700k vs 2700X, the choice will boil down to use case and price.

If you have a 2700X, there's no reason to switch to a 9900k. If you're building a new box, the 9900k is a solid choice - I'd build with one if I were building today. The 2700X still competes well enough with the 9700k at its price tier. If AMD wishes to respond to the 9900k with Zen 2, they need 16% between both frequency and IPC gains. Definitely plausible, given both a process shrink and architectural revisions. But by no means guaranteed, either.

None of this is exactly unexpected, given the specs of the 9900k and 9700k. Though lol juanrga for using PCGamer as a source. PCGamer is trash.
 
From what I can tell looking at all the reviews, overall performance difference between 9900k and 2700X is ~16%. In some rendering tasks, it's a bit lower of a delta. In some single-thread heavy tasks, it's a bit higher. But ~16%, overall.

The 9700k is only a bit better than the 8700k, overall, and is roughly equivalent to the 2700X, though as with the 8700k vs 2700X, the choice will boil down to use case and price.

If you have a 2700X, there's no reason to switch to a 9900k. If you're building a new box, the 9900k is a solid choice - I'd build with one if I were building today. The 2700X still competes well enough with the 9700k at its price tier. If AMD wishes to respond to the 9900k with Zen 2, they need 16% between both frequency and IPC gains. Definitely plausible, given both a process shrink and architectural revisions. But by no means guaranteed, either.

None of this is exactly unexpected, given the specs of the 9900k and 9700k. Though lol juanrga for using PCGamer as a source. PCGamer is trash.

I evaluate content itself, not the producer of that content.

You admit the gap is ~16%, and the graph I shared shows a 17% gap. So I was sharing accurate info independently of if you like/dislike PCGamer.
 
If you're okay with slower ;)

Yes, but that "pure" metric doesn't make sense either.
I don't have the exact numbers in front of me, but it's something like a 70% cost increase from a 2700x vs an i9 9900 for something like a 16% increase in performance only in some tests?
Generally for this sort of thing there is an easily comparable car analogy to go along with it, but at the very least, suffice it to say for a huge chunk of users and workloads, Intels absolute performance in some tests isn't worth their poor value proposition.

(EDIT: I guess the car analogy would be like spending $250k on a Lambo Huracan versus $80k on a Skyline GTR. The absolute performance might be in the Lambo's favor. But for most, the absolute cost makes no sense unless of course you're in an environment in which you can use that additional power in a meaningful way). Just like in the car example, ignoring the cost just so you can have the best performance is neither logical or even feasible for everyone. Especially with that level of diminishing returns.

Additionally in highly threaded workloads AMD is still performing very competitively. If you're doing anything with video editing, rendering, or any form of compute you'd have to be a complete fool to not at the very least consider AMD. To add insult to injury, most of the things that Intel does beat AMD at don't even make that much difference to the enduser (as Intel is basically winning in gaming, the area in which most people are far more likely to be GPU bound rather than CPU bound).
 
Last edited:
  • Like
Reactions: rUmX
like this
Generally for this sort of thing there is an easily comparable car analogy to go along with it, but at the very least, suffice it to say for a huge chunk of users and workloads, Intels absolute performance in some tests isn't worth their poor value proposition.

I can get equal to the 2700X in a great many workloads while exceeding it in gaming with an 8700K.

The 9900K exceeds the 2700X even more. You pay for it. You don't have to buy it. You may not need it! You may not even be able to use it!

But it's available, and there is no competing solution from AMD. If there was, I'd likely own it already.

[I'm not interested in the 9900K beyond the novelty any more than I'm interested in the 2080Ti- I'm just pointing out that value is only a part of the consideration, and it should be further mentioned that AMD is no stranger to stratospheric CPU pricing when they claim a performance lead]
 
I can get equal to the 2700X in a great many workloads while exceeding it in gaming with an 8700K.

The 9900K exceeds the 2700X even more. You pay for it. You don't have to buy it. You may not need it! You may not even be able to use it!

But it's available, and there is no competing solution from AMD. If there was, I'd likely own it already.

[I'm not interested in the 9900K beyond the novelty any more than I'm interested in the 2080Ti- I'm just pointing out that value is only a part of the consideration, and it should be further mentioned that AMD is no stranger to stratospheric CPU pricing when they claim a performance lead]


Sure, everything is pros and cons. All tech generally has some level of diminishing returns. Suffice it to say though, for most they wouldn't notice much difference in any of their workloads. Much less in something like gaming. Is it worth it?, is always in the eye of the beholder.
 
Much less in something like gaming.

This is where I'm disagreeing the most: gaming is precisely where you'd notice the difference, when there is a difference to be noticed. 15%-20% is nothing to scoff at when 'normal' is just too slow.

Now, in most other workloads, where reaction time and frame consistency aren't part of the fundamental equation, at most the difference is time to complete an operation. For consumers that's not really a big deal unless it's a really big difference (I upgraded laptops from a 7500U to an 8550U for this reason, for example), but nearly everyone can wait another second or two. If it were to the point where time and money became interchangeable, there's always Threadripper ;).
 
This is where I'm disagreeing the most: gaming is precisely where you'd notice the difference, when there is a difference to be noticed. 15%-20% is nothing to scoff at when 'normal' is just too slow.

I don't claim to know everything, so you can school me on this. I'm aware of the 15-20% gap, but my statement about people "not noticing" has to do with scaling. If you're primarily playing at 4k, you are much more likely to be limited by your GPU than by your CPU. My understanding from what I've read and seen (as let's be honest, I'm not testing every platform or piece of hardware so I can't know first hand), that only at 1080p is there this wide gap. So, that said it makes far more sense for people to spend $200 less on their CPU and $200 more on their GPU (an oversimplification obviously) if they are gaming at 4k or a good chunk of the time at 2560x1440. 1080p is totally different. However to me though even with the 20% gain, you're already pushing so many frames it doesn't matter (for most).

To reiterate, there are some oversimplifications in my statements and a lot of it also comes down to the complexity of what the GPU has to do (at any resolution). But that is the reason why I stated that the gains on Intel CPU's won't matter to a lot of folks even in the gaming arena. Now, if my assessment is wrong, okay I can handle that. But some of our "argument" might just be coming down to what we believe is "worth it" and our generalizations about where the gains are in terms of importance. But generally from what I've seen, the GPU will matter far more at 4k than using either a 2700x or 9900k.
 
Last edited:
but my statement about people "not noticing" has to do with scaling. If you're primarily playing at 4k, you are much more likely to be limited by your GPU than by your CPU. My understanding from what I've read and seen (as let's be honest, I'm not testing every platform or piece of hardware so I can't know first hand), that only at 1080p is there this wide gap. So, that said it makes far more sense for people to spend $200 less on their CPU and $200 more on their GPU (an oversimplification obviously) if they are gaming at 4k or a good chunk of the time at 2560x1440. 1080p is totally different. However to me though even with the 20% gain, you're already pushing so many frames it doesn't matter (for most).

For perspective: 1080p benchmarks on high-end cards with high-end CPUs (and 720p benchmarks etc.) are not generally meant to demonstrate user experience; they're meant to show two basic things: that the system is scaling, or not, and to simulate an increase in GPU performance over that reviewed. Benchmarking at a lower resolution shows you what is likely to happen if a faster GPU is dropped in, and demonstrates the systems 'legs'.

This is of course relative to the user, and it's a bit of forecasting too, but it's a worthy exercise and useful as a decision point.

Also, one thing to consider beyond framerates is frametimes- the framerates you feel. If your frametimes are not faster than 16ms, you're not getting a 'real' 60FPS. GPUs generally constrain maximum framerate, but CPUs keep maximum frametimes down, or not.
 
I honestly don't know why people discount 16% as not being a “worthy” enough of a difference to consider. But, for sake of argument, if I can process my video encoding 16% faster, I can get 16% more work units done in the same time it would take me to render on the slower system. Since my work units are paid for, that directly translates into dollars into my pocket for the life of the platform. Heck, I've already paid for the entire system with one job, so everything after the first job is bonus.

People around here may be [H]ard, but while you sit and argue over perceived value of the system, sometimes it's not just about what you've spent but it's about what you'll earn.
 
I honestly don't know why people discount 16% as not being a “worthy” enough of a difference to consider. But, for sake of argument, if I can process my video encoding 16% faster, I can get 16% more work units done in the same time it would take me to render on the slower system. Since my work units are paid for, that directly translates into dollars into my pocket for the life of the platform. Heck, I've already paid for the entire system with one job, so everything after the first job is bonus.

People around here may be [H]ard, but while you sit and argue over perceived value of the system, sometimes it's not just about what you've spent but it's about what you'll earn.

Agreed. But when doing those kinds of workloads, we could just as easily start discussing much more expensive Threadripper or Skylake-X parts, so you also had a very clear very deliberate cost cutoff. $570 was viable for you, but perhaps $1700 on a CPU wasn't (and those parts of course offer much more than a 16% increase in performance, meaning via your statements they also should be worthy to consider). And even if it was within your budget then we could still argue you're not hard enough unless you spend $100k on a render farm to do your video work for you (the point being there is always a budget and if you can't see that for other folks then you must apply that same logic to yourself. Because all you're really saying is that a certain dollar to performance amount makes sense for you, however you have to understand that that doesn't necessarily make sense for everyone else). The other part of this conversation here being at the [H] most never discuss things outside of gaming performance, and that is unfortunate.
For what it's worth, I'm with you. I don't competitive game, nor do I care much for gaming performance. However video rendering is of chief importance for me. Still, workflow, cost, time all must be evaluated at the individual level as to determine what is "worth it".

I run into this problem all the time, especially when in the Apple subforum or on Youtube, where there are a lot of gamers that don't understand why anyone would buy an iMac Pro (it starting at $5k and topping out around $13k). And indeed it doesn't make sense at all as a gaming platform. But when you realize that it eats Red Raw 8k 3:1 compressed footage all day every day and can play it back in real time with luts and color grades applied and basically render complete footage at a near 1:1 speed in FCPX it starts to become very obvious that it's an incredibly purpose built tool (let alone the full coverage DCI-P3 gamut 5k display, that it does all this near silently, and takes the smallest footprint as a workstation). But some don't get why that's "worth it".

Value is in the eye of the beholder. All we're discussing is simply how we are weighting those things. And I think that's a worthwhile thing to consider as we all do not have the same budget. Nor do we all actually profit from the hardware (most here are gamers, and this is all just hobby money. There are far fewer creative professionals or even miners).


EDIT: Also, for what it's worth, I'm strongly considering the i9 9900k to build a hackintosh with. But my choosing it also has to do with limitations on what I can use for that sort of system. As AMDs side lacks Quicksync and that has massive adverse effects on FCPX not to mention it requires far more hacks/hassle to get it to run macOS properly. If not for those things, I would very likely consider a 2700x. But like I said with all of these things, these considerations are different for different folks.
 
Last edited:
  • Like
Reactions: rUmX
like this
Yes, but that "pure" metric doesn't make sense either.
I don't have the exact numbers in front of me, but it's something like a 70% cost increase from a 2700x vs an i9 9900 for something like a 16% increase in performance only in some tests?
Generally for this sort of thing there is an easily comparable car analogy to go along with it, but at the very least, suffice it to say for a huge chunk of users and workloads, Intels absolute performance in some tests isn't worth their poor value proposition.

(EDIT: I guess the car analogy would be like spending $250k on a Lambo Huracan versus $80k on a Skyline GTR. The absolute performance might be in the Lambo's favor. But for most, the absolute cost makes no sense unless of course you're in an environment in which you can use that additional power in a meaningful way). Just like in the car example, ignoring the cost just so you can have the best performance is neither logical or even feasible for everyone. Especially with that level of diminishing returns.

Additionally in highly threaded workloads AMD is still performing very competitively. If you're doing anything with video editing, rendering, or any form of compute you'd have to be a complete fool to not at the very least consider AMD. To add insult to injury, most of the things that Intel does beat AMD at don't even make that much difference to the enduser (as Intel is basically winning in gaming, the area in which most people are far more likely to be GPU bound rather than CPU bound).
For work, you could offset a 70% higher initial investment pretty quick, surely within the lifetime of the computer, as long as it's less than $1000 or so (which it should be). Especially if it's left stock, as power draw should be less than a ryzen 2700x build (though maybe not by much–stock vs stock it's pretty close, under load at least).

For gaming, there's no such justification. You just have to want the extra performance enough to pay the premium. Whether you think it's worth it is personal and your opinion will be different than someone else's.
 
I evaluate content itself, not the producer of that content.

You admit the gap is ~16%, and the graph I shared shows a 17% gap. So I was sharing accurate info independently of if you like/dislike PCGamer.

You spin things to suit your narrative. Like your choice of the word "admit". I'm not 'admitting' that one CPU is slower than other. I'm stating a fact that one is slower than the other. The difference is that using the word "admit" implies I'm taking a side. I.e. AMD fanboy or something. If called out on this, you will try to escape saying that's not what you really meant. Passive-aggressive spin. Annoying, but most folks see through it, you know.

PCGamer is a poor source. You know it, I know it. Use Anand, or Toms, or even [H]ardOCP (there's a thought, given where we are posting, right?). The gap will be roughly similar, so it shouldn't change your conclusion, and then you won't have to worry about the credibility of the source. It's like when people cite wccftech. Yeah sure, they could be right - but I would not use them as a crutch in a debate.
 
I honestly don't know why people discount 16% as not being a “worthy” enough of a difference to consider. But, for sake of argument, if I can process my video encoding 16% faster, I can get 16% more work units done in the same time it would take me to render on the slower system. Since my work units are paid for, that directly translates into dollars into my pocket for the life of the platform. Heck, I've already paid for the entire system with one job, so everything after the first job is bonus.

People around here may be [H]ard, but while you sit and argue over perceived value of the system, sometimes it's not just about what you've spent but it's about what you'll earn.

Any performance difference is worthy enough to consider unless it's within the margin of error. 16% is a pretty sizable gap, too. That's not a gap you just hand wave away. The tough sell with the 9900k is that it's not much (if any) faster in gaming than a 9700k. And the 16 core Threadripper doesn't cost that much more, but provides more multithreaded performance, generally. The 12 core Threadripper may even wind up cheaper (the 2920X) depending on supply and sale prices and such.

So the 9900k occupies a curious market position, that of someone who wants a CPU with good multithreaded horsepower, but isn't willing to pay a whole lot, but still probably wants good gaming/single-threaded performance too. Ironically, that's me. That's why I have a 2700X, it was the best value in that segment when I bought it. It's not worth changing my whole platform over, but if I were buying the parts for a full fresh build today, the 9900k would be at the top of my list, most likely.
 
  • Like
Reactions: rUmX
like this
So the 9900k occupies a curious market position,
Which will become even more curious mid-next year. I can't wait to see how that goes down when Intel doesn't have much of a 10nm answer for at least a year or two from now and has no SC/IPC advantage.
 
To me, it seemed like one of the Ryzen mantras was, ‘it's not much slower than an Intel processor but it's got more threads for doing more than gaming’ (the gamer with 200 web page tabs open; the streaming gamer; the gamer who crunches 4K video on the side; enter your favorite multi-threaded scenario here). The second mantra seemed to be ‘if you're not playing at 1080p you're GPU limited anyway’.

For the first mantra, now we have a relatively high priced fast-at-gaming and good-at-multithreaded-workloads response. The second mantra is slowly dying now that nVidia has released their RTX line (love it or hate it—and I'm in the hate it category) and we're seeing CPU limitations in the 144/1440 arena (as predicted). For the hobbyist and casual gamer, Ryzen still holds a commanding lead on value and price. For anyone who is a ‘professional’ gamer, is paid for work, or both, Intel has a worthy contender, finally. This is competition and what it can do.
 
Technically, Intel did have their 6 and 8 core Extreme Editions but those were priced high enough that you should have gone Xeon anyway. (y)
 
<bla bla bla>

PCGamer is a poor source. You know it, I know it. Use Anand, or Toms, or even [H]ardOCP (there's a thought, given where we are posting, right?). The gap will be roughly similar, so it shouldn't change your conclusion, and then you won't have to worry about the credibility of the source. It's like when people cite wccftech. Yeah sure, they could be right - but I would not use them as a crutch in a debate.

Your own words:

From what I can tell looking at all the reviews, overall performance difference between 9900k and 2700X is ~16%. In some rendering tasks, it's a bit lower of a delta. In some single-thread heavy tasks, it's a bit higher. But ~16%, overall.

The graph I gave shows 17%. I also gave another review. 17% again. Enough said.

You use your technique of splitting reviews into good and bad in function of names. I will continue using my technique of selecting reviews with good data independently of who wrote them or where are published. For your info there are Anandtech or tomshardware reviews that I wouldn't cite ever, because I know they are wrong.
 
Your own words:



The graph I gave shows 17%. I also gave another review. 17% again. Enough said.

You use your technique of splitting reviews into good and bad in function of names. I will continue using my technique of selecting reviews with good data independently of who wrote them or where are published. For your info there are Anandtech or tomshardware reviews that I wouldn't cite ever, because I know they are wrong.

So the 9900k is 17% faster, 7 months late and runs hotter, and uses more power and its over 80-90% more expensive than the 2700x....O lets not forget it does not come with a retail heatsink.

Good Job Intel for catching up to AMD!
 
The graph I gave shows 17%. I also gave another review. 17% again. Enough said.

You use your technique of splitting reviews into good and bad in function of names. I will continue using my technique of selecting reviews with good data independently of who wrote them or where are published. For your info there are Anandtech or tomshardware reviews that I wouldn't cite ever, because I know they are wrong.

You missed the point entirely. It sailed right over your head. The ~16% value (or 17% if you prefer - I don't care) is right - I keep saying it, and for some reason you ignore it and say "but it's 17%!!!" duh, man. We know. Everybody knows.

The source you are using is poor. The reasons should be obvious to you. With Anand, Toms, [H], and many other sites, the test setups are well documented. Overclocking potential is more fully explored. In most cases, FPS values are expressed for averages and minimums. Frametime data is given. When one of them makes a mistake (which can, does, and has happened - Anand made a big one back in April) it's often found quickly and called out early, since everything is documented. PCGamer doesn't do this. They are quite casual about it. They don't even test all that many games.

You used to make a big deal of Hardware.fr's superior methodology (and to be fair, hardware.fr was pretty good), before they closed down testing. That's why your use of PCGamer as a source is confusing. There is no reason to use an inferior source in this case. Did you just google this shit at random, and that was the first result? Or was there some reason you chose it? You could have used any of the above sites and they would give you a similar value. So why did you choose that one?
 
  • Like
Reactions: otg
like this
When will AMD's Zen 2 and the X570 motherboards actually be available? Any potential delays? Guesstimated prices - relative to current prices or will there be a price premium?

I'm not sure if I should wait for Zen 2 in spring of 2019 or just settle for an X470? I waited for the 9900k but it's just way too hot and expensive - even more so when one considers liquid cooling - just not worth it.
 
When will AMD's Zen 2 and the X570 motherboards actually be available? Any potential delays? Guesstimated prices - relative to current prices or will there be a price premium?

I'm not sure if I should wait for Zen 2 in spring of 2019 or just settle for an X470? I waited for the 9900k but it's just way too hot and expensive - even more so when one considers liquid cooling - just not worth it.

Meh, if I were building today... that 9900k is pretty sweet. And the 2700X doesn't run all that cool either. Don't let the heat/cooling scare you. It's a beast!

AMD has held its release cadence for CPUs twice already. If they do a third time (seems likely), then you are looking at April for Zen 2 mainstream and X570. Prices... probably broadly similar to now, unless Zen 2 is particularly awesome in performance, in which case it might run a bit more.
 
Meh, if I were building today... that 9900k is pretty sweet. And the 2700X doesn't run all that cool either. Don't let the heat/cooling scare you. It's a beast!

AMD has held its release cadence for CPUs twice already. If they do a third time (seems likely), then you are looking at April for Zen 2 mainstream and X570. Prices... probably broadly similar to now, unless Zen 2 is particularly awesome in performance, in which case it might run a bit more.

The nice part about AMD is that the X470 chipset will work with Ryzen 2. So no need to upgrade to X570!
 
I honestly don't know why people discount 16% as not being a “worthy” enough of a difference to consider. But, for sake of argument, if I can process my video encoding 16% faster, I can get 16% more work units done in the same time it would take me to render on the slower system. Since my work units are paid for, that directly translates into dollars into my pocket for the life of the platform. Heck, I've already paid for the entire system with one job, so everything after the first job is bonus.

People around here may be [H]ard, but while you sit and argue over perceived value of the system, sometimes it's not just about what you've spent but it's about what you'll earn.
That doesn't make much sense. 9900K costs too much for the gain it brings. Your argument would make sense if it was the top choice, but there are faster CPUs for rendering and not that much more expensive.
 
That doesn't make much sense. 9900K costs too much for the gain it brings. Your argument would make sense if it was the top choice, but there are faster CPUs for rendering and not that much more expensive.

You're initiating a circular argument here. Defenders of Ryzen have (almost) always defended the slower cores by claiming that 'not everyone games' or 'not everyone exclusively games' so the extra cores are 'worth it'. Intel now has a faster, higher core count part. Simply stating that there are more expensive parts out there that are faster doesn't negate the original point: Intel now has a faster gaming part that is no longer as vulnerable to the 'but Ryzen does more work' argument.

My secondary argument is that value is relative; specifically if you get paid or not for your work. If you don't, then Intel may not have the best value for you. If you do, you can easily justify the extra cost because it will pay for itself. It's personally not worth it to me because I game on my gaming computer and I work on my workstations. However, I used a game/work scenario to show how easily I could justify building a machine that worked as well as it gamed.
 
Back
Top