AMD Ryzen R9 3900X Review Round Up

The whole gaming argument is still bogus in my opinion. The only area where you'll get anything going Intel at this point is if you are a low-resolution very high frame rate gamer. Anyone else gaming at 1440p+ these days won't even have any benefit from going 9900K over the 3800X/3900X when it comes to gaming. The vast majority are going to be better off spending that $500 on the 3900x and getting superior MT performance moving forward.

I agree with this statement, but to be honest, this is the way the young gamers are trending these days. Anything to support the perceived benefit in multiplayer games, sacrificing image quality, resolution and all sorts of things just to get higher framerates to get a perceived benefit in competitive games. It's quite dumb, but that's where gaming has headed for the majority of young people. It's not about the experience anymore, it's all about beating others in online games, and becoming the next great streamer. That's all they care about.

Now I question the benefit of super high framerates, but it almost doesn't matter if there is a real quantifiable benefit to them or not. It's all about the perception.
 
It could be so. Although AMD did not have a clock regression at 7nm.

Evidently that was a surprise even to AMD. It looks like Intel's current 10 nm parts may not reach the same peak clocks (judging by the upcoming laptop parts), and those seem to track relatively closely with the Desktop parts in that respect. 8th gen laptop parts have a peak only 200 MHz lower than their desktop counterparts, yet the fastest Ice Lake part is sitting at 3.9 GHz single core boost, 900 MHz down from Coffee Lake.
 
I really dont get the super high refresh rate at low res argument. I dont want my game to look like a god damn potato while I am playing it and I have had a high refresh panel at 1080p and I love my bog standard 1440p that has great color reproduction far more and I can still kill people just fine in a first person shooter. Your connection to the server is far more important then your actual frame rate.
 
Last edited:
I agree with this statement, but to be honest, this is the way the young gamers are trending these days. Anything to support the perceived benefit in multiplayer games, sacrificing image quality, resolution and all sorts of things just to get higher framerates to get a perceived benefit in competitive games. It's quite dumb, but that's where gaming has headed for the majority of young people. It's not about the experience anymore, it's all about beating others in online games, and becoming the next great streamer. That's all they care about.

Now I question the benefit of super high framerates, but it almost doesn't matter if there is a real quantifiable benefit to them or not. It's all about the perception.

People have been trying to get the highest framerate possible in competitive multiplayer since competitive multiplayer was a thing.
 
People have been trying to get the highest framerate possible in competitive multiplayer since competitive multiplayer was a thing.

Yeah, but competitive multiplayer was always a relative niche until the stupid streaming era. Now it is dominant to the point where almost nothing else matters. I'm surprised we even get any single player games anymore.
 
Yeah, but competitive multiplayer was always a relative niche until the stupid streaming era. Now it is dominant to the point where almost nothing else matters. I'm surprised we even get any single player games anymore.

If you thought competitive gaming was a niche you weren't paying attention. Q3A, UT, CS, etc were MASSIVELY popular games.
 
If you thought competitive gaming was a niche you weren't paying attention. Q3A, UT, CS, etc were MASSIVELY popular games.

I know. I played CS for years (~1999 to ~2005) and ran 3 of the most popular public servers on the East coast for many years.

When you said "competitive" I thought you meant the narrow subset of people who actually played it in actual competitions, not just for fun.

At least in my time we were still trying to enjoy the game, liked whatever the graphical quality we could get and weren't trying to manipulate and disable whatever graphics settings we could all in some misguided attempt to get some ridiculous framerate.

One stupid trend I see kids doing now Is playing in a 4:3 resolution stretched to 16:9 in some misguided effort to make it easier to aim and get an advantage.

The "gamer" kids these days are so dumb it just makes me want to repeatedly bash my head against a wall.
 
Evidently that was a surprise even to AMD. It looks like Intel's current 10 nm parts may not reach the same peak clocks (judging by the upcoming laptop parts), and those seem to track relatively closely with the Desktop parts in that respect. 8th gen laptop parts have a peak only 200 MHz lower than their desktop counterparts, yet the fastest Ice Lake part is sitting at 3.9 GHz single core boost, 900 MHz down from Coffee Lake.

Keep in mind that Intel has effectively abandoned 10nm. I don't even know why people keep bringing it up regarding Intel CPUs since it was Intel itself which said that 7nm was the future after the 10nm flop. Just because Intel is coming out with some 10nm parts doesn't mean they're putting much into 10nm. They already have the process and conversions at fabs have already been done so they need to make some use of what they currently have. At the moment that means low clock, low power and low core count parts with likely low yields. If Intel had anything truly good for the desktop from 10nm they would be shouting it from the rooftops so loudly the dead could hear.

Simply put, Intel won't have anything new and in volume until 7nm and that assumes no problems with Intel's 7nm. To think otherwise would require that you assume that Intel, one of the largest semi-conductor fabricators in the world, was totally wrong about how flawed its own 10nm process is.

Personally, I think AMD has time to fix or refine the current Zen architecture enough to increase clocks a bit more before Intel has anything out that's actually new and in volume. I would absolutely love it if somehow AMD would be able to pull out another Thoroughbred-A to Thoroughbred-B stepping refinement. I don't expect anything like that to happen by any means but it would be nice to see it.
 
Simply put, Intel won't have anything new and in volume until 7nm and that assumes no problems with Intel's 7nm. To think otherwise would require that you assume that Intel, one of the largest semi-conductor fabricators in the world, was totally wrong about how flawed its own 10nm process is.

I mean, how long did they hang onto that "flawed" process again? The fact that they won't produce anything really meaningful on it doesn't preclude the fact that they will likely still see clock regressions moving to 7 nm as well. AMD's result was the exception, not the rule.
 
I know. I played CS for years (~1999 to ~2005) and ran 3 of the most popular public servers on the East coast for many years.

When you said "competitive" I thought you meant the narrow subset of people who actually played it in actual competitions, not just for fun.

At least in my time we were still trying to enjoy the game, liked whatever the graphical quality we could get and weren't trying to manipulate and disable whatever graphics settings we could all in some misguided attempt to get some ridiculous framerate.

One stupid trend I see kids doing now Is playing in a 4:3 resolution stretched to 16:9 in some misguided effort to make it easier to aim and get an advantage.

The "gamer" kids these days are so dumb it just makes me want to repeatedly bash my head against a wall.

The "professional" players generally aren't hanging around in public servers. The wanna-be professionals do and those types have been around for quite a long time.

I remember when the wanna-bes would bitch and moan about widescreen being "cheating" and even ran across several that would argue games should be locked to some random low framerate (likely whatever their system could hit) in order to provide an "even playing field" or some such nonsense.
 
I never really understand these "last year's performance, but today!" arguments either. The vast majority of pc users are probably 4+ years behind on cpus/gpus anyways, so why does it really matter? Especially if the company that caught up is offering that performance for cheaper along with increased performance in other areas? I get it, Ryzen ain't for everyone, but some of the mental gymnastics people go through to downplay the whole lineup is staggering.

Pretty much. If you're in the " best performance regardless of cost" crowd and always need to stay on the bleeding edge, then please look away and stop trolling these threads. You knew before the launch AMD wasn't going to beat Intel outright in everything, so not sure why you're even here.

For everybody else (me included) who're sick of paying Intel's ludicrous prices and were holding out for something better, this is the launch we were looking forward to. Yes sure we're "only" getting 7920X performance 2 years later, but we also saved $700 that could be put towards a very nice GPU upgrade. Different strokes for different folks yada yada. 9900K still has the gaming crown so be happy don't salty lol
 
At least in my time we were still trying to enjoy the game, liked whatever the graphical quality we could get and weren't trying to manipulate and disable whatever graphics settings we could all in some misguided attempt to get some ridiculous framerate.

One stupid trend I see kids doing now Is playing in a 4:3 resolution stretched to 16:9 in some misguided effort to make it easier to aim and get an advantage.

The "gamer" kids these days are so dumb it just makes me want to repeatedly bash my head against a wall.

I've been playing cs for almost 20 years now.. I play at 4:3 stretched as well as these dumb gamer kids, it stretches out the head models and makes them easier for me to see, not necessarily hit. The added fov of 16:9 doesn't help with the narrow angles you hold on most maps, so I prefer the larger models.

Granted, i started doing 4:3 stretched because my first widescreen setup stretched automatically and I never figured out how to stop that, so it could just be what I'm used to and kids copy their elders. I don't consider this one dumb.

And lastly back to the conversation, low settings high frame rate has been a thing for as long as online shooters have. I think the issue is most single player type gamers are just now noticing it exists. Settings are really game and perception dependant. I like eye candy in battlefield or call of duty, but in CS or PUBG I want 144fps minimum, cant stand any jerkiness, will gladly drop settings to do so (though pubg really benefits from 1440p, high aa and textures). 3700x is more than enough for my tastes, but I can understand why the 240hz crowd may still want a 9900k until ryzen gets refreshed.
 
I played the shit out of older twichy shooters. That being said, older me can't notice the difference in refresh rates once I get to around 90-100hz/fps. I realize some people claim that they can perceive/utilize more, but I think most people claiming they can are full of shit.
 
I played the shit out of older twichy shooters. That being said, older me can't notice the difference in refresh rates once I get to around 90-100hz/fps. I realize some people claim that they can perceive/utilize more, but I think most people claiming they can are full of shit.

You definitely can, in some situations. Though from 90-100 you'd probably need to hit around 180-200 to get any real perceptible difference in games. You need to effectively double your famerate/hertz each time with diminishing returns for every jump. 30 to 60 is massive. 60 to 90 it noticeable but 60 to 120 is the real jump. Then 120 to 240 has a much smaller impact. At least as far as games go.
 
Funny thing is some of them arguing about low res and how they need the 9900K to push the fps are the same ones yelling at people how RTX is the next best thing and everyone has to have it. Literally makes no sense at all.
 
Just for shits I benched my now ancient 4930K @ 4.5 in Cinebench R20:

View attachment 172616

Single core score is identical to that of Ryzen 5 1600, while multitcore score is a smudge better (from TechSpot):

So you got me to try my i7 4770K, never used Cinebench before, was not expecting much, but if this is correct (425pts Single core) I'd say I'm a bit surprised. Was not planning on upgrading until maybe Zen 3 7nm+, hopefully better clocks by that refresh.

upload_2019-7-9_15-52-56.png
 
We all just have to admit that AMD won't beat Intel until they get their chiplets down to this size.
 

Attachments

  • f676050f65640f5641af638b8487a40e_300x300.jpg
    f676050f65640f5641af638b8487a40e_300x300.jpg
    21 KB · Views: 0
So you got me to try my i7 4770K, never used Cinebench before, was not expecting much, but if this is correct (425pts Single core) I'd say I'm a bit surprised. Was not planning on upgrading until maybe Zen 3 7nm+, hopefully better clocks by that refresh.

View attachment 172898

I'm kind of curious where my overclocked 3930k falls. I'm betting its just old enough now that anything new I'd get would be a rather large advantage.

I haven't tried Cinebench since like R11.5 or something like that.
 
Looks like Anandtech posted their retested results for the 3900x this morning (they are still working on the 3700x)



Seems like the impact was smaller, at least for them, than I had hoped.

Still no mention of whether the BIOS impacted the ability of PBO+AutoOC to get higher clocks.

I'm looking forward to Dan_D 's testing.

Working on it. Unfortunately, I don't have a fix for the boost clock on my test setup yet. I've reached out to MSI on that but haven't heard back yet.
 
it looks like the IPC is genuine clock vs clock IPC gain rather than clockspeed boosted which is rare to see. While the max frequency is slightly down on expectations the performance is genuinely there despite the offset so that is very impressive from AMD's side.

That power usage...damn son. Gonna be able to buy a lot of alcohol for the power savings you get by not overclocking the 9900K to 12GHZ and pulling a MW at the wall.
 
Interestingly, the difference seems to be more down to Intel's higher clock speeds than IPC now. If AMD could have found a way to increase Zen 2's clocks to near 9900K levels it might have wiped the floor with it in gaming. Or, at the very least, got much closer.

I think you're seeing the clockspeed regression AMD talked about with 7nm. Intel will have the same wall when they transition. The question is will they be able to increase IPC enough to compensate.
 
I think you're seeing the clockspeed regression AMD talked about with 7nm. Intel will have the same wall when they transition. The question is will they be able to increase IPC enough to compensate.

Any link to AMD talking about said regression at 7nm? I missed that. I just assumed they jumped into a brand new and immature node on a heavily modified new architecture resulting in the "poor" (less than expected?) clocks. Feels like a "tick and tock" at the same time. Hoping to see refresh chips next year that can crank out a few hundreds more mhz, doesn't have to be 5ghz but 4.6-4.7 would be tremendous.
 
Any link to AMD talking about said regression at 7nm? I missed that. I just assumed they jumped into a brand new and immature node on a heavily modified new architecture resulting in the "poor" (less than expected?) clocks. Feels like a "tick and tock" at the same time. Hoping to see refresh chips next year that can crank out a few hundreds more mhz, doesn't have to be 5ghz but 4.6-4.7 would be tremendous.

I can't speak to any AMD statements, but it is pretty common knowledge that as the feature sizes get really small clock speeds tend to be more difficult to achieve. It is different from, but related to Dennard Scaling, and the breakdown thereof thereof, which probably started at about 65nm just over 10 years ago.

Dennard scaling used to state that as transistors get smaller, their power density stays constant, so that the power use stays in proportion with area; both voltage and current scale (downward) with length.

This widely held until 65nm came along, when we started to see the first signs that scaling was no longer linear. Each die shrink started giving us more and more diminishing returns.

This is largely due to current leakage when the features are so small. This same current leakage - I believe - is what makes higher clock speeds more difficult.

I suspect that at some point, smaller process nodes will not provide higher performance, but rather only be useful for more power savings.

We might have a time when older, larger nodes are used for higher power high performance desktop parts, and smaller nodes are only for mobile stuff.
 
Last edited:
Any link to AMD talking about said regression at 7nm? I missed that. I just assumed they jumped into a brand new and immature node on a heavily modified new architecture resulting in the "poor" (less than expected?) clocks. Feels like a "tick and tock" at the same time. Hoping to see refresh chips next year that can crank out a few hundreds more mhz, doesn't have to be 5ghz but 4.6-4.7 would be tremendous.


"AMD's "Working Beyond Moore's Law" initiative involves developing several technologies to circumvent the diminishing point of returns from new process nodes. Norrod explained that AMD is using "every trick in the book" to circumvent the challenges because the two simple levers of density and frequency improvements have reached a diminishing point of returns. In some cases, frequency is even regressing."

From a tomshardware link: https://www.tomshardware.com/news/amd-3d-memory-stacking-dram,38838.html
 
I played the shit out of older twichy shooters. That being said, older me can't notice the difference in refresh rates once I get to around 90-100hz/fps. I realize some people claim that they can perceive/utilize more, but I think most people claiming they can are full of shit.

You get diminishing returns as you go higher up but the difference is there. I'm sure older me can't tell the difference as well as younger me also but that's cause my eyesight and reflexes aren't as sharp and I don't get to game 12-16 hours a day, sleep, wake up, and repeat.

It was jarring going from CRTs to the initial crappy LCDs and I used to mention the "stuttering" (low refresh rate) to people. You might be more likely to notice the difference if you start playing at 240hz and a card that can maintain the FPS, and then go back down to 120hz or lower.
 
Any link to AMD talking about said regression at 7nm? I missed that. I just assumed they jumped into a brand new and immature node on a heavily modified new architecture resulting in the "poor" (less than expected?) clocks. Feels like a "tick and tock" at the same time. Hoping to see refresh chips next year that can crank out a few hundreds more mhz, doesn't have to be 5ghz but 4.6-4.7 would be tremendous.
I thought I read something from AMD regarding their expectations of slower (than Zen+) clock speeds for 7nm Zen 2 CPUs, but I cannot find it now.
I did find this interesting, though, and there are a couple of AMD quotes in it: reddit.com r/Amd/comments/a8b5aa/an_analysis_of_expected_7nm_clock_speeds/
I can't seem to link reddit without the forum converting the link into a hidden MEDIA element.
 
This is only at 4.4Ghz will try 4.7Ghz later. Either way it really shows how far ahead Ryzen is now.

Keep in mind that Intel's Core i9 9900K can hit 510 in Cinebench R20's single thread test. Ryzen is massively far ahead in the multithread test with even my baby Threadripper crushing the test, but single thread is still very close between them.
 
So you got me to try my i7 4770K, never used Cinebench before, was not expecting much, but if this is correct (425pts Single core) I'd say I'm a bit surprised. Was not planning on upgrading until maybe Zen 3 7nm+, hopefully better clocks by that refresh.

View attachment 172898

What OC is your 4770K at? If it's 4.5 or less it means my chip is severely underperforming.

I'm kind of curious where my overclocked 3930k falls. I'm betting its just old enough now that anything new I'd get would be a rather large advantage.

I haven't tried Cinebench since like R11.5 or something like that.

Should be within 2-3% of 4930K if at the same frequency. SB to IB was on average 3% IPC improvement IIRC.
 
Any link to AMD talking about said regression at 7nm? I missed that. I just assumed they jumped into a brand new and immature node on a heavily modified new architecture resulting in the "poor" (less than expected?) clocks. Feels like a "tick and tock" at the same time. Hoping to see refresh chips next year that can crank out a few hundreds more mhz, doesn't have to be 5ghz but 4.6-4.7 would be tremendous.
I wouldn’t call it regression. I would call it less clock speed bump. I think we can expect another 200-300mhz from clock increase. I believe TSMC estimated the clocks to be higher at first but it did look like it wasn’t going to get where they wanted. But the IPC improvement AMD made is almost like zen running close to 5ghz.

7nm + is what 7nm should have been. Looking mike that is going to get them to original 7nm targets. 7nm+ EUV is going to give 10% more improvement or 15% more power savings. So AMD can technically get another 300_400 MHz more at same power. We will see but I expect them to keep shooting for IPC gains and few hundred MHz improvements in MHz.
 
What OC is your 4770K at? If it's 4.5 or less it means my chip is severely underperforming.



Should be within 2-3% of 4930K if at the same frequency. SB to IB was on average 3% IPC improvement IIRC.

Its 4.5.

upload_2019-7-9_23-32-48.png
 
I saw an improvement in normal day to day tasks when I went from my 5820k system to my 2700x build. It shouldn't have been an improvement, but I can tell you that it was. Games ran exactly the same, but other tasks were better. This was with the 5820k clocked at 4.3ghz.
 
Man the 3900x is a monster in many applications that are not games. It is the better chip in all applications that are not games. It beats Intel in so many tests, that are not games. So I see that the 9900k is at 449usd, 50 dollars less than the 3900x. I see the x570 platform starting at 300, in most cases more than that. I see every review site claiming the 3900x is the greatest, except for games. I see everyone on this forum talking about content creation as if content creators are the majority of pc builders/enthusiasts(really?). This is a joke of a launch. It is more expensive to build an AMD system than an Intel system for less gaming performance. I wanted AMD to dominate, they did not. Please, AMD, take one performance crown.

Do not come in talking about how trhe 3900x is the better chip unless you can prove you are a content creator with livestreams or actual content, otherwise you are a shill.

Me, I just do a build every few years with the best available. My first build ever was an Athlonxp and a 9800pro. That was a disruption for many years. This is a whiff.
 
Man the 3900x is a monster in many applications that are not games. It is the better chip in all applications that are not games. It beats Intel in so many tests, that are not games. So I see that the 9900k is at 449usd, 50 dollars less than the 3900x. I see the x570 platform starting at 300, in most cases more than that. I see every review site claiming the 3900x is the greatest, except for games. I see everyone on this forum talking about content creation as if content creators are the majority of pc builders/enthusiasts(really?). This is a joke of a launch. It is more expensive to build an AMD system than an Intel system for less gaming performance. I wanted AMD to dominate, they did not. Please, AMD, take one performance crown.

Do not come in talking about how trhe 3900x is the better chip unless you can prove you are a content creator with livestreams or actual content, otherwise you are a shill.

Me, I just do a build every few years with the best available. My first build ever was an Athlonxp and a 9800pro. That was a disruption for many years. This is a whiff.
I went ahead and grabbed a 9900kf for $440 as all I care about is getting the best upgrade for games. I was tempted to go 3900x or maybe even 3700x but none are available anyway and I am ready to build a pc this week. Plus the 3900x is going to need some BIOS updates and a few weeks to work out the kinks.
 
Man the 3900x is a monster in many applications that are not games. It is the better chip in all applications that are not games. It beats Intel in so many tests, that are not games. So I see that the 9900k is at 449usd, 50 dollars less than the 3900x. I see the x570 platform starting at 300, in most cases more than that. I see every review site claiming the 3900x is the greatest, except for games. I see everyone on this forum talking about content creation as if content creators are the majority of pc builders/enthusiasts(really?). This is a joke of a launch. It is more expensive to build an AMD system than an Intel system for less gaming performance. I wanted AMD to dominate, they did not. Please, AMD, take one performance crown.

Do not come in talking about how trhe 3900x is the better chip unless you can prove you are a content creator with livestreams or actual content, otherwise you are a shill.

Me, I just do a build every few years with the best available. My first build ever was an Athlonxp and a 9800pro. That was a disruption for many years. This is a whiff.

X570 does not start at $300. You can use first gen Ryzen boards for Zen 2 CPUs if you want. If you don't care about X570 features, a $100 B450 board will run the 3900X with absolutely zero problems even overclocked. Do you seriously believe that most people that buy hardware are gamers? Non-game applications make up far more of PC market than games do. AAA PC gaming is a big market, but its pretty small compared to professional fields. There is more to life, and computers, than video games. And no, people are not only talking about those who "livestream" even though that is a quickly growing market. If that is all you see then you aren't paying attention.
 
X570 does not start at $300. You can use first gen Ryzen boards for Zen 2 CPUs if you want. If you don't care about X570 features, a $100 B450 board will run the 3900X with absolutely zero problems even overclocked. Do you seriously believe that most people that buy hardware are gamers? Non-game applications make up far more of PC market than games do. AAA PC gaming is a big market, but its pretty small compared to professional fields. There is more to life, and computers, than video games. And no, people are not only talking about those who "livestream" even though that is a quickly growing market. If that is all you see then you aren't paying attention.

It was at launch, for two days, and first gen boards????The platform's only upgrade is 4gbs pciexpress which this site's only reviewer said had 0 advanjtage in performance and may offer some speed with nvme drives experiencing bottleknecks, maybe.

That same reviewer, that used to be at HARDOCP, proclaimed Intel to still have the gaming crown. I did not want that, I did not want the 3900x to be beaten by the 9900k which, as a chip and platform, can be had for cheaper now. It just happened.
 
Yeah, but competitive multiplayer was always a relative niche until the stupid streaming era. Now it is dominant to the point where almost nothing else matters. I'm surprised we even get any single player games anymore.

I wholeheartedly agree, I have been trying to find player comparisons of the Radeon image sharpening part of the 5700s and all I have found are testimonials about Radeon anti lag being awesome, and even though I like amd I have to call bs because such a small difference has to be imperceptible by most people (it's like up to 20ms at most, usually less than that)

Edit: my example is that I was looking for a new driver feature for quality of the new GPUs and could only find the almost imperceptible pro gaming stuff being tested.
 
It was at launch, for two days, and first gen boards????The platform's only upgrade is 4gbs pciexpress which this site's only reviewer said had 0 advanjtage in performance and may offer some speed with nvme drives experiencing bottleknecks, maybe.

That same reviewer, that used to be at HARDOCP, proclaimed Intel to still have the gaming crown. I did not want that, I did not want the 3900x to be beaten by the 9900k which, as a chip and platform, can be had for cheaper now. It just happened.

9900k is not cheaper when you include platform cost. As stated you can use a $100 last gen board for the 3900x.

I’d like to know what games you play at what resolutions when you claim the 9900k is better for gaming. If it’s anything above 1080p it’s a wash.
 
Back
Top