AMD Ryzen R9 3900X Review Round Up

Which CPU do you have? MSI said their 3700X was working right. I have only tested the Ryzen 9 3900X so far.
3900x. Out of the box I saw single core boost 4.5ghz+ but after the AB update nowhere near that. Never really jumps above 4250 in ryzen master under 1 core load. I tested before and after several time with CPUz bench.

Looks like AMD_Robert posted about voltages and stuff on reditt and there was another user with same setup as mine who updated to latest bios on 3900x and lost the boost speeds on everything auto and went from 4.5Ghz+ to hard wall at 4.3 on boost. Almost exact same thing as me, and he also saw the max boost in out of the box bios on the Crosshair VIII Hero.
 
Last edited:
I think MSI probably is just buying time. My asus crosshair VIII hero wifi has the AB bios release on 7/5. But out of box it was shipped with earlier bios. Here are my results. out of the box my CPU was boosting close to 4.5ghz+ on single core. I wanted to test to see if this was indeed issue with the 1.0.0.3 bios. So I updated, I knew going in I won't be able to downgrade because asus didn't have the out of the box bios on the site. After update my boost clock even on single core was 200mhz slower. So I am sure 1.0.0.3 code needs tweaks, there is definitely something with the code that is messing up the stock boost bios that I had out of the box.

After the update It mostly saw 4200 single core boost. I went in to bios and did +150 and now it boosts to about 4275-4300. So the boost OC is definitely working, didn't see much with +200. But the new bios code most definitely screwed up the single core boost. Also my all core boost went down as well from like 4.2, this is with AIDA64 to like 4050-4051. So looks like there is overall reduction in boost clock of 200mhz here. They really need to work on improving the bios and most definitely older bios did boost right.

I think my original bios was AGESA code ending with 0.7.2
Should've dumped the bios to disk from a linux live cd. No guarantee you could reflash it later, but at least you'd have it.
 
Just going to say this once: After updating an Asus BIOS, press that Clear CMOS button on the back I/O panel to ensure the BIOS gets reset with the proper HIDDEN configuration settings pertaining to the new BIOS version. Then go into the BIOS and configure as desired.

My C7H wifi would not listen to any DDR4 voltage setting I configured in the BIOS after a BIOS update and was therefore stuck at 2933 MHz maximum due to lack of memory voltage. Resetting to factory defaults/BIOS defaults/F5 or whatever it is called inside the BIOS did NOT work. I was this close > < to an RMA when I pressed the Clear CMOS button, configured memory speeds/voltage again and it finally took. First time I've ever seen this kind of thing, but from now on, I will ALWAYS press the Clear CMOS button after a BIOS update.
 
how would you say a 3600x + rx 5700 compare to a similar intel/nvidia pairing? these higher end parts look good, but i may need to build a system just a step or two down from there. focused on 1080p gaming mostly.

Crushes it, mostly because it crushes Intel, I quite like the RX5700's, they surprised me way more than zen2... that being said I had high hopes for zen2 and they met my expectations.

The 3900X runs on a 350 chipset as well. Someone had asked if the x370 was going to be compatible...:D

Yes, writing from a AB350M Pro4 with a 3900X!
 
I never really understand these "last year's performance, but today!" arguments either. The vast majority of pc users are probably 4+ years behind on cpus/gpus anyways, so why does it really matter? Especially if the company that caught up is offering that performance for cheaper along with increased performance in other areas? I get it, Ryzen ain't for everyone, but some of the mental gymnastics people go through to downplay the whole lineup is staggering.

Anyways, I see these CPU's doing great in the future. The majority of pc users don't have a dedicated gaming pc, and sacrificing a few % or so in specific gaming loads to save some cash and have better general performance is going to be an obvious choice. I'm already sold on the platform (thanks for the reviews!), but I'm gonna hold out until possible holiday sales (also to let the platform mature a little). Have a 1600 atm so maybe a 3600, up to the 3800x max.

On a side note, anyone have any experience with the Aorus Ultra/Master? I planned on sticking with gigabyte and those 2 looked good for the NVME support.

Well said!!!

I’m looking at those motherboards as well. Waiting for some real reviews
 
Last year's performance (or technology), today, all at reasonable prices is what makes all the difference.

Nobody codes mainstream software or games for hardware only the <1% can use.

When these 8 core high IPC chips become mainstream, only then will majority of the software/game developers have incentives to find innovative and creative ways to exploit this kind of technology.

Same with any 'standard' of ray-tracing. You'll need low end GPUs to support it before you see massive and widespread adoption across games.
 
Last edited:
I don't have those yet, but I would like to check them out at somepoint.

Have you ran any of the new CPU's on the X470 gaming pro carbon yet? I want to upgrade, but I haven't heard positive from anyone yet if there are any issues. I know some of the 450's are having issues.
 
9900k is not cheaper when you include platform cost. As stated you can use a $100 last gen board for the 3900x.

I’d like to know what games you play at what resolutions when you claim the 9900k is better for gaming. If it’s anything above 1080p it’s a wash.
You can use a $100 "last gen" motherboard with the 9900K, as well.

https://www.amazon.com/dp/B0798C43D5

upload_2019-7-11_12-35-3.png
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
You can use a $100 "last gen" motherboard with the 9900K, as well.

https://www.amazon.com/dp/B0798C43D5

View attachment 173297

The missing GPU frequency listings on the table make a lot of sense. Back when Intel released that terrible Core i7 7740X, it picked up about 200MHz or more overclocking capability compared to the Core i7 7700K. This is because the lack of GPU (and soldering vs. TIM as I recall) allowed for greater clocks. Mine would do 5.1GHz pretty easily. As I recall, you generally got stuck around 4.8-4.9GHz on the Core i7 7700K. When paired with a motherboard like ASUS' Maximus XI APEX, a 9900KF could be a potent gaming CPU indeed.


Just to add, I never experienced this issue.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Yes, writing from a AB350M Pro4 with a 3900X!

Yeah, no reason it shouldn't work provided:

1.) Board has sufficient VRM's to support the chip

2.) Manufacturer releases BIOS update to support it.

There are many B350 boards that lack one or both of the above.
 
Yes, writing from a AB350M Pro4 with a 3900X!

You really should check the list on Reddit though. Most B350's are marginal on anything above a stock clocked 3800x due to insufficient power delivery.


At a very least keep a fan blasting on your VRM's.
 
You really should check the list on Reddit though. Most B350's are marginal on anything above a stock clocked 3800x due to insufficient power delivery.


At a very least keep a fan blasting on your VRM's.

Sheet is highly overly cautious, there is enough tests of cheap b350's and 3900X just.. working well without any issues on internet now :)

no problems after an hour run of all 12 cores loaded at 4.3 ghz! :)
it's toasty at 95 ish c, but who cares.. not like they're actually burning up.
Caps are at 60C so they won't last 15-20 years but then it doesn't matter anyways.

people are overly critical about this stuff imho, I tested a B350M-A.. no vrm coolers what so ever, there it went to 3.8 ghz allcore and was perfectly reasonable - You actually left performance on the table with a tad higher end board.

The b350m-a is actually the worst B350 board I know of.
 
Only the top line is showing more than a measurable delta- we're going to need better flash and better controllers first.

For people who work with large 4K video editing would benefit. Transferring videos into projects, back ups to another SSD and so forth.
 
For people who work with large 4K video editing would benefit. Transferring videos into projects, back ups to another SSD and so forth.

Yes, but fractionally- the next problem you run into is that if you try to use that bandwidth for large transfers, you're going to saturate the cache quickly, and in relation to a PCIe 3.0 SSD, you're just going to get there slightly quicker.

It's speed that you can benchmark but can't really use. And since it is costly on the SSD side- over twice the cost!- the benefit is quite minimal. You can just get double the capacity.
 
It's such a niche for these super high drive speeds. Your somewhat average NVME is already so fast for most users. I just don't get the draw here. Even if you're doing 4k video projects the main limitation is your compute time, not access/storage times.
 
So you got me to try my i7 4770K, never used Cinebench before, was not expecting much, but if this is correct (425pts Single core) I'd say I'm a bit surprised. Was not planning on upgrading until maybe

I'm kind of curious where my overclocked 3930k falls. I'm betting its just old enough now that anything new I'd get would be a rather large advantage.

I haven't tried Cinebench since like R11.5 or something like that.

What OC is your 4770K at? If it's 4.5 or less it means my chip is severely underperforming.

Should be within 2-3% of 4930K if at the same frequency. SB to IB was on average 3% IPC improvement IIRC.

Well, here's my [email protected] with DDR3-1866 RAM and Win 10 build 1803 (windows update just hasn't offered me anything newer yet, and I haven't bothered to do it manually)

upload_2019-7-13_1-32-53.png



upload_2019-7-13_1-33-7.png


Not terrible for an 8 year old CPU. It stayed near the top for much longer than I expected it to when I bought it, but it is finally showing its age.
 
Just remember everyone, things could always be worse.

CBench.jpg
 
  • Like
Reactions: Aix.
like this
Sheet is highly overly cautious, there is enough tests of cheap b350's and 3900X just.. working well without any issues on internet now :)

no problems after an hour run of all 12 cores loaded at 4.3 ghz! :)
it's toasty at 95 ish c, but who cares.. not like they're actually burning up.
Caps are at 60C so they won't last 15-20 years but then it doesn't matter anyways.

people are overly critical about this stuff imho, I tested a B350M-A.. no vrm coolers what so ever, there it went to 3.8 ghz allcore and was perfectly reasonable - You actually left performance on the table with a tad higher end board.

The b350m-a is actually the worst B350 board I know of.

I'd like to point out that those lower end B350 and B450's running a 3900X may be working today, but that doesn't mean they'll work very long. More than that, VRM quality effects boost clocks. If that stuff is maxed out, or running super hot performance could easily suffer in PB2 and PBO modes. Manually, you can force it, but that's not the best idea.
 
I'd like to point out that those lower end B350 and B450's running a 3900X may be working today, but that doesn't mean they'll work very long. More than that, VRM quality effects boost clocks. If that stuff is maxed out, or running super hot performance could easily suffer in PB2 and PBO modes. Manually, you can force it, but that's not the best idea.
Just make sure to keep the VRM's cool. :)
 
So apparently Corsair iCue is the key cause of the high-idle issues. What's interesting is that even on my 2700x system iCue is doing the same thing, for whatever reason people are just noticing this issue now. With iCue running it basically never allows the CPU cores to go into idle state and drop voltage. I've tested this on my system with the 2700x dropped in and the 3800x dropped in. Both stay stuck at the normal low-workload idle voltages of around 1.4v instead of dropping to below 1v.

Now, if you have decent cooling this doesn't matter much, but it's still not working the way it should.

I've looked at AMD's posts about other logging software causing the issues, but generally, I don't see HWINFO64 etc causing any of these issues. I can keep those running, and as soon as I force quit iCue from running in the background the problem goes away.
 
Last edited:
So apparently Corsair iCue is the key cause of the high-idle issues. What's interesting is that even on my 2700x system iCue is doing the same thing, for whatever reason people are just noticing this issue now. With iCue running it basically never allows the CPU cores to go into idle state and drop voltage. I've tested this on my system with the 2700x dropped in and the 3800x dropped in. Both stay stuck at the normal low-workload idle voltages of around 1.4v instead of dropping to below 1v.

Now, if you have decent cooling this doesn't matter much, but it's still not working the way it should.

I've looked at AMD's posts about other logging software causing the issues, but generally, I don't see HWINFO64 etc causing any of these issues. I can keep those running, and as soon as I force quit iCue from running in the background the problem goes away.

iCue is and always has been a piece of shit.
 
iCue is and always has been a piece of shit.

Does anyone have any legitimately good RGB software? iCue is garbage, Synapse is abysmal, AsRock’s barely works, MSI’s just kind of exists. Last time I used ASUS’s it was pretty bad.
 
Does anyone have any legitimately good RGB software? iCue is garbage, Synapse is abysmal, AsRock’s barely works, MSI’s just kind of exists. Last time I used ASUS’s it was pretty bad.

Alienware's stuff is pretty good. The funny part is that it's just MSI stuff, but done better. The problem is that, of course, it's Alienware only. The rest is complete garbage.
 
The good thing about the RGB software is that you can generally set it and forget it.
 
Does anyone have any legitimately good RGB software? iCue is garbage, Synapse is abysmal, AsRock’s barely works, MSI’s just kind of exists. Last time I used ASUS’s it was pretty bad.

The best RGB is no RGB at all :p

Even better, buy a case without a window and hide it away under your desk somewhere.

It's whats on the screen that matters.
 
I bought a B450 I Aorus from someone which he flashed before selling it to me, it works really well since PCI 4 is not that important to me It was an easy offset.
 
The best RGB is no RGB at all :p

Even better, buy a case without a window and hide it away under your desk somewhere.

It's whats on the screen that matters.
I bought my fans with just 1 color. Not that I really look at and admire it. But for the keyboard and mouse, I love it. I can't imagine not having a lighted keyboard.
 
The best RGB is no RGB at all :p

Even better, buy a case without a window and hide it away under your desk somewhere.

It's whats on the screen that matters.

This isn't 1990. We don't all have beige boxes. You can have all the performance you want and or can afford while still having a sense of style to the machine.
 
This isn't 1990. We don't all have beige boxes. You can have all the performance you want and or can afford while still having a sense of style to the machine.

I don't think aesthetics has any place in technical hobbies, as long as things don't look like a mess. I hate how this hobby has turned into "pimp my computer" more than focusing on the basics, the most possible performance with the least possible noise.

xzibit-happy.jpg


I bought my fans with just 1 color. Not that I really look at and admire it. But for the keyboard and mouse, I love it. I can't imagine not having a lighted keyboard.

I consider a lighted keyboard a functional thing, not an aesthetic thing. It's great for those of us who have self taught "bad" typing methods and need to see the keyboard every now and then in the dark.

I got myself a Ducky One with white LED's a while back. I'm not crazy about the Cherry MX switches, but I am starting to get used to them. I keep the back lighting down to one of the lowest settings so it doesn't reflect off of my screen or distract when it is dark. Just enough subtle light to see the lettering. During daylight you can barely tell it is lit up.
 
I don't think aesthetics has any place in technical hobbies, as long as things don't look like a mess. I hate how this hobby has turned into "pimp my computer" more than focusing on the basics, the most possible performance with the least possible noise.

View attachment 173788



I consider a lighted keyboard a functional thing, not an aesthetic thing. It's great for those of us who have self taught "bad" typing methods and need to see the keyboard every now and then in the dark.

I got myself a Ducky One with white LED's a while back. I'm not crazy about the Cherry MX switches, but I am starting to get used to them. I keep the back lighting down to one of the lowest settings so it doesn't reflect off of my screen or distract when it is dark. Just enough subtle light to see the lettering. During daylight you can barely tell it is lit up.
I like it since I keep my room dark for VR. Well......and I am old and don't see as well anymore.....
 
Nothing wrong with Aesthetics. I think most of us (including me) are just put off by the amount of people who spend incredible amounts of money on custom loops, lighting, chassis, etc just to show off their computer and stare at it while the peak of their computing needs is solitaire :p
 
Back
Top