AMD Ryzen R9 3900X Review Round Up

Discussion in 'HardForum Tech News' started by Schro, Jul 7, 2019.

  1. NightReaver

    NightReaver [H]Lite

    Messages:
    68
    Joined:
    Apr 20, 2017
    I mean most people do play at 1080p, but that same majority are still playing at 60 fps, so it really doesn't matter if one cpu can get 5% more fps when they're both above that target.
     
  2. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    11,152
    Joined:
    Oct 4, 2007
    First off, the 3900X is a monster for games. :) Second off, your pricing, as pointed out by others, is way off on the boards cost and also, you can use lesser expensive X470 boards if you want. Claiming that the 9900K has the gaming "crown" does not mean much of anything, anymore. Also, you need to overclock the 9900K to get that crown, if I understand correctly.

    Of course, you could also go for the 3700X, 3800X or even the 3600X but hey, ignore those because they do not fit what you are trying to convey. Good thing you are not the only consumer. I saw a younger guy, probably early 20's, buy and walk out with an X570 Gigabyte Aorus Motherboard, R9 3900K and 32 GB of Ballistix memory on Monday and he appeared happy to me. (If I had to guess, he already had a high end video card of some sort, I did not ask.) Yeah, the certain processors were out of stock at Microcenter and guess what, they were not the 9990whatever.
     
    Last edited: Jul 10, 2019
  3. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    11,152
    Joined:
    Oct 4, 2007
    Well, it helps if you are playing on a freesync or gsync capable monitor.
     
  4. NightReaver

    NightReaver [H]Lite

    Messages:
    68
    Joined:
    Apr 20, 2017

    I thought that stuff only smooths stuff out when you aren't pegged at max fps the whole time? If I'm playing at 60 fps and I never dip, does it really make a difference?

    Correct me if I'm wrong, it's been a while since I used an AMD card (was during the mining craze), and there's no way I'm shelling out for gsync.
     
  5. Azrak

    Azrak Gawd

    Messages:
    859
    Joined:
    Oct 4, 2015
    Yes, and the 9900K does not come with a cooler. The 3900X comes with a cooler that by all accounts is decent in performance. You need to take that into account when comparing prices, otherwise you are a shill for Intel by ignoring the facts.

    Define "platform". If you are referring to the motherboard, there are many X570 offerings that start at less than $200. You also have the choice of NOT buying an X570 motherboard and instead choosing one of the many previous generation platforms that are still compatible with the 3000 series for even less money.

    Your laser focus on gaming performance is why you have such an opinion, but keep in mind not everyone has the same requirements from their processor. So, it looks like Intel may be the best choice - for you. Please do not push your needs and desires on others and declare "fail" when you have such a narrow focus of requirements. The 3000 series is a "fail" only to you and others requiring absolute and total gaming performance. And that's fine. Everyone's needs are different. The 3000 series is not a "fail" to others with a wider range of tasks or lesser gaming performance requirements (or higher resolutions where GPU becomes the bottleneck).
     
  6. Rockenrooster

    Rockenrooster Limp Gawd

    Messages:
    410
    Joined:
    Apr 11, 2017

    Bruh, like others pointed out, the 9900k is only a few FPS (1-3) faster at resolutions greater than 1080p. It's not much of a "crown" when its that close in gaming and so much faster in everything else.
    Also I'm planning on dropping one in my X370 that I got at launch more than 2 YEARS ago for LESS THAN $150.
    The 9900K is "technically" faster in gaming, but its so margninal that I don't care enough, I use my CPU for more than just gaming. VMs, encoding, game servers, WHILE gaming. All around much better overall than the competition at the time (Intel 7th gen) even though it was "slower"at gaming (bigger difference then). I got a 1600 at launch and then upgraded to a 1700 from a friend for $180 like 6 months later.
    Never spent for than $250 for a CPU ever, but Ryzen 3000 has me reconsidering.
     
    Red Falcon likes this.
  7. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    11,152
    Joined:
    Oct 4, 2007
    I am not 100% certain on that but, I do find that freesync monitors smooth things out better than not having it at all. However, I am using 144hz freesync panels so maybe that has something to do with it.
     
    Maddness likes this.
  8. buttons

    buttons [H]ard|Gawd

    Messages:
    2,031
    Joined:
    Oct 12, 2011
    9900 is certainly more expensive for me and please show me games where a 9900k is actually faster at 4k resolution. its 2019, i havent been on 1080p for years.

    9900 + cooler + motherboard = ?
    3900x + bios update + drop into my current X370 board = $499

    edit: i see my point was already made by #246.... whoops.
     
    Red Falcon and Legendary Gamer like this.
  9. Soulstorm brew

    Soulstorm brew [H]Lite

    Messages:
    70
    Joined:
    Mar 29, 2019
    That 9900k @ 5 GHz needs a D15 or aio to keep under 80c , something a lot of posters are´nt talking about.
    200w is a lot of heat to dissipate
     
  10. drescherjm

    drescherjm [H]ardForum Junkie

    Messages:
    14,568
    Joined:
    Nov 19, 2008
    Between my department at work and at home I have seen exactly 1 monitor that is significantly greater than 1080p and that was an expensive (tens of thousands of dollars) medical imaging monitor. I loved working on it..

    Maybe its time for me to shop for some new monitors however in the mean time 1080p is what I am interested in.
     
  11. Vega

    Vega [H]ardness Supreme

    Messages:
    6,288
    Joined:
    Oct 12, 2004
    People who think you can only be CPU limited at 1080p must have really slow computers...
     
    GoldenTiger, aznpotpie and Shadowed like this.
  12. notarat

    notarat [H]ard|Gawd

    Messages:
    1,897
    Joined:
    Mar 28, 2010
    And from the opposite side of the spectrum, I haven't used a 1080p monitor in over a decade and a half. 1080p is not only the very last resolution I'm interested in, it doesn't even make the list of resolutions I'm interested in. You may as well game at 640x480 if you're going to game at 1080p
     
  13. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    21,644
    Joined:
    Sep 13, 2008
    wow that hit me right in the feels.. my 1080p monitors are crying now because of you..



    jk i'm just a cheap bastard otherwise i'd probably grab some 1440p monitors. :p but i agree, once you go 1440p+ there's no going back to 1080p, that's for damn sure.
     
  14. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    11,152
    Joined:
    Oct 4, 2007
    1080p 144hz gaming for the win!
     
    dgz likes this.
  15. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    11,152
    Joined:
    Oct 4, 2007
    I cannot agree at all. I have a 4k Samsung 43 inch TV that I use as a monitor and it works great. However, I also have one 27 inch 1080p 144 monitor and one 24 inch 1080p 144 hz monitor, both curved, and they run great as well. It is not one or the other, thankfully. :)
     
  16. notarat

    notarat [H]ard|Gawd

    Messages:
    1,897
    Joined:
    Mar 28, 2010
    It's just my .02

    I'm an odd duck...My perspective on the whole "What's the best res to play at?" issue is: There's 1 item in your computer you interact with no matter what you're doing....surfing, streaming, compiling, gaming, defragging an old spinner, watching teh pr0n, whatever....

    It's your monitor. It should be the primary item in your build since supporting the resolution/settings you want to play at will determine every other component you use in your build.

    When I build a system I decide what res and features I want to play at, then choose my monitor and, from there, the rest of the parts. When I get to the hardware choices on the system side, start with the power supply since it's the one item in a build which impacts everything else in the build...
     
    dub77nj and sirmonkey1985 like this.
  17. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    11,152
    Joined:
    Oct 4, 2007
    I agree with this but, I have a computer with a R5 1600 and Powercolor RX580 Red Devil with another computer with the R7 1700 and an XFX RX570. Those two computers are best at 1080p and having a 144hz Freesync Panel on them works great. :) (4K on my Vega 56, however.) If I had just one computer with my R7 1700, I would upgrade that to the R7 3700X with a 5700 but, I do not need them. :) ;)
     
  18. wizzi01

    wizzi01 2[H]4U

    Messages:
    2,372
    Joined:
    Apr 25, 2008
    Lol, I could have got the gigabyte aorus elite(I think it is the elite that's the lower end) for $199 when I bought my 3700x. I ended up with an msi that was $259 which brought the price down on my 2700x to $199 because of combo. There were plenty of boards available for under $300.
     
  19. Nolan7689

    Nolan7689 [H]ard|Gawd

    Messages:
    1,453
    Joined:
    Jun 5, 2015
    Why don’t you go ahead and give that a shot then.
     
    AlphaQup and Maddness like this.
  20. buttons

    buttons [H]ard|Gawd

    Messages:
    2,031
    Joined:
    Oct 12, 2011
    my home computer uses a 32" 2560x1440 @ 75 hz hp omen display. Ive had it for years, but my next display will probably be a samsung Q6N 55" 2560x1440 @ 120hz free sync ... unless a 4k @ 120hz becomes available sooner then i expect.

    I have a 65 " 4k and two 55" 4k tv that i sometimes game on.
     
  21. notarat

    notarat [H]ard|Gawd

    Messages:
    1,897
    Joined:
    Mar 28, 2010
    Nah. As I said, I haven't used 1080p (or lower) in 15+ years.

    Why don't you try it first and create a thread about it.
     
  22. n=1

    n=1 2[H]4U

    Messages:
    2,388
    Joined:
    Sep 2, 2014
    I game at 1440p and spend ~10% of my time ripping DVDs (encoding, sorta). Explain to me why 9900K is the better purchase please.

    This is the part so many miss. Z390 is a dead end, 9900K is the last "upgrade" for it. Want anything faster? Another $200+ for the mobo you go.

    If we're going to criticize the cost of X570, then let's at least acknowledge that Zen 3 (Ryzen 4000) is guaranteed to be supported, and possibly even Zen 4 (Ryzen 5000). Good luck trying to do drop-in upgrades with Intel lol.

    Not everyone runs dual RTX Titan setups or do competitive gaming or insist on 144 fps minimums.
     
    Last edited: Jul 10, 2019
    schmide and Red Falcon like this.
  23. ryan_975

    ryan_975 [H]ardForum Junkie

    Messages:
    14,205
    Joined:
    Feb 6, 2006
    Where is Zen 3 guaranteed to be supported? AMD has stated that AM4 will be supported until 2020. Zen 3 desktop products will probably be released in late 2020 to early 2021 since the server (Milan) products are slated for mid-2020. There's a good chance that Ryzen 3000 is the end of the road for AM4.
     
    IdiotInCharge likes this.
  24. legcramp

    legcramp [H]ardForum Junkie

    Messages:
    10,838
    Joined:
    Aug 16, 2004
    Only scenario Intel wins is in high refresh / lower resolution twitch gaming, otherwise it's a wash at 1440p or higher for everyone else because it's so close. And AMD like you said is a monster in all other applications.

    Super hilarious that you mention your first ever build was an Athlon XP.... where this Ryzen launch is very similar to the Athlon XP back then compared to the Pentium 4. LOL :ROFLMAO::ROFLMAO::ROFLMAO:
     
    Red Falcon likes this.
  25. n=1

    n=1 2[H]4U

    Messages:
    2,388
    Joined:
    Sep 2, 2014
    Considering Zen 2 desktop released before Rome (Epyc 2), I'm not sure that conclusion holds. To be clear I'm not saying Zen 3 desktop will release before Milan, but at this point I see no evidence pointing to Ryzen 3000 being end of the road for AM4.
     
  26. kamikazi

    kamikazi Limp Gawd

    Messages:
    392
    Joined:
    Jan 19, 2006
    This is true. I kind of feel like an idiot moving up and leaving myself no upgrade path. But, I'm still on Sandy with DDR3, so I obviously keep my stuff for quite a while. If I wait any longer, I'll be skipping an entire memory generation.
     
  27. Nolan7689

    Nolan7689 [H]ard|Gawd

    Messages:
    1,453
    Joined:
    Jun 5, 2015
    Then maybe you shouldn’t be so asinine as to compare it to 640x480. To view them as the same paints you as extremely ignorant.
     
  28. n=1

    n=1 2[H]4U

    Messages:
    2,388
    Joined:
    Sep 2, 2014
    Well if you went Z390 then you know for sure that's the end of the road. I'm entirely not convinced Ryzen 3000 is end of the road for AM4, and have seen no convincing evidence (forget that, not even rumors) indicating that's the case.
     
    Red Falcon likes this.
  29. Verado

    Verado Limp Gawd

    Messages:
    232
    Joined:
    May 16, 2017
    Hi.
    Your truth is a personal truth. Do not mistake it for a universal truth.
    I have a brick wall I want you to use.
     
  30. ryan_975

    ryan_975 [H]ardForum Junkie

    Messages:
    14,205
    Joined:
    Feb 6, 2006
    I didn't mean to imply a certain order of product releases, but rather that the timing doesn't make a lot of sense. Rome is up for release next. Castle Peak (Threadripper) is supposedly coming late this year or early 2020, and Milan is to be released mid-2020 (May, Jun July). There's just nowhere to fit another product release in unless they're going to do server and desktop simultaneously, which isn't something they usually do.

    As for evidence of AM4's demise, AMD said the socket would be supported until 2020. It's almost 2020. They could extend for one more release, but they had to put a lot of effort into designing the package to work with separate modules without making any changes to the socket. If Ryzen 4000 adds any features, it would almost definitely require a new socket.
     
  31. n=1

    n=1 2[H]4U

    Messages:
    2,388
    Joined:
    Sep 2, 2014
    Fair enough. I guess I read AMD's statement as AM4 will be supported until end of 2020, in which case Ryzen 4000 will most likely make the cut.
     
  32. Dan_D

    Dan_D [H]ard as it Gets

    Messages:
    55,115
    Joined:
    Feb 9, 2002
    It's possible we might see one more desktop release, perhaps a refresh in AM4 sometime next year about this time. I don't know. AMD's always said support would be "through 2020", implying that it would be the desktop socket until the end of that year. However, AMD could just as easily push back any planned release outside of 2020 and not break its promise. Honestly, three generations of CPU's on a single socket is pretty good. I think it's reasonable for AMD to move on after that.
     
    Red Falcon likes this.
  33. n=1

    n=1 2[H]4U

    Messages:
    2,388
    Joined:
    Sep 2, 2014
    Fack you very much Dan_D, now what am I supposed to upgrade to? :LOL:

    Z390 is dead, and I absolutely do not want a 14nm++++ chip, so Intel's out until 2021 at least. Meanwhile I'd really hate to be on the tail-end of AM4, so I guess one more year of waiting it is.
     
  34. Mchart

    Mchart 2[H]4U

    Messages:
    3,597
    Joined:
    Aug 7, 2004
    I guess I don't see what the problem with AM4 is. If you get X570 you'll have a PCI-E 4.0 slot for future GPU's that may benefit from it. It's got enough lanes for most users. Finally, you've got a 12 core CPU now, and in a few more months a 16 core CPU. Finally, it's highly likely you'll have the next Zen at least still be on AM4 and work on any X570 board as well.
     
  35. Dan_D

    Dan_D [H]ard as it Gets

    Messages:
    55,115
    Joined:
    Feb 9, 2002
    Let me be clear on how this works.

    There is no specific all core boost clock in the traditional sense. There is no guaranteed frequency that these CPU's will do in multi-threaded applications. The Precision Boost 2 algorithm attempts to boost loaded core clocks as high as possible until power or thermal limits are reached. These limits can be from VRM output, socket power, or CPU / VRM thermal limits. The way thermals ramp up under load basically has the CPU's even out their clock speeds across all cores. Where these stop is going to depend on the specific workload, power and thermal conditions. To some extenet, the quality of your specific CPU will come into play as well. Under PBO, you can defer to the motherboards PPT, EDC and TDC values, or you can manually adjust these. Even if you go ham on the limits and input stupidly high values, the CPU is still at least partially governed by its own internal values. While PPT, EDC and TDC can effectively be overridden, the CPU still has an OEM max clock speed limit programmed into the CPU.

    Here is data provided by AMD on their tests. I can confirm this is pretty much the case and its roughly in line with what I saw on a manual all core overclock. I typically saw boost clocks under load that were a bit lower than these, but that can come down to a variety of things. Ambient temperatures in my office, specific board BIOS settings, and motherboard design. Some reviewers like Gamer's Nexus reported better clocks than I saw, but the GIGABYTE Aorus Master they used has a better VRM design than the MSI does. Or at least, one that's capable of outputting more power with less heat. I believe this is the case but, I haven't confirmed the design details. This is something I saw or heard mentioned somewhere.

    upload_2019-7-10_15-18-42.png

    More specifically, AGESA Code 1.0.0.3 patch A isn't as aggressive as 1.0.0.3 patch AB as far as clocks go. The review BIOS revisions were patch A, not AB. MSI has provided me with a unrealeased internal copy of its latest BIOS using AGESA code 1.0.0.3 AB which I'm in the process of testing but the fact of the matter is, I'm not seeing any real improvement in boost clocks. Keep in mind that boost clocks are never guaranteed. According to Ryzen Master, using PB2, I'm hitting the TDC, EDC, and PPT limits as it is. Therefore, it isn't going to go beyond the 4.425GHz I'm seeing now. It doesn't do this single thread either, but that could easily be my silicon just not being that great. The interesting thing about the new Ryzen Master is that you can actually see what each core is doing. You can even watch the CPU shift to using a different core and trying to boost it higher. You can also see cores go into sleep mode, or wake up and run at various frequencies.

    In short, my specific 3900X does not run at the advertised 4.6GHz boost clock. That clock is generally a single core boost clock. No "all core" boost clock is advertised anywhere for these CPU's. That said, expecting around 4.0-4.3GHz on all cores is reasonable. One thing I'll restate that I said in my review is this: I don't think manual overclocking is the way to go with these chips. I think your best bet is to buy the best motherboard and cooling you are willing to pay for and use PBO or PBO with the +offset to achieve your best results. I have been able to achieve 4.5GHz in single threaded applications with PBO+200MHz offset. So I have gotten close.

    That said, I do not think the difference in clock speeds would alter the bottom line on these CPUs. I will verify this, but I don't think it will. The fact is, the all core overclocking and performance under multi-threaded workloads is unaffected by this issue and therefore said data is 100% accurate as it stands. Single threaded or lightly threaded applications might end up with a boost, but per MSI on AMD's AGESA code, performance can vary due to other variables in the code, and seeing a higher clock speed in CPU-Z or Ryzen Master does not guarantee better performance. I've only run Cinebench so far, and under PB2, I've seen no impact to the test results.
     
    AlphaQup, schmide, blurp and 2 others like this.
  36. Mode13

    Mode13 Gawd

    Messages:
    678
    Joined:
    Jun 11, 2018
    I find it frustrating how elusive news of the 3800x is 4 days after launch. All I can find is this review from newegg:

    https://www.newegg.com/amd-ryzen-7-3800x/p/N82E16819113104

    Come on AMD.. send out some 3800x and get it over with
     
  37. Mchart

    Mchart 2[H]4U

    Messages:
    3,597
    Joined:
    Aug 7, 2004
    Anyone ran a 3900x on a gaming pro carbon yet? Before I upgrade I want to make sure the bios is stable.
     
  38. Zarathustra[H]

    Zarathustra[H] Official Forum Curmudgeon

    Messages:
    28,887
    Joined:
    Oct 29, 2000

    I get most of this, but the part I am chafing at a little bit is that you suggest that the reason you are not hitting advertised single thread boost speeds might be because of poor silicon.

    Don't they validate each silicon bin so it can at least hit the advertised boost speed? And if they don't, shouldn't they?

    No one in their right mind should expect an overclock, but they should expect the CPU to hit advertised clock speeds, provided cooling and power delivery are acceptable. This kind of seems like the bare minimum.

    If I buy a CPU and it can't hit the advertised speeds no matter what I do, that seems like a defective CPU to me.
     
    buttons and ZeroBarrier like this.
  39. NKD

    NKD [H]ardness Supreme

    Messages:
    7,761
    Joined:
    Aug 26, 2007
    I think MSI probably is just buying time. My asus crosshair VIII hero wifi has the AB bios release on 7/5. But out of box it was shipped with earlier bios. Here are my results. out of the box my CPU was boosting close to 4.5ghz+ on single core. I wanted to test to see if this was indeed issue with the 1.0.0.3 bios. So I updated, I knew going in I won't be able to downgrade because asus didn't have the out of the box bios on the site. After update my boost clock even on single core was 200mhz slower. So I am sure 1.0.0.3 code needs tweaks, there is definitely something with the code that is messing up the stock boost bios that I had out of the box.

    After the update It mostly saw 4200 single core boost. I went in to bios and did +150 and now it boosts to about 4275-4300. So the boost OC is definitely working, didn't see much with +200. But the new bios code most definitely screwed up the single core boost. Also my all core boost went down as well from like 4.2, this is with AIDA64 to like 4050-4051. So looks like there is overall reduction in boost clock of 200mhz here. They really need to work on improving the bios and most definitely older bios did boost right.

    I think my original bios was AGESA code ending with 0.7.2
     
  40. Dan_D

    Dan_D [H]ard as it Gets

    Messages:
    55,115
    Joined:
    Feb 9, 2002
    Which CPU do you have? MSI said their 3700X was working right. I have only tested the Ryzen 9 3900X so far.