Zen 2 Review Summary

Discussion in 'AMD Processors' started by DuronBurgerMan, Jul 7, 2019.

  1. drescherjm

    drescherjm [H]ardForum Junkie

    Messages:
    14,421
    Joined:
    Nov 19, 2008
    Can't you use a $20 PCIe slot adapter?
     
  2. XoR_

    XoR_ Gawd

    Messages:
    814
    Joined:
    Jan 18, 2016
    AMD allegedly had 8 core CPU on the consumer market even before Zen...
     
    ChadD likes this.
  3. DuronBurgerMan

    DuronBurgerMan [H]ard|Gawd

    Messages:
    1,300
    Joined:
    Mar 13, 2017
    8 cores, that even when all fully utilized, could barely match a 4c/4t 2500k. And on single thread was about half as fast. As far as I'm concerned, the only CPU in x86 history to be worse than Bulldozer was the early Pentium with the FDIV bug. Even the P4 at least had some strengths relative to its competition.
     
    drescherjm, IdiotInCharge and XoR_ like this.
  4. juanrga

    juanrga Pro-Intel / Anti-AMD Just FYI

    Messages:
    2,540
    Joined:
    Feb 22, 2017
    1600X: 4.0GHz
    2600X: 4.2GHz
    3600X: 4.4GHz

    1800X: 4.0GHz
    2700X: 4.3GHz
    3800X: 4.5GHz

    It is a bit difficult to believe that 4600X and 4800X will hit 5GHz.
     
  5. DuronBurgerMan

    DuronBurgerMan [H]ard|Gawd

    Messages:
    1,300
    Joined:
    Mar 13, 2017
    1800X: 4.0 + 100 XFR. 4.1 max.
    2700X: 4.35
    3900X: 4.6 (3950X 4.7 if we want to count that one)

    I don't make any claims as to which SKU/core count would do it, or even say that Zen 3 definitely will, only that it's plausible and follows the formula seen thus far.

    Edit: also worth noting that if AMD comes CLOSE next time around, i.e. 4.8 or 4.9, I can see them pushing the power envelope just to get that nice 5GHz marketing number. Very much an AMD thing to do.
     
    Last edited: Jul 11, 2019
  6. XoR_

    XoR_ Gawd

    Messages:
    814
    Joined:
    Jan 18, 2016
    Sandy Bridge was a competitor by association. Bulldozer could not beat Nehalem
    perfrel_1024.gif

    NetBurst if anything was marketing success and having two threads (on later/expensive models with HyperThreading) could at places make a lot of difference in system responsiveness which was big selling point. With single core CPU's running application at 100% speed meant everything else was blocked and any I/O operation which used interrupts (at that time even disk controllers were retarded... at least most of them, some people had fancy SCSII hardware...) blocked processor. HT solved these issues more than it provided multi-thread performance speedup (in typical home usage scenario). Performance scaling with sheer clock at that time was not only possible but also made Pentium 4 competitive. Intel unfortunately broke NetBurst with Presscott which was complete failure. Northwood with some L3 cache (as in P4 Extreme Edition) was pretty good CPU and they should rather just shrink it and not increase pipeline depth even more. But generally other than selling processors with numbers it was stupid idea.

    FDIV bug in original Pentium... later with Pentium Pro there was loadable microcode and such bugs could be fixed by BIOS/OS update. Where did we that... ;)

    I just finished initial testing of my superior 720p gaming CPU, still without GPU
    Tested Borderlands at 720p and performance in such setup is truly advantageous to Zen2 because it... ran :)
     
    IdiotInCharge likes this.
  7. juanrga

    juanrga Pro-Intel / Anti-AMD Just FYI

    Messages:
    2,540
    Joined:
    Feb 22, 2017
    720p testing today is a good measurement of 1440p gaming tomorrow with a better GPU.
     
    drescherjm and IdiotInCharge like this.
  8. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    Northwood was the only 'good' Pentium IV; I owned several, literally just to get away from the VIA chipsets used on AMD boards. AMD moving the memory controller onto the CPU with the Athlon 64 was the lynchpin.
     
    N4CR likes this.
  9. RamonGTP

    RamonGTP [H]ardness Supreme

    Messages:
    7,665
    Joined:
    Nov 9, 2005
    But it’s not tomorrow. It’s today and when the tomorrow you’re referencing comes, the systems we are all taking about now will be as relevant as a Q6600 is today.

    Sufficed to say, we have two vendors to choose from. Both are quite good at everything with each one excelling at a certain workload or another.
     
    DuronBurgerMan likes this.
  10. ChadD

    ChadD 2[H]4U

    Messages:
    4,067
    Joined:
    Feb 8, 2016
    That WAS the way it was yes. For the most part it no longer is. Outside of somet very niche stuff using avx or something. AMD has a pretty clean sweep with 3000. Intel isn't really better at anything. Unless people are seriously considering a 1-2% win at 720p in a few games a real win. AMD destroys Intel across the board. That hasn't been true in a long long time.
     
  11. Dan_D

    Dan_D [H]ard as it Gets

    Messages:
    54,224
    Joined:
    Feb 9, 2002
    Whoa there. Not quite. Intel still maintains an advantage at 1080P gaming. The difference was more than 1-2%. None of my testing was done at 720P. As much as it pains me, 1920x1080 is still very much relevant. Granted, the differences will really only be there for people trying to push ultra-high refresh rates, but there is a difference. AMD didn't do a clean sweep on everything. There are a few other single-threaded workloads where AMD and Intel are super close. There are a few games where AMD closed the gap, but many more where it still exists. That said, I would agree that the difference is irrelevant in practical application outside of the aforementioned high refresh rate crowd at 1080P. Where you start to become more GPU limited, these differences become moot.

    14-2.png

    We see more than a 1-2% difference in this test. Now again, this is absolutely academic using anything under a 175Hz monitor. Above that, I think the differnce is more significant. Significant enough to buy Intel over AMD? I wouldn't, but that's just me.

    16-2.png

    In this one we saw a considerably larger gap in performance. Overclocked, the Intel simply walks away from the AMD system.

    I think the AMD Ryzen 9 3900X is still the overall better buy, but I can't deny that a pure gaming rig would be better using Intel right now. With game streaming in the mix, that's not quite the case. Obviously, AMD has an advantage there.
     
    otg, legcramp and drescherjm like this.
  12. Derangel

    Derangel [H]ard as it Gets

    Messages:
    17,612
    Joined:
    Jan 31, 2008
    And by the time that “tomorrow” comes all these CPUs will be about as useful as a C2D is today. It will be a long time before 1440p at 60hz comes even close to being CPU limited. Even at 144hz we’re still hitting GPU limits and the small difference between Intel and AMD is negligible. Then there’s 4K.
     
  13. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    That's a bit far-fetched- the reality is that 4000-series Intel CPUs are still functionally relevant today. So part of a CPU review needs to be focused on how the CPU might scale. The machine under my desk at work has a Dozer APU in it. It works, for desktop stuff, and a Sandy Bridge quad would be a significant upgrade!

    This is the picture that many people aren't getting. High core-count CPUs are great if you need to say edit video or compile or something else on a work schedule, and the reality is that most people don't need that. Maybe they edit video, but they don't do it often enough or to a production level that doubling the cores in their processor would make a real difference in their life.

    That's why gaming has been a significant focus for desktop performance. It's the most strenuous task that faster components actually make a difference with that most people would do, in essence, it's the common denominator for consumer performance. Your phone can display web pages and stream videos. A laptop or a desktop just give you better input and output options. Consoles play games well enough for most.

    Performance gaming is a cut above that and where the real innovation comes for desktops.

    On the desktop ;). Servers too, of course, if talking raw performance and especially if talking virtualization / containerization. But there's still mobile!

    :D
     
    otg, juanrga and DuronBurgerMan like this.
  14. PiEownz

    PiEownz Gawd

    Messages:
    615
    Joined:
    Mar 21, 2010
    I reserved a 3900X and a Gigabyte Aorus Pro WiFi at Microcenter. I’m an idiot who likes to waste money lol. I’ll pick it up this weekend.
     
    N4CR and legcramp like this.
  15. DuronBurgerMan

    DuronBurgerMan [H]ard|Gawd

    Messages:
    1,300
    Joined:
    Mar 13, 2017
    This is the conventional wisdom, but parallelism throws a monkey in this wrench. Will gaming tomorrow take more advantage of parallelism and thus more cores/threads? Probably, though to what extent I don't know. Will that offset (to some unknown limited degree) single thread performance deficiency?

    I don't know and can't quantify this, and neither can you.

    It's much harder to determine future performance with more variables in the mix.

    I neither take the AMD fanboy approach of 'everything will take advantage of more cores/threads', because this is extra work for developers, usually, and this imposes a cost. The door is open for more parallelism in games. How many devs walk through it, and how far they go, I don't know.

    Neither do I take the Intel fanboy approach of single-threaded performance being king forever. Who is to say that core/thread count won't take more of a role than raw lightly-thread performance in games? We've seen an increase in threading as core counts have increased. Will it continue? I don't know.
     
    BrotherMichigan and schmide like this.
  16. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    We can take decent stabs at it with a bit of logic.

    Right now, due to consoles going wider vs. faster, and clockspeeds being limited for almost a decade, we can reasonably assume that anything that can be easily parallelized has been already. We can then reasonably assume that games aren't going to get less single-thread dependent without some major rethink to the level of entirely new paradigms of software development coming into play.

    What we can't really know is whether future additions to game logic are going to be more single-thread dependent, or less. We can expect a bit of both, most likely, but addressing the question means making a solid prediction as to the mix.

    The best prediction that we can make is that single-thread performance will not become less important for gaming in the forseeable future. That doesn't really put anyone at a disadvantage any more than they are today.
     
    XoR_ likes this.
  17. XoR_

    XoR_ Gawd

    Messages:
    814
    Joined:
    Jan 18, 2016
    Core count will matter more in the future that is for sure
    Will it be one year? Two? Three?
    In three year time I guess single core performance of new processors will be so much better compared to current ones that upgrade will be preferable to using older system anyway.

    AM4 platform is in this good situation that even if there won't be any new Zen architecture on it there will be still 3950X (or 3900X for those who got weaker CPU) and it generally supports PCI-E 4.0 so upgrade will be possible
    So it simply is more future proof platform and there should be no doubt about it...

    Intel is more "now" solution for people who want fastest single core and gaming performance that is available, period.
    Besides I doubt there will be any significant market for custom built Intel rigs anymore. Most people who build PC themselves will get AMD, it is the most obvious choice.
    Only die hard Intel fanboys will get Intel until they bring something considerably better, like 10nm with those 15% IPC improvement and significantly more cores.

    Intel will be still used in PC's that companies mass sell to corporations. They have their funny iGPU thing and no matter how you slice it that is important feature for that kind of computer. AND that is beside Intel doing its Intel thing which makes these companies choose Intel...
     
  18. n=1

    n=1 2[H]4U

    Messages:
    2,388
    Joined:
    Sep 2, 2014
    And also no clockspeed regression, or costing you 3 fingers for the 10 core. (luckily with AMD around, it will no longer cost a kidney, but Intel being Intel you can still expect to lose some body parts :D)
     
  19. drescherjm

    drescherjm [H]ardForum Junkie

    Messages:
    14,421
    Joined:
    Nov 19, 2008
    I really wish Dell (only approved vendor) would have some real AMD options in the products that I am permitted to purchase at work.
     
  20. XoR_

    XoR_ Gawd

    Messages:
    814
    Joined:
    Jan 18, 2016
    Maybe not less single threaded but it might be that there will be more physics in future games
    or some games or even GPU's will use CPU processing for denoising (for ray-tracing) which will put more CPU demand for those with those cards...

    There is a lot of things additional CPU processing power can be used for so who knows.
    In any way I do not think it will be soon. In other words nothing to worry about.

    Pretty much just like the last thing owner of fast Zen2 processor need to worry about is performance. Same goes for i7/i9 Coffe Lake users. Heck, even Zen and 8600K class processors are still fine. Heck++, even old 2500K still manages to provide decent gaming experience, play 4K movies with SW decoding, draw, play, dance, talks, whatever.

    Good enough CPU performance to do most daily computer usage (browsing web, watching video, etc.) is achieved even by f**king RaspberryPi. Anything extra is just this, extra. Like the graphs say 105fps or 132fps... boo hoo...
     
    otg, DuronBurgerMan and IdiotInCharge like this.
  21. DuronBurgerMan

    DuronBurgerMan [H]ard|Gawd

    Messages:
    1,300
    Joined:
    Mar 13, 2017
    It could also go the other way. If single thread performance levels off, and developers want to add more features to their games and need the CPU horsepower... the only way they'll be able to get it is through parallelism.

    I dunno man. That's why I say very clearly I don't know, and I don't take a side in that debate. I can argue it either way. We're at a weird place as we approach the silicon limits.
     
  22. XoR_

    XoR_ Gawd

    Messages:
    814
    Joined:
    Jan 18, 2016
    I made a deal with myself "one small meal a day (at evening so my body get used to ketones at night and I am not hungry next day) until saving from buying meals at work cover difference between what I should be buying (3700x, 3600MHz DDR4) and what I bought (9900k, 4400MHz DDR4)"... and I am already loosing my body fat as we speak :ROFLMAO:
     
  23. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    This is the most likely direction, but the challenge they'll run in to is that game logic is inherently linear and co-dependent. Stuff has to happen in order, and so on. So we should expect a mix of increased single-thread dependence and multi-thread utilization / flexibility.

    We can already see part of the solution in processor specialization. On CPUs stuff like AVX and encryption accelerators, on GPUs (including APUs/IGPs) video transcoding hardware, and of course the ever expanding flexibility of GPUs themselves. An even better example can be made of phone SoCs, where a tremendous number of different types of hardware are smashed together in order to keep clockspeeds and therefore power draw low.
     
  24. XoR_

    XoR_ Gawd

    Messages:
    814
    Joined:
    Jan 18, 2016
    Do not worry, with TSMC/Samsung process node nomenclature we will get to 1nm in no time :whistle:
     
  25. n=1

    n=1 2[H]4U

    Messages:
    2,388
    Joined:
    Sep 2, 2014
    Gahh should've specified "...lose some body part you don't want to lose" :smuggrin:
     
    XoR_ likes this.
  26. VIC-20

    VIC-20 Gawd

    Messages:
    879
    Joined:
    Mar 24, 2006
    Exactly.
     
  27. VIC-20

    VIC-20 Gawd

    Messages:
    879
    Joined:
    Mar 24, 2006
    H370 + 9700K would be very similar to 3700x. No price advantage unless creativity is your use or you cant 2 box for streaming.
    3600x vs 9600k. Same scenario.

    I would argue that for the first time since A64 we have a scenario where price isn't the deciding factor. You could save money all kinds of ways on both platforms. The decision this time is use case. Pure gaming vs general or mixed use.
     
    XoR_ likes this.
  28. DuronBurgerMan

    DuronBurgerMan [H]ard|Gawd

    Messages:
    1,300
    Joined:
    Mar 13, 2017
    Don't know if we even have a parallel for this situation, really.

    Then again, I'm starting to feel old. All the old review sites are disappearing. Old forums, too, in many cases. Damned Youtube reviewers and all that jazz. Even PCs in general... so many people use their phones and don't even bother with a desktop (and in many cases, even a laptop) at home anymore.

    The days of the desktop as standard are over. The desktop is now for people who do boatloads of work, or people who do boatloads of gaming (or both).
     
    N4CR, otg, Algrim and 3 others like this.
  29. Dan_D

    Dan_D [H]ard as it Gets

    Messages:
    54,224
    Joined:
    Feb 9, 2002
    Intel and most of the PC parts makers learned years ago that traditional desktops are now all about gaming and content creation. They aren't used for office type tasks or casual web browsing, or even E-Mail much anymore. As you said, most people use a tablet or a phone for that sort of thing.
     
    IdiotInCharge and XoR_ like this.
  30. Mchart

    Mchart 2[H]4U

    Messages:
    3,288
    Joined:
    Aug 7, 2004
    Well, people who bought into AM4 with a mid-range board from the past two generations will be saving quite a bit of money, and I would argue there is still at least another generation that will be compatible with many boards.

    I had the money to go Intel, but chose not because I knew AM4 would have long life, and i've been rewarded for that choice. I'm moving to a 3800x now, but can still move to a 12 or 16 core if I need to in the next couple of years. (Next year possibly more, or at least another IPC gain)

    It's in this regard that Intel is still more expensive. Although I think moving forward Intel won't pull this BS now that AMD is in parity with them.
     
    mvmiller12 and VIC-20 like this.
  31. juanrga

    juanrga Pro-Intel / Anti-AMD Just FYI

    Messages:
    2,540
    Joined:
    Feb 22, 2017
    People upgrades GPU much more often than CPU.

    Intel is 7--12% ahead depending what model is compared.

    "Tomorrow" can mean the next year or the one after. And the best current CPUs will continue being relevant for gaming. If you play games at 4K and you don't plan to update your GPU, then you don't need the best gaming CPU, you don't need a $400 Zen2 CPU either, an older $100 i3 is enough.

    "Tomorrow" doesn't mean a decade away. Much faster GPUs will be available in one or two years, but the number of cores/threads used by game engines will not increase in the same proportion in the same time period, not even close. Reviewers know this and that is why they perform "CPU tests" of gaming.

    I have been hearing the myth of parallelism on games since Bulldozer ("soon eight cores will be the minimum required for games!"), and I did hear it again with Piledriver, with PS4 and Xbox, with Mantle, with DX12, with Vulkan... Reality is that ancient four-core i7 continues being valid for a gaming build today

    b58dF9Du7rNkLGKtGpLZLg-650-80.png

    Games are serial algorithms. There is a continuous cycle that updates the game state from user input.

    State --> Action --> State --> Action --> State --> Action --> State ···

    This is the master thread, everything else (rendering, AI, sound processing,...) can be moved to slave threads executed in additional cores, but those slaves threads are launched, controlled, synchronized, and ended by the master thread.

    So the performance of the game will continue being given by the master thread. If the core running the master thread bottlenecks then it doesn't matter if you have eight cores or a hundred cores executing slave threads, the game will run exactly the same because the extra cores will be awaiting to the core executing the master thread.

    Vulkan%2BDoom%2Bcore%2Bload.png
     
    Last edited: Jul 12, 2019
    drescherjm and IdiotInCharge like this.
  32. Lepardi

    Lepardi Limp Gawd

    Messages:
    197
    Joined:
    Nov 8, 2017
    Only HT four-core. Games have quickly evolved in to taking advantage of 8 threads after the release of 8 thread consoles.

    Same evolvement will happen over the next years, as 2020 consoles will come with 8 cores and 16/24 threads.
     
  33. pandora's box

    pandora's box [H]ardness Supreme

    Messages:
    4,471
    Joined:
    Sep 7, 2004
    Bought a 3900x from Best Buy, upgrading from a 2600x. Will continue to use my asrock Taichi x470 motherboard. Will be cooled using my Corsair h150 360mm aio.

    3900x is total overkill for my needs (only gaming), but decided to treat myself after a promotion at work smile.gif . The gains in minimum FPS are pretty impressive, even at 1440p. I game at 3440x1440 100hz

    In regards to the games and cpu core debate - pretty sure the ps5 and next Xbox will be using 8 core Ryzen cpus, and as such gaming PCs are going to need 8 core CPU’s just to keep up and that’s not accounting for the usual overhead of games ported from consoles to pc
     
  34. juanrga

    juanrga Pro-Intel / Anti-AMD Just FYI

    Messages:
    2,540
    Joined:
    Feb 22, 2017
    One or two cores are reserved for the system in current consoles, so only six or seven cores are accessible to games. I guess future Zen-based consoles will do something similar.
     
  35. Lepardi

    Lepardi Limp Gawd

    Messages:
    197
    Joined:
    Nov 8, 2017
    One is reserved for other stuff yeah so 7 cores. I think with 2020 consoles they are stripping the reservations for useless kinect stuff etc. so it would be even less of impact. But 1 thread out of 16 wouldn't be much anyway.
     
  36. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    To a degree- those 'eight-thread' consoles (XB1, PS4) have eight incredibly weak cores, but the games that they're running are console games that are limited in both scope and in framerate.

    This upcoming generation represents the biggest leap in console performance relative to desktop performance since the original Xbox (that no one knew how to code for). That's less true now, but at the same time, single core performance on these consoles is still behind desktop CPUs from ten years ago.

    What we expect from games running on the desktop is higher detail settings, higher resolution, and higher framerates in a variety of scenarios.

    Should be eight cores and sixteen threads.
     
  37. TMCM

    TMCM [H]ard|Gawd

    Messages:
    1,368
    Joined:
    Apr 15, 2003
    A little bummed, that for my purposes, there isn't enough of a performance increase to warrant upgrading my 6700k @4.8Ghz. My wallet is happy though :D Maye Zen3 or whatever Intel has out in 2-3 years will justify the expense of upgrading
     
    DuronBurgerMan likes this.
  38. TheHig

    TheHig Limp Gawd

    Messages:
    502
    Joined:
    Apr 9, 2016
    :dead:It’s been said by others here but restating anyway..:D

    For me: Two real choices. -3600 or go all in on the 3900.

    The 3600 is basically the CPU to get if you are a general user/gamer today. Price/performance is insane and it can be use on just about any board out there.

    3700x is really tempting for that “future proofing “ but even if you dabble in steaming and light creation tasks — is it worth 130 more than a 1600? I’m not so sure spend 130 more on a gpu.

    3900x is tremendous for anyone who can actually put to to work and seems well worth it if you can. Possibly the most “future proof” if you care about that sort of thing. A term I don’t love but many people make purchases based on perceived longevity.
     
    sirmonkey1985 likes this.
  39. VIC-20

    VIC-20 Gawd

    Messages:
    879
    Joined:
    Mar 24, 2006
    That is an excellent point.
     
  40. Dan_D

    Dan_D [H]ard as it Gets

    Messages:
    54,224
    Joined:
    Feb 9, 2002
    I only partially agree with this sentiment. The only thing you save on is the motherboard, but unfortunately, that comes with a different kind of cost. The fact is, even mid-range AM4 motherboards feature more cost cutting than you often see on similar Intel boards. Hence the whole BIOS upgrade fiasco. Many of them do not have the ability to support flashing without a CPU / RAM installed. Any BIOS with a 128Mbit flash ROM may also lose some features due to accommodate the larger AGESA code required to support Ryzen 3000 series CPU's. This includes lost support for APU's, the UEFI GUI and even RAID support in some cases.

    Like it or not, Intel switching chipsets and sockets so often prevents issues like these. It does cost more money, but there are a lot less potential headaches to deal with dealing with broader CPU compatibility over two or three generations. Those same motherboards with 128Mbit BIOSes will be even more problematic if a fourth generation Ryzen stays on AM4. While X570 has better VRM's across the broader spectrum of models, X370 and X470 motherboards often have VRM's that are cut rate and incapable of supporting some of higher core count CPU's beyond stock speeds. This could even potentially impact boost clocks. We saw similar issues with AM3 / AM3+ motherboards where some motherboards could only handle 95w TDP CPU's while the 130w options required higher end motherboards. That's basically where we are headed with AM4.
     
    otg, TheHig and IdiotInCharge like this.