AMD Ryzen 9 3000 is a 16-core Socket AM4 Beast

Discussion in 'HardForum Tech News' started by sknight, May 10, 2019.

  1. legcramp

    legcramp [H]ardForum Junkie

    Messages:
    10,822
    Joined:
    Aug 16, 2004
    This is getting juicy.
     
  2. jfreund

    jfreund Gawd

    Messages:
    953
    Joined:
    Sep 3, 2006
    I'll take 8 cores on a single chiplet at 5+ GHz, please.
     
  3. Oldmodder

    Oldmodder Gawd

    Messages:
    707
    Joined:
    Aug 24, 2018
    With Ryzen processors like this, i cant wait to see what the next batch of threadrippers can do for me.
    12 cores @ 4GHZ now, i wouldn't mind 16 cores @ 5GHZ
     
  4. KarsusTG

    KarsusTG 2[H]4U

    Messages:
    3,012
    Joined:
    Aug 27, 2010
    Honestly, I don't feel that gaming should be the benchmark we use for these things. We all hate to admit it to ourselves, but pc gamers make up a small percentage of users. I do feel you on the sentiment though.

    I think software is starting to change, and as these systems become more widespread I believe we will see the developers follow. I use Pix4D software for example, and it doesn't even start seeing diminishing returns until 56 cores or so (according to the emails I got from them.) Now if we could just get Solidworks to do the same...
     
    N4CR and Keljian like this.
  5. Keljian

    Keljian Gawd

    Messages:
    646
    Joined:
    Nov 7, 2006
    Compiling and autoCAD (inventor, eagle, fusion) are my loads, with a bit of davinci on the side. Running on a 9700k/1080ti with 32 gig of ram, barely touches the sides...
     
  6. Master_shake_

    Master_shake_ [H]ardForum Junkie

    Messages:
    9,462
    Joined:
    Apr 9, 2012
    and to think just 2 years ago we were having to sit through release of quad core after quad core on the mainstream.
     
  7. Keljian

    Keljian Gawd

    Messages:
    646
    Joined:
    Nov 7, 2006
    To be fair - I'm not saying AMD shouldn't do this. I'm questioning the use case for the average Jane/Joe
     
  8. DooKey

    DooKey [H]ardness Supreme

    Messages:
    8,105
    Joined:
    Apr 25, 2001
    You'll be waiting a while for that. It's not coming anytime soon.
     
    GhostCow likes this.
  9. Teenk9

    Teenk9 Gawd

    Messages:
    974
    Joined:
    Jul 22, 2011
    What was the use case for average consumers when AMD released the first dual-core?
     
    Darth Kyrie likes this.
  10. Keljian

    Keljian Gawd

    Messages:
    646
    Joined:
    Nov 7, 2006
    Single to dual and dual to quad made sense- lots of background processes, some multithreaded software.

    Quad to 8 made sense from a logistics point of view. But unless you have embarrassingly parallel loads (like encoding, or rendering), getting more out is really hard. Even if you do - a better option is put that into a dedicated processor, eg GPU or SIMD unit

    If the code is 50-75% parallel, going from 8 to 16 cores nets about 5-10% speed up, for twice the number of cores, that’s not even taking into account I/O bottlenecks, cause you have to feed the beast.

    https://en.m.wikipedia.org/wiki/Amdahl's_law


    Case in point (without I/O bottlenecks) https://www.anandtech.com/bench/product/2258?vs=2272

    This is different from mobile phones as in phones you typically have lower power cores as well as performance cores, so the extra cores are there for power saving.
     
    Last edited: May 12, 2019
  11. _mockingbird

    _mockingbird Gawd

    Messages:
    992
    Joined:
    Feb 20, 2017
    I guess you've never gone to college because college papers require citing reputable sources.

    Heck, this is something that's even taught in the high school.
     
    Nolan7689 and Hakaba like this.
  12. Trimlock

    Trimlock [H]ardForum Junkie

    Messages:
    15,157
    Joined:
    Sep 23, 2005
    The use case was almost everything, other than gaming.

    The first dual core was a staggering increase in PC capability at the time, and funny enough advertised to gamers despite a lack of software support.
     
    GhostCow and Red Falcon like this.
  13. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    9,352
    Joined:
    Apr 22, 2006
    Exactly, and as I mentioned before, those encoding/rendering tasks are typically (for the home computers) going to be non real time stuff suitable for batching.

    I used to encode a fair bit of video. Sometimes hours/day, but usually I just batched it overnight while I slept. So it didn't matter if it took 4 hours or 1 hour. Buying more CPU to speed up work that happens when I sleep seems pointless to me.

    Speeding up the real time usage matters much more. I'll take faster 6-8 cores, over slower 12-16 cores every time.
     
    GhostCow and Keljian like this.
  14. Trimlock

    Trimlock [H]ardForum Junkie

    Messages:
    15,157
    Joined:
    Sep 23, 2005
    I don't have any need for 8 cores at home either. I'll take it, don't get me wrong but I'll take <8 cores @ 5ghz over >8-cores that barely maintains 4.4ghz
     
  15. Keljian

    Keljian Gawd

    Messages:
    646
    Joined:
    Nov 7, 2006
    I have an unusual use case - I work and study from home, and use one pc to do it all. The biggest transient load I have is typically compiling and that is usually very quick, especially with recent processors. Up to 5 minutes of compile time is more than workable, gives me time to make a cup of tea..:) (though I confess, it's usually less than 1)
     
    Last edited: May 11, 2019
  16. Smashing Young Man

    Smashing Young Man [H]ard|Gawd

    Messages:
    1,538
    Joined:
    Sep 11, 2009
    My DAW would love all these cores, especially on larger projects where I'm running a bunch of virtual instruments and guitar and vocal effects.
     
    Sulphademus likes this.
  17. Taldren

    Taldren Gawd

    Messages:
    520
    Joined:
    Nov 28, 2006
    Rather have the extra frequency boost of the 12 core (5GHz) than extra cores that I know I have no means to use. I can't even use the 8 I already have at 100% without synthetic benchmarks.
     
    Jim Kim likes this.
  18. Pieter3dnow

    Pieter3dnow [H]ardness Supreme

    Messages:
    6,789
    Joined:
    Jul 29, 2009
    That is how the ball gets rolling :) . Once more people have these cpu software will follow ;)
     
    TheFlayedMan and Darth Kyrie like this.
  19. kirbyrj

    kirbyrj [H]ard as it Gets

    Messages:
    24,489
    Joined:
    Feb 1, 2005
    You'll be saying the same thing when it's for sale on Amazon and Newegg...very predictable...
     
  20. DrDoU

    DrDoU 2[H]4U

    Messages:
    2,393
    Joined:
    Jun 4, 2007
    Bring on the 12 cores
    Odd cores?Had a 3 core 720@3.4
     
  21. thebufenator

    thebufenator [H]ard|Gawd

    Messages:
    1,209
    Joined:
    Dec 8, 2004
    I am sure glad Mockingbird is here to tell us its all just rumors.

    NO SHIT
     
  22. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    9,352
    Joined:
    Apr 22, 2006
    Well I would have used rumor instead of hearsay. But he is correct. Any of the details so far are nothing but rumor.

    As far as how realistic those rumors are, I would bet we aren't going to see the Adored TV, wishful thinking 5.1GHz boost 16 core happen.

    Almost across the board, Adored AMD rumors were wishful thinking with super low prices, super high performance, and super low power. It was almost certainly completely made up BS.

    I'd love for them to be true, but really when things sound too good to be true, they usually are.

    You have be especially skeptical when they are telling you exactly what you want to hear.
     
    BB Gun, Revdarian, DooKey and 2 others like this.
  23. kirbyrj

    kirbyrj [H]ard as it Gets

    Messages:
    24,489
    Joined:
    Feb 1, 2005
    Well, lets see where it ends up. Here's hoping we can stop hearing from a certain member here crying "wolf" about all the "fakes."
     
    Darth Kyrie likes this.
  24. Brian_B

    Brian_B 2[H]4U

    Messages:
    3,328
    Joined:
    Mar 23, 2012
    I would not hold my breath on any overclock potential.

    If there's potential to get the clockspeed, odds are AMD's turbo will already eek it out for you.

    Has nothing to do with 7nm vs 12nm vs whatever... it just has to do with power management getting that much better over the years. Ryzen hasn't been a great overclocker because it's had great power management and gets great turbo clocks.
     
  25. Pieter3dnow

    Pieter3dnow [H]ardness Supreme

    Messages:
    6,789
    Joined:
    Jul 29, 2009
    It is rather simple there rumours and then there are rumours that make sense.
    Given that AMD has a clear advantage pushing cores because that is where they do better the design is key to flood the market with as many cores as they can, this will push their agenda and will be better also for when the competition comes back.

    When the market is flooded with 12 and 16 cores on the desktop the game API using Vulkan or DX12 pushing more multi thread software why would people then go back to 8 or 6 core cpu when the landscape has changed in favour of AMD.

    People denying this are purposely fooling themselves. The desktop market strategy is sound AMD only target for higher margins are server. And the server parts are also cheaper then what the competition offers. The argument for higher prices tend to be people who push Intel desktop marketing strategy (pricing) onto AMD and that never ever happened before there is no good reason why it would happen now.
     
    Last edited: May 12, 2019
  26. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    9,352
    Joined:
    Apr 22, 2006
    I don't think anyone claims 12 and 16 cores aren't coming. That is pretty much confirmed by AMD themselves. They showed the package for Ryzen 3000 with one chiplet, and Lisa Su pointed out the empty chiplet slot, and said they were going to fill it. So 12 and 16 cores have been confirmed by the CEO, no less.

    What is nonsense are the clockspeed, pricing, and high core count APU claims for CPUs. On the GPU side, massive power reductions, offering NVidia performance for something like half the price.

    That nonsense from Adored, is just a laundry list of wishful thinking that people want to hear.
     
    Last edited: May 12, 2019
  27. _mockingbird

    _mockingbird Gawd

    Messages:
    992
    Joined:
    Feb 20, 2017
    Joel Hruska:

    There has been a surfeit of what Alan Greenspan might have called "irrational exuberance" surrounding AMD and 7nm technology for both Ryzen and Navi. It appears to be fed by fanboys with no concept of how over-hyping the technology cycle behind a company can lead to fans being angry and even vengeful when AMD "fails" to deliver on promises they never made. Widespread coverage of these rumors can lead to them being treated as facts or near-facts, despite AMD doing absolutely nothing to confirm them.

    The basic argument is the same, and goes like this:

    1). AMD is about to do something extraordinary.

    2). AMD, being run by idiots, will choose to sell their extraordinary new product for roughly half the price as the competition, despite the fact that what AMD needs, more than anything, is stable, long-term profits and strong revenue gain across multiple market shares.

    3). Even though the only way to establish #2 is by investing in one's own products and growing revenue, people expect that AMD will starve itself in the name of gaining market share, even though "Lose money on every product and make it up with volume," is not actually a winning move.

    4). This practical issue will be solved with chiplets, because chiplets are magic, and 7nm wafers are not more expensive, and design costs have not risen, and AMD is not trying to break into markets like AI and deep learning where Nvidia has an enormous institutional advantage. AMD certainly isn't facing an entrenched competitor like Intel, whose quarterly profits dwarf AMD's by orders of magnitude.

    5). The fact that 10nm has slipped so badly is proof that Intel can no longer compete and will slowly be destroyed by ARM and AMD while AMD takes over its market and rules the Earth.

    The most annoying thing about all of this is that you could hit "Rewind" and turn the clock back to early 2006. They're basically the same arguments with updated product names (and, of course, the fact that AMD didn't own ATI in early 2006).

    I expect AMD to take advantage of 7nm to build a much more competitive Navi than Vega or Polaris have been.I think they will offer a much higher level of performance per dollar and performance per watt. I have not made specific predictions past that because the rumor mill has done a lot of churning about Navi and most of it has been stupid. AMD will not launch an RTX 2070 killer at $250 because AMD isn't going to leave all that money on the table when it desperately needs revenue to fuel its own R&D. AMD wants to play in AI and DL. Nvidia owns those markets so completely, AMD is basically fighting to be a footnote. So clearly, the right solution is to make as much money as possible and plow that back into the business as quickly as possible, in order to build more aggressive AI-focused products on 7nm and steal a march on Nvidia.

    Just kidding.

    What I meant was, "The smart thing to do is to sell each GPU for one penny above cost, to make the fanboys happy."

    (To be absolutely clear, I am not annoyed with you or any commenter specifically. I am tired of chasing down and debunking bad rumors based on dumb data).

    I think Navi will be good. I share your concern about how good it will be because AMD has had a hard time securing a straight win against Nvidia in most market segments (the RX 570 is a blowout win against both the GTX 1050 Ti and the GTX 1650, but that's the exception that proves the rule). I think the $330 price tag on an RTX 2070 competitor is probably low, but it's not unbelievably, insanely low. The $250 rumor was.

    The rumor mill all-too-often confuses “AMD will make a very competitive / superior play in terms of performance per dollar” with “AMD will gut its own profit margins in the name of offering an unsustainably good deal.”
     
  28. N4CR

    N4CR 2[H]4U

    Messages:
    3,863
    Joined:
    Oct 17, 2011
    Always wondered if the hype train was largely fuelled by sneaky marketing teams for. Intel/nvidia.
     
    Brian_B likes this.
  29. Derangel

    Derangel [H]ard as it Gets

    Messages:
    17,779
    Joined:
    Jan 31, 2008
    I doubt it. People just get overly excited and want the "underdog" to beat down their "evil" competitors. People get attached to "their" corporation and it leads to wanting to believe every good rumor and ignore anything that says otherwise, so it leads to massive overhype as the rumors keep getting spread around. In turn, everyone else starts to believe those rumors as they keep getting repeated so when the products come out and don't live up to those overhyped expectations and now they're "underwhelming" and "disappointing". Intel and Nvidia don't need to do anything, AMD blind fanboys do it to themselves and have been doing so for years.
     
    Brian_B and DooKey like this.
  30. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    10,934
    Joined:
    Oct 4, 2007
    Nice but, gone are the days when I will upgrade my CPU every year or even every other year. Maybe if it doubled the performance but still, not needed, at least for me. That said, good to see that AMD's death was greatly exaggerated.
     
  31. N4CR

    N4CR 2[H]4U

    Messages:
    3,863
    Joined:
    Oct 17, 2011
    Oh yeah the rabid fanboys make for interesting encounters, that goes for either end of the stick, the nervous non-AMD types and vice versa.
    That said, for the AMD side It's gone on so long that you'd expect they had figured it out by now - over a decade anyway in my experience, which lead me to wonder if the less experienced members of the community are partially being bated by experienced marketing teams and sockpuppets.

    Who knows but it's one thing that never changes every time something AMD comes out.
     
    Wade88 likes this.
  32. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    10,934
    Joined:
    Oct 4, 2007
    Personally, I would have thought attitudes like the one here would have gone away long ago. Oh well, just a new gen of bashing AMD.
     
    Darth Kyrie likes this.
  33. gigaxtreme1

    gigaxtreme1 2[H]4U

    Messages:
    3,507
    Joined:
    Oct 1, 2002
    It is what it is. Preconceptions are hard to live down. As they say, proof is in the pudding.
     
    Darth Kyrie and ManofGod like this.
  34. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    10,934
    Joined:
    Oct 4, 2007
    The only ones who typically make this low price prediction are Nvidia fans wanting the price on Nvidisa card to go down. There will be rumors, some realistic and some not but, it is not your job to run around and prove one way or the other.
     
    Last edited: May 12, 2019
    Darth Kyrie and Pieter3dnow like this.
  35. c3k

    c3k 2[H]4U

    Messages:
    2,097
    Joined:
    Sep 8, 2007
    Didn't read all of mockingbird's post. Was it something about 32 cores at 6+GHz (and 60w TDP) for $250?

    ;)

    Yeah, he made a few good points. But I still think this upcoming Ryzen release is gonna be great. And Navi? Well, we'll see.
     
  36. dvsman

    dvsman 2[H]4U

    Messages:
    2,776
    Joined:
    Dec 2, 2009
    I'm still torn on this. As a 2700x power user I'm all for more cores. But this split my thinking:

    On the one hand, the fact that to get those cores would require multiple 8 core chiplets, which in turn might cause memory bandwidth issues for software that actually uses all the cores (according to some users posting here - since I'm not a software dev, I'll take their word on that).

    On the other hand, software using more cores is and always has been a chicken and egg situation. If AMD didn't push the core count, we'd still be stuck with Intel 4 and 4/8 chips. Which of course meant software devs wouldn't be writing to take advantage of more cores than that. I'd wager with 5G and cloud computing / services and AI infecting everything, more cores will be the answer, especially for normies who buy a computer once and run it until the wheels fall off / not [H] who upgrade all the time or have multiple boxes running simultaneously.

    Personally, if the 3xxx chips can hit the boosts that everyone is hoping, then I'm in for whatever model gets us there - maybe the 12 core if it works to be faster than the 16 core.
     
  37. jfreund

    jfreund Gawd

    Messages:
    953
    Joined:
    Sep 3, 2006
    Not much software uses 4 cores, let alone 8. The 16 threads on my 2700X mostly sit idle. I occasionally do some encoding that will use every available thread, but on daily use clock rate/IPC is much more important than 8 bazillion cores.
     
  38. Keljian

    Keljian Gawd

    Messages:
    646
    Joined:
    Nov 7, 2006
    https://www.pcworld.com/article/329...ng-amds-32-core-threadripper-performance.html 32 cores/4 channels

    [/url]

    8 cores/2 channels - if it wasn't bandwidth constrained then you wouldn't see a benefit in faster speeds. (yes I know the uncore runs faster with memory speed increases, but that's not going to make as dramatic a difference)

    TLDR: ryzen 8 core/2 channel hits its stride about 3466Mhz - memory bandwidth wise. Now assuming caching and other nice things, you could say that a 16 core would need about 5000mhz worth of speed on the memory bus to hit the same "stride" on two channels.
     
    Last edited: May 12, 2019
  39. N4CR

    N4CR 2[H]4U

    Messages:
    3,863
    Joined:
    Oct 17, 2011
    Other thing to consider if not better imc and ddr4 speeds is the ddr5 wildcard, not seen confirmation either way.
     
  40. Keljian

    Keljian Gawd

    Messages:
    646
    Joined:
    Nov 7, 2006
    You didn't take time to look :)

    A better imc won't make up for gaping bandwidth needs. (see differences between ryzen 1xxx and ryzen 2xxx for example)

    DDR4, on release, was not faster than DDR3 - it started about 2133-2400. In fact DDR3 was faster.

    I've seen reports of DDR5 being released towards the end of this year, at 4800MT/s and this being about 1.87x the speed of 3200MT/s DDR4.

    I've also seen reports of DDR5 being postponed to 2020

    I would say that DDR5 is due on platforms about mid next year, based on these reports - would make sense for AMD to persue a new socket for this...
     
Tags: