AMD Ryzen 9 3000 is a 16-core Socket AM4 Beast

With Ryzen processors like this, i cant wait to see what the next batch of threadrippers can do for me.
12 cores @ 4GHZ now, i wouldn't mind 16 cores @ 5GHZ
 
As a gamer I've yet to see my quad core left wanting. As an IT guy I've seen very few workloads for office machines requiring more than quad cores. A few 2D CAD machines and extreme Excel users get hex core i7 8700 CPUs and users are happy.

I'll go 16 cores just because I want to but I laugh at the thought of it. One of my servers runs 16 cores and can support 44 VMs and serve data to 1100+ employees. Oh what glorious excess we have access to now.

Honestly, I don't feel that gaming should be the benchmark we use for these things. We all hate to admit it to ourselves, but pc gamers make up a small percentage of users. I do feel you on the sentiment though.

I think software is starting to change, and as these systems become more widespread I believe we will see the developers follow. I use Pix4D software for example, and it doesn't even start seeing diminishing returns until 56 cores or so (according to the emails I got from them.) Now if we could just get Solidworks to do the same...
 
Compiling and autoCAD (inventor, eagle, fusion) are my loads, with a bit of davinci on the side. Running on a 9700k/1080ti with 32 gig of ram, barely touches the sides...
 
To be fair - I'm not saying AMD shouldn't do this. I'm questioning the use case for the average Jane/Joe
 
What was the use case for average consumers when AMD released the first dual-core?

Single to dual and dual to quad made sense- lots of background processes, some multithreaded software.

Quad to 8 made sense from a logistics point of view. But unless you have embarrassingly parallel loads (like encoding, or rendering), getting more out is really hard. Even if you do - a better option is put that into a dedicated processor, eg GPU or SIMD unit

If the code is 50-75% parallel, going from 8 to 16 cores nets about 5-10% speed up, for twice the number of cores, that’s not even taking into account I/O bottlenecks, cause you have to feed the beast.

https://en.m.wikipedia.org/wiki/Amdahl's_law


Case in point (without I/O bottlenecks) https://www.anandtech.com/bench/product/2258?vs=2272

This is different from mobile phones as in phones you typically have lower power cores as well as performance cores, so the extra cores are there for power saving.
 
Last edited:
What was the use case for average consumers when AMD released the first dual-core?

The use case was almost everything, other than gaming.

The first dual core was a staggering increase in PC capability at the time, and funny enough advertised to gamers despite a lack of software support.
 
Single to dual and dual to quad made sense- lots of background processes, some multithreaded software.

Quad to 8 made sense from a logistics point of view. But unless you have embarrassingly parallel loads (like encoding, or rendering), getting more out is really hard.

If the code is 50-75% parallel, going from 8 to 16 cores nets about 5-10% speed up, for twice the number of cores, that’s not even taking into account I/O bottlenecks, cause you have to feed the beast.

https://en.m.wikipedia.org/wiki/Amdahl's_law

Exactly, and as I mentioned before, those encoding/rendering tasks are typically (for the home computers) going to be non real time stuff suitable for batching.

I used to encode a fair bit of video. Sometimes hours/day, but usually I just batched it overnight while I slept. So it didn't matter if it took 4 hours or 1 hour. Buying more CPU to speed up work that happens when I sleep seems pointless to me.

Speeding up the real time usage matters much more. I'll take faster 6-8 cores, over slower 12-16 cores every time.
 
Exactly, and as I mentioned before, those encoding/rendering tasks are typically (for the home computers) going to be non real time stuff suitable for batching.

Speeding up the real time usage matters much more. I'll take faster 6-8 cores, over slower 12-16 cores every time.

I have an unusual use case - I work and study from home, and use one pc to do it all. The biggest transient load I have is typically compiling and that is usually very quick, especially with recent processors. Up to 5 minutes of compile time is more than workable, gives me time to make a cup of tea..:) (though I confess, it's usually less than 1)
 
Last edited:
My DAW would love all these cores, especially on larger projects where I'm running a bunch of virtual instruments and guitar and vocal effects.
 
Rather have the extra frequency boost of the 12 core (5GHz) than extra cores that I know I have no means to use. I can't even use the 8 I already have at 100% without synthetic benchmarks.
 
You'll be saying the same thing when it's for sale on Amazon and Newegg...very predictable...

Well I would have used rumor instead of hearsay. But he is correct. Any of the details so far are nothing but rumor.

As far as how realistic those rumors are, I would bet we aren't going to see the Adored TV, wishful thinking 5.1GHz boost 16 core happen.

Almost across the board, Adored AMD rumors were wishful thinking with super low prices, super high performance, and super low power. It was almost certainly completely made up BS.

I'd love for them to be true, but really when things sound too good to be true, they usually are.

You have be especially skeptical when they are telling you exactly what you want to hear.
 
Well I would have used rumor instead of hearsay. But he is correct. Any of the details so far are nothing but rumor.

As far as how realistic those rumors are, I would bet we aren't going to see the Adored TV, wishful thinking 5.1GHz boost 16 core happen.

Almost across the board, Adored AMD rumors were wishful thinking with super low prices, super high performance, and super low power. It was almost certainly completely made up BS.

I'd love for them to be true, but really when things sound too good to be true, they usually are.

You have be especially skeptical when they are telling you exactly what you want to hear.

Well, lets see where it ends up. Here's hoping we can stop hearing from a certain member here crying "wolf" about all the "fakes."
 
I would not hold my breath on any overclock potential.

If there's potential to get the clockspeed, odds are AMD's turbo will already eek it out for you.

Has nothing to do with 7nm vs 12nm vs whatever... it just has to do with power management getting that much better over the years. Ryzen hasn't been a great overclocker because it's had great power management and gets great turbo clocks.
 
Well I would have used rumor instead of hearsay. But he is correct. Any of the details so far are nothing but rumor. As far as how realistic those rumors are, I would bet we aren't going to see the Adored TV, wishful thinking 5.1GHz boost 16 core happen. Almost across the board, Adored AMD rumors were wishful thinking with super low prices, super high performance, and super low power. It was almost certainly completely made up BS.
I'd love for them to be true, but really when things sound too good to be true, they usually are.
You have be especially skeptical when they are telling you exactly what you want to hear.

It is rather simple there rumours and then there are rumours that make sense.
Given that AMD has a clear advantage pushing cores because that is where they do better the design is key to flood the market with as many cores as they can, this will push their agenda and will be better also for when the competition comes back.

When the market is flooded with 12 and 16 cores on the desktop the game API using Vulkan or DX12 pushing more multi thread software why would people then go back to 8 or 6 core cpu when the landscape has changed in favour of AMD.

People denying this are purposely fooling themselves. The desktop market strategy is sound AMD only target for higher margins are server. And the server parts are also cheaper then what the competition offers. The argument for higher prices tend to be people who push Intel desktop marketing strategy (pricing) onto AMD and that never ever happened before there is no good reason why it would happen now.
 
Last edited:
It is rather simple there rumours and then there are rumours that make sense.
Given that AMD has a clear advantage pushing cores because that is where they do better the design is key to flood the market with as many cores as they can, this will push their agenda and will be better also for when the competition comes back.

When the market is flooded with 12 and 16 cores on the desktop the game API using Vulkan or DX12 pushing more multi thread software why would people then go back to 8 or 6 core cpu when the landscape has changed in favour of AMD.

People denying this are purposely fooling themselves. The desktop market strategy is sound AMD only target for higher margins are server. And the server parts are also cheaper then what the competition offers. The argument for higher prices tend to be people who push Intel desktop marketing strategy (pricing) onto AMD and that never ever happened before there is no good reason why it would happen now.

I don't think anyone claims 12 and 16 cores aren't coming. That is pretty much confirmed by AMD themselves. They showed the package for Ryzen 3000 with one chiplet, and Lisa Su pointed out the empty chiplet slot, and said they were going to fill it. So 12 and 16 cores have been confirmed by the CEO, no less.

What is nonsense are the clockspeed, pricing, and high core count APU claims for CPUs. On the GPU side, massive power reductions, offering NVidia performance for something like half the price.

That nonsense from Adored, is just a laundry list of wishful thinking that people want to hear.
 
Last edited:
It is rather simple there rumours and then there are rumours that make sense.
Given that AMD has a clear advantage pushing cores because that is where they do better the design is key to flood the market with as many cores as they can, this will push their agenda and will be better also for when the competition comes back.

When the market is flooded with 12 and 16 cores on the desktop the game API using Vulkan or DX12 pushing more multi thread software why would people then go back to 8 or 6 core cpu when the landscape has changed in favour of AMD.

People denying this are purposely fooling themselves. The desktop market strategy is sound AMD only target for higher margins are server. And the server parts are also cheaper then what the competition offers. The argument for higher prices tend to be people who push Intel desktop marketing strategy (pricing) onto AMD and that never ever happened before there is no good reason why it would happen now.

Joel Hruska:

There has been a surfeit of what Alan Greenspan might have called "irrational exuberance" surrounding AMD and 7nm technology for both Ryzen and Navi. It appears to be fed by fanboys with no concept of how over-hyping the technology cycle behind a company can lead to fans being angry and even vengeful when AMD "fails" to deliver on promises they never made. Widespread coverage of these rumors can lead to them being treated as facts or near-facts, despite AMD doing absolutely nothing to confirm them.

The basic argument is the same, and goes like this:

1). AMD is about to do something extraordinary.

2). AMD, being run by idiots, will choose to sell their extraordinary new product for roughly half the price as the competition, despite the fact that what AMD needs, more than anything, is stable, long-term profits and strong revenue gain across multiple market shares.

3). Even though the only way to establish #2 is by investing in one's own products and growing revenue, people expect that AMD will starve itself in the name of gaining market share, even though "Lose money on every product and make it up with volume," is not actually a winning move.

4). This practical issue will be solved with chiplets, because chiplets are magic, and 7nm wafers are not more expensive, and design costs have not risen, and AMD is not trying to break into markets like AI and deep learning where Nvidia has an enormous institutional advantage. AMD certainly isn't facing an entrenched competitor like Intel, whose quarterly profits dwarf AMD's by orders of magnitude.

5). The fact that 10nm has slipped so badly is proof that Intel can no longer compete and will slowly be destroyed by ARM and AMD while AMD takes over its market and rules the Earth.

The most annoying thing about all of this is that you could hit "Rewind" and turn the clock back to early 2006. They're basically the same arguments with updated product names (and, of course, the fact that AMD didn't own ATI in early 2006).

I expect AMD to take advantage of 7nm to build a much more competitive Navi than Vega or Polaris have been.I think they will offer a much higher level of performance per dollar and performance per watt. I have not made specific predictions past that because the rumor mill has done a lot of churning about Navi and most of it has been stupid. AMD will not launch an RTX 2070 killer at $250 because AMD isn't going to leave all that money on the table when it desperately needs revenue to fuel its own R&D. AMD wants to play in AI and DL. Nvidia owns those markets so completely, AMD is basically fighting to be a footnote. So clearly, the right solution is to make as much money as possible and plow that back into the business as quickly as possible, in order to build more aggressive AI-focused products on 7nm and steal a march on Nvidia.

Just kidding.

What I meant was, "The smart thing to do is to sell each GPU for one penny above cost, to make the fanboys happy."

(To be absolutely clear, I am not annoyed with you or any commenter specifically. I am tired of chasing down and debunking bad rumors based on dumb data).

I think Navi will be good. I share your concern about how good it will be because AMD has had a hard time securing a straight win against Nvidia in most market segments (the RX 570 is a blowout win against both the GTX 1050 Ti and the GTX 1650, but that's the exception that proves the rule). I think the $330 price tag on an RTX 2070 competitor is probably low, but it's not unbelievably, insanely low. The $250 rumor was.

The rumor mill all-too-often confuses “AMD will make a very competitive / superior play in terms of performance per dollar” with “AMD will gut its own profit margins in the name of offering an unsustainably good deal.”
 
Joel Hruska:

There has been a surfeit of what Alan Greenspan might have called "irrational exuberance" surrounding AMD and 7nm technology for both Ryzen and Navi. It appears to be fed by fanboys with no concept of how over-hyping the technology cycle behind a company can lead to fans being angry and even vengeful when AMD "fails" to deliver on promises they never made. Widespread coverage of these rumors can lead to them being treated as facts or near-facts, despite AMD doing absolutely nothing to confirm them.

The basic argument is the same, and goes like this:

1). AMD is about to do something extraordinary.

2). AMD, being run by idiots, will choose to sell their extraordinary new product for roughly half the price as the competition, despite the fact that what AMD needs, more than anything, is stable, long-term profits and strong revenue gain across multiple market shares.

3). Even though the only way to establish #2 is by investing in one's own products and growing revenue, people expect that AMD will starve itself in the name of gaining market share, even though "Lose money on every product and make it up with volume," is not actually a winning move.

4). This practical issue will be solved with chiplets, because chiplets are magic, and 7nm wafers are not more expensive, and design costs have not risen, and AMD is not trying to break into markets like AI and deep learning where Nvidia has an enormous institutional advantage. AMD certainly isn't facing an entrenched competitor like Intel, whose quarterly profits dwarf AMD's by orders of magnitude.

5). The fact that 10nm has slipped so badly is proof that Intel can no longer compete and will slowly be destroyed by ARM and AMD while AMD takes over its market and rules the Earth.

The most annoying thing about all of this is that you could hit "Rewind" and turn the clock back to early 2006. They're basically the same arguments with updated product names (and, of course, the fact that AMD didn't own ATI in early 2006).

I expect AMD to take advantage of 7nm to build a much more competitive Navi than Vega or Polaris have been.I think they will offer a much higher level of performance per dollar and performance per watt. I have not made specific predictions past that because the rumor mill has done a lot of churning about Navi and most of it has been stupid. AMD will not launch an RTX 2070 killer at $250 because AMD isn't going to leave all that money on the table when it desperately needs revenue to fuel its own R&D. AMD wants to play in AI and DL. Nvidia owns those markets so completely, AMD is basically fighting to be a footnote. So clearly, the right solution is to make as much money as possible and plow that back into the business as quickly as possible, in order to build more aggressive AI-focused products on 7nm and steal a march on Nvidia.

Just kidding.

What I meant was, "The smart thing to do is to sell each GPU for one penny above cost, to make the fanboys happy."

(To be absolutely clear, I am not annoyed with you or any commenter specifically. I am tired of chasing down and debunking bad rumors based on dumb data).

I think Navi will be good. I share your concern about how good it will be because AMD has had a hard time securing a straight win against Nvidia in most market segments (the RX 570 is a blowout win against both the GTX 1050 Ti and the GTX 1650, but that's the exception that proves the rule). I think the $330 price tag on an RTX 2070 competitor is probably low, but it's not unbelievably, insanely low. The $250 rumor was.

The rumor mill all-too-often confuses “AMD will make a very competitive / superior play in terms of performance per dollar” with “AMD will gut its own profit margins in the name of offering an unsustainably good deal.”

Always wondered if the hype train was largely fuelled by sneaky marketing teams for. Intel/nvidia.
 
Always wondered if the hype train was largely fuelled by sneaky marketing teams for. Intel/nvidia.

I doubt it. People just get overly excited and want the "underdog" to beat down their "evil" competitors. People get attached to "their" corporation and it leads to wanting to believe every good rumor and ignore anything that says otherwise, so it leads to massive overhype as the rumors keep getting spread around. In turn, everyone else starts to believe those rumors as they keep getting repeated so when the products come out and don't live up to those overhyped expectations and now they're "underwhelming" and "disappointing". Intel and Nvidia don't need to do anything, AMD blind fanboys do it to themselves and have been doing so for years.
 
Nice but, gone are the days when I will upgrade my CPU every year or even every other year. Maybe if it doubled the performance but still, not needed, at least for me. That said, good to see that AMD's death was greatly exaggerated.
 
I doubt it. People just get overly excited and want the "underdog" to beat down their "evil" competitors. People get attached to "their" corporation and it leads to wanting to believe every good rumor and ignore anything that says otherwise, so it leads to massive overhype as the rumors keep getting spread around. In turn, everyone else starts to believe those rumors as they keep getting repeated so when the products come out and don't live up to those overhyped expectations and now they're "underwhelming" and "disappointing". Intel and Nvidia don't need to do anything, AMD blind fanboys do it to themselves and have been doing so for years.
Oh yeah the rabid fanboys make for interesting encounters, that goes for either end of the stick, the nervous non-AMD types and vice versa.
That said, for the AMD side It's gone on so long that you'd expect they had figured it out by now - over a decade anyway in my experience, which lead me to wonder if the less experienced members of the community are partially being bated by experienced marketing teams and sockpuppets.

Who knows but it's one thing that never changes every time something AMD comes out.
 
I doubt it. People just get overly excited and want the "underdog" to beat down their "evil" competitors. People get attached to "their" corporation and it leads to wanting to believe every good rumor and ignore anything that says otherwise, so it leads to massive overhype as the rumors keep getting spread around. In turn, everyone else starts to believe those rumors as they keep getting repeated so when the products come out and don't live up to those overhyped expectations and now they're "underwhelming" and "disappointing". Intel and Nvidia don't need to do anything, AMD blind fanboys do it to themselves and have been doing so for years.

Personally, I would have thought attitudes like the one here would have gone away long ago. Oh well, just a new gen of bashing AMD.
 
Joel Hruska:

There has been a surfeit of what Alan Greenspan might have called "irrational exuberance" surrounding AMD and 7nm technology for both Ryzen and Navi. It appears to be fed by fanboys with no concept of how over-hyping the technology cycle behind a company can lead to fans being angry and even vengeful when AMD "fails" to deliver on promises they never made. Widespread coverage of these rumors can lead to them being treated as facts or near-facts, despite AMD doing absolutely nothing to confirm them.

The basic argument is the same, and goes like this:

1). AMD is about to do something extraordinary.

2). AMD, being run by idiots, will choose to sell their extraordinary new product for roughly half the price as the competition, despite the fact that what AMD needs, more than anything, is stable, long-term profits and strong revenue gain across multiple market shares.

3). Even though the only way to establish #2 is by investing in one's own products and growing revenue, people expect that AMD will starve itself in the name of gaining market share, even though "Lose money on every product and make it up with volume," is not actually a winning move.

4). This practical issue will be solved with chiplets, because chiplets are magic, and 7nm wafers are not more expensive, and design costs have not risen, and AMD is not trying to break into markets like AI and deep learning where Nvidia has an enormous institutional advantage. AMD certainly isn't facing an entrenched competitor like Intel, whose quarterly profits dwarf AMD's by orders of magnitude.

5). The fact that 10nm has slipped so badly is proof that Intel can no longer compete and will slowly be destroyed by ARM and AMD while AMD takes over its market and rules the Earth.

The most annoying thing about all of this is that you could hit "Rewind" and turn the clock back to early 2006. They're basically the same arguments with updated product names (and, of course, the fact that AMD didn't own ATI in early 2006).

I expect AMD to take advantage of 7nm to build a much more competitive Navi than Vega or Polaris have been.I think they will offer a much higher level of performance per dollar and performance per watt. I have not made specific predictions past that because the rumor mill has done a lot of churning about Navi and most of it has been stupid. AMD will not launch an RTX 2070 killer at $250 because AMD isn't going to leave all that money on the table when it desperately needs revenue to fuel its own R&D. AMD wants to play in AI and DL. Nvidia owns those markets so completely, AMD is basically fighting to be a footnote. So clearly, the right solution is to make as much money as possible and plow that back into the business as quickly as possible, in order to build more aggressive AI-focused products on 7nm and steal a march on Nvidia.

Just kidding.

What I meant was, "The smart thing to do is to sell each GPU for one penny above cost, to make the fanboys happy."

(To be absolutely clear, I am not annoyed with you or any commenter specifically. I am tired of chasing down and debunking bad rumors based on dumb data).

I think Navi will be good. I share your concern about how good it will be because AMD has had a hard time securing a straight win against Nvidia in most market segments (the RX 570 is a blowout win against both the GTX 1050 Ti and the GTX 1650, but that's the exception that proves the rule). I think the $330 price tag on an RTX 2070 competitor is probably low, but it's not unbelievably, insanely low. The $250 rumor was.

The rumor mill all-too-often confuses “AMD will make a very competitive / superior play in terms of performance per dollar” with “AMD will gut its own profit margins in the name of offering an unsustainably good deal.”

The only ones who typically make this low price prediction are Nvidia fans wanting the price on Nvidisa card to go down. There will be rumors, some realistic and some not but, it is not your job to run around and prove one way or the other.
 
Last edited:
Didn't read all of mockingbird's post. Was it something about 32 cores at 6+GHz (and 60w TDP) for $250?

;)

Yeah, he made a few good points. But I still think this upcoming Ryzen release is gonna be great. And Navi? Well, we'll see.
 
I'm still torn on this. As a 2700x power user I'm all for more cores. But this split my thinking:

On the one hand, the fact that to get those cores would require multiple 8 core chiplets, which in turn might cause memory bandwidth issues for software that actually uses all the cores (according to some users posting here - since I'm not a software dev, I'll take their word on that).

On the other hand, software using more cores is and always has been a chicken and egg situation. If AMD didn't push the core count, we'd still be stuck with Intel 4 and 4/8 chips. Which of course meant software devs wouldn't be writing to take advantage of more cores than that. I'd wager with 5G and cloud computing / services and AI infecting everything, more cores will be the answer, especially for normies who buy a computer once and run it until the wheels fall off / not [H] who upgrade all the time or have multiple boxes running simultaneously.

Personally, if the 3xxx chips can hit the boosts that everyone is hoping, then I'm in for whatever model gets us there - maybe the 12 core if it works to be faster than the 16 core.
 
On the other hand, software using more cores is and always has been a chicken and egg situation. If AMD didn't push the core count, we'd still be stuck with Intel 4 and 4/8 chips. Which of course meant software devs wouldn't be writing to take advantage of more cores than that.

Not much software uses 4 cores, let alone 8. The 16 threads on my 2700X mostly sit idle. I occasionally do some encoding that will use every available thread, but on daily use clock rate/IPC is much more important than 8 bazillion cores.
 
On the one hand, the fact that to get those cores would require multiple 8 core chiplets, which in turn might cause memory bandwidth issues for software that actually uses all the cores (according to some users posting here - since I'm not a software dev, I'll take their word on that).

https://www.pcworld.com/article/329...ng-amds-32-core-threadripper-performance.html 32 cores/4 channels

[/url]

8 cores/2 channels - if it wasn't bandwidth constrained then you wouldn't see a benefit in faster speeds. (yes I know the uncore runs faster with memory speed increases, but that's not going to make as dramatic a difference)

TLDR: ryzen 8 core/2 channel hits its stride about 3466Mhz - memory bandwidth wise. Now assuming caching and other nice things, you could say that a 16 core would need about 5000mhz worth of speed on the memory bus to hit the same "stride" on two channels.
 
Last edited:
https://www.pcworld.com/article/329...ng-amds-32-core-threadripper-performance.html 32 cores/4 channels

[/url]

8 cores/2 channels - if it wasn't bandwidth constrained then you wouldn't see a benefit in faster speeds. (yes I know the uncore runs faster with memory speed increases, but that's not going to make as dramatic a difference)

TLDR: ryzen 8 core/2 channel hits its stride about 3466Mhz - memory bandwidth wise. Now assuming caching and other nice things, you could say that a 16 core would need about 5000mhz worth of speed on the memory bus to hit the same "stride" on two channels.

Other thing to consider if not better imc and ddr4 speeds is the ddr5 wildcard, not seen confirmation either way.
 
Other thing to consider if not better imc and ddr4 speeds is the ddr5 wildcard, not seen confirmation either way.

You didn't take time to look :)

A better imc won't make up for gaping bandwidth needs. (see differences between ryzen 1xxx and ryzen 2xxx for example)

DDR4, on release, was not faster than DDR3 - it started about 2133-2400. In fact DDR3 was faster.

I've seen reports of DDR5 being released towards the end of this year, at 4800MT/s and this being about 1.87x the speed of 3200MT/s DDR4.

I've also seen reports of DDR5 being postponed to 2020

I would say that DDR5 is due on platforms about mid next year, based on these reports - would make sense for AMD to persue a new socket for this...
 
Back
Top