AMD to Simultaneously Launch 3rd Gen Ryzen and Unveil Radeon "Navi" This June

I admit I'm not exactly happy with the late release compared to the 1000 and 2000 series Ryzen, where March/April was for AM4 and Threadripper came in the summer. I really hope the wait is worth it and that they announce a release date for Zen2 Threadripper that ideally would come a month or two later in August, as opposed to a late November or December "We got it in this year as we promised!" launch.

I'm definitely going to build at least one PC with a high end AMD CPU, so I'm eager to see if its as awesome as all predictions should suggest. Hopefully they'll have a strong set of reasonably priced Threadripper models that offer high clock/per core performance as well as numerous cores and high end platform features. It would be great to see another 16 core / 32 thread monster that can also reach 4.6 - 5.0ghz with AIO level cooling; ideally all core, but perhaps even higher if only certain cores were in use. Plus, this will further encourage software development for multicore use.

I think the big difference with the "MOAR CORES" thing today and why even Intel is onboard is that tech has changed bigtime since the days of Bulldozer. Even Zen+ cores are almost up to Intel's IPC and clocks so even in the cases when single/few core performance is necessary they still make a strong showing. However, as it seems we've hit the wall as far as easily increasing frequency or IPC, multi-core is the way to grow and we've seen in CPU and GPU design all around. Heck, most if not all ARM type mobile CPUs have multiple cores. Thus, more software than ever is written to take advantage of having many cores available and will likely continue to be, in gaming and elsewhere . This really differentiates it from the old days when everything was coded for single thread, AMD's performance for that was hugely below Intel's, and more.
 
I'm definitely going to build at least one PC with a high end AMD CPU, so I'm eager to see if its as awesome as all predictions should suggest.

There is no way the actual products match the hype.

It will be fun watching AdoredTV’s “credibility” goes down the tubes.
 
There is no way the actual products match the hype.

It will be fun watching AdoredTV’s “credibility” goes down the tubes.

It depends what the 'hype' is I guess.

If high end means higher core count and single thread parity with 9700K/9900K at less than crazy prices, I seem to be in for one or two systems.

But yeah I think most of the hype currently is 5ghz systems with 10-15% IPC advantage over Intel's current... :|
 
It depends what the 'hype' is I guess.

If high end means higher core count and single thread parity with 9700K/9900K at less than crazy prices, I seem to be in for one or two systems.

Everything (including prices)
 
Ahh too bad then. Still waiting for the release to see what's up. If it means I end up getting a last generation/Intel system for better price: perf, then be it.

Already picked up a 2700X anyway for a small dev box.
 
That's called the GeForce RTX 2080

Yes it might well be for many people, but i am building a all AMD machine,,,,, not that i am a fanboy my current TR are the first AMD CPU since the Opteron 185 i think it was called, and that machine back then did not even have a ATI / AMD GFX card.
If the NAVIs are really that low/mid end as suggested, then i might even take a small dive on the performance as i can do that due to my display, just need 1080 and 144 Hz - freesync.
As usual price & performance are a big deal for me, not least now where i am on a pension and aside for video work don't really need all that much processing power.
I will of of curse take all the processing power i can get, but if i have to get a bite of food or a cup of coffee while the computer chew on something so be it, and should i be so lucky someone made a game i would consider, well then i don't really need all the eye candy on MAX.
I am "really old" so i know i can have fun in a game without ray tracing and 4K resolution, both nice things but i can do without if need be.
 
But why? With a few notable exceptions (rendering and encoding workloads) throwing more cores at a problem never helps.

So the cores just sit unutilized taking up space and generating heat preventing you from hitting higher clocks on the cores you are actually using.

We have finally gotten to the point where 4C/8T probably isn't enough. But today 6C/12T is enough. Make it 8C/16T to give yourself some room to grow.

The remaining 4 cores are just plain wasted.

Didn't we learn from the last time?

View attachment 152602

Except now Intel is doing it too...
3-5 years ago, I would fully agree with you, but today, 4 cores (Haswell-level performance or above) is barely enough, even for games (not counting older games, of course).
Most software is scaling to at least 8 cores and beyond, not just RTS games, and other software like content creation, video/audio/photo rendering, VMs, etc. will all massively benefit from more cores.

The main limitation, as showcased here on [H], is that beyond 16 cores with current Ryzen processors, is that the memory channels themselves become bottlenecked with 24 and 32 cores.
Yes, the thermals are also something to concern oneself with, you are right about that, for sure.

I do think the sweet spot at this point for gaming is 8-16 cores (maybe 6 if clocked high enough), and for content creation or professional applications, 16-32 cores.
The remaining cores are definitely not wasted, though, unless single-threaded software, or older 4-thread-limited software is used - sitting idle, though, the additional heat generation should be minimal.

It is just nice for us to have the options, now, and not be limited to just 4 cores, or paying extreme amounts for 6-8 cores, and we have AMD to thank for that!
 
Yes it might well be for many people, but i am building a all AMD machine,,,,, not that i am a fanboy my current TR are the first AMD CPU since the Opteron 185 i think it was called, and that machine back then did not even have a ATI / AMD GFX card.

If you are a buying because of the brand, you are, by definition, a fanboy.
 
by definition, a fanboy.

I agree but i don't, a little while back my excuse was my monitor are freesync, but support for that have changed lately and are no longer a AMD thing.
Most of my computers have been Intel, my first DIY build was the 700 MHZ slot ( or was it socket ) A AMD CPU, then i was on Intel for a little while & Athlon ditto, and then the Opteron 185 ( which i still have in a box as those was "nicknamed" Denmark which are where i live )
I also have my Northwood & Prescott CPU's and the GEN 1 - Core I7 i just retired sit in its motherboard as a backup system,,,, just in case.

At the moment AMD make more sense, and in November last year i could get a 12 / 24 core one at a decent price, the GFX side of things are not as self evident to me, but i really would like to go AMD there too and support the little guys, and i have always had a thing for the underdogs, which are also why i don't do any betting.
But qua the new freesync support situation Nvidia are not completely ruled out just yet, but i really would like to go AMD, and TBH i would like freesync II for when i might update my monitor, though thats not going to happen anytime soon, pretty much all down to luck in lotto, and i never hand any luck in gambling or love.

In June / July i can afford a Vega VII, but i would love to be able to save a little money as i have other things i need too, for PC and other stuff i dabble with.
I sure as hell are not going to buy any of the factory Vega VII due to the cooler, and i will have to run it on air for a while until i can afford a H2O loop for that.
 
At the moment AMD make more sense, and in November last year i could get a 12 / 24 core one at a decent price

Let's say that you are buying the $499 12C/24T "Ryzen 9 3900X*:

How likely would you use the stock cooler?

Or would you rather use an aftermarket cooler?

In your opinion, does the inclusion of a stock cooler adds value to the product and make you more likely to purchase the product?

* = hypothetical name
 
Last edited:
I have only used water cooling in the past 10 - 15 years, and since a computer of mine crossed over 1.4 GHZ i have not used box coolers, but i have used a few high end heat-pipe coolers.
I have done OC since my 700 MHZ machine, that CPU equipped with a so called GFD devise ( a set of DIP switches ) you could overclock it, but i do recall the horrible cooler on that CPU and the likewise little insane cooler on the chipset.
So that in it self have ruled out box coolers for me.

I did use a box cooler for my friends PC upgrade, but he are also a really cheap bastard, and are happy if the computer turn on when he press the button.

So i would always go for a tray CPU and a after market cooling solution of some sort.
 
I have only used water cooling in the past 10 - 15 years, and since a computer of mine crossed over 1.4 GHZ i have not used box coolers, but i have used a few high end heat-pipe coolers.
I have done OC since my 700 MHZ machine, that CPU equipped with a so called GFD devise ( a set of DIP switches ) you could overclock it, but i do recall the horrible cooler on that CPU and the likewise little insane cooler on the chipset.
So that in it self have ruled out box coolers for me.

I did use a box cooler for my friends PC upgrade, but he are also a really cheap bastard, and are happy if the computer turn on when he press the button.

So i would always go for a tray CPU and a after market cooling solution of some sort.

Only time I ever used a box cooler was on my i3-7100 router build. I have a collection of stock coolers, both AMD and Intel that came out if the retail box and we're set aside to never be looked at again.

To me they are a complete and total waste on anything except low end chips. I'd rather pay a few bucks less than get yet another bundled cooler I'll never use.
 
Let's say that you are buying the $499 12C/24T "Ryzen 9 3900X*:

How likely would you use the stock cooler?

Or would you rather use an aftermarket cooler?

In your opinion, does the inclusion of a stock cooler adds value to the product and make you more likely to purchase the product?

* = hypothetical name

I'd use the stock cooler if:

1. If the stock cooler wasn't particularly noisy
-Note, I don't find stock Intel coolers noisy except on the 7700 which kept spinning up. Even then I probably would keep it.

2. It's proven to get more MHz by overclocking.
-I use a Noctua on my 4770K, got 700-800Mhz in the process. If OC headroom on a chip was an additional 100-300 mhz out of the box without exotic cooling, I probably wouldn't have bothered.
 
Let's say that you are buying the $499 12C/24T "Ryzen 9 3900X*:

How likely would you use the stock cooler?

Or would you rather use an aftermarket cooler?

In your opinion, does the inclusion of a stock cooler adds value to the product and make you more likely to purchase the product?

* = hypothetical name

to be honest i actually like AMD's stock coolers, i still use the stock one on my 2600x that i use for my flight sim. cpu runs just fine at 4.1Ghz boost with it.

for me though i feel it does add value because i don't horde old coolers, i picked my r5 1600 instead of the 1600x at the time specifically because it had the stock cooler so i could test the entire setup before mounting my AIO which is a royal pain in the arse to mount and potentially have to demount. also makes it easier to resell the processor because not everyone cares about high end cooling but still want a really nice processor.
 
There is no way the actual products match the hype.

It will be fun watching AdoredTV’s “credibility” goes down the tubes.

It only took until the 2nd page for you to bring up AdoredTV...must be a record for you in your constant AMD bashing.
 
Only time I ever used a box cooler was on my i3-7100 router build. I have a collection of stock coolers, both AMD and Intel that came out if the retail box and we're set aside to never be looked at again.

To me they are a complete and total waste on anything except low end chips. I'd rather pay a few bucks less than get yet another bundled cooler I'll never use.

On both companies 65W chips, there is little need for anything except the stock cooler under normal boost conditions. Even with MCE enabled on a i5-8400, the stock cooler is fine for 99% of users. On the high end, generally, I'd agree with you though.
 
I'd pay more for an 8 core CPU at 5Ghz, than a 12 core CPU at 4.8Ghz.
I agree with this - everything being equal, of course you'd want as many cores as possible, but if it's a trade-off between cores and clockspeed (assuming IPC remains the same) then I'd rather have fewer cores that run faster. Of course, this assumes that however many cores you end up with is enough for your use case - for me, personally, 8 cores is enough and I'd not really see a huge benefit from 12 or 16 cores, but I might see a benefit going from 4.8GHz to 5.0GHz.

Simply throwing more and more cores at things doesn't always help, although there are undoubtedly situations where the more cores, the merrier.
 
to be honest i actually like AMD's stock coolers

I did too...until removing it bent the pins on my 2700x. Note this is after reading Kyle's disclaimer on his review saying this is a possibility. So even after being very gentle with it, I still had bent pins. Luckily I was able to straighten them out with a precision screwdriver, but never again.
 
On both companies 65W chips, there is little need for anything except the stock cooler under normal boost conditions. Even with MCE enabled on a i5-8400, the stock cooler is fine for 99% of users. On the high end, generally, I'd agree with you though.


Maybe things have changed, but box coolers used to cause CPU's to throttle like mad at load even at stock clocks, and be loud as fuck when doing it. I'm not willing to put up with that.

Bigger fans and more fins result in a quieter and more capable experience.
 
Navi 10 is a mid range part. It's only going to match a 2060/2070. What is supposed to be different (in theory) is the price for that performance. If you get an AMD card that's $350 and matches Nvidia's $500 card, it doesn't really matter if it's 6-9 months late (3Q'19 more likely I'd guess).
I'm not entirely convinced that this will be borne out in reality. It'd be great if Navi 10 can undercut Nvidia's offerings on price at the same level of performance, but I'm not sure that it'll happen, or at least to anywhere near the extent you seem to be hoping for. If AMD can produce a card that offers the same performance as a $500 Nvidia card, I don't personally believe that they will sell it for 70% of the price, for a couple of reasons:

Firstly, they want to make as much money as possible from each sale, and as much money as possible overall. If they price their new cards at 70% of Nvidia's equivalent then they need to sell over 40% more cards to make the same amount of money. I think it's much more realistic to assume that if they do undercut Nvidia on price, it'll be by a much more modest margin, perhaps 10% or 15%? The market has shown that it'll support current pricing (even if we all grumble, justifiably, about it). The logic is that lower prices = more sales, but I don't think it's quite that simple, which brings me on to my next point...

Secondly, pricing your products much lower than the competition can lead to a perception that somehow your offering is of lower quality, because it's much cheaper. AMD already have to fight against the perception that they don't compete with Nvidia on quality (just look at the market share to see that customers overwhelming prefer to buy Nvidia even though they might not be getting the best value by doing so). Being a "value" brand competing in a two-horse race with a "premium" brand will only get you so far.

Thirdly, entering into a price war right out of the gate may actually be harmful for AMD longer-term. If Nvidia respond aggressively, what do AMD do next? If they can't differentiate on price, and already have a much smaller market share, then this could go really badly for them. I'm not convinced that Nvidia would seek price parity with AMD, partly because of the "quality" perception issue above, but they could certainly reduce their own prices to the point where customers would still prefer to buy Nvidia due to the perceived quality difference (i.e. "Nvidia products are better, so it's fair to pay 15% more for them", or whatever numbers may apply).

On the flip side, AMD do lag behind in terms of market share, so one argument would be that you can drive market share by undercutting on price. I guess it just comes down to what you believe a realistic margin (below Nvidia's pricing) actually is. As I've said above, I think it's probably closer to 10%-15% rather than 30%. One other thing to consider is that Nvidia caught a lot of shit for bumping the prices from generation to generation (especially going from the 900 series to the 1000 series, but also for Turing): Navi 10 might be relatively cheap in terms of R&D costs given that it's a die shrink of an existing architecture but at some point AMD will have to recoup some serious R&D costs when they finally do away with GCN, and that means a price hike (if they dropped prices for Navi 10). Again, you could argue that dropping prices now might lead to greater market share which means more sales later, but if customer loyalty is bought by price drops then it's lost by price hikes, so it's swings and roundabouts.

I do think that there's an element of wishful thinking every time AMD line up a new GPU, that somehow it's going to be performance-competitive and significantly cheaper. Past experience tells us that this typically doesn't happen: new AMD cards either slot in between Nvidia's offerings at suitably scaled prices, or in the case of VII they drop in at the same price and performance point. I accept that VII may not be the best example because it's something of a one-off and served an additional purpose in terms of showing that AMD can compete at the high (if not highest) end. Rumours of 2070 performance at $250 or 2060 performance at $200 or whatever it was that AdoredTV came out with in December don't really help anyone.
 
I did too...until removing it bent the pins on my 2700x. Note this is after reading Kyle's disclaimer on his review saying this is a possibility. So even after being very gentle with it, I still had bent pins. Luckily I was able to straighten them out with a precision screwdriver, but never again.

luckily i have a long history of using AMD processors so i'm use to the pins.. i've always just unplugged the fan and let it sit in bios for a couple minutes. that usually softens the thermal compound enough to just press down while twisting and pull the heatsink off.
 
Anyway, this is not "news" for me as I have said this months ago:

https://hardforum.com/threads/an-update-for-2019.1975750/

We literally do not care that you think this is not news, in fact all you have done is derail the thread and continue to rail on your favorite youtube personality. You have been a broken record lately and just love repeating the same thing over and over again.

Now for the thread, I for one am looking forward to seeing how these new chips do in June, just sucks there will be no [H] review.
 
I'm not saying you're wrong. I'm saying I hope you're wrong. And if it turns out you are wrong I'm going to find this thread... and I'm going to dance upon your posts proclaiming just how wrong you were. On the inverse if you are right I will quietly ignore this thread in the era of what I want to hear is what I should hear. Plus... meh...


I am hopeful as well, but from the rumors thus far it does sound like June will bring us a Navi 10 announcement, which will be a next gen mid range product like the RX580. Navi 20 comes a year after that (so late 2020?) and it only barely beats a 2080ti, but who knows what Nvidia will have on the market at that point. They aren't very likely to just sit on their hands and wait.
 
<Sigh>...if you insist...

That's called the GeForce RTX 2080

Two separate April Fools posts in Processors and video cards...

The user was asking talking about a video card with similar performance to the Radeon VII, but more power efficient.

A product that fits that description already exists: the GeForce RTX 2080

Navi Rumors thread...

...you mean the one where I did analysis of memory bandwidth and said that NVIDIA has better delta color compression than AMD does

Hardly anyone would argue with that.

The most cruel joke will be when reality hits those who were gullible enough to believe AdoredTV's fake "leaks"

It will be a big April Fools joke that just happens to fall outside of April 1.

April Fools!

...and?

I wasn't bashing AMD.

What makes you think I was "bashing AMD"?

I'm not going to go crazy, but there's definitely a pattern. But in fairness...it's far more AdoredTV bashing because of that leak video.

...but I thought you said that I was "bashing AMD". You didn't said I was bashing FakeTV.


If (and I realize a big if) Zen2 and Navi perform and cost anywhere near what Adored said, you'll have a lot of crow to eat. I for one hope it performs and costs like that, not to prove you wrong (because at the end of the day, nobody gives a shit). But to bring some sense back to video card pricing and light a fire under Intel before we get more +'s at the end of 14nm.


I don't have to "eat crows" because I already know FakeTV's dubious "leak" is wrong.

Do you remember when AMD released 3rd gen Ryzen and Navi at CES?

I don't think so.
 
Last edited:
3-5 years ago, I would fully agree with you, but today, 4 cores (Haswell-level performance or above) is barely enough, even for games (not counting older games, of course).
Most software is scaling to at least 8 cores and beyond, not just RTS games, and other software like content creation, video/audio/photo rendering, VMs, etc. will all massively benefit from more cores.

Not buying it. Most games are still fine with a fast 4c/8t CPU, and while there is some scaling beyond that is minimal in most games. AotS is more benchmark than game, and it should be obvious why it scales to higher core counts, it spends it's time controlling thousands of units. You can just divide the units among the cores to spread the loads. Most games aren't like this.
 
Not buying it. Most games are still fine with a fast 4c/8t CPU, and while there is some scaling beyond that is minimal in most games. AotS is more benchmark than game, and it should be obvious why it scales to higher core counts, it spends it's time controlling thousands of units. You can just divide the units among the cores to spread the loads. Most games aren't like this.

It has nothing to do with you buying it. Games use more cores these days....its a fact.
 
It has nothing to do with you buying it. Games use more cores these days....its a fact.

Show me some tests where a 5GHz 7700K is falling down in a large selection of games.

This video is comparing an 8700K with varying amounts of cores/threads disabled.

4c/8t is only marginally behind 6c/12t.
 
Last edited:
Not buying it. Most games are still fine with a fast 4c/8t CPU, and while there is some scaling beyond that is minimal in most games. AotS is more benchmark than game, and it should be obvious why it scales to higher core counts, it spends it's time controlling thousands of units. You can just divide the units among the cores to spread the loads. Most games aren't like this.

Modern games use more than 4 cores.

Just because something is "still fine" doesn't meant that it can't be done better.

It's like saying, getting a C is "still fine" because you can still pass the class doesn't mean that a B isn't better.
 
Modern games use more than 4 cores.

Just because something is "still fine" doesn't meant that it can't be done better.

It's like saying, getting a C is "still fine" because you can still pass the class doesn't mean that a B isn't better.

What games? I'm curious. Please show me a 6c/8t or higher machine at whatever GHZ rating it can comfortably run at. Compared to something like my 7700k at 4.6+ ghz. I'd love to see a comparison there with like for like everything else (as close as is reasonable.)

I really don't think you'll see a big boon. maybe if you're a 2c/4t system... maybe that would be problematic?

My current system as detailed below I have a shit ton of things running. Dual monitor and I game at 1440p with my 2080 and I have YET to see any sort of CPU bottleneck.
 
Modern games use more than 4 cores.

Just because something is "still fine" doesn't meant that it can't be done better.

It's like saying, getting a C is "still fine" because you can still pass the class doesn't mean that a B isn't better.


But this isn't that kind of difference, see the video, I linked above. 50% more cores, delivering 5% more performance is negligible gains for the additional cores. Sure it's "using" them a bit, but it really isn't a significant difference.
 
I don't have to "eat crows" because I already know FakeTV's dubious "leak" is wrong.

Do you remember when AMD released 3rd gen Ryzen and Navi at CES?

I don't think so.

Once again...just like I told you in the other thread...there's a big difference between the timing of a release and the performance once it is released. You claim there is no validity to the leak because the timing wasn't a full release at CES only a preview. We'll see if they were wrong about the performance and pricing.
 
Once again...just like I told you in the other thread...there's a big difference between the timing of a release and the performance once it is released. You claim there is no validity to the leak because the timing wasn't a full release at CES only a preview. We'll see if they were wrong about the performance and pricing.

AMD said Matisse won't have an iGPU and Picasso only has up to 4 cores.

So that's another "hole" in the supposed "leak"
 
AMD said Matisse won't have an iGPU and Picasso only has up to 4 cores.

So that's another "hole" in the supposed "leak"

You're keeping this up like I care about the leak. I don't care about the leak. Nobody but you cares about the damn leak anymore. But the way you discuss it makes it seem like you have a personal mission against AdoredTV. I mean in the world of Fudzilla, and WCCFTech, etc. if you throw info against a wall and half of it sticks you look like a genius. AdoredTV is just more background noise.
 
But this isn't that kind of difference, see the video, I linked above. 50% more cores, delivering 5% more performance is negligible gains for the additional cores. Sure it's "using" them a bit, but it really isn't a significant difference.

91834.png

91858.png


Here you can see that despite being clocked much lower, Core i5-8400 is able to easily beat the Core i7-7700K

There are a few outliers in there, but you can see the general trend.
 
What games? I'm curious. Please show me a 6c/8t or higher machine at whatever GHZ rating it can comfortably run at. Compared to something like my 7700k at 4.6+ ghz. I'd love to see a comparison there with like for like everything else (as close as is reasonable.)

I really don't think you'll see a big boon. maybe if you're a 2c/4t system... maybe that would be problematic?

My current system as detailed below I have a shit ton of things running. Dual monitor and I game at 1440p with my 2080 and I have YET to see any sort of CPU bottleneck.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS84L0cvNzE1Njk2L29yaWdpbmFsL2ltYWdlMDAyLnBuZw==.jpg
 
View attachment 152708
View attachment 152709

Here you can see that despite being clocked much lower, Core i5-8400 is able to easily beat the Core i7-7700K

There are a few outliers in there, but you can see the general trend.


Geez, according to those slides, the 7700k is equal to the 8700k, despite the 8700K being faster and having more cores/clock speed, and of course the 8400 is also killing the 8700K. You might think there is something wrong with that test setup...
 
Geez, according to those slides, the 7700k is equal to the 8700k, despite the 8700K being faster and having more cores/clock speed, and of course the 8400 is also killing the 8700K. You might think there is something wrong with that test setup...

ahr0cdovl21lzglhlmjlc3rvzm1py3jvlmnvbs84l0cvnze1njk2l29yawdpbmfsl2ltywdlmdaylnbuzw-jpg.jpg
 
Back
Top