The Evolving AMD Master Plan

he should do a podcast, jesus thats a long video
 
There is a couple year lag to tell if the CEO is doing good work. That engineer ceo Dirk? did a great job with CPU. Even though lisa got credit. Their GPU is such weak shape sounds like he overlooked it.

At this point Nvidia is competing with it's own cards, older games, and the upgraded consoles. Would sure love to have decent 250$/500$ entries from AMD.
 
Would sure love to have decent 250$/500$ entries from AMD.
Hey, AMD just came out with this great card, the RX580, that is actually sub $250, and is every bit as good if not better than the Nvidia entry into that segment. For about $30 over your $250 mark, they have the RX590, which absolutely destroys anything that NVIDIA has in that price segment (current MSRP pricing for new products). Now, the $500 price segment, you may have a point.
 
Gotta love the knuckle draggers in here bashing one of the last decent tech journalists left, willing to put his shit on the line to educate and inform you. No wonder Kyle struggles to pay the bills with asshats like you lot filling his forum up.

Would you complain if he posted a detailed and in-depth video of his dealings with nGreedia? Yeah, you probably would... :yuck:
 
[QUOTE="joobjoob, post: 1043963212, member: 83558"

At this point Nvidia is competing with it's own cards, older games, and the upgraded consoles. Would sure love to have decent 250$/500$ entries from AMD.[/QUOTE]

I would too. But I think Nvidia competing with themselves is a good thing, hopefully they will price themselves out of existence.
 
  • Like
Reactions: ltron
like this
An hour of listening to something that could be summed up in a few paragraphs....I'm guessing he gets paid by the word.
 
I find myself interested, but I can't watch a damned video.

Once again I find myself wishing they had written a proper article instead.

An hour of listening to something that could be summed up in a few paragraphs....I'm guessing he gets paid by the word.

I usually share that opinion, but it was actually some interesting speculation. It would not have been a few paragraphs. Still would have taken me only about 1/3 the time to get through if it wasn't video though.

It isn't jsut his speculation, but his justification for the speculation. It'd be cool if his version fo the world is coming soon rather than in 5-6 years though.
 
I find myself interested, but I can't watch a damned video.

Once again I find myself wishing they had written a proper article instead.

The part at the end is where he puts it all together it is not a tedious video just starts out a little slow.
There is a lot of speculation in it which goes on about chiplets zen 2 navi 14nm IO block and he ties it all together.
 
I will always choose a written article over a video. I hope the YouTube video tech trend dies, but who am I kidding.

What trend because he is pretty much the only one on youtube that has remotely interesting stuff not always in every direction he goes but this stuff(master plan series) is pretty decent.
 
What trend because he is pretty much the only one on youtube that has remotely interesting stuff not always in every direction he goes but this stuff(master plan series) is pretty decent.
Oh, I just meant in general. I do know Adored loves his videos, haha.
 
great video, despite the limited amount of information available, he makes a very strong case for the the chiplet ps5. If AMD planned for it, or is even able to pull it off for the ps5, they will have a massive success on their hands. Exciting to see how this will play out.
 
I only skipped the middle 1/3. Almost made my buy more shares
 
Its easy to see without even watching the video that AMD is going back to their early 2000's formula.
 
Yes.


There saved you 1hr long video.

I genuinely laughed out loud.

There is a couple year lag to tell if the CEO is doing good work. That engineer ceo Dirk? did a great job with CPU. Even though lisa got credit. Their GPU is such weak shape sounds like he overlooked it.

At this point Nvidia is competing with it's own cards, older games, and the upgraded consoles. Would sure love to have decent 250$/500$ entries from AMD.

I don't think they had any money for real gpu work. It was zen zen zen for like 5 years barely keeping the lights on while they were at it. Also in the 250 dollar bracket the 580 is the only game in town so I dunno what you're driving at.

Its easy to see without even watching the video that AMD is going back to their early 2000's formula.

Which was?
 
That was quite the trip. Very interesting stuff, now to see how close the analysis and predictions will be when the PS5 actually comes out.
 
The 8 chiplet design looks a lot like what I showed AMD April of 2017. I didn't have the I/O, but did have two quad core Ryzen interposer chiplets side by side and temperature color coded and mentioned to AMD how they could do something like it and use precision boost cleverly to maximize and regulate both performance and temperature needs in a intelligent manner I basically color coded the CPU die's to represent temperature hot/cold and alluded to them they the hotter cores might be clocked higher while the cooler ones clocked slower and some could be more like neutral and they could stagger them in such a way to help regulate hot spots. It looks like they basically did that and added a I/O hub in between what I had showed them a render of though they split apart the CCX dies as well which helps even further to reduce hot spots. I think we'll see a network of smaller I/O hubs communicating with CCX cores and meshed and linked together to interact with a bigger central I/O hub. I'm thinking that CCX cores might not all be the same as well we could start to see more of a mixture of features on the CCX chiplet dies themselves.

What's great about that is we might see a CCX chiplet for a APU for example with a a CPU core and a handful of GPU shader's on it. Then we might see another CCX chiplet that's just CPU cores though maybe it lacks certain instruction sets to safe on die space, but improved in other area's with the space savings it provides to the dies themselves. I think we will start seeing more slightly deviated chiplets designs that can be mixed and match to create more compelling end producs to market to users that the I/O hub or a mesh of I/O hubs communicate and coordinate with on what to do via infinity fabric between them. It'll be a bit like a strategy game where we can highlight clusters of mobs and tell them attack a spot from a hardware standpoint a lot of it might be automatic, but we might also have a bit of fine tune control over portions of it as well.

As we run out of other options to increase performance it's the logical way to progress. The CPU do it all approach though lazily and not so efficiently is a bit old fashioned and dated. We still need a bit of it, but we need more task specific catered chip dies as well that a I/O can control to maximize our needs instead of simply brute forcing one big stupid design. I mean they've gotten away with that for a long time, but that is becoming more and more difficult to continue to do and less than ideal way to keep on keeping on just look at the rut that's gotten Intel into as of late after all rather than innovating.

I actually believe future AMD GPU's will incorporate a CPU into the GPU itself that the main CPU can communicate with. It'll just be there to reduce latency essentially and not always exclusively used by the GPU itself, but will intelligently be used by it to cut down on frame rate stutter and smooth it out where possible to do so. There isn't much reason for AMD not to try to do so. In fact they could add some just simply for compression/decompression. I wonder if maybe AMD will use the NVMe M.2 found on some of the of their Radeon Pro cards for compressed/uncompressed VRAM data blocks that could be loaded and swapped around not to mention stored. Much like Nvidia's shader cache except built right onto the cards NVMe storage which is both quicker itself plus localized and can simply be copied onto the VRAM.
 
Last edited:
I don't think they had any money for real gpu work. It was zen zen zen for like 5 years barely keeping the lights on while they were at it. Also in the 250 dollar bracket the 580 is the only game in town so I dunno what you're driving at.

No they didn't, which is why they had to nail it with Zen. Which they did. And according to them, that was their game plan. To make the R&D money from Zen to fuel their GPU market. It makes sense.
 
Hey, AMD just came out with this great card, the RX580, that is actually sub $250, and is every bit as good if not better than the Nvidia entry into that segment. For about $30 over your $250 mark, they have the RX590, which absolutely destroys anything that NVIDIA has in that price segment (current MSRP pricing for new products). Now, the $500 price segment, you may have a point.
I'm contemplating AMD for my next build. I'm going to cap by budget at $300 for a GPU. I have a 970 right now - but since I'm doing a new build, it makes sense for the 970 to stay with this rig. (Not too mention, one of my kids will get it - it's an upgrade for them).
I don't game as often as I'd like. When I do, it's typically not a heavy 3D game anyway. If I could get a 2070 for $350, I'd be in (and realize this isn't going to happen)
 
Ho
No they didn't, which is why they had to nail it with Zen. Which they did. And according to them, that was their game plan. To make the R&D money from Zen to fuel their GPU market. It makes sense.

But can they really do it? The situation they have before is because of them needing to compete on both front. Ryzen really shows us they need to choose one market or else they will get beaten on both front.
 
Ho


But can they really do it? The situation they have before is because of them needing to compete on both front. Ryzen really shows us they need to choose one market or else they will get beaten on both front.

I think they can make a good product that will give people a reason to choose. When they got into the GPU market with the acquisition of Ati, they didn't have much to ride on. Now with Ryzen , EPYC & the gaming console market, they have much more funding to throw around. I'm not sure that they will ever have a GPU that will beat Nvidia, but that's ok. They know who their customers are, and will price their tech accordingly.
 
Gotta love the knuckle draggers in here bashing one of the last decent tech journalists left, willing to put his shit on the line to educate and inform you. No wonder Kyle struggles to pay the bills with asshats like you lot filling his forum up.

Would you complain if he posted a detailed and in-depth video of his dealings with nGreedia? Yeah, you probably would... :yuck:

I would prefer an article over a 1 hour video.. every damn time. Listening to facts for one hour will put me to sleep when I can digest a properly written article in 20 minutes.
 
The part I don't understand is why they are saying any chiplets based GPU would have to use Crossfire type mGPU solutions.

If you are able to do the interconnects between chiplets correctly (and they would have to be able to, if they are putting the memory controller on a separate IO controller) I do that see why the driver couldn't present two chiplets with - say - 64 cores each as a singles GPU with 128 cores.

It will likely make board manufacture a little bit more pricy in order to place the multiple chiplets right so the inteoonnectd line up right, but that should be more that offset by the reduxton in cost from the yield improvements due to using chiplets...
 
An hour of listening to something that could be summed up in a few paragraphs....I'm guessing he gets paid by the word.

Exactly.

Sadly there is more money in YouTube videos these days than in well written articles.

It's the downfall of society. People are too öazy to even read...
 
great video, watched the entire thing.

AMD has a plan and its very simple and elegant in its execution if it is close to what adored has speculated.

I probably wont be buying the first gen navi, but will buy the next 3700x and probably navi 2. easiest upgrade path I have ever seen.
 
The part I don't understand is why they are saying any chiplets based GPU would have to use Crossfire type mGPU solutions.

If you are able to do the interconnects between chiplets correctly (and they would have to be able to, if they are putting the memory controller on a separate IO controller) I do that see why the driver couldn't present two chiplets with - say - 64 cores each as a singles GPU with 128 cores.

It will likely make board manufacture a little bit more pricy in order to place the multiple chiplets right so the inteoonnectd line up right, but that should be more that offset by the reduction in cost from the yield improvements due to using chiplets...
Or a hardware scheduler for that part how long have we not had the IDE interface with master and slave :) . The problem is that AMD has not have a good run nor their priorities lies on developing this as soon as that changes why not 4 to 6 chiplets each needing about 60 Watt that would end up as a killer gpu in whichever form configured.
That was quite the trip. Very interesting stuff, now to see how close the analysis and predictions will be when the PS5 actually comes out.
Don't forget that he based his speculation on a guy from reddit :) Some speculation is a house of cards ...

Maybe the most interesting bit regarding Navi is that according to the RTG engineer it is not a break from GCN.
 
great video, watched the entire thing.
AMD has a plan and its very simple and elegant in its execution if it is close to what adored has speculated.
I probably wont be buying the first gen navi, but will buy the next 3700x and probably navi 2. easiest upgrade path I have ever seen.

I was actually thinking the other way around I would buy the 1st generation Navi and skip Navi 20 in 2020. Unless they make really big steps I expect that the bigger step comes from the gpu AMD releases in 2021.
 
Waiting for 2nd gen navi but will jump on Zen2. V64 will last till then, 290x fallback card is no slouch either.
PS5 will be impressive if they indeed do go MCM route, pretty damn good bang for buck for a console but I'm not sure what they'll do for GPU.. I somehow don't believe the 4k60 claims.
 
I'm on 1060 6g gtx and still a 4.2 oc2600k. I really wanted to get a ryzen/amd build, however, it feels like a side grade for gaming at least. I'm ok with 1080p, but I am looking to jump to 4k in the near future. I don't want the rtx, and was hoping the new amd cards would allow that to happen. Not sure what to think after this video though.
 
Wow that was long.

Why does it feel like this guy is an AMD apologist? He spends most of the video blaming consumers for AMD sucking for the past 7-10 years (since 7970/290x in GPU and till Zen in CPU). Anyway, conclusions seems to be Navi will be a minor upgrade to Poolairis. I believe AMD said they were targeting 25% better clocks. 680 and below in 2019. Vega replacement in 2020. Maybe late 2020 we see the post GCN stuff. Looks like Nvidia will keep pushing RTX for the next few years since Navi won’t be much competition.
 
I think I sprained something in my brain watching that! Wow, what an infodump. Well done video, and it held my attention all the way through...which is pretty amazing, given the subject material! Will be very interesting watching how all of this plays out as these products come to market.
 
Back
Top