The Evolving AMD Master Plan

I would rather read, but this guy's accent is entertaining enough for me to watch the video. =D
 
Exactly.

Sadly there is more money in YouTube videos these days than in well written articles.

It's the downfall of society. People are too öazy to even read...

People used to think books would be the downfall of society.
 
Well, my take on the video has nothing to do with "Navi was designed for PS5". It was therte of course, but there's a lot more to it.

His speculation of the planning and applicability of the chiplet design sounds feasible, and amazing. If true, it means AMD basically got everything terrible they had going for them and found a way to turn it into a competitive advantage. Foundry contracts, tech implementation, and so on.

For someone who always wanted to see AMD compete, and was sad to have to buy an Intel processor for an upgrade after many many years of having AMD as the supplier of the chips I could afford, the way they came out of the certain doom they seemed to be facing is just amazing. They may not be bleeding edge 100%, but sure as hell they're not only committed to provide the best bang for the buck, they're also making sure they get the best bang for the buck as well. That's awesome.
 

In the middle of moving but this video was the best I could find on short notice:

It mostly deals with the violence and video games comparison, but it gets some of the idea across of how the older generation of the time felt about the younger folks reading books.
 
Wow that was long.

Why does it feel like this guy is an AMD apologist? He spends most of the video blaming consumers for AMD sucking for the past 7-10 years (since 7970/290x in GPU and till Zen in CPU). Anyway, conclusions seems to be Navi will be a minor upgrade to Poolairis. I believe AMD said they were targeting 25% better clocks. 680 and below in 2019. Vega replacement in 2020. Maybe late 2020 we see the post GCN stuff. Looks like Nvidia will keep pushing RTX for the next few years since Navi won’t be much competition.
Don't think you got the good bits from the video. It shows that AMD might go the same route with gpu as they went with Zen server parts. Which means smaller die sizes and still allow greater performance without being stuck on single large die designs which can cost a fortune.
GCN is a way how components are placed on the die rather then what people think it is not exactly the same architecture.
As you can see with Vega that is GCN as well but the drivers can not transfer the raw compute power to gaming.

Don't think that people are waiting for anything RTX. Since it tanks framerates on ray tracing titles by such a large drop that people not going to enjoy gaming at sub 60 fps or way lower resolutions and the price is not something people are happy with either....
 
Don't think that people are waiting for anything RTX. Since it tanks framerates on ray tracing titles by such a large drop that people not going to enjoy gaming at sub 60 fps or way lower resolutions and the price is not something people are happy with either....

Given that our mainline example is currently BFV, which apparently has a DX12 implementation that's been broken since it debuted several games before...

I'd honestly thought they'd have gotten DX12 working well in Frostbyte by now.

We'd seen other demos that used ray tracing in RTX (in DXR) that ran well at 4k, so really, people who bought the cards are enjoying more performance and the rest of us (myself included) are waiting either for the need for that performance or for the 'killer app' that would prompt us individually to step up.
 
Given that our mainline example is currently BFV, which apparently has a DX12 implementation that's been broken since it debuted several games before...

I'd honestly thought they'd have gotten DX12 working well in Frostbyte by now.

We'd seen other demos that used ray tracing in RTX (in DXR) that ran well at 4k, so really, people who bought the cards are enjoying more performance and the rest of us (myself included) are waiting either for the need for that performance or for the 'killer app' that would prompt us individually to step up.

I can only laugh at this since DICE knows how to optimize for hardware better then most if not all of the developers out there. The notion that they suck balls on purpose is so far fetched that it has to be you that does not have a clue on what you are talking about.

What is next are DICE extorting money from Nvidia to do better programming?
 
It's long... but it's so darned interesting and educational that you can listen to the whole thing enraptured.
Tech porn at it's best!
/a web producer actually worth supporting on Patron
 
I can only laugh at this since DICE knows how to optimize for hardware better then most if not all of the developers out there. The notion that they suck balls on purpose is so far fetched that it has to be you that does not have a clue on what you are talking about.

I can only laugh at that response. Really. Not really sure about the need for a personal attack though.

You can look at the latest Gears game- it works extremely well in DX12. You can look at Doom 2016- it works extremely well in Vulkan. I can't say programmatically how close these are to Frostbyte, but really, those games work better than I'd expect if they were written in DX11.

Just a few examples, but it's clear that DICE's poor desktop DX12 performance relative to their own DX11 performance is a bit of an outlier. Granted, few games do this very well, but you'd think DICE- with all the credit you give them and EA's resources- should be able to get this straight.

And if you need DX12 for DXR, and thus for RTX, as Frostbyte does, then you can't really use their ray tracing performance which would apparently be crippled by their DX12 performance as a definitive metric.

Again, I wish it wasn't the case- I'd love for BFV to run well in DX12!- but it is. We need more examples.
 
I think he is correct in all fronts, it just makes sense. The one thing i hope for is invisible mgpu with navi.. if that is possible, they will be competitive all over probably for a long time. Later they might do ray tracing chiplets, stuff like this i am thinking. It will be very interesting.
 
Back
Top