AMD reveal next-gen ‘Nvidia killer’ graphics card at CES?

Shocking that Lisa Su doesnt 100% endorse RT. Shocking.
 
Shocking that Lisa Su doesnt 100% endorse RT. Shocking.

Dean Takahashi, VentureBeat: Is real time ray-tracing in graphics going to be as big as NVIDIA says it is?
LS: I’ve said in the past that ray tracing is important, and I still believe that, but if you look at where we are today it is still very early. We are investing heavily in ray tracing and investing heavily in the ecosystem around it – both of our console partners have also said that they are using ray tracing. You should expect that our discrete graphics as we go through 2020 will also have ray tracing. I do believe though it is still very early, and the ecosystem needs to develop. We need more games and more software and more applications to take advantage of it. At AMD, we feel very good about our position on ray tracing.

This is what I take from her statement:
  • AMD goal for RT is to have a discrete card in 2020
    • discreet does not mean console
  • AMD RT is still at early crap stage
  • AMD crap stage RT is way better than Nvidia RT excrement
 
  • Like
Reactions: N4CR
like this
Noko gets it.

What she is doing, is what every manufacturer does when they are behind: Spin.

This line in particular is kind of lame spin: "We need more games and more software and more applications to take advantage of it."

You don't get more games without shipping HW, and you don't really draw more developers in without some kind of installed base.

A more realistic Lisa Su spin translation:

"We are behind, working at getting product out this year, while NVidia does all the heavy lifting of getting Ray Tracing off the ground".

Nothing wrong with that, but it isn't some kind of brilliant strategy. It is just playing catch up.
 
I am looking at this as being unbiased here... Can most agree with me that more people will vote with their wallet more on an AMD product than Nvidia? Meaning that there is skepticism on many fronts with Radeon cards, performance inconsistency, drivers, power draw etc.

My point is if AMD continues the trend with the 5500xt and 5600xt - We can expect these RT SKUs to command the same price as the RTX 2080ti. Well maybe the Navi maybe 10% faster per say at the same power consumption as a 2080ti (Just throwing out a random number here) RTX performance is similar let's say... and they go ahead and charge $1000 for the high end SKU.

$1000 for the reference blower design
$1100 + probably for AIB models following after launch.


In this situation, (which is what I think is going to likely happen based on trends at this point.) Only AMD Fans with dispensable income are going to purchase this. I will personally look at it as a hard pass knowing that Nvidia will have something the same price right around the corner... (or they just drop a 2080ti super as rumors would suggest last year)


Another thing to consider is that Nvidia will probably be locking out their own RT features with the PhysX 5.0 announcement not too while back. So AMD will have to rely on Microsoft's DXR implementation unless they create something on their own which will not have the adoption rate in my opinion.
 
What she is doing, is what every manufacturer does when they are behind: Spin.

This line in particular is kind of lame spin: "We need more games and more software and more applications to take advantage of it."

You don't get more games without shipping HW, and you don't really draw more developers in without some kind of installed base.

A more realistic Lisa Su spin translation:

"We are behind, working at getting product out this year, while NVidia does all the heavy lifting of getting Ray Tracing off the ground".

Nothing wrong with that, but it isn't some kind of brilliant strategy. It is just playing catch up.

Problem with that theory is Physx was hardware based and ultimately was moved to software as Nvidia pretty much killed it. See software developers have to actually develop for it that makes good use of the tech not lipstick on a pig and the hardware vendor has to make sure they utilize the hardware properly. If neither happen then you end up with dead tech and no consumer demand for it. Building the hardware first is no guarantee that a technology will take off, it takes close cooperation between software and hardware developers and so far Nvidia has shown to be inept with working with software developers. Yet oddly they had no issue with working with them when it came to using CUDA.
 
Another thing to consider is that Nvidia will probably be locking out their own RT features with the PhysX 5.0 announcement not too while back. So AMD will have to rely on Microsoft's DXR implementation unless they create something on their own which will not have the adoption rate in my opinion.

I think your post is generally well thought through, there is just one other point of contention and I think it's the battleground for AMD.
Vulkan. AMD was the original developer of the Vulkan API (formerly called Mantle) as more or less an open source, low level API alternative to DirectX 12. Then they donated all the code to the Kronos group, the group responsible for OpenGL. And more or less overnight Vulkan became the successor to OpenGL.
This is relevant because AMD is now the CPU/GPU/APU of choice for gaming consoles. The PS5 is essentially going to be a Vulkan API console. The Switch as an example has also supported OpenGL and Vulkan from day 1.

Having Vulkan as the low level API of choice gives console devs a massive leg up on being able to monetize their games as porting from a console in the next generation will become even easier. The wrinkle is that the battle between the XSX and PS5 will have an artificial barrier as the XSX will of course run a modified version of DX12. However, Vulkan is becoming the open source API of choice. It is transparently able to be ported directly to Metal, used on iOS and macOS. It's used on Linux. It's used on Switch and the upcoming PS5. The more console hardware is sold, the more programming and optimization in general will favor AMD. And this is despite the Switch essentially being an nVidia Shield.

Anyway, that last bit is a side note. The point is is that AMD's API's are getting baked into what most consumers use: gaming consoles. "Console port", often a dirty term on hardware forums, will start to mean something better (hopefully) and will also mean that PC games that were ported from consoles will defacto run better on AMD hardware. The API's behind it and also the hardware behind it listed above was just for context. Consoles are the wedge that breaks nVidia's proprietary coding and pushes devs to open source alternatives. Something that at least I am personally thankful for.
 
Last edited:
So AMD will have to rely on Microsoft's DXR implementation unless they create something on their own which will not have the adoption rate in my opinion.

Microsoft is just supplying the API, not the implementation.

Windows games so far are using that Microsoft API.

AMD won't be creating an alternative, that will require separate support/adoption.

Both NVidia (already does) and AMD will need to write drivers to support that DXR API.

In a perfect world, AMD would release their HW RT card with Windows DXR drivers and all the "RTX" games would just work because they both use the same API.

In reality untested things break, or have bottlenecks and hitches that are new to the specific implementation.

So AMD will send early HW to dev shops to get them testing/bug fixing for their HW ASAP.

Much like Intel is sending out DG1 cards to dev shops to get more testing and fixes before the product lands.

But the bottom line is that AMD is not creating a separate standard from NVidia RTX.

They just have to write the drivers for their implementation. Just like they need to write DX12 drivers to support their implementation of shaders which are implemented different from NVidias.
 
Well maybe the Navi maybe 10% faster per say at the same power consumption as a 2080ti (Just throwing out a random number here) RTX performance is similar let's say...

The challenge is, as AMD has shown with their 7nm products so far, that they'll either be the same power usage for a slower product or higher power usage for the same performance -- with slower RT, given how far behind they are.

Problem with that theory is Physx was hardware based and ultimately was moved to software as Nvidia pretty much killed it.

They killed the hardware, the PhysX software is still in wide use. Further, using physics more 'extravagantly' in games is a real chicken-and-egg problem, one Nvidia wasn't able to crack, unlike RT. Developers simply aren't going to build games that are so physics-intense that they require dedicated hardware, thus PhysX implementations beyond the basics were relegated to effects that could be toggled.

Of course, Nvidia can run PhysX on their GPUs, so if the demand returns, there is certainly hardware support.

Another thing to consider is that Nvidia will probably be locking out their own RT features with the PhysX 5.0 announcement not too while back.

These two things aren't related, really at all.

So AMD will have to rely on Microsoft's DXR implementation unless they create something on their own which will not have the adoption rate in my opinion.

AMD doesn't have access to RTX -- that's an Nvidia implementation of RT that further includes denoising and other optimizations done in Nvidia hardware. However, since RTX is accessed through DX12 and Vulcan APIs, that's not a big deal -- AMD just has to produce a competitive product with competent drivers and software for the developer side.

"Console port", often a dirty term on hardware forums, will start to mean something better (hopefully) and will also mean that PC games that were ported from consoles will defacto run better on AMD hardware.

We've had two full generations of AMD hardware in Xbox consoles and one on the Playstation, and well, the very opposite has remained true. AMD hasn't gained ground and they remain a minority player regularly peddling two year-old technology that doesn't challenge the top end, usually not even closely.

DX and Vulcan on consoles are different enough from desktop implementations that porting is never going to be straightforward, and for a number of reasons.
 
We've had two full generations of AMD hardware in Xbox consoles and one on the Playstation, and well, the very opposite has remained true. AMD hasn't gained ground and they remain a minority player regularly peddling two year-old technology that doesn't challenge the top end, usually not even closely.
AMD hasn't been on the top end, that can't be debated. But my point was about optimization and whether or not AMD would be given problems due to nVidia proprietary coding. My response was no.

DX and Vulcan on consoles are different enough from desktop implementations that porting is never going to be straightforward, and for a number of reasons.
While this has been true in the past this console generation is closer to PC's than ever before. Microsoft, Nintendo, and Sony have definitely figured out it's beneficial for them to use API's that can easily be ported and API's that can easily be programmed for. The PS3 with its Cell Processors never could be 100% utilized even over it's incredibly long console generation. With each successive generation it's moved more and more towards standard parts and standard APIs. This also helps them looking forward and being forward thinking as it would allow for easier backwards compatibility in the future without cumbersome emulation layers or the necessity to install previous gen processors raising cost.
The big difference is that for PC games there is a general PC hardware target vs a specific one for consoles. But because Vulkan and DX12 are both designed to be low level API's there will be more in common than has been previously possible.

While we can debate this out, I'm content to have a "wait and see" attitude about it like I do more or less anything before it launches. I'm confident though that this coming upcoming console gen will be disruptive in a good way in terms of programming and porting than has ever happened previously.
 
The bigger thing will be that Vulcan and DX12 exist now on the desktop, as they didn't during the last release cycle -- and more importantly the development build-up that preceded it. That should be a win for low-level APIs.

The other will be ray tracing. This is even bigger, IMO, because the hardware base and the development targets are going to be there for developers of all types to leverage.
 
While AMD hardware may be Vulkan crushers it remains to be seen whether or not Microsoft will allow their flagship console to be Vulkan native as opposed to being a modified DirectX as at present. I believe it to be highly unlikely.
 
The Xbox will be Windows / DX native.

Except, it'll be customized for the hardware platform, so you're not going to see those optimizations carry over directly.
 
The Xbox will be Windows / DX native.

Except, it'll be customized for the hardware platform, so you're not going to see those optimizations carry over directly.

Got a source on that, or is that just a guess on your part?
 
The xbox is all about directx with low level extensions, not about portability to other api's and other platforms. Vulkan won't happen soon, if ever on the xbox.

it's less about what the developers want and more about what MS wants to do and give them. Vukan could disappear, DX won't as far as MS is concerned.
 
Last edited:
Got a source on that, or is that just a guess on your part?

It's a guess. But Microsoft is highly unlikely to abandon its DirectX stack (which they have historically used as cudgels to force upgrades to newer operating systems). I just don't see Vulkan native happening on XBox. It may very well happen on Playstation (Sony uses a custom API anyway) so maybe there can be Vulkan traction there but then you have to realize that developers are unlikely to want to support two APIs for essentially the same hardware unless the developers are making a platform-exclusive title.
 
It's a guess. But Microsoft is highly unlikely to abandon its DirectX stack (which they have historically used as cudgels to force upgrades to newer operating systems). I just don't see Vulkan native happening on XBox. It may very well happen on Playstation (Sony uses a custom API anyway) so maybe there can be Vulkan traction there but then you have to realize that developers are unlikely to want to support two APIs for essentially the same hardware unless the developers are making a platform-exclusive title.

I dont see it always up to them, if software developers want Vulkan and the chip can use it then that will be the standard. Especially if Sony uses Vulkan then it simplifies everything for developers. Sony is far bigger then Microsoft in the console world and may force their hand if it means a loss of developers for Xbox. One never knows, but I dont see it as certain they will use DX12 exclusively.
 
XBox DirectX is not DX12. It's a customized version of DirectX specifically designed for the hardware that will be in the XBox.

In regard to why developers would choose Vulkan over DirectX, you are aware that developers have to use the API that Microsoft exposes to them, yes? If Microsoft chooses to not expose the Vulkan API to the customized OS running on the XBox, developers can't use it. Microsoft won't certify their software for sale for the platform; such certification is required to sell XBox software.

You're also assuming that Sony would expose the Vulkan API to the developers. This may be more likely but Sony has no real incentive to do so.
 
It's a guess. But Microsoft is highly unlikely to abandon its DirectX stack (which they have historically used as cudgels to force upgrades to newer operating systems). I just don't see Vulkan native happening on XBox. It may very well happen on Playstation (Sony uses a custom API anyway) so maybe there can be Vulkan traction there but then you have to realize that developers are unlikely to want to support two APIs for essentially the same hardware unless the developers are making a platform-exclusive title.

For the super long and short: I agree. XSX will be some type of DX12. Sony's API is in question, but I think it's grossly to their benefit to have it be Vulkan/OpenGL as it lowers development costsly grossly. Essentially every game engine becomes available to Sony if they do this. Devs will basically have the ability to just use Unreal as an example without further complications.
I dont see it always up to them, if software developers want Vulkan and the chip can use it then that will be the standard. Especially if Sony uses Vulkan then it simplifies everything for developers. Sony is far bigger then Microsoft in the console world and may force their hand if it means a loss of developers for Xbox. One never knows, but I dont see it as certain they will use DX12 exclusively.
This will never happen. Not that Vulkan will never come to a Microsoft product, but that Microsoft will get pushed around by devs for the platform they create. If anything, users of end products have a greater say than devs, but Microsoft was still willing to forceably get everyone off of Windows 7 by artificially making DX12 Windows 10 exclusive (and then pushing it towards EOL as quickly as possible). I would say being a manipulative dick is way more Microsoft's style than anything else.
 
Will find out one way or the other soon, but pressure is felt both ways and Sony and Microsoft will both want their console to perform better then the others. It will be interesting to see how it turns out.
 
Will find out one way or the other soon, but pressure is felt both ways and Sony and Microsoft will both want their console to perform better then the others. It will be interesting to see how it turns out.

Most desktop DX12 implementations are poor (some are spectacular), and Vulcan implementations are rare -- but the performance difference between the two when properly implemented (which hasn't happened yet) should be nil. They are very similar.


Microsoft isn't going to gain any performance by supporting Vulcan. Sony might; but Sony does whatever they want, so odds on them supporting Vulcan are also basically nil.

Further, by supporting DX12, developers hit both a major console and the Windows desktop -- that's a win / win for potential markets.
 
Most desktop DX12 implementations are poor (some are spectacular), and Vulcan implementations are rare -- but the performance difference between the two when properly implemented (which hasn't happened yet) should be nil. They are very similar.


Microsoft isn't going to gain any performance by supporting Vulcan. Sony might; but Sony does whatever they want, so odds on them supporting Vulcan are also basically nil.

Further, by supporting DX12, developers hit both a major console and the Windows desktop -- that's a win / win for potential markets.
One of the only games I know about that is Vulkan/DX12 is Red Dead Redemption 2, both work about the same and I would give maybe a slight edge to DX12 but not sure yet. For me using HDR the Vulkan api crashes instantly while DX 12 does not. Except the HDR implementation so far with every adjustment I've made sucks. Anyways that game was on the consoles first, both PS4 and Xbox 1, so looks like it was developed solely with those two api's. Probably the best looking game I've seen or close to it.
 
so far nvidia has shown to be inept with working with software developers. Yet oddly they had no issue with working with them when it came to using CUDA.

Is this a serious comment? The adoption rate of RT in games has been extremely fast. Well to anyone who understands how software development works.
 
For the super long and short: I agree. XSX will be some type of DX12. Sony's API is in question, but I think it's grossly to their benefit to have it be Vulkan/OpenGL as it lowers development costsly grossly. Essentially every game engine becomes available to Sony if they do this. Devs will basically have the ability to just use Unreal as an example without further complications.

Sony already has a very good api (gnm) that they can optimize for their ecosystem. Any engine worth mentioning already supports it fully. They have no incentive to prioritize vulkan on ps5.
 
Sony already has a very good api (gnm) that they can optimize for their ecosystem. Any engine worth mentioning already supports it fully. They have no incentive to prioritize vulkan on ps5.

Exactly, it may be forgotten but Sony used to support OpenGL on at least one of it's consoles, but no one used it, they stuck with the Sony API.
 
I think your post is generally well thought through, there is just one other point of contention and I think it's the battleground for AMD.
Vulkan. AMD was the original developer of the Vulkan API (formerly called Mantle) as more or less an open source, low level API alternative to DirectX 12. Then they donated all the code to the Kronos group, the group responsible for OpenGL. And more or less overnight Vulkan became the successor to OpenGL.
This is relevant because AMD is now the CPU/GPU/APU of choice for gaming consoles. The PS5 is essentially going to be a Vulkan API console. The Switch as an example has also supported OpenGL and Vulkan from day 1.

Having Vulkan as the low level API of choice gives console devs a massive leg up on being able to monetize their games as porting from a console in the next generation will become even easier. The wrinkle is that the battle between the XSX and PS5 will have an artificial barrier as the XSX will of course run a modified version of DX12. However, Vulkan is becoming the open source API of choice. It is transparently able to be ported directly to Metal, used on iOS and macOS. It's used on Linux. It's used on Switch and the upcoming PS5. The more console hardware is sold, the more programming and optimization in general will favor AMD. And this is despite the Switch essentially being an nVidia Shield.

Anyway, that last bit is a side note. The point is is that AMD's API's are getting baked into what most consumers use: gaming consoles. "Console port", often a dirty term on hardware forums, will start to mean something better (hopefully) and will also mean that PC games that were ported from consoles will defacto run better on AMD hardware. The API's behind it and also the hardware behind it listed above was just for context. Consoles are the wedge that breaks nVidia's proprietary coding and pushes devs to open source alternatives. Something that at least I am personally thankful for.

Some points.

Vulkan is NOT AMD's API. API donated code, and after that they lost control of it.

Relating to it NOT being AMD's API: The Ray Tracing code in Vulkan was 100% written by NVidia, and really only supports NVidia RTX. Kronos is planning to make that code more generic, but they are using the NVidia code as the basis.

It remains to be seen how much Vulkan gets used on PS5. Don't assume support means replacing Sony APIs as the tool of choice.

Finally, don't assume Vulkan is a big AMD advantage. Initially that might have been the case, but NVidia Vuilkan drivers have improved significantly.
 
Is this a serious comment? The adoption rate of RT in games has been extremely fast. Well to anyone who understands how software development works.

lol, extremely fast to not add it to games maybe. Only thing used even less is DLSS, a lot has to due with the fact that current hardware just is not up to the task of using it. At best you get a poorly implemented tacked on version of RT. Only a handful of games have actually spent some time on RT or DLSS and actually make it work where people want to use it.
 
lol, extremely fast to not add it to games maybe. Only thing used even less is DLSS, a lot has to due with the fact that current hardware just is not up to the task of using it. At best you get a poorly implemented tacked on version of RT. Only a handful of games have actually spent some time on RT or DLSS and actually make it work where people want to use it.

Ok, so to be clear you’re confirming that you don’t know how software development works?

DXR dropped in Q4 2018. In less than a year we had AAA games on store shelves supporting RT. That is simply amazing given those games were in development long before DXR was publicly available. Not to mention raytracing is a completely different rendering method to what devs have been doing for the past 30 years.

You can hate on Nvidia all you like but their developer relations are on point.
 
Can't wait to play Wolfenstein Youngblood with AMD RT . . .:D

Have not looked at reviews yet for the recently released RTX upgrade, looks good here but no idea about performance impact. Doom Eternal is suppose to have the best RTX ever, we will see.



As a quick note, especially to any developer - go to any reflective glass window in real life, you will note your eye will either focus on the glass plane reflection or focus on the background or inside and blur out the reflection on the window, really the brain will disregard the reflection unless all objects behind the glass is almost up to the glass. In essence so far all RTX window reflections where you can see inside the room has been less than accurate where both reflection and inside are rendered clearly. Better would be if you look straight through the window to focus the inside area, looking up down or to the sides (some threshold) blur the inside and focus the reflection on the window plane, then it would be more natural looking. Other type of reflective surfaces where you can see behind them a distance, same thing, as in clear water. That would be a FPS type view, camera views are similar but also different to how eyes focus objects.
 
Last edited:
You'd be able to if AMD enabled it in the driver's. I myself would love to try it out and compare
 
As a quick note, especially to any developer - go to any reflective glass window in real life, you will note your eye will either focus on the glass plane reflection or focus on the background or inside and blur out the reflection on the window, really the brain will disregard the reflection unless all objects behind the glass is almost up to the glass. In essence so far all RTX window reflections where you can see inside the room has been less than accurate where both reflection and inside are rendered clearly. Better would be if you look straight through the window to focus the inside area, looking up down or to the sides (some threshold) blur the inside and focus the reflection on the window plane, then it would be more natural looking. Other type of reflective surfaces where you can see behind them a distance, same thing, as in clear water. That would be a FPS type view, camera views are similar but also different to how eyes focus objects.

Yeah they haven’t quite figured out that most reflections are subtle. Some are really sharp in real life but devs are currently going overboard.
 
  • Like
Reactions: noko
like this
Yeah they haven’t quite figured out that most reflections are subtle. Some are really sharp in real life but devs are currently going overboard.

I'd say it has more to do with showing off the new features. I would say that in the future more of the effects will be toned down to a more natural look.
 
I'd say it has more to do with showing off the new features. I would say that in the future more of the effects will be toned down to a more natural look.
I hope so, watching RTX videos with RT reflections are kinda jaded and just stick out way beyond normal making it even look more artificial or contrived. Subtlety has a more impactful realism.
 
Back
Top