AMD raytracing techdemo

Anyhow, one benefit the Xbox will have is that devs will be able to bypass DXR and get closer to the metal when utilizing RT, per DF.

Microsoft is making a huge deal about DX12 alignment between XBOX and PC with DX12 Ultimate, and basically shipping the same games on XB and PC, so I wouldn't bet on them getting any closer to the metal on Xbox than they do on PC.
 
Microsoft is making a huge deal about DX12 alignment between XBOX and PC with DX12 Ultimate, and basically shipping the same games on XB and PC, so I wouldn't bet on them getting any closer to the metal on Xbox than they do on PC.

Also not the other way around is my guess, unfortunately.
 
None of what you said makes any sense. How can you use this promotional video to determine performance, let alone draw final conclusions, is beyond me. Garbage post is garbage. Maybe it's time to ignore you again.

If they don't want negative reactions then they should not be putting out low fps trash demos.
 
If they don't want negative reactions then they should not be putting out low fps trash demos.
I was pretty critical of Nvidia demos as well, noisey, low resolution and frame rates. Still they were better than this AMD demo 1.5 years later. AMD would have been much better off showing a current RT game. Now the question will be, can it work with current titles?
 
Microsoft is making a huge deal about DX12 alignment between XBOX and PC with DX12 Ultimate, and basically shipping the same games on XB and PC, so I wouldn't bet on them getting any closer to the metal on Xbox than they do on PC.

I’m just regurgitating what DF said. What actually happens, I guess we’ll have to wait and see.

I also don’t think anyone should expect RDNA2 to have amazing RT performance. If I was a betting man, I’d say it’s RT performance is worse than Turing’s.
 
I was pretty critical of Nvidia demos as well, noisey, low resolution and frame rates. Still they were better than this AMD demo 1.5 years later. AMD would have been much better off showing a current RT game. Now the question will be, can it work with current titles?

RDNA 2 is obviously close to finalized as Xbox will be shipping soon along with the PC discrete gpu (assuming coronavirus doesn't push things back) so at this point they should have had an impressive high fps demo ready. Maybe the dxr hardware just isn't that great?
 
I’m just regurgitating what DF said. What actually happens, I guess we’ll have to wait and see.

I also don’t think anyone should expect RDNA2 to have amazing RT performance. If I was a betting man, I’d say it’s RT performance is worse than Turing’s.

No one should expect any GPU to have amazing RT performance.

RT effects utilizes something like 40% RT cores and 60 % traditional cores on Turing, for the frame time hit. Note that I am not talking about the mix in hybrid games. The actual pure RT cores still hit shaders hard, and then on top of that you hit shaders more for traditional raster effects. So overall in modern hybrid RT games, traditional cores are vastly more important than RT cores, and you don't make big moves in overall performance without big boost in shader performance.

So you you need big time Raster, and tilting massively in favor RT cores has diminishing returns, because even RT effects are traditional shader bound. If you want to double RT performance you don't just double RT cores, you need to double shader cores as well.

If AMD has a big Navi part that is clearly faster than 2080Ti in Raster games, I expect it will faster in RT games as well. Because it is simply a case of them balancing RT cores to match their known NVidia targets.
 
  • Like
Reactions: noko
like this
With the release of DX12 ultimate, AMD released a tech demo.

No info on what its running on, but big navi comes to mind

I hope its an early alpha, cause it kind of sucks. Not impressed at all. Also seems to be running at a low framerate.






Looks like its barely hitting 30FPS,and lower often. But,its early hardware,and like Nvidia,it will take time for AMD to get the software right. I cut NV lots of slack on DLSS and now its fantastic.
 
Looks like its barely hitting 30FPS,and lower often. But,its early hardware,and like Nvidia,it will take time for AMD to get the software right. I cut NV lots of slack on DLSS and now its fantastic.
Exactly -- ray tracing is currently an exercise in figuring out how to do as little of it as possible per frame. In this respect, Nvidia is years ahead of AMD; not that that's any different than normal for AMD, but given that they're supplying hardware for these new consoles, yeah, it's going to take developers quite some time to figure out how to leverage the benefits of the new technology without making excess sacrifices elsewhere that would negatively affect the gaming experience.
 
Well it can't be too early. I mean, the consoles are launching this year, meaning the hardware is probably locked (or near final) since they have to start production in advance.

I still hold out hope that AMD is going to come through. Obviously it's possible they make some big strides at the end, we'll see.
 
Last edited:
I still hold out hope that AMD is going to come through. Obviously it's possible they make some big strides at the end, we'll see.
I've been hoping the same thing since they bought ATi... ;)

Really though, ray tracing itself should be pretty straightforward. It's far simpler than rasterization. Here, AMD needed to develop (it's set in stone, hardware wise, it's just too late) something that can do ray tracing well enough within the confines of a console, and given that their technology is second only to Nvidia's and that they were developing their hardware in concert with Microsoft, Sony, and game developers concurrently, I honestly believe that they'll pull through for the consoles themselves.

As for the desktop... that's never been their sandbox. Hopefully, they'll exceed Turing well enough to compete with Ampere and provide solutions that are desirable for enthusiasts, but given the disconnect between console and desktop development, it's simply not a sure thing even if they knock it out of the park for the consoles, which I'm betting they already have.
 
I think them beating Nvidia's upcoming cards is out of the question, but getting reasonable RT performance in the mid-range (which is what I assume they have done on consoles) seems possible.

For mainstream audiences, which are the bread and butter of business for both companies, they are not expecting to play ray tracing at 4K and high refresh (or whatever people here want).

Most gamers are at 1080p and probably play on 60Hz monitors. *IF* AMD can produce a card that doesn't cost a fortune and can do some form of ray tracing at 1080p, then I think it will sell.

I have an RTX 2060 somewhere, I tested it with the Star Wars reflection demo and it was sub-10 fps. I didn't even both with any RT games but that was at launch, the situation may be better now.

The point is, Nvidia got a lot of flak (justifiably) because RT was not viable outside the 2080 Ti at launch. Hell, even today I have to run Control on medium settings (RT High) w/ DLSS to "only" get 90 fps at 1080p.

Even if AMD found some shortcut or trick where the RT was faster but lesser quality, that would be huge. Judging from the demo, they have not, but maybe something can be done with the drivers even if that is final hardware.
 
Even if AMD found some shortcut or trick where the RT was faster but lesser quality, that would be huge. Judging from the demo, they have not, but maybe something can be done with the drivers even if that is final hardware.
The trick really is for game developers to develop for RT first. For 'real' RT, nothing could be done, even the top Ampere SKU will fall far short; but we're not doing that, we're doing 'hybrid' RT, and there's plenty of room to cut RT hardware requirements through game engine flexibility. They're just not designed to do that, yet.

To imagine one possibility, instead of looking for light rays to 'light' each pixel, instead imagine them 'informing' the raster-based lighting model. Far lower RT resolution and far more potential for artifacts if done crudely, but if done smartly (and likely dynamically), also more accessible from a hardware standpoint.

Given the tight coupling between hardware, OS, and games on consoles, I feel that AMD will have no problem pulling this off in concert with vendors and developers.


The problem I'm seeing is the application of this approach outside of the upcoming consoles. While a shift in developer approach to RT (and low-latency APIs and a host of other things) is going to make a difference, it's not going to be something that shows more than a margin of error between GPU vendors or CPU vendors etc., if history is any judge.
 
On 2nd thought I think AMD dropped the ball big time with the DXR demo.

When nvidia/ILM/Epic released the SW reflection demo, it first showed it running on a Quad Titan V at 24 fps IIRC NO RTX. Later they showed it with a single RTX2080Ti at over 30fps @4K. That was the first glimpse as what we could expect from RTX. Today the only game that comes close is probably CONTROL.

2 years later AMD shows a raytracing demo that looks like old Intel raytracing demos back when Larrabee was still a thing.
 
Exactly. The demo should be at least as good as current games, ideally much better.

Star Wars reflections still blows people away when I show them on my system, they can't believe it's running in real time.

AMD needed something like that (but they still have time to correct).
 
On 2nd thought I think AMD dropped the ball big time with the DXR demo.

When nvidia/ILM/Epic released the SW reflection demo, it first showed it running on a Quad Titan V at 24 fps IIRC NO RTX. Later they showed it with a single RTX2080Ti at over 30fps @4K. That was the first glimpse as what we could expect from RTX. Today the only game that comes close is probably CONTROL.

2 years later AMD shows a raytracing demo that looks like old Intel raytracing demos back when Larrabee was still a thing.

One thing I think that might hamper AMD is that I suspect NVIDIA has been aiming for this ever since the G80.
Not just the hardware...but also the software/eco-system.
So they have planned this for a long time...and put the recources into developing this.

Where as AMD seems to have been caught off-guard and are playing catch up.
I think NVIDIA moved 2-3 generation earlier than AMD suspected...and it shows.
 
One thing I think that might hamper AMD is that I suspect NVIDIA has been aiming for this ever since the G80.
Not just the hardware...but also the software/eco-system.
So they have planned this for a long time...and put the recources into developing this.

Where as AMD seems to have been caught off-guard and are playing catch up.
I think NVIDIA moved 2-3 generation earlier than AMD suspected...and it shows.
Supposedly nvidia started working on Raytracing about the same time as Intel was working on Larrabee. Probably about the time the G80 or G92 came out. Nvidia had been pushing Raytracing acceleration via CUDA for years, and AMD followed suit much later using OpenCL but lagged behind performance/quality wise. I guess the same could happen with RTX vs AMD RT implementation.
Time will tell.
 
Supposedly nvidia started working on Raytracing about the same time as Intel was working on Larrabee. Probably about the time the G80 or G92 came out. Nvidia had been pushing Raytracing acceleration via CUDA for years, and AMD followed suit much later using OpenCL but lagged behind performance/quality wise. I guess the same could happen with RTX vs AMD RT implementation.
Time will tell.

Didn't Jensen say it took them a decade? I doubt he'd lie about that.
 
None of what you said makes any sense. How can you use this promotional video to determine performance, let alone draw final conclusions, is beyond me. Garbage post is garbage. Maybe it's time to ignore you again.

I'll explain

This promotional video determines performance in similar workloads. If AMD could render it in higher resolution, they would have. If AMD could have rendered it at a higher framerate, they would have. AMD is arriving at the Ray Tracing party two years late, they need to show that not only can they offer the feature, but they can compete with other solutions. This demo was supposed to show that. It didn't it showed the opposite: That AMD was not maintaining 30FPS at full-screen glossy reflections at sub-1080p resolution. To give a comparison, a 2080Ti can render full-scene path-tracing at 1440p at over 60FPS.

So in similar workloads, in a demo that AMD made from the ground-up, rendered on yet-unreleased siliconthat AMD has complete control over, it could not show matching the competing solution in a similar workload. Do you think AMD showed them in this position by choice?

Or to put it another way: The tried-and-true graphics demo, such as Nvidia Dawn, Ruby, Toyshop, etc: all those cool demos, showed off ridiculously high-detail (for the time) graphics with amazing image quality to show off how graphics would look when the architecture and design was focused. They look better than anything at the time. This is true with Nvidia's Raytracing demos. No games look as good as Nvidia's RT demos, and they run smoothly.

Meanwhile there is this from AMD. if you think it looks good, thats opinion: I can't argue. But if you think its performing well or at a high resolution, I can argue. its not, and you'd be wrong. Why would AMD show off something that is performing poorly and at low resolution, unless A: its the only thing they could show off or B: they are for some reason holding back (something Radeon has historically never done)
 
I just got to watch the AMD for the first time on my PC and it's even more underwhelming.

AMD puts out that turd, remember nVidia's? Something you could mistake for a movie:

 
I just got to watch the AMD for the first time on my PC and it's even more underwhelming.

AMD puts out that turd, remember nVidia's? Something you could mistake for a movie:


It's a bummer, I really dig what can be done with RT and was hoping that by now it would have come much further from AMD and others than just Nvidia.
 
I watched the demo again just to be sure and it looks like maybe 30fps, if even that.

I can understand if they can't afford to hire ILM to use movie graphics, but they have to do better than that.

And, even excusing the bad art direction, 1080P60 should be a given. If that is the best they got, then Nvidia has nothing to worry about.
 
Let me throw AMD a bone in this discussion -- my assumption from the beginning has not been that this demo is meant to show off performance, but simply that the hardware is doing the work.

Given AMDs focus on producing Ryzen cores to cover Intel's shortfalls as well as SoCs for these new consoles, it would be understandable that they don't have a high-end part ready, and that on their own, a high-fidelity, high performance demo akin to Nvidia's wouldn't be a priority. They just want to show working hardware at this point.

In this case, it seems AMD has less work to do, because Nvidia has already sold the development world and much of the enthusiast world on ray tracing. AMD doesn't need to do the hard part; they just have to show that they can do it as well.

Now, let's assume that they built this demo simply as a tech demo, and that the purpose was a public 'function check' more than a PR point for consumers. Perhaps it's underwhelming for gamers, and perhaps it'd run better on a 2080Ti, and perhaps it was run on an Xbox SoC. We don't really know the details and we can't check them because AMD RT hardware isn't in public hands.

Maybe its just all some 5 dimensional chess sandbagging from Herk (Jebaited) :p
As much as AMD has earned an element of caution with respect to launching new GPU technology on the desktop, primarily due to software and partner integration issues, they've acquitted themselves well on this last console generation and refresh. They do pump out products that perform and with respect to the tighter integration and support of the console ecosystem, we should expect them to do very well.

Perhaps ray tracing will remain more limited on consoles relative to overall performance vs. PC, perhaps not; but we should at least expect AMD to make it work.
 
Let me throw AMD a bone in this discussion -- my assumption from the beginning has not been that this demo is meant to show off performance, but simply that the hardware is doing the work.

Given AMDs focus on producing Ryzen cores to cover Intel's shortfalls as well as SoCs for these new consoles, it would be understandable that they don't have a high-end part ready, and that on their own, a high-fidelity, high performance demo akin to Nvidia's wouldn't be a priority. They just want to show working hardware at this point.

In this case, it seems AMD has less work to do, because Nvidia has already sold the development world and much of the enthusiast world on ray tracing. AMD doesn't need to do the hard part; they just have to show that they can do it as well.

Now, let's assume that they built this demo simply as a tech demo, and that the purpose was a public 'function check' more than a PR point for consumers. Perhaps it's underwhelming for gamers, and perhaps it'd run better on a 2080Ti, and perhaps it was run on an Xbox SoC. We don't really know the details and we can't check them because AMD RT hardware isn't in public hands.


As much as AMD has earned an element of caution with respect to launching new GPU technology on the desktop, primarily due to software and partner integration issues, they've acquitted themselves well on this last console generation and refresh. They do pump out products that perform and with respect to the tighter integration and support of the console ecosystem, we should expect them to do very well.

Perhaps ray tracing will remain more limited on consoles relative to overall performance vs. PC, perhaps not; but we should at least expect AMD to make it work.

Stop being reasonable and open-minded on the internet!
 
Yeah, I think they can still come through.

I mean, if ray tracing works on consoles (and there is every indication it does) then that does bode well for the high performance PC parts.

The video did accomplish something, in that it proves that it "works", not so much that the performance or art is there.

But I do think they need a proper demo for gamers, and hopefully they can do so before launch.
 
If the Xbox soc can show off minecraft RT then there's no excuse for that shitty demo. I can't give AMD a pass for that. At the very least they could have also used minecraft and just said it's on unreleased hardware and alpha drivers as a caveat.

I do think they will obviously pull off RT with reasonable performance for simple effects akin to Turing or maybe even slightly worse but I don't have any expectation of AMD matching Ampere. If they do, I'll be very very surprised. They also need a dlss 2.0 equivalent on the PC.
 
I'll explain

This promotional video determines performance in similar workloads. If AMD could render it in higher resolution, they would have. If AMD could have rendered it at a higher framerate, they would have. AMD is arriving at the Ray Tracing party two years late, they need to show that not only can they offer the feature, but they can compete with other solutions. This demo was supposed to show that. It didn't it showed the opposite: That AMD was not maintaining 30FPS at full-screen glossy reflections at sub-1080p resolution. To give a comparison, a 2080Ti can render full-scene path-tracing at 1440p at over 60FPS.

So in similar workloads, in a demo that AMD made from the ground-up, rendered on yet-unreleased siliconthat AMD has complete control over, it could not show matching the competing solution in a similar workload. Do you think AMD showed them in this position by choice?

Or to put it another way: The tried-and-true graphics demo, such as Nvidia Dawn, Ruby, Toyshop, etc: all those cool demos, showed off ridiculously high-detail (for the time) graphics with amazing image quality to show off how graphics would look when the architecture and design was focused. They look better than anything at the time. This is true with Nvidia's Raytracing demos. No games look as good as Nvidia's RT demos, and they run smoothly.

Meanwhile there is this from AMD. if you think it looks good, thats opinion: I can't argue. But if you think its performing well or at a high resolution, I can argue. its not, and you'd be wrong. Why would AMD show off something that is performing poorly and at low resolution, unless A: its the only thing they could show off or B: they are for some reason holding back (something Radeon has historically never done)

Investing more in a fake promo equals a better product. Got it. Thank for your insight.
 
Investing more in a fake promo equals a better product. Got it. Thank for your insight.

Investing in proper marketing and PR pays huge dividends. A poor first showing leaves lasting impressions. Just ask Microsoft and the poor impressions they left on gamers during the Xbox one launch. I want AMD to make me money and these kinds of demos do the opposite for mindshare.
 
Looks like AMD needs to hire some good demo guys, artists, programmers and start producing marketing content. That demo shows that AMD totally dissolved that department long ago, the Ruby Demo's were very cool and many looked forward to them. There were many unique ones that ATi use to do that made one drew with anticipation. At least hire a team, subcontract out, otherwise use some current games - like Control or Youngblood, better yet Cyberpunk 2077.
 
If I remember correct, the Ruby demo wasn't done in-house, they did contract outside to do it, but that is a totally reasonable thing to do (just like Nvidia did with ILM).
 
You know it's funny looking over comments here about how this demo is underwhelming and how Nvidias demos of Ray Tracing looked great. Then we got a working game with ray tracing to use on Nvidia hardware and well performance sucked and the world got a fresh coat of wax on everything. You would think by now you would learn not to put much into a demo of tech other then it to show they have working ray tracing tech. Not only is it a demo but we dont even know what hardware it was running on.
 
Probably something like four Quadro RTX 8000 GPUs, who knows...?
 
Demo looks shit but they did re release the video recently and it's a bit smoother. Apparently was a 60fps capture at 24 or 25fps as people on plebbit frame by framed and could see the doubling going on. Either way not a good look. Bring back ruby yo
 
  • Like
Reactions: Auer
like this
You know it's funny looking over comments here about how this demo is underwhelming and how Nvidias demos of Ray Tracing looked great. Then we got a working game with ray tracing to use on Nvidia hardware and well performance sucked
Thankfully AMD doesn't have to carry the torch of being 'first' here ;)
 
You know it's funny looking over comments here about how this demo is underwhelming and how Nvidias demos of Ray Tracing looked great. Then we got a working game with ray tracing to use on Nvidia hardware and well performance sucked and the world got a fresh coat of wax on everything. You would think by now you would learn not to put much into a demo of tech other then it to show they have working ray tracing tech. Not only is it a demo but we dont even know what hardware it was running on.

It better be running in early alpha midrange card. :D :D :rolleyes::rolleyes:
 
Back
Top