cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,060
During an interview with Mark Hachman of PCWorld, AMD president and CEO Dr. Lisa Su acknowledged AMD is "deep in development" of a GPU with real-time ray tracing (DXR) capabilities. She explained that building a development ecosystem was important as "development is concurrent between hardware and software." "The consumer doesn't see a lot of benefit today because the other parts of the ecosystem are not ready," she said.

On the subject of the upcoming 3rd generation Ryzen processor, she explained that the demo was at CES 2019 was done with an 8C/16T AMD Ryzen part to create parity in core count with the Intel Core i9-9900K. She elaborated with this statement, "If you look at the evolution of Ryzen, we've always had an advantage in core count." While acknowledging that some in the room had noticed the extra space on the package, she said, "There is some extra room on that package and I think you might expect we will have more than eight cores."

"I think ray tracing is an important technology, and it's something we're working on as well, both from a hardware and software standpoint," Su said. "The most important thing, and that's why we talk so much about the development community, is technology for technology's sake is okay, but technology done together with partners who are fully engaged is really important."
 
She's not wrong. It's hard to make the case for a $1,000 video card when there are no games yet available to take advantage of its hyped technology. This is where NVIDIA stumbled big time, as we still have to deal with the worst of the chicken and egg problem.
 
She's not wrong. It's hard to make the case for a $1,000 video card when there are no games yet available to take advantage of its hyped technology. This is where NVIDIA stumbled big time, as we still have to deal with the worst of the chicken and egg problem.
My prediction is that when the PlayStation 5 releases with DXR is when we will see AMD DXR mainstream GPUs roll out. Gran Turismo?
 
She's not wrong. It's hard to make the case for a $1,000 video card when there are no games yet available to take advantage of its hyped technology. This is where NVIDIA stumbled big time, as we still have to deal with the worst of the chicken and egg problem.
How many times did ATI make this mistake? Hardware features before software implementation is usually not a great business plan. Sidenote: Dr. Lisa Su is THE pillar of current AMD. The recent difference in professionalism vs Jensen Huang is astounding.
 
We'll see if Navi has tensor and/or rt equivalents and if the PS5/Xbox next have any capability that way. Pretty sure ray tracing won't get broad adoption unless both new consoles have it. Also will intel and amd's integrated graphics support it to some degree in the future? Otherwise game developers still have to write code paths that don't use it and other paths that do.
 
She's not wrong. It's hard to make the case for a $1,000 video card when there are no games yet available to take advantage of its hyped technology. This is where NVIDIA stumbled big time, as we still have to deal with the worst of the chicken and egg problem.
Nvidia has been doing this for years.

RIVA TNT and 32 bit 3D color
Geforce 256 and T&L
Geforce 3 and Shaders
Geforce FX and Shader Model 2.0

All of the above were overkill for games at the time of release, and was widely used by the more popular next gen card.

Nvidia always plays a long game when it comes to features. By the time the 3080Ti comes out, most AAA games will be using the tech. AMD is betting that their implementation will be out at the same time and their game console lead will keep developers from widely developing the tech since the PS4 and Xbox one can't do raytracing either.
 
How many times did ATI make this mistake? Hardware features before software implementation is usually not a great business plan. Sidenote: Dr. Lisa Su is THE pillar of current AMD. The recent difference in professionalism vs Jensen Huang is astounding.

Nvidia has been doing this for years.

RIVA TNT and 32 bit 3D color
Geforce 256 and T&L
Geforce 3 and Shaders
Geforce FX and Shader Model 2.0

All of the above were overkill for games at the time of release, and was widely used by the more popular next gen card.

Nvidia always plays a long game when it comes to features. By the time the 3080Ti comes out, most AAA games will be using the tech. AMD is betting that their implementation will be out at the same time and their game console lead will keep developers from widely developing the tech since the PS4 and Xbox one can't do raytracing either.


Well, there is kind of a chicken and egg problem. If hardware vendors are waiting for software availability to develop the features, and software vendors are waiting for hardware vendors to make a capability available before developing for it, then we will never have anything new.
 
We'll see if Navi has tensor and/or rt equivalents and if the PS5/Xbox next have any capability that way. Pretty sure ray tracing won't get broad adoption unless both new consoles have it. Also will intel and amd's integrated graphics support it to some degree in the future? Otherwise game developers still have to write code paths that don't use it and other paths that do.
Navi is Polaris replacement. Seriously doubt tensor cores on a mid range product.
 
Navi is Polaris replacement. Seriously doubt tensor cores on a mid range product.

Then PS 5 and Xbone next are both without and the feature is dead as a door nail anyway.

For RTX to be a thing going forward AMD has to make it go. lol Seriously though they have Sony, MS and now Google all locked to AMD hardware. Features they don't support will be also ran if they are supported at all.
 
Then PS 5 and Xbone next are both without and the feature is dead as a door nail anyway.

For RTX to be a thing going forward AMD has to make it go. lol Seriously though they have Sony, MS and now Google all locked to AMD hardware. Features they don't support will be also ran if they are supported at all.

The question is not 'if' RT will be everywhere, the question is 'when' RT will be everywhere. AMD dragging their feet might make it take a little bit longer, but that's it.

Fortunately the companies making game engines have already jumped on board, regardless of the uncertainty surrounding AMD's ability to support the standard.
 
The question is not 'if' RT will be everywhere, the question is 'when' RT will be everywhere. AMD dragging their feet might make it take a little bit longer, but that's it.

Fortunately the companies making game engines have already jumped on board, regardless of the uncertainty surrounding AMD's ability to support the standard.

There is zero point to talking up hardware features if you know software won't expose them for over a year. The DX tracing spec is not an NV spec its a MS one. AMD has been behind the scenes getting things ready just as much and likely to more of a degree then NV. NV simply figured they would beat AMD to the punch and start talking about it to be first. Good on them its still a useless feature until game developers start really pushing it. (not tacked on crap implementations in 1 or 2 games) That real push is obviously coming with the next generation console launches.

AMDs cards will run the ray tracing standard just fine... and almost without a doubt the best. The real question will be is how much is Sony going to dictate the standard. I'm sure their PS5 is going to continue to use some form of GNMX. Are they going to just mostly lift tracing stuff from Vulkan ? Or are they going to do their won thing. MS is going to be forced to pretty much do what Sony is doing... and there is imo a very real chance that Real time ray tracing is going to look better on the PS5 then the PC cause ya software matters... and Sonys GNMX+tracing extensions are going to be no doubt the lowest level of them all. I forsee a situation very much like when the PS4 and Xbone launched and developers made clear the sony hardware although almost identical was able to do a lot of things in a cycle vs 100s of cycles on the MS hardware due to their crap API.
 
How many times did ATI make this mistake? Hardware features before software implementation is usually not a great business plan.

It's a difficult business because they start planning and design 5 years ago and all they can do is predict where things would be now.

It's like saying what new games and technology will be out 5 years from now.
 
Then PS 5 and Xbone next are both without and the feature is dead as a door nail anyway.

For RTX to be a thing going forward AMD has to make it go. lol Seriously though they have Sony, MS and now Google all locked to AMD hardware. Features they don't support will be also ran if they are supported at all.

At this point real time ray tracing is not viable on current hardware or software at a reasonable price. That leaves out console for now. If you need bastardized, sampled ray tracing so bad, you need to pay Jensen gobs of cash and get a 2080ti and use RTX. AMD is working on it but don't expect anything for a bit. I bet we will hears something by next year. Just a guess. RTX will probably dead end. DXR is the future. It appears that AMD is waiting on the hardware to catch up in the mid-low end to implement DXR.

https://wccftech.com/amds-david-wan...-dxr-until-its-offered-in-all-product-ranges/
 
  • Like
Reactions: ChadD
like this
At this point real time ray tracing is not viable on current hardware or software at a reasonable price. That leaves out console for now. If you need bastardized, sampled ray tracing so bad, you need to pay Jensen gobs of cash and get a 2080ti and use RTX. AMD is working on it but don't expect anything for a bit. I bet we will hears something by next year. Just a guess. RTX will probably dead end. DXR is the future.

https://wccftech.com/amds-david-wan...-dxr-until-its-offered-in-all-product-ranges/

Its possible it will dead end. Its sort of like VR in a way that a real early launch may have hurt it more then it helped.

Still I do believe AMD has some insanity coming down the road with chiplet based APUs... It is very possible that the PS5 chip (and MS) they are working on may well look a lot more like the CELL. The Cell was way ahead of its time and Sue played a big roll in the Cell chip. The next gen console chip could well feature the Ryzen IO chip + 1 Ryzen 6/12 or 8/16 + 1 Navi + 1 tensor /tracing ASIC. There is actually nothing stopping them from developing the tensor ASIC on its own now... and selling them for Server add ins and dropping the cast offs into APUs / Console chips. Just like they removed the memory controller from ryzen... their tensor hardware doesn't have to be part of Navi at all. So Navi could still be a low ball simplified GPU desgin... high end versions could feature 2 Navi cores + a tensor core. Low ends one and no tensor.

As Sue says there is a lot of empty space on that chiplette she showed off. :)
 
It's a difficult business because they start planning and design 5 years ago and all they can do is predict where things would be now.

It's like saying what new games and technology will be out 5 years from now.

This is where consoles are huge for the market. If not for consoles, we would likely be stuck in a mexican stand-off between hardware and software for a long time. Consoles will always be a dual edged sword where we see a boom in new advances in the gaming industry when the next generation is released, but stagnates afterwards as it's all about optimizing for the systems.
 
She's not wrong. It's hard to make the case for a $1,000 video card when there are no games yet available to take advantage of its hyped technology. This is where NVIDIA stumbled big time, as we still have to deal with the worst of the chicken and egg problem.
I'd much rather have seen a dedicated real time post process GPU that does stuff like ESRGAN/mCable/ReShade/Emulator upscalers effects. Actually I'd love to see AMD make it's APU's capable of such a feature separate from discrete graphics. Though a dedicated PCIE card for such thing wouldn't be terrible either. Hell true be told RTX might not have been so terrible had Nvidia gone discrete accelerator card route much like Physx began. The reason I say that is the tensor cores could be more robust in such a scenario. I know one thing real time ESRGRAN upscaling would be amazing think of all the old games that could benefit from such hardware.
 
Its possible it will dead end. Its sort of like VR in a way that a real early launch may have hurt it more then it helped.

I see that too, lack of standards hurts the technology.

Still I do believe AMD has some insanity coming down the road with chiplet based APUs... It is very possible that the PS5 chip (and MS) they are working on may well look a lot more like the CELL. The Cell was way ahead of its time and Sue played a big roll in the Cell chip. The next gen console chip could well feature the Ryzen IO chip + 1 Ryzen 6/12 or 8/16 + 1 Navi + 1 tensor /tracing ASIC. There is actually nothing stopping them from developing the tensor ASIC on its own now... and selling them for Server add ins and dropping the cast offs into APUs / Console chips. Just like they removed the memory controller from ryzen... their tensor hardware doesn't have to be part of Navi at all. So Navi could still be a low ball simplified GPU desgin... high end versions could feature 2 Navi cores + a tensor core. Low ends one and no tensor.

As Sue says there is a lot of empty space on that chiplette she showed off. :)

Ryzen 7 6/12, + navi + asic would appear viable. Chiplet Solution?
 
How many times did ATI make this mistake? Hardware features before software implementation is usually not a great business plan. Sidenote: Dr. Lisa Su is THE pillar of current AMD. The recent difference in professionalism vs Jensen Huang is astounding.

Too many times!

Nvidia has been doing this for years.

RIVA TNT and 32 bit 3D color
Geforce 256 and T&L
Geforce 3 and Shaders
Geforce FX and Shader Model 2.0

All of the above were overkill for games at the time of release, and was widely used by the more popular next gen card.

Nvidia always plays a long game when it comes to features. By the time the 3080Ti comes out, most AAA games will be using the tech. AMD is betting that their implementation will be out at the same time and their game console lead will keep developers from widely developing the tech since the PS4 and Xbox one can't do raytracing either.

I have the opposite experience, Nvidia USED to be on the front with all the others but then just relaxed.
I can't complain recently though, they have been making killer hardware!
 
Its possible it will dead end. Its sort of like VR in a way that a real early launch may have hurt it more then it helped.

Still I do believe AMD has some insanity coming down the road with chiplet based APUs... It is very possible that the PS5 chip (and MS) they are working on may well look a lot more like the CELL. The Cell was way ahead of its time and Sue played a big roll in the Cell chip. The next gen console chip could well feature the Ryzen IO chip + 1 Ryzen 6/12 or 8/16 + 1 Navi + 1 tensor /tracing ASIC. There is actually nothing stopping them from developing the tensor ASIC on its own now... and selling them for Server add ins and dropping the cast offs into APUs / Console chips. Just like they removed the memory controller from ryzen... their tensor hardware doesn't have to be part of Navi at all. So Navi could still be a low ball simplified GPU desgin... high end versions could feature 2 Navi cores + a tensor core. Low ends one and no tensor.

As Sue says there is a lot of empty space on that chiplette she showed off. :)

i think it'll all hinge on whether navi has DXR support or not, if it doesn't then DXR is dead before it ever gets off the ground, if it does then consoles support it will bring DXR to the forefront so Nvidia better hope thats what happens otherwise they put all their eggs in the wrong basket.
 
She's not wrong. It's hard to make the case for a $1,000 video card when there are no games yet available to take advantage of its hyped technology. This is where NVIDIA stumbled big time, as we still have to deal with the worst of the chicken and egg problem.
But somebody has to do it first, AMD is a company of firsts but I am sure they have learned that doing it first is not always as great as it sounds.
 
But somebody has to do it first, AMD is a company of firsts but I am sure they have learned that doing it first is not always as great as it sounds.

You have to be first with the new technology, but you can't compromise the old- Nvidia learned this with the FX5800, and AMD has learned this with the 2900XT, and there are plenty of smaller prior examples. Nvidia knocked the 8800GTX out of the park not by making it stupidly fast at DX10, but by making it stupidly fast at DX9, despite it being their introductory DX10 part.

At least with RTX, they've made the cards a bit faster.

Further, they've gotten working hardware to the masses. They're on the losing end of the chicken-and-egg problem, and they're doing it anyway.
 
There is zero point to talking up hardware features if you know software won't expose them for over a year.

The features have already been exposed.

The important part is that the hardware actually exists!

The DX tracing spec is not an NV spec its a MS one.

No shit?

AMD has been behind the scenes getting things ready just as much and likely to more of a degree then NV. NV simply figured they would beat AMD to the punch and start talking about it to be first. Good on them its still a useless feature until game developers start really pushing it. (not tacked on crap implementations in 1 or 2 games)

No doubt- they were involved with the DX10 spec too. They still haven't caught up ;)

And Nvidia beat AMD to the punch not by talking about DXR, but by shipping working products. That are also faster than AMD's new non-DXR parts.

AMDs cards will run the ray tracing standard just fine... and almost without a doubt the best.

This statement stands at odds with AMD's history on such things. They may- and should!- run ray tracing properly, since ray tracing is in reality a simplification; the results achieved by using ray tracing are what rasterizers have been attempting to simulate, after all. But I don't expect AMD to compete any better than usual, as raster performance is still a major priority even with workloads shifting to ray tracing, and well, AMD is always years behind when it comes to raster performance.

The real question will be is how much is Sony going to dictate the standard. I'm sure their PS5 is going to continue to use some form of GNMX. Are they going to just mostly lift tracing stuff from Vulkan ? Or are they going to do their won thing.

Sony doesn't get to dictate the standard for the hardware; it's already set. The PS3 used a DX9-era Nvidia GPU, and the PS4 and PS5 will use DX12-era AMD GPUs. Sony likely will, as usual, deploy their own semi-custom APIs for the standardized hardware. They can lift as much or as little from Vulkan as they like, and they might lift more than usual simply due to DX12 and ray tracing themselves being simplifications and thus the routes available for implementation being fewer.

and there is imo a very real chance that Real time ray tracing is going to look better on the PS5 then the PC cause ya software matters... and Sonys GNMX+tracing extensions are going to be no doubt the lowest level of them all. I forsee a situation very much like when the PS4 and Xbone launched and developers made clear the sony hardware although almost identical was able to do a lot of things in a cycle vs 100s of cycles on the MS hardware due to their crap API.

Not really?

We're already using low-level APIs, and throughput isn't an issue on the PC. The PS5 will be nice when it releases, as will the next Xbox, but neither will compare to what PCs are capable of. Perhaps the PS5 will be the better of the two consoles. No way to know that right now.
 
Well, there is kind of a chicken and egg problem. If hardware vendors are waiting for software availability to develop the features, and software vendors are waiting for hardware vendors to make a capability available before developing for it, then we will never have anything new.

Its more like a slow crawl towards a better experience. The competition between product manufacturers slowly pushes the bar forward, when new tech comes out it needs to provide a competitive edge. The issue with things like PhysX and potentially RayTracing is that the competitive edge isn't immediately apparent as it requires specific design of third party products (software) to utilize.

Now, if Ray tracing is indeed as easy to implement as some like to say, and ultimately causes less work for software developers, that is a significant competitive edge, and will force AMD to come up with a compatible solution. If RT technology requires more work to implement in the software end, things get a lot more murky.
 
Well, there is kind of a chicken and egg problem. If hardware vendors are waiting for software availability to develop the features, and software vendors are waiting for hardware vendors to make a capability available before developing for it, then we will never have anything new.

Couldn't agree more. It's crazy how worked up people get when there aren't games available to make use of the new tech yet. I personally can't wait for the day when I can afford an OLED screen or VR headset along with a ray-tracing capable card and a creepy horror game in the dark that really makes this new tech shine.
 
A DXR cable GPU will be invisible mGPU chiplet Navi, you will see.
If they went with a thread ripper's approach 1 I/O hub controlling V6 chiplet engine with 7nm Vega 32 chiplet's they'd have really impressive results I'd think. Power would probably be a issue, but then again Vega 32 chiplets would run cooler and require less juice plus heat is easier to dissipate across a wider area not to mention power gating becomes easier precision boost FTW lastly yields would be greatly improved. The thing to do would be to make them like DAC chips and drop in/out replacements. Just sell a V6 PCB with a I/O hub in place and 1-2 chiplets and allow you to upgrade more of them down the road much like memory with single channel/dual channel/quad channel.

Imagine a old arcade board with all the IC's now imagine those being GPU chiplets rather than roms and controlled by a I/O hub. That's sort of where I think we are headed, but shrunk down in size. Still what's stopping AMD from using that old packaging with chiplets at 7nm? Everything old is new again is all I'm saying why reinvent the wheel.
 
Last edited:
If they went with a thread ripper's approach 1 I/O hub controlling V6 chiplet engine with 7nm Vega 32 chiplet's they'd have really impressive results I'd think. Power would probably be a issue, but then again Vega 32 chiplets would run cooler and require less juice plus heat is easier to dissipate across a wider area not to mention power gating becomes easier precision boost FTW lastly yields would be greatly improved. The thing to do would be to make them like DAC chips and drop in/out replacements. Just sell a V6 PCB with a I/O hub in place and 1-2 chiplets and allow you to upgrade more of them down the road much like memory with single channel/dual channel/quad channel.

Imagine a old arcade board with all the IC's now imagine those being GPU chiplets rather than roms and controlled by a I/O hub. That's sort of where I think we are headed, but shrunk down in size. Still what's stopping AMD from using that old packaging with chiplets at 7nm? Everything old is new again is all I'm saying why reinvent the wheel.

i personally think that this is where amd is going right now...zen-navi...etc .. i wouldnt be suprise if they make dedicated-separated "chiplet"just for the RT. and i can imagine a big interposer(ala epyc) with multiple chiplet for different application...the infinity fabric thing is starting to pay off!
 
Geforce FX and Shader Model 2.0

All of the above were overkill for games at the time of release, and was widely used by the more popular next gen card.

Nvidia always plays a long game when it comes to features.

hahahaha what?

hahahahaahaha
 
She's not wrong. It's hard to make the case for a $1,000 video card when there are no games yet available to take advantage of its hyped technology. This is where NVIDIA stumbled big time, as we still have to deal with the worst of the chicken and egg problem.

Well, they are pushing console sales with those terrible prices of over $1000.
 
This is a reasonable approach for a company that lacks money to invest in serious R&D, play catch-up to the competitor. It's safe, it mostly guarantees a satisfactory level of ROI. However, you're at the whim of the competitor as to market direction, as well as marketing direction.

Just like how Nvidia is the de-facto choice for VR gaming, Nvidia will probably stay the de-facto choice for raytraced gaming.
 
This is a reasonable approach for a company that lacks money to invest in serious R&D

Any R&D, really.

They could have doubled Polaris, hooked it up to two stacks of HBM, and had a more competitive raster-only product. Meanwhile Nvidia is pushing out high-end gaming-focused GPUs generation after generation.
 
What?!! AMD is developing their own implementation of ray tracing!!!

Holy crap...so it's actually an important technology coming to more than one game?

Good lord, who would have guessed!!

Whew, thank goodness I a chose a 2080 over a 1080TI, contrary to what the RTX haters recommended.

But hang on [H], I don't get it.... now that we know AMD is developing their own raytracing implementation,

Does this mean that the RTX Isn't a Lie Anymore?
 
What?!! AMD is developing their own implementation of ray tracing!!!

Holy crap...so it's actually an important technology coming to more than one game?

Good lord, who would have guessed!!

Whew, thank goodness I a chose a 2080 over a 1080TI, contrary to what the RTX haters recommended.

But hang on [H], I don't get it.... now that we know AMD is developing their own raytracing implementation,

Does this mean that the RTX Isn't a Lie Anymore?

Silly argument, RTX is a DX12 feature and how many generations didn't it take for DX12 to be adopted... uh wait it still isn't... Using that as an argument to validate a 2080 purchase is just crazy and you would be better of referencing DLSS which is probably more of a key feature at this point.

RTX will be very very slow in adoption, from a gaming studios perspective it's not free (still need the optimization effort) and they are still required to do the current lighting effort anyway so no way to look at it other than that it will be an additional cost for atleast 4-5 more generations until ray tracing will be mainstream. You will see RTX support in some of the bigger showcase titles and probably as an addon patch, something we will have to be satisfied with. Heck if you got the money for a 2080ti (only viable RTX card at the moment) then heck why would you care anyway...
 
Silly argument, RTX is a DX12 feature and how many generations didn't it take for DX12 to be adopted... uh wait it still isn't... Using that as an argument to validate a 2080 purchase is just crazy and you would be better of referencing DLSS which is probably more of a key feature at this point.

RTX will be very very slow in adoption, from a gaming studios perspective it's not free (still need the optimization effort) and they are still required to do the current lighting effort anyway so no way to look at it other than that it will be an additional cost for atleast 4-5 more generations until ray tracing will be mainstream. You will see RTX support in some of the bigger showcase titles and probably as an addon patch, something we will have to be satisfied with. Heck if you got the money for a 2080ti (only viable RTX card at the moment) then heck why would you care anyway...
:)
 
Back
Top