Crytek Ray Tracing Demo NOIR out, hardware agnostic!

(Now there are certain effects that are really hard to do with typical rasterization, like indirect shadow interaction, and indirect light reflection - and more that i cannot think at the moment.)

With respect, this is what we're looking forward to with RT of any type. Direct and indirect global lighting to include gameworld reflections and so on. I get that right now the only choice for a performant implementation in hardware is to use RTX through DXR, and that the portability of this work to other architectures remains an open question, but the hard work of actually implementing RT in games is progressing along nicely. I do not really believe that AMD or Intel RT hardware, if released with competent drivers and developer support, will have any real issue gaining market acceptance. Hell, I expect their entries to go significantly smoother than Nvidia's.
 
With respect, this is what we're looking forward to with RT of any type. Direct and indirect global lighting to include gameworld reflections and so on. I get that right now the only choice for a performant implementation in hardware is to use RTX through DXR, and that the portability of this work to other architectures remains an open question, but the hard work of actually implementing RT in games is progressing along nicely. I do not really believe that AMD or Intel RT hardware, if released with competent drivers and developer support, will have any real issue gaining market acceptance. Hell, I expect their entries to go significantly smoother than Nvidia's.

While they may not easy to make for a green fellow (as a newbie in computer graphics not related with nv), they can be made easy enough, just have to spend additional couple minutes which will benefit the player anyway since he won't need to crunch 3-4 pathtraces per light path every frame, and fill missing pixels with approximations. Player will get better experience playing the game. At this moment, RT is just for lazy people "one" click solution, and leaving it out for players to crunch out money.

Most RTX implementations (ones i saw in battlefield and metro) do not offer anything special, and the price is very high (performance and USD). (its why i wrote that RTX is taking people for a ride - and a damn expensive one.) ~ it almost feels like a scam. But I do have to ask what kind of effects do people really really like to see with RTX?
 
While they may not easy to make for a green fellow (as a newbie in computer graphics not related with nv), they can be made easy enough, just have to spend additional couple minutes which will benefit the player anyway since he won't need to crunch 3-4 pathtraces per light path every frame, and fill missing pixels with approximations. Player will get better experience playing the game. At this moment, RT is just for lazy people "one" click solution, and leaving it out for players to crunch out money.

Most RTX implementations (ones i saw in battlefield and metro) do not offer anything special, and the price is very high (performance and USD). (its why i wrote that RTX is taking people for a ride - and a damn expensive one.) ~ it almost feels like a scam. But I do have to ask what kind of effects do people really really like to see with RTX?

You are asking the wrong question.
The proper question is:
Why do developers want this? ;)
 
I'm fascinated that some people are having such a hard time understanding this very simple concept. Why, I wonder?

DXR - raytracing via DirectX 12. Hardware agnostic, GPUs must support DX12.
RTX - Nvidia's way of accelerating DXR. Proprietary.

Right now, the only way to accelerate DXR via hardware on a GPU is RTX. When AMD releases DXR hardware-capable cards, we'll have another proprietary path to accelerate raytracing DX12 code, that is, DXR.

Why are some people so confused? It's literally how GPUs have always worked: AMD and NV both supported, say, DX11, but the way each architecture accelerated DX11 code have always followed proprietary paths. What is so hard about this?

Because NVIDIA is evil, AMD doesn't support DXR yet and facts doesn't matter to a small minority that are far to vocal...
 
You are asking the wrong question.
The proper question is:
Why do developers want this? ;)
I don't think devs have much to say in this matter - they are usually just told to use this - and don't do that, by their marketing teams, and managers. They (certain managers, and ceo/cfo's) want to spend as little time as possible on product, and maximize profits. Creating optimized titles, too nice looking (crysis 1), and good titles is literally shooting themselves in potential profits, and most of the time the even hardware manufactures will pay the game companies to be less optimized so they can sell newer GPU's, CPU's etc.

(it is also one of many reasons behind graphical downgrades, basically if you can make better graphics don't sell it right away... sell it bit by bit, and people can always say this engine brought something new, and now it looks better in 2nd title they release 2-3years down the road.) Not because it was to optimize performance :) but optimize $ income from consumers.
 
I don't think devs have much to say in this matter - they are usually just told to use this - and don't do that, by their marketing teams, and managers. They (certain managers, and ceo/cfo's) want to spend as little time as possible on product, and maximize profits. Creating optimized titles, too nice looking (crysis 1), and good titles is literally shooting themselves in potential profits, and most of the time the even hardware manufactures will pay the game companies to be less optimized so they can sell newer GPU's, CPU's etc.

(it is also one of many reasons behind graphical downgrades, basically if you can make better graphics don't sell it right away... sell it bit by bit, and people can always say this engine brought something new, and now it looks better in 2nd title they release 2-3years down the road.) Not because it was to optimize performance :) but optimize $ income from consumers.

Look at it eg. this way:
How long time ($$$) does it take too pre-bake RT lighting? ;)
 
Because NVIDIA is evil, AMD doesn't support DXR yet and facts doesn't matter to a small minority that are far to vocal...

You act like you use facts but you’ve been wrong before. Aka original Titan X Pascal launch being the largest chip they could make and anyone that thought otherwise was an idiot. Then they launched a larger die.

At the end of the day we need wide adoption of this tech, is all most is saying, so it needs to run on a potato of a GPU so devs don’t need to create a nornal rasterized game and then have to do a substantial amount of additional work to add RT. An actual agnostic tech where it “just works” easily is also key. So an engine baking it in would be fantastic.

The vast majority of 2080ti owners take high fps over RT anyways.
 
At the end of the day we need wide adoption of this tech, is all most is saying, so it needs to run on a potato of a GPU so devs don’t need to create a nornal rasterized game and then have to do a substantial amount of additional work to add RT. An actual agnostic tech where it “just works” easily is also key. So an engine baking it in would be fantastic.

Agreed so far as raytracing needing to work on the crappiest GPUs so devs can fully move to it. I'm guessing we're 3/4 years away from that happening. As for your other points, I'll point out again:

1) DXR is already hardware agnostic. It's DX12. AMD and Nvidia will always need their own way to accelerate DXR code because their hardware is completely different, so they need to translate DXR raytracing code in different ways so that it's most efficiently processed by their respective architectures.

2) DXR is already integrated into UE4 and Unity (latter on preview for now).
 
  • Like
Reactions: Auer
like this
'm guessing we're 3/4 years away from that happening.

For the performance to be there, sure- but I expect we'll see hardware support pushed down to entry-level with each company's next architecture (which isn't Navi for AMD, unfortunately). After that we'll likely see it in IGPs/APUs.
 
For the performance to be there, sure- but I expect we'll see hardware support pushed down to entry-level with each company's next architecture (which isn't Navi for AMD, unfortunately). After that we'll likely see it in IGPs/APUs.

Well, if all you want is for GPUs to process DXR, that'll happen in 2019 at pretty much all levels. Remember Nvidia's 16 series can already process it (can the 1650 too? I can't recall... but the 1660 definitely can). We know Navi will support DXR, it will on consoles and that tells me that next month AMD will announce DXR acceleration on Navi cards - maybe even on RX 580 series. It'll probably be garbage performance, and meh performance on Navi, because it'll be just like Nv's 16 series - some half-assed acceleration, without actual dedicated hardware. That'll come in 2020 for AMD. Nvidia is already there in 2019.

So, if all you want is for GPUs to be able to process that code... that's available now in quite cheapo cards. We'll see if AMD follows suit next month, most likely yes - their DX12 cards are already capable of processing the code, however slowly. They'll probably add support on Navi/RX5xx this Q3 at least so devs have a base to work on AMD's RTX equivalent path to accelerate DXR (which will probably be announced at Computex, I'm guessing), despite not having dedicated hardware yet.

So in 2019, most GPUs already run DXR, but it's a joke. In 2020 performance will mostly not completely suck. In 2021 it'll be decent. In 2022 it'll be pretty good. By 2023, it should be very good. By 2024, I'm guessing many devs will feel comfortable enough to focus most of their budget on DXR development. More or less the 4/5 year gap I mentioned.
 
Well, if all you want is for GPUs to process DXR, that'll happen in 2019 at pretty much all levels.

Actually mean having dedicated RT hardware.

The doubt about Navi comes with AMD's refusal to really update GCN, though admittedly on the other hand, basic RT hardware is extremely straightforward.

For Intel, their upcoming IGPs are highly unlikely to have RT hardware as they've had those spec'd out for half a decade now. 10nm has really bitten them in the ass. However, whatever replaces their current IGP IP, which is likely being accelerated, very likely will have RT hardware, as will their upcoming discrete project.

For Nvidia, we can expect their next generation to push RT to all hardware levels except perhaps their MX150/MX250-class 'minimal' GPUs, and with Intel upping their IGP game, those are probably not going to be carried forward.
 
You act like you use facts but you’ve been wrong before. Aka original Titan X Pascal launch being the largest chip they could make and anyone that thought otherwise was an idiot. Then they launched a larger die.

At the end of the day we need wide adoption of this tech, is all most is saying, so it needs to run on a potato of a GPU so devs don’t need to create a nornal rasterized game and then have to do a substantial amount of additional work to add RT. An actual agnostic tech where it “just works” easily is also key. So an engine baking it in would be fantastic.

The vast majority of 2080ti owners take high fps over RT anyways.

You are so mad at me it's almost funny...get over it :kiss:

But if this makes you so mad:
i4S4SA9.jpg


Then this must really piss you off:
hcCnt4T.jpg


Remember when talking down RT...you are talking down AMD too….because RT is comming with the backing of:
Microsoft: DXR API (part of DX12)
Intel: GPU hardware to support RT ( Xe Incomming)
AMD: GPU hardware to support RT(NAVI Incomming)
NVIDIA: GPU hardware to support RT(RTX hardware released, Pascal shader RT enabled, Future GPU's incomming)
PowerVR: GPU hardware to support RT (Wizard GPU incomming)

Devs want it too (you should listen to what they say hint-hint)...so you will be funny to observe the next few years...going bonkers over RT everywhere.

Or will you, predictably, change your tune when this is no longer just NVIDIA that has hardware in place...
Fanyboys...so boring, so angry, so predictable….soooooo boring.

Hint: Being angry at me will do nothing...besides waste your life ;)
 
You are so mad at me it's almost funny...get over it :kiss:

But if this makes you so mad:
View attachment 161847

Then this must really piss you off:
View attachment 161848

Remember when talking down RT...you are talking down AMD too….because RT is comming with the backing of:
Microsoft: DXR API (part of DX12)
Intel: GPU hardware to support RT ( Xe Incomming)
AMD: GPU hardware to support RT(NAVI Incomming)
NVIDIA: GPU hardware to support RT(RTX hardware released, Pascal shader RT enabled, Future GPU's incomming)
PowerVR: GPU hardware to support RT (Wizard GPU incomming)

Devs want it too (you should listen to what they say hint-hint)...so you will be funny to observe the next few years...going bonkers over RT everywhere.

Or will you, predictably, change your tune when this is no longer just NVIDIA that has hardware in place...
Fanyboys...so boring, so angry, so predictable….soooooo boring.

Hint: Being angry at me will do nothing...besides waste your life ;)

I have a 2080ti and 2x 1080tis... obviously an AMD fanboy.

This discussion had nothing to do with AMD vs nVidia. You’re the one that keeps throwing it in there as a personal attack.
 
Last edited:
I have a 2080ti and 2x 1080tis... obviously an AMD fanboy.

This discussion had nothing to do with AMD vs nVidia. You’re the one that keeps throwing it in there as a personal attack.

Who started mentioning who?
Again, you anger is hillarious
 
Who started mentioning who?
Again, you anger is hillarious

You’re the one that made it into somekind of nVidia vs AMD nonsense. I am not angry? Why would I get angry over some kind of vendor bias that doesn’t exist? You keep going off topic.

Basically when someone has a terrible arguement they go offtopic or personal attacks like you do. Which you have a vast history of.

I am done with this thread. I don’t feel like doing circular arguements today.
 
Last edited:
I am done with this thread. I don’t feel like doing circular arguements today.

Don't be done with the thread. I like the different viewpoints everyone brings here. Just skip the disagreement with Factum.

But if this makes you so mad:
View attachment 161847

Then this must really piss you off:
View attachment 161848

Frankly I don't see the point of you including this... nobody was saying anything AMD vs Nvidia, all Dayaks was saying was that we need RT to spread through most of the hardware lineup so devs prioritize its inclusion in game development. I didn't get the impression that s/he was mad at you or at the topic, but it does seems like you are trying to make this a AMD vs Nvidia discussion, and that's far from what this is. AMD will most likely have DXR support this year, whether via shaders or some hardware acceleration (I doubt the latter, I'd expect that in their next architecture), and Intel will bring the same in 2020. I don't think anyone's "mad" about it. Quite the opposite, in fact.
 
Back
Top