Mimicking RTX Global Illumination with Traditional Graphics Techniques

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,074
Gamers Nexus created a demonstration that illustrates the positives and negatives of RTX usage by utilizing the DXR implementation in Unreal Engine. They create environments using traditional graphics techniques that fake RTX global illumination and then recreate them using the accuracy of real-time ray tracing . Viewers can visually see the ups and downs of using DXR as noise rises and performance declines. This is the first video of a series that will delve into the topic.

Today's content fakes global illumination to illustrate the ups and downs of using RTX. The upside is obvious -- it's easier, allows better dynamic movement of objects while retaining GI, and is theoretically more accurate. The downside is performance, clearly, and noise. Our 100% ray-traced... "game" provides some early demos of DXR's implementation into Unreal Engine, which is presently (at writing) on its third preview build of the DXR implementation. Epic Games still has a lot of work to do on this front, but we can take an advance look at RTX shadows, reflections, global illumination, and more.
 
On a different note, Vlado from Chaosgroup updated a screenshot from their latest RTX test for V-Ray. Anyone who is in the rendering industry will love the huge performance boosts we're currently seeing on these cards, and coming from professional standpoint this is good reason to upgrade to a 2080 Ti down the road if you can't afford their Quadro's.

delete.jpg
 
On a different note, Vlado from Chaosgroup updated a screenshot from their latest RTX test for V-Ray. Anyone who is in the rendering industry will love the huge performance boosts we're currently seeing on these cards, and coming from professional standpoint this is good reason to upgrade to a 2080 Ti down the road if you can't afford their Quadro's.

View attachment 145080
Looks like this may be the case of “the more GPUs you buy the more you save” assuming this is your livelihood and has positive IRR.
 
Looks like this may be the case of “the more GPUs you buy the more you save” assuming this is your livelihood and has positive IRR.

Unfortunately, in this case time is money in my industry, and nothing says deadline when your rendering project still needs 45 more hours to complete. The reward is you're faster, more efficient, and gives you the breathing room to be more creative on your projects. Not to mention, the performance/dollar value you're getting with two RTX 2080 Ti's in NVLink in comparison to a $9000 GV100 is a no brainer.
 
On a different note, Vlado from Chaosgroup updated a screenshot from their latest RTX test for V-Ray. Anyone who is in the rendering industry will love the huge performance boosts we're currently seeing on these cards, and coming from professional standpoint this is good reason to upgrade to a 2080 Ti down the road if you can't afford their Quadro's.

Realtime raytracing has always been the holy grail. And I am 100% sure that once RTX and GPU based raytracing overall gains its footing and iterates from crawl to walk to run over the next 3-5 years, then even the naysayers will get selective amnesia and swear they were totally on board from the beginning.

I think 95% of the outrage about RTX is simply misplaced anger over 2080 Ti MSRP not being $699. Simple as that.
 
On a different note, Vlado from Chaosgroup updated a screenshot from their latest RTX test for V-Ray. Anyone who is in the rendering industry will love the huge performance boosts we're currently seeing on these cards, and coming from professional standpoint this is good reason to upgrade to a 2080 Ti down the road if you can't afford their Quadro's.

View attachment 145080

This is the same thing I called very early on, and it's the main reason why we would pay the artificial pay increase for RTX cards, people argued like hell with me, but The value was in content creation, some dude argued NVidia wasted the die space for RTX, I called him a moron the tensor FP16 is well worth the cost as games are starting to use it more, Vega actually had a leg up in this as they were more compute cards but games like Farcry and new collossus Wolfenstien were examples. Either way it's great to see my perdictions were on the nose.
 
Realtime raytracing has always been the holy grail. And I am 100% sure that once RTX and GPU based raytracing overall gains its footing and iterates from crawl to walk to run over the next 3-5 years, then even the naysayers will get selective amnesia and swear they were totally on board from the beginning.

I think 95% of the outrage about RTX is simply misplaced anger over 2080 Ti MSRP not being $699. Simple as that.

Yep, but see they can't see past the tip of their nose, NVidia isn't going to offer a High compute card that can offer workstation level of performance without some overcharge on gamers to close the gap on profits they will lose from people that need workstation performance for a more reasonable cost, it what took weeks, now it's in realtime, or hours if not at the least a day, still massive when time is money.
 
I think 95% of the outrage about RTX is simply misplaced anger over 2080 Ti MSRP not being $699. Simple as that.

It's that it's not even close to the old $699 standard. The average gamer doesn't care how long it takes the high compute crowd to render anything. It went from a top 10% of gamers card to a top 1% of gamers card in 1 generation. You can call it misplaced anger, but anyone but the highest end gamer doesn't even consider it anymore simply because of the price.

I'm pro-business. You can charge whatever you think the market will bear, but it went from a "must buy" to a "no chance in hell" for me just from looking at the MSRP.
 
That was good info buried in a really sloppy-assed presentation.......

It was more of a live stream than a ready to consume content edit. I don't mind seeing people actually work through their presentation, and come to decisions on the spot. We're just so used to seeing concise snips of data that actually seeing humans reason through their methods seems off now.
 
Let’s judge the first generational implementation of real time ray tracing against the decades of development with rasterized lighting techniques. Makes sense...

I dunno, part of me loves the idea that ray traced global illumination is a great representation of the real world but the other part of me realizes that lighting has to be faked to achieve a certain look/mood. The latter is something that TV and movies do every time to great effect. I don’t know if will ever games utilize 100% ray tracing effects whether that’s for GI or for anything else no matter what the performance cost.
 
Last edited:
Realtime raytracing has always been the holy grail. And I am 100% sure that once RTX and GPU based raytracing overall gains its footing and iterates from crawl to walk to run over the next 3-5 years, then even the naysayers will get selective amnesia and swear they were totally on board from the beginning.

I think 95% of the outrage about RTX is simply misplaced anger over 2080 Ti MSRP not being $699. Simple as that.

At hardocp, amd is the fan favorite so anything from nvidia tends to get stigmatized as more devious acts from the "evil empire".
 
Let’s judge the first generational implementation of real time ray tracing against the decades of development with rasterized lighting techniques. Makes sense...

It makes more sense than the RTX promo videos which show an all-or-nothing, RTX on vs off scenario rather than RTX vs the best non-RTX techniques. Probably because it would be hard to tell the difference, except for the frame rate tanking with RTX.

That said, in time I'm sure games will be all RT, all the time. TV and movies do feature stylised lighting, but it's generally all produced with fully RT renderers these days.
 
Realtime raytracing has always been the holy grail. And I am 100% sure that once RTX and GPU based raytracing overall gains its footing and iterates from crawl to walk to run over the next 3-5 years, then even the naysayers will get selective amnesia and swear they were totally on board from the beginning.

I think 95% of the outrage about RTX is simply misplaced anger over 2080 Ti MSRP not being $699. Simple as that.
To me, the 2080Ti at $800-$900 would have been okay. Let me have a 2080 for $600-$700, 700 being a nice Asus card, and I'm alright. As it is a nice Asus 2080 on sale will be $830 and we don't have to talk about the Ti's.
Now you can call that misplaced but that is the exact sentiment I read a lot all over the place.
NV's earnings pretty much confirmed it.

Perhaps they will be better with their 7nm. Let's hope.
 
Gotta admit. I couldn't make it thru 27 minutes of this but I am very interested in the topic especially after dropping ~$1500 on a GPU with the tech.

That being said I skimmed thru some parts. I can also say that much like every other audio/visual upgrade I've done over the last 30 years you start to get used to it and then notice when it's not present. Not happy about the $$ or performance cost but I am on train that RT is needed for the ongoing growth of realism in games. Here's a snap of one place I stopped at and to me the differences are pretty dramatic.
upload_2019-2-28_20-33-46.png
 
  • Like
Reactions: Youn
like this
Is this really unique to ray tracing global illumination vs raster GI? The basis of this argument can be applied to a lot of other areas due to how visual aesthetics to individual ends up rather subjective.

Many people are very focused in general on the idea of "max settings" in games but for many of those I wonder in terms of a blind test against max-1 for many of settings what the actual critique would be if done in normal play (so no zooming in, or carefully combing still shots) and no telling the user which setting is actually higher (do they actually prefer HBAO vs SSAO without knowing which is which for instance even if they can tell the difference?).

Or even if we look at something like monitors. While you can objectively measure color accuracy I wouldn't be surprised if the majority of people in a blind test prefer inaccurate but high color saturation basically the so called "color pop."
 
Yes we get it. Current gen takes a massive hit for little eye candy return but don't we have to start somewhere? If the Quake demo didn't convice people of the massive real world quility improvements worth pursuing and not much else will. GN is trying to stay relevant and kudos to them for trying but faking RT is like eating soya chucks. Yeah you can eat it and make it taste okay but it's no prime rib.
 
Last edited:
Yep. It baffles me how ignorant people are about the whole thing and raytracing in general.
 
I feel for the programmers. They can't abandon older rendering techniques, and now they have to learn to implement RTX as well. I can see why many developers will stick to traditional only for a while. Two rendering techniques (one being low single digit %) can ruin a budget fast.
 
I feel for the programmers. They can't abandon older rendering techniques, and now they have to learn to implement RTX as well. I can see why many developers will stick to traditional only for a while. Two rendering techniques (one being low single digit %) can ruin a budget fast.
Well the programmers don't care, it's a business and financial decision that happens above them. Inverse-same reason publishers still design around DX11 instead of DX12 - the power of the still massive Win7 buying base. Same reason SLI is dead. Budgets are bigger than ever and there's no financial tolerance for catering to a niche.

So Nvidia has to subsidize RTX to make it financially rational for the publisher to displace dev time.
 
Last edited:
Well the programmers don't care, it's a business and financial decision that happens above them. Inverse-same reason publishers still design around DX11 instead of DX12 - the power of the still massive Win7 buying base. Same reason SLI is dead. Budgets are bigger than ever and there's no financial tolerance for catering to a niche.

So Nvidia has to subsidize RTX to make it financially rational for the publisher to displace dev time.

What's the excuse for not using Vulkan it works on Windows 7?
 
Well the programmers don't care, it's a business and financial decision that happens above them. Inverse-same reason publishers still design around DX11 instead of DX12 - the power of the still massive Win7 buying base. Same reason SLI is dead. Budgets are bigger than ever and there's no financial tolerance for catering to a niche.

So Nvidia has to subsidize RTX to make it financially rational for the publisher to displace dev time.
The programmers care when they are expected to bring results that take 48 hours to render in 10 hours time, and if you don't produce you get fired or next in line...to get fired. MAKE it happen or make your way out the door.
 
What's the excuse for not using Vulkan it works on Windows 7?

Inertia. VIdeogame developers' established toolsets, processes, familiarity is with DX11, and I reckon the idea of moving to or split-developing on Vulkan - such that both DX and VLK renderpaths would be available in a game - is viewed as too costly in terms of added development time. The exceptions to this seem to rather do DX11/DX12 split development than DX11/VLK.

Particularly problematic the bigger a developer or publisher is, because any kind of deviation from "the way we've always done it" is measured in dollars and cents. Why games like DOOM 2016 ended up with Vulkan I think had more to do with personal passion among a few of the principle devs rather than any forward thinking corporate decision.

Believe me, there's nothing I'd like more than to see DX fade away and Vulkan become defacto. I've also wished AMD would have leaned in to Vulkan more and taken it back up as their cause and created efficiencies in their hardware that would synergize well with it, since after all it has their Mantle DNA.
 
Last edited:
Realtime raytracing has always been the holy grail. And I am 100% sure that once RTX and GPU based raytracing overall gains its footing and iterates from crawl to walk to run over the next 3-5 years, then even the naysayers will get selective amnesia and swear they were totally on board from the beginning.

I think 95% of the outrage about RTX is simply misplaced anger over 2080 Ti MSRP not being $699. Simple as that.

I'm onboard, but is it necessary, a reason or a value add for Gamers today?
Nope, not at all.
Is it the future, absolutely but RTX2xxx for it's raytracing in games is not where it is, devs and content creators will eat it up.
 
Judging from what we have today, nVidia will have to simply double DXR performance to make it to an acceptable performance level in their next gen RT engine. Then, in the next architecture, it will have to double in speed again, so that you can double the amount of rays that can be processed today, to begin to improve rendering quality, which is an awful noisy mess when you examine it in movement. At this rate, DXR will be pretty good in the same kind of hybrid rasterized and RT games we see today, in about 2 generations time, but that puts fully RT games (ones that offer high visual fidelity and relatively life-like light physics, without noise artifacts and blurred movement) about 15 years away, assuming nVidia can at least double RT performance with each new generation, every 2 years or so. I'm not sure this is possible, if we really are seeing the end of scaling in silicon manufacturing in the next 12 years. I guess it's fully possible if we start seeing chips with dies over 2000mm, because that's kind of the only way to get more transistors after that point.
 
Judging from what we have today, nVidia will have to simply double DXR performance to make it to an acceptable performance level in their next gen RT engine.

We're already seeing 'acceptable' performance, and while this is a matter of perspective, we're only about one generation away from 4k120 performance with RT.

In reality, the problem is that we keep boosting our expectations for performance. We scoff at 1080p60, and we're already seeing 1440p120 working pretty well.

You do make a good point about some of the 'shortcuts' being taken- rasterization, lower-resolution ray-tracing, noise reduction- and I do agree that reducing the reliance on them to the point that at least the artifacts they produce can be eliminated will take more than a single generation.

Really just wanted to point out that Nvidia has gotten pretty far with their first hardware implementation, to the point that at the high end it's actually effective for the end user.
 
Judging from what we have today, nVidia will have to simply double DXR performance to make it to an acceptable performance level in their next gen RT engine. Then, in the next architecture, it will have to double in speed again, so that you can double the amount of rays that can be processed today, to begin to improve rendering quality, which is an awful noisy mess when you examine it in movement. At this rate, DXR will be pretty good in the same kind of hybrid rasterized and RT games we see today, in about 2 generations time, but that puts fully RT games (ones that offer high visual fidelity and relatively life-like light physics, without noise artifacts and blurred movement) about 15 years away, assuming nVidia can at least double RT performance with each new generation, every 2 years or so. I'm not sure this is possible, if we really are seeing the end of scaling in silicon manufacturing in the next 12 years. I guess it's fully possible if we start seeing chips with dies over 2000mm, because that's kind of the only way to get more transistors after that point.

Look up quantum raytracing. Yes, that's that's a thing and not science fiction and both the governments of the United States and China are investing billions to get this technology mainstream. Or whoever gets it first just might classify it and hack the shit out of their mainframes and steal all their technology and monays!

But don't take my word for it, if you can understand this quantum super position technique they used to sample a scene. It'll give you an idea the future of ray tracing.
 
Last edited:
As much i love Steeve and is work on Hardware, witch is awesome. I think they should have left game developpement alone.... I don'Mt know anything about it and he obiously don't either (the other guy does but it's 1 guy...)

Let return back in time when Tesselation was released on PC and how much performance lost they was vs today... Oh yes it was unplayable...
 
The programmers care when they are expected to bring results that take 48 hours to render in 10 hours time, and if you don't produce you get fired or next in line...to get fired. MAKE it happen or make your way out the door.

Or thank them and proactively leave for a job that has realistic expectations. That environment rarely brings out the best in people or gets their best work.

I work in ml/ai so the tensor cores in a consumer card are more interesting to me. I think ray tracing is the future and nvidia made the right call to introduce it now while amd is so far behind on the high end. They are essentially without competition so it's their best shot to start the move to ray tracing without affecting their market share. I don't think they anticipated the low sales, but it's affecting amd as well and they've properly priced the 1660ti and 2060 to keep moving them, so not a failure if not an outright success.
 
I think for programmers RT is easier... simplifies a lot of things, everything is more unified as opposed to trying to jerry-rig a bunch of different techniques together...... I could be wrong (I've only programmed simple quake-quality 3d engines) but I'd also say there wouldn't be as much enthusiasm for this if programmers weren't on board. Reading their book and blogs about this stuff they all seem very keen on it, like going from doing laundry across town to finally having your own washer/dryer in house sorta thing
 
Yep. It baffles me how ignorant people are about the whole thing and raytracing in general.

It's not ignorance they launched the technology too early, they should have either kept their card pricing in line with the prior gen(or at least keep the hike low enough to not make it seem like you are moving every card up in price brackets) or waited until the hardware and tech was strong enough to run at 1440p@120hz or 4k@60hz.


All of the rtx hate is of their own making.
 
We're already seeing 'acceptable' performance, and while this is a matter of perspective, we're only about one generation away from 4k120 performance with RT.

In reality, the problem is that we keep boosting our expectations for performance. We scoff at 1080p60, and we're already seeing 1440p120 working pretty well.

You do make a good point about some of the 'shortcuts' being taken- rasterization, lower-resolution ray-tracing, noise reduction- and I do agree that reducing the reliance on them to the point that at least the artifacts they produce can be eliminated will take more than a single generation.

Really just wanted to point out that Nvidia has gotten pretty far with their first hardware implementation, to the point that at the high end it's actually effective for the end user.

I'd like to know where an example of 1440p@120Hz with RT exists. A link would be nice.
 
I think for programmers RT is easier... simplifies a lot of things, everything is more unified as opposed to trying to jerry-rig a bunch of different techniques together...... I could be wrong (I've only programmed simple quake-quality 3d engines) but I'd also say there wouldn't be as much enthusiasm for this if programmers weren't on board. Reading their book and blogs about this stuff they all seem very keen on it, like going from doing laundry across town to finally having your own washer/dryer in house sorta thing

Yep. Developers over the years have had to invent alot of different ways to somewhat correctly illuminate an object. They're more tricks than techniques. Pixar has a great paper on why they used ray tracing for the movie Cars which can be found here: https://graphics.pixar.com/library/RayTracingCars/paper.pdf

For those of you that think ray tracing is just some lame feature Nvidia is trying to push down your throat, I'd highly recommend you read this paper.
 
stupid thing is, everyone just needs to get over it.

its new, its slow, it hasn't gone mainstream, the devs need to know how to use it properly.

Eventually it will work fast and it will work correctly. Can't fault any one group right now, well, except maybe the consumer for bitching so loud about being early adopters.
 
The way I see it, RTX is Generation 1 technology. Its the same as Movie 3D
images?q=tbn:ANd9GcTPMK6qVnryVq3adae0Gjl_JJw1NiWWgkioCuRWdje-wmNQ0qR6.jpg
or VR
220px-Ford_Galaxy_Launch_by_Imagination_1995.jpg
was in their first generation. Both of those just passed Gen2 before dying. Its a niche market, a gimmick but nowhere near ready for general use. I'd bet both go through at least 2 more generations before getting it right.
Raytracing and GI are a little different but I still think it will take 4 or 5 iterations before most games use it. I'm not saying theres no point in them just that as they are now and for the forseeable future theyre a gimmick. We need them to evolve and improve to the point they are usable. Without these early iterations of VR we won't get SwordArtOnline style full imersion VR.
To be honest, it wouldn't surprise me if we get an SLI/PhysX style setup eventually with one card for raster/rendering and a GPU die dedicated to RT.
 
It was more of a live stream than a ready to consume content edit. I don't mind seeing people actually work through their presentation, and come to decisions on the spot. We're just so used to seeing concise snips of data that actually seeing humans reason through their methods seems off now.

Yeah, I gotta be honest...I'm not the biggest fan of Steve's presentations, so this livestream type of scenario really doesn't work for me.....didn't reallly want the "over the shoulder in real time" thing, but I'll give it another shot this weekend.
 
I imagine devs will be using it for their tools as well. Imagine not waiting hours for all your lighting to bake and you can get a good idea of what it'll look like in seconds.
 
  • Like
Reactions: Youn
like this
Yeah, I gotta be honest...I'm not the biggest fan of Steve's presentations, so this livestream type of scenario really doesn't work for me.....didn't reallly want the "over the shoulder in real time" thing, but I'll give it another shot this weekend.

Dig it or not, he's one of a select few hardware journalist still creating original knowledgeable content. His site isn't littered with 'stories' that are essentially links to other peoples hard work, and he has engineers and programmers delving into hardware in a way that is for some reason unique in this journalism field. For droves a benchmark is all they want/need, but for many of us that have made this a lifestyle since the 90s, he's really taking up the torch that Kyle and a few others still carry. Power to him.
 
It's that it's not even close to the old $699 standard. The average gamer doesn't care how long it takes the high compute crowd to render anything. It went from a top 10% of gamers card to a top 1% of gamers card in 1 generation. You can call it misplaced anger, but anyone but the highest end gamer doesn't even consider it anymore simply because of the price.

I'm pro-business. You can charge whatever you think the market will bear, but it went from a "must buy" to a "no chance in hell" for me just from looking at the MSRP.

The average gamer though spends less than $300 on a video card, so anything above that is considered enthusiast, whatever level you can afford, I understand what you are saying but realize in the 80's a Tandy 1000 would sell for $8-12k so if you have been around the block with PC's in general you would realize $1200 for a GPU is actually cheap in the greater scheme of things, yes you didn't ask for workstation level compute, but there is the 1660ti for you then, you are not entitled to dictate a companies product lines, it's their product not yours.

Or you could go buy a Vega 7 for $800 and let AMD know all you care about is meaningless higher frame rates at a level I doubt most human eyes can handle, 8k monitors arnt a thing yet, and on top of that when they are you wouldn't buy one anyways as it's not catering to your budget.

At least NVidia you have a choice RT at 1080or1440p or just go 4k, yes it's a high price but people tend to pay for options and higher at the enthusiast level.
 
Back
Top