Ray Tracing Deep Dive: Is it worth it?

vjhawk

Limp Gawd
Joined
Sep 2, 2016
Messages
451
Here we have a video examining how RTX ray tracing is implemented in Battlefield V.

The most significant drawback, besides the massive performance hit is the noise.

Particularly check out the scene where they examine ray traced waves and it looks noisy as hell instead of smooth. In fact I'd say I preferred the look of the smoother non ray-traced waves because of that noise.

Having now seen actual RTX Ray Tracing in a AAA game like Battlefield V, do you feel like it's worth it?
 
Last edited:
Let’s just say I generally have a high end or cutting edge video card. I’m not sold on rtx. I’m also not sold on the current pricing scheme.

I wasn’t sold on hair works either. Not physx either.

Once bit...
 
Last edited:
This is only one game. Way to early to make a call like that. Ever since i read about it years ago, i have been waiting for it to come to gaming PC's. Whilst this launch has been far from great to say the least, i believe it's a step in a good direction for gaming overall. Game devs are very keen on it. From what i have read there is less work involved in it compared to rasterized graphics. We just need to give it time.
 
  • Like
Reactions: Ryom
like this
Ray Tracing won't go anywhere unless:

1. The cost is dramatically reduced AND
2. AMD has it as well.

As long as its locked to one manufacturer's line, it will die like all the other things: Beta, Mini Disc, Hair Works, etc.

Either everyone has it, or no one will really make things for it.
 
The problem is not if it is worth it the problem is that even when you run very optimized code the frame rate takes such a dive for so so graphical effects.
DICE tends to be very good at optimizing for certain hardware features and that is the best they could do that means that if developers implement more ray tracing in other games it will make it very hard to do it well or fake it and hope no one will notice.

Everyone knew that ray tracing on first generation hardware was never going to work well same happened for transform and lighting.
But who knows where it is going?
One thing is pretty sad is that you will hear that it is the future and that might be true but then you read this:
https://fudzilla.com/news/graphics/47636-turing-rt-cores-are-made-for-ai

According to Nvidia, the Turing Tensor Core has 114 TFlops FP16 performance, 228 TOPS of INT 8 and 455 TOPS INT4. It's recognized that INT8 is something that is heavily used in AI today and INT4 and even INT2 research is happening as we speak. Nvidia preaches that Turing Tensor cores are being leveraged for deep learning inference for gaming, which is part of the truth. More importantly, these tensor cores will do really well in AI and so will the Ray Traced cores.

Well it seems that someone is covering their behind and that "the way you are meant to be played" Ray tracing by Nvidia might become a thing.

The article ends on a cliffhanger tho :) .
Despite Nvidia's pioneering efforts in Ray Tracing in real time gaming we are hearing that the big console guys and other players are thinking about a different Ray Tracing approach. More soon.

Always find it a bit hard to swallow the very very very very mixed message Fudzilla sends.
 
Last edited:
ray-tracing is worth it in the long run...but not in its current form...will take 2-3 generations for it to be used to its potential...the people that bought 2080Ti cards for ray-tracing are people who don't know how technology works...anyone who thought that 1st gen ray-tracing cards would deliver something groundbreaking are insane
 
I think it will be pretty cool in a few years when the cards can really keep up with fast moving games.

I would like to see some puzzle solving games using RTX. It should look really good and you would be able to look around better without bullets grazing your booty.
 
RTX won't be really useful for 2-3 GPU generations at 1080p for high FPS game-play. 4K is 4-5 GPU generations away at high FPS game-play. I am basically writing it off for now.
 
Raytracing will -always- be a unicorn. When it gets to the point where we have cards that can raytrace at 1080p 60FPS, top-end cards will be running normal graphics at 4K 100FPS. When we get Raytracing working at 4K100 FPS, top end cards will be running 8K at 200FPS. the cycle will continue and real time raytracing will ALWAYS be a unicorn.
 
It's a scam. I'm not a hater. I bought 2 2080 Ti cards. For rasterized games, the performance is great and I'm happy there.

But I spent that last 2 days testing DXR on BFV, and it's a bust. I can barely get 60 fps with Ultra settings, DXR on Low. In intense scenes, it's dropping into 40 or 30 fps. This is at 1080p ultrawide, not a high resolution.

With DXR off, and some tweaks to the settings (mostly Ultra, with some High), I can get in the 135 - 145 fps range, much more playable. Can't see why I would want to drop to 1/3rd of the performance for slightly nicer puddles. Fail.
 
It's a scam. I'm not a hater. I bought 2 2080 Ti cards. For rasterized games, the performance is great and I'm happy there.

But I spent that last 2 days testing DXR on BFV, and it's a bust. I can barely get 60 fps with Ultra settings, DXR on Low. In intense scenes, it's dropping into 40 or 30 fps. This is at 1080p ultrawide, not a high resolution.

With DXR off, and some tweaks to the settings (mostly Ultra, with some High), I can get in the 135 - 145 fps range, much more playable. Can't see why I would want to drop to 1/3rd of the performance for slightly nicer puddles. Fail.

Give it a bit more time. If you're playing single player it will probably be acceptable. However, for a multi-player shooter it's too much of a performance hog. Maybe DICE can squeeze out some more performance as they play around with it.
 
I think it will be pretty cool in a few years when the cards can really keep up with fast moving games.

I would like to see some puzzle solving games using RTX. It should look really good and you would be able to look around better without bullets grazing your booty.

Yeah, I can see it benefiting something like Civ over a multiplayer FPS (in it's current form), although, to be quite frank... a talented art team can "fake" it with pretty convincing results, especially with things like physically based rendering/shaders/materials in modern game engines. (although, one can argue that limiting rays to a few bounces isn't really an accurate representation of light anyway, so why even focus on RTRT?)

Realtime raytracing is absolutely the crown jewel, but at this point, it's still many many years away IMO. The efficiency and performance of current tech is just so much better. Seeing the BFV demo is impressive, at a technical level, but not very impressive on an actual graphical fidelity level (again, IMO)

With that said, I can see this tech being very beneficial to computer animation studios and the like. Being able to get a semi-accurate preview of your work, without waiting for a complete scene render, would be a huge time saver.

:edited for stuff:
 
Last edited:
I think the $600, $800, and $1200 price points are clearly out of line compared to the previous $379, $549, $699 prior generation. So if you're going to sell me a bill of goods that this new technology is worth the up charge, then it better work. If RTX features were just a bonus at the previous price points, no one cares that the performance is cut in half when you enable it. It's just a "bonus." But when you up charge $500 on the top end, you better put out a compelling product, and quite frankly they haven't.
 
As many of stated, ray tracing is likely the future but as first gen it's not so great. A common problem with many things. I remember T&L and per pixellighting (DX8) taking a while to really be worthwhile.

One thing concerns me though.....ray tracing (at the moment) is a completely different way of rendering. We're all going to want to play our old games down the road that use 'traditional' rendering. Is this going to be a problem in the future with respect to core design complexity? Are we going tobe holding back ray tracing performance potential? Hope not.
 
As many of stated, ray tracing is likely the future but as first gen it's not so great. A common problem with many things. I remember T&L and per pixellighting (DX8) taking a while to really be worthwhile.

One thing concerns me though.....ray tracing (at the moment) is a completely different way of rendering. We're all going to want to play our old games down the road that use 'traditional' rendering. Is this going to be a problem in the future with respect to core design complexity? Are we going tobe holding back ray tracing performance potential? Hope not.

Back way back when I used Imagine software (Amiga) to ray trace stuff and all of the demo bits were amazing until you needed it to render then scanline rendering was better because rendering finished a lot sooner all that is needed is where hardware solves this problem.

Nvidia created a problem for itself when they started this (ray tracing lofty goal) everyone needed ray tracing and like I said on the Amiga it looks really cool you can sell people on that it is better but what Nvidia forgot is that the problem for ray tracing is in hardware not software.

Since all of this is on Nvidia they should fix it And that part I know will likely not happen for the simple reason is that the hardware real estate (the amount of mm2 of the die) is so large that they run the risk of making it larger and eating more shader performance in favour of Turing AI/RTX cores is not something that makes sense. That leaves Nvidia making better use of their hardware a smarter way to implement the current shaders by either redesign or other options.

And that will take time if it was easy it would have been done already. So don't expect anything soon within 2 or 3 generations for this to be fixed.

btw that does not mean that Nvidia won't start making 800+ mm2 dies for the next generation.
 
Last edited:
When video cards start to do 144fps w/RT on I'll start to care about it, so not worth it to me right now.
 
Last edited:
  • Like
Reactions: isp
like this
Ray tracking not worth it .Only reason I own RTX 2080 is because I upgraded whole system with adding a little money so it was worth from my perspective.
I am more hoping for DLSS support in games than ray tracing.
As well for Ray tracing my guess only is that its more optimisation issue than hardware itself as devs had very little time to work with in and implement in their games.
But might be wrong about last point
 
hell no it's not worth it today...in 2-3 years hopefully yes but not now with the 2070/2080 cards
 
This is coming from someone who works professionally in the 3d modeling and rendering industry, so to be brief I'll summarize it like this:

• Short term: Not worth it at all because the prices are over-inflated, the technology is in its infancy, and the only real beneficial improvements will be with professional users for rendering.

• Long term: Raytracing is the holy grail of computer graphics, and when the tech matures it will be like watching real cgi movies.
 
RT in the 20xx series is really a conundrum. On the one hand it's cool that it has new technology, on the other hand it comes at such a price premium. Now if RT was established and every game had it and it was optimized, then the premium might be worth it. BUT since you're forced to pay a premium for 1st gen hardware and little software support on top of the extra $$$ you just paid - you're basically paying extra to beta test for them. In the few years for RT to have more ubiquitous support among developers, newer generation hardware will be out (from both red and green) which will likely be cheaper and perform better as well.

Right now it's basically early adopter tax all around - as has been said a la physix / hair works.
 
Last edited:
The BFV patch is supposed to increase RT performance by like 50%. Still a hit, but should be playable.

I haven't tried it yet, let me see if I can take a look tonight.
 
The BFV patch is supposed to increase RT performance by like 50%. Still a hit, but should be playable.

I haven't tried it yet, let me see if I can take a look tonight.

They removed some of the rays in ray tracing ;) on certain objects. It should be fine now from what I saw on youtube...
 
ray-tracing is worth it in the long run...but not in its current form...will take 2-3 generations for it to be used to its potential...the people that bought 2080Ti cards for ray-tracing are people who don't know how technology works...anyone who thought that 1st gen ray-tracing cards would deliver something groundbreaking are insane

His Name is Factum.. but he absolutely adored PhysX aswell :)
 
I imagine by 2 year's ray tracing will be de facto on all cards.

Is not only hardware that needs to be worked on but the drivers and game coding as well.

I'm betting by the end of this summer we will see at least one game company have a solid ray tracing implementation in games.

Wouldn't it be great to see a separate ray tracing add on card instead of part of the card? Like the old Ageia physx cards.
 
I imagine by 2 year's ray tracing will be de facto on all cards.

Is not only hardware that needs to be worked on but the drivers and game coding as well.

I'm betting by the end of this summer we will see at least one game company have a solid ray tracing implementation in games.

Wouldn't it be great to see a separate ray tracing add on card instead of part of the card? Like the old Ageia physx cards.

I'am betting that no-one will have a "solid" ray-tracing implemetation by summer..

and it would quite plainly suck if we get "add-on" cards - it would fragment the userbase even more..

The only way this gets suported is if nVidia pays the developers.. as no-one in their right mind would develop for a measly percentage of the market.. developers want to sell copies.. and with so small a userbase.. why bother.. unless NV will hand over some sweet cash??

so best scenario here is actually .. same direction as physx, which never improved games much, sure you had some extra particle effects.. but nothing gamechanging.

and in most titles.. the performance hit sucked..
 
Ray tracing has been the dream of card makers and game engine devs for years now. There have been articles in Popular Science, Forbes, several military magazines, online and TV shows for 20 years about it not being feasible but when it does drop into the public realm it'll explode.

The military helps fund certain developments in technology. Remember America's Army? Kind of a bad example but they helped in the development of AI which we now see commonplace. Flight sims will be huge with ray tracing for the training of pilots,not to mention the shooting simulation programs currently being used.

Think of it as a new console. The dev kits and programs have been in the hands of big companies for a year or more. The first iteration or 2 are crap, not the best to showcase what can be done. The second generation of games is when it starts to really shine. Driver tweaks push it along as well. Just watching current implementation shows that they've brute forced it into BF instead of using finesse.

I stand by my comment. By summer, start or end, Ray tracing will be a much more palatable option for gamers.
 
I don't see any games supporting full raytracing, only raytracing assist in something at this stage. BF5 reflections is an example. Not sure if other games will use different aspects of raytracing other than reflections - Shadow Of The Tomb Raider looked like they were using ray bounce for color bleed but not sure. In any case, RTX cards are only partially raytracing and mostly rasterizing.
 
The only way this gets suported is if nVidia pays the developers.. as no-one in their right mind would develop for a measly percentage of the market.. developers want to sell copies.. and with so small a userbase.. why bother.. unless NV will hand over some sweet cash??
They need to "bother" to learn how to program such features.
And given DXR tech nature which is fully programmable all these effects that are possible and optimizations to them which will be used are not even invented yet.
Those developers who experiment with this tech get experience from playing with this tech and will be asset to their companies along with what they create and add to game engines.
 
Maybe next-gen consoles will be able to do ray tracing.
maybe... or maybe not....

let me consult with ray-traced crystal ball...
ZvezL.png
 
Raytracing will -always- be a unicorn. When it gets to the point where we have cards that can raytrace at 1080p 60FPS, top-end cards will be running normal graphics at 4K 100FPS. When we get Raytracing working at 4K100 FPS, top end cards will be running 8K at 200FPS. the cycle will continue and real time raytracing will ALWAYS be a unicorn.
This is thinking about ray-tracing addition to graphics pipeline that is grounded in wrong assumptions

And terrible marketing is to blame here. NV could do this so much better and it is really sad they completely screwed it.
RTX tech is supposed to accelerate things not slow them down. NV failed to show just this.\
They neither release any demos which show RTX tech and can run on non-RTX cards nor they showed any ray-tracing effects that ale possible on non-RTX cards and get boost on RTX cards or prepare plugins for productivity software to take advantage of RT cores.

Hell, DLSS tech need not be game specific so they could just make universal proper upscaler for all games. Sure it would not work as well as game specific implementations but AI-upscaler is much better than any other type of uspscaler and given such tech need to be trained in universal way anyway it could just be done and would increase value of their cards to the point people would think twice about recommending non-RTX cards.
 
This is thinking about ray-tracing addition to graphics pipeline that is grounded in wrong assumptions

And terrible marketing is to blame here. NV could do this so much better and it is really sad they completely screwed it.
RTX tech is supposed to accelerate things not slow them down. NV failed to show just this.\
They neither release any demos which show RTX tech and can run on non-RTX cards nor they showed any ray-tracing effects that ale possible on non-RTX cards and get boost on RTX cards or prepare plugins for productivity software to take advantage of RT cores.

Hell, DLSS tech need not be game specific so they could just make universal proper upscaler for all games. Sure it would not work as well as game specific implementations but AI-upscaler is much better than any other type of uspscaler and given such tech need to be trained in universal way anyway it could just be done and would increase value of their cards to the point people would think twice about recommending non-RTX cards.

Well marketing does not have to implement ray tracing into game engines ;).
Until the end of next year we can see what Nvida is doing so far ray tracing is not going anywhere fast :)
 
Ray tracing has been the dream of card makers and game engine devs for years now. There have been articles in Popular Science, Forbes, several military magazines, online and TV shows for 20 years about it not being feasible but when it does drop into the public realm it'll explode.

The military helps fund certain developments in technology. Remember America's Army? Kind of a bad example but they helped in the development of AI which we now see commonplace. Flight sims will be huge with ray tracing for the training of pilots,not to mention the shooting simulation programs currently being used.

Think of it as a new console. The dev kits and programs have been in the hands of big companies for a year or more. The first iteration or 2 are crap, not the best to showcase what can be done. The second generation of games is when it starts to really shine. Driver tweaks push it along as well. Just watching current implementation shows that they've brute forced it into BF instead of using finesse.

I stand by my comment. By summer, start or end, Ray tracing will be a much more palatable option for gamers.

good arguement - there might be something there.. i'll believe it when i see it..
 
This might be weird but some folks have Battlefield V ray tracing running on Titan V
https://translate.google.com/translate?hl=en&sl=auto&tl=en&u=https://www.forum-3dcenter.org/vbulletin/showthread.php?t=590648&page=159

Getting decent framerates as well...
You mean ... they even have it with a hack on AMD Vega cards. But the big problem is RTX API renders the complementary scene of DXR limited scene to what you see. So you won't see it well on any non Nvidia card. I wasn't expecting such a huge support from developers, but here it is, and there is no way back or aside as the Mantle way for the AMD side. AMD sales are poor and only on the lower end cards from people playing cheap games, mostly freebies or oldies. So AMD is not in position to come close to real raytracing as RTX cards do. AMD has lost all interest from developers. Future games will become more and more only non RTX partially compatible, rendering poorly on other kind of GPU. AMD cards like Navi and Vega II will be good in 4K on non raytracing games... but who cares.
My bet AMD will be back when Intel will come to the GPU market, trailing Intel standards. AMD and Intel will come together with same standards to hope taking shares from Nvidia near 100% gaming market. So it will happen in 2020 or 2021.
 
Don't think AMD will do anything with ray tracing until they figured out a way where it does not cost die space or minimal die space.
 
Don't think AMD will do anything with ray tracing until they figured out a way where it does not cost die space or minimal die space.

I hope you are wrong there. We have already seen how powerful the new series from Nvidia is, and the huge performance cost of enabling Ray Tracing. If what you say is true, it could literally take them 10 years or more before that came to pass. I'm very hyped about it and want AMD to get on board.
 
Back
Top