• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

How many of you actually use ray tracing in the games you play?

Do you use Ray Tracing in the games you play?

  • I only play games with Ray Tracing options.

    Votes: 46 17.9%
  • I will sometimes enable Ray Tracing in games.

    Votes: 92 35.8%
  • I will enable Ray Tracing momentarily check it out then turn it off to get better res/performance.

    Votes: 50 19.5%
  • I prefer not to use Ray Tracing.

    Votes: 47 18.3%
  • My GPU doesn’t even support it. 🤷🏻‍♂️

    Votes: 22 8.6%

  • Total voters
    257
I always use ray tracing except maybe Fortnite where its enough I use lumen's global illumination. RT improves reflections and as nice it can look its not as useful for this type of game as slightly more frames per second even if performance impact here isn't big.
 
I always play games how they were designed/meant to played.... at max settings (at least as far as the PC build lets me). Even in multiplayer games like Battlefield when most people turn down the settings to remove foilage, etc to get that extra advantage. If the game was designed to be graphical and has the settings, then I turn them as high as I can!
 
I always play games how they were designed/meant to played.... at max settings (at least as far as the PC build lets me). Even in multiplayer games like Battlefield when most people turn down the settings to remove foilage, etc to get that extra advantage. If the game was designed to be graphical and has the settings, then I turn them as high as I can!
Ditto to this. Go big or go home..lol
 
I always turn it on. If I have to use DLSS or lower my resolution to get acceptable performance then I will.
 
I have a 3080 and a 4K LG TV as monitor, and I have to say that even if my card could handle it I wouldn't really be bothered with it. I find the whole thing "meh" and barely noticeable compared to traditional lightning methods, and would much prefer the higher framerate without.
Same goes for HDR.
Maybe I haven't seen a proper implementation of it but so far it seems very overhyped to me.
 
Only when combined with DLSS because the native resolution performance hit is too steep
 
Any narrative game that doesn't need competitive twitch reactions, I play with it on. I find it adds an extra level of niceness to most scenes. If broken, I'll shut it off as well, its broken.
 
If it's a single player game I use it. If it's multiplayer it depends on how hard it hits performance.

If a single player game doesn't have ray tracing it's a pretty big turn off at this point. I like good graphics.
 
Why does the game (Assassin's creed shadows) ship with two GI solutions rather than being RT-only, given the likes of Indiana Jones and Avatar: Frontiers of Pandora?

The simple reason is that we believe forcing RT on players today, notably on PC but also in performance modes on consoles, compels us to make qualitative sacrifices that we were simply not willing to take. We're proud of the baked GI system that we've continuously improved from AC Unity - with time of day in AC Syndicate, sparse GI in AC Origins, multi-state GI in AC Valhalla and seasons in AC Shadows.

It's a system that has a cheap run-time cost but produces great results, at the expense of build complexity. This allows us to use our limited GPU budget in 60fps scenarios on console on other features, like procedurally simulated trees and vegetation. We know performance modes are important to players and it will be a popular mode.

On the other hand, RTGI and RT reflections have undeniable advantages, and it was natural for us to develop this for Shadows and future titles. We believe that the game offers a real dilemma in terms of choosing between quality or performance mode. It would have been easier for us to develop a single GI solution, but we want to maximise the reach of the game, even including Steam Deck and portable PC hardware as well.

https://www.eurogamer.net/digitalfo...-tech-qa-rtgi-shader-compilation-taa-and-more
 
^ +1 to ubischeiss on that. Forcing settings on pc games is stupid, that's the whole point of graphical settings in the first place, to accommodate different hardware configs. Imagine if devs forced very high/ ultra textures, then people with less than 16gb cards would be screwed.
 
^ +1 to ubischeiss on that. Forcing settings on pc games is stupid, that's the whole point of graphical settings in the first place, to accommodate different hardware configs. Imagine if devs forced very high/ ultra textures, then people with less than 16gb cards would be screwed.
On the flip side some PC games have way way too many graphical settings and options. I don't know what half of them are most of the time. I just miss not needing a computer engineering degree to figure out the 6 types of artificial frames I can use and the 13 different post processing effects and temporaral this and that...

Some games have whole guides online that will show you which settings do what and the fps hit you will get with any one setting.

I just want Low to Very High options/settings or whatever.

I want UT2K4 graphical settings. Once you max everything out in settings you hear "HOLY SHIT..." in the menus.
 
Short answer is yes, if I can get the features to run well I turn it on and leave it on.

Long answer is, with video card pricing and availability the way it is, we ARE going to see a near complete stagnation of RT feature adoption for a year or two. If you have to buy a video card $1000 and up to really be able to turn the high qualtiy RT options on for a meaningul experience with it, no notable number of games will be released that require high end ray tracing hardware capabilities for their "as-intended" experience.

We're seeing massive demand from gamers for cards from $250 to $700. At the moment and for the forseeable future, you can't buy a card with really usable high end RT performance for that money.

Even 1 million 5080s and 5090s sold over the next 12 months won't put a dent in the numbers required for real heavy duty RT adoption. We aren't going to see a whole bunch of Indiana Jones and Cyberpunk RT Overdrive "required" games over the next 1-2 years. You'll be able to count the games that really push the full suite of RT features on one hand.

I wish I could predict faster adoption than this, but Nvidia has killed the current high end market with low availability and astronomical pricing. Nothing below a 5080 (and even it is questionable) can handle anything close to full path tracing and all the RT features on even with all of Nvidia's latest tricks to accellerate it. So we're still AT LEAST a gen or two away from something we could call a "real" RT conversion/revolution.

I suppose all that to say I don't consider anything on the market to really be "RT" yet. I'll concede Indiana Jones at high RT settings and Cyberpunk RT Overdrive are arguably using enough RT features for me to consider them "enough to call it RT centric."
 
I do and am enjoying the hell out of it now that I can do it with reasonable fps at the $700 price point.
 
Devs are doing a good job with hybrid models in many recent releases. I think this is how it will go for the foreseeable future, realistically.
 
I'll try it once my new card arrives, assuming any games in my library support it.
 
All of them. It makes a huge difference and once you're used to it, playing without it makes the game look lackluster.
 
All of them. It makes a huge difference and once you're used to it, playing without it makes the game look lackluster.
I would say the same thing about HDR. Another great example is Ratchet&Clank:RA. Minus the cool reflection here and there I hardly noticed it turning everything on vs. off. HDR on the other hand is the real star of the show in that game, again it's transformative. Without it hdr makes everything seem lackluster to me.
 
Turned RT on in Control with the 9070XT, surprised how well it ran. Then notice the ghosting behind the NPCs, the slow update to lighting, the slight behind motion of reflections. Yeah, RT is just peachy. Just think, we went with bake lighting with thousands of samples per pixel of RT baked in, to 1 to 2 samples per pixels, sorta real time. Control with the dynamic changing environments and destruction, RT fits well minus the issues.
 
I have a 3080 and a 4K LG TV as monitor, and I have to say that even if my card could handle it I wouldn't really be bothered with it. I find the whole thing "meh" and barely noticeable compared to traditional lightning methods, and would much prefer the higher framerate without.
Same goes for HDR.
Maybe I haven't seen a proper implementation of it but so far it seems very overhyped to me.
I felt the same about HDR until I upgraded to an OLED and Windows 11. Windows does a terrible job of managing HDR but 11 noticeably improved that. Once I saw it working correctly on a massive OLED it was truly impressive and changed my whole view on it for games.
 
I felt the same about HDR until I upgraded to an OLED and Windows 11. Windows does a terrible job of managing HDR but 11 noticeably improved that. Once I saw it working correctly on a massive OLED it was truly impressive and changed my whole view on it for games.
The HDR implementation in games also makes a huge difference. Some games really make an effort and look amazing, while others have mediocre implementation where HDR is better left off. It feels like game developers generally do a much better effort with HDR on consoles than PC.
 
The basic idea behind RT is really nice, but the execution and overuse in some cases is bad.
It is transformative in the few games that do it right, but a lot of games only add it as a marketing gimmick. CP2077, Control, Metro exodus enhanced, Alan wake 2, Assassin's creed shadows are examples where RT makes a huge difference, to the point where turning the RT off makes the game look like medium graphics. Unfortunately you need a lot of GPU power to run those, basically a 3080 with upscaling is around the minimum for the first 3 and you will need a 9070 XT or 4070 ti super for the latter two with upscaling at QHD (1440p) resolution.

The DLSS TF model at quality with Ray Reconstruction is very close to native without any DLSS/DLAA, and in some places it is superior, but the DLSS CNN model and FSR4 will lose out a bit on fine detail, motion stability etc. Having to run CNN model or FSR makes it a tradeoff between much more realistic looking lighting, that is a lot more dynamic, and having more fine detail with better image stability. Comparing max non-RT to max with RT on an Nvidia GPU with DLAA and Ray Reconstruction is a big win for max RT IMO when you ignore the framerate.
 
If it offered Global Illumination brighter maps I'll use it if the Stutters are not bad at 1440p. The game Atomic Heart uses it the best.
 
I just got a 5080 and it seems like Ray Tracing is still unusable? Admittedly haven't tried lots of games but I expected Portal RTX to not run so terribly...
 
  • Like
Reactions: noko
like this
I just got a 5080 and it seems like Ray Tracing is still unusable? Admittedly haven't tried lots of games but I expected Portal RTX to not run so terribly...
yeah, admittedly the RT performance in R&C:RA is all over the place with my 9070xt. Most maps run fine locked to 60, but there are a few where the performance just goes to shit. It kinda feels like 'the honeymoon is over' type of thing, where it starts off great and then it's just like wtf. I think this game could've used another performance patch or two tbh. I also get crashes while transitioning into cutscenes every now and then, it's kinda killing the experience for me.
 
Back
Top