RTX is a joke

Status
Not open for further replies.
Raster vs Ray Tracing is just tools to meet an end goal. But ray tracing doesn't necessarily have to mean "realism" in any way shape or form either. That all has to do with artistic intent.
I once saw someone make a similar diatribe about not wanting PhysX cards because they wanted there to be games that could have "cartoon physics" not understanding that the PhysX card just handles the calculations. You could make any type of physics interactions you want real, unreal, or otherwise.

RT in that sense is the same. There isn't anything there dictating how "hyper-real" anything has to be. It's more than feasible to make a 2d "flat lit" RT game if you wanted to do so. It's just that no one has made that application yet because currently it's not easier to do and would have unnecessary overhead that making 2d vectors doesn't. (Similarly it's possible to make an anime that looks "2D" using cell shading out of 3D).

Will people still want raster games for the same reason they want pixel art? Sure. But that isn't necessarily tied to a renderer. And to go back to your analogy at the beginning, I guarantee the average American has watched thousands of hours of YouTube and probably can't name 5 Renaissance painters. Death of raster? Maybe not. But raster will eventually become the new "pixel art" that is used minimally for stylistic reasons and not for any other reason.

When you're talking about two entirely computer generated mediums that are totally based around technology, the most high end version will always be the standard that is strived for. If you want to see that track record, 2d games still exist, but we're a long ways away from Atari level graphics, things found in the 8 bit era, and even new "16-bit games" are using far more advanced tech than ever existed in the 80's-90's are are at an extreme minimum. Said another way: there are zero AAA pixel art games. Raster will eventually have the same fate as being niche.

You are correct my argument is surface level. I guess my point is all the raster style shadows we have now... allow for a lot of artistic expression. More realistically modeling a reflection or some global lighting. Your right it doesn't alone trample artistic expression it is just a tool... its just insane overkill for the most part unless your going for ultra realism. I do think things like Lumen will continue to improve and basically replace the silly idea of using AI designed Tensor cores to create faux real time ray tracing. I agree there are artistic benefits of better lighting and reflections. I was responding to someone who said the long term goal should be 100% ray traced games. I just don't agree that is a goal anyone should really have. The only advantage to REAL 100% ray traced games (not the faux RTX style we have now) would be ultra realism imo... and that is no different then trying to replace Picasso with photos. Both have their place... but one is never going to replace the other. That is all I was really saying... Real time ray tracing may happen one day, but its not going to replace raster imo.
 
There is no reason ray tracing can't look a certain way. There's no requirement to be physically based.

Its not that it couldn't.... its that raster can already do such styles at 1/100th the silicon cost. You don't build a 8k TV to play 1080p content. I agree that software methods of more accurate raster light calculation such as lumen have their place. I just don't see the point of say in a decade from now building hardware designed to truly accelerate real ray tracing 100%... why commit 100x the silicon to achieve the same result.
 
You are correct my argument is surface level. I guess my point is all the raster style shadows we have now... allow for a lot of artistic expression. More realistically modeling a reflection or some global lighting. Your right it doesn't alone trample artistic expression it is just a tool... its just insane overkill for the most part unless your going for ultra realism. I do think things like Lumen will continue to improve and basically replace the silly idea of using AI designed Tensor cores to create faux real time ray tracing. I agree there are artistic benefits of better lighting and reflections. I was responding to someone who said the long term goal should be 100% ray traced games. I just don't agree that is a goal anyone should really have. The only advantage to REAL 100% ray traced games (not the faux RTX style we have now) would be ultra realism imo... and that is no different then trying to replace Picasso with photos. Both have their place... but one is never going to replace the other. That is all I was really saying... Real time ray tracing may happen one day, but its not going to replace raster imo.
There is artistic expression but then developers need to hire artists, designers, engineers, they need to spend expensive man hours and they need to test the crap out of it. Those huge frame rate dips and generally bad performance in sections or levels are from those humans making mistakes and missing things which people do. The real value RTX brings to the job is consistency and time. You don’t need a team of artists, testers, and engineers to build a level, you build a level, you define your properties and light sources you press some buttons then go for coffee. One person does in a week what took 5 people more than a month and it will be more consistent and generally what people would call “optimized” (I really hate that term).
What RTX is, is the new “we could spend 2 million on making it use half the ram, or we could tell them to buy more ram”.
 
Isn’t erek the one with the massive multi monitor dream builds in the mid 2000s? Did you run out of doge coins and now you’re jelly?
 
I think by now it is throught DX12 ( DirectX Raytracing (DXR) or Vulkan api, I do not think there is many RT game that do not run on non Nvidia cards (even Portal RTX does run on AMD card I think)

Maybe some nvidia only extension persist

You are probably right. Come to think of it, I ran the Quake2 RTX tech demo on my RX6900 when I had it.
 
Isn’t erek the one with the massive multi monitor dream builds in the mid 2000s? Did you run out of doge coins and now you’re jelly?
i was considering Atomic Hearts, but now i'm not. They was supposed to be an RTX enabled game through and through

"Super Flower Leadex Titanium 1600W | Western Digital Black 6TB | NVIDIA GeForce RTX 4090 FE | LG 27GN950-B UltraGear 144 Hz 4K IPS | Corsair H115i RGB AIO | Inland 1TB NVMe M.2 (Phison E12 Controller) | G.SKILL Trident Z Neo (For AMD Ryzen) Series 32GB (2 x 16GB) | Fractal Define R6 | Asus ROG X570 Crosshair VIII Hero | Ryzen 9 5950X"

https://valid.x86.fr/w62tde

1677032601146.png

1677032607879.png
 
i was considering Atomic Hearts, but now i'm not. They was supposed to be an RTX enabled game through and through

RT hasn't been cancelled with Atomic Heart...it's just coming later in a post-launch patch...lots of games have done this- Doom Eternal, Elden Ring, Plague Tale: Requiem etc
 
i was considering Atomic Hearts, but now i'm not. They was supposed to be an RTX enabled game through and through

"Super Flower Leadex Titanium 1600W | Western Digital Black 6TB | NVIDIA GeForce RTX 4090 FE | LG 27GN950-B UltraGear 144 Hz 4K IPS | Corsair H115i RGB AIO | Inland 1TB NVMe M.2 (Phison E12 Controller) | G.SKILL Trident Z Neo (For AMD Ryzen) Series 32GB (2 x 16GB) | Fractal Define R6 | Asus ROG X570 Crosshair VIII Hero | Ryzen 9 5950X"

https://valid.x86.fr/w62tde

View attachment 550970
View attachment 550971
ok you win
 
The X amusingly stands for Xtreme or some dumb shit.

DXR existed before Ultimate.
Ray Tracing Texel Extreme yeah I should have see that coming.
DXR was released back in 2018 for non accelerated hardware and it is still supported Nvidia translates DXR calls to RTX but not all RTX calls have a DXR equivalent.

AMD does straight DXR and so do most developers, outside of specific software sets for drafting and such very few developers bother with the RTX specific stuff.

But it is one hell of a recognizable brand at this point.
 
Ray Tracing Texel Extreme yeah I should have see that coming.
DXR was released back in 2018 for non accelerated hardware and it is still supported Nvidia translates DXR calls to RTX but not all RTX calls have a DXR equivalent.

AMD does straight DXR and so do most developers, outside of specific software sets for drafting and such very few developers bother with the RTX specific stuff.

But it is one hell of a recognizable brand at this point.
i thought Voxels were also the next big thing aside from RTX?
 
i thought Voxels were also the next big thing aside from RTX?
They still could be, there are just no development tools out there that support Voxel development. Should that change maybe it does take off but at this stage it’s probably the tech time forgot.
 
I have a 4090, and I have played a ton of titles with RT on and off for comparison, and the differences are marginal. And I have yet to see even a single title with RT on that wowed me. In 100% of the titles I have seen, I have seen other raster only games that achieved similar lighting effects, leading me to believe that nothing zip, zilch, nada, not Cyberpunk, or Metro Exodus: Enhanced Edition or Dying Light 2 or anything else, couldn't have been made to look just as good without RT at a fraction of the GPU load.

I am open to the idea that maybe some day in the future, with full scene RT we may see things that are a drastic visual improvement over raster, but it hasn't happened yet, not even in tech demos.

And if that is the case, maybe wait until you can actually build hardware that can do full scene RT before implementing it?

In its current state, I am totally sticking with it being mostly a marketing scam as there is little to no benefit, at exorbitant cost, and the little benefit there is could easily be done without it if they really wanted to.

Some day maybe it will no longer just be a gimmick. But right now that iis all it is. They gimped the raster version in order to sell the RT version, and then charged ridiculous money for the RT hardware, or you get stuck on the gimped raster version. Classic market manipulation.
Same here and I bet you the rtx effects could be cleverly enough simulated without all the BS libraries and proprietary nonsense. It’s all bloatware. The software could be engineered on orders of magnitude more efficiently and optimally. Look At Windows / UI over time for example . They could definitely write some awesomely efficient code, If they chose to
 
Same here and I bet you the rtx effects could be cleverly enough simulated without all the BS libraries and proprietary nonsense. It’s all bloatware. The software could be engineered on orders of magnitude more efficiently and optimally. Look At Windows / UI over time for example . They could definitely write some awesomely efficient code, If they chose to
1677040980853.png

1677041096235.png
 
Ray Tracing Texel Extreme yeah I should have see that coming.
DXR was released back in 2018 for non accelerated hardware and it is still supported Nvidia translates DXR calls to RTX but not all RTX calls have a DXR equivalent.

AMD does straight DXR and so do most developers, outside of specific software sets for drafting and such very few developers bother with the RTX specific stuff.

But it is one hell of a recognizable brand at this point.

There is no RTX API. You get DXR or the Vulkan interfaces. Whatever the driver does behind the scenes is mostly none of your business. The API can change during pre-release versions and if hardware doesn't exist yet (though it's very possible big devs get very, very, very early access) you can use the software renderer to test your code.

I wrote an application like the day of the DX12 SDK getting released and I think I remember having to use WARP because the drivers were so scuffed. Or they would update the SDK and it'd start crashing because the drivers haven't caught up - so back to WARP you go.

There's no real extension system in DX so there's libraries like NVAPI and AGS instead. They expose certain vendor specific fuckery.

In Vulkan land, Nvidia took point and put out an extension as a proposal. Quake 2 RTX used it originally. It was eventually ratified as a standard. These are generally collaborations. Anyone can submit a proposal.

i thought Voxels were also the next big thing aside from RTX?
This is actually covered by one of the Epic engineers during a Nanite talk. For geometry, it's definitely dead.
 
There is artistic expression but then developers need to hire artists, designers, engineers, they need to spend expensive man hours and they need to test the crap out of it. Those huge frame rate dips and generally bad performance in sections or levels are from those humans making mistakes and missing things which people do. The real value RTX brings to the job is consistency and time. You don’t need a team of artists, testers, and engineers to build a level, you build a level, you define your properties and light sources you press some buttons then go for coffee. One person does in a week what took 5 people more than a month and it will be more consistent and generally what people would call “optimized” (I really hate that term).
What RTX is, is the new “we could spend 2 million on making it use half the ram, or we could tell them to buy more ram”.

Well that is the promise made by Nvidia. Which hasn't panned out as of yet. So far ray tracing has involved a shit ton of hours.
Epic has perhaps provided something that might start delivering on that promise... but its not going to require Nvidia hardware to accelerate. From what I have seen it seems like Lumen enabled games you probably are going to want to not turn hardware RT on and use software based RT instead. 90%+ of the visual impact with fairly sizable performance increases. That is Nvidias problem right now. It looks like Epic is going to show them up pretty hard, delivering the visuals they have been marketing for 5 years now... without the RTX hardware. Like I had said earlier its probably time for Nvidia Intel and AMD to start listening and paying attention to what developers like Epic are doing. GPU hardware folks back in the day used to design their pipes to accelerate what people like Carmack where inventing... Nvidia for the last 5-7 years have been taking hardware designed for non game markets and trying to build tools to push game developers to engage their hardware that would otherwise be useless to gamers.

If I was a company like Intel right now... I would be paying Epic to embed a small team of Intel engineers. Engineers there to pay attention to how they could build the hardware to specifically push their engine. I'm sure all the GPU engineers working for all 3 companies are well versed. If a company like Intel wants to upend things though that would be how to do it. Drop a arch that is just built to push Epics lighting engine efficiently, so they could push such visuals on even inexpensive parts and perhaps even iGPUs.
 
I have a 4090, and I have played a ton of titles with RT on and off for comparison, and the differences are marginal. And I have yet to see even a single title with RT on that wowed me. In 100% of the titles I have seen, I have seen other raster only games that achieved similar lighting effects, leading me to believe that nothing zip, zilch, nada, not Cyberpunk, or Metro Exodus: Enhanced Edition or Dying Light 2 or anything else, couldn't have been made to look just as good without RT at a fraction of the GPU load.
You can't use your personal lack of observation skills as proof. You were not wowed by it, just means you were not wowed by it, not that there is literally no difference.
I could however see a meaningful difference even between the Ultra and Psycho setting in Cyberpunk 2077, let alone between RT off and on. You are arguing here from a position of willful blindness.
I am open to the idea that maybe some day in the future, with full scene RT we may see things that are a drastic visual improvement over raster, but it hasn't happened yet, not even in tech demos.
It already is there, you just choose to deny it, I don't know for what possible reason*. This is literally the same as if someone said in 2003 that per pixel shaders do not offer drastic visual improvements.
And if that is the case, maybe wait until you can actually build hardware that can do full scene RT before implementing it?
You have to start out somewhere you can't go from 0 to final iteration in one step. That never happens. At first pixel shaders were only used on a few surfaces, bump mapping was only used scarcely before it got better, this is the way for every new graphics feature.
In its current state, I am totally sticking with it being mostly a marketing scam as there is little to no benefit, at exorbitant cost, and the little benefit there is could easily be done without it if they really wanted to.
You do know what a scam is, right? Because it is not this, so please don't be ridiculous.
Some day maybe it will no longer just be a gimmick. But right now that iis all it is. They gimped the raster version in order to sell the RT version, and then charged ridiculous money for the RT hardware, or you get stuck on the gimped raster version. Classic market manipulation.
Yeah, I'm sure they gimped the raster version of games that only added RT as a patch months or years after the game's release.
Also can you imagine any developer agreeing to gimp their own game for nVIDIA? Maybe some indie devs bleeding money, but even then if it come out, it would be a huge controversy that isn't good for anyone.

Your argument here sounds like: why build houses from brick and mortar when it can be done from mud plaster almost as well?

*Actually, I have my suspicions that it is just people refusing to accept that their $2000+ graphics card is incapable of running games with RT on at high enough frame rate. So they'd rather tell themselves that it is worthless anyway. In other words pure cope. I've seen it in the hogwarts legacy thread as well, people claiming there is little or no difference, seeking validation from others. And they are getting it. Still I'd rather believe my eyes. And I can't run the game with RT at a playable FPS on my RTX2080Ti, and that's a bummer, but no reason for me to deny the impressive visual advantages.
 
Last edited:
If they implemented it and it needed more work then that's more money they have to spend. If Nvidia didn't give them money to implement the feature then I could see them pulling the feature. If Ray-Tracing isn't going to used on consoles then likely they expected Nvidia to pony up the money and pay them to implement it. That's usually how this works anyway.
Either way they promised the community that it willl be a feature so they are obliged to do it. If it is harder to implement than they had anticipated that's on them. It's not as if they only found out on launch day that it won't make it into the release version. So I still think they were being disingenuous. They deliberately waited until the last moment to announce the cut of the feature. And the "we might add it later" line can be complete lie made up to soften the blow.
I still think it's hilarious that people are calling ray tracing "RTX."
I think what's hilarious is that people are offended by this.
Everybody knows RTX is just the nVIDIA brand of implementing ray tracing, but it is universally understood that almost nobody actually means just nVIDIA when they say RTX. That's the benefit of being first. Plenty of new consumer products were known originally by the branding of the first major manufacturer that made them widely accessible.
 
*Actually, I have my suspicions that it is just people refusing to accept that their $2000+ graphics card is incapable of running games with RT on at high enough frame rate. So they'd rather tell themselves that it is worthless anyway. In other words pure cope. I've seen it in the hogwarts legacy thread as well, people claiming there is little or no difference, seeking validation from others. And they are getting it. Still I'd rather believe my eyes. And I can't run the game with RT at a playable FPS on my RTX2080Ti, and that's a bummer, but no reason for me to deny the impressive visual advantages.
The more likely scenario is that people with the super overpriced video cards scream that RT is the best thing since sliced bread because they don't want to feel stupid for massively overpaying for cards with features which do little in some cases and are completely unused and useless in the vast majority of others.

nVidia has had the RT hype train in overdrive since the release of the 2000 series. Many people have bought into the hype and yet the actual results of most people is that the hype is massively overblown and the difference is far from night and day.

What you claim is that people bitching about RT are simply disillusioned. I have no doubt many are disillusioned because they haven't seen the huge differences which were hyped and some of what you're seeing is natural blowback. Hyping up something which even after three generations of hardware still isn't remotely ready for prime time is going to cause issues. Most are people who have been skeptical or holding their opinion back from the beginning but are now voicing their opinion. RT has not panned out from what was promised and doesn't look like it will even be possible for a very long time.

What I've been seeing lately is fewer RT zealots speaking up which is no longer drowning out those who live in reality.
 
i'd rather good games than RTX
I'd rather have good games with RTX. It is not a one or the other proposition.
My theory is it's all crap and horribly unoptimized software. I figure you can come up with clever ways to calculate Tracing more legitimately than this garbage, and faster too.
There are no free meals, software is "horribly unoptimized" because it is infiintely more complex than it was in the 90s.
Back in the day all sorts of interesting and clever techniques had to be applied to speed up stuff...
Yes, because games were simple, it is not feasible to do the same level of optimization on a current AAA title. And I'd rather have the AAA title than 8bit looking games trying to sell themselves as nostalgia. And mind you Oblivion and New Vegas were already unoptimizied, buggy, and highly complex games.
but now we get 1.25GB driver installs monthly... riddled with hidden Critical and High CVEs, etc etc
There is more optimization going on now for games in those drivers, but you are comparing a model T to a 2023 car in complexity. Games might have been the most optimized versions of themselves in 1990, but they were still tiny and simple.
The Legendary Fast Inverse Square Root
It's "easy" to find giant leaps at the start, as technology progresses improvements become incremental.
 
The more likely scenario is that people with the super overpriced video cards scream that RT is the best thing since sliced bread because they don't want to feel stupid for massively overpaying for cards with features which do little in some cases and are completely unused and useless in the vast majority of others.
As if you could buy a new gaming video card without RT. It wasn't a choice. When I bought my RTX2080Ti, I did not care for RT, but there was no GTX2080Ti that I could've gotten instead.
nVidia has had the RT hype train in overdrive since the release of the 2000 series. Many people have bought into the hype and yet the actual results of most people is that the hype is massively overblown and the difference is far from night and day.
At first I could take it or leave it, but there is a night and day difference in CP2077, and especially in Hogwarts Legacy. It's not hype it is a fact. why would I want to hype something up that I myself cannot utilize?
What you claim is that people bitching about RT are simply disillusioned. I have no doubt many are disillusioned because they haven't seen the huge differences which were hyped and some of what you're seeing is natural blowback.
Natural blowback? More like being grown pampered and expecting everything to run perfectly because they paid big bucks. Yes graphics cards are overpriced, but I'm not going to deny the advantages of RT because nvidia is greedy.
I used to switch graphics cards every 6 months in the late 90s early 2000s, and still "enjoyed" a ton of slideshows. I'm not going to burn down the house because my 3 year old GPU has finally met a game that I can't crank graphics up to the maximum in.
Hyping up something which even after three generations of hardware still isn't remotely ready for prime time is going to cause issues. Most are people who have been skeptical or holding their opinion back from the beginning but are now voicing their opinion. RT has not panned out from what was promised and doesn't look like it will even be possible for a very long time.
Reality not meeting someone's expectations doesn't mean reality is in the wrong. I never expected RT to suddenly be available with no performance hit. Anyone who did were deluding themselves.
What I've been seeing lately is fewer RT zealots speaking up which is no longer drowning out those who live in reality.
LOL, if living in reality means denying reality to make yourself feel better about your buyers remorse, then sure. Meanwhile RT is still fully optional in games. So I ask again: What is the issue? You can choose to ignore it if you think it offers no visual improvements or the performance hit is too much. It seems to me that the zealots are the RT haters who want everyone to conform to their thinking. I'm not trying to force anyone to turn RT on, but for crying out loud stop denying that it offers improved visuals. Choosing performance over graphics fidelity is a valid position. Trying to entirely deny the visual improvement exists is not.
 
I'd rather have good games with RTX. It is not a one or the other proposition.

There are no free meals, software is "horribly unoptimized" because it is infiintely more complex than it was in the 90s.

Yes, because games were simple, it is not feasible to do the same level of optimization on a current AAA title. And I'd rather have the AAA title than 8bit looking games trying to sell themselves as nostalgia. And mind you Oblivion and New Vegas were already unoptimizied, buggy, and highly complex games.

There is more optimization going on now for games in those drivers, but you are comparing a model T to a 2023 car in complexity. Games might have been the most optimized versions of themselves in 1990, but they were still tiny and simple.

It's "easy" to find giant leaps at the start, as technology progresses improvements become incremental.
Rtx is at the start and there’s no giant leap in perceived visual fidelity or anything else beyond a gimmick 🙁. Others than me in this thread have stated this. The so called ray tracing I’ve seen can be simulated faster on pure rasterization. Even in gta online when it rains looks very similar to rtx on effects in other games like cyberpunk. I own both games
 
I agree that games can look great without RT...but RT can push visuals to another level...take a look at Metro Exodus: Enhanced Edition, Cyberpunk 2077, Dying Light 2
If people didn't point out what was ray-traced and what wasn't I'm 100% sure you wouldn't know what was unless someone told you. Furthermore, it's still not that realistic because it's not full scene RT'd. It's a reflection here a puddle there. Then to get it to perform well we send it through DLSS / FSR to reduce the impact of having it. It's just not that useful all in all and most people really wouldn't know the difference.
 
RT just crushes too many of the RTX enabled cards even with gimmicks like DLSS.

What's worse from Nvidia's perspective is that they've been pimping this tech since the 2XXX series when clearly it wasn't up to the task and virtually no software supported it. This didn't stop Jensen from charging an arm and a leg for it.
 
The more likely scenario is that people with the super overpriced video cards scream that RT is the best thing since sliced bread because they don't want to feel stupid for massively overpaying for cards with features which do little in some cases and are completely unused and useless in the vast majority of others.

nVidia has had the RT hype train in overdrive since the release of the 2000 series. Many people have bought into the hype and yet the actual results of most people is that the hype is massively overblown and the difference is far from night and day.

What you claim is that people bitching about RT are simply disillusioned. I have no doubt many are disillusioned because they haven't seen the huge differences which were hyped and some of what you're seeing is natural blowback. Hyping up something which even after three generations of hardware still isn't remotely ready for prime time is going to cause issues. Most are people who have been skeptical or holding their opinion back from the beginning but are now voicing their opinion. RT has not panned out from what was promised and doesn't look like it will even be possible for a very long time.

What I've been seeing lately is fewer RT zealots speaking up which is no longer drowning out those who live in reality.
Every new video card released in the past 3 years supports hardware ray tracing.
 
I wonder how the gaming world survived without cards with shaders? I wonder how the gaming world survived without 3D cards? I wonder how the gaming worls survived without SVGA?

The answer is, with less graphical fidelity. I don't understand why gamers, particularly on a site called [H]ardforum, are opposed to progress. The idea behind RT is to push gaming graphics to a new level, why is that a bad thing?

Like all technologies, it is difficult at first and of limited use. Shaders were the same way. When the GeForce 3 came out we finally got cards with programmable pipelines (sort of, not fully). However they were pretty weak. You could use them in a game, but it had to be done sparingly. Maybe for some shiny water, or Dot3 products on a bumpy surface. But know what happened? Each generation they improved, the cards got more powerful, what you could do with shaders got better. As time went on, got to the point where you couldn't find a game not using them. Even more graphically simple games used them because it is useful, and the hardware is powerful and present even on low end cards.

Ah the good ole "whataboutism" excuse where even 4K Ultra quality with AA isn't good enough anymore. Not like those effects can be done without RT any other way or no other graphical enhancements have been made? Nvidia thanks your for your patronage and AMD is still second class. Those peasant $700 and under cards are worthless now, only good for Word or Youtube. If we don't fall in line we aren't hardforum worthy is quite a stretch.
 
I have played all of those.

I played Cyberpunk back when I had a 6900xt (which turned RT modes into a 20fps slide show) I played Metro Exodus on my old Pascal Titan, but I am playing through Dying light now.

I now have a 4090, and I have gone back and played the titles with and without RT, and honestly, the differences are marginal as all hell. I wouldn't even tell RT is on unless I compared them side by side.

To me - for instance - Cyberpunk just looked slightly less dark with RT enabled. The reflections and lighting didn't really appear very different outside of that.

And that marginal improvement came at a HUGE performance hit.

And I'd even argue that you could accomplish that small improvement in visuals using raster graphics as well, but they just don't, because they want to give you a reason to turn RT on.

It's more like raster modes are being gimped to make RT look better, than anything else, and even then the difference is minor.
The best game for showcasing Raytracing I’ve seen so far is Witcher 3. The raytracing really does add to the experience. Kills the frame rates, but generally looks significantly better.
 
The best game for showcasing Raytracing I’ve seen so far is Witcher 3. The raytracing really does add to the experience. Kills the frame rates, but generally looks significantly better.
Does it add to gameplay in any way? Does it change the way the game is played in any way?

When the answer to those questions is "yes" then raytracing will be a must-have. If it just makes things prettier... meh.
 
  • Like
Reactions: noko
like this
Does it add to gameplay in any way? Does it change the way the game is played in any way?

When the answer to those questions is "yes" then raytracing will be a must-have. If it just makes things prettier... meh.

What is even the logic here?

I mean why even have graphics settings am I right, they just make the game look better after all
 
What's worse from Nvidia's perspective is that they've been pimping this tech since the 2XXX series when clearly it wasn't up to the task and virtually no software supported it. This didn't stop Jensen from charging an arm and a leg for it.

From a "marketing" perspective though... Nvidia has "won"... (even if it is a joke).
The amount of shilling of "you need nvidia rtx" on Reddit and YouTube has increased tenfold...
 
Status
Not open for further replies.
Back
Top