RTX is a joke

Status
Not open for further replies.
The more I read this thread, the more confused I am on what is going on here... some like RT, some don't... cool story, but the evolution of graphics in games is built by hardware pushing the edge every generation and you can thank your 144+ FPS in games to a lot of that development.

Don't like RT, fine, save your money and get a mid-range or low-end card and enjoy your games with features turned off.
Some wonder if the resources invested in real time path tracing is ultimately wasted.
Others think real-time path tracing is the future, but right now we're stuck in this in-between limbo land where were are supporting two different technologies with split resources for raster and path tracing.
Others think real-time path tracing is already here, and everything is great!
 
It's a little bit of a scam of you ask me.

Developers don't need RT to make visually stunning games. A well designed raster game can look every bit as good as the best RT title.

But RT has become a marketing need due to the hype. As we learned in this story, people will complain when they don't get RT, even if it isn't necessary to make a game look good. Sometimes (especially in the RTX 2000 days) there was also pressure and/or coercion from Nvidia.

There is also an argument to be made that it might be easier to develop levels etc. with RT than it is to create light maps etc. like you have to in raster. Even so, we are talking something that helps Nvidia or the dev, not something that benefits the user.

(This is also more of a future state, as in today's games you can't just make them 100% RT, so you are still going to need light maps, etc.)

Once the developers use RT the in the game - however - GPU requirements go through the roof and Nvidia profits, as kids feel they NEED an RTX GPU to get the most out of their games.

So yeah, it's a bit of Nvidia RDF utilized in full swing to sell kids thousand dollar GPU 's.
if you ask me, there's a huge difference in some games with RT.

Wolfenstein youngblood was a huge surprise to me on how better it is with all the RT features turned on and how well optimized that game is.. a game from the RTX 2000 days.

 
Some wonder if the resources invested in real time path tracing is ultimately wasted.
Others think real-time path tracing is the future, but right now we're stuck in this in-between limbo land where were are supporting two different technologies with split resources for raster and path tracing.
Others think real-time path tracing is already here, and everything is great!
It's not here until the consoles have hardware acceleration for it and you can buy a sub $300 GPU capable of delivering 60fps for it. So probably 4 years and 2 generations out at this rate.

Epic is doing great work with UE5 and Nanite/Lumen, many studios who were working with Unity or in-house engines are actively switching for projects still in the early stages or looking to switch for future projects because what they are delivering there is an accountant's wet dream in terms of cost-effectiveness.
The tool sets and asset libraries just look to cut so many costs and give much faster deliverables that not using them when they are an option is just burning money. It's not going to be AMD nor Nvidia who make Raytracing actually a thing its going to be Epic and that is just funny.
 
And the moment you turn on RT with most of those cards the framerate drops to shit and the game is unplayable. "Support" means jack shit when the hardware can't actually perform well enough to be used.
unsuable? I mean if you're a twitch gamer sure, but for single player games like CP2077 or the revamped Witcher 3 or Metro Exodus the frame rates are totally fine.
 
if you ask me, there's a huge difference in some games with RT.

Wolfenstein youngblood was a huge surprise to me on how better it is with all the RT features turned on and how well optimized that game is.. a game from the RTX 2000 days.


aren't they just sandbagging the rasterization to make the RTX glorification and selling point seem all that much grander?

just like with anything such as statistics and charts. anybody can force an apparent argument by making A look like garbage and B a polished diamond
willing to bet there are rasterized games that look better than RTX On Wolfenstein Youngblood
 
What will not be a gimmick is reprojection being ported to PC games, as a latency-reducing frame generation technology.



Especially retroactive 6dof reprojection that rewinds frame rendering time latency to polltime latency, by using a dual GPU pipeline (original frames pipeline, not visible to screen) + (concurrent reprojection pipeline, perpetually visible to screen).

This is the route to 4K 1000fps 1000Hz UE5-detail-level. It will help RTX raytracing greatly.
 
aren't they just sandbagging the rasterization to make the RTX glorification and selling point seem all that much grander?

just like with anything such as statistics and charts. anybody can force an apparent argument by making A look like garbage and B a polished diamond
willing to bet there are rasterized games that look better than RTX On Wolfenstein Youngblood
Exactly. RT has been used in games for a very long time with potentially more accurate lighting results. Baking/lightmaps, just takes more work for each scene and is not as dynamic.

Digital Foundry's Best Game Graphics of 2022 -> Non real time RT title, Horizon Forbidden West. Imagine that. Reflections were done with cubemaps updated with camera movementj, and it looks outstanding without the RT hit in performance. Motion, animations, lighting, environmental behavior all add to the visuals. There is more to visuals then just RT in other words, it is the whole experience visually which I think they broken down into their components well.



The whole video is worth watching.
 
CamFrameX on twitter was blown away by the RTX until he realized it wasn't rtx lmao. i think they did such a good job with lighting people were fooled it didn't have RTX. May be they low key wanted to see lmao. I always said this if a game is well done 9 out of 10 people won't know the difference if RTX Is on or its off if you did a blind test.

Given I have a 4090 I do think at the current rate its more of a marketing then anything still because most games peopel fail the eye test.
 
It's like the guys at Best Buy turning up the contrast and brightness to make the more expensive TVs look better than they really are. Come on - crank up the gamma on the raster images to match rtx.
 
When did you ever expect to max out graphics on low end video cards?

I have never seen so many panties in a bunch over an optional graphical feature included in some games.
I think you should look into the mirror when you talk about that goalpost shifting. I responded to the comment that every card in the past three years has had raytracing support. I didn't move the goalposts when I said that; I stated a fact. You obviously don't like the fact but it's still a fact.

Especially the point that a 1080p card dies the moment that RT is enabled. Or that many 1440p cards also die when RT is enabled or it's required that you drop to 1080p on a 1440p card to make RT usable.

That means the vast majority of cards out there can't do raytracing without massive performance losses, many times making the performance too low to play.

We are already on the third generation of hardware for ray tracing and it's still not worth a shit outside of the top end cards. At what point is ray tracing going to be usable from top to bottom of the product stack? Until that happens and a few years go by, ray tracing is a gimmick.

At the pace ray tracing in gaming is going, a snail could outrun it. It's in hardly any games. The application in most games is extremely limited even to the point that someone has to bring out some side by side comparisons and tell people which is which. All of this for some tiny effects that without someone telling you which is which they likely wouldn't know the difference.
 
We are already on the third generation of hardware for ray tracing and it's still not worth a shit outside of the top end cards. At what point is ray tracing going to be usable from top to bottom of the product stack? Until that happens and a few years go by, ray tracing is a gimmick

it's not specific to ray-tracing...all new graphics tech will have the same effect...when Crysis was first released on PC it didn't have any RT and it took multiple generations for GPU's to catch up...it always starts out with the high end cards and filters down over multiple generations
 
it's not specific to ray-tracing...all new graphics tech will have the same effect...when Crysis was first released on PC it didn't have any RT and it took multiple generations for GPU's to catch up...it always starts out with the high end cards and filters down over multiple generations
And no one was stupid enough to make another Crysis since Crysis because you don't make money by making games that only the top 0.5% of the regular gaming population can play. Besides, Crysis is a terrible comparison since a huge part of the issue is that it's held back by single core CPU performance. By the way, as I already pointed out, we're three generations in already and still RT is basically crap on anything but the top end cards.
 
Look how good we have it though TBH

We're arguing how much to accurately predict light in real time

What a problem to have IMO
“RTX” junk is actually stagnating real hw accelerated pure ray tracing imho. To me there’s a huge difference between RTX layered on top of Rasterization and native Path Tracing through and through.

There’s more to it though where ultra photorealistic games don’t tend to age well compared to clever and identifying aesthetics, I still think a nice rasterized art style and good execution of a story line or other game mechanics is going to destroy and out weigh the value of RTX any day of the week. Many posts above are proving that rasterized games can do an awesome simulation of beautifully rendered lighting.

It's like the guys at Best Buy turning up the contrast and brightness to make the more expensive TVs look better than they really are. Come on - crank up the gamma on the raster images to match rtx.
But in this case with rasters you’re having improved input latency and general fps for that potential match of visuality
 
Last edited:
Spending a minimum extra 500 bucks or more on a card who's framerate drops in half with RTX eye candy sprinkles is never going to happen with me. :)
I support this, kudos and good on you
 
  • Like
Reactions: Mega6
like this
Games like the Witcher 3, Hogwarts, ect look way better with RT ambient occlusion and shadows. Even just RT low.

Yennefer with RT 🔥🔥

You also can’t compare stills. You need to play it.

I played it with Hogwarts Legacy turned on, and I really couldn't tell a difference while playing. I just don't see this massive difference you keep going on about. Tried high to low just to see and other then eating the frame rate I could only make out very subtle differences that hardly made it anymore spectacular. Looks great without any Ray Tracing in my opinion.
 
Not sure why everyone acts like the only setting is Ultra…. Low occlusion / shading still looks amazing and has minimal frame rate hits. That’s really what you want RT for anyways.

I played it with Hogwarts Legacy turned on, and I really couldn't tell a difference while playing. I just don't see this massive difference you keep going on about. Tried high to low just to see and other then eating the frame rate I could only make out very subtle differences that hardly made it anymore spectacular. Looks great without any Ray Tracing in my opinion.

Everything has more depth and is less bland with it on. I also get off to the spells reflecting on the objects around them.

They probably could have done similar without RT tbh, like discussed earlier in the thread, I do like the real thing though. I am a shadow / lighting whore, especially for outdoor scenes.
 
Last edited:
I played it with Hogwarts Legacy turned on, and I really couldn't tell a difference while playing. I just don't see this massive difference you keep going on about. Tried high to low just to see and other then eating the frame rate I could only make out very subtle differences that hardly made it anymore spectacular. Looks great without any Ray Tracing in my opinion.
what about the performance hit? did you have to enable DLSS
 
if you ask me, there's a huge difference in some games with RT.

Wolfenstein youngblood was a huge surprise to me on how better it is with all the RT features turned on and how well optimized that game is.. a game from the RTX 2000 days.


This video is basically standing super close to a reflective surface, filling half the frame with it. Sometimes zooming in on it.

RT reflections are very cool. But, for the couple shots where they actually take in a whole room or scene----A couple of reflections are literally the only difference. Now, that game does indeed run pretty well. And that's because RT reflections by themselves, aren't too heavy to run. It was also one of the first games with DLSS 2.0 (Maybe even the 2nd??).

But I would not be put out at all, if I had to run that game on a 1660ti, 1080ti, etc. It looks great. The RT reflections are sprinkles.
I played it with Hogwarts Legacy turned on, and I really couldn't tell a difference while playing. I just don't see this massive difference you keep going on about. Tried high to low just to see and other then eating the frame rate I could only make out very subtle differences that hardly made it anymore spectacular. Looks great without any Ray Tracing in my opinion.

Not sure why everyone acts like the only setting is Ultra…. Low occlusion / shading still looks amazing and has minimal frame rate hits. That’s really what you want RT for anyways.



Everything has more depth and is less bland with it on. I also get off to the spells reflecting on the objects around them.

They probably could have done similar without RT tbh, like discussed earlier in the thread, I do like the real thing though. I am a shadow / lighting whore, especially for outdoor scenes.
Is the RT in Hogwarts still busted/rendering improperly, until you edit a config file? It looked pretty damn good, after the config file edit. But, I wasn't paying close attention to hear if a patch fixed it.
 
I think you should look into the mirror when you talk about that goalpost shifting. I responded to the comment that every card in the past three years has had raytracing support. I didn't move the goalposts when I said that; I stated a fact. You obviously don't like the fact but it's still a fact.
You were claiming that people bought overpriced cards because of RT, except every card they sold had RT support, nobody bought a card because of RT support. So now switching to talk about performance is a literal moving of the goalpost, I couldn't come up with a better textbook example for it.
Especially the point that a 1080p card dies the moment that RT is enabled. Or that many 1440p cards also die when RT is enabled or it's required that you drop to 1080p on a 1440p card to make RT usable.
If you can't deny the visuals, you pivot to talking about performance. I still see no valid reason for being vehemently against it. All you look like is a luddite. When 3D graphics started out my VGA often ran at sub 5 FPS with 3D Acceleration enabled in some games. Still I wasn't bitching about how it is useless, or a waste of resources. This is totally uncharacteristic of a supposedly hardcore tech forum to be against new things this badly, especially optional ones.
That means the vast majority of cards out there can't do raytracing without massive performance losses, many times making the performance too low to play.
Let people decide what is too low performance to play. I don't understand this need to force your opinion on others. Everybody can choose whether to play with RT or without it. When havok phyisics were first introduced in Max Payne 2, the performance hit was massive also. Was it a huge change? No it merely affected the behavior of objects in the level. But it was a cool innovation, that has become industry standard long ago.

We are already on the third generation of hardware for ray tracing and it's still not worth a shit outside of the top end cards.
That's simply not true. The barrier to entry into RT has lowered with each generation. If I can play with RT in most games on my 2080Ti, then also can someone with a 3060Ti or 3070. And for current gen probably on a 4060 maybe even a 4050 when that comes around.

At what point is ray tracing going to be usable from top to bottom of the product stack? Until that happens and a few years go by, ray tracing is a gimmick.
This is so wrong. You literally advocate that if we can't have it running at 60fps on lower end cards, then nobody should have it.
At the pace ray tracing in gaming is going, a snail could outrun it. It's in hardly any games. The application in most games is extremely limited even to the point that someone has to bring out some side by side comparisons and tell people which is which. All of this for some tiny effects that without someone telling you which is which they likely wouldn't know the difference.
If you wouldn't know the difference then it is definitely not for you. You know what's the good thing? Nobody is forcing you to use it. So stop telling us that it is useless, which as as good as calling us stupid for liking it.
 
Last edited:
Not sure why everyone acts like the only setting is Ultra…. Low occlusion / shading still looks amazing and has minimal frame rate hits. That’s really what you want RT for anyways.



Everything has more depth and is less bland with it on. I also get off to the spells reflecting on the objects around them.

They probably could have done similar without RT tbh, like discussed earlier in the thread, I do like the real thing though. I am a shadow / lighting whore, especially for outdoor scenes.
I have the game as well. There was nothing there that blew my mind especially considering the performance impact. RT'ing the illumos spell was utter BS because every time you would use it the FPS would drop like a rock. Once they patched it the differences between RT and not often times came down to contrast of the light source and the shadows but in no way did it vault the game into some superior state that made the impact worth it.
 
You were claiming that people bought overpriced cards because of RT, except every card they sold had RT support, nobody bought a card because of RT support. So now switching to talk about performance is a literal moving of the goalpost, I couldn't come up with a better textbook example for it.
Literally there are reviewers justifying a $200 premium, pricing brackets be damned because of RT. So we can't spend an extra $50 to make sure your MB has PCI-E4.0 but we can spend an extra $200 because.....ray tracing (and actually more if you want a decent experience). It's bonkers. Could easily provide video after video of some pretty famous reviewers justifying insane premiums because of RT alone.
That's simply not true. The barrier to entry into RT has lowered with each generation. If I can play with RT in most games on my 2080Ti, then also can someone with a 3060Ti or 3070. And for current gen probably on a 4060 maybe even a 4050 when that comes around.
Your top of the line 2080TI is playing this game at no more than 54 FPS @ 1080P!!!! The lows are worse. You're gonna feel every RT hit while playing and it's going to be obvious. The 3060TI? LOL. Where you top out at 34 FPS at 1080P w/ RT on?

1677141958614.png
 
I played it with Hogwarts Legacy turned on, and I really couldn't tell a difference while playing. I just don't see this massive difference you keep going on about. Tried high to low just to see and other then eating the frame rate I could only make out very subtle differences that hardly made it anymore spectacular. Looks great without any Ray Tracing in my opinion.

Depends on the game. Hogwarts apparently doesn't look much different. Other games it is worth using if your hardware can handle it.

You need to evulate on a game by game basis.
 
  • Like
Reactions: kac77
like this
Depends on the game. Hogwarts apparently doesn't look much different. Other games it is worth using if your hardware can handle it.

You need to evulate on a game by game basis.
It obviously doesn’t help that the game is buggy as hell with terrible memory management and known VRAM memory leaks.
 
Literally there are reviewers justifying a $200 premium, pricing brackets be damned because of RT. So we can't spend an extra $50 to make sure your MB has PCI-E4.0 but we can spend an extra $200 because.....ray tracing (and actually more if you want a decent experience). It's bonkers. Could easily provide video after video of some pretty famous reviewers justifying insane premiums because of RT alone.

Your top of the line 2080TI is playing this game at no more than 54 FPS @ 1080P!!!! The lows are worse. You're gonna feel every RT hit while playing and it's going to be obvious. The 3060TI? LOL. Where you top out at 34 FPS at 1080P w/ RT on?

View attachment 551279
Except PCIE 4.0 has literally no real world meaningful benefits. Also top of the line....4 years ago. That's ancient by computer hardware standards, I was just pondering recently that I'm surprised it lasted this long and still allows me to play most games at decent settings.

Hogwarts is the only game so far I could not play with RT, because it doesn't work well under 16GB Video memory on nvidia cards. Even at 1080 it uses like 14GB memory with RT.

How far can you carry that goalpost? I mean literally demanding brand new games to run perfectly on 4 year old hardware now?
 
You were claiming that people bought overpriced cards because of RT, except every card they sold had RT support, nobody bought a card because of RT support. So now switching to talk about performance is a literal moving of the goalpost, I couldn't come up with a better textbook example for it.

If you can't deny the visuals, you pivot to talking about performance. I still see no valid reason for being vehemently against it. All you look like is a luddite. When 3D graphics started out my VGA often ran at sub 5 FPS with 3D Acceleration enabled in some games. Still I wasn't bitching about how it is useless, or a waste of resources. This is totally uncharacteristic of a supposedly hardcore tech forum to be against new things this badly, especially optional ones.

Let people decide what is too low performance to play. I don't understand this need to force your opinion on others. Everybody can choose whether to play with RT or without it. When havok phyisics were first introduced in Max Payne 2, the performance hit was massive also. Was it a huge change? No it merely affected the behavior of objects in the level. But it was a cool innovation, that has become industry standard long ago.


That's simply not true. The barrier to entry into RT has lowered with each generation. If I can play with RT in most games on my 2080Ti, then also can someone with a 3060Ti or 3070. And for current gen probably on a 4060 maybe even a 4050 when that comes around.


This is so wrong. You literally advocate that if we can't have it running at 60fps on lower end cards, then nobody should have it.

If you wouldn't know the difference then it is definitely not for you. You know what's the good thing? Nobody is forcing you to use it. So stop telling us that it is useless, which as as good as calling us stupid for liking it.
You have a lot of lies in this post and you need to go back and correct them. You're also putting a lot of words in my mouth that I never said, you need to correct that as well.

It's also hilarious that you consider a card that has RT hardware but can't run games with RT on is somehow moving the goalposts. If the card isn't capable of turning the feature on with playable performance, does it really matter if it technically has the feature? No, it doesn't. That's not goalpost moving, that's common sense.

The last time I checked, 60fps was considered the minimum for pc gaming and for good reason. Generally, anything below that ends up being a stuttering mess. I've played games with sub-60fps framerates and it's not a pleasant experience. For you to say that the universally regarded minimum fps is irrelevant is telling on your lack of real argument.

By the way, the barrier to entry on RT hasn't gone down. From more recently released games it has gone up. Up to the point that even high end cards are struggling again and it's only going to get worse. We're not at the pinnacle of RT implementation in games, we're at the very bottom and performance is anything but good for the vast majority of people. Implementations will get heavier and will need even more hardware that what we have now which will leave almost all the current cards in the dust. That's a fact. Well, it's a fact unless developers put a hold on RT features or simply stop adding them. It was the same with the other graphics technologies you like to keep bringing up. They were either used even more or they were abandoned.

I'm not against RT like you keep saying. I'm against the hype machine which has been going full force since the announcement/release of the nVidia 2000 series. I'm also against the recommendations people keep making that RT should be the determining factor in purchases even if it means a considerable increase in purchase prices. From the beginning I adopted a wait and see attitude. I've waited and after three generations of cards I'm still not seeing much value. Price is stupid and performance is pathetic yet. I'll repeat something else I've said, from everything we've seen so far it will be a minimum of two if not three more generations before RT is even halfway usable on anything but top end cards. It's possible some other breakthroughs may come along to change that but I seriously doubt it.

No amount of hype or hope is going to change reality and the reality is ray tracing isn't ready.
 
Depends on the game. Hogwarts apparently doesn't look much different. Other games it is worth using if your hardware can handle it.

You need to evulate on a game by game basis.
LOL, hogwarts looks massively different with RT, I wish I could play with RT, but it is not possible on the 2080ti. You need a 16GB card to do it.
 
You have a lot of lies in this post and you need to go back and correct them. You're also putting a lot of words in my mouth that I never said, you need to correct that as well.
If you accuse me of lies you'd better come with receipts.
It's also hilarious that you consider a card that has RT hardware but can't run games with RT on is somehow moving the goalposts.
Can't run does not equal can't run at your arbitrary standard that you only mention after it was pointed out that all cards have RT support.
If the card isn't capable of turning the feature on with playable performance, does it really matter if it technically has the feature? No, it doesn't. That's not goalpost moving, that's common sense.
But my card could run the feature at acceptable performance to me in every game until hogwarts legacy (which seems to be an issue specific to the game having a memory leak, and not the entirety of RT).
Yeah, adding extra conditions is literally moving the goalposts. It was never declared that you only consider 60+fps as acceptable. Still I wouldn't care, as for single player games I don't need 60fps, that's only your arbitrary standard.
The last time I checked, 60fps was considered the minimum for pc gaming and for good reason.
Checked with who? The official authority of PC Gaming? LOL. So if my FPS drops bellow 60 for any reason I'm not a real PC gamer, am I? As said, why do you want to force your own opinion on others? Let me decide what is satisfactory performance for me, OK?
Generally, anything below that ends up being a stuttering mess. I've played games with sub-60fps framerates and it's not a pleasant experience. For you to say that the universally regarded minimum fps is irrelevant is telling on your lack of real argument.
Generally, generalizations are stupid. Stuttering is not a direct function of FPS. A game can be a stuttering mess at 100fps average and be smooth as butter at 30. The two are not directly linked.
By the way, the barrier to entry on RT hasn't gone down. From more recently released games it has gone up. Up to the point that even high end cards are struggling again and it's only going to get worse. We're not at the pinnacle of RT implementation in games, we're at the very bottom and performance is anything but good for the vast majority of people. Implementations will get heavier and will need even more hardware that what we have now which will leave almost all the current cards in the dust. That's a fact. Well, it's a fact unless developers put a hold on RT features or simply stop adding them. It was the same with the other graphics technologies you like to keep bringing up. They were either used even more or they were abandoned.
Newer games need better HW, imagine that, what has PC gaming become, amirite?
So if it was the same with all other graphics features why is it a problem now? Whether RT is here to stay or will be phased out IDK, but I like how it looks, and I'll use it whenever I can.
I'm not against RT like you keep saying. I'm against the hype machine which has been going full force since the announcement/release of the nVidia 2000 series.
What machine? Is the mere mention of support for RT in games a hype machine? I'm not for RT because of any outside pressure or marketing, I'm for it because I've tried it in games and seen what it offers.
I'm also against the recommendations people keep making that RT should be the determining factor in purchases even if it means a considerable increase in purchase prices.
I always advocated for pure rasterizer performance without RT and FSR/DLSS to be the the determining factor. I even called out nvidia for their lies about 4xxx performance.
From the beginning I adopted a wait and see attitude. I've waited and after three generations of cards I'm still not seeing much value. Price is stupid and performance is pathetic yet.
I've adpoted the see it for yourself attitude just about everything, and I've seen RT, and I think it is great. Not that great that I should buy a €2000 video card just for it, but if I'm already spending on an upgrade I might spent a little extra. I'm also not going to deny the graphical benefits of RT just because it runs poorly on mid range and low end cards. I'll try to enjoy RT whenever possible on whatever HW I'll have.
I'll repeat something else I've said, from everything we've seen so far it will be a minimum of two if not three more generations before RT is even halfway usable on anything but top end cards. It's possible some other breakthroughs may come along to change that but I seriously doubt it.
You can repeat it however many times you want, it will still be your arbitrary standard and not general truth. Meanwhile I've been enjoying multiple games with RT in the past few years.
No amount of hype or hope is going to change reality and the reality is ray tracing isn't ready.
When is it ready, when it runs on the lowest end card that supports it at 60fps? That'll never happen, and even if it did, then you'd probably say something like "Yea, but it only runs at 1080p"
 
Except PCIE 4.0 has literally no real world meaningful benefits. Also top of the line....4 years ago. That's ancient by computer hardware standards, I was just pondering recently that I'm surprised it lasted this long and still allows me to play most games at decent settings.

Hogwarts is the only game so far I could not play with RT, because it doesn't work well under 16GB Video memory on nvidia cards. Even at 1080 it uses like 14GB memory with RT.

How far can you carry that goalpost? I mean literally demanding brand new games to run perfectly on 4 year old hardware now?
Literally Hardware Unboxed did an entire video on it. It does. No one is moving goalposts here. You mentioned the 2080TI. I looked at its performance. It's not great. That's about it.
 
Someone's bitter about buying a 7900XTX and getting 4070Ti level's of performance with RT on. /s

On a serious note, I agreed with your sentiments for years, but unfortunately I'm seeing RT featured in more and more games. RT when done right is fantastic, while yes an argument can be made for a shader based solution to be comparable, but it's not exactly the same, and in the best use scenario of RT, shader based method's just don't compare. RTX is just Nvidia's branding, but Nvidia being as big as they are, it might as well be the de-facto name at this point, a lot of people refer to Ray Tracing as "RTX."

Either way, it's still sorta is a niche feature, but, the 40xx series seems to have made enough of a push to get it running good without DLSS in some cases, but we're not exactly there yet where RT's performance hit is the same as say Ambient Occlusion, or Texture Quality, etc. when we get to that point, I'm sure Nvidia will be well ahead of AMD, and it hurts to say that as an avid AMD guy, but Nvidia is just very much ahead in everything GPU related comparatively speaking. It took me three generations of AMD GPU's, and bashing on anything with the name "RTX," and/or "Nvidia" to finally come to the realization that Nvidia is busy pushing shit forward while maintaining a performance advantage in a lot of cases, while AMD is literally just copying them without bringing anything really new to the table outside of "chiplet" designs.

I have a 4070Ti in my PC, and honestly it's fun to look at RT from time to time, and yes I have done side by sides, and with Frame Generation, when available, makes playing with it on a non-issue. Hell in CP2077 with highest in-game settings, RT Ultra, Psycho RT Lighting with DLSS Quality and FG I'm pulling over 120 FPS, all the eye candy, with a minimal hit in IQ and double the frame rate without the latency? Yes please!

There is no future for raytracing until low and midrange cards can run it without unacceptable performance losses and that's not happening anytime soon. Two generations at the very minimum before that might happen and I'd say closer to three more generations.

If you don't believe me, look at the current landscape. People go nuts over the 4090 despite the price. They call it the future of RT. The very same people were saying the same thing about the nVidia 3000 series. Radeon RX 7000 series has the same RT performance of the nVidia 3000 series but it's somehow complete crap since it's AMD.

Well, factor in Frame Generation in some games can be turned on without DLSS and there you go. My 4070 Ti with FG in CP2077 w/o DLSS and everything else set to the highest pulls 85-90 FPS. Considering the jumps Nvidia's made in two generation, saying 2-3 generations until low-mid range cards is embellishing. I say by next gen, you'll see xx60/70 class cards running it without issue whatsoever. Just take a loot at the performance jump from the 3xxx to 4xxx. When have we ever seen a xx70 level card that's on par with last gens Titan class cards? Let alone the 4080 and 4090.

Now, if you will take the common theme about each new GPU release with Nvidia, where xx70 chips are as fast as the xx80 chips from last gen, and xx80 chips being on par if not better than the xx90 chips, and this is being modest considering the 4070Ti competes with the 3090Ti, but imagine if you will newer RT tech, newer Tensor tech, and going by Nvidia's RT gains by generation where do you think that'll put the next gen's xx60/xx70 cards? Most likely at 4080/4090 RT performance levels with raster performance almost equal. I'd wager that the xx50 level card might be on par with what's going to be this gen's 4060, or possibly 4070 in terms of RT performance. So, two to three generations is stretching it, I say one gen before we start seeing RT used on lower end hardware without issues.

Also, the 7900XTX is on par with a 4070Ti with RT on, that's not acceptable in any capacity. A $1000-$1100 GPU should not have the same performance as an $800'ish GPU. It's not "because it's AMD," it's because they're charging a small fortune for their hardware while offering mid-range RT performance levels. The 7900XT barely performs better than a 3080 with RT, what do you think that's going to say about the 7700/7800 cards? 3060/3070 level's of performance? It honestly paints AMD in somewhat of a negative light, considering the 4xxx series has better features overall that, to me at least, push the value in Nvidia's favor despite the cost differences.
 
I'm not against RT like you keep saying. I'm against the hype machine which has been going full force since the announcement/release of the nVidia 2000 series. I'm also against the recommendations people keep making that RT should be the determining factor in purchases even if it means a considerable increase in purchase prices.
All of this. We go from 20 - $50 is a completely different price bracket to if we are talking about video cards a $200 dollar difference is no big deal. It's mental. It's partly responsible for the insane GPU prices.
 
Last edited:
Literally there are reviewers justifying a $200 premium, pricing brackets be damned because of RT. So we can't spend an extra $50 to make sure your MB has PCI-E4.0 but we can spend an extra $200 because.....ray tracing (and actually more if you want a decent experience). It's bonkers. Could easily provide video after video of some pretty famous reviewers justifying insane premiums because of RT alone.

I haven't seen many reviewers recommend the NV cards solely because of RT. There are a number of reasons to spend the extra $100-$200 on a 4080 over a 7900xtx; better video encoding, lower power consumption while gaming, 5x less power consumption when not gaming with multiple monitors, AMD is still struggling with drivers, FSR is currently far behind DLSS 3, etc are all bonuses on top of the RT.

As for RT I have a 2080ti and turn it on or off on a game by game basis. I like it on but it doesn't bother me much if I have to turn it off. It's no different then the performance regressions when moving from high to ultra in most cases while usually giving a noticable bump in fidelity.
 
Last edited:

Not "RTX", but apparently Real Ray Tracing... see what NVIDIA did here? created all this confusion about what real Ray Tracing is about

not this Gimmick on top of Rasterization

AFAIK "RTX" is proprietary to NV hardware, but this HL-1 Ray Tracing works on AMD hw too, but there's some issues being ironed out

the video even states "path tracing" in real-time... so this is not RTX from NV that is being referenced in the title/subject of the thread ... this Path Tracing does look Amazing though
 
Last edited:
Spending a minimum extra 500 bucks or more on a card who's framerate drops in half with RTX eye candy sprinkles is never going to happen with me. :)
I remember when MSAA came out, made a lot of games unplayable anything above 2xMSAA. As cards get faster, experiences improve. MSAA runs smooth on any card today.
There's a cycle to this and acceptance of RT will be here soon once consoles make use of it.

Give it time.
 
aren't they just sandbagging the rasterization to make the RTX glorification and selling point seem all that much grander?

just like with anything such as statistics and charts. anybody can force an apparent argument by making A look like garbage and B a polished diamond
willing to bet there are rasterized games that look better than RTX On Wolfenstein Youngblood
I love this.

Ray tracing doesn't make a difference.

vs.

Ray tracing looks so much better than rasterization that they must be nerfing rasterization to give RT a boost.
“RTX” junk is actually stagnating real hw accelerated pure ray tracing imho. To me there’s a huge difference between RTX layered on top of Rasterization and native Path Tracing through and through.
You're right. We need to ditch rasterization completely. The sooner, the better.
 
Status
Not open for further replies.
Back
Top