DukenukemX
[H]F Junkie
- Joined
- Jan 30, 2005
- Messages
- 9,128
Real time 3D was a thing before 1994, just look at the Super Nintedo. It was Sony who flipped the gaming industry on it's head that shows that you could do it better. Ray Tracing was something everyone wanted but required hardware that wouldn't be viable. Here's three PS3's doing Ray Tracing on a car that looks like 3fps.If we go back in 1994, telling people what get close to realtime rendering with optix and a 5090 in a blender viewport would sound quite something.
View: https://youtu.be/oLte5f34ya8?si=Q3T_JAOck_5zRID3
This is why people are noticing that 9 year old games look better than today's games because we're seeing developers giving up on image quality.In other words, for AAA games in the last 5 years or so, if you are not using RT you are getting a subpar product.
View: https://youtu.be/mam6by5JdS8?si=VsMqwXT3yj71HOM4
This is why when the RTX 20 series was released I said at some point we need to all buy graphic cards with Ray-Tracing because that's the direction the industry is going. Developers aren't going to waste their time with a toggle where you can turn on and off Ray-Tracing, because that's a lot of work. They just said you need a GPU with Ray-Tracing.And we have even seen a move towards titles that require RT in the last few months. Indiana Jones and the Great Circle is one of them. There was one more too, but I forget what it was now.
These are games where there is no option to run without RT, and they do not run on GPU's that do not support RT.
This IMHO is the future. Not because I want it to be - mind you - but because development is easier and cheaper for game developers. Raster graphics require so many manual workarounds to get things to look good that can be accomplished with RT by just dropping light sources and cameras.
The all mighty Nvidia has spoken and you must use Ray Tracing. Your games may look worse but that's a risk they're willing to take.I think it sucks, because rasterization can look 99% as good as RT at a fraction of the GPU render load if a dev really wants it to, but the truth is that whatever is easier for developers usually wins.
I think Nvidia pushed Ray Tracing because when the crypto bubble popped in 2017 it's not like Nvidia had a reason for people to buy their latest GPU's. They invented Ray Tracing to push gamers to buy it.Lets be real about RT. When Nvidia dropped it with the 20 series, it was never a feature for the Gamer. They pushed RT because:
1.) They knew it would make developers lives easier, and they could get huge adoption relatively quickly.
2.) They knew they had an advantage over the competition with RT, and knew that if game devs started using it, even reluctant gamers would feel forced to buy into it, or get a subpar experience.
If you look at Steam's Hardware Survey you can see that up until the past couple of years the GTX cards dominated it. Now if you look it's dominated by RTX. Mainly low end RTX cards with it comprised of RTX 3060, 4060, 2060, and laptop variants. You have the 3070 and 4070 on the list but further down. If you wanna know why people are turning on DLSS, it's because they own a RTX **60 class GPU and therefore need to turn on DLSS in order to get a playable frame rate with Ray Tracing. Because everyone who bought an RTX card is certainly turning on Ray-Tracing in games, just to see what the meme is all about.So it was a way to make devs happy, and a way to stick it to AMD. And that has pretty much come true. All reports are that AMD has mostly caught up with the new 90 series Radeons, at least in the mid range now, and that going forward devs that were titillated by the massive cost and time savings they could get out of RT, but feared implementing it because they didn't want to alienate mainstream gamers are going to have less and less reason to not push RT and make their titles "mandatory RT".
