Are we going to see current graphical effects held back by RTX?

Mut1ny

[H]ard|Gawd
Joined
Apr 4, 2013
Messages
1,854
https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2070/

Proof in point.

If you scroll down just a bit you can see a comparison of "RTX" on and off...notice with it on you get the usual ray tracing stuff they're promoting...light bounce, ambient lighting, etc.

My question pertains to this comparison really. Seeing as the "RTX" off basically eliminates these effects I find it disturbing considering that we ALREADY have these effects now using traditional methods and it's really irritating to see them not in the game AT ALL.

So, instead of non-RTX users getting a less accurate global illumination technique we now apparently get NONE.

Will this be the trend going forward? Are we all going to be punished now because we don't have the absolute latest and unnecessary GPU for effects we already have?

Oh you want ambient occlusion? Sorry, you don't have ray tracing AO so you get none.
 
https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2070/

Proof in point.

If you scroll down just a bit you can see a comparison of "RTX" on and off...notice with it on you get the usual ray tracing stuff they're promoting...light bounce, ambient lighting, etc.

My question pertains to this comparison really. Seeing as the "RTX" off basically eliminates these effects I find it disturbing considering that we ALREADY have these effects now using traditional methods and it's really irritating to see them not in the game AT ALL.

So, instead of non-RTX users getting a less accurate global illumination technique we now apparently get NONE.

Will this be the trend going forward? Are we all going to be punished now because we don't have the absolute latest and unnecessary GPU for effects we already have?

Oh you want ambient occlusion? Sorry, you don't have ray tracing AO so you get none.

Its certainly a possibility. If you want to get the best graphical experience the price point just got much higher.
 
Its certainly a possibility. If you want to get the best graphical experience the price point just got much higher.

Maybe. I guess that's what I don't like. Previous generations had things that were specific but mainly due to architectural changes...but with shit like this we already have systems and methods in place to do this stuff so to just completely eliminate them over a single feature from a single line of GPU's just seems stupid and cheap as fuck.
 
Maybe. I guess that's what I don't like. Previous generations had things that were specific but mainly due to architectural changes...but with shit like this we already have systems and methods in place to do this stuff so to just completely eliminate them over a single feature from a single line of GPU's just seems stupid and cheap as fuck.

Well we just don't know enough yet. Nvidia could be doing deals with these developers under the table so they can have a leg up on the competition.
 
How can anybody answer this? Its completely on the developer on how they want to structure their game.

Yup. We're going to see a transition period, and we don't know how smooth it will be.

I'm just glad that they got it working this well at all.

Also, I don't see the future being so expensive. Hardware will start to trend toward ray tracing, but that's easier than all of the little raster tricks that have to be accelerated in hardware. That means that it's possibly going to get easier for AMD and even Intel to catch up.
 
How can anybody answer this? Its completely on the developer on how they want to structure their game.

True, but considering how graphics have always been a big proponent of their engines and games I highly doubt they wouldn't have some some low precision global illumination for a game like this and would just skip it all together in favor of a feature that almost no one will have access to. Doesn't make sense based on their history.
 
They did that nonsense with physx. They made it look like a flag could not blow around or sparks could not be produced or other stuff unless physx was enabled. I have no doubt that they'll hold back on effects for regular users just to make Ray tracing look better. And I'm worried that their optimizations will be just as piss poor as they were with physx. Borderlands Pre-Sequel can't even maintain 50 FPS in certain areas that are physx heavy even on a 1080 TI at 1080p. Yet in those spots GPU usage actually plummets showing just how shity the implementation is.
 
True, but considering how graphics have always been a big proponent of their engines and games I highly doubt they wouldn't have some some low precision global illumination for a game like this and would just skip it all together in favor of a feature that almost no one will have access to. Doesn't make sense based on their history.


My point is they have full control of that on a developer level.
 
https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2070/

Proof in point.

If you scroll down just a bit you can see a comparison of "RTX" on and off...notice with it on you get the usual ray tracing stuff they're promoting...light bounce, ambient lighting, etc.

I seriously had to scroll down to see which was which. I thought I knew which one had RTX on, but had to make sure. I wouldn't have put money on my guess!
 
Developers and artists love raytracing because it's much less of a headache. No more baking lightmaps, no more tweaking a million parameters for every scene to make it look just right. So in addition to nVidia marketing money they may selfishly drop support for fallback methods out of sheer laziness and cost savings.

Having said that, a lot of games today support multiple types of AA and AO etc so don't see why that would change now just because a few cards support RT.
 
PC devs will probably spend less time polishing non-ray traced lighting options, but most big games are also on consoles that don't have rtx.

So yeah I think think non-ray traced lighting will start to be less polished and as rtx becomes mainstream probably look much worse than it does now.

But really it's the same way medium graphics settings don't look as good as they could. They don't tweak the mid and low resolution textures, lower polygon models, etc. nearly as much as they do at the highest settings.
 
1973fd88-ddb7-48a5-b492-7b37ac3d6857.Full.jpg
 
So yeah I think think non-ray traced lighting will start to be less polished and as rtx becomes mainstream probably look much worse than it does now.

I want to acknowledge the possibility of this, while at the same time I want to point out that we're nearly at a 'dead end' when it comes to raster graphics. Note that successive hardware generations have pushed resolutions and framerates up while hardware up to five years old can handle 1080p60; game detail and hardware requirements just haven't kept up with the pace of hardware innovation.

And generally speaking current game engines are fully-featured and their implementations are well known.

So while 'less polish' is perhaps probable, I think that on average it likely won't be that bad. Much of the work needed is still in terms of textures and geometry, the 'art', and all of that is still needed for ray tracing.
 
We are more held back by art direction and console hardware. I think RTX features will be a niche only applied by developers who mainly develop for PC.
 
How can anybody answer this? Its completely on the developer on how they want to structure their game.

Well there's two development paths
1. Carefully crafted scenes in which you are limited on path for appearance (Traiditional techniques0 The illusion falls apart if you try to do something like move a light source, or move yourself out of the intended frame.

2. Implement RTX which will allow you to go anywhere and allow you to render it correctly.



Do you think they will do both? It raises some interesting questions. This is akin to PhysX on Intel vs NVIDIA CPUs No one bothered playing it on Intel because it was too slow thanks to the crippling NVIDIA put on PhysX on intel CPU's (Intentionally unoptimized code on a single thread).

NVIDIA claims PhysX is still in use on a number of engines at it's heart, but looking at those games, NVIDIA really doesn't hold a performance benefit against AMD with a properly matched card (1080 vs Vega 64)
 
I want to acknowledge the possibility of this, while at the same time I want to point out that we're nearly at a 'dead end' when it comes to raster graphics. Note that successive hardware generations have pushed resolutions and framerates up while hardware up to five years old can handle 1080p60; game detail and hardware requirements just haven't kept up with the pace of hardware innovation.

And generally speaking current game engines are fully-featured and their implementations are well known.

So while 'less polish' is perhaps probable, I think that on average it likely won't be that bad. Much of the work needed is still in terms of textures and geometry, the 'art', and all of that is still needed for ray tracing.

There's actually a lot of tweaking artists do with lighting. They adjust position, color, brightness, ect. to make it look nicer. Adding hidden light sources so areas aren't completely black, and other things just so the game is playable too. All of that won't matter if you're using ray tracing, which will have it's own fine tuning.

That's actually a lot of work that developers will be trying to cut down on.
 
The big question is how well the non-accelerated ray tracing in MS's DXR API (NVidia's using a Direct X extension this time, not creating a proprietary API) looks and performs vs traditional lighting tech. Ideally it'd give at least comparable performance so that games targeting the API could both deliver current levels of lighting on AMD and olrer NVidia cards at low/medium lighting levels and better quality lighting via the RTX cores at high/ultra.
 
That's actually a lot of work that developers will be trying to cut down on.

I do see that, and that is pretty exciting in and of itself!

I also wonder if ray tracing and a bit of machine learning will help them 'back port' a lot of this stuff for raster engines.

Ideally it'd give at least comparable performance so that games targeting the API could both deliver current levels of lighting on AMD and olrer NVidia cards at low/medium lighting levels and better quality lighting via the RTX cores at high/ultra.

You need the RT hardware. Period.

Running it 'in software' so to speak, on AMD hardware or otherwise, will look the same while running like a turd in tar. Think seconds per frame instead of frames per second.

No RT hardware, no realtime RT, just turn it off.
 
No, graphics and by extension their effects are held back by console hardware, not by desktops.
 
Current graphic effects should just go and die
I would rather they concentrated on Ray-Tracing as I hate fake reflections that hurt my visual cortext
 
At some point, technology needs to change and move on. The goal is good looking games and high performance, if Nvidia can revolutionize how that gets accomplished then good for them. Just pounding out better and better rasterization may be the best approach for old games however it isn’t the future.

Eventually new tech like ray tracing will be the new normal. Games using current lighting technologies will be retro.

DLSS has the opportunity to do anti aliasing like effects without the massive performance penalty.

People used to think cordless tools were the devil as well... Change is hard for people.
 
Well, current effects won't go back in time. Those tools are available to any game that wants to use them.

That said, if you mean several years from now I suppose it will depend on how quickly ray tracing becomes the standard. That's why "the quicker the better." Make backward compatibility stop right about where we are now without having to continue pushing more performance in rasterization tricks in new hardware. Put all that R&D into faster ray tracing.

We can still have, for the next few years at least, cards that will play all existing games like a beast. By the time video cards start becoming focused on ray tracing only, emulation of the previous rasterization pipeline effects should be more than possible with faster newer hardware made at some point in the future.

The problem will be if we get stuck right HERE where we need video cards that continually need to be the best at both world at a hardware level. The good gaming cards can't remain $800-1200 forever or the market will shrink and fail.

My prediction will be a rather quick and brutal shift to ray tracing over the next 2 gens of video cards. We will look back 3 years from now and wonder at how rasterization died so quickly for EVERY new AAA title.

Once decent hybrid ray tracing can be done on a $200-250 video card, the change will happen almost overnight. And at that point yes, non-RT graphics modes will not be worked on as heavily. But then again once everyone can buy a RT capable card at $250 or less, it won't matter anymore. Because we will all be happy.
 
Well, current effects won't go back in time. Those tools are available to any game that wants to use them.

That said, if you mean several years from now I suppose it will depend on how quickly ray tracing becomes the standard. That's why "the quicker the better." Make backward compatibility stop right about where we are now without having to continue pushing more performance in rasterization tricks in new hardware. Put all that R&D into faster ray tracing.

We can still have, for the next few years at least, cards that will play all existing games like a beast. By the time video cards start becoming focused on ray tracing only, emulation of the previous rasterization pipeline effects should be more than possible with faster newer hardware made at some point in the future.

The problem will be if we get stuck right HERE where we need video cards that continually need to be the best at both world at a hardware level. The good gaming cards can't remain $800-1200 forever or the market will shrink and fail.

My prediction will be a rather quick and brutal shift to ray tracing over the next 2 gens of video cards. We will look back 3 years from now and wonder at how rasterization died so quickly for EVERY new AAA title.

Once decent hybrid ray tracing can be done on a $200-250 video card, the change will happen almost overnight. And at that point yes, non-RT graphics modes will not be worked on as heavily. But then again once everyone can buy a RT capable card at $250 or less, it won't matter anymore. Because we will all be happy.

With generation 1 flagship hardware struggling to do partial ray-tracing at 1080p60 despite much larger than normal GPU die sizes I think it's going to take a lot more than just 2 generations to go ray tracing first. That's going to require flagship cards to be able to raytraced 4k60 if not 4k120, which'll require 4/8 as much silicon dedicated to ray tracing just to do ray-traced lighting with everything else still being rasterized. That in turn means we're probably looking at no sooner than 5nm and quite possibly 3nm. Ray-tracing everything instead of rasterizing will take even longer.
 
But then again once everyone can buy a RT capable card at $250 or less, it won't matter anymore. Because we will all be happy.

I guess I should keep this 1080Ti for posterity then, like the Voodoo2 I should have kept....

:asshat:
 
They did that nonsense with physx. They made it look like a flag could not blow around or sparks could not be produced or other stuff unless physx was enabled. I have no doubt that they'll hold back on effects for regular users just to make Ray tracing look better. And I'm worried that their optimizations will be just as piss poor as they were with physx. Borderlands Pre-Sequel can't even maintain 50 FPS in certain areas that are physx heavy even on a 1080 TI at 1080p. Yet in those spots GPU usage actually plummets showing just how shity the implementation is.

PhysX wasn't advertised as "flags can blow around and sparks can be produced". It was "Flags and cloth react realistically with the environment and can be accurately damage modeled" followed by "a bajillion more sparks". Both of these were actually possible but weren't seen that often because of the small amount of PhysX games released.
 
I guess I should keep this 1080Ti for posterity then, like the Voodoo2 I should have kept....
1080Ti will never get to levels of timeless nostalgia as Voodoo2 have
Keeping GTX 1080Ti is like keeping GTX 580 :ROFLMAO:
 
The new hair-works it will even lower performance on the new cards but as long as the other cards take a bigger hit it is worth it.
 
...not at all?
Why not?

each time real new generation cards come out (with new effects - let alone completely new image drawing methods ...) the old cards become outdated

1080Ti is not outdated today... but in a few days... :ROFLMAO:

There is nothing about 1080Ti that is significant from historical point of view. It is as remarkable as 980Ti or 780Ti...
 
I guess you don't either then?

Whole article series about how cards hold up over time, and the jump in performance between generations?

:ROFLMAO:


So what makes you think your fishy lure will work now if it didn't on the last try? Keep casting.

:asshat:
 
Back
Top