Will Ray Tracing become another PhysX?

Jalseng

Limp Gawd
Joined
Dec 10, 2012
Messages
153
Will there be enough support and game titles that carry the Ray Tracing feature or will it become another PhysX and die off?
 
no. nvidia just has dedicated ray tracing cores and its still using microsoft dxr tech from what i read. So these games won't have any problem running on AMD hardware as long as they are capable. Weather they do it via dedicated ray tracing cores or use parts of their GPU to do ray tracing. I think AMD GCN architecture had alot of throughput but it just wasn't all being utilized for gaming. So lets see what they do. I am fairly confident their next gen architecture that they started working on few years back that is suppose to be launching in 2020 should have some nice improvements in rasterisation. GCN hasn't had any major improvements in ages. I think that should be resolved with their next gen architecture in 2020.
 
So here's the thing.

DICE is not enabling non-RTX owners to use RT in BFV. Even though DXR is completely fine being used on any DX12 compatible GPU, DICE is using an RTX-specific codepath. This means if you own a 1080Ti or Titan V, you will not be able to use RT, even if you have the power to run it (albeit much slower than RTX cards). This also means No direct comparisons are possible between RTX cards and existing GTX cards. I'm not sure if this is the case with ALL the RTX games demo'd so far, but DICE has come out and said this specifically.

So Nvidia is treating this like PhysX. It's exclusive to specific products, even though it's more than capable to run on other hardware.

Source
 
So here's the thing.

DICE is not enabling non-RTX owners to use RT in BFV. Even though DXR is completely fine being used on any DX12 compatible GPU, DICE is using an RTX-specific codepath. This means if you own a 1080Ti or Titan V, you will not be able to use RT, even if you have the power to run it (albeit much slower than RTX cards). This also means No direct comparisons are possible between RTX cards and existing GTX cards. I'm not sure if this is the case with ALL the RTX games demo'd so far, but DICE has come out and said this specifically.

So Nvidia is treating this like PhysX. It's exclusive to specific products, even though it's more than capable to run on other hardware.

Source

It would have been nice to have it on... even if it ran at 5FPS lol. This really matters if AMD or Intel have an actual product that can run RT over 5-10 FPS. This is functionally no different but possibly concerning. I actually figured they went RT/DLSS because it’s basiclaly proprietary and no one has a chance in hell to compete. Kinda like Mantle and AMD, technically open but if only AMD’s hardware is optimized... well...
 
Business. Selling point of the new cards is RT, hence why gimped for older cards.
 
No it wont....But Ray Tracing in my eyes won't really take off for another 2-3 generations of cards. Remember Developers have to add it in the game. It will eventually be in a lot of games it will just take time.

IMO You are buying the RTX for performance and not Ray Tracing. You are basically becoming an early adopter to a technology that isn't 100% ready yet? But IMO it will be one day.
 
No it wont....But Ray Tracing in my eyes won't really take off for another 2-3 generations of cards. Remember Developers have to add it in the game. It will eventually be in a lot of games it will just take time.

IMO You are buying the RTX for performance and not Ray Tracing. You are basically becoming an early adopter to a technology that isn't 100% ready yet? But IMO it will be one day.

Yep! I agree. I thought of this since day one. If you are buying this card for ray tracing it probably won't be worth the money. Its like beta testing at 1080p lol. Yea in few generations we will see it take off, performance and image quality.
 
Yep! I agree. I thought of this since day one. If you are buying this card for ray tracing it probably won't be worth the money. Its like beta testing at 1080p lol. Yea in few generations we will see it take off, performance and image quality.

DICE is working on rendering RT at different resolutions than everything else which would fix any issues like that. I’ve also heard they have it up to 40-50 fps at 1440p.
 
DICE is working on rendering RT at different resolutions than everything else which would fix any issues like that. I’ve also heard they have it up to 40-50 fps at 1440p.

Thats my whole point. When you have to find ways around making it work at decent speed means the hardware isn't fast enough, but its a good beginning. No matter how you look at it, that performance isn't acceptable if you are just buying it for rtx. 1200 bucks to play at 60fps or less is not worth the money at 1080p. May be second or third gen hardware do the trick. But if you are solely buying the card to jump on the rtx bandwagon its not worth the performance you are going to get. Well I am not telling anyone else to not spend their money but if you are not buying a 1200 dollar card to game at high fps high resolution than obviously there is something wrong there lol.
 
Thats my whole point. When you have to find ways around making it work at decent speed means the hardware isn't fast enough, but its a good beginning. No matter how you look at it, that performance isn't acceptable if you are just buying it for rtx. 1200 bucks to play at 60fps or less is not worth the money at 1080p. May be second or third gen hardware do the trick. But if you are solely buying the card to jump on the rtx bandwagon its not worth the performance you are going to get. Well I am not telling anyone else to not spend their money but if you are not buying a 1200 dollar card to game at high fps high resolution than obviously there is something wrong there lol.

If 1440p is hitting 40-50 1080p should be around 70-85ish and they think they can get it 30% better.

If you can do 4k with 1080p RT and it’s still a fuck ton better than the alternative.

Hell don’t like RT there’s still the 30-60% higher FPS and DLSS that adds another 50% as far as we know anyways.
 
If 1440p is hitting 40-50 1080p should be around 70-85ish and they think they can get it 30% better.

You can do 4k with 1080p RT and it’s still a fuck ton better than the alternative.

Hell don’t like RT there’s still the 30-60% higher FPS and DLSS that adds another 50% as far as we know anyways.

Only time will tell!
 
Yep, full on ray tracing in real time has been sort of the holy grail when it comes to graphics rendering. I remember using POV Ray back in the good old days (early 90’s) to render SVGA scenes where it would take the better part of a day or longer just to render a single frame. One of the issues with ray tracing is that the more complex the scene, the longer it is going to take to render at a given resolution... and upping the resolution dramatically increases the rendering time. The really big draw though is ultra realistic graphics without having to rely on any tricks to get there - as ray tracing is literallly modeling the light itself and letting physics do its thing as photons bounce all over the place. I’d be shocked if we see 4K full on real-time ray tracing in consumer grade GPU’s within the next 8 years. What Nvidia is offering in the RTX is sort of supplemental ray tracing support glued onto traditional raster rendering. It’s a start though.
 
Well PhysX itself has been pretty successful. Both Unity and Unreal use PhysX running on the CPU, and it's been used in hundreds, maybe thousands, of games.

GPU PhysX, on the other hand, was locked to Nvidia and many developers were not keen on catering only to a portion of the market (even if a majority).

If real-time ray-tracing is forever locked to certain high-end Nvidia cards, then I would agree it won't take off. But if AMD (and Intel) can implement their version of RTX, and it's likely they will, then this can become a really big development.
 
It's ironic because I think the closest comparison we can draw between RTX tech and the past is when Nvidia released the 8800GTX. It brought DX10 capability and hardware PhysX for the first time. It also was the biggest architecture change in years going to cuda cores. That card later established itself as legendary for its performance and capability. Will the RTX 2080Ti do the same?
 
It's ironic because I think the closest comparison we can draw between RTX tech and the past is when Nvidia released the 8800GTX. It brought DX10 capability and hardware PhysX for the first time. It also was the biggest architecture change in years going to cuda cores. That card later established itself as legendary for its performance and capability. Will the RTX 2080Ti do the same?

It will sure do the same when it comes to price tag. Legendary indeed!
 
I'm still confused as to how people can't immediately see the results of this. Whether or not THIS generation of cards can do it well enough, it's absolutely staring us in the face like an oncoming freight train.

It KILLS me when people say "I can't see the difference." Holy !@#$!@#$. It's plain as day when virtually the whole scene has lighting, shadows and reflections that are proper.

Anyone who hasn't really needs to watch this video and not the BF5 half-ass early implementation:



That's where we are really headed with this when an entire game engine leverages everything the hybrid ray tracing can do.

The demo they did for the game Control is much more impressive than most of them but I don't care for a lot of their art which holds back much of the wow but it's still a better example than either BF5 or Tomb Raider.

One way or another, we NEED to push the ray tracing in games now. It's time. It's PAST time. The graphics cards are becoming insanely powerful but the "bag of tricks" to mimic real lighting has grown to obscene proportions. We have kind of pushed rasterization way past where anyone thought it would go. A better and simpler architecture is needed.

Frankly, if there is ONE silver lining to Nvidia being a near-monopoly monster right now it's that they CAN afford to throw away an entire generation of cards on price and finally force the shift to some form of ray tracing.

I'm kind of begging everyone at this point. Support the shift now while it's possible. Separate your dislike of their prices from the GOOD that pushing ray tracing will do for all of us in the immediate future. Let's make this switch now rather than even later.
 
I'm still confused as to how people can't immediately see the results of this. Whether or not THIS generation of cards can do it well enough, it's absolutely staring us in the face like an oncoming freight train.

It KILLS me when people say "I can't see the difference." Holy !@#$!@#$. It's plain as day when virtually the whole scene has lighting, shadows and reflections that are proper.

Anyone who hasn't really needs to watch this video and not the BF5 half-ass early implementation:



That's where we are really headed with this when an entire game engine leverages everything the hybrid ray tracing can do.

The demo they did for the game Control is much more impressive than most of them but I don't care for a lot of their art which holds back much of the wow but it's still a better example than either BF5 or Tomb Raider.

One way or another, we NEED to push the ray tracing in games now. It's time. It's PAST time. The graphics cards are becoming insanely powerful but the "bag of tricks" to mimic real lighting has grown to obscene proportions. We have kind of pushed rasterization way past where anyone thought it would go. A better and simpler architecture is needed.

Frankly, if there is ONE silver lining to Nvidia being a near-monopoly monster right now it's that they CAN afford to throw away an entire generation of cards on price and finally force the shift to some form of ray tracing.

I'm kind of begging everyone at this point. Support the shift now while it's possible. Separate your dislike of their prices from the GOOD that pushing ray tracing will do for all of us in the immediate future. Let's make this switch now rather than even later.


Nah, sorry broseph. That Epic video was showing worst-case-scenario raster graphics versus best-case-scenario RT. You'd be surprised how well devs can 'fake' RT using techniques that require 1/100th of the render time. That's why RT has always been avoided, because the 'faked' Raster graphics can render 100 frames that look 90% as good as a single RT frame. Which means you can pack in 90x more Raster detail and resolution, showing a much more complex and objectively better-looking scene, and STILL render faster than RT. If Nvidia decided to get rid of the Tensor cores and the RTX cores and fill that die space with JUST Cuda cores, you would have a GPU that could be 2-3x faster than a 1080Ti. Which means you could pack in more AA, pack in more geometry detail, deeper, more complex shaders, better, more effective post-processing, more detailed voxel GI and AO, higher detailed cubemaps and STILL be faster than a 1080Ti.

But no, instead we get accurate reflections. Accurate Reflections running slower than a 1080ti.
 
Yeah, the 8800GTX was the game changer when it brought cuda cores. I know that PhysX was release with high expectation but sadly it just kinda phase out. Seems like Ray Tracing will be the same, to me, as physX because most FPS players will turn it off in game to get their higher fps. If all next gen games are release with RT then it will become a staple in the industry where if you didn't have RTX or an RT GPU then your frames were lower and as well poor setting then I believe it would change the world of PC gaming.
 
Yeah, the 8800GTX was the game changer when it brought cuda cores. I know that PhysX was release with high expectation but sadly it just kinda phase out. Seems like Ray Tracing will be the same, to me, as physX because most FPS players will turn it off in game to get their higher fps. If all next gen games are release with RT then it will become a staple in the industry where if you didn't have RTX or an RT GPU then your frames were lower and as well poor setting then I believe it would change the world of PC gaming.
Yeah that is the logical approach to the whole debate. But what is the matter that is this another feature which will be abused by Nvidia if tomorrow AMD would have better ray tracing cores How would Nvidia deal with it?
What they did with their blackbox solution GameWorks or PhysX where they artificially use the slowest methods of it running on non Nvidia hardware.

Since AMD does not have the mindshare there is virtually no incentive for AMD to provide special hardware ray tracing features.
 
Will there be enough support and game titles that carry the Ray Tracing feature or will it become another PhysX and die off?
Considering PhysX is more common in games these days than Havok, I would say you are confused.
So here's the thing.

DICE is not enabling non-RTX owners to use RT in BFV. Even though DXR is completely fine being used on any DX12 compatible GPU, DICE is using an RTX-specific codepath. This means if you own a 1080Ti or Titan V, you will not be able to use RT, even if you have the power to run it (albeit much slower than RTX cards). This also means No direct comparisons are possible between RTX cards and existing GTX cards. I'm not sure if this is the case with ALL the RTX games demo'd so far, but DICE has come out and said this specifically.

So Nvidia is treating this like PhysX. It's exclusive to specific products, even though it's more than capable to run on other hardware.

Source
That's not what they said. Here are the exact quotes:

GeForce RTX owners should get the option to turn ray tracing off. However, there is no DXR (DirectX Ray Tracing) fallback path for emulating the technology in software on non-RTX graphics cards. And when AMD comes up with its own DXR-capable GPU, DICE will need to go back and re-tune Battlefield V to support it.

Holmquist clarifies, “…we only talk with DXR. Because we have been running only Nvidia hardware, we know that we have optimized for that hardware. We’re also using certain features in the compiler with intrinsics, so there is a dependency. That can be resolved as we get hardware from another potential manufacturer. But as we tune for a specific piece of hardware, dependencies do start to go in, and we’d need another piece of hardware in order to re-tune.”


This is no different than standard DX12. You can have a generic code path, but it is going to run like shit for everyone if they do that. I'm sure they also don't include a software path since it is going to run 1-10 FPS on video cards lacking dedicated ray tracing hardware.
 
Last edited:
DICE is working on rendering RT at different resolutions than everything else which would fix any issues like that. I’ve also heard they have it up to 40-50 fps at 1440p.

Maybe on the 2080 TI but what about the other two cards they're trying to pass off as "RTX"?
 
Business. Selling point of the new cards is RT, hence why gimped for older cards.
In this case its more like capability/performance. AFAIK only volta could be capable of running RT since its what was used in early demos. And then again you needed 4 of them and still was slower than a RTX card.

So there was really no point on running RTX on anything but turing unless you wanted a slideshow.
 
Will there be enough support and game titles that carry the Ray Tracing feature or will it become another PhysX and die off?
PhysX is probably the most popular physics engine. Its used in consoles, mobile devices and PCs. So it isn't dead.
 
PhysX is probably the most popular physics engine. Its used in consoles, mobile devices and PCs. So it isn't dead.
i think they mean the old standalone card PhysX. which was "short lived" due to being incorporated into the gpu. and isn't there already a thread about basically this...
 
Nah, sorry broseph. That Epic video was showing worst-case-scenario raster graphics versus best-case-scenario RT. You'd be surprised how well devs can 'fake' RT using techniques that require 1/100th of the render time. That's why RT has always been avoided, because the 'faked' Raster graphics can render 100 frames that look 90% as good as a single RT frame. Which means you can pack in 90x more Raster detail and resolution, showing a much more complex and objectively better-looking scene, and STILL render faster than RT. If Nvidia decided to get rid of the Tensor cores and the RTX cores and fill that die space with JUST Cuda cores, you would have a GPU that could be 2-3x faster than a 1080Ti. Which means you could pack in more AA, pack in more geometry detail, deeper, more complex shaders, better, more effective post-processing, more detailed voxel GI and AO, higher detailed cubemaps and STILL be faster than a 1080Ti.

But no, instead we get accurate reflections. Accurate Reflections running slower than a 1080ti.

No, I've got to disagree. Rastering isn't anything like 90% of these hybrid RT scenes. There's no such thing as a rasterized game that calculates light anything even SORT of like what is happening in those RT demo scenes. Not even in the same universe. Multiple light fall across objects, light color accurately mixing as it falls across objects AND reflects, light that mixes from sources as it passes objects and AO that is real rather than oddly disturbing looking. Those scenes are an order of magnitude more impressive than anything I've seen real time in any engine.

There is nothing like that going on in a rasterized game. Yes, it's GOOD but it's nothing like that.

The biggest problem right this second really is that the "launch" games of BF5 and and Tomb Raider are really not making good use of the tech at all. It's quite bad use of it really which isn't surprising. Neither of those games were designed from the ground up intending to leverage RT. It's an add-on feature and not a core design element.

We pretty desperately need a couple of quickie arcade games that use the whole gamut of RT features so people can see it in better use.

It's surprising as hell that NVIDIA didn't code a couple of basic games to showcase it better.

EDIT - and for the record I could care less about Nvidia's "proprietary" implementation. I don't care if it's generic DX code path or Nvidia's RTX... as long as we get hardware accelerated RT somehow NOW.
 
But no, instead we get accurate reflections. Accurate Reflections running slower than a 1080ti.
Technically accurate reflections, but not realistically accurate reflections. I don't know of anywhere I've been where everything looks that wax-glossed.

There are things I have seen in the NVIDIA demos and the BFV demos that do look very good, but the over-polished reflectiveness of everything distracts severely.
 
Maybe on the 2080 TI but what about the other two cards they're trying to pass off as "RTX"?

2080 is -20% perf and 2070 -40% perf. So if DICE does add the ability to render RT seperately from the scene you either take a performance hit or render ray tracing at a lower resolution. The “pass off as “RTX”” is a little dramatic lol.

To clarify DICE said they are working on being able to render a scene at a resolution and ray tracing at a seperate resolution. So you can do 4k and just RT at 1080p and get your 90Hz. It’s way better than what we have now and is a nice compromise.
 
News flash, PhysX is in mostly every game just implemented for CPU tasks instead of GPU. Thank you, lock thread.
 
This thread is based on bad information and littered with posts that have more wrong information that are based on emotionally filled assumptions.
 
ray tracing doesn't belong to nvidia, PhysX does. Therefore there should be no further discussion.
This
PhysX is probably the most popular physics engine. Its used in consoles, mobile devices and PCs. So it isn't dead.
this
i think they mean the old standalone card PhysX. which was "short lived" due to being incorporated into the gpu. and isn't there already a thread about basically this...
this
News flash, PhysX is in mostly every game just implemented for CPU tasks instead of GPU. Thank you, lock thread.
this
This thread is based on bad information and littered with posts that have more wrong information that are based on emotionally filled assumptions.
and this.

Please lock/end this thread.
 
i think they mean the old standalone card PhysX. which was "short lived" due to being incorporated into the gpu. and isn't there already a thread about basically this...
GPU physx is live and well. Its just supported on nvidia hardware.
 
I'm still confused as to how people can't immediately see the results of this. Whether or not THIS generation of cards can do it well enough, it's absolutely staring us in the face like an oncoming freight train.

It KILLS me when people say "I can't see the difference." Holy !@#$!@#$. It's plain as day when virtually the whole scene has lighting, shadows and reflections that are proper.

Anyone who hasn't really needs to watch this video and not the BF5 half-ass early implementation:



That's where we are really headed with this when an entire game engine leverages everything the hybrid ray tracing can do.

The demo they did for the game Control is much more impressive than most of them but I don't care for a lot of their art which holds back much of the wow but it's still a better example than either BF5 or Tomb Raider.

One way or another, we NEED to push the ray tracing in games now. It's time. It's PAST time. The graphics cards are becoming insanely powerful but the "bag of tricks" to mimic real lighting has grown to obscene proportions. We have kind of pushed rasterization way past where anyone thought it would go. A better and simpler architecture is needed.

Frankly, if there is ONE silver lining to Nvidia being a near-monopoly monster right now it's that they CAN afford to throw away an entire generation of cards on price and finally force the shift to some form of ray tracing.

I'm kind of begging everyone at this point. Support the shift now while it's possible. Separate your dislike of their prices from the GOOD that pushing ray tracing will do for all of us in the immediate future. Let's make this switch now rather than even later.


Thanks Advil . Ray-tracing is a huge development, and it kills me as well when people pan it without all the information. I think that's a good video to see the real difference, and we will have to wait for games to take full advantage (not tacked on at the last minute).
 
Yeah that is the logical approach to the whole debate. But what is the matter that is this another feature which will be abused by Nvidia if tomorrow AMD would have better ray tracing cores How would Nvidia deal with it?
What they did with their blackbox solution GameWorks or PhysX where they artificially use the slowest methods of it running on non Nvidia hardware.

Since AMD does not have the mindshare there is virtually no incentive for AMD to provide special hardware ray tracing features.
You can document that claim?
 
I'll acknowledge that PhysX is still alive and kicking, but the gpu side is definitely worthless. It will probably become a collector's item in the future. I guess I should of ask "will it end up like PhysX and get implemented into another CPU task?"
 
Back
Top