"Ray Tracing Just Works"

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
7,262
First time I've seen exactly what Jensen Huang indicated for launch of Turing GPU's. Very impressive lighting, best I've ever seen for real time graphics. DLSS working as well to get acceptable frame rates with ray tracing on. Very cool video, could be great for RTX owners, I am pretty sure I could get loss in this pleasantly.

 
Lipstick on a pig.

Only if you're not into the aesthetic. As someone who was playing a lot of voxel-based games in the 90s, I kind of like the Minecraft aesthetic. This makes it look real/surreal, which is kind of cool. I wish I was into Minecraft. :D My kids are. I'm more into Terraria.
 
I've never been into games like Minecraft but the RTX update makes it tempting. Really cool stuff!
 
First time I've seen exactly what Jensen Huang indicated for launch of Turing GPU's. Very impressive lighting, best I've ever seen for real time graphics. DLSS working as well to get acceptable frame rates with ray tracing on. Very cool video, could be great for RTX owners, I am pretty sure I could get loss in this pleasantly.



You mean you missed all the other DXR titles?
 
I watch the video from LTT and it seems a bit out of reach at the moment.
Looks amazing but if something like Minecraft is pushing these cards this hard, this seems to be still a bit out of reach.
Maybe when the 3080TI comes out, something to revisit.


 
  • Like
Reactions: noko
like this
Not at all, played a little bit on my two 1080 Ti's using mGPU. No RT or Tensor cores needed.

~20 FPS at Raytracing light?
1587397116214.png


If that is your "baseline"...I understand why you sound a bit off.
 
let's go out and drop $600+ for an RTX card just so we can study that shadow thing during gameplay :oops:
If you're spending that much for a gaming card today, you're getting RTX regardless 😉


RTX may/will one day have it's place in PC gaming but sorry, there's no way I'd pay NVIDIA for RTX right now
Lighting, including color, is the biggest gap between current real-time graphics and photorealism.

No, I wouldn't advocate buying just for RT, but I also wouldn't advocate upgrading without it.
 
No, I wouldn't advocate buying just for RT, but I also wouldn't advocate upgrading without it.

really? LOl, sorry ... just had to do that. Do you honestly (keyword "honestly") think a ray tracing card from 2019 will still be a go to card in a few years? If that turns out to be the case then it won't say much for ray tracing development - right? Or am I not right?
 
really? LOl, sorry ... just had to do that. Do you honestly (keyword "honestly") think a ray tracing card from 2019 will still be a go to card in a few years? If that turns out to be the case then it won't say much for ray tracing development - right? Or am I not right?
That's why I split my reply :)

I don't think they're worth it today, just for that feature alone. However, I also wouldn't buy a card (or recommend one) without ray tracing.


One thing I consider is that current RT GPUs will likely be decent at RT with the settings 'dialed back'. A large part of the load of RT is that developers are currently having to code for split rendering paths, one with RT and one without.
 
RTX may/will one day have it's place in PC gaming sorry, there's no way I'd pay NVIDIA for RTX right now
You have paid for DXR already...AMD is going there (aka R&D funds are being used).

A get your ass into 2020...it is DXR, not RTX...sour grapes much?
 
If you're spending that much for a gaming card today, you're getting RTX regardless 😉



Lighting, including color, is the biggest gap between current real-time graphics and photorealism.

No, I wouldn't advocate buying just for RT, but I also wouldn't advocate upgrading without it.
He is just annoyed that AMD is late to the party...
 
really? LOl, sorry ... just had to do that. Do you honestly (keyword "honestly") think a ray tracing card from 2019 will still be a go to card in a few years? If that turns out to be the case then it won't say much for ray tracing development - right? Or am I not right?

That's a ridiculous argument. You can say the same about any GPU. Is the 980Ti a go to card today?
 
That's a ridiculous argument. You can say the same about any GPU. Is the 980Ti a go to card today?

Yep, that’s just how this stuff goes. A go to card right now is the 20xx line, and you’re getting RTX anyway if you buy one. Just like you got T&L on the GeForce 256 even though the tech wasn’t ripe until the GeForce 2 series. The NV/AMD thing is another can of worms. Not getting into that :D
 
Turing with RTX was just a toddler on its first baby steps. Ampere is supposedly the big boy that can take off with it.

 
He is no engineer...and a lot of bias...YT "reviewers" how the bar keeps dropping.

Tl;dr:
Waste of time video

Looked like a good video and opinion piece to me. I am guessing that is clear why you did not think it was good but hey, I will let those who watch the entire video determine that, for themselves.
 
Could also be AMD showing how it should be done? :D

Until the newer DXR cards hit, it is just speculation for most of us.

Oh, it is entirely speculation and opinion but, this is probably the closest we have gotten to being somewhat accurate as we will get, so far.
 
Is Minecraft still only single threaded? If the answer is yes then somebody needs to be tarred and feathered. Stupid game will still slow to an appalling 1 fps or less when there is a bunch of animation on the screen.
 
Fun story: I've had the 2080ti for almost 6 months now, and I'm yet to try my first RTX game.


I was in a similar boat. I just bought it for the 4k performance. When I tried the extremely limited ray tracing applications in the games that supported it I could see why so many people were frothing at the mouth about it. Minecraft looks promising, but again it's minecraft, a game that runs on (old) cellphones, so to see this level of rat tracing applied to a game like cyberpunk may be a bit far off from where we realistically are.

However, when games are built from the ground up with ray tracing hardware in mind, they may be able to do things we haven't yet seen, so for that I am tentatively holding out hope for. It may be 2 or 3 years till we see the next gen consoles get that type of a game to market, but these technologies always take time for adoption, just look at multi core cpu games. We're just now starting to get more games scaling with higher core counts.
 
  • Like
Reactions: noko
like this
Maybe if they showcased slower games it might be better.
 
Fun story: I've had the 2080ti for almost 6 months now, and I'm yet to try my first RTX game.

I'll let Jensen know, I'm sure all future RTX plans will be immediately cancelled.
 
I'll let Jensen know, I'm sure all future RTX plans will be immediately cancelled.
If you have his ears, I'd rather you tell him to spend more cash on making it more widely supported in games. Thanks.
 
If you have his ears, I'd rather you tell him to spend more cash on making it more widely supported in games. Thanks.
Nah, AMD will make it happen and get it going in way more games, Nvidia will work on bringing out more RTX implemented old games like Quake III (that would be cool). Buy Ampere so you can play old games with it :D at 1080p.

Sarcasm
 
I was in a similar boat. I just bought it for the 4k performance. When I tried the extremely limited ray tracing applications in the games that supported it I could see why so many people were frothing at the mouth about it. Minecraft looks promising, but again it's minecraft, a game that runs on (old) cellphones, so to see this level of rat tracing applied to a game like cyberpunk may be a bit far off from where we realistically are.

However, when games are built from the ground up with ray tracing hardware in mind, they may be able to do things we haven't yet seen, so for that I am tentatively holding out hope for. It may be 2 or 3 years till we see the next gen consoles get that type of a game to market, but these technologies always take time for adoption, just look at multi core cpu games. We're just now starting to get more games scaling with higher core counts.

Also can take time for the hardware to be less shit. I mean it seems like the RTX series don't have a lot of ray tracing power. No surprise, they need to be good at rasterization as that is how games do things, even ones that have some RT bolted on. So the first gen stuff is pretty weak. However they may be able to scale it up as time goes on. A good example from the past is pixel (and vertex) shaders. When programmable shaders first happened, they were a bit shit. GeForce 3 cards had them, but games couldn't do a lot with them because they were pretty weak. So they'd get used to do a little simple bump mapping or making water shiny, but that's it. However now it is how everything is done, every pixel is handled through a shader program, often complex ones with many layers (texture, light, displacement, specular, occlusion, etc). The power of the shaders has grown to the point that now most of the silicon on a card is the shaders, the ROPs, TMUs, all that jazz are just a tiny part.

So we could see future generations of cards have much more raytracing muscle and thus be able to do more than just some kinda "meh" shadow effects or ray trace extremely old/simple games slowly. Now we'll just have to see. It also may be the case that RT takes up too much silicon to really do right, and we are getting to the point where node shrinks are not what they used to be, so maybe in the long run it'll be too costly to add enough logic to really do it, and it'll forever be a gimmick kind of addon, or even something that fades away. Really nobody knows, including the GPU makers, at this point. We'll just have to see. They certainly think it is wroth pursuing though.
 
Tried it and it works well on a 2060, sits around 30-45 fps if you don't max out all the settings. Cranking everything to 11 results in 1 FPS slideshow. Soon as my boys saw it they asked when it was coming to the Switch...

Either way, funny coming from someone who touched Java Minecraft 10 something years ago.
 
  • Like
Reactions: noko
like this
Well as predicted, DLSS 2.0 in cyberpunk. AMD better have a DLSS 2.0 equivalent for RDNA 2 on day 1 of release else there would be zero reason to buy AMD over NVIDIA. Nvidia software ecosystem is already light years ahead with nvenc for streaming/encoding, filters, shadow play, CUDA etc and now DLSS 2.0 which is a huge game changer.

I’ve got $ ready and waiting for the 3090 on day one release, can’t wait.
 
Back
Top