cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,074
Eurogamer has conducted an interview with DICE rendering engineer Yasin Uludag where he explains the numbers and computer science behind the real-time ray tracing effects that PC gamers will experience when they turn NVIDIA RTX on. He discusses the bottlenecks and NVIDIA intrinsics used to analyze and optimize the game for better performance. He alludes to features that might be incorporated into the game in the future like a hybrid ray tracer/ ray march system, variable rate shading, and more.

We were initially negatively affected in our QA testing and distributed performance testing due to the RS5 Windows update being delayed. But we have received a custom compiler from Nvidia for the shader that allow us to inject a "counter" into the shader that tracks cycles spent inside a TraceRay call per pixel. This allows us to narrow down where the performance drops are coming from, we can change to primary ray mode instead of reflection rays to see which objects are "bright". We map high cycle counters to bright and low cycle counters to dark and then go in to fix those geometries. Having these metrics by default in D3D12 would be a great benefit, as they currently are not.
 
RTX is useless for now.

Here's an idea that has potential to improve performance https://techreport.com/discussion/3...-shading-with-wolfenstein-ii-the-new-colossus
I saw that earlier today. It didn't seem to make a big difference on the cards. :(

At least in the case of Wolfenstein II—already an incredibly well-optimized and swift-running game—content-adaptive shading provides anywhere from two to five more FPS on average for the RTX 2070, three to six more FPS on average for the RTX 2080, and one to six more FPS on average for the RTX 2080 Ti, depending on whether one chooses the quality, balanced, or performance preset in the game's advanced options.
 
Rtx underwhelming in performance. Dlss poor quality. Cas marginal gains. What's left to justify the price doubling?
 
So that begs the question, what counts as "real time" when people talk about "Real Time Raytracing"? If you have a chip that can output over 100FPS at 3840x2160p but it can only render at 40-50 FPS at 1080p when displaying the same exact scene only with 30% of the surfaces exhibiting single or double sample raytracing (when a professionally rendered scene from a special effects team would user hundreds or thousands of samples).

Can you really call that 'Real Time'?
 
im dumb I read " A DICE Rendering Engineer " and was confused for a sec thought it ment a dice simulator game was coming or something lol, to be fair i saw something about a shower simulator today so guess that was still in my mind lol
 
Ray tracing is new, all new tech is a bit of a cluster f, but in time it might mature. There just needs to be that reason to mature, such as market penetration.
 
I love when they go from low to ultra dxr settings and barely seeing a difference at all. o_O

It'll be a good tech in the future IF it takes off, right now it's a checkbox gimmick used to drastically inflate the price of the gpu.
 
So that begs the question, what counts as "real time" when people talk about "Real Time Raytracing"? If you have a chip that can output over 100FPS at 3840x2160p but it can only render at 40-50 FPS at 1080p when displaying the same exact scene only with 30% of the surfaces exhibiting single or double sample raytracing (when a professionally rendered scene from a special effects team would user hundreds or thousands of samples).

Can you really call that 'Real Time'?
Yes. It is still being rendered in real time. Even if it was running at 1fps at 240p it would still be real time rendering. Performance being shit does not change whether or not the effects are being rendered with the game.
 
Yes. It is still being rendered in real time. Even if it was running at 1fps at 240p it would still be real time rendering. Performance being shit does not change whether or not the effects are being rendered with the game.

...but by that standard, there are a whole lot of cards that can do 'Real Time' raytracing...
 
...but by that standard, there are a whole lot of cards that can do 'Real Time' raytracing...

Sure. There's a difference between doing it and doing it remotely close to playable. Intel onboard graphics can run a game at 4K with the settings maxed out. It would be at like 1 frame per minute in most cases, but it can run. You're expecting a new, very advanced and very demanding, feature to run maxed out at 4K at high frame rates. That never happens with new tech in games.
 
My disappointment is immeasurable and my day is ruined.
Really?
The article basically says "bugs, bugs, RTX bugs everywhere costing us a ton of fps"

And driver is slowing us down and DXR needs some work.

But fixes are coming shortly.

Not sure why your day would be ruined by this article, if anything it's good news.
 
So lots of bugs and no justifiable improvement is now good news? Lol, can't wait to hear the bad stuff.
 
Really?
The article basically says "bugs, bugs, RTX bugs everywhere costing us a ton of fps"

And driver is slowing us down and DXR needs some work.

But fixes are coming shortly.

Not sure why your day would be ruined by this article, if anything it's good news.
It is a meme. I was joking.
 
Really?
The article basically says "bugs, bugs, RTX bugs everywhere costing us a ton of fps"

And driver is slowing us down and DXR needs some work.

But fixes are coming shortly.

Not sure why your day would be ruined by this article, if anything it's good news.
I think he was being facetious.

edit: beat me to it...
 
Did anyone else notice the DICE Engineer said that some of their guys straight up didn't read the API instruction manual on Ray Tracing?

Following the API specifications, if you instead of collapsing them to (0, 0, 0), collapse them to (NaN, NaN, NaN) the triangle will be omitted because it's “not a number”. This is what we did and it gave a lot of perf. This has bug has been fixed and will be shipping soon

Not dogging on them for it, I just find it funny that he mentioned it. It's actually quite refreshing to not see PR spin on some things for once.
 
im dumb I read " A DICE Rendering Engineer " and was confused for a sec thought it ment a dice simulator game was coming or something lol, to be fair i saw something about a shower simulator today so guess that was still in my mind lol

Heck, I thought it was a job opening for a Rendering Engineer.
 
I once bought a shiny new Matrox card because of Bump-mapping. Lesson learned!
To be fair bump mapping did work as advertised, just that it was a tacked on feature for a few games. In time every card had it.
G400 Max? ;) I remember Slave Zero and Rollcage Stage II. The list of games is larger than I remember. https://ancientelectronics.wordpres...endable-with-environment-mapped-bump-mapping/

You are right though, we now have at least 4 different methods of mapping now all standardized and widely supported. First generation adoption is fun!
 
I think the G400 Max was the last non ATI-Nvidia-3DFX branded card where you could walk into a LAN party and you were BAD-ASS!
(Maybe a Parhelia if you were willing to lug 3 CRT's LoL!)
 
Back
Top