CD Projekt RED and NVIDIA Talk About Upcoming Cyberpunk 2077 Path Tracing

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,868
Cyberpunk Overdrive

"NVIDIA's Senior Developer Technology Engineer, Pawel Kozlowski, gave more technical details about the Cyberpunk 2077 Path Tracing preview. Pawel notes that it is still using the NVIDIA RTXDI (RTX Direct Illumination) for computing all the direct illumination and using Path Tracing to compute the indirect part of the render, in a manner of speaking.

Pawel also adds that you will definitely need the GeForce RTX 40 series GPUs in order to use it, mainly, as he says, because they are the most powerful, and they support DLSS 3 with Frame Generation, as well as the support for Shader Execution Ordering (SER), which helps to execute incoherent workloads like Path Tracing.

While Jakub suggested that you will need very high-end hardware for the upcoming Cyberpunk 2077 Path Tracing technical preview, Pawel also adds that NVIDIA is aiming for a good experience on 40 series GPUs, so it is left to be seen what kind of frame rates will we see from the GeForce RTX 4070 Ti.

Here is the full interview from PCWorld, and all we have to do is wait for the update to come out on April 11th."

rwGE4wK5BwciYOLF.jpg


Source: https://www.techpowerup.com/306589/...lk-about-upcoming-cyberpunk-2077-path-tracing
 
I don't know what computational power would be required, but path tracing is what VR needs on top of a generation or two of miniaturization.
 
Who don't like playing games at 18 fps?
It's only slow until it's fast. Just the nature of all technology improvements. Frankly we've gone to RT mostly showing up doing gimmicky things to having real full-blown path tracing with significant impact to visuals in just a couple generations. And it's playable with decently high framerates, albeit using upscaling techniques on top end hardware. We'll have to find out just how much of a slideshow it is on lower end hardware and previous gen hardware.

Even despite saying that, RT could supplant raster in just a few more hardware generations. Next gen consoles (ie PS6) could very easily be "RT first" with photo realistic visuals (UE 5).
 
It's only slow until it's fast. Just the nature of all technology improvements. Frankly we've gone to RT mostly showing up doing gimmicky things to having real full-blown path tracing with significant impact to visuals in just a couple generations. And it's playable with decently high framerates, albeit using upscaling techniques on top end hardware. We'll have to find out just how much of a slideshow it is on lower end hardware and previous gen hardware.

Even despite saying that, RT could supplant raster in just a few more hardware generations. Next gen consoles (ie PS6) could very easily be "RT first" with photo realistic visuals (UE 5).

Cyberpunk 2077 Patch v1.62 With RT: Overdrive is Now Live


New update out now
 
Played it. It's junk. The lighting is so overdone it looks like you're in a dreamscape. 4K with or w/o RT looks far better to me. Here's from Gamer Nexus.
1681266787143.png
 
Played it. It's junk. The lighting is so overdone it looks like you're in a dreamscape. 4K with or w/o RT looks far better to me. Here's from Gamer Nexus.
View attachment 563616
Odd, all the forums I browse including this one have users saying it's great. What does "it's junk" mean exactly? What's your system, settings, resolution, dlss setting, frame gen on or off, etc?
 
Played it. It's junk. The lighting is so overdone it looks like you're in a dreamscape. 4K with or w/o RT looks far better to me. Here's from Gamer Nexus.
View attachment 563616
I notice the new lighting seems to betray the tone set by the lighting in the original scenes. Many areas of the game that were once relatively well lit are now near pitch-black.
 
It will be for the forseeable future. This is just the new "ultra" mode, but they absolutely need to make the games work well on much less capable hardware.
No they don't. They are selling a hamburger today that you get to eat tomorrow.
Nvidia started the con with 2000 series and at 4000 series we are apparently not much closer. Enjoy currently designed games at the amazing performance loss the top guys can produce. Software producers who manage this stupidness and add some, some. Good on them for the try. So far we have poly-lame games as the no one cares titles get full RT! Ya Minecraft and other mega low poly shit of yester year.
This has become a buy to use feature that most everything sold doesn't play. Super pissed this HW taxing BS took the spotlight away from VR or other tech that demanded higher rez and higher framerate. How many useless cards have been sold because the box said RT ready?
 
Odd, all the forums I browse including this one have users saying it's great. What does "it's junk" mean exactly? What's your system, settings, resolution, dlss setting, frame gen on or off, etc?
You can literally see the issue in the screenshot. A section of the image goes from being not illuminated to being illuminated to being oversaturated. It's not realistic at all.
 
No they don't. They are selling a hamburger today that you get to eat tomorrow.
Nvidia started the con with 2000 series and at 4000 series we are apparently not much closer. Enjoy currently designed games at the amazing performance loss the top guys can produce. Software producers who manage this stupidness and add some, some. Good on them for the try. So far we have poly-lame games as the no one cares titles get full RT! Ya Minecraft and other mega low poly shit of yester year.
I've been pretty vocal about not liking nVidia at all as a company. But I'll tell you straight away that this is the way we get better tech. I've been watching the computer graphics scene from essentially it's "modern" inception, with the advent of Voodoo cards made by 3Dfx, which were developed by former SGI employees.

I say that to say, that I've seen what new software does to new cards. When Quake II launched, there wasn't a single graphics card that could play the game at above 25fps at 1024x768. If you had spend the money for 2x VooDoo 2 12MB's in SLI, you could break 30 fps in that game. This repeated again time after time. Meanwhile John, continued to push graphics tech meaningfully using every new for the time hardware advancement.

And every step of the way there were similar complaints to what you're making about it. There were complaints about APIs like Glide and DirectX. Complaints about every new hardware tech (bump mapping, transformation and lighting, anti-aliasing, even AGP!). Some even wondered if having 3d graphics in a home PC even made sense. Or it was too expensive. On and on.

RT is something novel and I think is incredibly beneficial to the future development of games and to things like IQ. The way we get to that place is experiencing a slide-show so that we can eventually get hardware where it won't be a slide show. And I'm glad this target has come out now because then it will be painfully clear to both those at AMD and nVidia that this game will be a target to optimize and create better hardware for. Because a real practical example had to exist. Beyond your other examples that you mentioned such as Minecraft or even (ironically) Quake II RT. This combined with CDPR's new Witcher game that will be in Unreal 5 and will also surely feature RT will be showcases for a while. Targets for hardware while we wait to catch up?

Don't want to spend the money to buy a shiny new card to dive into RT yet because the bang for the buck simply isn't there? That's more than fair and I for the most part agree. We're waiting for 1080p hardware (and by that I mean stuff that costs $300-$450 new) that can run RT at 120+ FPS. It's going to be a while. But it's also worth the wait. Until then, it's more than fine to simply turn off RT and go about living your life.

This has become a buy to use feature that most everything sold doesn't play. Super pissed this HW taxing BS took the spotlight away from VR or other tech that demanded higher rez and higher framerate. How many useless cards have been sold because the box said RT ready?
I can't comment on VR, other than to say there is some level of irony for wanting your favored tech to flourish but not wanting another favored tech to flourish?
I say that tongue in cheek. It's more than fair to say that both RT and VR are niche for the time being. Until eventually the tech becomes good enough that they aren't. It's early days on both and penetration on either for most people to be able to use either is incredibly limited. Both are fully for early adopters.

It is 4k, just reconstructed.
Upscaling is upscaling. You can try to make new data, but it's all "just a guess". It cannot and never will be the same as having actual data to begin with. I would never call something that is upscaled as if it is truly that resolution. It's not. It's deceptive at best.
 
Last edited:
Upscaling is upscaling. You can try to make new data, but it's all "just a guess". It cannot and never will be the same as having actual data to begin with. I would never call something that is upscaled as if it is truly that resolution. It's not. It's deceptive at best.
It's 4k dlss. However, the distinction he was making was semantical. Everyone knows 4k dlss means it's reconstructed.
 
Last edited:
It will be for the forseeable future. This is just the new "ultra" mode, but they absolutely need to make the games work well on much less capable hardware.

RT will take off when the console generation at that point can do it easily. Till then it's not ready for prime time. Just as with 4k. The rate things are going though that's going to be the PS6 era which is good.
 
Software producers who manage this stupidness and add some, some. Good on them for the try. So far we have poly-lame games as the no one cares titles get full RT! Ya Minecraft and other mega low poly shit of yester year.
Not sure I understand this, you are on a thread about Cyberpunk of all games getting the "full montecarlo RT a la Dreamwork render farm" treatments.

With the 2 "really easy" upgrade that were framerate-resolution (for which I imagine you can simply add the same cores, memory and bandwith) both well past their diminishing return for almost all gamers (would it be 1080p and 60-120 fps depending from people to people, the line with more is a bit nicer but not much), there could be some fear what if GPU become like tablet-TV-consoles, something you do not need to upgrade that much really anymore, once a decade, 7 years, the 1060 6gb will be 7. And they could be trying to find way to keep the 4 years upgrade affair going on more than just be focused what the best gaming experience for the price possible, sure they are big enough in that market for them to be possible to be in that mindset.

demanded higher rez and higher framerate
That could be nice for a tiny minority of people (VR, eyes that see the high framerate, competitive gamers) but for the giant majority of movies on a old 768p plasma tend to have better graphic than video games, as someone that do not make much of a difference past 75hz and would not mind I think 720p, I feel we lost a lot of time/focus on resolution-framerate, console historically give a good idea of how the budget should go for most people I feel like and most of their adaptive resolution-shaders, fovea tracking if possible like VR, should become more common on the PC side.

There a world (harder to do I imagine) where the focus goes at making the best looking game possible at just 60fps-1080p on the latest hardware, not sure if it is any worst for most people, that close to what console tend to do.
 
Who don't like playing games at 18 fps?

My teenaged self would have killed for that FPS in Duke 3d. I think I was able to get around 10 FPS standing against a wall, moving around was 4 to 8. It was only "playable" because the game sampled input much faster than it drew the screen so I could turn for 1/3 a frame and shoot when I thought an enemy would be passing in front of me and hit about half the time.
 
tried if on my 10gb 3080, at 1080p no DLSS it is actually fairly playable with everything else mostly maxed out.
 
Back
Top