Cyberpunk 2077 to Support NVIDIA RTX Raytracing

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
56,238
Long-time HardOCPer TaintedSquirrel pointed out yesterday that he came across this leaked screenshot from the Cyberpunk 2077 graphics menu that points out that Cyberpunk 2077 will in fact support NVIDIA's new RTX raytracing technology. Of course this is not official, so we will chalk this up to being a rumor. NVIDIA Hairworks is also mentioned. Yesterday during the stream it was reported that the game was running on a GTX 1080 Ti at 1080p, so we would have been seeing none of the RTX goodness, should it actually exist. There is still no official launch date for the game. Cyberpunk 2077 was announced back in 2012, and the HardForum thread is still alive and well.

Menu Pic.
 
Get ready for extra tesselation-esque issues to crippple AMD. I really hope nvidia is using open dx / vulkan standards but they don't deserve the benefit of the doubt.

Speaking of which I hope cyberpunk is written with vulkan since the game will be crossplatform.
 
Get ready for extra tesselation-esque issues to crippple AMD. I really hope nvidia is using open dx / vulkan standards but they don't deserve the benefit of the doubt.

Speaking of which I hope cyberpunk is written with vulkan since the game will be crossplatform.

NVIDIA's RTX is just a wrapper around the Ray Tracing that's already a part of DX12/Vulkan. The issue is since AMD doesn't have specialized HW to perform Ray Tracing, performance will be a lot less.

It's unlikely the game is written in Vulkan, since the game entered pre-production prior to the API existing. It's likely DX11 with some DX12 functionality tacked on.
 
I was at the pax panel last year for witcher /CDPR and i was amused how the Co owner of CDPR was so tight lipped about cyberpunk. They're the only developer that deserves a pre-order

More to the point, my 970 is probably not going to cut it.
 
Last edited:
It's not just graphics or anything that makes the game look so good, it's the animations and the presence you feel from the actions of the people around you etc. I found TW3 had some incredibly lifelike moments that I would have been pulled out of playing any other RPG. This demo puts Bethesda to shame.
 
It's not just graphics or anything that makes the game look so good, it's the animations and the presence you feel from the actions of the people around you etc. I found TW3 had some incredibly lifelike moments that I would have been pulled out of playing any other RPG. This demo puts Bethesda to shame.
People lose their shit from just a announcement of a new crappy ES game and cause a 20 page thread. This games releases actual footage and hasn't even broke the first page yet. The Witcher 3 alone is better then all ES games combine.
 
Get ready for extra tesselation-esque issues to crippple AMD. I really hope nvidia is using open dx / vulkan standards but they don't deserve the benefit of the doubt.

Speaking of which I hope cyberpunk is written with vulkan since the game will be crossplatform.
Hairworks is optional, and it is using the built-in features of DirectX, anyway. RTX Ray Tracing is the same: it uses DirectX Ray Tracing (DXR), which is built into the DirectX 12 API. Vulkan also has real time ray tracing features and support. The only real difference between NVIDIA and AMD on the development front is NVIDIA's middleware makes it easy to implement.

It doesn't matter whether or not it uses Vulkan. Memory management and other high-level features are easily ported to both the version of DirectX used on Xbox One and Sony's proprietary GNMX API. Low-level is where it gets sketchy because GNM on the PS4 has no relation at all to either OpenGL or Vulkan, and neither does Xbox DirectX with DirectX 12 (although I think there was an update in the past year or so that brought parity between PC DX12 and XB1 DX12).
 
NVIDIA's RTX is just a wrapper around the Ray Tracing that's already a part of DX12/Vulkan. The issue is since AMD doesn't have specialized HW to perform Ray Tracing, performance will be a lot less.

More like the other way around. Ray Tracing in DX12, is high level wrapper for the low Level RTX functions (and AMD when they get around to supporting it).

NVidia has been the one pushing Ray Tracing not Microsoft. Ray Tracing in DX12 was first announced at the first major NVidia RTX showcase at GDC back in March. Microsoft Ray Tracing dev kit only recommended NVidia products.

The API for DX12 Ray Tracing will have been developed/tweaked while working with NVidia HW, so they will probably work very well with NVidias low level APIs.
 
More like the other way around. Ray Tracing in DX12, is high level wrapper for the low Level RTX functions (and AMD when they get around to supporting it).

NVidia has been the one pushing Ray Tracing not Microsoft. Ray Tracing in DX12 was first announced at the first major NVidia RTX showcase at GDC back in March. Microsoft Ray Tracing dev kit only recommended NVidia products.

The API for DX12 Ray Tracing will have been developed/tweaked while working with NVidia HW, so they will probably work very well with NVidias low level APIs.
Eh, no RTX is definitely working on top of DXR. DXR has been around since March of 2018 while Nvidia hasn't even released the hardware. We've seen this before many times where we have API's that work on top of API's. Look at the Creative EAX APi when it worked on top of Direct Sound, but back in 2006 when Vista was released the hardware audio support in Direct Sound was removed and thus Creative had to migrate over to OpenAL. So instead of working on top of Direct Sound, it now works on top of OpenAL.

It makes a lot of sense to do this as not many developers would be big on the idea of writing code that was 100% only for Nvidia. Since it's based on DXR, the code can be used on none Nvidia hardware as well.
 
I just watched that gameplay footage with my wife last night. It looks pretty amazing, but I won't be able to enjoy the Tray Racing, unfortunately. My 1080ti will be on duty for another generation or two, I think.
 
Can't wait to try it out. I think this is going to be an amazing game.
 
So what code will Nvidia slip into this game that cripples AMD... presuming AMD even has a mid-range or better card that would be capable of playing the game under otherwise normal circumstances.
 
It's not just graphics or anything that makes the game look so good, it's the animations and the presence you feel from the actions of the people around you etc. I found TW3 had some incredibly lifelike moments that I would have been pulled out of playing any other RPG. This demo puts Bethesda to shame.
I thought the crowd animations looked janky as hell at times
 
Not many will be interested in playing it at 1080p or at 30fps.

Don't you usually wait to see what it ACTUALLY plays at? The Witcher seemed smooth enough to me. This game isn't due out for a while. Who knows, we might even have the next version of RTX by the time it ships, and a usable version in the mid-range sector. So far the game looks pretty nice on current hardware. There's only room for improvement from here.

There are several factors that make me say this. One is that this dev deserves the benefit of the doubt. Next is that the game is already playable from end to end (though they're obviously still working on it). The engine already runs smoothly, and has from what I've heard for some time. They wouldn't have had RTX hardware to run on for the earlier press previews. And it's still a ways off. To me that more or less combines to equal them not fucking it up. Of course, as I said above (and it works both ways) we have to actually wait and see. I'm just predicting good things.

These guys make a point of knowing their audience. They don't typically screw over PC users for console performance levels either. They're one of the few devs that I believe aim to impress people just to show how cool they are. :D
 
So what code will Nvidia slip into this game that cripples AMD... presuming AMD even has a mid-range or better card that would be capable of playing the game under otherwise normal circumstances.
"Raytracing is rigged against us!"

You guys are so tragic. CDPR isn't the kind of developer that engages in petty, tribal GPU politics, or let's anyone "cripple" anything in their game. It's a single toggle that won't even be on by default.

Get over it and start demanding more of your GPU company.
 
Last edited:
I play at 1080, though I do demand 60 or 120 frames per second. I also like vsync on, and eye candy maxed (within those synced frame rates). This has come up a lot lately though. I'm all for 4K, but only when I can comfortably run it at the settings I'm accustomed to. That'll be a while.
 
Huh? Almost everyone plays at 1080p. Even many here at the [H] play at 1080p. I'm an UW user myself, but I won't overlook the fact that the gaming world for the most part is still 1080p.
I still find it amazing how quickly I got spoiled on 144hz, and then 1440p. Even more amazing is how sneakily my change in expectations followed: "Whaddya mean this game is capped at 60FPS by the developer!? THE NERVE! CATER TO ME, A REASONABLE AND NORMAL REPRESENTATION OF ALL GAMERS!"
 
I still find it amazing how quickly I got spoiled on 144hz, and then 1440p. Even more amazing is how sneakily my change in expectations followed: "Whaddya mean this game is capped at 60FPS by the developer!? THE NERVE! CATER TO ME, A REASONABLE AND NORMAL REPRESENTATION OF ALL GAMERS!"

I'm actually all for my expectations getting to that point. I'm just not prepared to be disappointed though, so I'll wait until I'm reasonably sure I don't need to spend a couple grand just to ensure that. :D I'd rather stay at peasant level resolution with god like settings, at a middle-class budget. hehehe
 
Don't you usually wait to see what it ACTUALLY plays at? The Witcher seemed smooth enough to me. This game isn't due out for a while. Who knows, we might even have the next version of RTX by the time it ships, and a usable version in the mid-range sector. So far the game looks pretty nice on current hardware. There's only room for improvement from here.

There are several factors that make me say this. One is that this dev deserves the benefit of the doubt. Next is that the game is already playable from end to end (though they're obviously still working on it). The engine already runs smoothly, and has from what I've heard for some time. They wouldn't have had RTX hardware to run on for the earlier press previews. And it's still a ways off. To me that more or less combines to equal them not fucking it up. Of course, as I said above (and it works both ways) we have to actually wait and see. I'm just predicting good things.

These guys make a point of knowing their audience. They don't typically screw over PC users for console performance levels either. They're one of the few devs that I believe aim to impress people just to show how cool they are. :D
Isn't that exactly what they did with Witcher 3? The e3 footage was superior and dumbed down later for consoles iirc.
 
Huh? Almost everyone plays at 1080p. Even many here at the [H] play at 1080p. I'm an UW user myself, but I won't overlook the fact that the gaming world for the most part is still 1080p.
Who the fuck here plays at 1080p with a GTX 1080 TI though? So maybe actually pay attention to the context next time as this was about a 1080 TI only getting 30 FPS at 1080p.
 
Who the fuck here plays at 1080p with a GTX 1080 TI though?
I'd think it would be a reasonable choice for users of 240hz panels. My guess is that would be a niche within a niche, but as alluded to previously, my guesses are suspect. XD
 
I'd think it would be a reasonable choice for users of 240hz panels. My guess is that would be a niche within a niche, but as alluded to previously, my guesses are suspect. XD
Even with the fastest CPU out there many games would be CPU Limited at that resolution with a GTX 1080 TI and you are just as likely to get the same playable performance at 1440p.
 
I'd rather play at 1080p with maximum eye-candy at 60 fps... than a higher resolution with lower quality settings at 60 fps.

Ultra-wides... now that might be worth dialing back a few things.

What are the odds this game will still be dx11?

Does dx11 allow for ray-tracing?
 
I'd rather play at 1080p with maximum eye-candy at 60 fps... than a higher resolution with lower quality settings at 60 fps.

Ultra-wides... now that might be worth dialing back a few things.

What are the odds this game will still be dx11?

Does dx11 allow for ray-tracing?
okay let's use some basic Common Sense here. The type of person that's going to have something like a 2080 TI, which is 1200 freaking dollars, sure as hell not going to be using a 1080p monitor. They will have a 1440p at the very least and probably a 4K so using 1080p will look like Blurry dogshit compared to Native resolution. If you can't tell how shity it looks to play below native resolution then you are laughably insane talking about that you have to have the best visuals.
 
I'd rather play at 1080p with maximum eye-candy at 60 fps... than a higher resolution with lower quality settings at 60 fps.

Ultra-wides... now that might be worth dialing back a few things.

What are the odds this game will still be dx11?

Does dx11 allow for ray-tracing?
Well, technically any API can do ray tracing. You just have to create the functions manually. I don't think developers are going to put in all the work to support an older API when it is built into the current one, especially when there is hardware that leverages it, so chances are zero that there will be an implementation in DX11.
 
Huh? Almost everyone plays at 1080p. Even many here at the [H] play at 1080p. I'm an UW user myself, but I won't overlook the fact that the gaming world for the most part is still 1080p.

If you are limited to 1080p and getting less than 60fps with RTX engaged on their 2080ti, then the feature will be disabled by almost everyone.
 
  • Like
Reactions: Aix.
like this
Don't you usually wait to see what it ACTUALLY plays at? The Witcher seemed smooth enough to me. This game isn't due out for a while. Who knows, we might even have the next version of RTX by the time it ships, and a usable version in the mid-range sector. So far the game looks pretty nice on current hardware. There's only room for improvement from here.

There are several factors that make me say this. One is that this dev deserves the benefit of the doubt. Next is that the game is already playable from end to end (though they're obviously still working on it). The engine already runs smoothly, and has from what I've heard for some time. They wouldn't have had RTX hardware to run on for the earlier press previews. And it's still a ways off. To me that more or less combines to equal them not fucking it up. Of course, as I said above (and it works both ways) we have to actually wait and see. I'm just predicting good things.

These guys make a point of knowing their audience. They don't typically screw over PC users for console performance levels either. They're one of the few devs that I believe aim to impress people just to show how cool they are. :D

Wish in one hand, crap in the other, see which hand fills up first...
 
If you are limited to 1080p and getting less than 60fps with RTX engaged on their 2080ti, then the feature will be disabled by almost everyone.

Maybe so, however, current devs are shooting for 60fps at 1080p with RTX. Let's wait until we see official benchmarks before declaring it dead.
 
Who the fuck here plays at 1080p with a GTX 1080 TI though? So maybe actually pay attention to the context next time as this was about a 1080 TI only getting 30 FPS at 1080p.

I would totally. I have a GTX 1070 that does just fine, but I've occasionally had it dipping below 60 with maximum settings in certain games. I'd totally bump it up to a 1080Ti or even an RTX card (which I intend to once reviews hit) and still play at 1080.
 
Maybe so, however, current devs are shooting for 60fps at 1080p with RTX. Let's wait until we see official benchmarks before declaring it dead.
As I understand it, even though it may not be very impressive to enthusiasts, isn't 1080p 60hz real-time rendered raytracing kindof a phenomenal thing? I mean... I remember not long ago there was a video floating around of someone who'd managed to raytrace Quake at a playable FPS, and that was amazing at the time.
 
If you are limited to 1080p and getting less than 60fps with RTX engaged on their 2080ti, then the feature will be disabled by almost everyone.

Right, but who's telling you that will be the case?

Until there's proper reviews using proper games, nobody knows. Also, and this has been discussed in other threads, but many of us are hypothesizing that ray tracing or other RTX features could be used in part, or in trickier ways implemented by skilled developers that could potentially increase visuals and performance to some degree. It's what-iffery at the moment, but so is making 1080 and sub-60fps statements.

Not everyone needs to push all aspects 100% either. Here's an example I used from another thread. What about a really fancy new Tron game? The scenes would be geometrically simplistic (relative to many other types of game) but the way the shadows and surfaces would be draw would be incredible, and probably not use every ounce of power to do it.

I think people just automatically assume "ray traced Crysis" for every game and every situation. Then, sure, you might see some slower frame rates.

I don't think it's black or white. It will depend heavily on exactly which features of which tech is implemented in a given game and engine.
 
"Raytracing is rigged against us!"

You guys are so tragic. CDPR isn't the kind of developer that engages in petty, tribal GPU politics, or let's anyone "cripple" anything in their game. It's a single toggle that won't even be on by default.

Get over it and start demanding more of your GPU company.

Iunno didn’t Witcher 3 use the absurdly high x64 Tessellation for hairworks? When dropping it down to x8 or x16 minimally affected the visuals and greatly increased performance (on all hardware).

Not saying it was on purpose or if it was just a shitty bit of programming. But I also recall that CDPR allegedly didn’t have full access to hairworks code that was implemented.

But fuck it, it was 3 years ago
 
With how much I enjoyed The Witcher 3.... and how much I love the whole cyberpunk theme... this is a must buy.

Perhaps its time to build a new PC as well..... Just to make sure..... ;)
 
Right, but who's telling you that will be the case?

Until there's proper reviews using proper games, nobody knows. Also, and this has been discussed in other threads, but many of us are hypothesizing that ray tracing or other RTX features could be used in part, or in trickier ways implemented by skilled developers that could potentially increase visuals and performance to some degree. It's what-iffery at the moment, but so is making 1080 and sub-60fps statements.

Not everyone needs to push all aspects 100% either. Here's an example I used from another thread. What about a really fancy new Tron game? The scenes would be geometrically simplistic (relative to many other types of game) but the way the shadows and surfaces would be draw would be incredible, and probably not use every ounce of power to do it.

I think people just automatically assume "ray traced Crysis" for every game and every situation. Then, sure, you might see some slower frame rates.

I don't think it's black or white. It will depend heavily on exactly which features of which tech is implemented in a given game and engine.

Have you seen what Cyberpunk looks like? It's not going to be a simplistic looking game...we will see when the cards get released, but I would bet money that RTX raytracing cripples resolution and framerates in EVERY game it's being used in.
 
Back
Top