Cyberpunk 2077 to Support NVIDIA RTX Raytracing

Right, but who's telling you that will be the case?

Until there's proper reviews using proper games, nobody knows. Also, and this has been discussed in other threads, but many of us are hypothesizing that ray tracing or other RTX features could be used in part, or in trickier ways implemented by skilled developers that could potentially increase visuals and performance to some degree. It's what-iffery at the moment, but so is making 1080 and sub-60fps statements.

Not everyone needs to push all aspects 100% either. Here's an example I used from another thread. What about a really fancy new Tron game? The scenes would be geometrically simplistic (relative to many other types of game) but the way the shadows and surfaces would be draw would be incredible, and probably not use every ounce of power to do it.

I think people just automatically assume "ray traced Crysis" for every game and every situation. Then, sure, you might see some slower frame rates.

I don't think it's black or white. It will depend heavily on exactly which features of which tech is implemented in a given game and engine.

Have you seen what Cyberpunk looks like? It's not going to be a simplistic looking game...we will see when the cards get released, but I would bet money that RTX raytracing cripples resolution and framerates in EVERY game it's being used in.
 
Maybe so, however, current devs are shooting for 60fps at 1080p with RTX. Let's wait until we see official benchmarks before declaring it dead.

Nah, for $1200, they can keep their BS. I won't buy into it.
 
Iunno didn’t Witcher 3 use the absurdly high x64 Tessellation for hairworks? When dropping it down to x8 or x16 minimally affected the visuals and greatly increased performance (on all hardware).

Not saying it was on purpose or if it was just a shitty bit of programming. But I also recall that CDPR allegedly didn’t have full access to hairworks code that was implemented.

But fuck it, it was 3 years ago

Yes, but the NV cards ran it just fine at x64. So why wouldn't that be the default from NV's point of view. Here, we helped implement this feature, here's what we recommend you run it at for our hardware. I'm sure someone knew that it would run like crap on AMD hardware, but who's responsibility is it to make sure a game runs well on other GPUs? It's the dev and that particular GPU company. Maybe AMD should have worked more closely with CDPR toward the end of the development cycle to make sure the game settings all worked on their hardware. That's kind of devil's advocate, but I don't really go in for all this your GPU company is slimier than mine crap.
 
Have you seen what Cyberpunk looks like? It's not going to be a simplistic looking game...we will see when the cards get released, but I would bet money that RTX raytracing cripples resolution and framerates in EVERY game it's being used in.

You do understand that raytracing is something that won't even be on by default, and can be optionally toggled on/off with a single click, yes?

All of this weeping into pillows over one optional graphical feature is ponderous.
 
Last edited:
Have you seen what Cyberpunk looks like? It's not going to be a simplistic looking game...we will see when the cards get released, but I would bet money that RTX raytracing cripples resolution and framerates in EVERY game it's being used in.

Well, yes actually. I think it's gorgeous, and would probably push this tech to its limits. You may even be 100% correct in the end. However, with so much time left in development, the possibility that even one more RTX generation could surface by release, the fact that these are talented developers with plenty of resources available, etc. I think there's a good chance that they will find a good tradeoff in features and performance.

All I'm saying, is that I'm not too worried about it. I'll play it at the highest levels I can without sacrificing a good frame rate, and if that's not possible with what I own, (or doesn't look sufficiently good to me) I'll buy some more hardware.
 
Well, yes actually. I think it's gorgeous, and would probably push this tech to its limits. You may even be 100% correct in the end. However, with so much time left in development, the possibility that even one more RTX generation could surface by release, the fact that these are talented developers with plenty of resources available, etc. I think there's a good chance that they will find a good tradeoff in features and performance.

If anything, probably something you can turn on or off, either you want fidelity or performance.
 
If anything, probably something you can turn on or off, either you want fidelity or performance.

Exactly. I think his point was that most people would turn it off due to the performance level. However, until we all know what that looks like on RTX hardware it's all just guessing. IMO if someone is going to push it, and make it work well, this would be one of the devs that might be able to do it. Another would be id Software. Still another might be anyone left from Starbreeze. Possibly Epic too, but you know, all they care about is raking in Fortnight money now. Fortnight RT? :p
 
Exactly. I think his point was that most people would turn it off due to the performance level. However, until we all know what that looks like on RTX hardware it's all just guessing. IMO if someone is going to push it, and make it work well, this would be one of the devs that might be able to do it. Another would be id Software. Still another might be anyone left from Starbreeze. Possibly Epic too, but you know, all they care about is raking in Fortnight money now. Fortnight RT? :p

While I do believe CDPR devs are extremely competent, I just feel like the performance will not be there to run it properly. Kinda reminds me of Hairwork for Witcher 3, the only hardware at that time that can run it properly is Titan X Maxwell.
 
The secret to enjoying ray tracing is to make sure you never display your FPS =P

Honestly though, I was really hoping with a 2080 I would be able to play at 4K, 60FPS with decent settings. I know it's only alpha footage but kind of worrisome that a 1080 Ti could only pull 1080 x 60fps
 
The secret to enjoying ray tracing is to make sure you never display your FPS =P

Honestly though, I was really hoping with a 2080 I would be able to play at 4K, 60FPS with decent settings. I know it's only alpha footage but kind of worrisome that a 1080 Ti could only pull 1080 x 60fps
I thought it said they were only doing 30 FPS
 
Eh, I'm not really in to the whole Sci-Fi robotesque thing, at least not enough to be pulled in on the hype train for this.


I'd rather CDPR focus on something somewhat similar to TW3.

But I'm happy for their success, and envy their completely different project they're taking on.

Game play looks fun, graphics are good too.
 
Alpha, non-release drivers, unreleased hardware, a possible 6 months to 2 years development left to go on the game... I wouldn't put anything in stone yet. Totally possible that performance could suck. I just tend to think by the time the game is done, hardware is out there, drivers are mature, and even possible API updates, that things could look very different. Who knows, by the time they release, AMD could blow the doors off of everything with their ray tracing monster that they've managed to keep under wraps all this time... It could... happen...
 
Might be more viable on a 3080 by the time the game is released
 
Alot of games ran like shit when Anti Aliasing came out. I did love going back and playing those games over again, or trying games I missed with new hardware and maxing out AA for an even better experience. I think this is how we are going to enjoy this new technology.
 
I'd rather play at 1080p with maximum eye-candy at 60 fps... than a higher resolution with lower quality settings at 60 fps.

Ultra-wides... now that might be worth dialing back a few things.

What are the odds this game will still be dx11?

Does dx11 allow for ray-tracing?

The game will certainly have a DX11 option, and I'm fairly certain the majority of the backend is written against DX11.

You *can* do Ray Tracing in DX11, but its very unopimized. There's a reason why no one really does it. Even DX12 I can only see a handful of use cases, as the HW isn't there yet.
 
As I understand it, even though it may not be very impressive to enthusiasts, isn't 1080p 60hz real-time rendered raytracing kindof a phenomenal thing? I mean... I remember not long ago there was a video floating around of someone who'd managed to raytrace Quake at a playable FPS, and that was amazing at the time.

The difference is that Quake version was entirely raytraced; RTX is hybrid raytracing. Not the whole scene is raytraced, just the somewhat easier things that will have a bigger visual impact. It is impressive but it was only a matter of time before this thing was possible.
 
Sweet lawd, are you guys really whining about a game that doesn't even have a release date and how it will perform on a graphics card that isn't released yet and how new GPU tech that hasn't been released yet will affect it as well? :confused:

Then we wonder why Nvidia has no real competition and most game devs are looking for a MP cash cow to sit on. "The only way to win is not to play the game"
 
It doesn't matter whether or not it uses Vulkan. Memory management and other high-level features are easily ported to both the version of DirectX used on Xbox One and Sony's proprietary GNMX API. Low-level is where it gets sketchy because GNM on the PS4 has no relation at all to either OpenGL or Vulkan, and neither does Xbox DirectX with DirectX 12 (although I think there was an update in the past year or so that brought parity between PC DX12 and XB1 DX12).

i don't think cyberpunk will be out till PS5/XB2 are on shelves. Prob too graphically demanding for current gen consoles. just my guess.
 
Eh, I'm not really in to the whole Sci-Fi robotesque thing, at least not enough to be pulled in on the hype train for this.


I'd rather CDPR focus on something somewhat similar to TW3.

But I'm happy for their success, and envy their completely different project they're taking on.

Game play looks fun, graphics are good too.

Yeah same here. Something about walking around a shithole fails to draw me in.
 
Yeah same here. Something about walking around a shithole fails to draw me in.

Have to assume that it's not all a shithole, and that not being in the shitter would be part of the game; the TW series was a bit similar.
 
While the idea of ray tracing for me is cool, visually I'm not the biggest fan. If they said that ray tracing tech would make you get more frames per second / speed all your shit up, i'd be all for it. BUT it looks like it's just going to be another type of rendering that will slow down your FPS yet again.

When Nvidia announced hairworks in Witcher 3 (I was on my Titan X maxwell at the time), I thought cool. Cranked everything up and looked at it for a few minutes. Looked cool but I lost a bit of "felt" performance (didn't benchmark actual FPS numbers). A second later, turned that shit right back off and went back to normal graphics (everything but hairworks cranked). I don't know about most people, but I don't spend hours looking at a wall or the side of a car looking to see reflections of stuff that happens to be nearby. But maybe that's just me.

Maybe once 2nd gen or 3rd gen ray tracing hit and it becomes more ubiquitous with the game developers, I'll circle back to it but for me, I think I'll hold onto Pascal setups for a while longer.
 
The developers said that the screenshot is a fake, so you can all calm down now :rolleyes:
 
Back
Top