Cyberpunk is Available - Lets share how your GPU/CPU is performing

So this is interesting- I did play a little with HDR last night (latest drivers ) on my Sony X900 with game mode and FALD at 1440p120 / rtx 3080 was very glad I knew about the elevated black levels in advance. Long story short it makes clear that both SDR and HDR are a little broken right now, but that the game was really designed with HDR.
That is to say, it’s not an afterthought, the lighting etc. really pops in HDR and many scripted sequences are making explicit use of HDR techniques.
When I first played the game things looked a little flat on my 27”Qhd gsync monitor and I thought I didn’t have all the effects turned on or DLSS needed a sharpening filter or something (it does in this implementation) but the flatness is partly a tone mapping issue in SDR, I think, as others have expressed after seeing the hDR mode on a high quality HDR device.
The Nvidia driver issue is a pain because it stays broken after you turn off HdR so I hope they fix it soon since I don’t want to roll back drivers for other reasons.
 
So this is interesting- I did play a little with HDR last night (latest drivers ) on my Sony X900 with game mode and FALD at 1440p120 / rtx 3080 was very glad I knew about the elevated black levels in advance. Long story short it makes clear that both SDR and HDR are a little broken right now, but that the game was really designed with HDR.
That is to say, it’s not an afterthought, the lighting etc. really pops in HDR and many scripted sequences are making explicit use of HDR techniques.
When I first played the game things looked a little flat on my 27”Qhd gsync monitor and I thought I didn’t have all the effects turned on or DLSS needed a sharpening filter or something (it does in this implementation) but the flatness is partly a tone mapping issue in SDR, I think, as others have expressed after seeing the hDR mode on a high quality HDR device.
The Nvidia driver issue is a pain because it stays broken after you turn off HdR so I hope they fix it soon since I don’t want to roll back drivers for other reasons.
The issue is fixed, look at the most recent hotfix driver.
 
Gsync saves every game, except for ultra competitive esports games.
I would be inclined to agree with you, until cyberpunk. My 1080TI at 3440x1440 is struggling and not even gsync can save it as I’m dipping down below Gsync range finally on this title. First title to do this to me. I fact everything else (except flight sim 2020) ran at 60FPS average or above so far. And so long as it didn’t dip below low 40s felt smooth as butter on my Alienware AW3418DW with gsync.
 
Part of the issue at lower frame rate averages is you tend to have pretty major hitching and massive frame pacing spikes. This was well covered in GNs recent video on the subject of CPU influence of Cyberpunk’s game experience.
If you see a full game frame pacing graph you can see that when avg frame rate is 40, you can still see frequent frame rendering time spikes in the 100s of ms which feel awful. That’s not something gsync can fix. If on the other hand you can fix your .1% rates to be reasonable (i.e no big spikes)?even a “locked” 30 fps won’t feel as bad.
 
Part of the issue at lower frame rate averages is you tend to have pretty major hitching and massive frame pacing spikes. This was well covered in GNs recent video on the subject of CPU influence of Cyberpunk’s game experience.
If you see a full game frame pacing graph you can see that when avg frame rate is 40, you can still see frequent frame rendering time spikes in the 100s of ms which feel awful. That’s not something gsync can fix. If on the other hand you can fix your .1% rates to be reasonable (i.e no big spikes)?even a “locked” 30 fps won’t feel as bad.
It doesn't 100% 'fix' it, but it absolutely makes it a smoother experience where it is far easier to tolerate those sub-40 FPS frames.
 
I don't have G-sync or even working* Freesync on any of my monitors, and Cyberpunk is the second game I've had to play at 1440p <60fps that I can recall (the other being MSFS 2020).

Even without G-sync the fact that my monitor has a 165hz refresh rate makes the 45ish fps (constant) look surprisingly smooth, screen tearing isn't anywhere near as much of a problem on a high refresh rate monitor. Each frame gets 3+ full refreshes. However 45fps on a 60z monitor normally looks awful due to there only being 1.5ish frames per refresh and it's very noticeable when it tears or refreshes non uniformly.

Someday I'll try the magic of G-sync, but probably not till my korean no-name* monitors die on me. They've done well for the cost, which will likely be after a new GPU.

*They claim Freesync like they also claim HDR those were both lies. Which is fine, I knew that going in.
-----
Haven't played since the 1.05 Hotfix yet. Curious how well it works.
 
Gsync is the only thing letting me enjoy this game at 4k with RT on right now. Got so used to so many games at 90+ FPS I forgot how awesome gsync can be when its needed.
 
After spending several days with this game, and going from max settings to medium, and various levels of ray tracing and DLSS, I finally found a good mix.

I used the settings from this article, with RT on, medium lighting and disable RT shadows. https://www.pcgamer.com/the-best-settings-for-cyberpunk-2077/

Finally, I can get to 60 fps with ray tracing and the game looks absolutely phenomenal. Better than maybe anything.
 
After spending several days with this game, and going from max settings to medium, and various levels of ray tracing and DLSS, I finally found a good mix.

I used the settings from this article, with RT on, medium lighting and disable RT shadows. https://www.pcgamer.com/the-best-settings-for-cyberpunk-2077/

Finally, I can get to 60 fps with ray tracing and the game looks absolutely phenomenal. Better than maybe anything.
I personally did not like RT shadows off, mainly because it was obvious in daylight. The RT shadows produced a nice smooth transition, but without it, for me anyway, it looked like just a line... lol.

Oddly, disabling it only gained me like 2 or 3 FPS, my biggest hitter is by far the RT lighting.
 
Hmm, maybe I'll try to enable it again now that it's performing well. Seemed like it didn't make a huge PQ difference compared to the other RT settings, but it definitely looks better on.
 
I originally went balls to the wall with ultra and full ray tracing. It did look nice, and I was surprised the 2080 Ti could still keep up at around 60 fps (3440x1440).

However, once I got to the open world sections, populated city areas, etc. it was dropping to around 40 fps, for me unacceptable.

I've been on high refresh for a while, and under 90 fps starts to feel choppy. I could accept if it was 60 fps locked, but dropping to 45 fps (probably in a fight scene, when you need the performance) ruined the game.
I don't know man. I've had high refresh rate monitors since they've been out. Never shared that sense of needing a super high framerate, especially given there are other things to worry about like resolution and graphical fidelity.


It would be one thing if RT on and off was a subtle difference. But it's not - just look at that comparison in the thumbnail. You would really rather look at the image on the right just to have a slightly smoother game experience? That's what's unacceptable, and not the huge graphical downgrade?
 
It would be one thing if RT on and off was a subtle difference. But it's not - just look at that comparison in the thumbnail. You would really rather look at the image on the right just to have a slightly smoother game experience? That's what's unacceptable, and not the huge graphical downgrade?
Except that one scene is the only one that really stood out to me as looking noticeably different. It does look way better in that one scene with RT on. Overall, RT ON does look better and I do not think there is any arguing that, but almost every other example in the video was pretty subtle for the most part. In motion, and not still pictures, I rarely can tell the difference outside of some of the RT reflections and occasional more realistic shadow. And using DLSS in order to make things playable, at times really blurs textures way too much making it obvious it is being upscaled, even with the balanced and quality settings. Example: DLSS example
 
It would be one thing if RT on and off was a subtle difference. But it's not - just look at that comparison in the thumbnail. You would really rather look at the image on the right just to have a slightly smoother game experience? That's what's unacceptable, and not the huge graphical downgrade?
After doing more testing, I decided to go with full ray tracing and mostly max settings, aside from a couple (see my previous posts).

Now with a 2080 Ti I'm getting close to 60 fps, maybe dropping to 45 in heavy city scenes, but mostly in the mid 50's and 60's, which is playable.

The difference in picture quality is too great, and I spent all this money on the 2080 Ti, might as well see what it can do.
 
Settings are dialed in, but I'm already over the game (for now at least). Had this feeling everything was about to fall apart any moment but I realized it already had, not going to force it.

Some really good ideas in there, and boy does it look good. But it's also hilariously bad so often (AI for instance, or ABSOLUTE LACK of).

I guess it was a demo. Glad I tried it for cheap.
 
Tried the image quality fix and it makes balanced look a bit better than before, nothing too amazing, but an improvement enough for me to use it. The game runs in min 70's in heavy populated areas in the daytime in balanced, where high 60's were the norm for the same areas in quality mode. Performance mode seemed to blur it enough to be noticable. TBH, I almost prefer playing it with some DLSS on as in native resolution, finer thin lines (like trees and vegetation) look all jagged and stick out more. DLSS cleans it up and blurs it a bit so it doesn't stick out like a sore thumb. Ultra performance and performance modes have the flickering lights effect from time to time and don't really do much for fps for me.
 
So this is interesting- I did play a little with HDR last night (latest drivers ) on my Sony X900 with game mode and FALD at 1440p120 / rtx 3080 was very glad I knew about the elevated black levels in advance. Long story short it makes clear that both SDR and HDR are a little broken right now, but that the game was really designed with HDR.
That is to say, it’s not an afterthought, the lighting etc. really pops in HDR and many scripted sequences are making explicit use of HDR techniques.
When I first played the game things looked a little flat on my 27”Qhd gsync monitor and I thought I didn’t have all the effects turned on or DLSS needed a sharpening filter or something (it does in this implementation) but the flatness is partly a tone mapping issue in SDR, I think, as others have expressed after seeing the hDR mode on a high quality HDR device.
The Nvidia driver issue is a pain because it stays broken after you turn off HdR so I hope they fix it soon since I don’t want to roll back drivers for other reasons.
It's really interesting to hear your experience with the game regarding HDR/SDR and tone mapping. I'm playing on a non hdr computer monitor and often when i'm just kind of looking around, or in a shaded area, the tones are completely flat and it feels as if there's virtually no lighting going on. I couldn't imagine that this was what the developer had intended because there are so many areas that just come across as flat and unlighted completely. Everytime I was coming across one I would go look at my graphics settings and click through some of the different sliders to see if a settings was bugged or if something wasn't displaying correctly but time and time again nothing would change the look of the scene either maxed out or dropped to low. It's really unfortunate because a living breathing neon city of the night should literally pop around every corner, not leave me scratching my head.
 
I am limited to 1080p and this is Ultra but I have dropped back to High settings for outside as trying to find that happy 60fps min .. 3700x . RX 5700 flashed

 
Last edited:
The game is behaving much better between the Nvidia Hotfix driver and the 1.05 patch from CDPR. HDR is more than "good enough" to play now for sure. You can obviously tweak it more but at least the whole experience is far more polished "as intended" now rather than just buggy washed out mess. Unfortunately, while I was waiting for the patches I got hooked on Gunfire Reborn so I havn't had the desire to dive completely back into Cyberpunk yet. I will over the long weekend though.
 
I upgraded from a 2060 to a 2080 super for this game; I didn't shoot nearly high enough.
Yeah, I have a 2080 Ti and that is basically min spec if you want to play with everything and ray tracing turned up.

Took me a while to accept I wasn't going to get more than 60 fps (I usually play at 160 Hz) but I'm happy with how it looks.

Drops to around 45 (or maybe lower) in heavy city driving, but on most missions or indoors I am around 55 - 60 fps.
 
I don't know man. I've had high refresh rate monitors since they've been out. Never shared that sense of needing a super high framerate, especially given there are other things to worry about like resolution and graphical fidelity.


It would be one thing if RT on and off was a subtle difference. But it's not - just look at that comparison in the thumbnail. You would really rather look at the image on the right just to have a slightly smoother game experience? That's what's unacceptable, and not the huge graphical downgrade?

I played the game at 30fps on a 2060S because RT was that good. Wanted an AMD card but not anymore.
 
Nice. RT is really worth it in this game. I mean, I though Control and Metro were good, but this blows them away.
High Textures but medium settings, RT on but low lighting setting, DLSS Auto (balanced degraded the image). Fantastic Image.
Willing to buy a slightly marked up 3080 to push the settings higher at 40+ fps, which is good enough for G-sync to smooth it out for the game to fell like 60.

I think this is the game Nvidia needed when the 2000 series was launched, but now they can assert their dominance once again.
Everyone is saying it runs better on PC, which means more sales to console gamers that are now willing to upgrade/build a PC to play this game.

I built my first PC to play Crysis.
 
My first rig I built was when Unreal 1 came out, that was crazy back then. That machine (with several upgrades) got me through college.

Then I built a new machine for Crysis when it was new. Played it on a 720P TV and had to use a DX9 mod, but the graphics were so good.

Would have probably got a new machine for Cyberpunk, but of course you can't buy anything now. Maybe for the best.

By the time GPU inventory comes back, maybe they will fix all the bugs and I can play the Game of the Year edition or something with better framerate.
 
5900x/3080 FE 100% stock power target and clocks.

1440p ultrawide high settings (med cascade shadows) med rt dlss balanced. I had lows in the mid 50's but it typically hovered between 60-75FPS. * I did cap fps in game to 75 at the time so it could have been even higher.
1440p ultrawide high settings (med cascade shadows) dlss quality. I had lows in the the mid 50's as before but my average FPS is around 100-115FPS. * I have a global fps cap of 117 because of my 120Hz refresh rate and gsync.

One thing I did notice after turning RT off and dlss to quality is there seemed to be a lot more npcs in the city while driving around. I'm not sure if it's something I've never noticed until today or what. Coincidentally it was while I was driving around and saw all these npcs is exactly when my fps went into the 50's.
 
One thing I did notice after turning RT off and dlss to quality is there seemed to be a lot more npcs in the city while driving around. I'm not sure if it's something I've never noticed until today or what. Coincidentally it was while I was driving around and saw all these npcs is exactly when my fps went into the 50's.
Probably not related to settings. I've been tweaking the game for about a week and I didn't notice that. However, the NPC density varies widely. Some areas are dense and then it will get deserted for no reason.
 
Correct me if I'm wrong, but I think it's the first game to have RT reflections, shadows, and lighting, all at the same time.
 
I've done a bit of testing on this. I'm still gathering data.

Ryzen 9 3950X (PB2, AutoOC) w/ RTX 2080 Ti @ 3440x1440 runs at about 45FPS using the Ray Tracing Ultra preset with DLSS set to quality.
Ryzen 9 3950X (PB2, AutoOC) w/ RTX 3090 FE @ 3840x2160 runs at about 45FPS using the Ray Tracing Ultra preset with DLSS set to quality.
Ryzen 9 3950X (PB2, AutoOC) w/ RTX 3090 FE @ 3840x2160 runs at about 55FPS using the Ray Tracing Ultra preset with DLSS set to performance.
Intel Core i9 10900K (Stock clocks) w/ RTX 3090 FE @ 3840x2160 runs at about 58FPS using the Ray Tracing Ultra preset with DLSS set to quality.
Intel Core i9 10900K (Stock clocks) w/ RTX 2080 Super @ 1920x1080 averages 124FPS. When overclocked to 5.1GHz, it averaged 119. This was using the low preset.
Intel Core i9 10900K (Stock clocks) w/ RTX 2080 Super @ 3840x2160 using the Ray Tracing Ultra preset with DLSS set to quality wasn't playable at all. It got low single digit FPS.

I am unsure why the Core i9 10900K is so much faster in this game. It shouldn't be. Especially not at 4K where I should be almost entirely GPU bound. I know there is a slight hit for HDR, but that's the main difference between the 10900K and the 3950X test setup as the GPU is the same. The Intel system was connected to a display without HDR support.
 
Dan_D .. What kind of gaming clocks is the 3950x pushing like that as in all cores ?

I sold my early batch Ryzen 5 3600 as that thing like auto overclock +200 mhz which let it boost to 4.4Ghz in gaming
 
Last edited:
Back
Top