Cyberpunk is Available - Lets share how your GPU/CPU is performing

I moved from a 1080ti to a 3080 and from high details to Ultra. With raytracing and DLSS. No other hardware changes. Display is 3440x1440.

I was surprised to see CPU use go from averaging in the 40s-50s to averaging in the high 60s to 70s. I see my CPU hit into the 90s when driving through the city now and occasionally, rarely, peg it to 100% for brief spikes. Before the 3080, it was peaking in the 60% range.

i7 6950x. 10core/20 thread. @4.1 GHz across all cores.

Framerate hovers around 60FPS with DLSS and everything maxed out on the 3080, but was in the high 30s to low 40s on the 1080ti with mostly high settings (no RTX of course)
 
Last edited:
I turned AB back on to see how much vram the game is using at 1080p as I have Image Sharpening set to 80% and Surface Format Optimization running on AMD driver side plus the game settings .. Over 6Gb used and not a lot of head room left on an 8Gb card .

 
I moved from a 1080ti to a 3080 and from high details to Ultra. With raytracing and DLSS. No other hardware changes. Display is 3440x1440.

I was surprised to see CPU use go from averaging in the 40s-50s to averaging in the high 60s to 70s. I see my CPU hit into the 90s when driving through the city now and occasionally, rarely, peg it to 100% for brief spikes. Before the 3080, it was peaking in the 60% range.

i7 6950x. 10core/20 thread. @4.1 GHz across all cores.

Framerate hovers around 60FPS with DLSS and everything maxed out on the 3080, but was in the high 30s to low 40s on the 1080ti with mostly high settings (no RTX of course)
Well it's a good thing for you that the game scales on more cores because your CPU is actually quite slow for a modern CPU limited game since most games still really only care about fast cores and not having more than six. I went all the way down to 720p and still was only seeing about 60% CPU usage in the most CPU limited areas with an occasional spike closer to 70. My GPU usage at was still 96 to 97% in those CPU heavy areas. I'm using a 2060 super and stock 9900k.
 
Well it's a good thing for you that the game scales on more cores because your CPU is actually quite slow for a modern CPU limited game since most games still really only care about fast cores and not having more than six. I went all the way down to 720p and still was only seeing about 60% CPU usage in the most CPU limited areas with an occasional spike closer to 70. My GPU usage at was still 96 to 97% in those CPU heavy areas. I'm using a 2060 super and stock 9900k.

Not at 3440x1440. My 6950x CPU hasn't been an issue in any game I've played --- yet.

And the CPU doesn't seem to change much between 1280x720 and 3440x1440. But the GPU sure does.

720p -- pretty much everything else at max and Ray tracing ultra and DLSS auto on. (turned motion blur to low because I don't like it, turned off Chromatic Aberration off because I don't like it, and turned off film grain because I don't like it. Everything else is max settings).
At 720P the CPU is still like 80% And GPU is just idling.

1609655251272.png



Same scene at 3440x1440 -- no changes other than resolution. Same street corner. CPU still at 80% ish as a normal high in the cities. 3080 GPU just working harder now.

1609655895500.png



CPU doesn't seem to matter much at 4K. So I expect it wouldn't matter much at 3440x1440 either.
Here's some testing at Toms Hardware with a 3090 card and various CPUs.
1609656074927.png
 
Last edited:
CPU make a difference at high refresh rate. The game sort of caps out around 90 fps, regardless of GPU workload.

Especially in the downtown area, I see drops to 40 fps even though the GPU is sitting comfortably (for example at 60% utilization).
 
CPU make a difference at high refresh rate. The game sort of caps out around 90 fps, regardless of GPU workload.

Especially in the downtown area, I see drops to 40 fps even though the GPU is sitting comfortably (for example at 60% utilization).
I’ve played a little CyberPunk with my 6950X and 3090 at 3440x1440 and can confirm there are a few places in the game where I had noticeable slow downs due to high CPU utilization (99% CPU and 60-80% GPU). So there are definitely areas in the game that benefit from a more modern, beefy CPU. But most of the game is still GPU bottlenecked at higher resolutions.
 
I’ve played a little CyberPunk with my 6950X and 3090 at 3440x1440 and can confirm there are a few places in the game where I had noticeable slow downs due to high CPU utilization (99% CPU and 60-80% GPU). So there are definitely areas in the game that benefit from a more modern, beefy CPU. But most of the game is still GPU bottlenecked at higher resolutions.
So I’m not saying you are wrong, but I do want to hear where they are. I’ve never seen more than a spike to 100% that is exceptionally brief. Have you seen sustained 100%?
For comparison, if it matters, my 6950x is clocked to 4.1Ghz on all cores, and I’m using 32GB of 3200Mhz RAM. I’ve had the windows game bar overlaid almost the entire 50 hours I’ve played Cyberpunk. I’m not through the main story yet, but by 50hours I’ve seen a lot of the game I think. I’ve not seen it hit sustained 100%, and even in the very rare spike peaks, my FPS doesn’t seem to drop below mid 50s with my 3080 card. Where are you seeing the sustained 100% CPU use so I can see if the same occurs for me there?

Edit — I just reread and realize you said 3090. That extra GPU processing headroom is perhaps the delta. My CPU runs in the 60% area as normal highs in the city with medium high settings and 40FPS on my 1080ti. When I went to the 3080 it was 80% as normal highs in the city on the 3080 with max settings.
 
Last edited:
So I’m not saying you are wrong, but I do want to hear where they are. I’ve never seen more than a spike to 100% that is exceptionally brief. Have you seen sustained 100%?
For comparison, if it matters, my 6950x is clocked to 4.1Ghz on all cores, and I’m using 32GB of 3200Mhz RAM. I’ve had the windows game bar overlaid almost the entire 50 hours I’ve played Cyberpunk. I’m not through the main story yet, but by 50hours I’ve seen a lot of the game I think. I’ve not seen it hit sustained 100%, and even in the very rare spike peaks, my FPS doesn’t seem to drop below mid 50s with my 3080 card. Where are you seeing the sustained 100% CPU use so I can see if the same occurs for me there?

Edit — I just reread and realize you said 3090. That extra GPU processing headroom is perhaps the delta. My CPU runs in the 60% area as normal highs with medium high settings and 40FPS on my 1080ti. When I went to the 3080 it was 80% as normal highs on the 3080 with max settings.
I don’t recall the exact locations but they were accessible within the first couple hours of the game or so. I only installed the game to test out the 3090, didn’t go very deep into it yet. Waiting to experience it on my Ryzen 5950X/RTX 3090 custom loop rig; just need the GPU waterblock to come in.
 
Here's what I get with the rig in my sig. EDIT: This is at 3440x1440.

DLSS Quality
Psycho 40's (mid to high)
Ultra 50s (mid to high)

DLSS Performance
Psycho 600's (mid to high)
Ultra 70's (mid to high)

DLSS Ultra Performance
Psycho 70's (mid to high)
Ultra 70-80's (mid to high 70's, low 80's)
 
On a HT cpu it’s almost impossible for a real world game (not a benchmark) to hit 100% due to the way windows reports each thread as a core with equal weighting. Bring up resource monitor on your 2nd monitor and switch to the cpu tab (not task manager)
Look at individual core graphs - if some of them are hitting 100% there’s at least a single thread bottleneck and if half of them are spiking close to 100% (the “real” cores) you have a broader cpu bottleneck.
 
Loaded up this mod with great results: https://www.nexusmods.com/cyberpunk2077/mods/107

A bit apprehensive since the last "AMD fixes" and memory pool csv tweaks turned out to not actually do anything. For this mod, I loaded up a fresh save in Chinatown, and walked up and down the street a few times. This is at 1440P, RT all on, RT lighting on medium; DLSS Balanced; SSR low; AO low; chromatic aberration off; color accuracy medium; Volumetric fog medium; everything else high.
Min: 29
Max: 44
Median: 35

With the mod, I loaded right into the same Chinatown save:
Min: 49
Max: 65
Median: 56

Can anyone else corroborate?
 
Loaded up this mod with great results: https://www.nexusmods.com/cyberpunk2077/mods/107

A bit apprehensive since the last "AMD fixes" and memory pool csv tweaks turned out to not actually do anything. For this mod, I loaded up a fresh save in Chinatown, and walked up and down the street a few times. This is at 1440P, RT all on, RT lighting on medium; DLSS Balanced; SSR low; AO low; chromatic aberration off; color accuracy medium; Volumetric fog medium; everything else high.
Min: 29
Max: 44
Median: 35

With the mod, I loaded right into the same Chinatown save:
Min: 49
Max: 65
Median: 56

Can anyone else corroborate?
That's the exact same mod as the github that I use. It definitely helps Minimum frames with higher core systems.

https://github.com/yamashi/CyberEngineTweaks
 
Last edited:
Now that I have a 3090; my numbers are different. An improvement for sure and damn the game looks way more crisp for me, but also insane how punishing it is...

Comparison:

On my 2080Ti - I Ran at 4K, Everything Maxed out, Balanced DLSS (-3 LOD as well for better textures) and used Cinematic_RTX command line - Averaged around 35~40 FPS (sometimes dips into the low 30's in heavy city areas).

On my 3090 - Run at 4K, Everything Maxed out, Quality DLSS (-3 LOD as well for better textures) and still use Cinematic_RTX command line - Averaging around 45~50 FPS now (sometimes dips into the low 40's in heavy city areas).

All settings being close to the same, the clarity at 4K with Quality DLSS over Balanced DLSS is huge for me (not to mention smoother gameplay now at a better setting). I know for some, Balanced and Performance DLSS is fine with some image sharpening, but for me, the blur and loss of detail in a game that thrives on it was too much to bear.
 
Loaded up this mod with great results: https://www.nexusmods.com/cyberpunk2077/mods/107

A bit apprehensive since the last "AMD fixes" and memory pool csv tweaks turned out to not actually do anything. For this mod, I loaded up a fresh save in Chinatown, and walked up and down the street a few times. This is at 1440P, RT all on, RT lighting on medium; DLSS Balanced; SSR low; AO low; chromatic aberration off; color accuracy medium; Volumetric fog medium; everything else high.
Min: 29
Max: 44
Median: 35

With the mod, I loaded right into the same Chinatown save:
Min: 49
Max: 65
Median: 56

Can anyone else corroborate?
Not seeing a significant difference on my computer. I loaded same save and drove around on bike for a bit to compare. FPS were very similar (I did not record min, max, and average). I used Afterburner to display FPS. I looked at log file after and it seemed mostly successful loading. The only part that failed was the virtual input patch. Log says "[warning] Virtual Input Patch: failed". I am running only at only 1440p though. My settings on the ones you listed are: RT lighting - Ultra, DLSS - Quality, AO - High, Chromatic Aberration - Off, Color Precision - High, Volumetric Fog - Ultra, everything else at highest setting (except psycho). I have 5600X and 3060 Ti.
 
Not seeing a significant difference on my computer. I loaded same save and drove around on bike for a bit to compare. FPS were very similar (I did not record min, max, and average). I used Afterburner to display FPS. I looked at log file after and it seemed mostly successful loading. The only part that failed was the virtual input patch. Log says "[warning] Virtual Input Patch: failed". I am running only at only 1440p though. My settings on the ones you listed are: RT lighting - Ultra, DLSS - Quality, AO - High, Chromatic Aberration - Off, Color Precision - High, Volumetric Fog - Ultra, everything else at highest setting (except psycho). I have 5600X and 3060 Ti.
It likely won't do anything with a 5600X, I think it really only helps CPUs with more than 8 core/16 threads.
 
Discovered 2 neat little things that not only get rid of the "stutter" in the city; but also stop a lot of the "pop-in" you get with some textures when using DLSS.

Important Note: These are the settings I use to Game at 4K; everything is maxed and I am using Psycho RT (as the Global Illumination/Indirect Diffuse Lighting is awesome in this game).

To Fix Stutter: Even if you have an SSD or NVME Drive; make sure you ENABLE "Slow HDD Mode". This made a HUGE difference for me. I guess this forces the game to store more on the video cards VRAM. Seriously, removed all stutter/FPS dips for me (and i have an nvme drive). The setting can be accessed through the games GUI.

To Fix Texture Pop-In/Details when using DLSS: Add the following line to your created user.ini file (or General.ini file if you did not create a user.ini)

[Rendering/AsyncCompute]

DynamicTexture = False

NOTE: This is in ADDITION to the -3.00 LOD Bias for DX fix within nvidiainspector.

Hope this helps everyone out like it did me. I am cruzing this game now at 4K, with Psycho RT using Balanced DLSS and maintaining 50~55FPS most of the time. The worst parts of the city are about 45FPS, but wayyyyyyyyy better than before now as there are no major dips and DLSS Balanced looks really nice now. :)
 
My EVGA 3070 XC3 ULTRA paired with a 5800x has been unstoppable driving 2k at ray tracing ultra so far. cannot remember a single frame loss since I have been playing. my monitor is only 60hz refresh rate though. so your mileage may vary.
 
finally got to checking out 2077!

on my desktop (FX 8230 & Vega 64, both AIO cooled) I'm getting btwn 30-40FPS on High/Ultra quality at 1920x1200 with a bunch of Reshade FX including raytraced screenspace GI/AO. As predicted my CPU is straight up not having a good time and I had to turn Crowd Density down to medium to keep CPU utilization around 70-80% and framerate above 30 in busy areas.

on my laptop (i5 8300H & GTX 1050) getting 30-40fps at Low/Medium quality with 60% render scaling (from 1080 native) and Reshade FX but no RTGI. No issues with crowd density thanks to the stronger CPU.

i haven't tried any INI tweaks yet, maybe I can squeeze a bit more performance out yet.
 
Back
Top