Cyberpunk is Available - Lets share how your GPU/CPU is performing

A bit off topic - but has anyone played the Xbox One version on Xbox Series X? I know the "next-gen" update isn't due until next year - but is it decent? I will likely play on Xbox in my office versus getting to use my gaming PC (damn pandemic has kids all over doing school) so I want to know if it's a viable option or if it is just trash.
annecdotally I know a lot of next gen console people are up in arms cause the game is crashing their consoles. Not sure about performance, but they're calling for blood right now.
 
A bit off topic - but has anyone played the Xbox One version on Xbox Series X? I know the "next-gen" update isn't due until next year - but is it decent? I will likely play on Xbox in my office versus getting to use my gaming PC (damn pandemic has kids all over doing school) so I want to know if it's a viable option or if it is just trash.
One of the reviews I read (forgot which) played the PS4 version on PS5 and the game apparently ran smoothly (much more so than on PS4). I'd guess it's the same/better on XBX since you can dial in some settings.
 
So I tried with DLSS ultra performance, disabled RT shadows, and set other settings to medium, and I'm getting around 80 - 90 fps.

It still looks great and the performance boost is amazing. Played about an hour like this, very happy with it.
 
Man, at 4K, im struggling with how Ultra Performance looks... Performance I can get to look acceptable with nvidia freestyle sharpening. But for me, it seems like Ultra Performance loses too much detail for me to enjoy and adds blur I can't fix with freestyle.
 
Recommendations for a card for 1440@144? I don't need full bore, maxed everything, pegged. Something that looks decent and runs 60+ FPS is fine.
 
I play with everything maxed and DLSS on Quality at 1440p. Seems to be between 50-60 FPS most of the time. I don't notice much slowdown. I tried turning down Cascaded Shadows to Medium and there may have been a slight difference, but nothing massive. Maybe I'll play with the volumetric fog settings.
 
I play with everything maxed and DLSS on Quality at 1440p. Seems to be between 50-60 FPS most of the time. I don't notice much slowdown. I tried turning down Cascaded Shadows to Medium and there may have been a slight difference, but nothing massive. Maybe I'll play with the volumetric fog settings.
3080 then, right? That's looking to be my target card then. +1
 
I would suggest waiting on the 3080 if you can. I'm seeing VRAM allocation on my 2080ti nearing 10GB with the RT medium preset @ 4k. Not saying 10GB VRAM won't be enough, but that's getting awfully close. It's swayed me from getting a 3080, and waiting for a 3080ti or 3090 price drop.
 
I've noticed that Cyberpunk is REALLY good at ram management. The game itself rarely goes above 3GB and I haven't seen it use more than 8GB VRAM on my 1080ti. This is at 1440P.
 
I've noticed that Cyberpunk is REALLY good at ram management. The game itself rarely goes above 3GB and I haven't seen it use more than 8GB VRAM on my 1080ti. This is at 1440P.
4K and RT features adds more, fast.
 
I've noticed that Cyberpunk is REALLY good at ram management. The game itself rarely goes above 3GB and I haven't seen it use more than 8GB VRAM on my 1080ti. This is at 1440P.
Same.
My machine has 16GB and 4GB is unavailable from the go (basic memory use plus a 1.5GB ram cache) yet it barely exceeds 8GB used with the game running, with a 3090.
 
annecdotally I know a lot of next gen console people are up in arms cause the game is crashing their consoles. Not sure about performance, but they're calling for blood right now.
Tell em to find the most powerful hosed vacuum and put it on mega suck where the console vents.
Put a matchstick in the fan to prevent explosion lol.
 
GSync and DLSS are saving my ass here trying to run it at 3440x1440 on a 2080 Super.

Regularly below 60 in some outdoor areas but I don't really notice enough to care. Enjoying the game too much.
I'm on 5120x1440 on a 3090 with psycho RT, they are much needed even then!
 
Specs in sig.

457.30 drivers, opting out of the latest ones until the power usage at idle is fixed

Chormatic/film grain/motion blur off
High/Ultra everything
RT Ultra
DLSS Quality
Getting 70s indoor/outdoor
 
Specs in sig.

457.30 drivers, opting out of the latest ones until the power usage at idle is fixed

Chormatic/film grain/motion blur off
High/Ultra everything
RT Ultra
DLSS Quality
Getting 70s indoor/outdoor
Does DLSS even work with the older drivers? Didn't think it did.
 
Can confirm DLSS works with older drivers. In the training area with quality I was getting 100 fps, with DLSS off, I was getting 60 fps.
 
If anyone has suggestions for 1080p using the rig in my signature, "That'd be greattttt"..... (y)

I was thinking High Preset with both Cascade Shadow settings set to Medium? (and obviously motion blur, film grain, chromatic - turned off)

- Got a buddy coming over so I don't wanna spend an hour tinkering if it can be avoided etc.
 
Honestly the pure 1440p visuals are a little crisper owing to the higher resolution, but a zen 3 and 5700xt are really only going to see fps in the 40s during the shit. With nvidia offering like 3x the performance with all the juice they've pumped into their extras, it's a hard truth to swallow for cyberpunk right now.
Oh its definitely built from, for, and of nvidia












Amd is way down here
 
Last edited:
I cant take cred. I crossposted from another sub on [H] here is the source thread:

Oh its definitely built from, for, and of nvidia












Amd is way down here

I bet if you turn off all the bullshit, like Chromatic Aberation, Film Grain, Depth of Field, vegnette and whatever else "Cinematic" nonsense they bloated the game with, people would get better frames. Just give me good lighting effects, decent shadows and crisp textures. I don't need any of that film haze.
 
I bet if you turn off all the bullshit, like Chromatic Aberation, Film Grain, Depth of Field, vegnette and whatever else "Cinematic" nonsense they bloated the game with, people would get better frames. Just give me good lighting effects, decent shadows and crisp textures. I don't need any of that film haze.
That reply I made you quoted was all Fd up. It quoted somwthing from another thread that I actually never posted. The part about taking cred and all, just ignore it. Has nothing to do with this thread.
 
That reply I made you quoted was all Fd up. It quoted somwthing from another thread that I actually never posted. The part about taking cred and all, just ignore it. Has nothing to do with this thread.
No worries, as long as you didn't see any black cats.
 
So... Umm... I haven't played it yet, but a buddy from work called me this morning telling me he tried it and now his computer won't boot. Asked if I had any old hardware he could buy from me.

So... I have some older stuff in the Haswell era and ask him what he was running... He had a Phenom II 1100T with 8gb of memory and a 7970 (I gave him the card 4 months ago when his other card went bad).

I have no idea how the hell he planned on playing this on that.
 
Did a quick run of the latest drivers to see perf difference. The training area went up 5-6 fps, in game area/outside saw 80s, so overall a 5 fps boost. The missing settings in the control panel bug me a lot though. Power management is either normal or max performance. Adaptive seemed to work great before.
 
Soooooo.. I can get 60 FPS at 4K with EVERYTHING maxed including Psycho Ray Tracing for lighting (specs in sig)... but I have to use "Ultra Performance" DLSS. No clue what this is rendered at, but if I add 50% sharpening with nvidia freestyle, its actually not half bad... probably better being able to use ALL features fully maxed and deal with a few details missing here and there? Its the lighting that makes it feel real, not the signpost halfway down the road....

However; I'm somewhat torn; as the lighting with Psycho looks pretty sweet when you begin to notice it; however, I have noticed that using Ultra Performance DLSS does create artifacts in the city quite a bit, mainly with the cross walks and lights in a distance. I notice they tend to shimmer pretty bad, probably because the AI can't recreate the small details as well with so little information. In tight spots, it looks great though, but looking far into the distance, you start to notice blur (even with sharpening) and weird glitches. Still not 100% sure if I want to play with RT Lighting off and enjoy a higher DLSS setting (like Performance at 4K), or play with RT Lighting on Psycho and deal with a little bit of blur and glitches in the distance...

Damn, I wish I could get my hands on a damn 3090 already!!!
 
I’ve done some testing with my 2080ti, and this game is one of the few where upping power limit is helping a lot. It seems the RT functions decrease availability of power for the raster side, causing a clock decrease. Upping power limit brings back clocks to their normal raster only range I normally see, and around 112% increase I’m hitting stock voltage limit as I should be again. Each % increase in power limit very nearly correlates with a FPS % increase.

Any other normal game, I found power limit wasn’t helping much.
 
A bit off topic - but has anyone played the Xbox One version on Xbox Series X? I know the "next-gen" update isn't due until next year - but is it decent? I will likely play on Xbox in my office versus getting to use my gaming PC (damn pandemic has kids all over doing school) so I want to know if it's a viable option or if it is just trash.
By all accounts the game is pretty awful on consoles right now. Performance is garbage and glitches galore. I’d hold off until they have had time to get some patches out so you have a good experience.
 
So, apparently they specifically excluded AMD code paths? Check this out:

https://www.reddit.com/r/Amd/comments/kbp0np/cyberpunk_2077_seems_to_ignore_smt_and_mostly/

IMPORTANT: This is not Intel's fault and the game does not utilize ICC as its compiler, more below.

Open the EXE with HXD (Hex Editor).

Look for

75 30 33 C9 B8 01 00 00 00 0F A2 8B C8 C1 F9 08

change to

EB 30 33 C9 B8 01 00 00 00 0F A2 8B C8 C1 F9 08


Proof and Sources:



https://github.com/jimenezrick/patch-AuthenticAMD



If you apply that change it nets considerable performance gains on Ryzen CPUs according to those that have tried so far.

Apologies if that has been posted here somewhere already, I havent seen if so.
 
I mean, yeah, that sucks, but not many people on PC will get any benefit from this as they are gaming 1440+ resolution where this change is maybe worth 1 frame.
 
I mean, yeah, that sucks, but not many people on PC will get any benefit from this as they are gaming 1440+ resolution where this change is maybe worth 1 frame.
From what I’ve been reading from users who applied this “fix” is that the biggest impact isn’t in the average frame rates but rather the minimum frame rate significantly increases. That’s also a very important metric, arguably more-so than average frame rate.
 
1920x1200 on SONY GDM-FW900 with everything maxed out, RT on ultra because I did comparison and do not see the difference between psyho, only in fps, DLSS on performance. Runs like butter on RTX 2070 + OC on i9 9900k
I cannot imagine how disappointed AMD (or those from "dude keep 1080Ti, RTX is a scam" crowd...) users must feel. This FidelityFX CAS setting at 50% resolution scale (which amounts to the same rendering resolution as DLSS Performance) gives noticeably worse image quality and similar performance hit (its like 70fps with DLSS and 75 with this CAS thing) while DLSS there is pretty much no point in not enabling it. Basically free performance. Well.. not quite but it looks good and it acts like AA.
 
Can you take some photos? I'd be interested to see what it looks like on a CRT.

Also, I tested on my backup rig (GTX 1060) and it was almost impossible to reach 60 fps. Even with lowest settings and 50% render scale, it was still around 50 fps. Plus, it looked like shit.

Don't want to turn this into a DLSS vs AMD thread, but I will say the DLSS implementation on Cyberpunk is second to none. I'm using ultra perf mode and the performance boost is insane and it still looks acceptable. Can't say the same for FidelityFX.
 
From what I’ve been reading from users who applied this “fix” is that the biggest impact isn’t in the average frame rates but rather the minimum frame rate significantly increases. That’s also a very important metric, arguably more-so than average frame rate.
Yeah, eliminating stickiness at 60fps 1% lows contributes more to a gaming experience than chasing higher than 100 average FPS in a 1player RPG
 
I only have CasFX at 90% scaling because dynamic and anything lower just looked awful.

Which is weird because it works extremely well in Monster Hunter World and Horizon: ZD.
 
Can you take some photos? I'd be interested to see what it looks like on a CRT.

Also, I tested on my backup rig (GTX 1060) and it was almost impossible to reach 60 fps. Even with lowest settings and 50% render scale, it was still around 50 fps. Plus, it looked like shit.

Don't want to turn this into a DLSS vs AMD thread, but I will say the DLSS implementation on Cyberpunk is second to none. I'm using ultra perf mode and the performance boost is insane and it still looks acceptable. Can't say the same for FidelityFX.
Man, why does Ultra Performance look like shit to me??? I'm on 4k, it would run nice, but I can't get over the yuck of that specific setting.

I can't be the only one who notices the loss of detail and slight blur?

Performance is acceptable but balanced seems perfect me to and somewhat playable with RT lighting off.

From what I gathered at 4k.
Quality = 1440p
Balanced = something between 2k and 1080p.
Performance = 1080p
Ultra Performance = 720p
 
Back
Top