Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Don't worry about the FPS number. Just notice if the game feels slower or choppier and make game setting adjustments as necessary.Update 1.5, its worth to check fps after that or the same?
But there is now a benchmark built in so you can check your settings a little. I'm averaging 45fps with a 1080ti @3440x1440. I might turn a couple of things down later to smooth it out but didn't have time to play during lunch. Looks like there are more options now, but I could very well be wrong there, it's been a bit.Don't worry about the FPS number. Just notice if the game feels slower or choppier and make game setting adjustments as necessaBy.
Seeing this I gave it a try, runs fine on my Alder Lake so far, still on win10 though.Deleting those files didn't change anything. I'm not sure what's going on. For all I know, they broke the game on Alder Lake-S. I have absolutely no idea what's going on with it at this point.
But there is now a benchmark built in so you can check your settings a little. I'm averaging 45fps with a 1080ti @3440x1440. I might turn a couple of things down later to smooth it out but didn't have time to play during lunch. Looks like there are more options now, but I could very well be wrong there, it's been a bit.
can you clarify what makes the game so much better with this patch?I can't believe how good this patch is, I was a doubter until now. I'll probably play CP2077 again after ER.
can you clarify what makes the game so much better with this patch?
Fortunately, I have absolutely no interest in either of those games.Unfortunately the timing on this update is horrendous being between DL2 and Elden Ring.
Nope. I've got an AMD card on the test bench rig, but I haven't had a chance to give it a go.I heard this new update has support for AMD FSR, has anybody tried it?
Tried it and while it def increased my fps on the ole 1080 - Even set at Ultra Quality, the image suffered so I turned it back off..I heard this new update has support for AMD FSR, has anybody tried it?
Those screenshots are impressive, can't wait to see it in person. Local directional shadows were a weak point graphically in older versions but if those results are typical, this is a huge improvement.Cyberpunk 2077 patch 1.5 gets Ray Traced Local Light Shadows on PC Thanks to Partnership with Nvidia
NVIDIA revealed to have partnered with CD Projekt RED post-launch to improve the image quality of ray tracing in the game...the result is the addition of ray traced local light shadows, whereas previously, Cyberpunk 2077 only resolved ray traced sun shadows...
https://twitter.com/Dachsjaeger/status/1493657623836733441
The shadow update is agnostic. You'll get the local shadows on an AMD card that supports ray tracing. They were added in both gen 9 consoles, I guess they just forgot to add them to the PC version.Nvidia did this on purpose now I want a new GPU.
Cyberpunk 2077 patch 1.5 gets Ray Traced Local Light Shadows on PC Thanks to Partnership with Nvidia
NVIDIA revealed to have partnered with CD Projekt RED post-launch to improve the image quality of ray tracing in the game...the result is the addition of ray traced local light shadows, whereas previously, Cyberpunk 2077 only resolved ray traced sun shadows...
I don't want to go too far off topic but I think you misinterpreted the my post. I'm aware the AMD cards support ray tracing. I suppose I my post was written to be moderately humorous but missed the mark. Anyways Nvidia has 74% of the saturation in PC gaming GPUs, so realistically most the GPU sales over the next cycle will continue to be in Nvidia's favor regardless of what AMD releases. Even though the margins of market share have tightened over the last year. So Nvidia increasing the fidelity of the RT, simultaneously increasing system requirements for said features, will then sell more GPUs (to consumers who prefer higher fidelity).The shadow update is agnostic. You'll get the local shadows on an AMD card that supports ray tracing. They were added in both gen 9 consoles, I guess they just forgot to add them to the PC version.
Could u double-check that DLSS version when u have a chance? The DLL that came with my GOG 1.5 install is 2.3.4, odd if they're shipping different revisions but the update to DLSS 2.3.x is welcome regardless.I don't want to go too far off topic but I think you misinterpreted the my post. I'm aware the AMD cards support ray tracing. I suppose I my post was written to be moderately humorous but missed the mark. Anyways Nvidia has 74% of the saturation in PC gaming GPUs, so realistically most the GPU sales over the next cycle will continue to be in Nvidia's favor regardless of what AMD releases. Even though the margins of market share have tightened over the last year. So Nvidia increasing the fidelity of the RT, simultaneously increasing system requirements for said features, will then sell more GPUs (to consumers who prefer higher fidelity).
Back on topic, I think the new enhanced graphics features are awesome. I feel the new implementation of DLSS 2.3.7 is a pretty significant improvement as well in my limited testing, I felt the "Performance Mode" was a bit better looking than I remembered, the sharpening slider is most definitely welcome.
You are correct, sorry I was at work when I typed that, just got home. Steam version came with 2.3.4.Could u double-check that DLSS version when u have a chance? The DLL that came with my GOG 1.5 install is 2.3.4, odd if they're shipping different revisions but the update to DLSS 2.3.x is welcome regardless.
Yeah, raytracing basically needs a 3080 or 3090 with DLSS to run OK at 1440p and above at meaningful settings. It does make a huge difference in the game though, especially at night time. Probably one or two more generations of GPUs before the high end can run raytracing smoothly at 1440p and above without DLSS or FSR. The few games that have gone all in on raytracing do look impressive, but they are very few. Probably need 2-3 generations of GPUs and maybe a mid-gen update of consoles for raytracing to become mainstream. Atm. it can only be run on a few of the current gen GPUs which cause most developers to only do minor features for marketing (e.g. raytracing in far-cry 6 made almost no difference).Just tried this new update with AMD FSR enabled with my PC, so far impressed with the performance and quality, getting at least 80 FPS average at max settings with FSR set at "Quality" settings at 4K without too much noticeable loss of quality. Not using Raytracing on it though, not bothered because the game doesn't even look much better with it enabled without a huge performance loss.
I played the game when it first came out on a 2080 Ti at 4K with DLSS and a lot of the ray tracing enabled. It ran just fine.Yeah, raytracing basically needs a 3080 or 3090 with DLSS to run OK at 1440p and above at meaningful settings. It does make a huge difference in the game though, especially at night time. Probably one or two more generations of GPUs before the high end can run raytracing smoothly at 1440p and above without DLSS or FSR. The few games that have gone all in on raytracing do look impressive, but they are very few. Probably need 2-3 generations of GPUs and maybe a mid-gen update of consoles for raytracing to become mainstream. Atm. it can only be run on a few of the current gen GPUs which cause most developers to only do minor features for marketing (e.g. raytracing in far-cry 6 made almost no difference).
It is nice that they added DLSS sharpening slider though, as the game was looking a bit too soft with DLSS quality in the previous version.
I did too, but I recall having to keep raytracing to medium and turning down a couple of other things from very high to high, etc. That or play with DLSS balanced which I can't stand. I can only tolerate DLSS quality.I played the game when it first came out on a 2080 Ti at 4K with DLSS and a lot of the ray tracing enabled. It ran just fine.
"Meaningful" is subjective. Global illumination makes the most impact on this game as it should. Reflections are usually one of the first things I turn down if I need to look for performance.I did too, but I recall having to keep raytracing to medium and turning down a couple of other things from very high to high, etc. That or play with DLSS balanced which I can't stand. I can only tolerate DLSS quality.
Wasn't until my 3080ti that I could play it maxed out without worry.
Although TBH, with this game, I really only found the reflections truly worthwhile at the time. Same goes with Control.
My 3080 needed DLSS Quality to run OK at 1440p. Without DLSS it would run at around 45fps with tweaking settings so they didn't detract that much from image quality. Cyberpunk needs 55-60fps or more to not feel laggy. Of course some are happy to play at 30-40fps, but I don't consider that running OK.I played the game when it first came out on a 2080 Ti at 4K with DLSS and a lot of the ray tracing enabled. It ran just fine.
Generally high-30's minimum is fine if you've got hardware g-sync. You can still tell it's low in FPS, but it keeps things smooth. Without g-sync I can't tolerate much lower than 60 fps, with it, you can go a lot lower.My 3080 needed DLSS Quality to run OK at 1440p. Without DLSS it would run at around 45fps with tweaking settings so they didn't detract that much from image quality. Cyberpunk needs 55-60fps or more to not feel laggy. Of course some are happy to play at 30-40fps, but I don't consider that running OK.
Laggy as in there is a large delay on mouse movement to ingame movement. Doesn't really matter if g-sync is turned on or off as the render time becomes twice as large at 30fps compared to 60 fps. G-sync mainly affects tearing unless you are on a slow refreshrate monitor. My display has around 7ms input lag and runs at 240hz with around 7ms input lag so it becomes especially noticable if a game has high input lag. If I was on a slow 60hz or 120hz with high input lag then it would probably be much less noticable. People that mostly play SP games at 40-60 fps probably will not notice it that much either cause they are used to slow input response.Generally high-30's minimum is fine if you've got hardware g-sync. You can still tell it's low in FPS, but it keeps things smooth. Without g-sync I can't tolerate much lower than 60 fps, with it, you can go a lot lower.