The Witcher 3 Wild Hunt: Official Thread

I want to congratulate you guys on being a guinea pig for the rest of us who just couldn't muster up the courage to be a guinea pig. :D
 
I downloaded it at literally midnight with a bowl of popcorn ready to go. DL only took 20 mins.

it was an hour before I even stepped on roach :(
 
I think my performance issue may have been the game running in windowed fullscreen with vsync; sometimes I forget about the "Display" options. It seemed to perform better in real fullscreen without vsync, which is how I typically run games.
 
Guess I'm glad my pc is sitting in a box in my hotel room, and my OLED is several hundered miles away.

Hopefully by the time I'm set up in my apartment next year they'll have fixed this up. Didn't expect this to go too well once learning it was a free upgrade. Witcher 3 is the GOAT, this is one remaster I'd have gladly paid $60+ for it to be done right and more extensively upgraded.
 
I reinstalled from scratch using the update & DX12.

Ehhhhh... at DX12 1080p FSR2 Quality with dynamic resolution scaling ON (and obv no RTX) - Ultra+ looks better but the fps is lingering in the high 50's around Novigrad.

And whether Gsync On and Vsync on or off, there are still some weird stutters.

- DX11 client bumps the fps to approx 75 in the same scenarios but without FSR available and still... random stuttering :(

I don't expect this to be a Cyberpunk-esque situation where it'll take a year, but I'd wager 2-3 months "in the oven" will smooth things out.

.... Shame cuz 1.32 runs like greased butter /w mods (I used a 90 fps cap but it def went wayyyyy higher)

It really should've been a seperate installation/client in Steam. Ala Skyrim/Bioshock etc
 
Did you guys update your Nvidia drivers?

1671076925388.png
 
I reinstalled from scratch using the update & DX12.

Ehhhhh... at DX12 1080p FSR2 Quality with dynamic resolution scaling ON (and obv no RTX) - Ultra+ looks better but the fps is lingering in the high 50's around Novigrad.

And whether Gsync On and Vsync on or off, there are still some weird stutters.

- DX11 client bumps the fps to approx 75 in the same scenarios but without FSR available and still... random stuttering :(

I don't expect this to be a Cyberpunk-esque situation where it'll take a year, but I'd wager 2-3 months "in the oven" will smooth things out.

.... Shame cuz 1.32 runs like greased butter /w mods (I used a 90 fps cap but it def went wayyyyy higher)

It really should've been a seperate installation/client in Steam. Ala Skyrim/Bioshock etc
Why are you using FSR when you have a Nvidia card? It supports DLSS.

Game runs perfect for me. HDR looks great. Max settings with DLSS quality on a 4080.
 
Let me ask but where in the menu is the HDR toggle? I am guessing it should be in Display but I don't have it. I think it might be because I have 2 monitors, one is an old non HDR display that is technically monitor 1 while my HDR 4k monitor is monitor 2 that is set as the primary display in windows. Even though it's off, windows still sees it. I tried turning the other monitor on and now I get nothing but crashes when I try to load a save game file if RT is turned on. I turned the second monitor off again, and it's still crashing. Not sure what happened. Weirdly, if I set it to ultra, so RT is off, load my save, then while in-game go to options and turn on RT or RTUltra it runs fine. I might try physically disconnecting the other monitor to see if that fixes the HDR issue, but I just don't feel like doing it at the moment.
 
Updated AMD drivers to the 22.11.2 and I get a solid 85 FPS on full Ultra settings without RT. RT doesn't add that much IMO. 5950X/6900XT here just for reference. I still think it can be optimized more since it Radeon software says I am utilizing 6-8 GB out of 16GB
 
Game is a stuttering mess with RT on and everything on ultra. Need to wait for patch or drivers or both. Wtf. Don't people test their shit? I don't think I have a bad system.
 
Updated AMD drivers to the 22.11.2 and I get a solid 85 FPS on full Ultra settings without RT. RT doesn't add that much IMO. 5950X/6900XT here just for reference. I still think it can be optimized more since it Radeon software says I am utilizing 6-8 GB out of 16GB
4k?
 
It's playable on my 12700K with a 3080 at 4K resolution when I set DLSS to balanced and everything else to Ultra. No it's not 60FPS per second, but it looks really really good on my LG C2 42" TV/Monitor. Make sure dynamic resolution is off. I turned off motion blur, and blur, and chromatic aberration. Everything else is maxed out. I tried lowering the resolution but it affects the picture quality too negatively and softens everything up. Dynamic Resolution being on looks horrible.
I'm toying with turning Ray Tracing off and on for great performance vs. passable performace. I think you give up more than I want to with Ray Tracing off. There's some special lighting happening I think with Ray Tracing. Since the game doesn't need 120Hz motion to be playable - I think I'm leaning towards keeping the eye candy as high as possible and still have an above 30FPS experience.

This is the first game I've played that makes me want to buy a 4090.
 
It's playable on my 12700K with a 3080 at 4K resolution when I set DLSS to balanced and everything else to Ultra. No it's not 60FPS per second, but it looks really really good on my LG C2 42" TV/Monitor. Make sure dynamic resolution is off. I turned off motion blur, and blur, and chromatic aberration. Everything else is maxed out. I tried lowering the resolution but it affects the picture quality too negatively and softens everything up. Dynamic Resolution being on looks horrible.
I'm toying with turning Ray Tracing off and on for great performance vs. passable performace. I think you give up more than I want to with Ray Tracing off. There's some special lighting happening I think with Ray Tracing. Since the game doesn't need 120Hz motion to be playable - I think I'm leaning towards keeping the eye candy as high as possible and still have an above 30FPS experience.

This is the first game I've played that makes me want to buy a 4090.
I have a 4090, I get 100fps in general play in caves, fields etc with RT off. I tried enabling RT this morning, it crashed. I do not see this as an upgrade besides the ray tracing, but RT is not worth losing 70% of my FPS.

Every Witcher 3 PC player knows Novigrad is the true benchmark. So with everything maxed, standing in Hieirarch square: 34 FPS RT on, 55 FPS RT off. 62 FPS Hairworks off. I am just turning down graphics left and right to get to where I was.

Used to be 180 FPS prepatch
 
Last edited:
I have a 4090, I get 100fps in general play in caves, fields etc with RT off. I tried enabling RT this morning, it crashed. I do not see this as an upgrade besides the ray tracing, but RT is not worth losing 70% of my FPS.

Every Witcher 3 PC player knows Novigrad is the true benchmark. So with everything maxed, standing in Hieirarch square: 34 FPS RT on, 55 FPS RT off. 62 FPS Hairworks off. I am just turning down graphics left and right to get to where I was.

Used to be 180 FPS prepatch
The game crashes if you try to turn ray tracing on when you are already in the game. You need to change the option before loading a save game or starting a new game.
 
Glad its not just me
https://www.rockpapershotgun.com/th...ed-worse-performance-even-without-ray-tracing

"I found she’d previously averaged 102fps from the RTX 3070 when running the game’s Ultra preset back in 2020. 27fps therefore represents a 74% ray tracing tax, or 67% when using Balanced DLSS to make it just about playable. But it’s even worse than that, because I re-ran the Ultra preset with this next-gen update and averaged only 90fps. "
 
Glad its not just me
https://www.rockpapershotgun.com/th...ed-worse-performance-even-without-ray-tracing

"I found she’d previously averaged 102fps from the RTX 3070 when running the game’s Ultra preset back in 2020. 27fps therefore represents a 74% ray tracing tax, or 67% when using Balanced DLSS to make it just about playable. But it’s even worse than that, because I re-ran the Ultra preset with this next-gen update and averaged only 90fps. "
Lot of assets ingame has been upgraded and draw distance substantial increased eg.
That does not come for free, yet is seems it that the upgrades surprises a lot of people in requiring more computational power.
 
Ja, people seem to be forgetting that there is a lot more in this update besides ray tracing.
Right. Just look at the foliage levels. It’s a pretty significant difference! And it looked good before — but it’s pretty amazing now considering this was a 2015 game. I’m definitely just enjoying exploring and taking in the game world. Last night while playing a thunderstorm happened in the game and a few villagers ran towards the cover of a small clump of trees. They had a torch initially, but the rain put the torch out. The lighting flashing and storm in HDR with ray tracing was a sight to behold.

A few moments later as I explored, a huge bear walked by in the distance. It didn’t see me or interact with me. A minute later out of sight it killed a dog or wolf and I saw a carcass appear on the mini map.

Another moment I was on a road and a Nilf. soldier saw me. He was mounted on horseback and actually took the time to walk around me in a circle on horseback before moving on, his peering intently, inspecting me the whole time as he circled.

The world interacts and feels alive so much more so than any other game I’ve ever played.

I’ve just never played another game this good. Not even close. A couple hours of it — starting “new game +” and I’m reminded just how good it is!

picture looks bad unfortunately, because LG C2 OLED is in HDR and my phone camera wasn’t in HDR, but that was the thunderstorm scene I mentioned. In person it looked good enough for me to snap a quick photo.
 

Attachments

  • C4BA7347-432A-4DB1-ACC0-5F90EF7F58ED.jpeg
    C4BA7347-432A-4DB1-ACC0-5F90EF7F58ED.jpeg
    574.5 KB · Views: 2
Last edited:
Unplugged my second monitor, and it fixed the crashing on load bug I had with RT on yesterday. Looks like there is no switch to turn off/on HDR in game. If you have an HDR monitor it is on automatically if windows has it enabled. So if you are having crash issues with the game and have 2 monitors, try disconnecting your second one physically from the card. I have 2 monitors one is off and not used most of the time, but Windows still sees it as does the nvidia control panel. I think the game is getting confused somewhere when RT is on and multiple monitors are detected even if the monitor is off.
 
Unplugged my second monitor, and it fixed the crashing on load bug I had with RT on yesterday. Looks like there is no switch to turn off/on HDR in game. If you have an HDR monitor it is on automatically if windows has it enabled. So if you are having crash issues with the game and have 2 monitors, try disconnecting your second one physically from the card. I have 2 monitors one is off and not used most of the time, but Windows still sees it as does the nvidia control panel. I think the game is getting confused somewhere when RT is on and multiple monitors are detected even if the monitor is off.
I suspect my system has a problem related to the same issue.

It was working ok until I turned off RTX and enabled it again.
Now while loading the auto save game (by clicking continue) it crashes every time. I havent even got out of the introduction yet!
My system feeds audio to an AVR over HDMI for surround sound, the AVR becomes a second screen that cannot be disabled without losing audio.

I had some faith this release would be pretty polished after what they learned from CP2077.
Its shockingly bad in many ways and performance is terrible.
It doesnt matter how good it looks when it plays so badly.
Sad.
If I have to disable surround sound to play this game it will be put on the shelf until they fix it.

ps am on latest driver with a 3090.
 
Some RT on vs off comparisons. Ultra+ settings, 3440x1440, DLSS Performance, Sharpness High.

(DLSS Quality is 45-50fps, DLSS Balanced is 50-60fps, DLSS Performance appears to be 60fps baseline).

The use of DLSS Performance mode in games like this is that because there are no electric lights, so no straight line light sources, there are no light beam artefacts with using upscaling that you get in games like Cyberpunk.

RT on:
RT-on_witcher3_2022_12_15_19_18_02_995.jpg

RT off:
RT-off_witcher3_2022_12_15_19_18_15_605.jpg



RT on:
RT-on_witcher3_2022_12_15_19_12_29_655.jpg

RT off:
RT-off_witcher3_2022_12_15_19_12_50_879.jpg



RT on:
RT-on_witcher3_2022_12_15_19_14_13_374.jpg

RT off:
RT-off_witcher3_2022_12_15_19_14_27_540.jpg




And a couple random ones that looked cool lol:

witcher3_2022_12_15_19_15_44_705.jpg


witcher3_2022_12_15_19_08_02_255.jpg
 
Completely agree that some are forgetting it. I am not. I am still disappointed to see it inefficiently improving graphics.
How is it inefficient? What metric are you using to judge that? This is one of the most demanding raytracing titles yet. Obviously, hardware that can't deal with this much raytracing well (Pretty much everything outside of a 4080/4090) is going to struggle.

To me all of this complaining about raytracing performance issues would be like complaining that your Voodoo 2 can't run Quake 3 decently back in the early GPU days, or that your ti4200 can't run Doom3/HL2 good. We're at the point with raytracing where every new generation of GPU and the latest games that go with them will make the prior generation of GPU already seem like ancient technology. This isn't like the period we had for the past number of years where we just got minor raster performance gains.
 
I just installed it and loaded a previous save not too far into the game.

7700x, 6800xt sig system.

Global High preset
RTX off
AA set to FSR 2 quality
Dynamic res scaling off
SSAA on SSR High
Motion blur, blur , Chromatic aberration, camera lens All OFF
Hair works Geralt high preset at 4x AA
Settings below that all High
Borderless window , vsync on 1440p 120fps cap.

In dx12 I’m getting the reported stutters but I’m hitting the 120fps cap a ton. I haven’t played in ages and it looks pretty great to me. Are we needing a new start to see all the goodies? Seems like my performance is better than most?
 
How is it inefficient? What metric are you using to judge that? This is one of the most demanding raytracing titles yet. Obviously, hardware that can't deal with this much raytracing well (Pretty much everything outside of a 4080/4090) is going to struggle.
My man, I’m on a 4090 running 3ghz core clock and getting 35 fps in novigrad at 3440x1440p with everything turned up. Turning off RT gets me to 60fps. I’m willing to accept I might be trading off some performance with my lowly 5900x but I’m not even at a full 4k, I’m running the latest drivers and my system is optimized/stable.

As to inefficiency, that’s obvious - game looks somewhat better than before, costs 75% of my frames. Not efficient.

Look at the photos above and let me know if the impression is also that RT just looks darker.
 
Currently traipsing through Ghost of Tsushima, but thinking I might have to take a quick break to check out the Witcher 3 update. I'm hearing great things about the new combat changes to Witcher 3 that makes it feel more modern.
 
4k, Ultra settings, (no motion blur, blur, chromatic distortion), full RT, 3090, DLSS Balance -> 50fps average. Game is smooth, no stutters, played over 3 hours. With Quality DLSS fps was around 40fps. I keep it on Balance which still looks really good.

Ahmm, very impressed with the RT quality in this game, a very noticeable difference on the C2 OLED 42". I would prefer higher fps but GSync is working and doubling the hz for the monitor pretty much all the time. I tried to use 3440 x 1440p custom resolution in control panel, Centered, not scaled -> just gave black screens. The C2 has a 21:9 aspect ratio option but it makes it 2560x1080p vice 3440x1440p. My frame rate is actually really consistent, does not vary too much from scene to scene or during a fight. This is on a 5800X3D.

What you know, I may actually play through this game now.
 
My man, I’m on a 4090 running 3ghz core clock and getting 35 fps in novigrad at 3440x1440p with everything turned up. Turning off RT gets me to 60fps. I’m willing to accept I might be trading off some performance with my lowly 5900x but I’m not even at a full 4k, I’m running the latest drivers and my system is optimized/stable.

As to inefficiency, that’s obvious - game looks somewhat better than before, costs 75% of my frames. Not efficient.

Look at the photos above and let me know if the impression is also that RT just looks darker.
I'm on a stock 4080 and a 5800x3d. At 1440 with max settings and DLSS quality (DLSS only, not frame generation) i'm seeing around 80FPS. Perfectly smooth and playable at that FPS with g-sync.

What's your GPU usage? At that resolution, even without DLSS, i'd think a 4090 would easily handle this game on maxed out settings. The 5900x might be holding you back, substantially, at that resolution.

I get way over that 60 FPS on a 4080 if I were to disable raytracing.
 
Last edited:
I'm on a stock 4080 and a 5800x3d. At 1440 with max settings and DLSS quality (DLSS only, not frame generation) i'm seeing around 80FPS. Perfectly smooth and playable at that FPS with g-sync.
running RT as well?
 
Back
Top