The Witcher 3 Wild Hunt: Official Thread

I never thought that game will be ever used has an example of smart optimization for performance:

Maybe the patch changed stuff but at launch at least
https://www.dsogaming.com/pc-performance-analyses/the-callisto-protocol-pc-performance-analysis/
Add to this the fact that the game cannot take full advantage of modern-day CPUs, and you got yourself an unoptimized mess with RT enabled. Therefore, and at this point, we suggest avoiding Ray Tracing like the plague, even if you own high-end CPUs and GPUs.

Going from Ultra to Max RT according to this:
https://www.techpowerup.com/review/the-callisto-protocol-benchmark-test-performance-analysis/4.html

Goes to 73 fps to 41 (44% drop) and 62 to 38 (38% drop) in the 2 scene they tried, is that significantly different than Metro, Control or Cyberpunk RT ?
Which begs the question, who is this PC update aim for? Someone with a 3090 or above which would more likely be pushing 4k. Or below a 3090 more like 1440p and even 1080p. What GPU is needed to run this game well with RT on at each 1080p, 1440p and 4K resolutions using all the upsampling tricks? Someone buying a 4090 for 240hz or 360hz monitor for game play at 1440p or even 1080p may not like sub 100fps game play is another thought. What it does do well, intentionally or not is show DLSS 3 benefits on a 7 year old and now very restrictive CPU title. Turning RT off on this title does not even comes close to the fps that the older version would perform, which is another oddity I find.
 
Which begs the question, who is this PC update aim for? Someone with a 3090 or above which would more likely be pushing 4k. Or below a 3090 more like 1440p and even 1080p. What GPU is needed to run this game well with RT on at each 1080p, 1440p and 4K resolutions using all the upsampling tricks? Someone buying a 4090 for 240hz or 360hz monitor for game play at 1440p or even 1080p may not like sub 100fps game play is another thought. What it does do well, intentionally or not is show DLSS 3 benefits on a 7 year old and now very restrictive CPU title. Turning RT off on this title does not even comes close to the fps that the older version would perform, which is another oddity I find.

BUT does it run the same at the same visual quality?

Ultra in Quake 2 is not the same as Ultra in Cyberpunk 2077. Just because the old 'Ultra' ran at 200FPS does not mean the new 'Ultra' is supposed to perform exactly the same while looking substantially better and more detailed. We do realize that there are other options other than 'ultra', right? like.. maybe try 'high'? or even *dramatic music* 'medium'!?

does the new 'medium' look better than the old 'ultra'?

following this, do you then expect it to run better than the old 'ultra'?
 
Which begs the question, who is this PC update aim for?
I guess that it because we are in the witcher thread but I was responding about a message talking about Calispo protocol, saying:
Callisto to me is just smart RT where it enhances the IQ and using other techniques that work just or almost as well without the performance hit
While all the reviews I saw seem to be talking about a terrible implementation performance wise, to the point to recommend to disable it all together even with high end gpus.
 
I really think that some of the initial absolutely awful performance is due to making a shader cache. After you play for about 30 minutes and work yourself in and out of buildings things tend to go way, way smoother. You just gotta stop fucking with settings long enough for it to build up.

Removing that DX11onDX12.dll also seemed to improve performance, but there is a really good chance this was 100% coincidence because of the previously mentioned shader cache.

I managed to get RTX on and almost every setting on Ultra with my 3080 and maintain over 60fps at 1440p, which puts it just a little bit behind Cyberpunk performance. DLSS on "quality" because I hate how it looks at anything lower.

Also, RT on looks amazing on this game, at least in Touissant. I haven't been to the islands or back to the mainland yet.
 
Last edited:
I guess that it because we are in the witcher thread but I was responding about a message talking about Calispo protocol, saying:

While all the reviews I saw seem to be talking about a terrible implementation performance wise, to the point to recommend to disable it all together even with high end gpus.
Smart RT in reference to the consoles which use it. Actually only PS5 worked well, XBOX did not have reflections which I think has been fixed. PC was a mess from the start.
 
I'm on a stock 4080 and a 5800x3d. At 1440 with max settings and DLSS quality (DLSS only, not frame generation) i'm seeing around 80FPS. Perfectly smooth and playable at that FPS with g-sync.

What's your GPU usage? At that resolution, even without DLSS, i'd think a 4090 would easily handle this game on maxed out settings. The 5900x might be holding you back, substantially, at that resolution.

I get way over that 60 FPS on a 4080 if I were to disable raytracing.
5800X3d and 5900x arent too far from each other in gaming. 1-2% at this resolution. if you are thinking 3440 x 1440p = 2560 x 1440p, I can see the confusion re the performance. Its between 1440p and 4k.

Glad you asked about the GPU utilization. With everything/RT On, all settings maxed, its like 35-36 fps in novigrad and 48% utilization. GPU util is generally super low in some scenes (50%) and in others its 98%. Notably, frames are way higher in those scenes where its 98%.

Turns out low gpu utilization is a common problem and explains the performance issues. I am running the DX11 version with everything at ultra+ and getting 110-200 FPS in most scenes, even busy novigrad because my util is consistently more like 98%+ .
 
Are you guys not having any hitching in this game? It is unplayable.
Honestly, no, outside of what seemed like shader cache style one-off hitching when I first get into certain areas. The game runs surprisingly well on my system. There is a definite FPS difference between wilderness areas and the cities, but even so, it runs hitch-free after a first pass through of the area. (Read - Hitch free, there are still micro-stutters, but hardware g-sync smooths that out some).
 
https://nitter.1d4.us/witchergame/status/1604841010307411968
1671459756635.png
 

So far, so good. Performance increased by about 10fps at the same settings there seems to be no more stuttering.

RTSS OSD also shows up now.

I tested DLSS 3 Frame Generation and I'm pleasantly surprised that it works really well in this game after my negative experience with it in Portal with RTX. Getting 100+fps with max settings. Frame pacing seems fine. I didn't notice artifacts nor increased latency and it works properly with G-Sync and V-Sync ON in NVCP.
 
Are you playing maxed out with all RT options?
Yes. Only option I disable is motion blur / blur. I keep hairworks disabled as well because I don't like how stringy hairworks looks (Looks even worse with raytracing, because in some scenes the stringy hairworks hair is lit different than the rest of his head, and it looks extra shitty), but with it on I didn't notice any performance impact like years back.

I don't bother with frame generation as my FPS is generally above 60 with DLSS quality, and frame generation adds too much input latency for my taste. Might be useful if you're sitting at like 30FPS though.
 
Last edited:
For those installing the GoG patch with the standalone version, if you renamed the .dll files mentioned in an earlier post (dx12on11 and dxil ) make sure you rename them or put them back before you patch or the patch will fail. I had to reinstall the game before patching because I did that. Apparently the stand alone patcher patches files as it goes and when it fails partially, it doesn't revert any changes. So when I renamed the .dll files I re-ran the patch but it failed again with different errors.

Interestingly, they deleted the dx12on11.dll file. :) Maybe it was left there by mistake. Not really much of a difference on my system other than the OSD being usable with no renaming of .dll files. I can't really say if it still had the crash bug on load since my 2nd monitor is disconnected. I'll plug it in later and see if the game starts crashing again.
 
On the gameplay side of things I noticed it seems to be much easier to punch above your level than before. Enemies stagger much easier, and the full effects of your alchemy and signs seem to trigger more often. On Death March I was able to take down level 40 elementals with ease at level 20. It took a long time, but this was much more difficult to do before.
For those installing the GoG patch with the standalone version, if you renamed the .dll files mentioned in an earlier post (dx12on11 and dxil ) make sure you rename them or put them back before you patch or the patch will fail. I had to reinstall the game before patching because I did that. Apparently the stand alone patcher patches files as it goes and when it fails partially, it doesn't revert any changes. So when I renamed the .dll files I re-ran the patch but it failed again with different errors.

Interestingly, they deleted the dx12on11.dll file. :) Maybe it was left there by mistake. Not really much of a difference on my system other than the OSD being usable with no renaming of .dll files. I can't really say if it still had the crash bug on load since my 2nd monitor is disconnected. I'll plug it in later and see if the game starts crashing again.
Might have been an oversight they forgot to remove for the shipping version of the game. It's not unusual for that to happen these days.
 
Last edited:
My GOG install didnt have dx12on11 dll, before or after the update.
dxil dll is there on both though.

I compared CPU and GPU use, power use and framerate for each version, they are all very similar.
But with dxil.dll renamed now uses 7.7GB 'system' ram vs 8.2GB, thats the only standout difference I can see. Commit charge is the same for both.
Latest version Win10, latest video driver, unclocked 10700K, 450W 3090 with 350W profile. Game set to max with RTX on, a few settings off like blur...
 
After patch it's still crashing on load with any RT enabled.

Going to try a clean install of the Complete Edition just in case there's something to that.

ETA: And clean installing worked to get it running. Just at Kaer Morhen but better than the black screen CTD on load.
 
Last edited:
Still no crashes here. RT is a no go @ 4K on Ultra settings. Only 30FPS. I kinda prefer it on all Ultra+ 4K HDR anyway. That is min 70 max 90 FPS. Stock 5950X/6900XT

Edit : Still not utilizing more than 20 % CPU, 6-7GB VRAM, 13GB RAM
 
That should sort out my missing Steam FPS display. But at least I figured out why it felt so damned clunky to me. View distance defaulted to "close". It feels much better with view distance set to "normal", though I wish I could zoom out more. 43" 4k monitor so Geralt looks big like he ate an orange and red mushroom and the edges of the screen are "over there." I still need to play with graphics settings some more. I'm on a 3090 but at 4k so something has to give, plus I'm not sure how many FPS I really need. Maybe I'll wait a little longer to actually get started on playing and pick this up in January. I only have a few days left before I'm out of town and away from my computer for a while.
 
Ok after the hot fix and latest GPU driver I’m playable in the mid 50s to high 70s fps at 1440p with just RT global illumination, FSR 2 Balanced and Ultra+ . Blur /hair works off. Enabling all 3 RT settings drops to 30s and 40s and it’s rough. Standing around looking at all the effects is pretty nice though. I haven’t decided to play with the one RT setting and FSR or go back to 100+fps all the time by turning it off.
 
I found the other RT settings didn't impact fps much at all, it's Global Illumination that has the big impact. Hairworks is off naturally as it looks like crap especially in cutscenes as his hair is dark lol.
 
Latest update does seem to have improved performance some. I'm seeing more utilization & power usage out of my hardware in the same areas now.
 
That should sort out my missing Steam FPS display. But at least I figured out why it felt so damned clunky to me. View distance defaulted to "close". It feels much better with view distance set to "normal", though I wish I could zoom out more. 43" 4k monitor so Geralt looks big like he ate an orange and red mushroom and the edges of the screen are "over there." I still need to play with graphics settings some more. I'm on a 3090 but at 4k so something has to give, plus I'm not sure how many FPS I really need. Maybe I'll wait a little longer to actually get started on playing and pick this up in January. I only have a few days left before I'm out of town and away from my computer for a while.
I agree that the new close camera angle it defaults to is terrible. Who thought that was a good idea, anyway?
 
Game runs much better after the new patch. Hell it even showed me my fps now. I am doing about 105 fps at 4K DLDSR in Velen.
 
I agree that the new close camera angle it defaults to is terrible. Who thought that was a good idea, anyway?
New camera position looks great, I can't stand when the character is 6 miles away as I like to see those little details on the clothing and stuff. The good thing here is that you can change the camera position n back to old style for horse, explore and combat etc if you want in settings.
 
The Witcher 3 PC- Next-Gen- Game-Changing Visuals But What About Performance?

Running through the entire gamut of upgrades, including the game-changing ray tracing effects, the patch modernises The Witcher 3 with top-tier RT visuals...unfortunately, there are substantial problems with performance that cannot be overlooked - and perhaps surprisingly, it's the CPU that is taxed the most...these problems are not addressed in the hotfix patch...

 
So even the fastest CPUs out there cannot hold 60 FPS in the really CPU limited areas. I'm sorry but that is an absolute freaking joke. You know it's one thing when you can just keep throwing GPU power at a situation and get some improvements but it's quite another when there's absolutely no CPU made that can maintain 60 FPS. I believe this is the third or fourth game out there where the fastest CPUs made cannot always hold 60 FPS in CPU limited situations because of ray tracing. It's even more ridiculous when it's in an older game like Witcher 3 that's just trying to be spruced up to look more modern and ends up being much more demanding than modern games built from the ground up with these features.
 
Last edited:
It's because the engine is running in a dx12 wrapper on dx11 - It's CPU limited to single threaded use and not properly multi threaded. It needs a rework which is unlikely to ever happen due to the roundabout way they went about getting dx12 on a dx11 engine.
 
It's because the engine is running in a dx12 wrapper on dx11 - It's CPU limited to single threaded use and not properly multi threaded. It needs a rework which is unlikely to ever happen due to the roundabout way they went about getting dx12 on a dx11 engine.
He's just angry in almost every thread.

They implimented a ton of upgrades, crammed all the latest things in, and released it for free of course its going to have age related issues, single threaded being a perfect example. No way would they rebuild the game for DX12 and have it be free. Some people should just be ignored.

I doubt there would be a market for a rebuilt witcher 3 in dx12, to many mechanics (like combat) have not aged well.
 
Last edited:
He's just angry in almost every thread.

They implimented a ton of upgrades, crammed all the latest things in, and released it for free of course its going to have age related issues, single threaded being a perfect example. No way would they rebuild the game for DX12 and have it be free. Some people should just be ignored.
Yeah how dare I get angry and mad about shitty running games and shitty implementation of features. I should just be happy and enjoy crappy experiences like most of you seem to have no problem with. FFS just imagine if people like me didn't speak the hell up and there weren't YouTube channels out there calling out these issues to help draw attention to get developers to actually fix their games. I swear to God some of you just have bottom of the barrel standards.
 
Yeah how dare I get angry and mad about shitty running games and shitty implementation of features. I should just be happy and enjoy crappy experiences like most of you seem to have no problem with. FFS just imagine if people like me didn't speak the hell up and there weren't YouTube channels out there calling out these issues to help draw attention to get developers to actually fix their games. I swear to God some of you just have bottom of the barrel standards.
It's a free update on a 7 year old game. There really aren't any standards.
 
just because it's a free update doesn't mean that the quality doesn't deserve to be better...with that excuse every game that gets patched would need to only be subpar because it's free...the Witcher 3 upgrade sounds like a mixed bag which I'm not sure was really necessary but it's still an improvement despite the performance issues
 
The Witcher 3 PC- Next-Gen- Game-Changing Visuals But What About Performance?

Running through the entire gamut of upgrades, including the game-changing ray tracing effects, the patch modernises The Witcher 3 with top-tier RT visuals...unfortunately, there are substantial problems with performance that cannot be overlooked - and perhaps surprisingly, it's the CPU that is taxed the most...these problems are not addressed in the hotfix patch...



Wow thats trash. Dx12 wrapped or not, this is embarrassing for pc gamers that spent $400 on a cpu when the 2020 consoles seem to play the game better.
 
It's a free update on a 7 year old game. There really aren't any standards.
What I'm complaining about is not just this game that's why I said games with an s. This is become the norm now for the industry to just release broken stuttering pieces of crap that take weeks or months to fix if they ever get fully fixed at all. This is just getting ridiculous and it makes it more annoying to think how much I spend for hardware to try to get a good experience. More people need to start speaking up and quit accepting this trash.
 
Wow thats trash. Dx12 wrapped or not, this is embarrassing for pc gamers that spent $400 on a cpu when the 2020 consoles seem to play the game better.
The consoles don't. If you put a PC at the same settings as the consoles - I'm confident they perform the same if not better. The console versions aren't running the full suite of raytracing features and look like they are being rendered through a hefty smear of vasoline with how blurry they are with the scaling.
 
I know you can do a better job at paying attention to the context there. What I'm complaining about is not just this game that's why I said games you know plural with an s. This is become the norm now for the industry to just release broken stuttering pieces of crap that take weeks or months to fix if they ever get fully fixed. This is just getting ridiculous and it makes it more annoying to think how much I spend for hardware to try to get a good experience.

I share your sentiment. People need to stop accepting trash given to them as treasure. The reason why so many games are releasing broken, unstable, unfinished etc is because people keep defending them saying "its fine, it'll be fixed". The more voices speak out, the more it affects the buying decisions of others, the more it impacts the bottom line of those releasing garbage, the more it encourages them to do better.
 
Wow thats trash. Dx12 wrapped or not, this is embarrassing for pc gamers that spent $400 on a cpu when the 2020 consoles seem to play the game better.
What?

No one is saying the console plays better than a modern PC. The console settings are more like low or medium relative to ultra + on PC. The 2020 consoles have basically an AMD 2700 ish equivalent processor in them. They are pretty much just PCs at this point. There is no magic sauce. The console players aren’t getting a better experience than the PC players.
 
I share your sentiment. People need to stop accepting trash given to them as treasure. The reason why so many games are releasing broken, unstable, unfinished etc is because people keep defending them saying "its fine, it'll be fixed". The more voices speak out, the more it affects the buying decisions of others, the more it impacts the bottom line of those releasing garbage, the more it encourages them to do better.
No, sorry. I was the first to take a shit all over Callisto Protocol because that cost $60 and runs like shit.

I'm not going to complain much over a free update to a 7-year-old game though. Is it disappointing? Sure. But free is free.
 
What?

No one is saying the console plays better than a modern PC. The console settings are more like low or medium relative to ultra + on PC. The 2020 consoles have basically an AMD 2700 ish equivalent processor in them. They are pretty much just PCs at this point. There is no magic sauce. The console players aren’t getting a better experience than the PC players.
I think they are, no Shader comp stutters and no DX12 wrapping.
 
Back
Top