The Witcher 3 Wild Hunt Gameplay Performance Review @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
The Witcher 3 Wild Hunt Gameplay Performance Review - We take The Witcher 3: Wild Hunt, using the 1.08.2 patch and latest drivers, we find the highest playable settings and examine apples-to-apples performance with and without GameWorks features across 10 video cards. We put a focus on NVIDIA HairWorks and how it impacts performance and find out which video cards provide the best gaming value.
 
Best damn tech article I have ever read! Very eloquently written, easy to understand, and the conclusion was spot on. The way you tackled HairWorks was perfect. You're good Brent. Damn good! Where did you find the time to do this with the Nano coming soon?

Thank you.
 
Nice article, as always, interesting analysis of Hairworks... I think your logic and conclusion is fine, tho I do have one question (bear in mind I haven't played Witcher 3).

Is it possible Lara's basic hair rendering was just poorer to begin with and that makes TressFX look like more of an improvement by comparison? Not that I'm trying to defend HairWorks mind you, shoot I'm running CF R9 290.

The mental cross reference required to figure out where my R9 290s would land in relation to recent cards kinda throws me off too, heh. My guess is between 390X & 390, closer to the former if OC'd (and not VRAM limited)?

That is, assuming AMD's driver improvements have benefited their pre-rebadge cards equally...
 
Nice article, as always, interesting analysis of Hairworks... I think your logic and conclusion is fine, tho I do have one question (bear in mind I haven't played Witcher 3).

Is it possible Lara's basic hair rendering was just poorer to begin with and that makes TressFX look like more of an improvement by comparison? Not that I'm trying to defend HairWorks mind you, shoot I'm running CF R9 290.

The mental cross reference required to figure out where my R9 290s would land in relation to recent cards kinda throws me off too, heh. My guess is between 390X & 390, closer to the former if OC'd (and not VRAM limited)?

That is, assuming AMD's driver improvements have benefited their pre-rebadge cards equally...

Hairworks in Witcher 3 is just a lame mess. The hair bounces and flies around even when indoors.. and it looks horrible to boot. There is pretty much no reason to even enable Hairworks.
 
Hairworks in Witcher 3 is just a lame mess. The hair bounces and flies around even when indoors.. and it looks horrible to boot. There is pretty much no reason to even enable Hairworks.

There is a mod that enables hairworks for monsters/animals only. Looks great on them!

I do agree though exept for a couple of beards hairworks looks pretty bad on characters.
 
Last edited:
Typo fixed, thanks for the extra eyes. Please PM me next time. Thanks. - Kyle

Thanks for the analysis though. Always love reading these. Although it makes me sad that a card like the 970 is regarded as "mid range".
 
Last edited by a moderator:
I just read the whole damn article....Damn good job [H] I honestly have not read ANY review of performance on this game as detailed as this one.

I will be honest. I am shocked the 390x/390 are doing that damn good in witcher 3. I honestly did not expect that at all.

I honestly expected another Nvidia beat down....Well Fury X was, but that was to be expected at least to me.

Keep up the good reviews! Maybe you guys can go a performance review on that new FFXIV expansion, that game was some darn demanding spots for sure.
 
Nice review, and exactly my experience playing TW3 on a 980Ti.

Hairworks is not worth it in this game due to the performance hit. Unless you're running dual GPU's I wouldn't even bother with it.
 
I just read the whole damn article....Damn good job [H] I honestly have not read ANY review of performance on this game as detailed as this one.

I will be honest. I am shocked the 390x/390 are doing that damn good in witcher 3. I honestly did not expect that at all.

I honestly expected another Nvidia beat down....Well Fury X was, but that was to be expected at least to me.

Keep up the good reviews! Maybe you guys can go a performance review on that new FFXIV expansion, that game was some darn demanding spots for sure.

Since when is 4 fps difference a beat down?

Does that mean the 980 vs 390x is a slaughter since its the same difference yet the 390 is
$50ish cheaper?
 
That is, assuming AMD's driver improvements have benefited their pre-rebadge cards equally...

Something has been. I finally picked up a 3440x1440 monitor, but have still been on the fence when it came to powering it. As it stands, I have a 7870 driving it right now. Initially, I wanted to see the train wreck of how badly it would do at that resolution and was surprised to see it push 55 fps in tomb raider on high, ~38 fps on ultra.

When I dropped it down to regular 2560x1440 and 1920x1080, the 7870 right now is benching higher than the 7970 was when the game launched. Driver improvements have seriously given the old cards legs. I'm still leaning towards a 390, for a new card, but the 390X may warrant a second look.

Also, second the suggestion about the FF14 expansion review. Some very nice visuals in that game and while the old 7870 valiantly struggles with Tomb Raider, FF14 crushes it when maxed out at 3440x1440. (Around 17 fps average)
 
Great review. I really appreciated the "is it worth it" perspective on enabling Hairworks. Good to see a solidly competitive landscape all around.
 
I will be honest. I am shocked the 390x/390 are doing that damn good in witcher 3. I honestly did not expect that at all.

I honestly expected another Nvidia beat down....Well Fury X was, but that was to be expected at least to me.

I know right! Craziness out of a site that hellbent on destroying AMD! ;)

In all seriousness, we just give you the data and our opinion. We leave up to the forum goers to make up crazy conspiracy theories. ;)

No Tessellation Override tests for AMD GPUs? Why?

Is this a Witcher 3 feature that we have left out?
 
One topic I'm not sure I made clear enough in the conclusion, but wanted to make sure I do is how un-biased performance turned out to be in this game despite the use of GameWorks features.

NVIDIA and AMD are trading blows in performance.

HairWorks causes the exact same performance demand on AMD and NVIDIA GPUs. HairWorks shows no signs of being biased toward one brand of GPU.

Many want to criticize GameWorks for making a game outright biased toward NVIDIA. I saw no signs of that in my testing of this game.

This issue is always brought up when AMD or NVIDIA have their hand of technology in a game, some people instantly think there is a performance bias towards that GPU and for that reason we should not use the game for testing. Well here is some real-world data that proves it is not automatically so.
 
One topic I'm not sure I made clear enough in the conclusion, but wanted to make sure I do is how un-biased performance turned out to be in this game despite the use of GameWorks features.

NVIDIA and AMD are trading blows in performance.

HairWorks causes the exact same performance demand on AMD and NVIDIA GPUs. HairWorks shows no signs of being biased toward one brand of GPU.

Many want to criticize GameWorks for making a game outright biased toward NVIDIA. I saw no signs of that in my testing of this game.

This issue is always brought up when AMD or NVIDIA have their hand of technology in a game, some people instantly think there is a performance bias towards that GPU and for that reason we should not use the game for testing. Well here is some real-world data that proves it is not automatically so.

This should be a sticky post, certainly..
 
That AMD article was also written before the changes in 1.07.

As it is now, you can use the in-game options to manipulate the quality and performance of HairWorks, and in our testing, as you saw, I'd just recommend disabling HairWorks in order to run at "Ultra" graphics settings and "High" Postprocessing with HBAO+. Perhaps dual-GPUs have the performance to run with HairWorks enabled.

But thank you all for the feedback, I will take it under advisement if we do any more Witcher 3 specific articles.
 
So having you guys compare it would be nice to see if you find its similar or different.

Yes it would be nice, but we are not going to spend the resources to make that happen, and that is my final answer.
 
Excellent article!

...HairWorks causes the exact same performance demand on AMD and NVIDIA GPUs. HairWorks shows no signs of being biased toward one brand of GPU.

Many want to criticize GameWorks for making a game outright biased toward NVIDIA. I saw no signs of that in my testing of this game..

A great reason to do such a thorough technology analysis on these graphics features. It's nice to be able to put that to rest.

We can also call bullshit the next time someone complains about <insert any gameworks game name here> being used in testing.

Since the nvidia GPU's of this generation are much better overclockers, it seems a no brainer at this point for someone in the market for a video card.
 
Great review.

I do wish we had a similar review at launch. Then we could see how much 3 months of patches and driver work improved the game on all the various hardware.
 
This thread is about what is IN the review. Should you want to discuss something NOT in the review, please do it in your own thread.
 
Now this was a nice read. I still think that the FuryX makes up that 10% in performance compared to the 98ti with its watter cooler. Look at the Asus Poseidon 980ti, it costs 725$ and you need to buy a radiator/pump for it.
SO, i still think that the r9 Fury has the right price imho.
 
And about the test methods (this concearns all reviewers not just hardocp), everything is done in ideal conditions. 21-22 degrees room temperature, open benches... Real world scenarios are different. While this way you can maximise performance, in a closed case, 23-25 degrees room temperature maybe you can't achieve those OC numbers on the CPU or the GPU (i know, this was made on stock settings for the GPUs, but the cpu was on 4,7).
Just saying.
Anyway, my conclusion is that even with a 999$ card, you cant play the witcher 3 at max settings decently. That is a bit sad.
 
I like the analysis of Hairworks from a performance loss and image quality gain perspective. I agree strongly that Hairworks in its current form is using unnecessarily very high levels of tesselation which cause too much loss of performance. The performance tradeoff for the gain in image quality and gameplay immersiveness is not worth it.

TressFX continues to be the benchmark and model for how games should implement Hair effects in a game. I am very excited about TressFX 3.0 and Deus Ex Mankind Divided. Deus Ex Mankind divided is also a DX12 game so that makes it even more interesting. This is absolutely the game to watch out for this holiday season along with Starwars Battleront which uses the Frostbite engine and will also support DX12.

Good work guys. :)
 
That AMD article was also written before the changes in 1.07.

As it is now, you can use the in-game options to manipulate the quality and performance of HairWorks, and in our testing, as you saw, I'd just recommend disabling HairWorks in order to run at "Ultra" graphics settings and "High" Postprocessing with HBAO+. Perhaps dual-GPUs have the performance to run with HairWorks enabled.

But thank you all for the feedback, I will take it under advisement if we do any more Witcher 3 specific articles.

Have you guys checked dropping shadows quality ? From what I've seen dropping them to high or medium have tremendous impact on performance (not as big as Hairworks but close) while it was hard to find any serious difference in image quality.

Also conclusion is missing that it only holds true for reference clocked gpus, when almost all 960/970/980 on the market are overclocked variants with significant frequency gains.
 
Don't understand why comparisons are to stock cards as most Nvidia buyers get custom cooled FROM AIB partners many of which are much faster out of the box.
 
Can't install the new patch. Get error that a bin file can't be found. Using GOG
 
Don't understand why comparisons are to stock cards as most Nvidia buyers get custom cooled FROM AIB partners many of which are much faster out of the box.

Most likely it wouldn't be worth it to compare EVERY AIB card and variant. knowing the factory performance on this card, should give you an idea of what an overclocked / custom cooled card can do, at the very least, you know what your minimums will be.
 
Now this was a nice read. I still think that the FuryX makes up that 10% in performance compared to the 98ti with its watter cooler. Look at the Asus Poseidon 980ti, it costs 725$ and you need to buy a radiator/pump for it.
SO, i still think that the r9 Fury has the right price imho.

Not really a good way to look at it. That's like saying the Subaru WRX makes up it's performance vs the Mustang GT by having a Turbo installed, then pointing out how much more a ROUSH Stage 3 Mustang goes for. The Fury-X is overpriced, should have been air-cooled (overclocked Furys show that it is possible) and retailed for the price of the Vanilla Fury.

That said, this is a great comparison, and I really appreciate it. It's nice to see that AMD gave their old chips some legs (Tonga/Hawaii) to go toe to toe with much newer cards. I can't wait to see DX12 games show up. It'll be hilarious to see the 390 blowing the GTX 980Ti out of the water.
 
Most likely it wouldn't be worth it to compare EVERY AIB card and variant. knowing the factory performance on this card, should give you an idea of what an overclocked / custom cooled card can do, at the very least, you know what your minimums will be.

I understand. It's just that the custom 980tis are SO much faster even than Titan, some with little difference in price from the reference card. so an article concluding that Titan provides the best available experience in today's marketplace might not be entirely accurate. FURY X has been shown to be a very limited overclocker and there are no custom variants at this time.
 
Don't understand why comparisons are to stock cards as most Nvidia buyers get custom cooled FROM AIB partners many of which are much faster out of the box.

Sorry that a 10 card comparison was not enough for you. We are damned if we do, damned if we don't when it comes to overclocked vs non-overclocked cards in these game-specific articles. People complain about it no matter how we do it. The fact of the matter is that we have to end the workload somewhere to get the story published in a timely manner.

Like you mention above you already know that a heavily overclocked 980ti is faster, so I think you can come to your own conclusions there.

Now as mentioned previously in BIG BOLD TEXT. Can we discuss the article rather than what is NOT in the article? You people are like herding cats sometimes.
 
Thanks for a well written article H team. What a sad bunch of whining pussies some of you guys are.
Grow up and come back when you do so. No modern game in its history has been developed for single gpu. Considering all the options of high resolutions we have now.
 
You guys realize you can use simple math to convert similar cards. I.E. AIB card is 5% faster.... multiply the result by 1.05.

I'll give you a spoiler: the results don't change much.

I have to finish this game... only in the second area. I'll give it another few weeks to see which mods pop up.

You guy sliced this article a hundred different ways. Very interesting. Especially the equal hit against the 980ti and Fury X for hairworks.
 
Great article.

Question for you Kyle:

Since you guys didn't do any multi card configs (no blame, I totally get the time constraints), based on everything else you've seen with this game and whatnot, would you guess that a pair of 970's would be about equal to a 980ti in this game performance wise?
 
[L]imey;1041817380 said:
Great article.

Question for you Kyle:

Since you guys didn't do any multi card configs (no blame, I totally get the time constraints), based on everything else you've seen with this game and whatnot, would you guess that a pair of 970's would be about equal to a 980ti in this game performance wise?

No clue. SLI and CF scaling is VERY specific to the specific game title (and which version driver is used). I would not even want to guess. I would suggest someone here in our forums has a more informed opinion than I could give you on this question.
 
Last edited:
One topic I'm not sure I made clear enough in the conclusion, but wanted to make sure I do is how un-biased performance turned out to be in this game despite the use of GameWorks features.

NVIDIA and AMD are trading blows in performance.

HairWorks causes the exact same performance demand on AMD and NVIDIA GPUs. HairWorks shows no signs of being biased toward one brand of GPU.

Many want to criticize GameWorks for making a game outright biased toward NVIDIA. I saw no signs of that in my testing of this game.

This issue is always brought up when AMD or NVIDIA have their hand of technology in a game, some people instantly think there is a performance bias towards that GPU and for that reason we should not use the game for testing. Well here is some real-world data that proves it is not automatically so.

Very well said. With this conclusion, along with the evidence provided, I hope we can all finally put this issue to rest.
 
I would like to think this would end the complaints about Gameworks, but I know better.

The other point I have taken from this, is I really need to get this game.
 
1.) Thanks for the article. Very well laid out and thorough.

2.) Just out of curiosity, if you were to speculate--how much of the field leveling (if any at all) is due to the patches to Witcher 3 and how much is due to the newest driver set? "No clue" is a totally acceptable answer. :)
 
Very easy to read and detailed article, great job. Nvidia and AMD was very close at all levels, especially the 390/gtx970 level. I was surprised how well the 380 did against the gtx960. Might be worth picking up a 2nd 7970ghz ($100 used) since 3GB gets the job done. For those that have $650 to spend and bleed red, a pair of 390s would be the way to go.
 
Kinda wish they would widen the card selection. Lowest priced card is $200, highest is like $650 unless you count the titanX.

Im all for reading a good article on the latest game and how this years cards stack up, but look at the viewers:

http://store.steampowered.com/hwsurvey/videocard/

The most popular config on steam is 1 gig of vram.

Thank you for including the 970 and 960, that accounts for around 6% of the community. Im still amazed at the popularity of the 970.

Still throw us a bone and tell us what a $100 750 or 360 gets us.
 
Back
Top