The Witcher 3 Wild Hunt Gameplay Performance Review @ [H]

Most of that garbage there is mobile. The gtx760, gtx 660, and 7900 series tops the list for desktop. If you have only $100 to spare, get a used Hd 7900 or gtx 770. Otherwise, you will not even have the muscle for 1080p.
 
Most of that garbage there is mobile. The gtx760, gtx 660, and 7900 series tops the list for desktop. If you have only $100 to spare, get a used Hd 7900 or gtx 770. Otherwise, you will not even have the muscle for 1080p.

That said, I would love to see a performance evaluation with the Radeon 4870, 5870, 6970, 7970Ghz, R9-290x, Fury-X, and see how it correlates with the GTX280, GTX480, GTX580, GTX680, GTX780-Ti, and GTX980-Ti. It would be great to see which generation saw the biggest jump from the last.
 
Most of that garbage there is mobile. The gtx760, gtx 660, and 7900 series tops the list for desktop. If you have only $100 to spare, get a used Hd 7900 or gtx 770. Otherwise, you will not even have the muscle for 1080p.

I dont know why buy 770's are going for over $160 on ebay for 2gb, and 4gb is an insane $275.
 
I'd be curious to see what the hairworks performance looks like when the cards are all overclocked.
 
How come you guys don't knock on the $999 price tag on the Titan X?

You get a whole 2 FPS more than a $650 980 TI as pointed out in your latest review:

http://www.hardocp.com/images/articl...sJrD_6_1_l.gif

$349 / 2 = $175 per fps.

Yes I know the Titan came out first, but people are looking to buy when they read the review now not months ago.

How can you recommend it at all?


The only disappointing card on performance in The Witcher 3: Wild Hunt was the Radeon R9 Fury X. For a $649 video card it was nowhere on the level of where it should be. The GeForce GTX 980 Ti offers a much better experience, close to TITAN X level performance. For AMD video cards, compared to the competition, the Radeon R9 Fury, Radeon R9 390X, Radeon R9 390 and Radeon R9 380 are worth a second look for Witcher 3 gameplay experience if you are shopping around.

On the NVIDIA side, nothing at all beats the TITAN X for performance in The Witcher 3: Wild Hunt. Yeah, it costs $999, and it should do the best, and it proves that it does. AMD has nothing that can compete with it in this game. It stands in a league of its own, if you simply want the fastest single-GPU experience in this game go with the TITAN X.

The Fury X gets 6 fps less at the highest settings you tested, and is $350 cheaper and watercooled.

Somehow the $350 for 2 fps from 980 TI -> Titan is worth it, but the 4 fps difference between 980 TI and the watercooled Fury X makes it unworthy?
 
How come you guys don't knock on the $999 price tag on the Titan X?

You get a whole 2 FPS more than a $650 980 TI as pointed out in your latest review:

http://www.hardocp.com/images/articl...sJrD_6_1_l.gif

$349 / 2 = $175 per fps.

Yes I know the Titan came out first, but people are looking to buy when they read the review now not months ago.

How can you recommend it at all?

The Fury X gets 6 fps less at the highest settings you tested, and is $350 cheaper and watercooled.

Somehow the $350 for 2 fps from 980 TI -> Titan is worth it, but the 4 fps difference between 980 TI and the watercooled Fury X makes it unworthy?

We never "recommended" anyone purchasing a Titan X in this article. Where did we ever calculate "worth" as you are stating above?

I am fairly sure we went to extremes to point out the cost of each video card and specifically stated what was held up by the data. All the data is laid out there for the reader to digest and decide what is the best value for them.
 
Seems kinda obvious to me, and they point out how close the Ti is within the very same quote... Sometimes people just like to get butthurt over what they perceive rather than what's actually there. I'm running two R9 290 btw, and ran 6950 before those, couldn't be less biased IMO (tho I'd probably buy NV if I was buying today).
 
The conclusion is where I'm basing that:

You called the Fury X disappointing card on performance while it was 6 fps lower than the $350 more expensive Titan X which proves that it was the best. You then stated "If you simply want the fastest single-GPU experience in this game go with the TITAN X."

I know you have SLI Titan X in your home machine so enjoy using them, but they are a bad buy for the money, and if you can call the Fury X a bad buy, why can't you call the Titan X one too, when its objectively much worse performance/money. A third party 980 TI with stock OC is faster and cheaper.

So what you are telling me, is that a reader disagrees with us? I knew it was bound to happen sooner or later. Mark this day on your calendar!

On a serious note....We give you all the data we collect and talk about our gaming experiences in GPU reviews. I never expect all our readers to agree with our conclusions. We supply that data and subjective experience evaluation so our readers and come to their own conclusions as well. You have come to a different conclusion that we have based on the data.

And just for the record, this is what we said about "disappointing"not taken out of context as you have previously.

The only disappointing card on performance in The Witcher 3: Wild Hunt was the Radeon R9 Fury X. For a $649 video card it was nowhere on the level of where it should be. The GeForce GTX 980 Ti offers a much better experience, close to TITAN X level performance. For AMD video cards, compared to the competition, the Radeon R9 Fury, Radeon R9 390X, Radeon R9 390 and Radeon R9 380 are worth a second look for Witcher 3 gameplay experience if you are shopping around.

Second, the GTX 980 Ti offers amazing performance and experience for $649. Not a cheap price, but it offers near TITAN X performance at a much lower price. Its value is greater than the TITAN X because of this. It does beat the Radeon R9 Fury X and allows a better The Witcher 3: Wild Hunt gameplay experience.

The thoughts on it being disappointing is very much a value oriented statement, hence the spelling out the prices.

You will also notice that we gave props to the 390 X, 390, and 380 above in the paragraph you quoted from. So like you were implying in the other thread, we are not in a "feud" with AMD and we never publish anything but our thoughts on the hardware and the gaming experience it provides. Again, as in the other thread, would we be giving Gold and Silver awards to AMD video cards if we were in a "feud" and carried out that situation into our writing? No.

You are someone that has a different opinion on the hardware than us. I can accept that you do, hopefully you will be able to accept that someone has a different opinion than your own.

And you keep referring to the 6fps. We really do not put that much value into frame per second when we write these conclusions but more into the quality of actual gameplay experience.

We think the Fury X is overpriced for the gaming experience that it offers. Your mileage may vary. I don't expect you to come see it our way. But suggesting that we are giving bad reviews to Fury X cards because we have some sort of disagreement with AMD is ludicrous. For the record, we have given Gold awards to companies that I hate. But when it comes to it, we let the hardware stand on its own.
 
Seems kinda obvious to me, and they point out how close the Ti is within the very same quote... Sometimes people just like to get butthurt over what they perceive rather than what's actually there. I'm running two R9 290 btw, and ran 6950 before those, couldn't be less biased IMO (tho I'd probably buy NV if I was buying today).

So then point out that a water cooled 980 TI would be $100 more.

Titan X - $999 for 46 fps -> $21 per fps

980 TI - $650 for 44 fps -> $14 per fps

980 TI (WC) - $750 for 44 fps -> $17 per fps

Fury X (WC) - $650 for 40 fps -> $16 per fps

Yes the 980 TI OCs well and would push it to even better value over the Titan X.

I'm not trying to say that AMD handled the Fury launch well or anything, I'm guessing small batches is why they went with a WC only version instead of having air cooled as well for cheaper, but we have what we have.
 
Am i doing something different in my setup? I run a i5-4690k at stock and a MSI Gaming 4G GTX 980 (1444Mhz core/7.52Ghz memory) and get consistent performance around 50fps at 1440p, even with lots of action onscreen.

I'm running everything in Ultra under Graphics except I've manually set foilage visibility to High, Hairworks is off. Post Processing is HBAO+ and everything on except Chromatic Abbrasion.

Is my video card overclock really making up that much of a performance gap? Or is Foilage Visibility a high cost setting?
 
I never said anything about a feud. Though I do find it odd that the $50-100 cheaper 390X got a silver reward compared to the 980 even though it was just as fast.



Again, I said nothing about a disagreement with AMD. You aren't giving the Fury X any credit for having a watercooler built in, would be an extra $100 on a 980 TI for that.

I see a 2 FPS difference between 980 TI and Titan X with hairworks and everything else enabled in apples to apples:

[img*SNIP*[/img]

I'm going off the actual gameplay you tested. I still can't fathom how $350 is worth the difference. Why not just OC the 980 TI and save $350?

You have the exact same quality settings on the cards, but one gets 2fps more for $350 and its not called super overpriced luxury edition? Would the Fury X have to be free to not be a bad deal since its the same price as the 980 TI but 4fps less ($350 / 2 * 4 = $700).

look, for some people money isn't a issue.. I'm one part of those guys, and I can just say to you, at the same way the 980TI can be overclocked, the TITAN X can also be overclocked and still will offer better experience than a 980TI, specially in multi GPU configurations..

also its just 2FPS in this game at this moment of the test but in others the difference its bigger...
 
I never said anything about a feud. Though I do find it odd that the $50-100 cheaper 390X got a silver reward compared to the 980 even though it was just as fast.



Again, I said nothing about a disagreement with AMD. You aren't giving the Fury X any credit for having a watercooler built in, would be an extra $100 on a 980 TI for that.

I see a 2 FPS difference between 980 TI and Titan X with hairworks and everything else enabled in apples to apples:


I'm going off the actual gameplay you tested. I still can't fathom how $350 is worth the difference. Why not just OC the 980 TI and save $350?

You have the exact same quality settings on the cards, but one gets 2fps more for $350 and its not called super overpriced luxury edition? Would the Fury X have to be free to not be a bad deal since its the same price as the 980 TI but 4fps less ($350 / 2 * 4 = $700).

Why do they deserve credit for the AIO? It's part of the cooling system that AMD obvious felt they needed for the card. What a 980 ti with a block already on it costs is irrelevant. The ONLY situation in which the AIO is a "bonus" is if a potential buyer already planned to use a 120mm AIO on the card. If someone has a custom loop the cooler is entirely useless. If someone never changes the cooler that comes with the card (the VAST majority of people buying GPUs) then it's just a thing that's there. If for one reason or another someone doesn't like water coolers then they have to remove it and find an after-market solution, increasing their upfront cost. AMD can have credit for the card not being a fucking oven with what sounds like a damn delta fan cooling it like the 290x. Beyond being a good stock cooler the AIO doesn't matter one bit. Going back to the 980 ti with a block on it for a moment: Using water cooling in a price/performance argument is really stupid. You don't build a custom loop to save money or to get good price-to-performance.

PS: Why do you keep pointing to that one single image? Playable settings are the big metric for [H]. The TX allows higher settings than any other card tested, that's the important thing. By the way, none of the three cards in your image are running at playable frame rates.
 
I never said anything about a feud.

My apologies.

Your thoughts are noted. I am not going to answer the same questions and accusations over and over simply because you do not like our opinions.
 
I read the review and, as usual, very well written and informative, but I have to question your "Highest Playable Settings" results. If I want to max out all settings, what resolution should I play at? I care more about having all the bells and whistles on and maybe play at 1080p vs. playing at a higher resolution, but have to turn stuff down/off.

So can you run the Witcher 3 at 1080p with everything max and Hairworks and keep acceptable frame rates?
 
I read the review and, as usual, very well written and informative, but I have to question your "Highest Playable Settings" results. If I want to max out all settings, what resolution should I play at? I care more about having all the bells and whistles on and maybe play at 1080p vs. playing at a higher resolution, but have to turn stuff down/off.

So can you run the Witcher 3 at 1080p with everything max and Hairworks and keep acceptable frame rates?

You should have no problem getting the info you need as there at a metric shit ton of performance analysis done on the Witcher 3 available online. This game has been out almost 6 months now.

As much as like the performance review here on [H] this feels about 5 months too late, imo. I think most people have moved on from the witcher 3 or know exactly what the performance is too be expected from their setups as internet is flooded with this analysis.
 
We do not experience any differences on Geralt. When you played Tomb Raider you immediately recognized the difference with and without TressFX, it was night and day. This is not the case as far as Geralt goes.
However, it is the case as far as animal fur goes. There is a big difference visually on the horses mane and on creatures that have fur such as wolves. You can tell the difference instantly, and it does look great, on creatures. The problem is the performance drain needed to run HairWorks on both Geralt and creatures at the same time. The performance tanks, and you have to lower in-game settings.

ok, so HW on Geralt is meh, no difference. HW on animals is more noticeable. Having both on at the same time cripples performance. But you can only have either "Geralt HW" or "Geralt + Animals". It's pity that the devs did not figure out that it might make shitload more of sense to allow setting HW just on animals, and not on Gearalt at the same time. Give us 3 options!
a) only geralt
b) only animals
c) a+b

Is it really so hard?
 
Excellent review, as usual. Further evidence AMD needs to drop the price on Fury X to be competitive. Especially good point you threw in on TressFX being supported in upcoming games which should somewhat level the argument vs. Gameworks. Still a divisive issue nonetheless...

As a side note, in line with the experience with driver improvements in this game, Crossfire support in this game at release was in a word spotty at best. It's now absolutely terrific: my 295x2 tears through this game at 1440p without any of the prior issues.

The other point worth mentioning about this game is the absolutely stellar support CD Projekt has given it, not just with driver improvements but also with free and continual DLCs.
 
I never said anything about a feud. Though I do find it odd that the $50-100 cheaper 390X got a silver reward compared to the 980 even though it was just as fast.



Again, I said nothing about a disagreement with AMD. You aren't giving the Fury X any credit for having a watercooler built in, would be an extra $100 on a 980 TI for that.

I see a 2 FPS difference between 980 TI and Titan X with hairworks and everything else enabled in apples to apples:

1440388080TJD7qbsJrD_6_1_l.gif


I'm going off the actual gameplay you tested. I still can't fathom how $350 is worth the difference. Why not just OC the 980 TI and save $350?

You have the exact same quality settings on the cards, but one gets 2fps more for $350 and its not called super overpriced luxury edition? Would the Fury X have to be free to not be a bad deal since its the same price as the 980 TI but 4fps less ($350 / 2 * 4 = $700).

This is the false equivalence upon which your whole pointless gripe is based. [H] includes apples-to-apples as a point of interest and reference for those who are so inclined, but their conclusions are based primarily on highest playable settings.

Their comments at the end are based on the fact that the Titan X is able to play with a setting enabled that the 980ti and Fury X can't match. A 2 FPS difference at settings where all three cards are not smoothly playable is meaningless. The ability to turn on more eye-candy while maintaining playable FPS is maningFUL. So the review points that out.

Then you drag in the BS about "Oooooh, but the Fury X comes with a 'free' watercooler!" Really? You want to go there? Okay, let's go there. It has one because it would be unstable without it. The 980ti (and the Titan X) are stable on air cooling and highly overclockable on it to boot. The Fury X, with "free" watercooler, can't overclock at all. If this test had been done at max overclocks on every card, it would have been total carnage for the Fury X, and maybe for several of the other AMD cards. Be thankful that they chose stock clocks. Not needing a watercooler makes the NVidia value proposition better, not worse.
 
This is the false equivalence upon which your whole pointless gripe is based. [H] includes apples-to-apples as a point of interest and reference for those who are so inclined, but their conclusions are based primarily on highest playable settings.

Their comments at the end are based on the fact that the Titan X is able to play with a setting enabled that the 980ti and Fury X can't match. A 2 FPS difference at settings where all three cards are not smoothly playable is meaningless. The ability to turn on more eye-candy while maintaining playable FPS is maningFUL. So the review points that out.

Well really, that should be the setting they use for Apples to Apples then, not a higher setting if its "Unplayable". We can't compare what the highest playable on the Titan X is compared to the 980 TI because they aren't showing us those results, we have the Apples to Apples we can use though.

If anything, with lowered AA/IQ the 980 TI / Fury X should be closer to the Titan's Playable performance settings.

Then you drag in the BS about "Oooooh, but the Fury X comes with a 'free' watercooler!" Really? You want to go there? Okay, let's go there. It has one because it would be unstable without it. The 980ti (and the Titan X) are stable on air cooling and highly overclockable on it to boot. The Fury X, with "free" watercooler, can't overclock at all. If this test had been done at max overclocks on every card, it would have been total carnage for the Fury X, and maybe for several of the other AMD cards. Be thankful that they chose stock clocks. Not needing a watercooler makes the NVidia value proposition better, not worse.

Fury X doesn't require a watercooler. Someone in the forums here unlocked their Fury to Fury X and ran it Aircooled just fine.

I'm guessing the reason they went with the waterblock was:

A) Added Value (980 TI WC is $100 more).

B) Tons of people hated the stock 290x cooler (third party variants were much better) and they got lots of bad press about loud coolers. Now they get knocked for having a WC... Just can't catch a break.

They obviously have a limited supply of chips since they are constantly sold out, so I don't see why they'd make so many verisons of the card. Also it was originally pitched to compete against the $999 Titan X. Nvidia released the 980 TI right before Fury X to try to kill its sales.

Also are you going to tell me people should honestly buy a Titan X over a Factory OC'd / Custom Cooled 980 TI for $300-350 less? Because there are some very good custom coolers for the 980 TI that make it faster, quieter, and cooler than the Titan X for a fraction of the price.
 
Let me preface my response by saying I appreciate your objective tone. My original comment to you was way more flame-baity than it needed to be. Apologies.

If anything, with lowered AA/IQ the 980 TI / Fury X should be closer to the Titan's Playable performance settings.

And they are. Look at the frame rates in the highest playable table. The 980ti and the Fury X have higher frame rates than the Titan X because they don't have Hairworks turned on. And yet, the Titan still pulls 50's on average, which must have been considered the smooth-feeling range in [H]'s professional opinion. That's the very definition of "highest playable settings."

The point, in my view, is that you're using the apples-to-apples graphs to compare apples-to-oranges in your objection to their characterization of the Titan X. You say the Titan has no value based on the apples-to-apples. [H] sees a value (the extent of which is up to the observer) based on the fact that a demanding feature, Hairworks, is playable on the Titan but not on the 980ti or Fury X. Your arguments about the relative importance of 2 FPS over the Ti or 6 FPS over the X are all based on the wrong part of the review.

Fury X doesn't require a watercooler. Someone in the forums here unlocked their Fury to Fury X and ran it Aircooled just fine.

I was unaware of this--I stand corrected.

I'm guessing the reason they went with the waterblock was:

A) Added Value (980 TI WC is $100 more).

If you run a watercooled system, this is a potential advantage. I would think, though, that many such people would already have a loop for the graphics card, and would just take the air cooler off a regular Ti and put their own water block on it. Don't know, but watercooling is a niche of a niche (high-end enthusiasts).

B) Tons of people hated the stock 290x cooler (third party variants were much better) and they got lots of bad press about loud coolers. Now they get knocked
for having a WC... Just can't catch a break.

Didn't know about the hated 290x cooler, fair point. But I don't see anybody around here knocking the Fury X for having a watercooler. I only see people calling BS when someone claims that the watercooler somehow excuses the performance deficit in its price range.

Also it was originally pitched to compete against the $999 Titan X. Nvidia released the 980 TI right before Fury X to try to kill its sales.

Against the Titan X, the Fury X would be a bargain. Against the 980 non-Ti, it would be a performance leader. But the Ti does exist, because apparently nVidia knows what they are doing. Pity AMD doesn't.

This really isn't about the Titan at all. nVidia's top-tier mass-market card was the 980. They saw the Fury X coming, made the 980 better, and had it ready in time to win. So they win. QED.

Also are you going to tell me people should honestly buy a Titan X over a
Factory OC'd / Custom Cooled 980 TI for $300-350 less? Because there are some
very good custom coolers for the 980 TI that make it faster, quieter, and cooler
than the Titan X for a fraction of the price.

I'm not making claims of any kind--you are. You are misconstruing the intent and basis of [H]'s comments about the Titan in this article, and refusing to accept correction. If it makes you feel better, enjoy that, but it doesn't make you right.
 
Very interesting article. What I gather for single gpu use, Hairworks just does not add much for the game play comparing quality settings and Hairworks enhancements.

Going dual GPU would be very interesting on use of Hairworks and overall performance. Which then resolution maybe pushed upwards to 4K with the same results we see here :D. Still with AMD in general better scaling with multiple GPU configurations may make the dual+ configuration a match at the high end and surpass at the mid to lower end.

Bottom line is if Hairworks does not add much to the game play then it is pointless to spend more time testing it in the end.

One note:
  • If one has a 1440P monitor (not 4k), the highest playability settings sometimes do not help in determining an upgrade.
  • Knowing highest playbility setting and then next lower one settings (particulary if maxed out) I would think would help give a better feel for the experience and a better review overall.
  • Maybe max game quality resolution testing - what resolution can you max out all the quality features in the game in other words.
 
Last edited:
We found no video card in our testing that could maintain HairWorks on both Geralt and creatures at the same time with "Ultra" settings in The Witcher 3: Wild Hunt. It just isn't possible. Out of the sheer performance drain, we have to sacrifice the quality of fur on creatures. That leaves us with just Geralt, and the differences doesn't really benefit us there.

But what if you're running at something like 1080p? The only cards that were tested at that resolution were the 960 and the equivalent AMD which are both already barely adequate for that resolution (I'd want at least a 970/390 for 1080p). It'd be interesting to see if the 970/980/390 can run with everything turned on.
 
T

Their comments at the end are based on the fact that the Titan X is able to play with a setting enabled that the 980ti and Fury X can't match. A 2 FPS difference at settings where all three cards are not smoothly playable is meaningless. The ability to turn on more eye-candy while maintaining playable FPS is maningFUL. So the review points that out.

This pretty much sums up all of [H] video card reviews. We have 20 other sites that review the cards, all with the same settings on each. It you want an apples to apples match, goto one of those sites. If you want to see which video card is the most playable at your monitors resolution and price point, come here :)
 
ok, so HW on Geralt is meh, no difference. HW on animals is more noticeable. Having both on at the same time cripples performance. But you can only have either "Geralt HW" or "Geralt + Animals". It's pity that the devs did not figure out that it might make shitload more of sense to allow setting HW just on animals, and not on Gearalt at the same time. Give us 3 options!
a) only geralt
b) only animals
c) a+b

Is it really so hard?

There is a mod for it.
http://www.nexusmods.com/witcher3/mods/165/?
Though I bet eventually CDProjekt will add this option sometime.


Although its quite well known I would have also liked the HardOCP take a look at the Shadow setting. You can save quite a few FPS by dropping it with barely any real drop in visual quality. Even Low shadows look damn nice!
 
This pretty much sums up all of [H] video card reviews. We have 20 other sites that review the cards, all with the same settings on each. It you want an apples to apples match, goto one of those sites. If you want to see which video card is the most playable at your monitors resolution and price point, come here :)

Except on higher settings there was very little difference between the cards, so why not test those lower quality Titan X "Max" settings for all cards and let us decide if the few fps difference is worth it? Why are the Apples to Apples using higher settings than the max playable? If those tests tell us nothing and are worthless for comparison, why do them?
 
Except on higher settings there was very little difference between the cards, so why not test those lower quality Titan X "Max" settings for all cards and let us decide if the few fps difference is worth it? Why are the Apples to Apples using higher settings than the max playable? If those tests tell us nothing and are worthless for comparison, why do them?

View counts?

I think a fair trade off would be to find "max playable settings" for the fastest card, and set that for the "100%" benchmark like some sites do. Then you could choose to either get "the best card" or decide for yourself if the weaker card is worth it based on the frame drops.

Still as the other guy said, what is the point of testing at settings that are unplayable?

Can i bust out the car analogy and have a towing test but the the load is so heavy no truck can maintain highway speeds? Which truck should i buy, well the chevy got to 50mph first with 10,000lbs in the back, i guess ill get it since it is the best.
 
View counts?

I think a fair trade off would be to find "max playable settings" for the fastest card, and set that for the "100%" benchmark like some sites do. Then you could choose to either get "the best card" or decide for yourself if the weaker card is worth it based on the frame drops.

Still as the other guy said, what is the point of testing at settings that are unplayable?

Can i bust out the car analogy and have a towing test but the the load is so heavy no truck can maintain highway speeds? Which truck should i buy, well the chevy got to 50mph first with 10,000lbs in the back, i guess ill get it since it is the best.

I'm agreeing with you.

I don't understand why they tested the Apples to Apples at higher settings than their "Max" Titan X settings.

What I'm guessing is that the actual playable settings were almost identical between the 980 TI and Titan X. I mean they raised the IQ and the FPS was still within 2 FPS. So I'd expect at lower IQ those differences would be closer. We don't know because they didn't test that, they instead raised the IQ to "unplayable" settings and benched that instead.
 
So since the 980 TI OC version can run Witcher 3 better than a Titan X and at higher IQ, would you still call the Titan X a good value for its price, or is it over priced under performer?

Nobody ever said Titan X actually its a price/performance value card, specially since the launch of the 980TI and it was pointed several times by [H]... however before the 980TI Titan X had a market even for the price it offered massive performance increase over vanilla 980... so what's your point? will you say that Fury X its a price/performance value card?. your own screenshot state how the fury X its smashed badly by the competition's card at the same price. now, that's what I call a BAD value for it's price.

For multi-GPUs configuration at 4K I would still go with a OC'd Titan Xs. as still that card scale better with OC... so yes, even that card have a better market that the Fury.
 
I'm agreeing with you.

I don't understand why they tested the Apples to Apples at higher settings than their "Max" Titan X settings.

What I'm guessing is that the actual playable settings were almost identical between the 980 TI and Titan X. I mean they raised the IQ and the FPS was still within 2 FPS. So I'd expect at lower IQ those differences would be closer. We don't know because they didn't test that, they instead raised the IQ to "unplayable" settings and benched that instead.

A better guess based on available evidence would be:

Lowered to 980Ti best playable settings, the Titan X scores low-to-mid-60s FPS to the 980Ti's mid-50s.

Raised to Titan X best playable settings, the 980ti scores mid-40s FPS to the Titan X's low 50s.

As to the apples-to-apples settings, they have typically maxed them out on the high-end cards as a kind of worst-case-scenario torture test. It has been this way for quite some time. I do not recall if they explained their rationale for doing so in the past, but they have been consistent. It's not really the focus of their testing methodology anyway.
 
A better guess based on available evidence would be:

Lowered to 980Ti best playable settings, the Titan X scores low-to-mid-60s FPS to the 980Ti's mid-50s.

Raised to Titan X best playable settings, the 980ti scores mid-40s FPS to the Titan X's low 50s.

As to the apples-to-apples settings, they have typically maxed them out on the high-end cards as a kind of worst-case-scenario torture test. It has been this way for quite some time. I do not recall if they explained their rationale for doing so in the past, but they have been consistent. It's not really the focus of their testing methodology anyway.

Well Kyle deleted my post, but you can see that in their latest review, the OC'd 980 TI can run Witcher 3 better than the Titan did. Higher IQ and higher FPS (full hairworks, 4xAA)
 
Well Kyle deleted my post, but you can see that in their latest review, the OC'd 980 TI can run Witcher 3 better than the Titan did. Higher IQ and higher FPS (full hairworks, 4xAA)

But then, the Titan X OCs as well. But you knew that. And it's been pointed out to you multiple times in both threads even if you didn't know it. And so around the mulberry bush we go.
 
But then, the Titan X OCs as well. But you knew that. And it's been pointed out to you multiple times in both threads even if you didn't know it. And so around the mulberry bush we go.

And I've only seen benchmarks that show that 980 Ti's OC better (better coolers help) than Titans, and it was Factory Settings that were faster too, not just the extra OC.
 
Well Kyle deleted my post, but you can see that in their latest review, the OC'd 980 TI can run Witcher 3 better than the Titan did. Higher IQ and higher FPS (full hairworks, 4xAA)

Also remember different Game Versions with couple of optimizations, specially in the graphics settings department.
 
Also remember different Game Versions with couple of optimizations, specially in the graphics settings department.

Between this week and last? Ok but I doubt it makes a big difference. I mean there are tons of examples of Titan X vs Custom 980 TI with a factory OC being faster. I don't get why so many people are trying to say its not.
 
And I've only seen benchmarks that show that 980 Ti's OC better (better coolers help) than Titans, and it was Factory Settings that were faster too, not just the extra OC.

So, wait, are you saying that a non-reference 980Ti is a better value than a reference Titan X? I am shocked, sir, shocked! [H]ardOCP has claimed the exact opposite all along! Let them be executed forthwith!

Of course, the Fury X is still a rotten value, and the Fury Nano is a joke in poor taste. But let us not cast our gaze into the darkness of that abyss! No, we must anon to yonder pile of FUD, for all our FUD has not yet been chewed sufficiently to digest!
 
So, wait, are you saying that a non-reference 980Ti is a better value than a reference Titan X? I am shocked, sir, shocked! [H]ardOCP has claimed the exact opposite all along! Let them be executed forthwith!

Of course, the Fury X is still a rotten value, and the Fury Nano is a joke in poor taste. But let us not cast our gaze into the darkness of that abyss! No, we must anon to yonder pile of FUD, for all our FUD has not yet been chewed sufficiently to digest!

Oh I'm sorry, I forgot you can't think rationally about anything Nvidia related, nor accept that the pricing is completely overblown on the Titan Xs.

I figured we could talk about hardware here but apparently thats not the case for you.

Also I was pointing out that it wasn't better value, thats been known all along. But it actually performs better as well.
 
Back on topic please. Go make your own thread about your perceived Titan X value.
 
As to the apples-to-apples settings, they have typically maxed them out on the high-end cards as a kind of worst-case-scenario torture test. It has been this way for quite some time. I do not recall if they explained their rationale for doing so in the past, but they have been consistent. It's not really the focus of their testing methodology anyway.


First and foremost this is the way it has been done for a good while. Nothing new to see here in terms of how we test.

We have discussed this testing scenario from time to time and have come back to the higher settings criteria for apples-to-apples. (Actually we had a meeting about it just a couple months ago.) If you want real world gaming results, that is what our GPU reviews focus on. No matter how the cookie crumbles in apples-to-apples, one GPU is going to be getting over-stressed and another under-stressed depending on the settings. As it is now we wanted to show the data under very stressful GPU settings.
 
Don't understand why comparisons are to stock cards as most Nvidia buyers get custom cooled FROM AIB partners many of which are much faster out of the box.

There are so many variants of cards from AIBs with factory overclocks which one do you use? How do you keep things fair? How do you keep things compared fairly by price when the price will be a premium over MSRP and vary so much? It just introduces too many variables. You have to draw the line somewhere. Using stock frequencies, and stock MSRP pricing, keeps everything on the same page and easy to form a baseline. You can extrapolate from that baseline and add performance + frequency + price if you wish. There isn't enough time to include every variable.
 
Back
Top