Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Most of that garbage there is mobile. The gtx760, gtx 660, and 7900 series tops the list for desktop. If you have only $100 to spare, get a used Hd 7900 or gtx 770. Otherwise, you will not even have the muscle for 1080p.
Most of that garbage there is mobile. The gtx760, gtx 660, and 7900 series tops the list for desktop. If you have only $100 to spare, get a used Hd 7900 or gtx 770. Otherwise, you will not even have the muscle for 1080p.
The only disappointing card on performance in The Witcher 3: Wild Hunt was the Radeon R9 Fury X. For a $649 video card it was nowhere on the level of where it should be. The GeForce GTX 980 Ti offers a much better experience, close to TITAN X level performance. For AMD video cards, compared to the competition, the Radeon R9 Fury, Radeon R9 390X, Radeon R9 390 and Radeon R9 380 are worth a second look for Witcher 3 gameplay experience if you are shopping around.
On the NVIDIA side, nothing at all beats the TITAN X for performance in The Witcher 3: Wild Hunt. Yeah, it costs $999, and it should do the best, and it proves that it does. AMD has nothing that can compete with it in this game. It stands in a league of its own, if you simply want the fastest single-GPU experience in this game go with the TITAN X.
How come you guys don't knock on the $999 price tag on the Titan X?
You get a whole 2 FPS more than a $650 980 TI as pointed out in your latest review:
http://www.hardocp.com/images/articl...sJrD_6_1_l.gif
$349 / 2 = $175 per fps.
Yes I know the Titan came out first, but people are looking to buy when they read the review now not months ago.
How can you recommend it at all?
The Fury X gets 6 fps less at the highest settings you tested, and is $350 cheaper and watercooled.
Somehow the $350 for 2 fps from 980 TI -> Titan is worth it, but the 4 fps difference between 980 TI and the watercooled Fury X makes it unworthy?
The conclusion is where I'm basing that:
You called the Fury X disappointing card on performance while it was 6 fps lower than the $350 more expensive Titan X which proves that it was the best. You then stated "If you simply want the fastest single-GPU experience in this game go with the TITAN X."
I know you have SLI Titan X in your home machine so enjoy using them, but they are a bad buy for the money, and if you can call the Fury X a bad buy, why can't you call the Titan X one too, when its objectively much worse performance/money. A third party 980 TI with stock OC is faster and cheaper.
The only disappointing card on performance in The Witcher 3: Wild Hunt was the Radeon R9 Fury X. For a $649 video card it was nowhere on the level of where it should be. The GeForce GTX 980 Ti offers a much better experience, close to TITAN X level performance. For AMD video cards, compared to the competition, the Radeon R9 Fury, Radeon R9 390X, Radeon R9 390 and Radeon R9 380 are worth a second look for Witcher 3 gameplay experience if you are shopping around.
Second, the GTX 980 Ti offers amazing performance and experience for $649. Not a cheap price, but it offers near TITAN X performance at a much lower price. Its value is greater than the TITAN X because of this. It does beat the Radeon R9 Fury X and allows a better The Witcher 3: Wild Hunt gameplay experience.
Seems kinda obvious to me, and they point out how close the Ti is within the very same quote... Sometimes people just like to get butthurt over what they perceive rather than what's actually there. I'm running two R9 290 btw, and ran 6950 before those, couldn't be less biased IMO (tho I'd probably buy NV if I was buying today).
I never said anything about a feud. Though I do find it odd that the $50-100 cheaper 390X got a silver reward compared to the 980 even though it was just as fast.
Again, I said nothing about a disagreement with AMD. You aren't giving the Fury X any credit for having a watercooler built in, would be an extra $100 on a 980 TI for that.
I see a 2 FPS difference between 980 TI and Titan X with hairworks and everything else enabled in apples to apples:
[img*SNIP*[/img]
I'm going off the actual gameplay you tested. I still can't fathom how $350 is worth the difference. Why not just OC the 980 TI and save $350?
You have the exact same quality settings on the cards, but one gets 2fps more for $350 and its not called super overpriced luxury edition? Would the Fury X have to be free to not be a bad deal since its the same price as the 980 TI but 4fps less ($350 / 2 * 4 = $700).
I never said anything about a feud. Though I do find it odd that the $50-100 cheaper 390X got a silver reward compared to the 980 even though it was just as fast.
Again, I said nothing about a disagreement with AMD. You aren't giving the Fury X any credit for having a watercooler built in, would be an extra $100 on a 980 TI for that.
I see a 2 FPS difference between 980 TI and Titan X with hairworks and everything else enabled in apples to apples:
I'm going off the actual gameplay you tested. I still can't fathom how $350 is worth the difference. Why not just OC the 980 TI and save $350?
You have the exact same quality settings on the cards, but one gets 2fps more for $350 and its not called super overpriced luxury edition? Would the Fury X have to be free to not be a bad deal since its the same price as the 980 TI but 4fps less ($350 / 2 * 4 = $700).
I never said anything about a feud.
I read the review and, as usual, very well written and informative, but I have to question your "Highest Playable Settings" results. If I want to max out all settings, what resolution should I play at? I care more about having all the bells and whistles on and maybe play at 1080p vs. playing at a higher resolution, but have to turn stuff down/off.
So can you run the Witcher 3 at 1080p with everything max and Hairworks and keep acceptable frame rates?
We do not experience any differences on Geralt. When you played Tomb Raider you immediately recognized the difference with and without TressFX, it was night and day. This is not the case as far as Geralt goes.
However, it is the case as far as animal fur goes. There is a big difference visually on the horses mane and on creatures that have fur such as wolves. You can tell the difference instantly, and it does look great, on creatures. The problem is the performance drain needed to run HairWorks on both Geralt and creatures at the same time. The performance tanks, and you have to lower in-game settings.
I never said anything about a feud. Though I do find it odd that the $50-100 cheaper 390X got a silver reward compared to the 980 even though it was just as fast.
Again, I said nothing about a disagreement with AMD. You aren't giving the Fury X any credit for having a watercooler built in, would be an extra $100 on a 980 TI for that.
I see a 2 FPS difference between 980 TI and Titan X with hairworks and everything else enabled in apples to apples:
I'm going off the actual gameplay you tested. I still can't fathom how $350 is worth the difference. Why not just OC the 980 TI and save $350?
You have the exact same quality settings on the cards, but one gets 2fps more for $350 and its not called super overpriced luxury edition? Would the Fury X have to be free to not be a bad deal since its the same price as the 980 TI but 4fps less ($350 / 2 * 4 = $700).
This is the false equivalence upon which your whole pointless gripe is based. [H] includes apples-to-apples as a point of interest and reference for those who are so inclined, but their conclusions are based primarily on highest playable settings.
Their comments at the end are based on the fact that the Titan X is able to play with a setting enabled that the 980ti and Fury X can't match. A 2 FPS difference at settings where all three cards are not smoothly playable is meaningless. The ability to turn on more eye-candy while maintaining playable FPS is maningFUL. So the review points that out.
Then you drag in the BS about "Oooooh, but the Fury X comes with a 'free' watercooler!" Really? You want to go there? Okay, let's go there. It has one because it would be unstable without it. The 980ti (and the Titan X) are stable on air cooling and highly overclockable on it to boot. The Fury X, with "free" watercooler, can't overclock at all. If this test had been done at max overclocks on every card, it would have been total carnage for the Fury X, and maybe for several of the other AMD cards. Be thankful that they chose stock clocks. Not needing a watercooler makes the NVidia value proposition better, not worse.
If anything, with lowered AA/IQ the 980 TI / Fury X should be closer to the Titan's Playable performance settings.
Fury X doesn't require a watercooler. Someone in the forums here unlocked their Fury to Fury X and ran it Aircooled just fine.
I'm guessing the reason they went with the waterblock was:
A) Added Value (980 TI WC is $100 more).
B) Tons of people hated the stock 290x cooler (third party variants were much better) and they got lots of bad press about loud coolers. Now they get knocked
for having a WC... Just can't catch a break.
Also it was originally pitched to compete against the $999 Titan X. Nvidia released the 980 TI right before Fury X to try to kill its sales.
Also are you going to tell me people should honestly buy a Titan X over a
Factory OC'd / Custom Cooled 980 TI for $300-350 less? Because there are some
very good custom coolers for the 980 TI that make it faster, quieter, and cooler
than the Titan X for a fraction of the price.
We found no video card in our testing that could maintain HairWorks on both Geralt and creatures at the same time with "Ultra" settings in The Witcher 3: Wild Hunt. It just isn't possible. Out of the sheer performance drain, we have to sacrifice the quality of fur on creatures. That leaves us with just Geralt, and the differences doesn't really benefit us there.
T
Their comments at the end are based on the fact that the Titan X is able to play with a setting enabled that the 980ti and Fury X can't match. A 2 FPS difference at settings where all three cards are not smoothly playable is meaningless. The ability to turn on more eye-candy while maintaining playable FPS is maningFUL. So the review points that out.
ok, so HW on Geralt is meh, no difference. HW on animals is more noticeable. Having both on at the same time cripples performance. But you can only have either "Geralt HW" or "Geralt + Animals". It's pity that the devs did not figure out that it might make shitload more of sense to allow setting HW just on animals, and not on Gearalt at the same time. Give us 3 options!
a) only geralt
b) only animals
c) a+b
Is it really so hard?
This pretty much sums up all of [H] video card reviews. We have 20 other sites that review the cards, all with the same settings on each. It you want an apples to apples match, goto one of those sites. If you want to see which video card is the most playable at your monitors resolution and price point, come here
Except on higher settings there was very little difference between the cards, so why not test those lower quality Titan X "Max" settings for all cards and let us decide if the few fps difference is worth it? Why are the Apples to Apples using higher settings than the max playable? If those tests tell us nothing and are worthless for comparison, why do them?
View counts?
I think a fair trade off would be to find "max playable settings" for the fastest card, and set that for the "100%" benchmark like some sites do. Then you could choose to either get "the best card" or decide for yourself if the weaker card is worth it based on the frame drops.
Still as the other guy said, what is the point of testing at settings that are unplayable?
Can i bust out the car analogy and have a towing test but the the load is so heavy no truck can maintain highway speeds? Which truck should i buy, well the chevy got to 50mph first with 10,000lbs in the back, i guess ill get it since it is the best.
So since the 980 TI OC version can run Witcher 3 better than a Titan X and at higher IQ, would you still call the Titan X a good value for its price, or is it over priced under performer?
I'm agreeing with you.
I don't understand why they tested the Apples to Apples at higher settings than their "Max" Titan X settings.
What I'm guessing is that the actual playable settings were almost identical between the 980 TI and Titan X. I mean they raised the IQ and the FPS was still within 2 FPS. So I'd expect at lower IQ those differences would be closer. We don't know because they didn't test that, they instead raised the IQ to "unplayable" settings and benched that instead.
A better guess based on available evidence would be:
Lowered to 980Ti best playable settings, the Titan X scores low-to-mid-60s FPS to the 980Ti's mid-50s.
Raised to Titan X best playable settings, the 980ti scores mid-40s FPS to the Titan X's low 50s.
As to the apples-to-apples settings, they have typically maxed them out on the high-end cards as a kind of worst-case-scenario torture test. It has been this way for quite some time. I do not recall if they explained their rationale for doing so in the past, but they have been consistent. It's not really the focus of their testing methodology anyway.
Well Kyle deleted my post, but you can see that in their latest review, the OC'd 980 TI can run Witcher 3 better than the Titan did. Higher IQ and higher FPS (full hairworks, 4xAA)
But then, the Titan X OCs as well. But you knew that. And it's been pointed out to you multiple times in both threads even if you didn't know it. And so around the mulberry bush we go.
Well Kyle deleted my post, but you can see that in their latest review, the OC'd 980 TI can run Witcher 3 better than the Titan did. Higher IQ and higher FPS (full hairworks, 4xAA)
Also remember different Game Versions with couple of optimizations, specially in the graphics settings department.
And I've only seen benchmarks that show that 980 Ti's OC better (better coolers help) than Titans, and it was Factory Settings that were faster too, not just the extra OC.
So, wait, are you saying that a non-reference 980Ti is a better value than a reference Titan X? I am shocked, sir, shocked! [H]ardOCP has claimed the exact opposite all along! Let them be executed forthwith!
Of course, the Fury X is still a rotten value, and the Fury Nano is a joke in poor taste. But let us not cast our gaze into the darkness of that abyss! No, we must anon to yonder pile of FUD, for all our FUD has not yet been chewed sufficiently to digest!
As to the apples-to-apples settings, they have typically maxed them out on the high-end cards as a kind of worst-case-scenario torture test. It has been this way for quite some time. I do not recall if they explained their rationale for doing so in the past, but they have been consistent. It's not really the focus of their testing methodology anyway.
Don't understand why comparisons are to stock cards as most Nvidia buyers get custom cooled FROM AIB partners many of which are much faster out of the box.