Far Cry 4 Video Card Performance Review @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Far Cry 4 Video Card Performance Review - We play Far Cry 4 on no less than twelve different GPUs for this in-depth look at what graphics settings are playable in Far Cry 4. We will talk about playable settings and show apples-to-apples so you know what to expect in this game and what upgrading your video card may do for you in this new game.
 
good review. Sad to see my GTX780 is showing its age already. I haven't bought the game yet, perhaps I will hold off until I upgrade.
 
good review. Sad to see my GTX780 is showing its age already. I haven't bought the game yet, perhaps I will hold off until I upgrade.

If I had to guess on the main reason the game performs so bad on the 780 is ram. I know on my 290 gpuz reports 3.75gb used.

[h], any plans to look at cpu and ram usage?
 
I wonder how will the 960 perform. Seeing the 780, I don't think it will be able to run it full at 1080p
 
I'm pretty sure I read a post on Beyond3D that NVIDIA uses a form of Tessellation in their GameWorks Godrays that performs 6 times worse on AMD hardware, which would explain the huge drop.
 
I'm pretty sure I read a post on Beyond3D that NVIDIA uses a form of Tessellation in their GameWorks Godrays that performs 6 times worse on AMD hardware, which would explain the huge drop.

Yup, and I'm pretty sure gameworks is behind the broken crossfire support. Farcry 3 which was a Gaming Evolved title performed and scaled really well for both AMD and Nvidia hardware. Switch sponsors and suddenly the game performs horribly for AMD even though they are using the same exact game engine (Dunia 2) the only difference is gameworks.
 
Any chance of running older high end cards such as a 7970? Some of us still have those. be nice to see what improvement between that and the latest and greatest there is.I know there will be an improvement but will it be enough to justify buying the new stuff?
 
Any chance of running older high end cards such as a 7970? Some of us still have those. be nice to see what improvement between that and the latest and greatest there is.I know there will be an improvement but will it be enough to justify buying the new stuff?

280x is a rebadged 7970 Ghz.
 
Any chance of running older high end cards such as a 7970? Some of us still have those. be nice to see what improvement between that and the latest and greatest there is.I know there will be an improvement but will it be enough to justify buying the new stuff?

You know the 280x is just a rebranded 7970 right?
 
FarCry3 still is a crashy mess.

Looks like I will be waiting until FC4 hits the bargain bin like I did for FC3.

If they would start releasing games that were actually finished (minimal bugs), then I would be more likely to buy them soon after they come out.

That being said, I already bought Star Citizen, which is in Alpha/Beta.. but it is not being put out by one of these lame publishing companies that rushes games out before they are finished and then have to patch them multiple times before they are really even playable.
 
I found this game table extremely picky with overlocked Strix 980 gpu, it would crash and dump to desktop when I had no other issues with my other games, even after I lowered the gpu mhz by 100, the game refused to run stable unless I went to back to stock clocks. I don't run afterburner overlay and I don't use gforce experience either. The game also flat out refuses to give full screen resolution for 16:10 monitor users, it puts on wide-screen bars unless you run stretched 16:9 which looks utterly stupid.

I've got so many other games vying for my time that even though game is fun, when it actually runs, I've given up on ubisoft making stable running games anymore.

At least I got for free, but what a waste still...
 
I'm pretty sure I read a post on Beyond3D that NVIDIA uses a form of Tessellation in their GameWorks Godrays that performs 6 times worse on AMD hardware, which would explain the huge drop.

Yup, and I'm pretty sure gameworks is behind the broken crossfire support. Farcry 3 which was a Gaming Evolved title performed and scaled really well for both AMD and Nvidia hardware. Switch sponsors and suddenly the game performs horribly for AMD even though they are using the same exact game engine (Dunia 2) the only difference is gameworks.

and can you explain how and why in the earth a 280X can perform better than a GTX 780?..
 
and can you explain how and why in the earth a 280X can perform better than a GTX 780?..

All Kepler GPUs did pretty poor in this game, in my testing, perhaps under performing from their true potential, for whatever buggy reason.

For the features performance article I am going to compare and see if the 780 is taking a bigger performance hit with any graphics options compared to Maxwell.
 
and can you explain how and why in the earth a 280X can perform better than a GTX 780?..

Hmm... this is interesting :

Same engine with generic optimizations done with the help of AMD back for Far Cry 3. Now patch in Gameworks and nvidia making sure the game works better on their new maxwell cards for their bundle (and disable crossfire support, cause they can).

Now... Lets say this may have taken a lot longer than originally planned and little or no Kepler code could be optimized in time for release. If we take the original [H] review for Far Cry 3 & the GTX 780 Here :

http://www.hardocp.com/article/2013/11/07/nvidia_geforce_gtx_780_ti_video_card_review/4#.VK35r3uWY84

We can see the 780 was always a relatively slower performer. Sure some options have changed between the two games, but the underlying game engine is still the same and we can see some similarities. Actually it may not be that the 780 is worst than the 280X, it looks like it's about the same performance wise as it was in Far Cry 3.

Now take the 7970 (aka R9 280x) [H] review of Far Cry 3 here :

http://www.hardocp.com/article/2012/12/17/far_cry_3_video_card_performance_iq_review/4#.VK3-p3uWY84

The 7970 was always a beast on the Dunia 2 engine, considering the rather large difference with the 780 hardware wise.

Of coarse this is all conjecture based on performance figures I have gathered, still more closely controlled testing (ie: same screen resolution) between the two games could well be very interesting indeed.
 
You know...I have never decided to out right refuse to buy anything from a game developer before, but Ubisoft is not doing themselves any favors by releasing buggy games left and right. Where is crossfire support in both Far Cry 4 and AC: Unity?
 
I think the comments at the end are spot on. . no excuses for buggy games and games that don't work right on the first day, no excuses for SLI and Crossfire not working from day 1, and gamers are getting wise and not prebuying or even buying for weeks/months after release.
Hitting these companies in the wallet is the only answer.
 
Looks like my 680s won't cut it for this game at 1440p. Not too surprising, though. Since the game is buggy I guess it's a good thing.
 
FarCry3 still is a crashy mess.

Looks like I will be waiting until FC4 hits the bargain bin like I did for FC3.

If they would start releasing games that were actually finished (minimal bugs), then I would be more likely to buy them soon after they come out.

That being said, I already bought Star Citizen, which is in Alpha/Beta.. but it is not being put out by one of these lame publishing companies that rushes games out before they are finished and then have to patch them multiple times before they are really even playable.
The only crashing problem I had was the gliders. They would be upside down as soon as I got on and crash into the ground. lol

It did suck though. It was Far Cry 3 with different characters. Oh, and riding elephants. :rolleyes:
 
Any game like this that has broken SLI and CrossFire should be called out, and chastised.

As a PC gamer, I honestly do not have an issue with waiting for a better game truly built for the PC gamer, rather than some shitty console port that identifies the publisher as a half assed thief.

This is why I keep coming back to the [H] - good work guys.

I really have to wonder whats going on inside that godray feature. The 290x has a good bit more raw shader power than the 980, no excuse for it to dogleg the game like that on AMD hardware.
 
So, would PS4 owners be like me be better off with the console version? I was unsatisfied with Watch Dogs for PC and would have preferred to play it for PS4. Only reason I had the PC version was that it came for free with my gtx 780 purchase.
 
This is a case, where taking the bribe from nvidia actually hurts the dev. There is no way in hell I'm going to buy this game when it performs so poorly on AMD hardware.
 
Don’t be a jackass and go for the carrot-and-stick treatment with pre-buy "extras." Let’s keep our money in our pockets till we know the product is worth the price.

Right on. I learned my lesson with the Battlefield 4 debacle. I didn't play that damn game for 7 -8 months after launch when the new netcode was released. I am really hoping DICE learned its lesson on this as well. (Looks like it with them pushing Hardline back) Because if DICE botch Star Wars:Battlefront the way they did BF4, we riot! :mad:
 
Last edited:
Wow. A huge amount of work there, thank you.

I did pick up a typo:

The Bottom Line

Far Cry 4 has far more issues than Far Cry 3 had in terms of bugs, game mechanic issues, visual quality issues, hardware support, and performance issues. It really doesn't make much sense since this game is an evolution of the Far Cry 3, and the game itself is very similar in terms of mechanics. Many gamer's are calling this "Far Cry 3.5" because it is so similar in gameplay.

That should be 'Many gamers are calling...'

I was slightly surprised that when you tested SLI, you didn't test 3-way or 4-way SLI as well, but I guess you didn't have time. Maybe in a follow-up article?
 
Yeah Gameworks for ya, a black box that's very anti-consumer. I have an nVidia card but I didn't buy this game - or apparently any game from Ubisoft these days - in protest to such shady business practices.
 
Yeah Gameworks for ya, a black box that's very anti-consumer. I have an nVidia card but I didn't buy this game - or apparently any game from Ubisoft these days - in protest to such shady business practices.

You must have a very short list of games. Most big releases will be using gameworks and nVidia made it very open where most of it works great on AMD/PS4/Xbone.

You can say the same thing about AMD IP.
 
You must have a very short list of games. Most big releases will be using gameworks and nVidia made it very open where most of it works great on AMD/PS4/Xbone.

You can say the same thing about AMD IP.

My Steam list is in the hundreds, there's no shortage of games to play.

I'm not sure what AMD IP you're talking about that can break the competition's products like this, their market strategy is more about keeping the gaming market from segmenting too much so they don't shit their own bed, they generally go for open standards like OpenCL etc.

I don't like their current product lines but I respect that market approach. Then again, they're the smaller piece of the pie (30-40%) so who knows what they would do in nVidia's shoes.

Mantle may be the closest I can think of, and it accelerated DX12 plans for removing CPU overhead due to API calls, so I'm glad that it's a niche that triggered a reaction from big slow MS.

Some other things that get on my nerves when nVidia sponsors games, things that are even to the detriment of its own customers just to look good on a benchmark:
http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2

Why did Crytek decide to tessellate the heck out of this object that has no apparent need for it?
Yet the flat interior surfaces of this concrete slab, which could be represented with just a handful of large triangles, are instead subdivided into thousands of tiny polygons
That's right. The tessellated water mesh remains in the scene, apparently ebbing and flowing beneath the land throughout, even though it's not visible
Obviously, that's quite a bit needless of GPU geometry processing load. We'd have expected the game engine to include a simple optimization that would set a boundary for the water at or near the coastline, so the GPU isn't doing this tessellation work unnecessarily

I like my video card, but I don't support this sort of business practice.
 
Last edited:
This review is so inconsistent. it says the highest playable settings were
for R9 290X
Quality - Ultra, Shadow - Ultra, AO - SSBC, Godrays - OFF and Fur - ON
and R9 290
Quality - Very high, Shadow - Very High, AO - SSBC, Godrays - OFF and Fur - ON

But if you look at apples to apples with Quality - Ultra, Shadow - Ultra, AO - SSBC, Godrays - Volumetric and Fur - ON both R9 290X (avg 51.7 fps, min - 41 fps) and R9 290 (avg 51.7 , min - 41 fps) perform close to GTX 980 (avg 54.3 fps, min - 44 fps) and GTX 970 (avg 50.6 fps, min - 42 fps) with the R9 290X edging out the GTX 970. It defies logic that you call GTX 970 better performing than R9 290X when your own charts show otherwise.

btw this stuttering issue you talk about, did it vanish miraculously the minute you went down to Godrays - OFF on R9 290X and R9 290. :rolleyes:

I like my video card, but I don't support this sort of business practice.

well said. Nvidia Gameworks is a black box and the source code is available to licensees under strict clauses to not share that code with AMD. AMD tech like TressFX is open and available in source code with no restrictions. Anybody who says Gameworks is open needs to look up the definition of open in the dictionary.

http://www.extremetech.com/extreme/...surps-power-from-developers-end-users-and-amd
http://www.extremetech.com/gaming/1...opers-weigh-in-on-the-gameworks-controversy/1

http://community.amd.com/community/...014/09/23/tressfx-hair-cross-platform-and-v20
http://www.goldfries.com/downloadables/omega/AMD_Catalyst_Omega_Media_Presentation.pdf (see slide 42)

As they say the proof is in the games. Almost every Gameworks game sees AMD cards suffer when Gameworks exclusive features are turned on. On TressFX games no such discrepancy with Nvidia cards continuing to perform very well.
 
Last edited:
I am playing it fine on a 295x1 (god damn it fix crossfire already) with ultra preset and HBAO+. I am hitting 45-60 fps. It surely could use some fine tuning though. I am just hoping crossfire is working by the time I finish it.
 
I like my video card, but I don't support this sort of business practice.

Well, you are using their video card; so nvidia's business practice worked on you, and by extension you support them.

Not sure how else to read that.
 
By open I just meant it works on multiple platforms.

Well I am boycotting AMD until they give nVidia XDMA tech. They have a shadowy unfair grip on a system with less stutter. I'd use Mantle as an example but I wouldn't take that for free.

Realize anything on the Unreal engine uses game works. Also games like The Witcher 3, Star Citizen, COD, Batman, Assassins Creed, ect. Sure I'd rather everyone share tech and it be a jolly happy world, but all companies are guilty of not doing it.

AMD isn't even optimized for Mantle in BF4, people had to turn it off. Their own tech! Blaming nVidia for AMD not runing well with Godrays is like me blaming AMD for nVidia not running well on Mantle.
 
Last edited:
I don't have any doubts about the testing that was done, but I can tell you that my GTX 760 gets better performance than shown here. Not by a huge margin, but definitely more consistently at or above 50 fps than the graphs here show, and my game settings are set for either Very High or Ultra. Nothing below that. Vsync in game is ON.

I had the stuttering problem with the initial version that was released, but after the 1.4 patch, it went away completely. Also, after the game fully loads, if I ALT-TAB to the desktop, then back to the game, I get a significant boost to fps for the rest of the gaming session. Not sure why, but I used to have to do the same thing with Assassin's Creed 4: Black Flag.

For the record, I put a buttload of hours into Watch Dogs and never had any suttering issues with that game at all. Max details at 50+ fps all the time.

Machine specs: (NO overclocking)

Core i7 3770
nVidia GTX 760 with latest WHQL drivers
16GB DDR3
Samsung 840 Pro 500GB SSD
Windows 7 64-bit
 
Am I the only one that wonders if major hardware vendors work with major gaming vendors to ensure that previous gen graphics hardware suffers disproportionally more than it should? (Like rendering water under levels?) Surprised to not see 780Ti or Titan represented. Also, my new tin foil hat is very comfortable!
 
Am I the only one that wonders if major hardware vendors work with major gaming vendors to ensure that previous gen graphics hardware suffers disproportionally more than it should? (Like rendering water under levels?) Surprised to not see 780Ti or Titan represented. Also, my new tin foil hat is very comfortable!

Wonder? Well, not so much gimp older hardware, as much as competition hardware... For instance my previous gen 280x (7970) CF cards run DA:I beautifully, but couldn't run any of these ubi$oft titles acceptably.
 
It is kind of ironic that Watch_Dogs runs more or less flawlessly on 900x series hardware but tanks on anything older. Far Cry 4 and AC: U didn't even seem to get the new hardware support right, though.
They run better on new hardware, but a long way from flawless.
 
Good review guys.

Everyone on here pointing at Gameworks and saying that it is the cause of poor AMD performance, I think that's a load of bullshit. Where's your proof? I think this games' issues are Ubisofts fault. It launched with issues on both sides. Nvidia's issues getting fixed a little faster, probably because its a gameworks title and they are likely in there trying to help get this mess straightened out. That is not proof that nvidia is sabotaging amd performance. If all the dev's on Gameworks titles can see the source code that nvidia closely guards, surely it would have come to light if there was anything in them that singled out AMD cards for shitty performance. The performance appears to be falling in line with the cards real performance... r290 > 780. To be expected. 780 launched 5-2013, r290 launched 11-2013. If Gameworks could sabotage amd performance, it would not be r290 > 780, it would be closer to r290=780. But there's an approx 10% performance difference between each successive gpu family, it's probably about right. Even the 280x beats the 780...all of your arguments (baseless speculations) are null and void.

I'm not saying Nvidia is a bunch of angels. And I sure as hell wouldn't say that about amd either.

Put the blame for this games' mess where it belongs. Greedy developer that had to get in for the christmas shopping holiday.

At least they seem to be trying to get all of the issues fixed. Myself, I have FC4 installed but haven't even launched it once yet. 6 year old x58 really is due for a nice big upgrade =)
 
good review. Sad to see my GTX780 is showing its age already. I haven't bought the game yet, perhaps I will hold off until I upgrade.

If I had to guess on the main reason the game performs so bad on the 780 is ram. I know on my 290 gpuz reports 3.75gb used.

[h], any plans to look at cpu and ram usage?

It's not though. Even running the game with low resolution and textures it stutters like a mess.

Plus plenty of people on Maxwell hardware are reporting the same stuttering issue.

I even tried loading the game's assets onto a ramdisk and turning down the details. It still stutters like mad on my GTX 780 SLI.

At this point I'm holding off on playing a game I paid $60 for, because it's nowhere close to a fun experience. It's really disappointing and reminds me of some older THQ games in terms of how buggy it is.
 
Ubisoft: console first, PC LAST

Still, even though I am about 4 years behind the curve in hardware in my most "up to date" system (I know, I'm oft not [H]ard like some of you gents can afford to be, but I'm an enthusiast in spirit at least) and won't be able to get a very good experience, I still keep tabs on things and it's good to know there's places like [H] and DF that dig deep into the technical details so I don't get burned by greedy publishers. Cheers, guys
 
Back
Top