Far Cry 4 Video Card Performance Review @ [H]

Well, you are using their video card; so nvidia's business practice worked on you, and by extension you support them.

Not sure how else to read that.

b65.jpg
 
Good review guys.

Everyone on here pointing at Gameworks and saying that it is the cause of poor AMD performance, I think that's a load of bullshit. Where's your proof? I think this games' issues are Ubisofts fault. It launched with issues on both sides. Nvidia's issues getting fixed a little faster, probably because its a gameworks title and they are likely in there trying to help get this mess straightened out. That is not proof that nvidia is sabotaging amd performance. If all the dev's on Gameworks titles can see the source code that nvidia closely guards, surely it would have come to light if there was anything in them that singled out AMD cards for shitty performance. The performance appears to be falling in line with the cards real performance... r290 > 780. To be expected. 780 launched 5-2013, r290 launched 11-2013. If Gameworks could sabotage amd performance, it would not be r290 > 780, it would be closer to r290=780. But there's an approx 10% performance difference between each successive gpu family, it's probably about right. Even the 280x beats the 780...all of your arguments (baseless speculations) are null and void.

I'm not saying Nvidia is a bunch of angels. And I sure as hell wouldn't say that about amd either.

Put the blame for this games' mess where it belongs. Greedy developer that had to get in for the christmas shopping holiday.

At least they seem to be trying to get all of the issues fixed. Myself, I have FC4 installed but haven't even launched it once yet. 6 year old x58 really is due for a nice big upgrade =)

There is no " proof", but there is a few things to point out. Fc4 uses the same engine as fc3. How could fc3 have no problems with cf, but fc4 can't run cf?

I think the reason the 780 tanks is because of the lack of vram. I would like to see a 780 with 6gb tested.
 
There is no " proof", but there is a few things to point out. Fc4 uses the same engine as fc3. How could fc3 have no problems with cf, but fc4 can't run cf?

I think the reason the 780 tanks is because of the lack of vram. I would like to see a 780 with 6gb tested.

I had trifire 7970s after the microstutter fix and FC3 played worse with three GPUs than one. That was probably around a year after FC3 launch. Did they really fix that?

Not I mention sli was broken for months... I don't buy the conspiracy crap. I think Ubi just needs to clean up their act. Hell I was excited for Furworks and that still isn't complete.

FC4 has been running flawless for me for about a month. Kids love riding the elephants around :)
 
I had trifire 7970s after the microstutter fix and FC3 played worse with three GPUs than one. That was probably around a year after FC3 launch. Did they really fix that?

Not I mention sli was broken for months... I don't buy the conspiracy crap. I think Ubi just needs to clean up their act. Hell I was excited for Furworks and that still isn't complete.

FC4 has been running flawless for me for about a month. Kids love riding the elephants around :)

Micro stutter was a problem with the 7xxx series cards, that had nothing to do with fc3. Even given that, it did work.

Amd says ubisoft needs to change something in the game. Anyone have any idea what needs to be changed?
 
Well, you are using their video card; so nvidia's business practice worked on you, and by extension you support them.

Not sure how else to read that.

Yeah it's a tough situation, I was building a silent gaming PC and power consumption was the number one concern, at the time nVidia was the way to go for that goal. Hopefully next time I'm in the market the choices wont be limited again.
 
Last edited:
After this latest iteration of Gameworks for Ubisoft games I'm convinced that Nvidia should terminate that relationship with this company. This is what, the third big AAA game from them that's been a swing and a miss with this supposed vendor specific optimization program? Given how PC gamer public opinion has swung so hard against Ubi, maybe it's for the best if Nvidia just cut their losses on these guys.

Oh well, at least my 290 doesn't have much billing here so it'd likely meet my bottomed out performance expectations.
 
There is no " proof", but there is a few things to point out. Fc4 uses the same engine as fc3. How could fc3 have no problems with cf, but fc4 can't run cf?

I think the reason the 780 tanks is because of the lack of vram. I would like to see a 780 with 6gb tested.

Why would you expect the 780 to tank due to lack of vram while the 280x has the exact same amount and performs better.
 
You must have a very short list of games. Most big releases will be using gameworks and nVidia made it very open where most of it works great on AMD/PS4/Xbone.

You can say the same thing about AMD IP.



This. Both companies have their tech. Gameworks is just a dev toolkit, that implements standard effects and enhancements. You don't need source access because all it really is, is a bunch of prepackaged effects that save dev time instead of reinventing the wheel. Mantle is a completely different animal for example and truly is closed, only even running on amd cards and requiring a full new renderer to support. It's not even comparable. Amd or nvidia performing poorly with gameworks type stuff is on their own shoulders, and their own individual responsibilities.
 
You guys can't be shitting us...

After the entire article ranting against the horrible Ubisoft games of late and how terrible console ports they are, broken for the PC, how vendor locking with GameWorks destroys AMD performance and disable CF (requiring Ubi to actually patch it to support it!!)..

You are gonna add this lame title to your suite of FEW games as a benchmark for users to decide on hardware performance or value? Are you kidding me?!

Does that make any sense? You are rewarding such a shit title and dev (Ubi) with unjust street cred by being featured on [H]. Think about it.

Watch Dogs is enough. Now you have TWO UBISHIT games in your limited deck of titles. /GG [H]
 
Yes, we used FC3, we will use FC4. It will also allow us to show over time if the game improves via patches and driver updates. It is a graphically demanding game. Despite it having issues, it is a popular and well played game. The game is actually pretty fun. It certainly isn't the worse release of 2014.

There are many games I am interested in evaluating for adding to our gaming suite as well, Grand Theft Auto V, Witcher 3, Battlefield Hardline, Project Cars, Batman Arkham Knight, Dead Island 2, and more. I will look at each one and we will determine if we add it. I have an itch to get some old games out and some new games in.
 
I like that idea, Brent. Of particular interest to me will be GTA V, and Battlefield Hardline (perhaps Project Cars, too, since it looks interesting and some early testing I've seen done is showing it to be fairly demanding. It's important to highlight titles that a lot of people are playing, in my opinion. I mean, it's all well and good to find a niche title that is insanely demanding, but testing the actual games we're all playing or about to play is much more relevant :D.
 
You guys can't be shitting us...

After the entire article ranting against the horrible Ubisoft games of late and how terrible console ports they are, broken for the PC, how vendor locking with GameWorks destroys AMD performance and disable CF (requiring Ubi to actually patch it to support it!!)..

You are gonna add this lame title to your suite of FEW games as a benchmark for users to decide on hardware performance or value? Are you kidding me?!

Does that make any sense? You are rewarding such a shit title and dev (Ubi) with unjust street cred by being featured on [H]. Think about it.

Watch Dogs is enough. Now you have TWO UBISHIT games in your limited deck of titles. /GG [H]

I disagree. This game despite its bugs is a Amazing game. I and loads of people are having fun. Some of the bugs are hilarious and makes it more fun if that makes sense. Like riding a elephant across a river hehe. Proper jokes.

Love it but my only concern is that it only uses half of the power of my gpus.
 
I disagree. This game despite its bugs is a Amazing game. I and loads of people are having fun. Some of the bugs are hilarious and makes it more fun if that makes sense. Like riding a elephant across a river hehe. Proper jokes.

Love it but my only concern is that it only uses half of the power of my gpus.

I haven't bought it yet (waiting for a good sale as it's not a must-play but definitely a "want to play" game for me), but it's a very popular game and rated well overall. Looking forward to it...

Re: AMD blaming Ubisoft for lack of crossfire functioning, who knows what the truth is? The blame game is played by most every company in most every industry. My two cents would be that since nvidia has SLI working in Far Cry 4 even with engine revisions between FC3 and its successor, Ubisoft probably doesn't have anything blocking standard AFR techniques like both AMD/nvidia use for their multi-GPU rendering, and therefore it would fall on AMD to make their cards compatible. However, you never really know if you're not on the inside there, so while I wouldn't say it's unlikely that it is a Far Cry 4 issue due to Ubisoft, I also wouldn't say it's unlikely it's an AMD issue either.
 
Re: AMD blaming Ubisoft for lack of crossfire functioning, who knows what the truth is? The blame game is played by most every company in most every industry. My two cents would be that since nvidia has SLI working in Far Cry 4 even with engine revisions between FC3 and its successor, Ubisoft probably doesn't have anything blocking standard AFR techniques like both AMD/nvidia use for their multi-GPU rendering, and therefore it would fall on AMD to make their cards compatible. However, you never really know if you're not on the inside there, so while I wouldn't say it's unlikely that it is a Far Cry 4 issue due to Ubisoft, I also wouldn't say it's unlikely it's an AMD issue either.

That's totally backwards thinking. So you say it's the Video card manufacturer's responsibility to make sure every shitty game released works on their hardware? That's crazy. It's exactly the opposite. It's the developer's responsibility to make damn sure their game runs properly on avaialable hardware at the time of release. Especially hi-end hardware.
 
That's totally backwards thinking. So you say it's the Video card manufacturer's responsibility to make sure every shitty game released works on their hardware? That's crazy. It's exactly the opposite. It's the developer's responsibility to make damn sure their game runs properly on avaialable hardware at the time of release. Especially hi-end hardware.

Wouldn't it be Ubi's responsibility to make sure it works on DX11? If AMD/nVidia wants to implement proprietary tech that works outside of the normal DX11 routines it's on AMD/nVidia to make sure it works at least in a basic way. That would be my assumption. Note I don't work in that industry. To me I can extend this to being pissed they are not using GPU Physics. The quality could be so much better!! I am being robbed!

Proprietary hardware.. IMO totally on the manufacturer to make sure it's supported. They should have a mode (which could be AFR) that will replicate a single card's stability with at least some gains.
 
I haven't been able to get this game to even launch for me since I got it a few weeks ago - I've tried it on an NVIDIA machine and my AMD machine. Ubisoft is terrible. At least it was "free" with my SSD.
 
Wouldn't it be Ubi's responsibility to make sure it works on DX11? If AMD/nVidia wants to implement proprietary tech that works outside of the normal DX11 routines it's on AMD/nVidia to make sure it works at least in a basic way. That would be my assumption. Note I don't work in that industry. To me I can extend this to being pissed they are not using GPU Physics. The quality could be so much better!! I am being robbed!

Proprietary hardware.. IMO totally on the manufacturer to make sure it's supported. They should have a mode (which could be AFR) that will replicate a single card's stability with at least some gains.


umm, you know gameworks is proprietary tech? so the game runs on dx11 using proprietary tech.
 
umm, you know gameworks is proprietary tech? so the game runs on dx11 using proprietary tech.

We weren't talking about Gameworks. I do realize Gameworks is proprietary and it looks like nVidia paid and worked with Ubisoft to have it implemented. If it doesn't work I'd complain to nVidia since it's their tech. The ONLY time I'd be pissed at the software company is if they promised a feature as a selling point and didn't deliver.

But holding them accountable for crossfire/sli working? I'd blame the developers of that tech for not making it compatible enough. UNLESS Ubi marketed "buy FC4! 90% SLI/Xfire scaling!!". SLI/Xfire is somewhat of a niche IMO when compared to the whole of gaming (PS3/XB360/PS4/XBONE/PC), and many seasoned gamers stay the hell away from it because it's hit or miss.
 
Guys, they are going to benchmark whatever people are currently playing, with the most recently available drivers, whether the game is flawless or a pile of bugs. [H] waited for 7 patches (including the hotfix).
 
The last few months have had me so busy with RL that I haven't been able to play any games much less keep up with the hardware scene. HeEll, I've gone from checking the [H] 3 times a day to a couple times a week for only about 10 minutes each.

Good to know about the 280. Thanks.

edit:

sweet. This tells me that until I upgrade my monitor my card is still good enough:

stick with nvidia mate as its easier to follow blindly.

just pick the highest number of nvidia gpu series and your good to go.

AMD are more confusing
 
umm, you know gameworks is proprietary tech? so the game runs on dx11 using proprietary tech.

Proprietary is the wrong word. Licensed is more accurate. The source code (to nvidia provided dll's) is considered intellectual property, that's why they do not want it being shared. It's their way of trying to keep the "work" they have done from being copied and used for free by a competitor. (my guess)

Gamesworks runs on dx11, Gamesworks runs on AMD.

For more information on those, read
http://www.extremetech.com/gaming/1...elopers-weigh-in-on-the-gameworks-controversy

and

http://www.forbes.com/sites/jasonev...ut-gameworks-amd-optimization-and-watch-dogs/
 
I like that idea, Brent. Of particular interest to me will be GTA V, and Battlefield Hardline (perhaps Project Cars, too, since it looks interesting and some early testing I've seen done is showing it to be fairly demanding. It's important to highlight titles that a lot of people are playing, in my opinion. I mean, it's all well and good to find a niche title that is insanely demanding, but testing the actual games we're all playing or about to play is much more relevant :D.

If you want to be relevant based more popular games that we're all playing, please shift your attention here:

http://steamcharts.com/top

Eeven Shadow of Mordor has more players than FC4.

Don't just automatically jump onboard because it carries the Far Cry name. Certainly not after the fiasco of 4. I see it as unfairly rewarding Ubi (do they deserve it?) to consider two of their games as a benchmark on a site that only reviews a handful of games to form the basis of judging hardware.
 
As discussed in other threads, this game (as was FC3) is hard on OCs; if you're having stability issues I recommend dropping the GPU clock and trying to bump the RAM clock. This game seems to respond well to increased RAM bandwidth.
 
If you want to be relevant based more popular games that we're all playing, please shift your attention here:

http://steamcharts.com/top

Eeven Shadow of Mordor has more players than FC4.

Don't just automatically jump onboard because it carries the Far Cry name. Certainly not after the fiasco of 4. I see it as unfairly rewarding Ubi (do they deserve it?) to consider two of their games as a benchmark on a site that only reviews a handful of games to form the basis of judging hardware.

Isn't Steam Charts only Steam?

FC4 is a UPlay game, and none of the codes that came with GPUs work on Steam, only UPlay. Those numbers are going to be pretty far off for FC4 if that web site is only Steam.

Shadow of Mordor is a Steamworks game, so those numbers are accurate for it as all copies must be activated on Steam for it to run.
 
Isn't Steam Charts only Steam?

FC4 is a UPlay game, and none of the codes that came with GPUs work on Steam, only UPlay. Those numbers are going to be pretty far off for FC4 if that web site is only Steam.

Shadow of Mordor is a Steamworks game, so those numbers are accurate for it as all copies must be activated on Steam for it to run.

My fc4 code didn't activate on steam.
 
Re: Editor's GTA V comment on final page of this article:

Thing is, basically every GTA game on PC is like that since they started releasing for consoles first. I feel like it's more about milking consoles for sales than fixing the bugs in the game for PC.

Hey, it's a business and they are free to do that. I enjoy the GTA games. I just wish they would have released RDR on PC.
 
By open I just meant it works on multiple platforms.

Well I am boycotting AMD until they give nVidia XDMA tech. They have a shadowy unfair grip on a system with less stutter. I'd use Mantle as an example but I wouldn't take that for free.

Realize anything on the Unreal engine uses game works. Also games like The Witcher 3, Star Citizen, COD, Batman, Assassins Creed, ect. Sure I'd rather everyone share tech and it be a jolly happy world, but all companies are guilty of not doing it.

AMD isn't even optimized for Mantle in BF4, people had to turn it off. Their own tech! Blaming nVidia for AMD not runing well with Godrays is like me blaming AMD for nVidia not running well on Mantle.

This is hilarious, XMDA is a terrible comparison to use for Gameworks. A library of technologies that can be implemented by game developers, compared to a technology for communication between GPUs that is transparent to the game. They are not relevant to each other in any way at all, even for the purposes of your analogy.

Nobody (sane) is advocating Nvidia should just give away/share their proprietary technologies to AMD but if there is a more "active" disruption of AMD performance in these titles going on (and I have no proof that there is) then I don't see that as a good thing for gamers/gaming.

You say you won't use Mantle as an example then go on to do exactly that. Comparing Mantle, an alternative API, with the poor performance of one in-game setting, in one game, makes about as much sense as your first comment.
 
This is hilarious, XMDA is a terrible comparison to use for Gameworks. A library of technologies that can be implemented by game developers, compared to a technology for communication between GPUs that is transparent to the game. They are not relevant to each other in any way at all, even for the purposes of your analogy.

Nobody (sane) is advocating Nvidia should just give away/share their proprietary technologies to AMD but if there is a more "active" disruption of AMD performance in these titles going on (and I have no proof that there is) then I don't see that as a good thing for gamers/gaming.

You say you won't use Mantle as an example then go on to do exactly that. Comparing Mantle, an alternative API, with the poor performance of one in-game setting, in one game, makes about as much sense as your first comment.

That post was responding to someone else's that I should of quoted or in hind sight never responded to. It was responding to someone boycotting nVidia Gameworks games. In reality Ubisoft should of waited a few more months and fixed the stuttering, finished the fur, the shadows, ect. There are tons and tons of gameworks games that work great and people don't even realize gameworks is in them. It's the crappy publisher, not nVidia.

Just to clarify: I meant I wouldn't mention Mantle in a positive light for an example of something I would want. I didn't mean I wouldn't use it as a comparison or analogy.

People HAVE said they want Game Works to be "open" which is basically giving it away. I agree they are not sane. nVidia has approximately 300 engineers on Gameworks (~$90M a year burdened). IMO overall they seem to be "playing nice" with supporting consoles/AMD with most features. I believe they are even trying to make Physx run better off the CPU. The only feature I was looking forward to in FC4 was furworks, which works on both systems, and was a bit underwhelming anyways. IF nVidia is intentionally sabotaging AMD's performance outside of nVidia features I wouldn't buy nVidia product anymore.

The last line was just pointing out AMD doesn't run right on major AMD sponsored games that don't have nVidia tech but yet people are blaming nVidia for non-gameworks issues with no proof (xfire). I realize it's an API vs an in game feature (Mantle vs. Godrays). You can nit pick if you want but the point is I don't expect my nVidia hardware to run AMD funded tech perfectly and vice versa. If it does - fantastic - and I'll take note of it in my mind that x company has improved the gaming world for the better and factor it into my decision the next time I buy cards. To me AMD/nVidia are about equal here. I don't see AMD deprived of anything substantial.

I recognize why people could be concerned nVidia could abuse their tech, I just haven't seen it yet. If AMD did piss poor across the entire Gameworks package and nVidia wasn't riddled with problems as well... sure maybe I'd bite.
 
Last edited:
Yes, we used FC3, we will use FC4. It will also allow us to show over time if the game improves via patches and driver updates. It is a graphically demanding game. Despite it having issues, it is a popular and well played game. The game is actually pretty fun. It certainly isn't the worse release of 2014.

There are many games I am interested in evaluating for adding to our gaming suite as well, Grand Theft Auto V, Witcher 3, Battlefield Hardline, Project Cars, Batman Arkham Knight, Dead Island 2, and more. I will look at each one and we will determine if we add it. I have an itch to get some old games out and some new games in.

When you say Farcry 4 was not the worst , it also certainly was not one of the best games of 2014, both gameplay and graphics wise. Graphically demanding and well reviewed games like Middle Earth Shadow of Mordor, Dragon Age Inquisition are good choices to add to your suite. Ryse Son of Rome was good graphically but the game was boring and poorly reviewed. Also your testing suite is one dimensional with only first person and third person shooters / action games. Other genres need to find representation. Your test suite game count can be pushed to 6. Farcry 4 and older games like Tomb Raider can be replaced by these. Performance in Ubisoft's horribly buggy and unoptimized games is no basis on which to recommend video cards to consumers. 1 Ubisoft game in your test suite is itself a favour to them given their pathetic attitude.

http://www.metacritic.com/game/pc/middle-earth-shadow-of-mordor
http://www.metacritic.com/game/pc/dragon-age-inquisition
 
Due to the 280x results at 1920x1080, I'm guessing my R9 290 @ 1680x1050 should be significantly better....I'm waiting until this drops in the $5 bin as that is my usual price point :)
 
When you say Farcry 4 was not the worst , it also certainly was not one of the best games of 2014, both gameplay and graphics wise. Graphically demanding and well reviewed games like Middle Earth Shadow of Mordor, Dragon Age Inquisition are good choices to add to your suite. Ryse Son of Rome was good graphically but the game was boring and poorly reviewed. Also your testing suite is one dimensional with only first person and third person shooters / action games. Other genres need to find representation. Your test suite game count can be pushed to 6. Farcry 4 and older games like Tomb Raider can be replaced by these. Performance in Ubisoft's horribly buggy and unoptimized games is no basis on which to recommend video cards to consumers. 1 Ubisoft game in your test suite is itself a favour to them given their pathetic attitude.

http://www.metacritic.com/game/pc/middle-earth-shadow-of-mordor
http://www.metacritic.com/game/pc/dragon-age-inquisition

Dragon Age Inq should be a staple for any benchmark, its a very popular game & received great critical acclaim. It's the current "Skyrim".

At least some people see the poor logic in rewarding Ubifail such prominence they do not deserve. Likewise, I fully agree with this sentiment:

"Performance in Ubisoft's horribly buggy and unoptimized games is no basis on which to recommend video cards to consumers."

3 of the most buggiest & unoptimized AAA releases in recent times all belong to Ubi, Watch Dogs, ACU and FC4. If [H] fail to see the flaw in rewarding them then you guys may as well include ACU in your list of games to benchmark. :/
 
Thanks for the review guys!

I pre-purchased this game and needless to say I'm very disappointed that there is no support for crossfire. I have decided to stop playing the game until crossfire is supported, and I will never pre-purchase a game again!
 
I'd be really interested in seeing the performance of 290x 8GB version if Xfire wasn't so botched with Ubi's latest 'releases'.

Is the Xfire thing entirely Ubi's doing or does Games Works have something to do with it?
 
Well after reading one of them articles posted and i am shocked at this comment against this offical comment from nVidia.

"According to Nvidia, developers can, under certain licensing circumstances, gain access to (and optimize) the GameWorks code, but cannot share that code with AMD for optimization purposes."

So in essence AMD can't do jack even if they wanted to so forget driver optimisation nVidia will not allow AMD to do it. nVidia anti competition practices at work.
 
I'm playing Far Cry 4 with GTX780 just fine with ultra settings, though still at v1.03. I read that Fur is not really on until v1.04 and some other features. My graphic card is ASUS Strix GTX780 with Hynix's 6GB DDR5. Performance is close to HardOCP's result of GTX970 with lower resolution and same settings. GTX970 vs GTX 780, 1440 vs 1200, ultra vs ultra, avg50fps vs avg53fps. VRAM used was almost at 3GB according to GPU-Z and OHM. Here's the GPU's max values: Core 1045MHz, Mem 3005MHz, Shader 2090MHz.

Played Far Cry 3 with this card too, both games are smooth and enjoyable, never experience the stuttering everybody talks about. No crashes, no annoying bugs, no bad luck so far.

p9jlu.png


Cheers.
 
Back
Top