Far Cry 5 Video Card Performance Review @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Far Cry 5 Video Card Performance Review

We test eleven GPU's performance from the low-end, to the top-end in Far Cry 5. Our goal is to find what settings are playable on each video card in Far Cry 5, and to compare the performance. We will also look at AA and Volumetric Fog performance. If you want to find what the best value GPU is for this game, this is for you.
 
I guess with such high performance across all card, no sli crossfire testing?
 
However at 1440p we feel the AMD Radeon RX Vega 56 is your best deal for an enjoyable Far Cry 5 gameplay experience without breaking the bank.

I assume that applies to retail prices, as I've seen Vega56 cost as much as the GTX1070Ti. Haven't checked in a while though...
 
so, this game it's the best possible of the possibles best scenarios for VEGA and still VEGA64 = GTX 1080, VEGA56 = GTX 1070. lol, so much about VEGA64 = 1080Ti if all features ever are enabled.
 
so, this game it's the best possible of the possibles best scenarios for VEGA and still VEGA64 = GTX 1080, VEGA56 = GTX 1070. lol, so much about VEGA64 = 1080Ti if all features ever are enabled.
But AMD graphics are like fine whine! Give it a year or 5 years.
 
so, this game it's the best possible of the possibles best scenarios for VEGA and still VEGA64 = GTX 1080, VEGA56 = GTX 1070. lol, so much about VEGA64 = 1080Ti if all features ever are enabled.

Well, I saw initial tests showed Vega being very close to GTX1080Ti, but nvidia countered with drivers.
 
Does anyone else find themselves more [H]ard than [H]ardOCP when you read "These video cards are overkill for this game at 1440p." and you feel that the numbers in the benchmark for the 1080ti doesn't even meet your standard? When the fps constantly dip below 100 I notice it, and it drive me crazy. Need as close to 200fps. 2x 1080ti's barely do the job @ 1080p at max settings in many games, even then I regularly lower some of the settings. My Alienware 25 demands 200fps!
 
Does anyone else find themselves more [H]ard than [H]ardOCP when you read "These video cards are overkill for this game at 1440p." and you feel that the numbers in the benchmark for the 1080ti doesn't even meet your standard? When the fps constantly dip below 100 I notice it, and it drive me crazy. Need as close to 200fps. 2x 1080ti's barely do the job @ 1080p at max settings in many games, even then I regularly lower some of the settings. My Alienware 25 demands 200fps!

Im with you, lately things here are more soft than the typical [H]ard.. everything here turned "good enough even for overclock". however that's off topic.
 
I like the comparison images between the AA settings, in motion AA I will have to experience for myself. Those crawling lines drive me crazy.

I may have to pick up this game, looks very fun. Can do some CFX Vega's and 1080Ti comparisons 4K and at 3440x1440p except limited to a rather lower standard than Brent fantastic analysis. I am just not able to replicate Brent methods accurately.

Seems like from FarCry 4 to FarCry 5, IQ went down where you have way more good options in FC4 except hairworks performance sucked is probably the only less useful option. Also Rapid Pack math does not really seem to pull the Vega 64 away from the 1080 once real game play was used - which is way more accurate than a canned benchmark when done right.

Thanks Brent for some of the best analysis of new games on the net.
 
nice article.

I will agree a overclocked rx-480/580 has playable 1440p ultra performance.
 
I like TAA, it doesn't seem to blur textures like FXAA /TXAA, antialiases better and its fast. Finally, MSAA killer?
 
I like TAA, it doesn't seem to blur textures like FXAA /TXAA, antialiases better and its fast. Finally, MSAA killer?

TAA blur terribly in movement. Not the same blur as FXAA but more like you put vaseline in all your screen... at least that was my experience with fallout 4 (it was unbearable for me) I have not played yet Far cry 5.
 
Does anyone else find themselves more [H]ard than [H]ardOCP when you read "These video cards are overkill for this game at 1440p." and you feel that the numbers in the benchmark for the 1080ti doesn't even meet your standard? When the fps constantly dip below 100 I notice it, and it drive me crazy. Need as close to 200fps. 2x 1080ti's barely do the job @ 1080p at max settings in many games, even then I regularly lower some of the settings. My Alienware 25 demands 200fps!

Back to the age ole debate, “FPS Isn’t Everything.”


If you’ve been reading our site for a long time, you will note that we have adopted a “What is Playable” stance on gameplay performance, rather than using FPS to purely dictate performance. We actually put this at the top of every Test Setup page, in every GPU review - https://www.hardocp.com/article/2018/04/17/far_cry_5_video_card_performance_review/2


We will of course show you the min/max/avg FPS, so you can determine what the data means for yourself, and so that you can have all the data to make whatever determination you want. However, we always have a baseline for every game in terms of “What is Playable.” You will note the red and green line in each graph. These are indicators of what is playable, it’s been a part of our graphs for many years.


It is true that for most gamers there is a “Good Enough” amount of performance line in games. We do recognize everyone may have a different level of what is good enough. That’s why we show you the entire performance in graph form, every second, and the min/max/avg. It’s not about the level of performance in a game, but the consistency of performance. If FPS deviates wildly, of course you’ll notice this “change” in FPS while gaming. However, if an FPS level is consistent, it may be hard to tell the difference between 80FPS or 100FPS.


As long as the FPS is at 60FPS or above, it’s a wash. UNLESS you must attain a certain FPS level for Refresh Rate reasons. That is a different scenario altogether. Also, we test with VSYNC disabled to show you the entire FPS scaling. However, most people do play games with VSYNC enabled, or use FreeSync or GSYNC, thus FPS will be more consistent anyway.


The point and goal of testing performance in new games is to find the best video card for the money, the best “value” as it were. Will a GTX 1080 Ti offer you better performance at 1440p? Of course. But, not everyone can afford one. Therefore, the best value for dollar (according to our testing) is the Radeon RX Vega 56 for 1440p. It provides a baseline of performance that is acceptable for most gamers. While we state this in these reviews, we do go ahead and show you GTX 1080 Ti at 1440p anyway so we can appeal to everyone and you get to see all levels of performance.


We put all the data out there for you to make your own decisions. Most of the time we feel you are smart enough to take our data we provide and make your own educated and informed decisions based on the results we provide. Think of us as demonstrating and putting out performance on different levels, and giving our opinion of the best value, and with all the other data surrounding that out there you can look at it and make up your own mind. It takes a lot of work. Not everyone has the same level of “What is Playable” so we put it all on the table. YOU decide what works best for YOUR levels of “What is Playable.”
 
so, this game it's the best possible of the possibles best scenarios for VEGA and still VEGA64 = GTX 1080, VEGA56 = GTX 1070. lol, so much about VEGA64 = 1080Ti if all features ever are enabled.

Yet unless your at 4K you would never notice the difference. I dont think you and a couple of others realize almost no one games at that resolution except a tiny fragment of the community. Most of the gaming community is at 1080p and running a 580 or 1060 video card or worse. Your hatred of Vega and AMD has been noted tho in multiple threads. Personally I think it's great to see a new game that pretty much plays great on any hardware out there, that brings more people to playing on PC's instead of consoles and we get better games that way.
 
  • Like
Reactions: noko
like this
Yet unless your at 4K you would never notice the difference. I dont think you and a couple of others realize almost no one games at that resolution except a tiny fragment of the community. Most of the gaming community is at 1080p and running a 580 or 1060 video card or worse. Your hatred of Vega and AMD has been noted tho in multiple threads. Personally I think it's great to see a new game that pretty much plays great on any hardware out there, that brings more people to playing on PC's instead of consoles and we get better games that way.

Lol I love AMD as much as I love Nvidia as much as I love intel and as much as love samsung or Apple, im brand brand agnostic man, I buy PC hardware for FUN, I have as much AMD gpus as nvidia GPUS, I always buy AMD gpus even when I don't need to just because I like to have personal experience with the hardware, I like to test for myself, even for a time I was full time AMD user, I have 4 personal machines at home right now, three, read it, three are built with AMD gpus, gtx 1080ti, r9 390X, r9 280x and RX 580, this last BTW it's a full AMD one paired with a ryzen 7. And I hate AMD? LoL.. I don't hate AMD I HATE VEGA, and I consider it a failed GPU but typically I pick no sides when building or recommending a machine..

About the performance, yes I normally notice the performance between those gpus at 2k, I gamed for years at 1080P@120hz just because I love the fluidity and responsiveness of high refresh panels, my last card at 1080P was a gtx 980Ti only for that fact, I only upgraded to 2k@144hz when the gpu power was enough to push recent games as much as possible above 100 fps with highest settings as possible ( with couple of exceptions as motion blur, chromatic aberration, AA levels, etc).

Currently even at 2k the only card able to push this high refresh rate it's the 1080Ti not anything else.

Ahh you made me to remember when I built in December my hated ryzen machine.. yeah funny moments to have a full AMD machine again.. ;)

received_1948561565397422.jpeg
 
At least for me, gaming with the 1080Ti's, game play is better (mucho at that) using adaptive sync and holding 60fps (3440x1440) than some 100fps plus frame rates. Same with the Vega's but using Freesync. The monitor comes into big play dealing with playbility and smoothness with performance. Better performance as in FPS in other words may not equal better game play for a particular monitor or resolution. I like how Brent separates this out and from my experience it is pretty much with his findings except he is more [H]ard than I would be.

My playability for frame rates is lower than [H]ardOCP with Freesync, basically anything over 40fps is good for me. Maybe I am not [H]ard enough. While I can notice differences in smoothness it is utterly loss when playing games. As for CFX/SLI testing - it is not applicable as far as I am concern, once DX12 multi-GPU, if ever, is used more, than I think that would become more applicable, especially if VR titles really start using it and it makes a big difference with the higher resolution headsets as in next generation.
 
Last edited:
A lot of Ngreedia fans are triggered by this review. Same people that are advocating for competition yet cant handle the fact that Vega is panning out to be a great card with availability becoming the realty.
What a bunch of fucking hypocrites.

Great in depth review.
Thank you [H] team.
 
One of the best ways to keep your FPS where you want is to use the Resolution Scale option. Performance reviews rarely mention it, but it's in tons of games these days.
If you can't pull 60fps at 4K, try setting your resolution scale to 80% or 90%. It's the best of both worlds because it allows you to crank visual settings like shadows and water details without completely compromising performance and resolution. It looks way better than settling for 1080p (even with a >100% render) and it's not that far from "true" 4K.
 
One question. I have a 1080 Ti, but then a much older generation i5-2500K clocked at 4.2G. Is this going to be a bottleneck?
 
One question. I have a 1080 Ti, but then a much older generation i5-2500K clocked at 4.2G. Is this going to be a bottleneck?

At 4K and 2K unlikely to see much difference with a newer processor. At 1080p then yeah a newer processor would help but only if you need 100 fps or better all the time.
 
At 4K and 2K unlikely to see much difference with a newer processor. At 1080p then yeah a newer processor would help but only if you need 100 fps or better all the time.

Thanks. No I don't care beyond 70fps. I'm not a heavy gamer, the only reason I have that card is I need it to build some AI mdoels.
 
Their SMAA method seems half baked. I'm willing to bet the SMAA in Reshade would do a better job.

A situation such as this, should be cake for SMAA to dramatically improve. This is barely changed from no AA:
1523907821cc3hs0n7d2_7_9_before.png
 
One of the best ways to keep your FPS where you want is to use the Resolution Scale option. Performance reviews rarely mention it, but it's in tons of games these days.
If you can't pull 60fps at 4K, try setting your resolution scale to 80% or 90%. It's the best of both worlds because it allows you to crank visual settings like shadows and water details without completely compromising performance and resolution. It looks way better than settling for 1080p (even with a >100% render) and it's not that far from "true" 4K.

This helped me Maintain 75fps with Freesync on, on my 580, Running a mixture of high and ultra settings, but bumped the resolution scale down at 2560x1440 to 0.9 from 1. Its a great option, and gave me a 10fps boost without any noticeable visual reduction.
 
This helped me Maintain 75fps with Freesync on, on my 580, Running a mixture of high and ultra settings, but bumped the resolution scale down at 2560x1440 to 0.9 from 1. Its a great option, and gave me a 10fps boost without any noticeable visual reduction.

I use it for almost everything from Street Fighter 5, to Tekken 7, Destiny 2, and Gears 4. GTA5 is the first time I can recall seeing it, but it has become a widespread option. Even running 4K at a render of 0.6 is still a solid visual bump from 1080p.
 
One question. I have a 1080 Ti, but then a much older generation i5-2500K clocked at 4.2G. Is this going to be a bottleneck?

Games are primarily GPU dependent these days. You should be fine. GPU/Vram/SystemRam_amount are probably all more important than the CPU, as long as you have an i5/i7 family or ryzen (not too familiar with the 5ish year old amd cpu's). CPU's have been only making small incremental improvements starting from the i7-2600 and on. No big jumps.

I played for 8 years on the i7-920 @3.6Ghz. Remember playing Far Cry 3 and Bioshock on it without issue.
 
I get almost 25% better performance on Nvidias 391.01 drivers (February?) than I do on the 'far cry 5' 391.35 drivers. Go check it out, it's night and day difference performance wise. Holy cat.

I had upgraded to the 391.35 and reran the bench and to my surprise I got really awful numbers. I thought maybe the settings had changed or something but I double checked everything and it was the same. The new drivers were laggy and choppy to say the least. I had played 10 hours of the game on the older drivers and everything was completely smooth, without any hitches or issues, so after a cup of coffee I decided to try out the older drivers again to see if I was crazy and GOD DAMN they were fast. HUGE difference in the benchmark and the in game. Almost inconceivable how much better it was playing with the older drivers. Go check them out if you have far cry 5, I think you'll be blown away.

TLDR:
391.35 'far cry 5' drivers : complete shit
391.01 drivers: holy cat fast.

UPDATE: pretty sure the shadows are different in this driver attributing the substantial fps gain.
 
Last edited:
Great article. By the way, am I the only one that noticed a huge jump from 1080p to 1440p, and much less so from 1440p to 4k? I have a 4k screen, but obviously the GPU needed to run games at the desired framerate (in many cases) isn't out yet. That being said, I don't really notice that much of a difference.
 
I use it for almost everything from Street Fighter 5, to Tekken 7, Destiny 2, and Gears 4. GTA5 is the first time I can recall seeing it, but it has become a widespread option. Even running 4K at a render of 0.6 is still a solid visual bump from 1080p.

So many consoles use variable resolutions and resolution scaling these days that its pretty much being built into almost every aaa game to some extent. Should continue to see this 'feature' in more games since console devs need to create a game that can run smooth on ps4/xbox one, ps4 pro, xbox one x. 4 dramatically different specs and 1 game to rule them all, clearly a scenario for resolution trickery. SuperSampling has for my money always been the best way to increase visual fidelity all else equal, and a poor mans supersampling method never hurt anyone. It can also be one of the only ways to completely eliminate jaggies in certain games/engines that have AA solutions which just don't come close to making enough of a difference.
 
At 4K and 2K unlikely to see much difference with a newer processor. At 1080p then yeah a newer processor would help but only if you need 100 fps or better all the time.

I am pretty sure 1080p is 2k. (1920x1080)

1440p could be called 2.5k I guess. (2560×1440)
 
I guess with such high performance across all card, no sli crossfire testing?

Check my postings. I posted what I got on launch day with 2 x 1080s. Spoiler Alert: Scaling at 1440p is garbage, scaling at 4k is much better. I ended up getting better performance running at 4k with the resolution slider down a bit.
 
TAA blur terribly in movement. Not the same blur as FXAA but more like you put vaseline in all your screen... at least that was my experience with fallout 4 (it was unbearable for me) I have not played yet Far cry 5.

That's kind of annoying. I find it crazy that there isn't an end all be all AA solution yet. The TAA in those stills seem to get rid of aliasing on a lot of things that would normally be missed, but I definitely agree even in the pictures it's clear things start to become soft. The number of times you see a fence or a road where the edge is crazy jaggy, but no AA solution seems to have an effect on it is annoying. But it would be equally annoying if it makes it smooth to the point where there is no longer an edge either.
 
There are better shader AA methods, but developers seem to take the easy way out.

Heck, Crysis 3 had more, and better, AA options with SMAA.

At the very least it wouldn't be too hard to add on MSAA support in these games, by default all games should at least have traditional MSAA support IMO.
 
There are better shader AA methods, but developers seem to take the easy way out.

Heck, Crysis 3 had more, and better, AA options with SMAA.

At the very least it wouldn't be too hard to add on MSAA support in these games, by default all games should at least have traditional MSAA support IMO.

Does MSAA actually get rid of ALL of the jaggies? That's always been my biggest pet peeve is when you're running AA and yet there are still places where no AA method will fix it. I'd rather things be a bit soft, or take the performance hit than have immersion destroying aliasing going on.

maxresdefault2.png



This is a really old game, but this is something that always crosses my mind. The yellow lines in the middle of the road, and even the white line on the shoulder always has broken jaggies on it in TDU. At the time there wasn't an AA method that would fix this in game, and there was no brute force AA method outside of the game that would even touch that. I do agree I don't expect developers to waste too much time supporting AA methods, but it seems like an area where AMD or NVIDIA could have come up with something by now that would be able to touch that up without completely destroying the textures.

To tie that back into FC5, is there any combination of in game and driver control panel option that would provide the best IQ? Or is TAA the best thing that's available for FC5?
 
No, MSAA cannot remove aliasing on alpha textures, but at least it can on all objects or polygons without any meddling with texture quality.

For everything else you want SSAA, which is tremendously intense on performance, but I'd at least like the option. There have been attempts to alleviate that performance with hybrid forms of SSAA.

There really are a ton of different methods out there for removing aliasing, when we only see SMAA or TAA included, it shows a bit of laziness IMO.
 
I appreciate the response Brent! So basically SSAA is still the king even though no one wants to be bothered to implement it because they don't think anyone is going to use it. Is the option for "Antialiasing - Transparency" in the NVidia CP able to provide SSAA for everything in the game? So like if you had titan SLI you could turn on 2x super sample and provide better results than what TAA is giving you?
 
Does anyone else find themselves more [H]ard than [H]ardOCP when you read "These video cards are overkill for this game at 1440p." and you feel that the numbers in the benchmark for the 1080ti doesn't even meet your standard? When the fps constantly dip below 100 I notice it, and it drive me crazy. Need as close to 200fps. 2x 1080ti's barely do the job @ 1080p at max settings in many games, even then I regularly lower some of the settings. My Alienware 25 demands 200fps!

I would definitely like to echo this opinion.

I always thought PC gaming was all about no-holds-barred performance. Pinnacle of performance where possible. Compromise is the land of consoles, not PC.

Yet the review seems to be trying to satisfy itself at 60fps. While I get that buying say a Vega64 or 1080 is for 1440p gaming, what about gamers who don't want 1440p but wants 1080p at 144Hz refresh rate or better? Or 1080Ti at 1440p to see if it can attain more than 100fps for the 100Hz monitors? When I look at the competitive gaming scene, I see gamers still chasing the highest fps they can.

I get that not every gamer will be chasing the highest fps they can, and with Freesync/G-Sync, the need reduces drastically. But aren't we basically saying we are settling on a performance compromise there? It's like saying doing 50 burpees and quitting, instead of pushing 100 if you can.
 
Back
Top