Grand Theft Auto V Video Card Performance Preview @ [H]

For Rockstar PC game this one is highly optimized. I have everything very high @ 2K res while disabling AA + DOF(visual preference) and it runs smooth as silk. This is highly in contrast to GTA4 which still runs like spoiled milk.
 
All i know is, this game runs extremely well on my old rig. everything is set to high or very high @ 1080p. I wouldn't necessarily say you need a top of the line card to enjoy GTA V, but at if you do have the hardware, this game will push it, and that is what matters.
IMO, this is a huge leap forward from GTA IV, and will probably be my favorite game for a long time. I say hats off to RS, and AMD for the drivers. One happy gamer right here, give the credit where credit is do.
 
Kyle
I was using AMD CHS shadows and changed to the softest option and noticed a large drop in performance on my stock clock 290x, running 15.4 driver.
(This is before todays game update)

The results are below.
AMD CHS:
Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 41.595043, 92.774544, 65.192513
Pass 1, 11.956508, 106.121651, 72.797028
Pass 2, 41.618195, 129.500992, 66.436279
Pass 3, 47.774693, 110.236565, 86.334145
Pass 4, 28.028557, 119.079323, 67.433258

Rockstar softest:
Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 42.560192, 82.668213, 55.512993
Pass 1, 43.290787, 103.272202, 64.929726
Pass 2, 41.445988, 90.860741, 55.163109
Pass 3, 40.700497, 111.258820, 80.742699
Pass 4, 26.106491, 125.131538, 61.566952

The setup part of the log files are identical apart from
AMD: Shadow_SoftShadows: 4
Rockstar: Shadow_SoftShadows: 3

Thought you might like to know as it goes against the preview...
"When it comes to performance, if you need every little bit of performance savings for smoother performance, Rockstar's Soft Shadow settings are better to use in GTA V."
 
**EDIT** I have not tested with the latest game patch. I will update again when I've had some time to play with it.

I run SLI TITAN X, and I agree SLI scaling needs some serious work. If you have the hardware this game doesn't make efficient use of it. GPU usage often hovers between 80 and 90% on both GPU...

First, the settings. I run the game at TRUE maximum settings. Every single option is set to its highest available setting (motion blur set to ~25%... no difference in framerate but it's atrocious when set to full), including advanced graphics options, 8x MSAA (including reflections), NV PCSS.

The vast majority of the time the game runs above 70fps in the open world. The benchmark shows very few frames dropping below 60fps, except during the flight portion of the benchmark. I assume this is HD streaming during flight taking its toll more than anything, which surprises me... the game is installed on a 512GB RAID0 SSD array, and my OS is installed on a different 256GB SSD RAID0 array. IO shouldn't be a bottleneck.

I also occasionally see framerates drop BELOW 30fps at truly maximum in-game settings. In the northern areas of the map - around Mount Chiliad - I saw a minimum framerate of 28.7fps during gameplay. I thought it might be a processor bottleneck, but CPU usage was only ~70% across the 4 cores at the time. I have only seen this in a couple of spots - at night - on the map, so it's likely based on some sort of weird interaction between geometry and post processing, but it happens and needs to be mentioned.

VRAM usage with maximum in-game settings (true maximum) sits at ~5.73GB at 1440p (5869MB allocated).

I'd like to see if enabling/disabling hyperthreading has any impact on the frame rates, especially minimums; if there is an appreciable impact it may prompt me to upgrade from the i5. I'm also interested in seeing how much of an impact system memory bandwidth has on streaming the HD textures during flight; I have a sneaking suspicion that my CAS9 DDR3-1600 is slowing that down quite a bit.
 
Last edited:
Thought you might like to know as it goes against the preview...

This is why we totally re-test when new drivers and patches come out and do not use previous test data.
 
They have some of it enabled, why are you surprised they got slightly worse framerate?

Then they should turn that shit off, because it's misleading.

No PC gamer is going to want to play with motion blur, depth of field, or chromatic aberration. That trash was designed for console peasants to cover up the flaws of their games by making everything all blurry. Motion blur was designed to make 30 FPS feel like less of a slideshow; it should hold no relevance in a game running on PC at 60 FPS.

These are effects which cost performance whilst DECREASING image quality. It is completely ridiculous.

There's a reason why devs always include the option to disable motion blur/DoF. I don't know any serious PC gamer who actually likes these effects. I always thought it was considered the standard to consider a game "maxed out" when everything is set to its maximum values EXCEPT for the shitty post-process effects like motion blur, chromatic aberration and DoF, as well as the garbage post-process forms of AA like FXAA or TXAA. The only AA that I consider acceptable is SSAA, SGSSAA, downsampling, and MSAA.

I don't consider any of these settings as counting towards 'maxing out' a game because they detract from the image quality instead of enhancing it.
 
Then they should turn that shit off, because it's misleading.

Now that's a first, turning on graphics settings in-game that the developer has included, is "misleading" to performance.

You hear a new one everyday.
 
Then they should turn that shit off, because it's misleading.

No PC gamer is going to want to play with motion blur, depth of field, or chromatic aberration. That trash was designed for console peasants to cover up the flaws of their games by making everything all blurry. Motion blur was designed to make 30 FPS feel like less of a slideshow; it should hold no relevance in a game running on PC at 60 FPS.

These are effects which cost performance whilst DECREASING image quality. It is completely ridiculous.

There's a reason why devs always include the option to disable motion blur/DoF. I don't know any serious PC gamer who actually likes these effects. I always thought it was considered the standard to consider a game "maxed out" when everything is set to its maximum values EXCEPT for the shitty post-process effects like motion blur, chromatic aberration and DoF, as well as the garbage post-process forms of AA like FXAA or TXAA. The only AA that I consider acceptable is SSAA, SGSSAA, downsampling, and MSAA.

I don't consider any of these settings as counting towards 'maxing out' a game because they detract from the image quality instead of enhancing it.

Totally agree for the most part.. except Depth of field and/or effective focus range, which I consider add realism to games.. specially in cinematics... in games like hitman absolution it's a great add on, but also in watch dogs or shadow of mordor as other examples..
 
Then they should turn that shit off, because it's misleading.

No PC gamer is going to want to play with motion blur, depth of field, or chromatic aberration. That trash was designed for console peasants to cover up the flaws of their games by making everything all blurry. Motion blur was designed to make 30 FPS feel like less of a slideshow; it should hold no relevance in a game running on PC at 60 FPS.

These are effects which cost performance whilst DECREASING image quality. It is completely ridiculous.

There's a reason why devs always include the option to disable motion blur/DoF. I don't know any serious PC gamer who actually likes these effects. I always thought it was considered the standard to consider a game "maxed out" when everything is set to its maximum values EXCEPT for the shitty post-process effects like motion blur, chromatic aberration and DoF, as well as the garbage post-process forms of AA like FXAA or TXAA. The only AA that I consider acceptable is SSAA, SGSSAA, downsampling, and MSAA.

I don't consider any of these settings as counting towards 'maxing out' a game because they detract from the image quality instead of enhancing it.

100% agree, fyi 770 4GB uses 3BG memory, can only imagine how much it might suck with only 2GB.
 
Then they should turn that shit off, because it's misleading.

No PC gamer is going to want to play with motion blur, depth of field, or chromatic aberration. That trash was designed for console peasants to cover up the flaws of their games by making everything all blurry. Motion blur was designed to make 30 FPS feel like less of a slideshow; it should hold no relevance in a game running on PC at 60 FPS.

These are effects which cost performance whilst DECREASING image quality. It is completely ridiculous.

There's a reason why devs always include the option to disable motion blur/DoF. I don't know any serious PC gamer who actually likes these effects. I always thought it was considered the standard to consider a game "maxed out" when everything is set to its maximum values EXCEPT for the shitty post-process effects like motion blur, chromatic aberration and DoF, as well as the garbage post-process forms of AA like FXAA or TXAA. The only AA that I consider acceptable is SSAA, SGSSAA, downsampling, and MSAA.

I don't consider any of these settings as counting towards 'maxing out' a game because they detract from the image quality instead of enhancing it.


Its a review of the games graphic abilities and the gpu/cpu abilities to process those settings. Why not show us how the game can cripple a system with everything turned on. Obviously you can do your own custom settings to achieve the FPS you want.

[H] is just showing us what happens when everything is turned on, not "optimal" settings for gameplay.
 
I agree that motion blur is absolutely terrible, turning it on just makes games look like crap. "Lets blur the image and lower the performance!" yeah...no.


depth of field depends on the game, most of the time I turn it off though
 
Yikes, judging from that image comparison, PCSS is by far the worst and also the most expensive performance-wise. Don't see any reason to use it currently.

I think AMD CHS looks the worst, but regardless, judging by the images, 'Rockstar SOftest' both looks the best AND performs the best. I'll be checking it out when I get home tonight.

Yeah, different settings. We have EVERYTHING enabled/turned on.

Does that include the settings in the ADVANCED Graphics Settings menu? Like HD streaming in-flight and stuff like that? Because I found that those options had a huge effect on framerate.
 
No PC gamer is going to want to play with motion blur, depth of field...

Whoa now... speak for yourself. I certainly think I qualify as a "PC gamer", and I enjoy many of those post process effects in game.
 
Great preview. Can't wait for the main article to come out. I actually recently bought a Titan X based on the review of it here at HardOCP. I have everything maxed out except AA which is sitting at 2x at the moment.
 
Definitely want to see the full review, as it sits right now with beta drivers from AMD, this game slays my 295x2 GPU at 2560x1440 on max settings--lots of stuttering. If I use recommended settings the game is perfect. I really expect more out of this GPU so I hope and pray they get the revision on the drivers right to play this game as it was meant to be played..:rolleyes:
 
This is actually one of the few games I've left DoF on. Motion blur is a guaranteed disable for any game that has the option, though.

I'm quite happy with the performance so far. I've been running max textures, pop variety, tessellation, softest shadows, etc and even with SweetFX it's been smooth as hell on stock clocks.
 
This is actually one of the few games I've left DoF on. Motion blur is a guaranteed disable for any game that has the option, though.

I'm with you. I usually disable Depth of Field because of the performance boost and because most games go overboard with it...but it just looks so nice in GTA 5, and the performance hit is minimal.
 
Now that's a first, turning on graphics settings in-game that the developer has included, is "misleading" to performance.

You hear a new one everyday.

There's a lot of games that include a 30 FPS framerate cap option.

Are you going to start turning that on for your benchmarks as well?

It is a graphics setting in-game that the developer has included, after all! If you don't turn on everything, you're not maxing the game out!
 
There's a lot of games that include a 30 FPS framerate cap option.

Are you going to start turning that on for your benchmarks as well?

It is a graphics setting in-game that the developer has included, after all! If you don't turn on everything, you're not maxing the game out!

One is a 3D graphics affect rendered in-game.

The other is a frame rate cap.

You are really reaching with your debate, and making the most ridiculous argument.

You want us to turn off graphics features based on your opinion of what looks good or bad graphically. That is completely subjective, and arbitrary. Everyone has different opinions on what "looks good" your opinion is yours, and someone may disagree with you.

It is relevant to turn everything on, and test performance with all the 3D graphic effects in-game developed by the developer and see how the game performs with and without those features enabled. In this way we can find out how settings affect performance, which ones are more demanding than others, and how they compare vs. AMD and NV. We have to start somewhere, and that is with everything on.

In our full-evaluation we will find out what the highest playable settings are, i.e. adjusting graphics and performance to find that ratio of the best image quality along with the smoothest playable performance per-card.

What you saw in this article, is called a "preview" of performance. These are apples-to-apples settings using the highest in-game settings the game supports to show how graphically demanding the game is across video cards, and how they perform.
 
I think AMD CHS looks the worst, but regardless, judging by the images, 'Rockstar SOftest' both looks the best AND performs the best. I'll be checking it out when I get home tonight.



The Rockstar Softest is also a kind of trade off. While it objectively looks better its the least realistic option. Both AMD and NVidia contact hardening shadows in theory behave like real life light and shadows, the further away is the source of the shadow less focused the shadow is. Though to be fair it still has long way to go before it looks like real life, when the source is really far away it becomes a jaggy mess. But apparently Rockstar implementation uses the normal soft shadows which, while always nice looking, are completely static. To be honest I dont know which I prefer.
 
If anyone is looking for a decent breakdown while we wait for the [H] review the GeForce guide for GTA 5 was posted a couple days ago.

In my earlier post I mentioned some areas of the map slowing below 30fps with maximum in-game settings on TITAN X SLI. It turns out most of the culprit is Grass Quality, as indicated within that article. Dropping from Ultra to Very High increased my minimum framerates above 33fps in those same areas.
 
If anyone is looking for a decent breakdown while we wait for the [H] review the GeForce guide for GTA 5 was posted a couple days ago.

In my earlier post I mentioned some areas of the map slowing below 30fps with maximum in-game settings on TITAN X SLI. It turns out most of the culprit is Grass Quality, as indicated within that article. Dropping from Ultra to Very High increased my minimum framerates above 33fps in those same areas.

Because its a big read, I made a summary of the framerate results in the other thread.
http://hardforum.com/showpost.php?p=1041562158&postcount=1785
 
I know its not the [H]ardest of [H]ardware, but I was really hoping to see some 970 benchmarks.

Right now I have a 3570k at 4.4Ghz, 8gb RAM, and a Radeon 7850. Built around three years ago, I am saving for a 970 and another 8gb RAM, what kind of performance could I expect at 1080p?
 
7850 will hold you back even at high +1080P
I'm at 4.7 and 7870 @1250mhz and i don't like the minimum fps
 
7850 will hold you back even at high +1080P
I'm at 4.7 and 7870 @1250mhz and i don't like the minimum fps

Yea im already having a not so enjoyable time playing Guild Wars 2, wich is why im saving for a GTX 970, and wanted to see some benchmark numbers with it.
 
I know its not the [H]ardest of [H]ardware, but I was really hoping to see some 970 benchmarks.

Right now I have a 3570k at 4.4Ghz, 8gb RAM, and a Radeon 7850. Built around three years ago, I am saving for a 970 and another 8gb RAM, what kind of performance could I expect at 1080p?
Another review has the 970 and 980 neck and neck (about 1 to 2 FPS difference) between the two cards.


http://www.forbes.com/sites/jasonev...rked-across-14-nvidia-and-amd-graphics-cards/
 
Forbes is actually one of the few respectable sources of news about video games since most other sites have been hijacked by SJWs.

Forbes is a magazine aimed at the wealthy and is more conservative so they don't give a fuck about pissing off lefties.
 
How are people finding DSR vs MFAA in GTA V?

I am still not sure, i think even slight DSR works better, but can't decide.
 
Playing pretty smooth and gorgeous @1440p on my 780's not to mention the heat is not even a factor (stays below 80c).

Love dof and would really love a fov meter though the way it is is fine. (3440x1440p + proper fov meter/980sli...drool)

I usually don't do blur ever but since you can do just a little it works for me, reality has blur, it's just that most games will not let you do just a little.

Gonna get me some 980's asap when the b-stock and refurbs start kicking up.

Pretty much everything turned up, blown away at how smooth and great this game looks and plays on pc. (have the x1 and 360.v also)

So glad they held off, though bugs on release are normal and expected now.
 
Last edited:
Back
Top