Grand Theft Auto V Video Card Performance Preview @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,626
Grand Theft Auto V Video Card Performance Preview - Grand Theft Auto V has finally been released on the PC. In this preview we will look at some video card comparisons in performance, maximize graphics settings at 1440p and 4K. We will briefly test AMD CHS Shadow and NVIDIA PCSS shadow and talk about them. We will even see if SLI and CrossFire work.
 
Titan X ftw you mean. Looking at the frame rate charts, the Titan X has the least amount of variance in frame rate, you get a more consistent frame rate vs the 980 SLI setup.


Kyle: Any plans to add 3440x1440 results in future reviews? I think it's a growing segment in gaming, worth taking a look at imo.


Edit: What was the average boost clock for the Titan X?
 
I'm glad you guys mentioned the texture quality and foliage not looking all that great for a game which such steep requirements...looks like my GTX 970 will be fine at 1920 x 1200...looking forward to the full performance review, you guys always do a great job going in-depth with the graphics settings/comparisons
 
Titan X ftw you mean. Looking at the frame rate charts, the Titan X has the least amount of variance in frame rate, you get a more consistent frame rate vs the 980 SLI setup.


Kyle: Any plans to add 3440x1440 results in future reviews? I think it's a growing segment in gaming, worth taking a look at imo.


Edit: What was the average boost clock for the Titan X?

Nope. I game at 1440p on ROG swift, highest min max and avg. 980 SLI FTW!!!
 
Nope. I game at 1440p on ROG swift, highest min max and avg. 980 SLI FTW!!!



Eh, those massive dips in frame rate with the SLI and Crossfire rigs take away from a smooth gaming experience. Several times the frame rate drops 40-50fps, that big of a drop will be noticeable. SLI is certainly better than crossfire when it comes to this.
 
Looking at the graphs... Was the game CPU bottlenecked?

Cause neither multi-card performance looks acceptable at such poor scaling, I would call them "functional" at best.

Any plans for dual 8GB 290x cards benchmarking?
 
I've been surprised by how well this game runs. I guess Rockstar wasn't lying when they said they wanted to take some extra time polishing and optimizing the PC version.
 
Interesting. I found that at 1440p, 4GB of VRAM wasn't quite enough on my 980. Didn't play for very long though.
 
Eh, those massive dips in frame rate with the SLI and Crossfire rigs take away from a smooth gaming experience. Several times the frame rate drops 40-50fps, that big of a drop will be noticeable. SLI is certainly better than crossfire when it comes to this.

I agree, after having both SLI and then Crossfire setups I'm not going to be looking at one for my next one. The lack of frame rate stability is really distracting, and in some games it's unbearable. Maybe Adaptive V-Sync would help with that, I'm not sure.
 
I don't understand why Hardocp's SLI test rig didn't have very good scaling? I have a similar setup with a 3770k at 4.5, and 2 original Titan cards. I have everything set to max, except msaa at 4x, and I have vsync set to on. I play at 2560x1600 and for the most part, I stay pegged in the low 50's to 60fps range for the most part, unless of course I am in a chase and there is multiple rendering going off. If I keep my cards at stock or overclock with afterburner, in-game OSD shows sli scaling on both cards always above 95%.
 
I don't understand why Hardocp's SLI test rig didn't have very good scaling?

What your tool says about your SLI performance and the way we test multiple GPU scaling is a bit different. We compare single card performance to SLI/CF performance. That usage meter your are referring to is basically worthless to find true scaling percentages.
 
What was GPU usage maxing out at on the cards?

On my SLI EVGA 970 SSC's...my gpus dont max out. 50 to 80% at most.

Where as in single gpu...its pegged.
Also my 4690k stays pegged 95% of the time.
 
I REALLY wanted to see some 290x 8GBs in Crossfire :( This is the game that would determine if the 8gbs is useful on that card...
 
What was GPU usage maxing out at on the cards?

On my SLI EVGA 970 SSC's...my gpus dont max out. 50 to 80% at most.

Where as in single gpu...its pegged.
Also my 4690k stays pegged 95% of the time.

Question, is this in the online or single player? I have heard a lot of people talking about high CPU use. I'm running at 1080p on a single 970 on my rig in my sig and I am not seeing that kind of CPU impact.

Perhaps it is the CPU's are actually struggling now trying to maintain throughput as the I/O controller is on-board?
 
I'm glad you guys mentioned the texture quality and foliage not looking all that great for a game which such steep requirements...looks like my GTX 970 will be fine at 1920 x 1200...looking forward to the full performance review, you guys always do a great job going in-depth with the graphics settings/comparisons

Which is why you always wait on [H] review/preview (on blockbusters such as this of course) then i'll read the other sites.:cool:
 
Soft Shadow image quality comparison screenshots have been added to the conclusion page. None of those screenshots are zoomed in.
 
Yikes, judging from that image comparison, PCSS is by far the worst and also the most expensive performance-wise. Don't see any reason to use it currently.
 
On my sig rig, i thought i was CPU bottle necking, seeing 90-95% gpu usage and not max frames.

Though i have been monitoring CPU usage on a second screen(using real temp), and seeing all 4 physical cores at around 70% at all times, so not sure why its not pegging 99% GPU usage if it isn't frame rate capped.
 
Brent Justice said:
This game is going to need the latest architectures to achieve the best performance, and an enjoyable one at that. It is also going to need high-end GPUs to allow the gameplay experience to be fun.
Well that's just not true. I have the game running just fine on GTX 780 SLI, and the framerate almost never hitches at 1440p, despite being "gimped" by 3gb. My husband has it running on a single 7970 with a few of the settings turned down, and the gameplay experience in both cases is absolutely fun.

I'm sure the game benefits from newer generation hardware, but it seems disingenuous to suggest that the latest tech is required for a good gameplay experience, and lazy considering how [H] typically does gameplay reviews on a wide range of hardware.

Don't get me wrong, I'm itching for a reason to upgrade my GPU setup, but this game isn't it.
 
Yikes, judging from that image comparison, PCSS is by far the worst and also the most expensive performance-wise. Don't see any reason to use it currently.

IMO it looks the best in Terms of what shadows cast by leaves Would look like in sun light. They won't be defined perfectly like amd's solution.
 
IMO it looks the best in Terms of what shadows cast by leaves Would look like in sun light. They won't be defined perfectly like amd's solution.

I live 10 feet from a forest and I have never seen a leaf's shadow look like the Nvidia image. They need to work on that a bit!
 
"GTA V uses on an overhauled game engine called RAGE, which stands for Rockstar Advanced Game Engine used in the previous version of the game."

The word "on" in that sentence should be removed.
 
i need to know, buy a $200 290x now or wait? what do you guys think?
 
Something is really wrong with [H]'s setup. 34 FPS average framerate at 4K with SLI 980s? I'm calling bullshit on that one.

I've got over 35 hours logged in the game so far (already beat all 70+ story mode missions) on my SLI 980 setup and the game definitely stays in the 40-60 FPS range most of the time. It would be pretty obvious to me if it spent most of its time chugging along at console peasant framerates or if it constantly had huge jumps between 20 and 60 FPS. Yes, I have been running the game on max settings, except for motion blur/depth of field/post-processing garbage I have disabled as always, and no AA enabled.

This is actually one of the best performing recent games at 4K. Watch Dogs is a similar game but it runs like shit on my SLI 980s. Also played Dying Light which ran... okay at 4K on SLI 980s, but not as good as GTA V. Evolve is the other AAA 2015 release I've played on this system, and it runs at a decent framerate but does suffer stutter lag spikes from running out of VRAM.

It is important to note that that the VRAM doesn't appear to be a serious limitation here as the review implies, because I often encounter noticeable, sharp studders when my 980's VRAM is being fully utilized @ 4K -- but curiously enough, not in this game. GTA V has impressive texture streaming so even when you are hitting your VRAM limit you won't see huge stutters like you do in certain other games *cough* Assassin's Creed Unity. It is the first post-2013 AAA game I've played at 4K on my 980s where I felt the VRAM wasn't introducing a bottleneck, because the telltale drastic stuttering just hasn't been happening. I suspect Rockstar tested this game pretty well at 4K and when they did so, they used 4 GB VRAM GPUs to do it, so the game was built around running at 4K with 4 GB of VRAM to work with.
 
except for motion blur/depth of field/post-processing garbage I have disabled as always, and no AA enabled.

They have some of it enabled, why are you surprised they got slightly worse framerate?
 
Last edited:
GTA isn't my cup of tea but I enjoyed the review all the same!

I did spot one error: on page three you write:

Even at 1440p the TITAN X exceeded 4GB slightly.

According to the table on that page, the game uses 4015 MB at 1440p, which is slightly less than 4 GB, which is 4096 MB.
 
So much for being VRAM heavy :D (according to some people) and no wonder the gtx 970 handles it well if you don't push the AA too far.
 
Something is really wrong with [H]'s setup. 34 FPS average framerate at 4K with SLI 980s? I'm calling bullshit on that one.

You are calling "bullshit on that one" but are using different/lesser settings than us. Uh, OK.
 
Back
Top