PCIE 3.0 vs 2.0 GTX Titan

BatJoe

Gawd
Joined
Apr 4, 2012
Messages
836
I was curious to see if enabling PCIE 3.0 on my Titan would have any benefits. I didn't expect to see any really, but was surprised. Full specs in sig.

Futuremark "Fire Strike":

PCIE 3.0

Test 1: 53.8 FPS
Test 2: 43.1 FPS

PCIE 2.0

Test 1: 53.1 FPS
Test 2: 42.7 FPS

Not very large gains, so I tried 3DMark11, which had more interesting results.

Futuremark 3DMark11

PCIE 3.0

Test 1: 73.4 FPS
Test 2: 71.4 FPS
Test 3: 92.5 FPS
Test 4: 42.8 FPS

PCIE 2.0

Test 1: 70.8 FPS
Test 2: 68.7 FPS
Test 3: 88.9 FPS
Test 4: 41.3 FPS

I would say the major benefits for PCIE 3.0 will be for multi-gpus, but clearly it still has benefits for single gpus.
 
Here is some results from Just Cause 2 "Concrete Jungle" benchmark:

PCIE 3.0

Average FPS: 70.07

PCIE 2.0

Average FPS: 67.92
 
PCIe 3.0 more or less came about with the need for faster SSD drives and multiple GPU's. We haven't really seen much of a difference in the single GPU arena since PCIe 1.0 16x to PCIe 2.0/3.0. Still interesting to see, but well within the margin of error. It'll be a long time, possibly a decade, before PCIe 2.0 gets fully milked by just video cards alone.
 
Not that much of a difference then when comparing the two from what I can see.

But the fact appears to be that there is now a difference, where there wasn't before. Okay, it's not much in this generation, but what about the next generation? And the generation after that?

I wonder if other [H]ers can show similar results?
 
Vega did some testing on this a while ago. Basically the results showed that the performance gains between pcie 2.0/3.0 at low resolutions (1080p) with a single card were minor at most. When gaming at high/extreme resolutions with two or more cards the performance gains were pretty nice (around 20 fps average if I remember)
 
Without knowing what your inter-test variance is and controlling for it, this isn't really a useful comparison. Those differences may not be reliable (i.e., they may be noise).
 
If you really want to get down to the nitty gritty, a lot of the people here are enthusiasts (gee, really) and I know that they (myself included) would go through a lot of trouble to make PCI-E 3.0 work (or purchase a board that supported it) if there was even a hint of a performance increase.

Some would go so far as to do so just to see a flag flip to PCI-E 3.0 in GPU-Z, and then they could sleep at night.
 
If you really want to get down to the nitty gritty, a lot of the people here are enthusiasts (gee, really) and I know that they (myself included) would go through a lot of trouble to make PCI-E 3.0 work (or purchase a board that supported it) if there was even a hint of a performance increase.

Some would go so far as to do so just to see a flag flip to PCI-E 3.0 in GPU-Z, and then they could sleep at night.

Well, if you're going to do it, you should at least do it based on results from multiple runs of the same set of tests, averaged, so you know what the variance is and how reliable the results from a single pass of tests is. When the results are as extreme as the deltas in his quad-SLI test setup, nothing more than a single pass and a second sanity check is really necessary. When you're talking about a delta of one FPS or so, one really should rerun the tests multiple times to ensure it's a real result, and not just normal test variation producing the difference.

But if just seeing "PCIe 3.0" in gpu-z makes you happy, go for it. Nothing like faith-based performance tweaks. :)
 
Well, if you're going to do it, you should at least do it based on results from multiple runs of the same set of tests

I should have noted, the results I posted were on second runs. Beyond that I didn't do anything special.

Some would go so far as to do so just to see a flag flip to PCI-E 3.0 in GPU-Z, and then they could sleep at night.

spinaltap-11.jpg


:D
 
Yep, I noticed this with an OCed GTX 660. It gained 1FPS in test 3 and 4 of 3dmark11 on PCI-e 3.0 vs 2.0, but only fractions in the first 2 tests.
 
Last edited:
Likewise the results mostly show, even in single GPU situations that PCIE 3.0 does give you gains. How much depends on the application.

Actually, that's not what it shows. If you read the entire article, pretty much every case of gains (and they were very small) was attributed to the improved Ivy Bridge IPC, not to PCI 3.0.

And even with the improved IPC and PCI 3.0, the actual improvements -- and this is explicitly stated in the article -- for games isn't noticeable.
 
Seems like the only difference is when the game is not taxing the GPU that much, PCI 3.0 will let it spread its legs a bit further.
 
My hypothesis is that you won't see a noticable performance increase unless transfer times across the bus is becoming a limiting factor. As the GPU is perfectly capable of simultaneous compute and transfer I wouldn't suspect to see much of an advantage in using PCIE 3.0 until we see some more powerful cards out.
 
Back
Top