NVIDIA GPU Generational Performance Part 1 @ [H]

Discussion in 'Video Cards' started by Kyle_Bennett, Jul 25, 2018.

  1. Kyle_Bennett

    Kyle_Bennett El Chingón Staff Member

    Messages:
    52,729
    Joined:
    May 18, 1997
    NVIDIA GPU Generational Performance Part 1

    Ever wonder how much performance you are really getting from GPU to GPU upgrade in games? What if we took GPUs from NVIDIA and AMD and compared performance gained from 2013 to 2018? We are going to start that process today in Part 1 focusing on the GeForce GTX 780, GeForce GTX 980 and GeForce GTX 1080 in 14 games.

    If you like our content, please support HardOCP on Patreon.
     
    MATRIXSHARK, dvsman, c3k and 19 others like this.
  2. lucidrenegade

    lucidrenegade Limp Gawd

    Messages:
    321
    Joined:
    Nov 3, 2011
    Nice performance jump from the 980 to 1080, compared to the 780 to 980.
     
    DrezKill, AlphaQup and Armenius like this.
  3. MavericK

    MavericK Zero Cool

    Messages:
    30,910
    Joined:
    Sep 2, 2004
    Could be other factors at work, but I would guess that much of that is due to the 1080 having double the VRAM of the 980, versus the 980 only having 1 GB more than the 780.
     
  4. auntjemima

    auntjemima Hand Jobs Legend

    Messages:
    4,050
    Joined:
    Mar 1, 2014
    Not sure if it's just me, but I don't see any images. Such as charts, for example.
     

    Attached Files:

    Last edited by a moderator: Jul 25, 2018
  5. scojer

    scojer 2[H]4U

    Messages:
    3,511
    Joined:
    Jun 13, 2009
    You have a voicemail.

    The charts work on desktop, haven't tried mobile myself. Maybe hold the phone sideways? That shows more information when you're on the forum homepage.
     
  6. Comixbooks

    Comixbooks Ignore Me

    Messages:
    11,103
    Joined:
    Jun 7, 2008
    That was a good read to think the 780 gtx met it's maker Kingdoms Come. Buying the newest and best card isn't a fools game after all.
     
  7. Master_shake_

    Master_shake_ Little Bitch

    Messages:
    6,793
    Joined:
    Apr 9, 2012
    damn sam that 1080 is just spanking those 28nm cards.
     
    Sulphademus and DrezKill like this.
  8. auntjemima

    auntjemima Hand Jobs Legend

    Messages:
    4,050
    Joined:
    Mar 1, 2014
    I never check voicemail lol..

    I tried sideways and requested the desktop page. No good.

    I'll just read it when I'm home.
     
  9. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    14,316
    Joined:
    Jan 28, 2014
    But I heard the GTX 1080 was just a revised version of Maxwell (Paxwell) and that NVIDIA only gives us 30% improvements.
     
    Bankie, renz496, Gideon and 3 others like this.
  10. DooLocsta

    DooLocsta [H]ard|Gawd

    Messages:
    1,371
    Joined:
    Jan 26, 2005
    This was a good read, thanks for doing all the work required. I am looking forward to seeing how bad my 980Ti will be destroyed :unsure:
     
    Red Falcon, AlphaQup and Armenius like this.
  11. DigitalGriffin

    DigitalGriffin [H]ardness Supreme

    Messages:
    4,513
    Joined:
    Oct 14, 2004
    While I really enjoyed this article, there's some data left out I would have liked to see.

    If we scaled the stock clock so it was an apples to apples comparison for IPC purposes based on CU performance. (I know it's not apples to apples entirely because of memory advances)

    I think we will be stuck with 2GHz for the immediate future, so if there aren't IPC gains, it could be a good indicator of where things are headed design wise.
     
  12. Parja

    Parja [H]ardForum Junkie

    Messages:
    12,589
    Joined:
    Oct 4, 2002
    Who cares? If the generational improvement is a 40% increase in clock speed with a 10% decrease in IPC, that's still a 26% increase in performance.
     
    Red Falcon, Fleat, Kwaz and 3 others like this.
  13. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    14,316
    Joined:
    Jan 28, 2014
    No GPU out there runs at 2 GHz out of the box right now, so I don't know what you mean about being "stuck" with 2 GHz. I also don't understand the obsession with figuring out IPC on a GPU when it isn't the most important metric of its performance in the least.
     
    Nima84 likes this.
  14. DigitalGriffin

    DigitalGriffin [H]ardness Supreme

    Messages:
    4,513
    Joined:
    Oct 14, 2004
    Quite simply this:

    Look where intel has been suck from i2XXX->i6XXX. They hit a MHz ceiling. I think the same is coming for graphics cards.

    Adding more Cuda cores has limitations on improvements.
     
    Last edited: Jul 25, 2018
    thenapalm likes this.
  15. DejaWiz

    DejaWiz Oracle of Unfortunate Truths

    Messages:
    19,044
    Joined:
    Apr 15, 2005
    Holy shit...makes me really want to upgrade to a 1080Ti, now!

    Great review, [H]!
     
    neural0, DrezKill and Armenius like this.
  16. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,738
    Joined:
    Apr 17, 2000
    Here's a car analogy. You are a car reviewer, you get 5 different cars to test off-road performance. Sure, you can put your own custom off road tires on each car. However, by doing this you are modifying the default stock performance to your will. The only way to keep things fair, is to test everything at default stock, as the manufacturer intended its performance profile to be.

    I get what you are saying about IPC, but we will never, ever, be able to create an apples-to-apples situation because at the end of the day they all have different CUDA Core counts, ROPs, Texture Units, and other dissimilar specs that keep us from achieve that, even if we matched clock speeds. We still wouldn't be really testing anything useful.

    The only way to keep things fair is to evaluate the default performance, as the manufacturer intended at launch. That's the performance that really matters because that's the performance you'll get in-game when you buy the card.
     
  17. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,738
    Joined:
    Apr 17, 2000
    I think a lot more is at work because at the lower game settings the VRAM was not bottlenecked, and 1080 was still that much faster.
     
  18. Rockenrooster

    Rockenrooster n00bie

    Messages:
    54
    Joined:
    Apr 11, 2017
    dang. Kingdom come is a beast
     
  19. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,738
    Joined:
    Apr 17, 2000
    and consider there is also a high resolution texture pack for the game

    definitely would not have worked well on the 780 or 980, or at all, heh
     
    neural0, Armenius and DrezKill like this.
  20. gan7114

    gan7114 [H]Lite

    Messages:
    123
    Joined:
    Dec 14, 2012
    Nice review. The only thing I'd have liked to see in addition would have been the 70 series, but that's only because I'm using a 970, so bias. :) I get why the focus is on 80 and ti.

    I somewhat agree with the question posed near the end of conclusion 3, that except for a handful of games the 1080 hasn't really been taxed much by anything -- at 1440p. Of course, 4K is becoming the new hotness, so it might behoove [H] to conduct the performance review of the 1080 ti at both 1440p and 4K, since it's just going to stomp all over 1440p. 4K results will give a better picture of where we may be heading with 1180 and 1180 ti.
     
    Sulphademus likes this.
  21. Zarathustra[H]

    Zarathustra[H] Pick your own.....you deserve it.

    Messages:
    24,864
    Joined:
    Oct 29, 2000
    Kyle_Bennett If you plan on doing one on Titans, I still have my old original 2013 Kepler Titan humming along in my stepsons rig, I could swap out for something else.
     
  22. CombatChrisNC

    CombatChrisNC Gawd

    Messages:
    958
    Joined:
    Apr 3, 2013
    Man... 980 to 1080 was like i7 920 to i7 2700k kind of bump. 780 to 980's been what we've seen since from Intel.

    Taking bets: 1080 to 1180 will be (as a %) a gain closer to 780 -> 980 rather than 980 -> 1080. Call it a hunch that I don't see lightning striking twice, back to back.
     
  23. Zarathustra[H]

    Zarathustra[H] Pick your own.....you deserve it.

    Messages:
    24,864
    Joined:
    Oct 29, 2000
    Neat article.

    While I knew this would be the case intellectually, it still is rather stark to see the once might 780 get less than 20FPS in many modern titles.
     
  24. Maddness

    Maddness Gawd

    Messages:
    804
    Joined:
    Oct 24, 2014
    Wow, this just goes to show how Nvidia hit it out of the park with Pascal. I'm interested to see what the results would be on the AMD side.
     
    Sulphademus, Armenius and DrezKill like this.
  25. CuriousGeorge

    CuriousGeorge n00bie

    Messages:
    35
    Joined:
    Apr 8, 2012
    The larger jump from the 980 to the 1080 is largely attributable due to the huge process node jump (780 = 980 = 28nm, 1080 = 16nm), combined with the fact that the 780 is a cut down 780 Ti (big chip) and not the fully enabled mid-sized chip like the 980 and 1080 (which the 770 would correspond to). The fact that the 980 is significantly faster than the 780 at all while still being 28nm is related to the clockspeed advantage on refined 28nm but most significantly due to the huge efficiency jump of the Maxwell microarchitecture (which is the main reason AMD is behind to the degree it is - Maxwell really was a kind of "quantum leap", similar to Conroe or Sandy Bridge in desktop CPUs).
     
    Nima84 and Armenius like this.
  26. xrror

    xrror n00bie

    Messages:
    16
    Joined:
    Oct 21, 2008
    In regards to the jump from Maxwell to Pascal.

    It used to be a bit of a joke with Maxwell, just add +300mhz to the core and you would likely be able to stabilize that clock. With Pascal nVidia really pushed their TDP management to the next level, so you basically get that "extra" margin now on a "stock" card with the boost modes.

    Just 2 cents in there.

    That said, in my experience even with a +250mhz core clock boost on the 980Ti I had, it could "only" equal my 1070 (non-Ti). The 1070 would bench higher, but the 980Ti would keep a higher minimum framerate. I just give these as comparison points, I also could have been cpu bound.

    Just some thoughts on Pascal vs Maxwell.
     
  27. Kranium

    Kranium Limp Gawd

    Messages:
    429
    Joined:
    May 27, 2011
    I refer to my purchase of a gtx780 as the worst PC component purchase I've ever made. As you point out, it was eclipsed in short order by games as well as Nvidia, seems to have had poor driver optimization support, and it cost more than the next 2 generation's cards.
     
    Brackle likes this.
  28. _l_

    _l_ [H]ard|Gawd

    Messages:
    1,104
    Joined:
    Nov 27, 2016
    enjoyed the Part 1 ... thanks.

    Also noticed this:

    "as we anxiously await what NVIDIA has next up its sleeve for 2018"

    hmmmm ...
     
    Last edited: Jul 25, 2018
  29. IKV1476

    IKV1476 Lurker

    Messages:
    200
    Joined:
    Dec 26, 2005
    Wonderful article. And more companion articles coming soon. Can't wait for more.
     
    Armenius likes this.
  30. DrezKill

    DrezKill Limp Gawd

    Messages:
    370
    Joined:
    Mar 11, 2007
    Well gawd daayaaaaaamn, I never would have expected to see such a massive jump in performance going from Maxwell v2 to Pascal, than going from Kepler to Maxwell v2. Geeeezus. The drop in size to the smaller node and the refinements and improvements made to the Maxwell architecture that turned it into Pascal really counted for a whole helluva lot, didn't it. As always thanks to tha [H] for the work it took to present us with this info. Looking forward to reading the next parts.
     
  31. GoodBoy

    GoodBoy Gawd

    Messages:
    956
    Joined:
    Nov 29, 2004
    Love these kinds of articles, can't wait for the all in one comparison when you get done.
     
    Armenius likes this.
  32. stephen2002

    stephen2002 [H]Lite

    Messages:
    125
    Joined:
    Dec 23, 2005
    Cool comparison! Here's hoping the 1180 adds another 50%. I went form a 760 to a 1080 and at 4K plenty of games push the 1080 to the limit.
     
  33. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,227
    Joined:
    Oct 13, 2016
    Thanks Brent&Kyle for all your hard work on this. I enjoyed as I've lived it. I also really liked how you used 1440p as the standard resolution. For the time being, 4k, isn't a realistic target for most. I'm pretty sure Part 2 will show some interesting results for 1440p. Got a giggle out of you using KCD for benching, this game makes me grumpy with the tricks I have to do to get to a playable 4k setting. Totally reminds me of Witcher 2 Uber at 1080p back in the day. So many settings, such a PIA to get playable. Nowadays just about any x80 will do the trick.

    The rigs in my profile have had many GPU's. Most of which are in this article. Had 780>G1 970 SLI w/ 780 physx>G1 1080 SLI. The 780 I had was an EVGA SC780. Such a beast for the time and it had replaced a pair of PNY560TI's in SLI. Loved it. Only regret was not remembering about the TI's coming down the road. Was really happy with my G1 970's in SLI too, thanks to the SLI support of the time they had no trouble trouncing a single 980 and even kept up with a 980TI for most things in 2016. Never had a 980 but I do have a MSI Titan with OC'd 980m SLI. Sad part is they're roughly the same performance as the desktop 970's and 8GB Vram. Ironically the Vram size is pretty much useless at 1080p which is mostly much the limit of what the clocks can deal with in ultra for most games. The G1 1080SLI I have now still out paces most things in 4k compared to my single Strix 1080TI but Vram, as you noted with the 780, has continued to be an issue. MEA and ROTR both will hit the ceiling for them with AA type settings maxed, while the 1080TI has 'just enough' to get by but lacks the raw processing power to hold 4k/60hz.

    From the 780's to 980's I used to say that if they'd just make it so you could upgrade the Vram then most of those gen's would last years more. Obviously defeating any point of anyone upgrading to a new card. No surprise either that SLI support fell off a cliff after the 970's since so many we're happy with even the 3.5 ceiling for most things. The performance/price ratio was amazing then.

    I really want to jump ship to AMD in the next round but honestly whoever makes something a bit better than a 1080TI will get my money as the probable last card I put in my 4930k rig. For the next round I just want a single, 4k/60hz, above 11GB, solution. Assuming that MOBO survives, I'll probably upgrade to PCIe NVMe's at that point since I'll have the slots/lanes to spare. The only thing that will hold me back is $$$. If the rumors are true about being ~$1200-1500 then I'll stop here. It's just not worth that much to me anymore. A thousand is my limit.
     
    IdiotInCharge and Brent_Justice like this.
  34. Trepidati0n

    Trepidati0n [H]ardForum Junkie

    Messages:
    12,144
    Joined:
    Oct 26, 2004
    I think this also explains why nVidia doesn't need to push the next gen card out so fast...they overleaped the step from the 980 and gave a little "too much". If the 1180 has as big a delta from the 1080 as the 1080 has from the 980...I would probably buy now and realize this is good as it gets for a while.
     
  35. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,227
    Joined:
    Oct 13, 2016
    Do some searching here at [H]ard and you'll find Kyle has given great insight on how to make this happen. My air cooled rigs, 1080TI, and 1080SLI both are running at over 2Ghz stable, 50-60c. It's really not that difficult, right amount of power, keep 'em under 60c, and honestly not usually a problem. Even the 1080SLI will hold 2012MHZ upwards of 65c. I only run less when I want the rooms as close to silence as possible.
     
    Kyle_Bennett likes this.
  36. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,227
    Joined:
    Oct 13, 2016
    To be even more accurate about the 2Ghz limit we could throw in liquid. I've seen some 'order-able' solutions, non ln, that can keep things around 40-50c but even then the ceiling for most is ~2.1-2.2Ghz. Current Pascal just can't seem to go past 2.2 stable but 2.0 is pretty easy.

    edit: Again I have to say the Kyle/Brent have done extensive reviews on these solutions as well. The 2.0 limit is really just a generalization as to what many cards will do with minor tweaking 'out of the box'.
     
    IdiotInCharge and Kyle_Bennett like this.
  37. Kor

    Kor 2[H]4U

    Messages:
    2,082
    Joined:
    Mar 31, 2010
    Someones prepping for Turing comparisons :D

    Here's hoping for at least the 980 to 1080 jump.
     
    Maddness and Kyle_Bennett like this.
  38. Kor

    Kor 2[H]4U

    Messages:
    2,082
    Joined:
    Mar 31, 2010
    Check's own signature, damn I must be lying.
     
    lostin3d and Kyle_Bennett like this.
  39. Supercharged_Z06

    Supercharged_Z06 [H]ard|Gawd

    Messages:
    2,034
    Joined:
    Nov 13, 2006
    Heh. My GPU has been running at just over 2Ghz since August 2016.
     
    lostin3d and Kyle_Bennett like this.
  40. A Little Teapot

    A Little Teapot [H]Lite

    Messages:
    85
    Joined:
    Dec 9, 2016
    Very cool idea, but the information isn't going to be useful to most people without mid-tier cards being used as well. A 970/1070 tier comparison would likely engage a lot more readers.
     
    deruberhanyok and amenx like this.
Tags: