Last Gen Games - Max IQ and Perf on Today's GPUs @ [H]

Discussion in 'Video Cards' started by Kyle_Bennett, May 24, 2018.

  1. Kyle_Bennett

    Kyle_Bennett El Chingón Staff Member

    Messages:
    52,981
    Joined:
    May 18, 1997
    Last Gen Games - Max IQ and Perf on Today's GPUs

    Have you ever wanted to pull out an older game from years past, and play it on your new video card to see if the gameplay experience is improved by turning on all the highest graphics options, or a higher resolution like 4K? We sure did! We test some older games and see if we can "max out" the gameplay experience in 2018.
     
    Fleat, DF-1, lostin3d and 6 others like this.
  2. pcgeekesq

    pcgeekesq [H]ard|Gawd

    Messages:
    1,135
    Joined:
    Apr 23, 2012
    I was glad to see this article, it's more relevant to me (and a lot of other people I suspect) than the usual latest-games benchmarking.
    Good to see that when the next generation of cargs comes out, my wife can look forward to a better 4K Witcher experience -- if she hasn't finished the final expansion by then.
     
  3. Xyvotha

    Xyvotha Limp Gawd

    Messages:
    348
    Joined:
    Aug 22, 2005
    Crysis not tested, unsubbed.
     
    maclem8223, Armenius, Aioeyu and 12 others like this.
  4. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    6,427
    Joined:
    Jun 13, 2003
    I love going back and playing older games with newer hardware- nice to see some numbers put to it!
     
    Manny Calavera and DrezKill like this.
  5. Teenyman45

    Teenyman45 [H]ard|Gawd

    Messages:
    2,011
    Joined:
    Nov 29, 2010
    Remember when SLI and Crossfire used to work fairly well and you could use two or three or maybe even four GPUs and get the full effect of all the eye candy with the best resolution available?
     
    Armenius likes this.
  6. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,744
    Joined:
    Apr 17, 2000
    AceGoober, Armenius, lostin3d and 2 others like this.
  7. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,744
    Joined:
    Apr 17, 2000
    Not always, games like GTAV or Watch Dogs 2 never did well with multi-GPU due to the nature of the game, or implementation. Often efficiency was hard to get over 80%, sometimes I remember it being very small with certain games, having hardly any affect. A lot came down to developer implementation as well as vendor support. It was inconsistent at times. Certain games work better with it than others.
     
    AceGoober likes this.
  8. Xyvotha

    Xyvotha Limp Gawd

    Messages:
    348
    Joined:
    Aug 22, 2005
    AceGoober likes this.
  9. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    5,914
    Joined:
    Feb 22, 2012
    What moved me away from multi-GPU was getting a game to work then one day bam, just stops working. When I had AMD this happened more than once with BF4 and I sold my cards the second time. Nothing is more irritating than having time for one match and you can’t because of mGPU. Not worth it!
     
  10. Stoly

    Stoly [H]ardness Supreme

    Messages:
    5,742
    Joined:
    Jul 26, 2005
    I liked this. I have a GTX1070Ti myself and play older games at 4k, I wish it was included in the test bed, but I guess it should perform close the the Vega64 anyway.

    An IQ comparison between the highest/lowest settings would be nice.

    My only rant is NO UNREAL GOLD tested? How could you?!!! :D:D
     
    Manny Calavera likes this.
  11. Grimlaking

    Grimlaking 2[H]4U

    Messages:
    2,140
    Joined:
    May 9, 2006
    Enjoyed the article. I'm an old dog so games like Watchdogs 2 are still "new" to me. Actually I saw rather few "old" games in the mix. I don't even see GTA V as really being OLD. Maybe I've been spoiled by World of Warcraft. (THE MOST EXPENSIVE GAME IN THE WORLD.) <I can math it for you if you want but 15 dollars a month for the past 13 years pretty much nails that coffin shut.>

    I was also expecting to see Crysis thrown in just because. A little sad that it wasn't. lol. I'm pretty sure that's going to be a GOG soon.
     
    Armenius and noko like this.
  12. FlawleZ

    FlawleZ Gawd

    Messages:
    544
    Joined:
    Oct 20, 2010
    This is true. There had been some high notes for Crossfire specifically (maybe SLI at the time too but dont rememeber) where scaling was at or very near 100% at least in 2 card configuration. Believe this was back with HD4000,5000,6000 cards.
     
  13. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,744
    Joined:
    Apr 17, 2000
    I wanted to use games that were old, but still relevant, so for this initiail article I didn't want to go too far back, I want to see the feedback we get from this. Naturally there is a large library of games even older going way back, who knows, we may do this again.
     
  14. Kwaz

    Kwaz Whine & Cheezy

    Messages:
    3,240
    Joined:
    Sep 3, 2014
    I have to say that I was expecting games like Half Life 2, Battlefield 1942, Wow and so forth.

    The games listed aren't recent, but certainly are not old, and graphically speaking, nothing better has come out since for the most part.

    Although the 1080 Ti wasn't released at the time of these games, the 1080 / 1070 was, and the games mostly line up with the same architecture that existed upon release. I was expecting there to be a gap of 5+ generations from the release of the title compared to the release of the architecture.

    Another interesting test would be to take a game like Half Life 2 and benchmark it for FPS and frame pacing / time generation by generation 5 back. 570 > 670 > 770 > 970 > 1070 :)

    Thanks for putting this together though. It's good to see my 980 Ti is probably good for another few years!

    Overall this really hammered home for me how little has changed in the past two years.
     
    Armenius, Aioeyu and Tactlesss like this.
  15. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    6,427
    Joined:
    Jun 13, 2003
    In terms of reported average FPS, yes; in terms of actual improvement, not even close. Before AMD fixed their frame-pacing, many/most games exhibited a multi-GPU experience that was worse than single-GPU.
     
  16. Zarathustra[H]

    Zarathustra[H] Official Forum Curmudgeon

    Messages:
    24,906
    Joined:
    Oct 29, 2000
    I find the experience is mixed when doing this, but I also run at 4k and try to use max settings.

    Generally older games run much easier than newer ones, but there are exceptions. The opening scene of Metro 2033 from 2010 still drops my system well below the 60fps where I'd like to be.

    I recently replayed Deus Ex Human Revolution, and that game was comically easy on the GPU, even at 4k.

    Much older titles are all over the place. The Original Deus Ex from 2001 pins a single CPU core no matter what you do. The NOLF games behave much better.
     
  17. shad0w4life

    shad0w4life Gawd

    Messages:
    636
    Joined:
    Jun 30, 2008
    Was waiting on old games....these are not old!

    Figured it would be some Crysis, Skyrim etc
     
    AceGoober, Armenius, Aioeyu and 2 others like this.
  18. Teenyman45

    Teenyman45 [H]ard|Gawd

    Messages:
    2,011
    Joined:
    Nov 29, 2010
    That was the point. Games like GTA V were near the beginning of when GPU companies and software devs began supporting multi-gpu configurations less and less.

    Of course, even 60% scaling would be a big help in many games. A moderately playable 35fps is now a fairly smooth 56fps with 60% scaling. If a third GPU gave even an additional 30% after that, then 35 + 21 + 10 = 66fps or no need for VSync.
     
    Armenius likes this.
  19. Zarathustra[H]

    Zarathustra[H] Official Forum Curmudgeon

    Messages:
    24,906
    Joined:
    Oct 29, 2000
    These aren't old either :p
     
  20. Xyvotha

    Xyvotha Limp Gawd

    Messages:
    348
    Joined:
    Aug 22, 2005
    I'm pretty sure those games would easily run maxxed out with modern enthusiast cards.
    But what about current mainstream/budget graphics? Perhaps even modern IGP, all of them running those very old titles@1080p or 1440p. I'd love that.
     
    otg and Sulphademus like this.
  21. Zarathustra[H]

    Zarathustra[H] Official Forum Curmudgeon

    Messages:
    24,906
    Joined:
    Oct 29, 2000

    I liked multi-gpu better back when we had options for split frame and SLI AA modes. These days it's AFR or nothing, and IMHO, AFR is the worst possible way to do multi-gpu.
     
    Armenius likes this.
  22. Kyle_Bennett

    Kyle_Bennett El Chingón Staff Member

    Messages:
    52,981
    Joined:
    May 18, 1997
    As always, we very much appreciate you talking about what the article is NOT about. Many thanks.
     
  23. T4rd

    T4rd [H]ardForum Junkie

    Messages:
    15,896
    Joined:
    Apr 8, 2009
    Digital Foundry did it just a few weeks ago.

     
    Xyvotha and Nolan7689 like this.
  24. alxlwson

    alxlwson You Know Where I Live

    Messages:
    4,986
    Joined:
    Aug 25, 2013

    I am curious... What ended up being some deciding factors on these games? WD2 for example... Was Brent_Justice running low on VRAM? I ask comparatively... With KCD, I run out of VRAM before I start running into issues with throughput. I have to turn texture quality down so I'm not asset swapping constantly.
     
  25. MacLeod

    MacLeod [H]ardness Supreme

    Messages:
    7,996
    Joined:
    Jul 28, 2009
    Very cool and interesting article. Lately all I seem to play are older games so this was a cool new take on game benching.

    I would like to see a little older games too tho... I know you can't go back 10 years but would like to see Crysis 1 or even 2 in there. Some of the older Assassins Creed games were demanding as well.
     
  26. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,744
    Joined:
    Apr 17, 2000
    I did not encounter any VRAM issues that were noteworthy, I'm sure WD2 at 4K with max settings was hitting the limits of the 8GB Vega 64, but the performance was so low to begin with it didn't matter.
     
    Armenius, AceGoober and alxlwson like this.
  27. DejaWiz

    DejaWiz Oracle of Unfortunate Truths

    Messages:
    19,077
    Joined:
    Apr 15, 2005
    What a great review - thank you, [H]!
     
  28. Araxie

    Araxie [H]ardness Supreme

    Messages:
    6,111
    Joined:
    Feb 11, 2013
    great article as always, and thank you for it but I was really expecting more older games, Things as Dragon Age Inquisition, Hitman: Absolution, Metro 2033/Last Light/Redux, Far cry 3, Tomb Raider (2013 reboot, not rise of the tomb raider) , Max Payne 3, Sleeping Dogs definitive, Bioshock Infinite, Assassins creed tittles, even some skyrim?. I mean there are a lot of games that would have been better than the selected suite to be tested at 4K IMHO.

    I think it was an extremely well made and detailed review but with wrong games, those are the games MORE tested and reviewed out there, they never miss a single review on any site (even this one), so really the information even being good and detailed at different settings was an info that most people here already know so reading all just left a "deja vu" feeling honestly.

    but well to stay on topic and avoid a scold from Kyle, what I most appreciated from the article was the Feature performance.. it may shut A LOT of Fanboys mouths regarding GameWorks Features, this will be a nice article used for that very reason.
     
    AceGoober, Armenius, Meeho and 2 others like this.
  29. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,744
    Joined:
    Apr 17, 2000
    Thank you for the game suggestions, those games interest me as well, I had to start somewhere and I didn't want to go too far back initially, the beauty of this type of review is there is a lot of potential for more articles including more games of the past. I also believe testing the game features is important, it lets us know what causes a bigger burden on performance than other things, and by how much, it is amazing that some graphics options can bring down performance close to 30% by just turning on one feature,
     
  30. magoo

    magoo [H]ardForum Junkie

    Messages:
    14,501
    Joined:
    Oct 21, 2004
    This is something I try to do every once and again to see what an"older"game was meant to look like in the imagination of the developer.

    There are always issues though that make it harder......drivers, patches, sometimes even the OS has evolved.

    I think what you fail to credit here is your platform. The MB, memory and CPU have no doubt improved since these games were released.
     
  31. travm

    travm Limp Gawd

    Messages:
    184
    Joined:
    Feb 26, 2016
    Im struggling to say this is old games. To me this was like copy/paste from the reviews of these cards? I didnt actually go look, so feel free to make me look stupid. But older games, I think should be 5+ years. anything inside the last 3 is still "current" even if people arent playing them, the tech hasnt moved.

    I support revisiting cards 1-2 years after release however. HUGELY. I dont early adopt. The crap i have to endure on youtube to try and figure out what the landscape is like 1-2 years later is painful at best.

    This is the first time I've looked at a H article and went, WTF?
    Otherwise it is an excellently done article and you guys are awesome. I'm going to go hide in a beer can now.
     
    Meeho likes this.
  32. funkydmunky

    funkydmunky [H]ard|Gawd

    Messages:
    1,937
    Joined:
    Aug 28, 2008
    I know it was initially said a joke, but that game looks amazing. And considering it is 10 years old, and the hardware that was out then! Wow!
     
    Armenius and Xyvotha like this.
  33. GoldenTiger

    GoldenTiger 3.5GB GTX 970 Slayer

    Messages:
    17,946
    Joined:
    Dec 2, 2004
    Yep. Crysis is and was no joke!
     
  34. Skylinestar

    Skylinestar Limp Gawd

    Messages:
    418
    Joined:
    Jun 14, 2011
    More than a decade isn't old :p

    Nice article [H]. I guess I'm one of the minority who buy today's mid range hardware and play yesteryear games at max quality. This saves a lot of money when you play a game the way it's meant to be played - ULTRA quality.

    I'll play Battlefield 1 in year 2022.
     
    Xyvotha likes this.
  35. harmattan

    harmattan 2[H]4U

    Messages:
    4,030
    Joined:
    Feb 11, 2008
    Great article. I have a backlog from here to Cleveland (including GTA V) so good to see what these "older" games demand. It's a shame more reviews don't look back at semi-recent high-profile games like this since they're new to many gamers.
     
    GoldenTiger likes this.
  36. Vegas P11

    Vegas P11 Gawd

    Messages:
    986
    Joined:
    Dec 16, 2005
    Great article as always Dan and Kyle. If you do plan on trying some older games for another article, I will gladly send you my GTX 780 to use as a baseline for a card that was on the market when these games were considered new.
     
  37. knowom

    knowom Limp Gawd

    Messages:
    265
    Joined:
    Aug 15, 2008
    I'd be really curious to know how The Division runs at 4K DSR 4.00x with FX AA set to off, Temporal AA set to stabilization, AF set to 4X, resolution scale 100%, shadow quality high and then all the other shadows settings and reflection settings along with parallax mapping set on low. Here's a great video on DSR comparisons.


    I did some quick and dirty DSR testing of my own to get a generalized idea of image quality results to expect. Overall I'm quite impressed. 4K DSR 4.00x versus 4K native has definitive image quality improvements in various area's if not all of them. Aliasing is lessened plus shimmers less in motion, shadows and lighting both improve a slight bit, parallax bump mapping becomes a bit more easily detailed, and lastly the reflections seem drastically more noticeable.

    Now I'm curious how does image quality compare at native resolution w/o DSR and higher quality settings versus 4K DSR 4.00x and maximizing IQ settings as best as possible compromises wise while maintaing the same frame rate target? Which then looks better overall or are there very real pro's and con's to either or depending on what you value more for example DSR really helps minimize shimmer and aliasing amount other things. Meanwhile down to a point a lot of people won't necessarily mind lower shadow/lighting/bump mapping quality.

    I went from about 25FPS to 19FPS within the same scene testing between them at 4K versus 4K DSR 4.00x at the same overall settings so the impact is there, but lower than I expected just the same. This is testing on a GTX960 4GB mind you a 1080ti would fair substantially better I'm sure between being both faster and having much more VRAM at it's disposal for higher resolutions.
     
    Last edited: May 25, 2018
    Meeho and Kyle_Bennett like this.
  38. Xyvotha

    Xyvotha Limp Gawd

    Messages:
    348
    Joined:
    Aug 22, 2005
    Agree with you. Between the little time I have to play games, and as a consequence the $ that I can justify to put in my gaming PC, my Steam library has many "unopened" games in queue. I'm probably getting FarCry 5 in a couple years.
     
  39. knowom

    knowom Limp Gawd

    Messages:
    265
    Joined:
    Aug 15, 2008
    So after playing around with The Division testing the impact DSR has at the same image quality settings. What I found was that it was actually more blurred and less sharp than the original native resolution. It's rather disappointing to see how badly implemented it is. Perhaps that's a by product of downsampling, but a bit shoddy by Nvidia. On top of that the DSR settings below 4.00x get progressively more blurred in relationship to the native resolution is what I noticed. Talk about a worst case scenario. As if Nvidia hasn't add another blur to every image quality setting going. They made LOD bias worse then adding more blur to everything under the sun I mean really wtf enough already.

    Before anyone mentions DSR smoothing it was set to 0 for optimal sharpness which for DSR 4.00x or DSR as a whole less than native resolution which is just saddening. Comical though is Nvidia made it easy as pie to pile on additional blur with DSR smoothing 0 upwards to +100 for that awful positive LOD inspired muddy/soap opera look. On the flip side of the coin the image is already more blurry than the native resolution let alone allowing you to sharpen it further.

    Nvidia's slogan should probably be changed to something a bit more appropriate to it's image like inferior image quality the green pill the way you're meant to just shut up and swallow. They just keep shoving more and more image quality pissing upon down consumers throats it would seem around every twist and turn. I swear it feels as if they seem more focused on adding features of that nature than ones that *gasp* improve image quality for the better.

    The thing is sharpness adds tons of clarity and depth perception to bring forth all the details much more so than nearly any other setting and blur just craps all over it. There is a point where you can make a image too sharp and run into that halo effect or shimmers, but that's another topic and in those situations you might actually warrant a bit of selective minimal blur mixed in to address it. That or don't go overboard on the sharpen. I'd rather deal with that than blur on the other hand personally.

    To give you a example of how important sharpness is overall to image detail here's a zip comparing 4K DSR 4.00x then the same image with IrFanView effect sharpner/unmask sharpen added 50sharpness added, 100 sharpness added, and finally unmask sharp 4 to the 4K DSR 4.00x image. I don't know about the rest of you, but I'll take the unsharp mask results in a heartbeat. My aliasing settings suck btw because GTX960 reasons it was never intended for testing DSR 4.00x, but curiosity killed a cat or two.
     
    Last edited: May 26, 2018
    Meeho likes this.
  40. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,245
    Joined:
    Oct 13, 2016
    Thanks guys for the in depth testing and review. I've got 3 of these games, GTA V, Witcher 3, and ROTR. I've had nearly identical results with my rigs. Something to note, all three support SLI. Using 1080 SLI I get roughly 10-20% more FPS in 4k vs. my 1080TI using same settings in your tests(couldn't count how many times I've done these same tests through the years). Basically if you have the money and desire 1080 SLI will give 4k/50-60fps for most of these. Only real exception being ROTR. ROTR at max everything just punishes any GPU combination at the moment. Best just to play at 1440p if wanting maxed settings and you happen to have TI or Vega 64.

    Did I miss it or did you mention Vram usage in any of these tests? I've found Vram can be very relevant when using maxed settinngs at 4k. I've seen ROTR eat nearly all 11GB on my TI @ 4096x2160 with maxed. May've been a memory leak but FPS dropped ridiculously low then(the upper Russian military installations). GTA V can get pretty hungry too but I forget the numbers. Witcher 3 was using ~4-7GB.
    Any chance you could add Vram comparisons, at least for 4k?

    edit: I'm sure some lucky person out there has a 1080TI SLI and probably kick ROTR's butt with it at 4k.