Last Gen Games - Max IQ and Perf on Today's GPUs @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Last Gen Games - Max IQ and Perf on Today's GPUs

Have you ever wanted to pull out an older game from years past, and play it on your new video card to see if the gameplay experience is improved by turning on all the highest graphics options, or a higher resolution like 4K? We sure did! We test some older games and see if we can "max out" the gameplay experience in 2018.
 
I was glad to see this article, it's more relevant to me (and a lot of other people I suspect) than the usual latest-games benchmarking.
Good to see that when the next generation of cargs comes out, my wife can look forward to a better 4K Witcher experience -- if she hasn't finished the final expansion by then.
 
Remember when SLI and Crossfire used to work fairly well and you could use two or three or maybe even four GPUs and get the full effect of all the eye candy with the best resolution available?
 
Remember when SLI and Crossfire used to work fairly well and you could use two or three or maybe even four GPUs and get the full effect of all the eye candy with the best resolution available?

Not always, games like GTAV or Watch Dogs 2 never did well with multi-GPU due to the nature of the game, or implementation. Often efficiency was hard to get over 80%, sometimes I remember it being very small with certain games, having hardly any affect. A lot came down to developer implementation as well as vendor support. It was inconsistent at times. Certain games work better with it than others.
 
Not always, games like GTAV or Watch Dogs 2 never did well with multi-GPU due to the nature of the game, or implementation. Often efficiency was hard to get over 80%, sometimes I remember it being very small with certain games, having hardly any affect. A lot came down to developer implementation as well as vendor support. It was inconsistent at times. Certain games work better with it than others.

What moved me away from multi-GPU was getting a game to work then one day bam, just stops working. When I had AMD this happened more than once with BF4 and I sold my cards the second time. Nothing is more irritating than having time for one match and you can’t because of mGPU. Not worth it!
 
I liked this. I have a GTX1070Ti myself and play older games at 4k, I wish it was included in the test bed, but I guess it should perform close the the Vega64 anyway.

An IQ comparison between the highest/lowest settings would be nice.

My only rant is NO UNREAL GOLD tested? How could you?!!! :D:D
 
Enjoyed the article. I'm an old dog so games like Watchdogs 2 are still "new" to me. Actually I saw rather few "old" games in the mix. I don't even see GTA V as really being OLD. Maybe I've been spoiled by World of Warcraft. (THE MOST EXPENSIVE GAME IN THE WORLD.) <I can math it for you if you want but 15 dollars a month for the past 13 years pretty much nails that coffin shut.>

I was also expecting to see Crysis thrown in just because. A little sad that it wasn't. lol. I'm pretty sure that's going to be a GOG soon.
 
Not always, games like GTAV or Watch Dogs 2 never did well with multi-GPU due to the nature of the game, or implementation. Often efficiency was hard to get over 80%, sometimes I remember it being very small with certain games, having hardly any affect. A lot came down to developer implementation as well as vendor support. It was inconsistent at times. Certain games work better with it than others.
This is true. There had been some high notes for Crossfire specifically (maybe SLI at the time too but dont rememeber) where scaling was at or very near 100% at least in 2 card configuration. Believe this was back with HD4000,5000,6000 cards.
 
Enjoyed the article. I'm an old dog so games like Watchdogs 2 are still "new" to me. Actually I saw rather few "old" games in the mix. I don't even see GTA V as really being OLD. Maybe I've been spoiled by World of Warcraft. (THE MOST EXPENSIVE GAME IN THE WORLD.) <I can math it for you if you want but 15 dollars a month for the past 13 years pretty much nails that coffin shut.>

I was also expecting to see Crysis thrown in just because. A little sad that it wasn't. lol. I'm pretty sure that's going to be a GOG soon.

I wanted to use games that were old, but still relevant, so for this initiail article I didn't want to go too far back, I want to see the feedback we get from this. Naturally there is a large library of games even older going way back, who knows, we may do this again.
 
I wanted to use games that were old, but still relevant, so for this initiail article I didn't want to go too far back, I want to see the feedback we get from this. Naturally there is a large library of games even older going way back, who knows, we may do this again.

I have to say that I was expecting games like Half Life 2, Battlefield 1942, Wow and so forth.

The games listed aren't recent, but certainly are not old, and graphically speaking, nothing better has come out since for the most part.

Although the 1080 Ti wasn't released at the time of these games, the 1080 / 1070 was, and the games mostly line up with the same architecture that existed upon release. I was expecting there to be a gap of 5+ generations from the release of the title compared to the release of the architecture.

Another interesting test would be to take a game like Half Life 2 and benchmark it for FPS and frame pacing / time generation by generation 5 back. 570 > 670 > 770 > 970 > 1070 :)

Thanks for putting this together though. It's good to see my 980 Ti is probably good for another few years!

Overall this really hammered home for me how little has changed in the past two years.
 
Believe this was back with HD4000,5000,6000 cards.

In terms of reported average FPS, yes; in terms of actual improvement, not even close. Before AMD fixed their frame-pacing, many/most games exhibited a multi-GPU experience that was worse than single-GPU.
 
I find the experience is mixed when doing this, but I also run at 4k and try to use max settings.

Generally older games run much easier than newer ones, but there are exceptions. The opening scene of Metro 2033 from 2010 still drops my system well below the 60fps where I'd like to be.

I recently replayed Deus Ex Human Revolution, and that game was comically easy on the GPU, even at 4k.

Much older titles are all over the place. The Original Deus Ex from 2001 pins a single CPU core no matter what you do. The NOLF games behave much better.
 
Not always, games like GTAV or Watch Dogs 2 never did well with multi-GPU due to the nature of the game, or implementation. Often efficiency was hard to get over 80%, sometimes I remember it being very small with certain games, having hardly any affect. A lot came down to developer implementation as well as vendor support. It was inconsistent at times. Certain games work better with it than others.

That was the point. Games like GTA V were near the beginning of when GPU companies and software devs began supporting multi-gpu configurations less and less.

Of course, even 60% scaling would be a big help in many games. A moderately playable 35fps is now a fairly smooth 56fps with 60% scaling. If a third GPU gave even an additional 30% after that, then 35 + 21 + 10 = 66fps or no need for VSync.
 
I have to say that I was expecting games like Half Life 2, Battlefield 1942, Wow and so forth.

I'm pretty sure those games would easily run maxxed out with modern enthusiast cards.
But what about current mainstream/budget graphics? Perhaps even modern IGP, all of them running those very old titles@1080p or 1440p. I'd love that.
 
Not always, games like GTAV or Watch Dogs 2 never did well with multi-GPU due to the nature of the game, or implementation. Often efficiency was hard to get over 80%, sometimes I remember it being very small with certain games, having hardly any affect. A lot came down to developer implementation as well as vendor support. It was inconsistent at times. Certain games work better with it than others.


I liked multi-gpu better back when we had options for split frame and SLI AA modes. These days it's AFR or nothing, and IMHO, AFR is the worst possible way to do multi-gpu.
 
Very cool and interesting article. Lately all I seem to play are older games so this was a cool new take on game benching.

I would like to see a little older games too tho... I know you can't go back 10 years but would like to see Crysis 1 or even 2 in there. Some of the older Assassins Creed games were demanding as well.
 
I am curious... What ended up being some deciding factors on these games? WD2 for example... Was Brent_Justice running low on VRAM? I ask comparatively... With KCD, I run out of VRAM before I start running into issues with throughput. I have to turn texture quality down so I'm not asset swapping constantly.

I did not encounter any VRAM issues that were noteworthy, I'm sure WD2 at 4K with max settings was hitting the limits of the 8GB Vega 64, but the performance was so low to begin with it didn't matter.
 
great article as always, and thank you for it but I was really expecting more older games, Things as Dragon Age Inquisition, Hitman: Absolution, Metro 2033/Last Light/Redux, Far cry 3, Tomb Raider (2013 reboot, not rise of the tomb raider) , Max Payne 3, Sleeping Dogs definitive, Bioshock Infinite, Assassins creed tittles, even some skyrim?. I mean there are a lot of games that would have been better than the selected suite to be tested at 4K IMHO.

I think it was an extremely well made and detailed review but with wrong games, those are the games MORE tested and reviewed out there, they never miss a single review on any site (even this one), so really the information even being good and detailed at different settings was an info that most people here already know so reading all just left a "deja vu" feeling honestly.

but well to stay on topic and avoid a scold from Kyle, what I most appreciated from the article was the Feature performance.. it may shut A LOT of Fanboys mouths regarding GameWorks Features, this will be a nice article used for that very reason.
 
great article as always, and thank you for it but I was really expecting more older games, Things as Dragon Age Inquisition, Hitman: Absolution, Metro 2033/Last Light/Redux, Far cry 3, Tomb Raider (2013 reboot, not rise of the tomb raider) , Max Payne 3, Sleeping Dogs definitive, Bioshock Infinite, Assassins creed tittles, even some skyrim?. I mean there are a lot of games that would have been better than the selected suite to be tested at 4K IMHO.

I think it was an extremely well made and detailed review but with wrong games, those are the games MORE tested and reviewed out there, they never miss a single review on any site (even this one), so really the information even being good and detailed at different settings was an info that most people here already know so reading all just left a "deja vu" feeling honestly.

but well to stay on topic and avoid a scold from Kyle, what I most appreciated from the article was the Feature performance.. it may shut A LOT of Fanboys mouths regarding GameWorks Features, this will be a nice article used for that very reason.

Thank you for the game suggestions, those games interest me as well, I had to start somewhere and I didn't want to go too far back initially, the beauty of this type of review is there is a lot of potential for more articles including more games of the past. I also believe testing the game features is important, it lets us know what causes a bigger burden on performance than other things, and by how much, it is amazing that some graphics options can bring down performance close to 30% by just turning on one feature,
 
This is something I try to do every once and again to see what an"older"game was meant to look like in the imagination of the developer.

There are always issues though that make it harder......drivers, patches, sometimes even the OS has evolved.

I think what you fail to credit here is your platform. The MB, memory and CPU have no doubt improved since these games were released.
 
Im struggling to say this is old games. To me this was like copy/paste from the reviews of these cards? I didnt actually go look, so feel free to make me look stupid. But older games, I think should be 5+ years. anything inside the last 3 is still "current" even if people arent playing them, the tech hasnt moved.

I support revisiting cards 1-2 years after release however. HUGELY. I dont early adopt. The crap i have to endure on youtube to try and figure out what the landscape is like 1-2 years later is painful at best.

This is the first time I've looked at a H article and went, WTF?
Otherwise it is an excellently done article and you guys are awesome. I'm going to go hide in a beer can now.
 
  • Like
Reactions: Meeho
like this
These aren't old either :p
More than a decade isn't old :p

Nice article [H]. I guess I'm one of the minority who buy today's mid range hardware and play yesteryear games at max quality. This saves a lot of money when you play a game the way it's meant to be played - ULTRA quality.

I'll play Battlefield 1 in year 2022.
 
Great article. I have a backlog from here to Cleveland (including GTA V) so good to see what these "older" games demand. It's a shame more reviews don't look back at semi-recent high-profile games like this since they're new to many gamers.
 
Great article as always Dan and Kyle. If you do plan on trying some older games for another article, I will gladly send you my GTX 780 to use as a baseline for a card that was on the market when these games were considered new.
 
I'd be really curious to know how The Division runs at 4K DSR 4.00x with FX AA set to off, Temporal AA set to stabilization, AF set to 4X, resolution scale 100%, shadow quality high and then all the other shadows settings and reflection settings along with parallax mapping set on low. Here's a great video on DSR comparisons.


I did some quick and dirty DSR testing of my own to get a generalized idea of image quality results to expect. Overall I'm quite impressed. 4K DSR 4.00x versus 4K native has definitive image quality improvements in various area's if not all of them. Aliasing is lessened plus shimmers less in motion, shadows and lighting both improve a slight bit, parallax bump mapping becomes a bit more easily detailed, and lastly the reflections seem drastically more noticeable.

Now I'm curious how does image quality compare at native resolution w/o DSR and higher quality settings versus 4K DSR 4.00x and maximizing IQ settings as best as possible compromises wise while maintaing the same frame rate target? Which then looks better overall or are there very real pro's and con's to either or depending on what you value more for example DSR really helps minimize shimmer and aliasing amount other things. Meanwhile down to a point a lot of people won't necessarily mind lower shadow/lighting/bump mapping quality.

I went from about 25FPS to 19FPS within the same scene testing between them at 4K versus 4K DSR 4.00x at the same overall settings so the impact is there, but lower than I expected just the same. This is testing on a GTX960 4GB mind you a 1080ti would fair substantially better I'm sure between being both faster and having much more VRAM at it's disposal for higher resolutions.
 
Last edited:
Nice article [H]. I guess I'm one of the minority who buy today's mid range hardware and play yesteryear games at max quality. This saves a lot of money when you play a game the way it's meant to be played - ULTRA quality.
I'll play Battlefield 1 in year 2022.
Agree with you. Between the little time I have to play games, and as a consequence the $ that I can justify to put in my gaming PC, my Steam library has many "unopened" games in queue. I'm probably getting FarCry 5 in a couple years.
 
So after playing around with The Division testing the impact DSR has at the same image quality settings. What I found was that it was actually more blurred and less sharp than the original native resolution. It's rather disappointing to see how badly implemented it is. Perhaps that's a by product of downsampling, but a bit shoddy by Nvidia. On top of that the DSR settings below 4.00x get progressively more blurred in relationship to the native resolution is what I noticed. Talk about a worst case scenario. As if Nvidia hasn't add another blur to every image quality setting going. They made LOD bias worse then adding more blur to everything under the sun I mean really wtf enough already.

Before anyone mentions DSR smoothing it was set to 0 for optimal sharpness which for DSR 4.00x or DSR as a whole less than native resolution which is just saddening. Comical though is Nvidia made it easy as pie to pile on additional blur with DSR smoothing 0 upwards to +100 for that awful positive LOD inspired muddy/soap opera look. On the flip side of the coin the image is already more blurry than the native resolution let alone allowing you to sharpen it further.

Nvidia's slogan should probably be changed to something a bit more appropriate to it's image like inferior image quality the green pill the way you're meant to just shut up and swallow. They just keep shoving more and more image quality pissing upon down consumers throats it would seem around every twist and turn. I swear it feels as if they seem more focused on adding features of that nature than ones that *gasp* improve image quality for the better.

The thing is sharpness adds tons of clarity and depth perception to bring forth all the details much more so than nearly any other setting and blur just craps all over it. There is a point where you can make a image too sharp and run into that halo effect or shimmers, but that's another topic and in those situations you might actually warrant a bit of selective minimal blur mixed in to address it. That or don't go overboard on the sharpen. I'd rather deal with that than blur on the other hand personally.

To give you a example of how important sharpness is overall to image detail here's a zip comparing 4K DSR 4.00x then the same image with IrFanView effect sharpner/unmask sharpen added 50sharpness added, 100 sharpness added, and finally unmask sharp 4 to the 4K DSR 4.00x image. I don't know about the rest of you, but I'll take the unsharp mask results in a heartbeat. My aliasing settings suck btw because GTX960 reasons it was never intended for testing DSR 4.00x, but curiosity killed a cat or two.
 
Last edited:
  • Like
Reactions: Meeho
like this
Thanks guys for the in depth testing and review. I've got 3 of these games, GTA V, Witcher 3, and ROTR. I've had nearly identical results with my rigs. Something to note, all three support SLI. Using 1080 SLI I get roughly 10-20% more FPS in 4k vs. my 1080TI using same settings in your tests(couldn't count how many times I've done these same tests through the years). Basically if you have the money and desire 1080 SLI will give 4k/50-60fps for most of these. Only real exception being ROTR. ROTR at max everything just punishes any GPU combination at the moment. Best just to play at 1440p if wanting maxed settings and you happen to have TI or Vega 64.

Did I miss it or did you mention Vram usage in any of these tests? I've found Vram can be very relevant when using maxed settinngs at 4k. I've seen ROTR eat nearly all 11GB on my TI @ 4096x2160 with maxed. May've been a memory leak but FPS dropped ridiculously low then(the upper Russian military installations). GTA V can get pretty hungry too but I forget the numbers. Witcher 3 was using ~4-7GB.
Any chance you could add Vram comparisons, at least for 4k?

edit: I'm sure some lucky person out there has a 1080TI SLI and probably kick ROTR's butt with it at 4k.
 
great article as always, and thank you for it but I was really expecting more older games, Things as Dragon Age Inquisition, Hitman: Absolution, Metro 2033/Last Light/Redux, Far cry 3, Tomb Raider (2013 reboot, not rise of the tomb raider) , Max Payne 3, Sleeping Dogs definitive, Bioshock Infinite, Assassins creed tittles, even some skyrim?. I mean there are a lot of games that would have been better than the selected suite to be tested at 4K IMHO.

I think it was an extremely well made and detailed review but with wrong games, those are the games MORE tested and reviewed out there, they never miss a single review on any site (even this one), so really the information even being good and detailed at different settings was an info that most people here already know so reading all just left a "deja vu" feeling honestly.

but well to stay on topic and avoid a scold from Kyle, what I most appreciated from the article was the Feature performance.. it may shut A LOT of Fanboys mouths regarding GameWorks Features, this will be a nice article used for that very reason.

With my 1080TI rig the Metro's will still bring it to a crawl if I crank AA to it's fullest even in 1440p. Keep 'em at x2 or .5 and I can happily get 70-100+ fps @1440p, .5 in 4k can do ~60fps, with all other settings maxed. Odd thing is that the remasters seem to render unevenly. Maybe the originals were like that too but I can't remember. Basically I'm constantly watching things being drawn in the background in a number of spots. I love these games both for what they were but technologically they were amazing. SLI, 3d, and PhysX support. At one point I had 2 G1 970's in SLI and a EVGA SC780 and that could actually play 50-60 fps in 4k even with AA x4. That's why I got the unusual x79 mobo that I'm still using. At first I loved the 3d too but whether drivers or something else there's a lot of ghosting in the renders now.

1080TI just crushes BioShock Infinite in 1440p. Really barely breaks a sweat. Haven't tried it in 4k.

Tomb Raider 2013 with the TI can hold 70-100 fps fully maxed on the builtin bench in 1440p but we all know how those builtin benches aren't that acccurate. Another game that was really advanced for its time. Once again, SLI and 3d support. Game still looks great in 3d too. My 1080SLI rocks it in 4k at 60 fps for the same test.

Assassin's Creed IV at max can also hold its own with the TI fully maxed in 1440p, 30-50 fps!. Another game that had PhysX. Once again my 970's and 780 actually did better becuase of the dedicated card.

I can't wait for the next gen cards so I can put a 1180TI or whatever in my SLI rig and use the 1080's as dedicated PhysX cards for these old rigs. Ought to be a lot of fun in 4k when the time comes.
 
Back
Top