Last Gen Games - Max IQ and Perf on Today's GPUs @ [H]

I do have to agree with you guys about how SLI scaling has really dropped off the edge of the world. Using 2 cards to at most maybe 25% with a realistic number being closer to 10-20% isn't cost effective in any sense. On the other hand, a single 1080TI is just so close, so often, to that sweet 60fps/4k spot with everything cranked that the 1080SLI's actually do get you there. Not cost effective but they do help.

I really have to give you guys the props for actual testing and metrics. I just tinker for heck of it so by no means are my numbers as credible as yours. Thanks for all the hard work and time.
 
I find the experience is mixed when doing this, but I also run at 4k and try to use max settings.

Generally older games run much easier than newer ones, but there are exceptions. The opening scene of Metro 2033 from 2010 still drops my system well below the 60fps where I'd like to be.

I recently replayed Deus Ex Human Revolution, and that game was comically easy on the GPU, even at 4k.

Much older titles are all over the place. The Original Deus Ex from 2001 pins a single CPU core no matter what you do. The NOLF games behave much better.
The original Deus Ex runs on the first-generation Unreal Engine, which is notorious for being CPU-limited and wants at least a 1 GHz CPU to run smoothly. Between that game and Unreal Tournament '99, I've noted that it runs like crap on K6-2 350s and G3 350-400s, barely adequate on a Celeron 533 Mendocino at 66 MHz FSB (would've preferred a 300A or 366 that I could then jack up to similar clocks with a 100 MHz FSB), but will fly if given an Athlon XP 1800+ or G4 1.42 GHz and a decent GPU like anything Voodoo5/GeForce/Radeon onward.

Memory bandwidth is also critical; people benchmarking the Mac ports noted that interleaved memory provides a significant boost on pre-SDRAM models, with the post-SDRAM models also gaining a huge 100 MHz bus clock (up from 40-66 MHz on older models) to really help shuttle data between the CPU and RAM. Even some modern engines are apparently held back by this on today's systems more than anything, like the one for ArmA III (which apparently gets a big FPS boost from fast DDR4 over older DDR3 platforms).

However, part of the problem with early Unreal Engine isn't just that it didn't support SMP like id Tech 3 does (hence why Q3A runs way better on my dual Celeron BP6 than UT'99 does), but also that it doesn't support HT&L at all because it was heavily tuned around 3dfx Voodoo cards that never had it. I presume that adds to the CPU load significantly.

Where was Quake 3 Arena :)
I was totally expecting some Q3A and UT'99 in 4K 60+ FPS here when talking about "old games to the max", not current-gen (by PC standards) titles! Surely, those games can run at that resolution on today's hardware and still hum along smoothly, right?

I would've also mentioned Crysis, but that one's just way too obvious. Maybe I should bring up Jurassic Park: Trespasser instead, because that game was to 1998 was Crysis was to 2007, being ridiculously ahead of its time, but also ridiculously system-intensive in the days when 300-500 MHz CPUs were current. (Remember, they had a reasonably realistic physics engine running on CPUs that slow, in real-time!)
 
Not always, games like GTAV or Watch Dogs 2 never did well with multi-GPU due to the nature of the game, or implementation. Often efficiency was hard to get over 80%, sometimes I remember it being very small with certain games, having hardly any affect. A lot came down to developer implementation as well as vendor support. It was inconsistent at times. Certain games work better with it than others.

Depends on the resolution and graphics settings; at 1440p on up SLI starts to make a difference in GTA V.
 
Very interesting article. It would be great to see another article like this targeting 2012-2014:

Middle Earth: Shadow of Mordor
FarCry 4
Watchdogs
Dragon Age: Inquisition
Assassin's Creed: Unity
Thief
Bioshock Infinite
Tomb Raider
Total War: Rome II
ArmA III (with 4x 38 man PLTs engaging minimum)
Crysis 3
Dishonored
Max Payne 3
FarCry 3
Hitman Absolution
 
Ugh, main problem with this is my computer is older than the 'oldest' games presented :cry:
 
I'm currently playing Mafia with mods @4K on my GTX1070Ti; reallly nice.
Next stop is the witcher series, Doom (2005), AC 1-4 and the Max Payne series.

Damn, just realized I have dozens of older games I'd like to re-play at 4K
 
Speaking of IQ I wonder if real time ray tracing would be possible on today's cards right now if rendered in the same mannerism as Maze War did in 1973 out of hardware necessity at the time. They restricted rendering to 90 degree movement turns in any direction. A lot of early older games were done that way pretty much all the early 3D looking dungeon crawlers were made as such. A lot of them were pretty fun games in terms of game mechanics despite restricted camera view rendering limitations though and today I'm sure could be even better and more imaginative. I think it's interesting to ponder a modern Wizardry game with real time ray-tracing, but retaining the old vintage renaissance 90 degree turn movement restrictions in order for the hardware to handle it.
 
Speaking of IQ I wonder if real time ray tracing would be possible on today's cards right now if rendered in the same mannerism as Maze War did in 1973 out of hardware necessity at the time. They restricted rendering to 90 degree movement turns in any direction. A lot of early older games were done that way pretty much all the early 3D looking dungeon crawlers were made as such. A lot of them were pretty fun games in terms of game mechanics despite restricted camera view rendering limitations though and today I'm sure could be even better and more imaginative. I think it's interesting to ponder a modern Wizardry game with real time ray-tracing, but retaining the old vintage renaissance 90 degree turn movement restrictions in order for the hardware to handle it.
ATI/AMD already had some real-time raytracing tech demos done in the HD Radeon 4x00 days, if I'm not mistaken. Old news; we've technically been capable of it with current GPGPU designs for a long time.

It's just that nobody wants to buckle down and write a renderer optimized for it. If anything, I'd like to see a raytracing renderer that deliberately evokes the look of '90s pre-rendered FMV in games like MegaRace, System Shock, MechWarrior 2, etc., complete with low resolution and possibly compression artifacts, but optimized to run on today's hardware efficiently in real-time.

This would go against people's expectations of raytracing looking as good or better than today's rasterized renderers, but it's meant to be a nostalgic look to begin with.
 
So after playing around with The Division testing the impact DSR has at the same image quality settings. What I found was that it was actually more blurred and less sharp than the original native resolution. It's rather disappointing to see how badly implemented it is. Perhaps that's a by product of downsampling, but a bit shoddy by Nvidia. On top of that the DSR settings below 4.00x get progressively more blurred in relationship to the native resolution is what I noticed. Talk about a worst case scenario. As if Nvidia hasn't add another blur to every image quality setting going. They made LOD bias worse then adding more blur to everything under the sun I mean really wtf enough already.

Before anyone mentions DSR smoothing it was set to 0 for optimal sharpness which for DSR 4.00x or DSR as a whole less than native resolution which is just saddening. Comical though is Nvidia made it easy as pie to pile on additional blur with DSR smoothing 0 upwards to +100 for that awful positive LOD inspired muddy/soap opera look. On the flip side of the coin the image is already more blurry than the native resolution let alone allowing you to sharpen it further.

Nvidia's slogan should probably be changed to something a bit more appropriate to it's image like inferior image quality the green pill the way you're meant to just shut up and swallow. They just keep shoving more and more image quality pissing upon down consumers throats it would seem around every twist and turn. I swear it feels as if they seem more focused on adding features of that nature than ones that *gasp* improve image quality for the better.

The thing is sharpness adds tons of clarity and depth perception to bring forth all the details much more so than nearly any other setting and blur just craps all over it. There is a point where you can make a image too sharp and run into that halo effect or shimmers, but that's another topic and in those situations you might actually warrant a bit of selective minimal blur mixed in to address it. That or don't go overboard on the sharpen. I'd rather deal with that than blur on the other hand personally.

To give you a example of how important sharpness is overall to image detail here's a zip comparing 4K DSR 4.00x then the same image with IrFanView effect sharpner/unmask sharpen added 50sharpness added, 100 sharpness added, and finally unmask sharp 4 to the 4K DSR 4.00x image. I don't know about the rest of you, but I'll take the unsharp mask results in a heartbeat. My aliasing settings suck btw because GTX960 reasons it was never intended for testing DSR 4.00x, but curiosity killed a cat or two.

What you're saying is we need madVR for games?
 
ATI/AMD already had some real-time raytracing tech demos done in the HD Radeon 4x00 days, if I'm not mistaken. Old news; we've technically been capable of it with current GPGPU designs for a long time.

Have to be specific about 'capable of it'. Ray-tracing is fairly simple, and simple tech demos have been done in real-time for decades; our benchmark needs to be modern games, and we don't know if those could be rendered in real-time.

The next step, already seen in Battlefront II with a shiny helmet, appears to be putting ray-traced elements in games as an additional special effect. The technology can be introduced into the field with all interested parties getting a chance to work with it.

Then, if the technology is viable from an implementation perspective, i.e. the image quality benefits are there while the cost of entry is not too high and performance is acceptable, we might see more penetration into the market.
 
This is an awesome idea that can be applied all the way back to the beginning of pc gaming. Games that worked an 8088 pc hard flew on a 80286, so much so that the 'turbo' button was added so you could slow the pc down again. Then vga graphics and the 386 gave developers more options that finally all looked great on the 486 and pentium. And then the advent of games taking advantage of gpus which later on could be emulated even faster by faster machines. Older software on newer hardware has always been a way to gain speed.

And the same applies to just regular computing. DOS batch files fly on the modern command prompt, even those that intensely use search algorithms on large numbers of files/data. Win3.1 flew on our Cyrix 6x86 build back in the late 1990s, and 95 and 98se flew on later p3-era pentiums. Even today, on the motherboards that supported xp, xp absolutely flew and excel calculated quicker, pdfs popped up quicker, and files copied faster, and many times did these things faster than the same hardware with the equivalent 'modern' os. I actually purposely try to stay in this sweet spot of performance. Makes me want to pull out need for speed underground and play it on our dual xeon 5620 server with a gpu--I could probably turn on all the pretty settings there too. :)
 
I use a undervolted XFX 390 X freesync combo for 4K and it is fine.

My gaming target is 50 FPS min, anything lower the free sync drops out unless you mod the drivers. I use V sync or target frame control to avoid tearing on some titles if above refresh rate.
I also turn off AA at 4K, personal preference.

I game on ultra on Verdun @ 75 FPS,
Multiplayer Crysis 3 @ 60 FPS very high settings,
Star Wars BF on ultra 60 FPS steady and higher if V sync is disabled,
Need For Speed 60 FPS ultra, med AO, Gorgeous in 4K
Metro LL 50 FPS very high, no PhysX, no SSAA,
COD BO3 90 FPS maxed out
Alien Isolation ultra 60 FPS min
Bioshock inf ultra 60 FPS min
Titanfall ultra 60 FPS with AO turned down a bit.
Dirt Rally ultra 60 FPS steady.

I see no need to upgrade until I leave the 60 Hz world.
Most modern titles are fine with a pre render reduction of 80% of 4K if the game will allow you to change this setting.
If you side by side two identical systems running at 80% vs 100% render you cannot tell the difference.
 
I don't really consider these titles to be last gen games, they're still running on the same technology base used by everything else at the moment. I'd wind back to clock to stuff that was coming out on 360/PS3, fire up some UE3 based stuff.
 
Have to be specific about 'capable of it'. Ray-tracing is fairly simple, and simple tech demos have been done in real-time for decades; our benchmark needs to be modern games, and we don't know if those could be rendered in real-time.

The next step, already seen in Battlefront II with a shiny helmet, appears to be putting ray-traced elements in games as an additional special effect. The technology can be introduced into the field with all interested parties getting a chance to work with it.

Then, if the technology is viable from an implementation perspective, i.e. the image quality benefits are there while the cost of entry is not too high and performance is acceptable, we might see more penetration into the market.
If you're expecting Crysis or Star Citizen levels of quality in a ray-traced engine, that's a bit much, even for today's hardware. It's just not inherently efficient, and video game engines have historically been about getting the most efficiency out of limited hardware. (Well, assuming they're not coded by idiot programmers banking on Moore's Law to solve their performance problems for them.)

That's why I proposed a game that deliberately limits itself to 1980s/early 1990s-style ray-tracing optimized for today's processors, as a sort of deliberate stylization for the sorta games people imagined we'd be getting toward that whole CD-induced FMV MULTIMEDIA!! wave of the early-mid '90s, but without the inherent limitations of pre-rendering everything.

It'd be a nice change of pace from the usual sprite-based 2D graphics (which I like, but are hard to pull off correctly) and the faux-voxel trend set forth by Minecraft (when "voxel graphics" makes me instead think of early NovaLogic games, certain Shadow Warrior/Blood item rendering options, or Outcast).

Expecting a modern AAA game to go ray-traced now is extremely unlikely when it's a good way of getting yourself stuck in the same hole Ultima Underworld, System Shock 1, Trespasser, and Crysis had to deal with in terms of system requirements too far ahead of their time, to the point that people call your game ridiculously unoptimized and don't even give it a second look.
 
That's why I proposed a game that deliberately limits itself to 1980s/early 1990s-style ray-tracing optimized for today's processors

I kinda got what you were saying before, but now I'm tracking :).

I introduced a previously proposed idea that ray-tracing be used alongside raster graphics as a form of special effects, just to get it in the engines and get development teams used to using it. You propose instead that games could be made with 'retro' fully ray-traced graphics.

Honestly, I think we're both right, and I'd love to see your idea come to fruition- if it's in a good game, graphics won't matter so much. Hell, I still play League of Legends on my sig system, and that could probably be rendered with real-time ray-tracing!
 
What I was suggesting when I brought up ray tracing is intentionally limiting the type of game design mechanics of older games from about Maze War up to around, but probably before Ultima Underworld/Wolfenstein 3D since that was really the point at which they kind of really broke and strayed from the mold more as tech matured enough to do so. I'm glad NamelessPFG mentions being to far ahead of the curve. Wolfenstein 3D actually came out about two months after Ultima Underworld, but it certainly ran better and was really more well received at the time due to that even though today is arguably the better game with today's hardware. The reasons Ultima Underworld ran a lot worse was in part due to the extra resource overhead of using slope surfaces and no view lock though, but lighting was a factor as well. Now if you look back to Maze War it basically only had 4 way turn radius and it was view locked due to 1973 hardware limitations. The best approach would be closely mimicking the type of 3D progression of older games from Maze War up to today. If the hardware limits from the additional overhead ray tracing requires stops us more in the game rendering mechanics of a 1980's or early 1990's game so be it work within that until it matures further. At the very least if we can make rasterization inferior in limited scenario's we could actually eliminate those scenario's from using rasterization entirely and thus replace them with real time ray tracing. Technically I don't see a reason why we couldn't have a game with rasterized open world environments with ray traced dungeon environments. A remake of early Ultima game would probably be perfect for such a situation though the open world today would be a hell of a lot cooler and closer to divinity original sin or grim dawn type graphics. You could actually offer two very contrasting game mechanics too for open world versus dungeon crawl mechanics combining real time and turn based depending upon where you are currently fighting.
 
Very interesting article. It would be great to see another article like this targeting 2012-2014:

Bioshock Infinite

Bioshock Infinite was really well optimized. Even a GTX 1060 gets about 40 fps average at 4k completely maxed.
 
Where was Quake 3 Arena :)
1011405114PGktvSTHDW_1_2_l.jpg
 
I didn't realize Rise of the Tomb Raider was considered an older game ... just goes to show what I don't know
 
ATI/AMD already had some real-time raytracing tech demos done in the HD Radeon 4x00 days, if I'm not mistaken. Old news; we've technically been capable of it with current GPGPU designs for a long time.

It's just that nobody wants to buckle down and write a renderer optimized for it. If anything, I'd like to see a raytracing renderer that deliberately evokes the look of '90s pre-rendered FMV in games like MegaRace, System Shock, MechWarrior 2, etc., complete with low resolution and possibly compression artifacts, but optimized to run on today's hardware efficiently in real-time.

This would go against people's expectations of raytracing looking as good or better than today's rasterized renderers, but it's meant to be a nostalgic look to begin with.

I remember just a few years ago when everyone and their grandma, their cousins and the dog were doing raytracing. Intel kicked in with Larrabee, then quickly nvidia, amd and even the PS3 and PowerVR had demos with it.
Unfortunately it quickly faded.

Hopefully its making a comeback with Turing. The recent SW demos were awesome (I think they were running in Volta)
 
Back
Top