What 'Optimization' Really Means In Games

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
It's funny, you hear people talk about it all the time but do you really know what goes into optimizing a PC game? This is a very informative article and the author has even split it into three separate pages to make it easier to digest.

Complaints about bad optimization are often shorthand for 'this game doesn’t run well on my PC.' But is that performance really the fault of the game’s programming not being written as efficiently as possible? How do you really know if a game is poorly optimized, or simply pushing your PC to its limits? This article will try to clarify what optimization actually is, give an overview of some technical features like lighting and anti-aliasing which are inherently computationally expensive, and shed some light on how developers actually determine graphics presets and system requirements.
 
This article needs to be mandatory reading for all gamers who feel like sharing thoughts on the internet. It feels like 90% of game criticism on Steam and Reddit is just children saying "hurr durr it isn't optimized enough, why are these devs so bad at optimizing!!!". I always feel like these people took that late night game school ad on TechTV seriously and now believe "all they have to do is push a button to tighten up the graphics on level 3".
 
Looking at the graphics fidelity of Planetside 2 before and after its OMFG optimization, optimization means dumbing down graphics for everyone, even people who put the graphics settings on Ultra. They should have left the old graphics available for those who wanted it.
 
To me, I consider FEAR a badly optimized game. When that game released, I had WAY over the recommended specs. I even had the 7800 GTX 512mb that was just released at the time of its released. That game ran like utter shit. Games that looked far better ran better than that POS.

I later spoke to one of the people that worked on that game, and my assumptions were mostly correct. So, I don't feel bad for shitting on that game.

I usually don't use "optimization" as an excuse. Sometimes I do. Mostly, it's when a game shouldn't run the way it does when it doesn't do what other games do. Now, I use that complaint to explain bad PC ports. Like, I know that Rise of the Tomb Raider is an intense game. My hardware is responsible for the performance, and not really the game. I feel it's been well optimized and patched for support. So, I don't talk shit about that game. Bioshock Remaster? Badly handled game release.
 
Optimization, like many words, has lost its meaning over the past decade. I always think that people who throw this word around all the time in the context of video games should get a job as a developer to see what it really means and see how "easy" it is.
 
The entire argument of the article is based on the flawed definition of "good optimization" given on the first page. Optimization is an extremely difficult task, more so than simple bug squashing in code but similar in discipline. The author is trying to pass the buck onto the consumers with such stark condescension, "I'm going to use 3 pages to tell you to lower your graphics settings." The arrogance displayed in believing all gamers are mouth breathers is staggering. This oversimplified explanation doesn't at all describe the process developers use to optimize their game against the engine they're using. It's a complete waste of an article and doesn't even delve deep enough into the points he is trying to make to pull any value from it.


I feel like I should send a bill to Peter Thoman for the time it took me to read this article.
 
  • Like
Reactions: M76
like this
Looking at the graphics fidelity of Planetside 2 before and after its OMFG optimization, optimization means dumbing down graphics for everyone, even people who put the graphics settings on Ultra. They should have left the old graphics available for those who wanted it.

THIS OMG SO FUCKING MUCH JESUS CHRIST IT MAKES ME ANGRY FUCK! Too many PC gamers are just entitled fucking cry babies when their "super rig" doesn't run at Ultra quality. Then what happens? The developers patch it by making High the new Ultra and then people praise the developers for fixing it. No they just fucking set the graphics ceiling lower because you idiots don't know how to do it yourself. I can't believe Planetside 2 did that, I was wondering why the game looked like shit after I played a couple of years after release. Not many developers have the balls to allow people to stress the fuck out of their system these days. Crysis pushed boundaries god damnit now players are trying to put them back up.
 
Optimization, like many words, has lost its meaning over the past decade. I always think that people who throw this word around all the time in the context of video games should get a job as a developer to see what it really means and see how "easy" it is.

It's like p0rn. You may not be able to define it but you know it when you see it.
 
From the article: "But is that performance really the fault of the game’s programming not being written as efficiently as possible? How do you really know if a game is poorly optimized, or simply pushing your PC to its limits?"

Presuming the computer has a reasonably updated OS and does not have tons of bloatware (or the effective detrimental equivalent such as running a dozen virtual machines off the computer), if the specs of the complainer's computer greatly exceeds the specs of the console and yet the console delivers higher framerates and better graphics candy for a given resolution then the pc port optimization is likely suspect.
 
Real optimization would be some balance of several things. It would be writing good, well-documented, tight code that doesn't use roundabout or bloated means to achieve something. Something that doesn't waste CPU cycles doing a task that could be written to do it in less cycles. (early on this meant using more assembly code) I would also say that setting realistic boundaries for what machines you're targeting to run the code is important, so that A) the game actually runs well on your specified platform and B) people/customers don't have false expectations of it running well if they don't meet the target requirements. Then, yes, I do believe that some scaling back would typically be in order from possibly over-ambitious initial ideas. Adding in a ton of options is also a good way, because it allows the end user to do a bit of optimization themselves (within the program's parameters).

To me, if any scaling back is done, it should be as invisible to the user as possible. For example, if done properly texture compression doesn't really give much loss in overall scene detail, while reducing the load on graphics memory considerably.

I would also suggest that balancing proper loads between cores, CPU, GPU, precision level, disk access, would be another place where some optimization could be done.

There is quite a bit more too.

I'd say look at id software, former Starbreeze, maybe Epic, etc. if you want to know how to do this correctly.
 
My favorites are the games that shortcuts were taken, so when you change OS or hardware the game breaks.
 
My favorites are the games that shortcuts were taken, so when you change OS or hardware the game breaks.

Let me introduce you to some games from the 90s that had 16 bit code in them. :D

(though to be honest, most of them work in VMs now. :D )

Another favorite, are ones that have hard-coded hardware checks and won't even ATTEMPT to work if they don't see ancient hardware in the system. :p We're sorry, we can't find your Direct 3D 6 hardware. (or Voodoo hardware, though this case is slightly more understandable API-wise...)
 
The games that break from too much v-ram, or too much system ram, or break if there are more than 4 cores, or break because they change something in direct x.

Just glad there are dosbox and nglide to help save us with the really old ones.
 
>Bringing Planetside 2 into it and saying all the performance increases was just because lowering textures

Planetside 2 on launch ran on one, repeat, ONE thread, and when people complained about it Smeedly went "You don't know how to program, trust me, this is as good as it gets in terms of performance"

Then SONY wanted PS2 for Playstation 4, which required, GASP, being optimized in order to run on inferior hardware, so they had to go back, lower some textures, but the massive boost in performance came from rewriting the entire game engine not only from x86 code to x64 code, but moving the game from single thread to up to quad thread support

This was the same game Smeedly said "Couldn't be optimized any further" until his bosses higher up made him fix it

And THAT is why Planetside 2 runs way better today, not because they lowered some texture map counts
 
As a game dev, I look at it this way:

  • you have 2 games, Game A and Game B.
  • Game A and Game B look similar in terms of graphics fidelity and on-screen action etc.
  • Game A does not run as smoothly as Game B.
This leads me to the conclusion that:
  • Game A is not optimised as well as Game B.
Essentially, if it looks as good as DOOM, but runs worse, it isn't as well optimised as DOOM. Therefore it is (by comparison) poorly optimised.

If your first instinct is to crank settings to 11 and then complain that your wall-mart special can't run the game smoothly, you have no idea what optimisation is.
 
"Good optimization means that the game works at the same framerate across a wide range of hardware specs, including low-end configurations."

That must be where the console ports 30fps is enough is coming from.

As a game dev, I look at it this way:

  • you have 2 games, Game A and Game B.
  • Game A and Game B look similar in terms of graphics fidelity and on-screen action etc.
  • Game A does not run as smoothly as Game B.
This leads me to the conclusion that:
  • Game A is not optimised as well as Game B.
Essentially, if it looks as good as DOOM, but runs worse, it isn't as well optimised as DOOM. Therefore it is (by comparison) poorly optimised.

If your first instinct is to crank settings to 11 and then complain that your wall-mart special can't run the game smoothly, you have no idea what optimisation is.

I agree.
Optimizing it about getting the best end results with the least amount of resource usage.
 
Let me introduce you to some games from the 90s that had 16 bit code in them. :D

(though to be honest, most of them work in VMs now. :D )

Another favorite, are ones that have hard-coded hardware checks and won't even ATTEMPT to work if they don't see ancient hardware in the system. :p We're sorry, we can't find your Direct 3D 6 hardware. (or Voodoo hardware, though this case is slightly more understandable API-wise...)

Requim old shooter. would only run under DX6 it would not install under any newer version and no way to bypass the check in the installations :banghead:
 
To me, good optimization means good hardware scaling. If I have four cores, utilize them with tangible benefit, like better visuals or near-linear FPS improvements. If I have hyperthreading, use it. If I have SLI, give me as near to 100% utilization on both cards as possible for the thing I am attempting to run.

And then give me the option of what I want more of, visuals or frames.
 
As a game dev, I look at it this way:

  • you have 2 games, Game A and Game B.
  • Game A and Game B look similar in terms of graphics fidelity and on-screen action etc.
  • Game A does not run as smoothly as Game B.

The problem comes with the second step, because how do you quantify "similar"? As a game dev you're in a better position make that evaluation, but back when Crysis was crushing people's machines a lot of people moaned that CoD4 looked "as good" but ran way better. Of course CryEngine was far more sophisticated than CoD's ageing, id-tech-derived engine, but CoD had some excellent art direction and much more linear design. And when Crytek released Warhead to appease all the people who complained, it was very obvious (if you looked for it) that they achieved their "optimisations" at the expense of image quality.
 
Without having read the article or the thread, the word "optimization" would, under ideal conditions, have the dual meaning to me of:
"to reduce the workload on the system as much as possible while still getting effectively the same result"
or
"to increase quality as much as possible while keeping the workload on the system the same"
but conditions are never ideal and the devs need to choose how much more workload they are willing to allow in order to increase the quality as much as they need to.
 
To me, there is no 'optimised', but rather, if it's simply inefficient or badly coded.

Deus Ex: MD might be just plain inefficient, as in if I had a single GPU twice as powerful as Pitan X, I'd be able to play it smooth.

Badly coded means it runs like crap no matter what hardware you run it on, EG GTA 4.

Both fall under bad optimisation, as per current trend, but I have far less of an issue with the former than I have with the latter.
 
>Bringing Planetside 2 into it and saying all the performance increases was just because lowering textures

Planetside 2 on launch ran on one, repeat, ONE thread, and when people complained about it Smeedly went "You don't know how to program, trust me, this is as good as it gets in terms of performance"

Then SONY wanted PS2 for Playstation 4, which required, GASP, being optimized in order to run on inferior hardware, so they had to go back, lower some textures, but the massive boost in performance came from rewriting the entire game engine not only from x86 code to x64 code, but moving the game from single thread to up to quad thread support

This was the same game Smeedly said "Couldn't be optimized any further" until his bosses higher up made him fix it

And THAT is why Planetside 2 runs way better today, not because they lowered some texture map counts

Its true they had a lot of optimization to do and that they did do in Planetside 2 with O:MFG. But they also permanently made the game look worse even on max settings and hoped no one would notice.
 
To me, I consider FEAR a badly optimized game. When that game released, I had WAY over the recommended specs. I even had the 7800 GTX 512mb that was just released at the time of its released. That game ran like utter shit. Games that looked far better ran better than that POS.

I later spoke to one of the people that worked on that game, and my assumptions were mostly correct. So, I don't feel bad for shitting on that game.

I usually don't use "optimization" as an excuse. Sometimes I do. Mostly, it's when a game shouldn't run the way it does when it doesn't do what other games do. Now, I use that complaint to explain bad PC ports. Like, I know that Rise of the Tomb Raider is an intense game. My hardware is responsible for the performance, and not really the game. I feel it's been well optimized and patched for support. So, I don't talk shit about that game. Bioshock Remaster? Badly handled game release.

FEAR had issues with 512 MB RAM and below, when I put in 512 MB more for 1GB total, FEAR ran like butter. (+40FPS min FPS, which in those days were very good)
I even have videos of gameplay somewhere, I think you memeory either is terribad...or you had a bugged system.
 
This is why consoles get AAA titles first and exclusives. Static platform vs open platform.
One of the things that bothers me, why dont devs collaborate on a standard or some sort of tiered performance levels. I mean, VR has it, why not PC games?
Gamers could run the test and see where their PC fits. Tier 1, 2, or 3.
When you buy a game, you would know if you can play it or not. Sure their are tools, but not one standard.
DX9,10,11,12...openGL, Vulkan. then throw in GCN, Pascal, physX, gameworks, cuda, tresfx and countless other tech orientate buzzwords, its no wonder consoles are the preferred choice for the younger gen. Don't get me wrong, PC gaming has come a long way since the days of editing config file to load soundblaster,and mouse drivers trying to free up 600kb or better conventional memory, but PC gamers deal with bad drivers, good drivers, patches, OS updates, trouble shooting conflicts, tweak guides, closing apps, overclocking, better cooling, more ram, faster ram, faster GPU, CPU.... all so we can play a stupid game that millions of console owners just put in the disk, sit on the couch and play, soon to be in 4k. Yeah, PC gamers are a special breed.
 
From the article: "But is that performance really the fault of the game’s programming not being written as efficiently as possible? How do you really know if a game is poorly optimized, or simply pushing your PC to its limits?"

Presuming the computer has a reasonably updated OS and does not have tons of bloatware (or the effective detrimental equivalent such as running a dozen virtual machines off the computer), if the specs of the complainer's computer greatly exceeds the specs of the console and yet the console delivers higher framerates and better graphics candy for a given resolution then the pc port optimization is likely suspect.

That's the entire POINT of consoles; better performance at the expense of portability. Consoles should CERTAINLY perform better and look better on lower end hardware then a PC; that's the entire POINT.
 
I plan to read the article a bit later but generally "optimization" means disabling cutting things.

My last boss: make the game (WebGL) run faster!

Then I did and about a week later he started complaining how can hear his laptop fans doing what they're supposed to do.

I knew this was going to happen so I just limited the thing to 25 fps.

90% of the people are just clueless morons.
 
To me, I consider FEAR a badly optimized game. When that game released, I had WAY over the recommended specs. I even had the 7800 GTX 512mb that was just released at the time of its released. That game ran like utter shit. Games that looked far better ran better than that POS.

I later spoke to one of the people that worked on that game, and my assumptions were mostly correct. So, I don't feel bad for shitting on that game.

What the hell? I had a 7800gtx back in the day and FEAR ran perfectly fine for me. I set everything to max settings and never fiddled with them ever again. In fact, that was one of my favorite games that year and I played it a shit ton.
 
Sins of a solar empire is a good example of an un-optomized pile of code. Single threaded design. So late game? Everything studders. No multi core coding involved. It's a fun game until It becomes unplayable.
 
There is never such thing as apples to apples when it comes to game's graphical quality and performance vs. any other game.

Yes games use common standard APIs like DX11, DX12, OpenGL, Vulkan, etc... but ultimately every game engine is unique. There are infinite ways to code an engine to render a spinning cube with dynamic lighting and just because Game A uses the same API as Game B doesn't mean performance should be identical.

This ultimately comes down to the simple fact that PC gamers can argue about bad performance since the hardware and software ecosystem of PCs is so diverse. "Blame the player, not the game".
 
What the hell? I had a 7800gtx back in the day and FEAR ran perfectly fine for me. I set everything to max settings and never fiddled with them ever again. In fact, that was one of my favorite games that year and I played it a shit ton.

Ran pretty well on 7600GTs as well.
 
What the hell? I had a 7800gtx back in the day and FEAR ran perfectly fine for me. I set everything to max settings and never fiddled with them ever again. In fact, that was one of my favorite games that year and I played it a shit ton.

It ran great for me too. I can't remember exactly what I had then, but I think it was a pair of 6800GTs or something like that in SLI. The game ran really nicely for me.
 
>Bringing Planetside 2 into it and saying all the performance increases was just because lowering textures

Planetside 2 on launch ran on one, repeat, ONE thread, and when people complained about it Smeedly went "You don't know how to program, trust me, this is as good as it gets in terms of performance"

Then SONY wanted PS2 for Playstation 4, which required, GASP, being optimized in order to run on inferior hardware, so they had to go back, lower some textures, but the massive boost in performance came from rewriting the entire game engine not only from x86 code to x64 code, but moving the game from single thread to up to quad thread support

This was the same game Smeedly said "Couldn't be optimized any further" until his bosses higher up made him fix it

And THAT is why Planetside 2 runs way better today, not because they lowered some texture map counts

It is very likely that the single threaded version was not only easier to code but also EXACTLY what the PC version needed to be playable. Maybe they didn't have the man hours budgeted to make a multi-threaded PC version.

Sony coming in and bankrolling the game for the PS4 has nothing to do with the game being poorly optimized for PC. At that point they had more money and more incentive to start from the ground up for that specific platform. Again, the single threaded 'unoptimized' PC version was actually exactly optimized for their target PCs at the time and from what I can gather, many people were able to max it out without much issue.

Just because gamer A has an 8 core CPU and gamer B has a 4 core CPU does not entitle gamer A to complain about the lack of unutilized cores in any given application. Parallel programming is not only difficult but can sometimes even produce worse performance.
 
Last edited:
What the hell? I had a 7800gtx back in the day and FEAR ran perfectly fine for me. I set everything to max settings and never fiddled with them ever again. In fact, that was one of my favorite games that year and I played it a shit ton.
Had a 7800 GS AGP in a system with 2GB of RAM and an AMD Athlon XP 2000+, and the game ran perfectly fine for me as well at 1600x1200.
 
FEAR had issues with 512 MB RAM and below, when I put in 512 MB more for 1GB total, FEAR ran like butter. (+40FPS min FPS, which in those days were very good)
I even have videos of gameplay somewhere, I think you memeory either is terribad...or you had a bugged system.

I had way more than 512mb of ram. Unless, you're talking about VRAM, then 512mb was the max you could get at the time of FEAR. Again, I had purchased that 7800gtx 512mb Edition when it came out.
 
That's the entire POINT of consoles; better performance at the expense of portability. Consoles should CERTAINLY perform better and look better on lower end hardware then a PC; that's the entire POINT.

Yay for additional software layers making stuff run slower.

However, things should be getting better with DX12, Vulkan, etc.

Get some of the useless crap out of the way and let the games access the hardware itself instead of having to go through a bunch of hoops.
 
Optimization to me is about spending development cycles to improve performance. If you just crank out code to be used for multiple platforms and don't spend any time tweaking things for that platforms, its not optimized.
 
As stated above, the article is missing the point. I think people usually mean "badly optimized" as "running much worse on a PC than on a console with comparable graphics". The idea to compare it with other PC games comes when the person making the comparison is trying to desperately make the point that his/her PC has enough to run games well. It originates from times before consoles had anything to do with hardware performance.

Why not include a proper benchmark option in PC games?
GTA 4 is a good example of badly optimized games mentioned above, even though there are many settings to tweak. However, without knowing what exactly you are changing resource usage-wise, even with the predicted VRAM usage. (At the end you find out that the cake is a lie anyway.)

For the improvement, the basis could be any "benchmark" which is like the one in Mafia 2, which is a pretty basic one. The problem with those is that they seem to be there after you figure out what settings you want. How could you do that before running the game engine?
The upgrade could be to be able to change settings quickly/easily between runs and have feedback on resource usage after a run. Nothing more than what the logger in MSI Afterburner, for example, can record. Framerate is something you can see anyway, but will not make it easy to determine which settings should be modified.
More and more people could use something like this, because there are many who won't upgrade their CPUs for years, for the little gains, but they have a new(er) GPU or two.
For the ones who would complain if the above was used (and also suggested by customer support as a solution to fend off complainers):
If you don't want to change individual settings, then go with a preset which runs well enough for your liking or get a console. If you do want to fiddle around, but don't want to deal with managing resources: tough... buy a console and maybe it will suit you.

As a sidenote, there are games like F1 2014 which can run out of VRAM without any warning for example. Probably there's hardware out since it's release which has the resources to handle anything it can generate, but anything else can be pushed over its limits and that results in a crash. That one crashed more times than all the games I've ever played put together.
 
I had way more than 512mb of ram. Unless, you're talking about VRAM, then 512mb was the max you could get at the time of FEAR. Again, I had purchased that 7800gtx 512mb Edition when it came out.

Then you had a borked system...FEAR screamed on my rig of the time (it was even AGP with a Gainward 7800GS+ ( a 7800 GTX 512 MB on AGP)...and we are talking a single core Pentium with RDRAM.

I am also not the only one questioning your "experience"...just saying...
 
The problem comes with the second step, because how do you quantify "similar"? As a game dev you're in a better position make that evaluation, but back when Crysis was crushing people's machines a lot of people moaned that CoD4 looked "as good" but ran way better. Of course CryEngine was far more sophisticated than CoD's ageing, id-tech-derived engine, but CoD had some excellent art direction and much more linear design. And when Crytek released Warhead to appease all the people who complained, it was very obvious (if you looked for it) that they achieved their "optimisations" at the expense of image quality.

Not a game dev, but I made some amateur maps for COD4 which as you noted uses id engine. I learned in doing so that optimizing is not just about code, at least with FPS shooters it can also be about map design. With radiant, the id map design tool Activision devs use for COD, there are map making techniques, such as brushes (called "portals") that dictate what gets rendered in the FOV and what doesnt, that help optimization for the end user. They are time intensive to set up properly. You could see maps designed by real game devs would have heavy, heavy use of these tools all over the map, and they would run great, and then you could see amateur maps that failed to use these techniques and ran like shit. So it's not just "coding," it's (potentially) every element of a game's design.
 
Then you had a borked system...FEAR screamed on my rig of the time (it was even AGP with a Gainward 7800GS+ ( a 7800 GTX 512 MB on AGP)...and we are talking a single core Pentium with RDRAM.

I am also not the only one questioning your "experience"...just saying...

Agreed, he had a borked system or an unstable overclock. I easily managed a 100+ minimum frame rate on low end hardware at the time of F.E.A.R.'s release. Another common culprit is folks who used the horrible Logitech drivers at the time which had a conflict with the game.
 
Back
Top