10 Things To Do When Your PC Can’t Run A New Game

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Now I know none of you have ever had this problem but this list is still pretty entertaining. My favorites on the list are the one about obsessing over benchmarks and the one about overclocking. :)

No matter how close your PC is to the specs listed in a given publication’s benchmark, you will not be able to run the game nearly as well as their benchmark suggests. This person at TechTimesPro is averaging 80fps with your card! This person at MegaBitz is getting 75fps. You’re getting, like, 32fps at best. Clearly this is your fault.
 
Now I know none of you have ever had this problem but this list is still mildly entertaining. My favorites on the list are the one about obsessing over benchmarks and the one about overclocking. :)

No matter how close your PC is to the specs listed in a given publication’s benchmark, you will not be able to run the game nearly as well as their benchmark suggests. This person at TechTimesPro is averaging 80fps with your card! This person at MegaBitz is getting 75fps. You’re getting, like, 32fps at best. Clearly this is your fault.
I don't know about that. We have several users in this very forum who believe that just because they paid $600+ for their shiny new GTX 1080 that they should be able to run all the games at 60+ FPS in 4K.

Believe? Check that: they feel entitled.
 
Hey, this game comes with an option to auto-detect optimal settings! Maybe that’ll make it run decently. You auto-detect your settings and, hmm.

You couldn’t get a steady frame-rate with MSAA turned on, but now it’s set to MSAA x8… and the shadows have been bumped up to something called “Nvidia GCSS RealWorldLyfe,”
wtf. I have a 1080 and have never seen Nvidia GCSS RealWorldLyfe...
 
11. Be pissed at the company that created the game that they didn't introduce more low quality video settings. When the modernized version of xcom came out it ran slow on my i5 2400 laptop with all settings low. "Well, it's a demanding engine" I thought. Later I learned that they had released an ipad version of the exact same game that run smooth as butter (basically all the graphics settings were tuned even lower than possible for the pc version). WHY DO YOU DO THIS TO ME?????
 
Really, I've never had any of these problems even up to a few months ago when I was still running a gforce 580 at 2560x1600 resolution. I just lived with frame rates in the 20s.
 
They forgot to say install and test and then uninstall every GPU driver and beta release in the past two years to discover which one is "best". Then argue on the internet with others who've done the same but claim a different driver is best.
 
12. Mull over the benchmarks of X game. (Non Found) Purchase the next generation higher, find out that it's not powerful enough for the quality and performance. (Less than 60 FPS on max settings all the time!) Return video card. Purchase the newest video card. GPU runs too hot. Look into all into Watercooling. Look into watercooling everything, even the memory. Hell what about oil cooling? (Might as well go balls deep). Consider Koolance systems, or a corsair system or just air cooling because I'm a lazy pc gamer that doesn't mind dust rabbits... Maybe better case fans would do? Research the best case fans, weigh the options of LED Bling, silent, High airflow or Pressure.


I'm going through this stage now. Damn you XCOM2. Damn. You.
 
I'm not having any issues running any games atm.... :p

Granted I'm sure there are some that would still run like crap just from poor coding, I just haven't seen em personally.
 
I don't know about that. We have several users in this very forum who believe that just because they paid $600+ for their shiny new GTX 1080 that they should be able to run all the games at 60+ FPS in 4K.

Believe? Check that: they feel entitled.

I completely agree. If for whatever reason some have to turn down a setting to get a good experience it's the end of the world. Of course, having just done a big upgrade 5 months ago with the latest and greatest at the time, it would be nice to make it a year and be able to run things at least on the upper end of settings and so far that's not been a problem. But 60+ FPS with everything in every game maxed out at 5760x1080, not happening.

I think point #9 is one that gets missed a lot especially with techies, just play the damned game without one eye on an FPS overlay.
 
So, you saying bad optimization is something we should accept? And I know what the fucking word means.
 
Sometimes a game is good enough to play even with low frames. I played Torchlight 2 at 12fps at 1080 on low settings hooked up to a TV for days. It almost added to the experience having mobs of enemies slow everything down. But nowadays I notice dips into the 70s and wonder what broke...
 
So, you saying bad optimization is something we should accept? And I know what the fucking word means.

Optimization is a rather broad term, though. Usually it means cutting corners. Sometimes people can see the rough edges, other can't.

Take the new DOOM. Great performance BUT IQ wise, there's much to be desired at places. Too much dynamic resolution shenanigans. Love me some 120+ FPS single player, though. It makes a huge difference with all the action going on. They've made the right choice
 
Believe? Check that: they feel entitled.
That was like that since GTAIV and the first Crysis came out. They all bashed the games for running slow on their computers or that they couldn't even max out the settings, because the game won't let them. It didn't matter that the games looked better than anything else available at the time even on medium or low setttings.

There must be a distinction. There are games that run slow because they're coded badly and inefficiently, and they look crap even maxed out. And there are games that can't be maxed out because they're so forward looking. I personally always welcome the latter, it means the game is future proof and has the potential to look even better down the line.


But due to the clueless gamers developers don't dare enable settings that would cripple the actual hiend hardware.
 
For the past 25 years I've come across only one game that was really off in its hardware demands, and unfortunately I can't remember the name.
It must have been in 1997. The game was running in DOS, and my friend who bought the game was unable to make it even start in Win95.
My computer was running DOS 6.22 with Win 3.11, so right off the bat I was able to get it started... only to be greeted by a "too little RAM in the lower region" (*).
Creating a special boot file that loaded only the bare essentials, placing as much as possible of it in the upper region, I was able to start the game and see the whole intro. Then came next message: "Too little memory in the upper region."
I think I had 8MB RAM in total, which generally wasn't too bad at the time, but this game required at least 12MB of RAM!
The best analogy I can come up with today would be to now release a game written for the 64-bit version of Win XP that won't start in any later OS while also demanding at least 14GB of RAM installed.

Doom 3, and later Metro 2033, was when released so graphics intensive that even a bleeding edge computer at the time couldn't reach playable frame rates with the IQ maxed out at "normal" resolution.

(*) For those of you that aren't familiar with DOS I can tell that it was designed to run with up to 640kB of RAM. As RAM later became cheaper and having more than 1MB seemed like a good idea a function called HiMem was introduced that allowed most data to be moved to the "upper" memory area. Some code still needed to be run within the "lower" 640kB though.
 
I can't play minecraft nor Google Earth "flight" "simulator" on my work laptops :(
I feel entitled because I have top of the line Intel integrated 3D graphics !

My home computer is now sporting a 7100GS :)D) which technically can render 4-5 triangles inside 20 minutes.
BUT. I don't complain because I run both games windowed. I'm not a hardcore gamer so windowed mode is a really cool side feature. Both games are perfectly playable on my shit equipment when resized.
 
I don't know about that. We have several users in this very forum who believe that just because they paid $600+ for their shiny new GTX 1080 that they should be able to run all the games at 60+ FPS in 4K.

Believe? Check that: they feel entitled.

Well it should! I mean my ps4pro clearly has no issues playing games at 4k60fps, apparently still waiting on pc gamers to play catch up!

/sarcasm for those who couldn't tell :p
 
That was like that since GTAIV and the first Crysis came out. They all bashed the games for running slow on their computers or that they couldn't even max out the settings, because the game won't let them. It didn't matter that the games looked better than anything else available at the time even on medium or low setttings.

There must be a distinction. There are games that run slow because they're coded badly and inefficiently, and they look crap even maxed out. And there are games that can't be maxed out because they're so forward looking. I personally always welcome the latter, it means the game is future proof and has the potential to look even better down the line.


But due to the clueless gamers developers don't dare enable settings that would cripple the actual hiend hardware.

I completely agree on both fronts. I've had many games going from MSDOS to now that still require substantial upgrades to enjoy at their best and the best get better with new tech. After my most recent upgrades I've had a lot of fun playing Crysis2 & 3 in 4k and 2k 3d 120hz. They do just keep getting better. Metro's with AA fully cranked still weigh 'em down though. Doom 2016 will rock on forever!. Witcher 2 & 3 are games I love and have helped me set my sights for hardware upgrades for the last 6 years now. Sure Red usually has about a year of patches but when done the games tend to be very flexible with settings. GTAV, I can't make up my mind, it's really demanding with the highest settings but I'm not sure if its really worth it.

On the other hand there are some that only make it to the polished turd phase and not much else. Batman AK. . .looks nice but still not impressed. NMS, still hope but also not impressed.

"1. See just how egregiously wrong “auto-detect settings” is", yeah that and Geforce experience have both been about equally useless for me. Only a couple of times have I seen something slightly useful on any of my systems. I also learned about 10 years to stop believing the minimum/recommended specs the manufacturers list, they don't usually give the benchmarks for those specs for a reason.

"What if you moved to a new apartment? In a new state? What if you changed your name, then reinstalled Windows under that name?", I'm still laughing.

Sure I've encountered my share of useless forums and posts but overall they tend to be the best place to find the things the dev's don't tell you. In the past year I'm really loving the forums for SLI bits. As we're becoming more abandoned it's a nice community to get our own fixes going. --latest there are ways to get Planet Coaster w/ SLI.



 
I'm suffering with you. Same game, different stage.

I'm at step : Realise the game runs like lumpy shit and just grit your teeth and bear it.
Strangely, XCOM2 is one of the only two games (other is Elite) that made my 1080 artifact at stock settings. I had to ship the damn card off for repairs and they still have it :/ Out of the last 4 expensive video cards I've purchased, 3 (8800, 280, 1080) have had to be repaired or replaced under warranty. Fucking ridiculous.

For the record, the 1080 runs XCOM2 beautifully when it isn't artifacting :D
 
I'm not having any issues running any games atm.... :p

Granted I'm sure there are some that would still run like crap just from poor coding, I just haven't seen em personally.
Signature checks out.

...

*High Five!*
 
I also learned about 10 years to stop believing the minimum/recommended specs the manufacturers list, they don't usually give the benchmarks for those specs for a reason.
At this point, my processor is the minimum requirement for a lot of games, but it plays them beautifully and at high/highest quality settings with a good video card.
 
For the past 25 years I've come across only one game that was really off in its hardware demands, and unfortunately I can't remember the name.
It must have been in 1997. The game was running in DOS, and my friend who bought the game was unable to make it even start in Win95.
My computer was running DOS 6.22 with Win 3.11, so right off the bat I was able to get it started... only to be greeted by a "too little RAM in the lower region" (*).
Creating a special boot file that loaded only the bare essentials, placing as much as possible of it in the upper region, I was able to start the game and see the whole intro. Then came next message: "Too little memory in the upper region."
I think I had 8MB RAM in total, which generally wasn't too bad at the time, but this game required at least 12MB of RAM!
The best analogy I can come up with today would be to now release a game written for the 64-bit version of Win XP that won't start in any later OS while also demanding at least 14GB of RAM installed.

Doom 3, and later Metro 2033, was when released so graphics intensive that even a bleeding edge computer at the time couldn't reach playable frame rates with the IQ maxed out at "normal" resolution.

(*) For those of you that aren't familiar with DOS I can tell that it was designed to run with up to 640kB of RAM. As RAM later became cheaper and having more than 1MB seemed like a good idea a function called HiMem was introduced that allowed most data to be moved to the "upper" memory area. Some code still needed to be run within the "lower" 640kB though.

I think your memory is a little foggy. There were loads of games requiring a lot of free memory from the 640K, some wanted a little less, some more. But generally any game requiring more than 580K was a pain to run, and there were a lot.
And 8MB ram in 1997 was on the low end. I had 8MB in 1995, in 1997 I had 32MB. In 1998 64MB.
 
Optimization is a rather broad term, though. Usually it means cutting corners. Sometimes people can see the rough edges, other can't.

Take the new DOOM. Great performance BUT IQ wise, there's much to be desired at places. Too much dynamic resolution shenanigans. Love me some 120+ FPS single player, though. It makes a huge difference with all the action going on. They've made the right choice

To me, Optimization is not cutting corners. It has a whole lot more to do with actual optimization of the code so it takes less time to complete each task.

I've taken c/c++ code that I initially wrote and sped it up exponentially.

I've also taken other people's code and sped it up exponentially.

Besides work and personal stuff, I was working on a speedier version of the old version of the CPU emulation for DOSBOX. This was right before they changed over to the assembler core.

I rewrote the core to use direct array lookup for the CPU instructions instead of a huge case statement. That one change alone made it require about 25% less CPU cycles than the version that was already out.

I tried to get ahold of the DOSBOX team to submit my code, but they wouldn't even respond to me.

So yeah, optimization is NOT ABOUT CUTTING CORNERS.

It is about actually optimizing the code to be more efficient.

A huge amount of the code I have looked at over the years has been horribly written and not optimized at all.

The general theme I see is: "If it works, it is good enough. Who cares if the same could be achieved in 10% of the time".
 
You go into the options menu and make the usual tweaks. Turn on MSAA. Turn up shadow quality. Turn up reflections. Turn on any feature with a weird name and Nvidia branding. All of it does much to help. This game simply runs well on your PC.


fixed
 
wait for new drivers, this one is actually kinda true right? I remember trying to play Star Wars KOTOR on my Radeon 8500LE an a P4, which at the moment it was within the recommended specs if I am not mistaken. Well the drivers were not working at all, and it was until an upgrade came out that I was able to play it, same happened with NOLF 2.
 
In light of all the zombie and apocalyptic premises in movies/tv it'd been funny to see something based on this.

"In a world lost to the masses", "One man stands alone in search of the perfect gaming rig!" Our hero travels from city to city meeting fellow gamers scrounging every bit of info on the elusive setup for the perfect game while being tracked by the mega-corp with updates to block their every attempt.
 
To give you an idea how long I've been doing this. I learned from people modding TRS-80's when the Apple II just came out. I started with an Atari 400(GTIA) that I upgraded with a 48k Ram kit that involved soldering 16k onto a 32k board. I also added one of the first after market keyboards to it, from tape decks>single sided single density floppy's to double sided double density and so on.
 
To me, Optimization is not cutting corners. It has a whole lot more to do with actual optimization of the code so it takes less time to complete each task.

I've taken c/c++ code that I initially wrote and sped it up exponentially.

I've also taken other people's code and sped it up exponentially.

Besides work and personal stuff, I was working on a speedier version of the old version of the CPU emulation for DOSBOX. This was right before they changed over to the assembler core.

I rewrote the core to use direct array lookup for the CPU instructions instead of a huge case statement. That one change alone made it require about 25% less CPU cycles than the version that was already out.

I tried to get ahold of the DOSBOX team to submit my code, but they wouldn't even respond to me.

So yeah, optimization is NOT ABOUT CUTTING CORNERS.

It is about actually optimizing the code to be more efficient.

A huge amount of the code I have looked at over the years has been horribly written and not optimized at all.

The general theme I see is: "If it works, it is good enough. Who cares if the same could be achieved in 10% of the time".

I was hoping for a response like that. Then again, I did say "Usually it means cutting corners". Say, even with your skills, at some point the low hanging will be gone. What would you do when the math and algorithms are alright, and everything you could think of has already been squeezed and yet things are not being rendered fast enough? What then?
 
I was hoping for a response like that. Then again, I did say "Usually it means cutting corners". Say, even with your skills, at some point the low hanging will be gone. What would you do when the math and algorithms are alright, and everything you could think of has already been squeezed and yet things are not being rendered fast enough? What then?

Get a second person or even a few more people to look at it. I've come back to code months later and found things that I have missed or new ideas that came to mind. I basically obsess over optimizing something I am working on.. even a long time after I have already "finished" working on it.

I've gone as far as rewriting a large section of a program from scratch in order to try a new implementation of an algorithm just to try to make it 1-2% faster after figuring it out by hand on paper.

But, yeah, there comes a point where you can't really do much more as far as optimization goes.

At that point, in the case of a game still not being able to run acceptably on the hardware you want it to run good on, then you need to take something out or figure out how to split the workload into smaller chunks.

If you can keep the data set to be worked on small enough to not have to go out to main system RAM, it can speed up the calculations quite a bit. But that is really tricky when you are not programming for a specific CPU, and I highly doubt even a single game engine in existence even tries to do that.

They would need to detect the CPU and the cache sizes - easy look up table for existing CPUs, but if somebody tries to run on a CPU that was not coded into the program detection function, then it might get kinda tricky. In that case, an editable .INI file for the program that the user could edit with the correct values would be a good idea for future users.

I haven't had the chance to do any real game programming yet, so I can't really comment on rendering yet. I am just starting to learn a game engine I am planning on making a game with.
 
Back
Top