Are games starting to out-pace GPU development?

No AAA games where performance matters are even written using C# or .NET framework. It's not like the performance in some 2D Unity turd even matters. Sure, it might require a computer 18 times more powerful than it should need to run, but it'll still hit 60hz on almost anything. All the real games are still C or C++ under the hood.

It's actually these crappy visual programming languages that are the cause of 99% of Unreal Engine games running like horse shit.

https://blueprintsfromhell.tumblr.com/

Look at those dog turds. Some of those atrocities are actually from shipping AAA games. And people wonder why every Unreal engine game stutters.

What the FUCK? I haven't kept up on programming for like 15 years, but is that the shit "programmers" are pumping out? For AAA games? And these people have degrees in programming? Is C and C++ really that hard? (I sure didn't think so!)

Of course, that's easy for me to say having not been in the industry and having to deal with deadlines and whatnot, but damn man. I honestly wish I liked coding. I feel like I could have been retired by now, as I picked it up very easily with literally no training before college. I just hated (and still hate) doing it, lol.

I'm weirdly cursed by being good at very profitable things I can't stand doing for a living.
 
99% of it is in Assembly... as I understand it he also had custom libraries that he made from when he ported Amiga/ST games. I don't think boiling that level of expertise down to 'he just used the DirectX SDK" is reasonable. I guess if you mean that he didn't literally write the entire thing in assembly, graphics and all, then... well yeah, that's getting into a much harder problem at that point...
The way some people read here is particularly a stretch, it is not just used the directX SDK, but I really doubt that 99% of the assembly code running on the CPU/GPU when you played that game did not go through a compiler at some point.

I mean I'm agreeing with you either way, expecting that type of person to be behind every video game is just unreasonable. Some people have the willpower and the idea, but not the means, if game development was still like that. I'd prefer badly optimized games with new ideas, rather than Assassin's Creed 395, because we don't have enough time to come up with a new series...
Not just that type of person, expecting that it would be possible for a giant teams with giant times to make an OS/drivers/game engines/actual giant game like a sequel of Cyberpunk without using a compiler (or that it would end up better/more optimized that what modern C++ compiler can do on modern hardware) and writing in assembly sound just a bizarre statement.
 
Performance requirements are also going up. Remember when we had CRTs you weren't tied to specific native resolution plus the pixel response time was instant. So even 30 fps at a lower resolution looked just fine.

Now you need 1440p+ and people want 120fps or more just to get some semblance of smooth motion.
 
Just sell your 2080 towards a newer card. I would always opt for that rather than getting a second card to go SLI, if just for the simplicity and more predictable performance upgrade across all games.

SLI never made much sense to me outside of flagship cards for the absolute best performance available on the market. Otherwise just go for a faster single-GPU solution so there's more linear scaling (you rarely saw a 90%+ performance increase with the 2nd GPU), better compatibility (some games just straight up didn't support SLI), and of course less heat and power requirement.
SLI and Crossfire always made the most sense on flagship models when you needed the absolute best performance. The drawbacks of mgpu would start to outweigh the benefits on lower-tiered cards.

The only time SLI or Crossfire made sense on mid tier cards was in the laptop space where FPS were at a premium. For example, 670ms SLI may get you playable framerate in games of that era e.g. Deus Ex Mankind, where a single 680m would be insufficient, and dual 680ms were cost prohibitive.

As for OP's original premise, a) yes, games are getting more demanding, b) this is due to new consoles and new features e.g. RT
 
Performance requirements are also going up. Remember when we had CRTs you weren't tied to specific native resolution plus the pixel response time was instant. So even 30 fps at a lower resolution looked just fine.

Now you need 1440p+ and people want 120fps or more just to get some semblance of smooth motion.

Yeah, and at that time a 30” TV was considered gigantic. Now I have a 55” TV and I feel like I can barely see the thing.
 
What the FUCK? I haven't kept up on programming for like 15 years, but is that the shit "programmers" are pumping out? For AAA games? And these people have degrees in programming? Is C and C++ really that hard? (I sure didn't think so!)

Of course, that's easy for me to say having not been in the industry and having to deal with deadlines and whatnot, but damn man. I honestly wish I liked coding. I feel like I could have been retired by now, as I picked it up very easily with literally no training before college. I just hated (and still hate) doing it, lol.

I'm weirdly cursed by being good at very profitable things I can't stand doing for a living.
I saw that this was where things were going when I was learning game development, myself. I learned the majority of things in C++, but the obsession with my classmates over the then-new Unity Engine was frankly frightening. I had a class where we delved into UE3 and the only thing I got out of it was despising the workflow. This is why I always have to be the "downer" whenever some news relating to Unreal Engine comes out.
 
I saw that this was where things were going when I was learning game development, myself. I learned the majority of things in C++, but the obsession with my classmates over the then-new Unity Engine was frankly frightening. I had a class where we delved into UE3 and the only thing I got out of it was despising the workflow. This is why I always have to be the "downer" whenever some news relating to Unreal Engine comes out.
I learned Fortran :)
 
I’m old enough to remember when anti-aliasing would absolutely tank performance.

Seeing FSAA in-person for the first time was absolutely mind-blowing!

Anisotropic Filtering just kinda weaseled its way into enough games that I learned to appreciate the IQ benefit.

Now, when I finally invested in a G-Sync monitor 3 years ago.... That + high refresh was the last big "Ohhhhhhhh!!" moment.

(Discounting SSD's obviously - which was HUGE. I still don't regret paying over $1 per GB for that privilege in 2013)
 
Seeing FSAA in-person for the first time was absolutely mind-blowing!

Anisotropic Filtering just kinda weaseled its way into enough games that I learned to appreciate the IQ benefit.

Now, when I finally invested in a G-Sync monitor 3 years ago.... That + high refresh was the last big "Ohhhhhhhh!!" moment.

(Discounting SSD's obviously - which was HUGE. I still don't regret paying over $1 per GB for that privilege in 2013)
I used to calculate “the real cost” of games I purchased based on their price and install size 😂
 
wasnt DX12 supposed to usher in a new era of sli in a way that devs didnt have to make entire engines written from the ground up just to take advantage of this?
 
wasnt DX12 supposed to usher in a new era of sli in a way that devs didnt have to make entire engines written from the ground up just to take advantage of this?
From my understanding it was the exact opposite, in the sense DX12-Vulkan is shifting control from SLI/crossfire to game devs hands, not that Vulkan/DX12 hide the multi gpu to the devs and present it has a single easy to use GPU that is more performant.

And with almost full screen if not temporal effect going on making the task over time much harder than in the past I would imagine, back in the days would have been possible for a gpu to simply raster half the screen and the other on the other half in some games, on some other gpus rendering half of the frame by interval.
 
Last edited:
Seeing FSAA in-person for the first time was absolutely mind-blowing!

Anisotropic Filtering just kinda weaseled its way into enough games that I learned to appreciate the IQ benefit.

Now, when I finally invested in a G-Sync monitor 3 years ago.... That + high refresh was the last big "Ohhhhhhhh!!" moment.

(Discounting SSD's obviously - which was HUGE. I still don't regret paying over $1 per GB for that privilege in 2013)
Force off anisotropic filtering in any game and you'll see why it "weaseled" its way into games. I literally do not play any game anymore if it doesn't have anisotropic filtering, or it can't be forced through the NVCP.
 
A typical game only has a handful of real programmers. The rest are making the art assets and designing the gameplay and such
 
Back
Top