John Carmack Says He’s Better at Optimizing than GPU Driver Programmers

I believe him.

Regardless of whether anyone else does, doesn't matter to me because for the last 5-10 years I cringe and wait after every NV driver update. Seriously never really know anymore what might get better and what will get worse or be broken. Sometimes takes weeks or months till I play something I haven't tried in while to find out and then I have to back track what driver worked last.

It took nearly six months past 378.92 before I got back my correct color depths for 4k. Back when Witcher 3 came out, and SLI support was still o.k., there were occasions when a driver take me from 90-120fps 1080p on my 970's down to 50-60fps. I remember having to reinstall older drivers then too.
 
Not really. You needed a 66 MHz Pentium to run Doom at a smooth 35 FPS with the screen at max size.

Carmack did not code the engine that Doom (2016) runs on. Tiago Sousa is the genius behind that one, though admittedly id Tech 6 still uses features developed by Carmack for id Tech 5.
Ahhh. I stand corrected. I thought Carmack was still the guy behind the new Doom engine.
 
I thought DX12 and Vulkan were made so devs could do the optimizations?....As for Carmack... I would say any programmer, or someone who is at the level in their craft that he is, should be somewhat arrogant. Just not too much... nobody likes an asshole except other assholes. :)
Carmack can back up his shit though... many cannot.
 
Drivers these days accommodate games. But with limited time and money you can't accommodate everyone for every device. The most they could do is expose more of their API to developers.

You can take different code paths. You could even have different DLLs for different situations.. as in for different game engines that were specifically optimized for those game engines.

It really shouldn't be all that difficult to do that. Sure it would lead to people making driver packs again as well as driver switchers, but it would end up being a lot better optimization-wise.

Having one huge frankendriver is going to complicate things and make it harder to optimize for specific games while not hurting the performance of others.

Exposing the API better to developers would help.. but without the actual driver source code to work with, to make sure the API documentation is actually correct, it isn't going to make a huge difference, especially when the developers are generally on a very tight schedule and don't have the time to screw with more than they already have to deal with.
 
Not really. You needed a 66 MHz Pentium to run Doom at a smooth 35 FPS with the screen at max size.

Carmack did not code the engine that Doom (2016) runs on. Tiago Sousa is the genius behind that one, though admittedly id Tech 6 still uses features developed by Carmack for id Tech 5.

Not to take away from his credit but Sousa joined id Software less than two years before the game came out.
 
The problem in this industry is that most game engine programmers are pretty awful at their job. So awful that GPU driver developers frequently take it on themselves to inject performance fixes at runtime. This is why new games frequently need a new driver before they run well. I lost the link, but there was a really well-written article a few month ago by a GPU driver developer about this issue.

Mixing GPU drivers that silently "fix" things behind the scenes in undocumented ways with programmers that actually know what they're doing is a recipe for disaster.

Edit: Found it: https://www.gamedev.net/forums/topi...2vulkanmantle/?do=findComment&comment=5215019

The first lesson is: Nearly every game ships broken. We're talking major AAA titles from vendors who are everyday names in the industry. In some cases, we're talking about blatant violations of API rules - one D3D9 game never even called BeginFrame/EndFrame. Some are mistakes or oversights - one shipped bad shaders that heavily impacted performance on NV drivers. These things were day to day occurrences that went into a bug tracker. Then somebody would go in, find out what the game screwed up, and patch the driver to deal with it. There are lots of optional patches already in the driver that are simply toggled on or off as per-game settings, and then hacks that are more specific to games - up to and including total replacement of the shipping shaders with custom versions by the driver team. Ever wondered why nearly every major game release is accompanied by a matching driver release from AMD and/or NVIDIA? There you go.

The second lesson: The driver is gigantic. Think 1-2 million lines of code dealing with the hardware abstraction layers, plus another million per API supported. The backing function for Clear in D3D 9 was close to a thousand lines of just logic dealing with how exactly to respond to the command. It'd then call out to the correct function to actually modify the buffer in question. The level of complexity internally is enormous and winding, and even inside the driver code it can be tricky to work out how exactly you get to the fast-path behaviors. Additionally the APIs don't do a great job of matching the hardware, which means that even in the best cases the driver is covering up for a LOT of things you don't know about. There are many, many shadow operations and shadow copies of things down there.

...

Why are games broken? Because the APIs are complex, and validation varies from decent (D3D 11) to poor (D3D 9) to catastrophic (OpenGL). There are lots of ways to hit slow paths without knowing anything has gone awry, and often the driver writers already know what mistakes you're going to make and are dynamically patching in workarounds for the common cases.

...

Part of the goal is simply to stop hiding what's actually going on in the software from game programmers. Debugging drivers has never been possible for us, which meant a lot of poking and prodding and experimenting to figure out exactly what it is that is making the render pipeline of a game slow. The IHVs certainly weren't willing to disclose these things publicly either, as they were considered critical to competitive advantage. (Sure they are guys. Sure they are.) So the game is guessing what the driver is doing, the driver is guessing what the game is doing, and the whole mess could be avoided if the drivers just wouldn't work so hard trying to protect us.
 
Last edited:
Tell him to work with the GPU driver teams. Is he a good team player or is he just the prodigy that goes on a one man team thinking he's the greatest thing ever?

I like the guys work, but he sure is an arrogant bastard that just gets me in the wrong way. I just don't like the guy.
 
Tell him to work with the GPU driver teams. Is he a good team player or is he just the prodigy that goes on a one man team thinking he's the greatest thing ever?

I like the guys work, but he sure is an arrogant bastard that just gets me in the wrong way. I just don't like the guy.

GPU driver teams don't want to work with you, as they're deathly afraid of keeping their competitive secrets airtight. Their solution is to embed someone in your team to handle everything.
 
If someone can name someone who has been programming games for longer than Carmack, I'd love to hear it. I find it pretty believeable that a guy with 30 years of programming experience beats a kid that might have 10 to 15 years of experience.
https://impellerstudios.com/ the guys at Impeller wrote the original X-Wing games. I bet they could give him a run.
 
No one should question Carmack. He is The Man. Anyone who disagrees is under 30 and doesn't realize this dude invented FPS.

Listen to some of his tech talks... maybe a little dry but you cannot deny this guy's brain is on another level.
 
just another example of mediocre minds shunning great minds
 
Just had to downgrade to August nvidia driver so Guild Wars 2 would stop shitting the G-Sync bed on every map change. I always clean install drivers. It absolutely is a driver issue. Carmack, work your magic!
 
Aside from the texture issue which was solved shortly after launch, it did run impressively smooth. Too bad the game just wasn't good.

I certainly couldn't get into it, though I don't get into most id games.

But man is it smooth, and texture pop-in- where witnessed- is an artifact of prioritizing smoothness over image quality, and it worked. And id continues to improve on it, with Doom 2016 as an impressive result. DICE and whoever is making CoD engines could only hope to produce something so effective.
 
I personally think he is rotting at Oculus with this VR fad.
 
Not really. You needed a 66 MHz Pentium to run Doom at a smooth 35 FPS with the screen at max size.

Carmack did not code the engine that Doom (2016) runs on. Tiago Sousa is the genius behind that one, though admittedly id Tech 6 still uses features developed by Carmack for id Tech 5.

Don't you mean quake? Doom ran fine on my 486dx2-66mhz. Of course I was running 32bit vlb and scsi.
 
every game for the past 10 years..
make it work on xbox and playstation = pc optimizations :(
 
I personally think he is rotting at Oculus with this VR fad.

I think the reason he went was because he felt like he was rotting doing "regular" games. Doing VR is something completely new so maybe he is enjoying the challenge and the change. I've only been in my career for 12 years and I already and bored as hell and ready to do something different. I'm sure after 30 he was ready for some kind of shake up.
 
Back
Top