John Carmack Says He’s Better at Optimizing than GPU Driver Programmers

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Carmack made an interesting comment at Oculus Connect today, saying he's better at optimizing code than dedicated graphics card driver teams: Carmack believes that GPU driver teams often make mistakes in driver development that ultimately break optimizations. This is something Carmack feels he can do better than them at the moment. "They often make mistakes," he said, referring to driver updates that can break things in games. "I tell them to let me do the optimizations, I can do it better."

Carmack wasn't throwing shade at AMD or NVIDIA. His comment was made in passing as he discussed Oculus Go and various VR technologies. However, it caught our attention, especially given the challenges that driver teams at AMD and Nvidia have faced at certain points in time. AMD has especially struggled in the past, at least before bringing Raja Koduri on board to lead a dedicated Radeon Technologies Group that AMD formed in the lead-up to Vega.
 
With the exception of RAGE, he's probably right. I can't think of a game that he has been involved with that was poorly optimized. Then again he's a programming wizard which talent most development teams lack. :)
 
With the exception of RAGE, he's probably right. I can't think of a game that he has been involved with that was poorly optimized. Then again he's a programming wizard which talent most development teams lack. :)
Rage wasn't poorly optimized, AMD's drivers were the culprit there. Sort of proves his point I guess.
 
Yeah, he is probably right. How many years of experience does he have in the industry? 30? He knows basically the entire history of 3D graphics development and probably understands why the hardware and software are what they are today better than most people that work at AMD or Nvidia.

The average driver development engineer probably has way less experience in the industry than Carmack. There's no doubt that most driver programmers are probably really smart people, but he has actually written game engines from scratch and knows the ins and outs of each part of the technology.
 
Optimizations blah!
Nothing deep learning can't fix: write random opimization code, run and measure FPS. Rinse and repeat..
Remember! its not brute force.

Joking aside I am betting some of this driver stuff is probably AI right?
 
I'm waiting for the day when someone discovers that video card drivers have been used to mine Bitcoins for years now but only when a 3D application kicks in and it sucks up some GPU time in the background.

Conspiracy theory, maybe, but I wouldn't put it past any GPU maker and since there's basically just two of 'em (with Intel laughably producing integrated stuff) well, it only remains to be proven I suppose.

Ive seen stranger shit happen. :)
 
I've worked with some programmers that are like Yoda - they are just in a different realm. When I was a programmer, I was ok and could do certain things fine. I got to work with our "Yoda" and felt like I didn't even know how to add two int's together. This guy wasn't condescending or anything - just brilliant. Carmarck is on the brilliant level as well. (I used to watch all of his speeches and read his blogs - fascinating but I realized a lot of it was over my head and I kind of lost interest because I wasn't following his thoughts.)
 
Rage wasn't poorly optimized, AMD's drivers were the culprit there. Sort of proves his point I guess.

It had some major issues with texture pop-in and morphing/stutter so I can't agree 100% there - and that was on nVidia as well.
 
It had some major issues with texture pop-in and morphing/stutter so I can't agree 100% there - and that was on nVidia as well.

Aside from the texture issue which was solved shortly after launch, it did run impressively smooth. Too bad the game just wasn't good.
 
I believe him....and I doubt the driver packages would be 250 meg either. (I know there's a lot of other shit in there, but just sayin)

In case of nvidias, seems to include a lot of shit to make sure that stuff doesnt work.

Example, if user is trying to do gpu pass-through, say via ESXi, they block geforces because they want you to buy quadros.

If user is using an AMD card as primary and a nvidia just for physx, guess what? yeap, blocked.

Pretty sure there are many more crap like this and its in their drivers package.


I'm waiting for the day when someone discovers that video card drivers have been used to mine Bitcoins for years now but only when a 3D application kicks in and it sucks up some GPU time in the background.

Conspiracy theory, maybe, but I wouldn't put it past any GPU maker and since there's basically just two of 'em (with Intel laughably producing integrated stuff) well, it only remains to be proven I suppose.

Ive seen stranger shit happen. :)

You could be on to something there, especially observing the crap that nvidia has already pulled, as stated above and also including telemetry crap in every driver release, without saying shit to the user, neither let you opt-in or out.
 
Didn't he rewrite some opengl or glide stuff back in the early quake days because the performance was sub par or something, been a long time.
 
I'm waiting for the day when someone discovers that video card drivers have been used to mine Bitcoins for years now but only when a 3D application kicks in and it sucks up some GPU time in the background.

Conspiracy theory, maybe, but I wouldn't put it past any GPU maker and since there's basically just two of 'em (with Intel laughably producing integrated stuff) well, it only remains to be proven I suppose.

Ive seen stranger shit happen. :)
I'd put it past Intel with their GPU drivers...lol
 
Last edited:
Didn't he rewrite some opengl or glide stuff back in the early quake days because the performance was sub par or something, been a long time.

Yeah, I think it was MiniGL for voodoo's on quake.

Also Scrolling on PC's (like Mario bros on NES) early on which used in Commander Keen. Then the Wolfenstein3D engine. Then the Doom engine and so forth. Alot of it's talked about in Masters of Doom book or audiobook read by will wheaton.

Nvidia and AMD would be scared if he did that probably. He was a fan of open source back in the day. Imagine open source drivers, no more locked SLI/CFX, Freesync/G-Sync, Nvidia Works, etc.... If only :sadface:
 
Drivers these days accommodate games. But with limited time and money you can't accommodate everyone for every device. The most they could do is expose more of their API to developers.
 
With the exception of RAGE, he's probably right. I can't think of a game that he has been involved with that was poorly optimized. Then again he's a programming wizard which talent most development teams lack. :)

Doom 3 was the proof. The leaked E3 demo of Doom 3 ran like shit. The retail version ran significantly better on the same exact hardware.
 
LOL, I'd hope so, since he makes about three orders of magnitude more money than any of those driver guys.

He says it like AMD/Nvidia should hire him and he doesn't understand why they don't. Haha! I'm sure they'd love to, if he'd work for a driver programmer's salary, or even 3 driver programmers' salaries.
 
This from the guy who thought that automatic quality bullshit in the Rage engine was a good idea?

My_a22742_1600256.jpg


Who cares what this has-been thinks?

In the 90's he was relevant. Not today.
 
On the one hand, i think maybe...


On the other hand i think probably bullish*t.



It would be like Bezos saying he can excel spreadsheet better than a project manager... bro you haven't touched a spreadsheet since office 2000. You're way to high on the totem for that nonsense.
 
Carmack loves to toot his own horn...he sure loves himself...maybe he's looking for another job
 
Dude is full of shit.
Sounds like he needs to work with the gpu programmers more closely instead of doing his own thing. I don't have near his experience, but I have enough to know that type of developer. And yes, I get it he doesn't work for them but should be working with them to drive the optimizations he wants.
 
Drivers these days accommodate games. But with limited time and money you can't accommodate everyone for every device. The most they could do is expose more of their API to developers.

I think this is the main problem. Optimization is just not the key focus. Better hardware will always be around the corner, and companies prefer focusing on bringing profit (new cards, or fixing old bugs) over speeding things up.
 
First off, they updated their article with corrections:
Correction: A previous version of this story included the quotes "They often make mistakes" and "I tell them to let me do the optimizations, I can do it better" which were misheard. We've updated this story with accurate quotations.

The accurate quotes are: According to Carmack, GPU driver teams often make mistakes in driver development that ultimately break optimizations. This is something Carmack feels he can do better than them at the moment. "They often make mistakes," he said, referring to driver updates that can break things. "'Let me do the specific low level things, I know what I'm doing, I'll take care of it, you're going to make decisions that are not going to be optimal for me in various ways," Carmack said.



Slightly annoyed at PCGamer, they divided their article on his talk into two separate articles on their site, and to me, put his comments out of context and heard it wrong in the first place. The second article is this: http://www.pcgamer.com/john-carmack-the-power-of-the-pc-will-never-get-to-mobile/

Heres a portion of it:
John Carmack took the stage today at Oculus Connect to talk about how developers are not doing a good job with developing for VR. Carmack told the audience of developers that the content industry is generally lazy, waiting for powerful hardware. but the mainstream doesn't rely on powerful expensive CPUs.

"I would rather have magic software over magic hardware," said Carmack, referring to how many game developers always look toward the next generation GPU from AMD or Nvidia, but in fact should work on tuning their software and games for lower-end hardware.

To push virtual reality into the mainstream, and then have the high-end put more resources to the platform, Carmack indicated that the industry needs to focus more on mainstream hardware, specifically mobile. This strategy, said Carmack, allows the entire industry to move forward since mobile is where the rest of the tech industry is focused.
 
Dude is full of shit.
I disagree. He has very interesting posts on the usenet and other places where he talks in great detail about optimization. The source code from id software titles that were open sourced also shows an impressive mindset. This man knows his stuff, absolutely.
 
He's not a has been.

You can't argue with the way Doom ran. Beautiful game -- which would even run well on a potato.
 
Last edited:
He's not a has been.

You can't argue with the way Doom ran. Beautiful game -- which would even run well on a potato.
Not really. You needed a 66 MHz Pentium to run Doom at a smooth 35 FPS with the screen at max size.

Carmack did not code the engine that Doom (2016) runs on. Tiago Sousa is the genius behind that one, though admittedly id Tech 6 still uses features developed by Carmack for id Tech 5.
 
  • Like
Reactions: Zuul
like this
Back
Top