NVIDIA Might Incorporate More AI-Optimizations In Its Graphics Drivers For RTX GPUs

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,891
Neat little operation here, hopefully the performance improvements aren’t at a loss to any visual or temporal fidelity

“Nvidia is working on AI optimized drivers. Release maybe this year (Q1).
Up to 30% more performance
Average improvement ~10% No info about specific gen Take this with a grain of salt. If true Nvidia drivers will be real "fine wine"

https://wccftech.com/nvidia-might-i...zations-in-its-graphics-drivers-for-rtx-gpus/
 
They used AI to help develop the layout of the Hopper chip too IIRC?
 
I can certainly see the potential for improved DLSS and RT if driver's become optimized for game-specific data (I am not sure, maybe they already are). For example, a colorful, bright game vs. a dark, sparsely lit game could certainly benefit from neural nets trained on very different data-sets. If NVidia were to get developers involved I can see how even level-by-level optimization for these features could yield considerable improvements over the base performance.

Of course there is limited benefit for developers if the majority of the gaming audience is still buying 1060's and 1650's. Hopefully NVidia discontinues these older cards are releases some entry level RTX options and gets everybody on the DLSS + RT bandwagon. At this point, those features are clearly the future.
 
76tfyo.jpg
 
This will not be the last place AI will "insert" itself more and more.
Healthcare, Banking, Financial Services And Insurance (BFSI), Retail And E-Commerce, Gaming And Entertainment are all areas that will see more and more A.I. injected.
 
Assuming that this isn't just unsubstantiated click-bait, I'd be interested to see which cards this ends up applying to. Nvidia has a bad habit of making things only apply to the very latest cards, even if some older cards are technically still capable of it also. I was using a RTX 2080 until recently and it was more than obvious that the card was already off the radar of Nvidia's driver team.
 
Assuming that this isn't just unsubstantiated click-bait, I'd be interested to see which cards this ends up applying to. Nvidia has a bad habit of making things only apply to the very latest cards, even if some older cards are technically still capable of it also. I was using a RTX 2080 until recently and it was more than obvious that the card was already off the radar of Nvidia's driver team.
Turing is second-gen Tensor cores (FP32, FP16, INT8, INT4)

Ada is fourth-gen Tensor Cores (FP64, FP32, FP16, BF16, INT8, FP8, INT4) and dedicated Optical Flow Accelerator hardware (Turing and Ampere runs the Optical Flow Accelerator algorithm on the compute enigne)

Hardly the same hardware, hence a differentation in features/performance and also why Ada is ~2 times faster than Ampere in this regard.
 
If a midtier card pumps out the performance of a 4090, is it really midtier though? 🤔
That's not how midtier's are determined. Usually has to do with pricing and configuration of the GPU. Also, doesn't this sound like Nvidia plans to turn on DLSS permanently? My guessing is that Nvidia thinks they have a version of DLSS with no image quality loss.
 
If a midtier card pumps out the performance of a 4090, is it really midtier though? 🤔

If it comes out a year later.... ya. :)
Nvidia lately has tried to freeze performance to price ratio 2 or 3 years back at all times. Oh this card performs at 1.6x the previous flag ship, lets add 60% more price tag. Oh this performs like a 3090 ti in at least a few tests lets price it the same. I suspect the 4060 will be priced like a 3080 because same performance right.

To be fair their competition isn't helping by comparing everything they make to a 3060 either. lol
 
Turing is second-gen Tensor cores (FP32, FP16, INT8, INT4)

Ada is fourth-gen Tensor Cores (FP64, FP32, FP16, BF16, INT8, FP8, INT4) and dedicated Optical Flow Accelerator hardware (Turing and Ampere runs the Optical Flow Accelerator algorithm on the compute enigne)

Hardly the same hardware, hence a differentation in features/performance and also why Ada is ~2 times faster than Ampere in this regard.
The real question is, why do we keep calling this generation "Ada?" Every prior generation named after a famous scientist used the last name of the person. We didn't call the 20-series "Alan" or the 30-series "André-Marie."
 
Turing is second-gen Tensor cores (FP32, FP16, INT8, INT4)

Ada is fourth-gen Tensor Cores (FP64, FP32, FP16, BF16, INT8, FP8, INT4) and dedicated Optical Flow Accelerator hardware (Turing and Ampere runs the Optical Flow Accelerator algorithm on the compute enigne)

Hardly the same hardware, hence a differentation in features/performance and also why Ada is ~2 times faster than Ampere in this regard.

They are not referring to generational changes. Of course generational change = performance. They are talking about driver optimization. Nvidia tends to stop optimizing games for cards that over a generation old. We are not talking about RT performance or DLSS 3 vs 2. We are talking about drivers optimized for specific games. Nvidias history is to sell something like a 2080... and when the 3080 comes out they don't bother optimizing the 2080 for new games day and date anymore. All those arch changes are nice... but at the end of the day 99% of all FPS comes from standard old Raster.

For a real world obvious example.. the 1080ti (which to be fair is a pretty old card at this point) no longer gets a ton of software optimization. As a result the old AMD competition cards which struggled to match the 1080 at launch out perform it in most scenarios currently. I think you where responding to someone who was wondering if Nvidias new wonder optimizations would apply to cards like the 2080 supers ect that people that a lot of owners are probably hoping they can stretch out for another few years. (really helps if driver releases for new games still help out, and not just increase the gap between your card and newer parts)
 
Getting back on topic, I am confused. "Nvidia is working on AI optimized drivers." Is that drivers being optimized for AI, or drivers optimized by AI? The former doesn't interest me, the latter does.
 
Getting back on topic, I am confused. "Nvidia is working on AI optimized drivers." Is that drivers being optimized for AI, or drivers optimized by AI? The former doesn't interest me, the latter does.

The latter. Nvidia has mentioned their drivers have more lines of code than Windows XP - I imagine they'll use the AI to optimize/streamline the codebase and reduce overhead of having all that code in the various ways (big and small) that making things smaller/more efficient is beneficial (performance/maintenance/compile/storage/distribution)
 
Getting back on topic, I am confused. "Nvidia is working on AI optimized drivers." Is that drivers being optimized for AI, or drivers optimized by AI? The former doesn't interest me, the latter does.
Drivers optimized by AI, by AI they optimized drivers for.
 
If that is the case, then any performance improvements found in the driver stack would apply to any supported GPU. Not sure how far back support goes, but this sounds like a good thing for everyone. If it works out well, maybe Intel can find an AI to write their drivers for them.
 
The latter. Nvidia has mentioned their drivers have more lines of code than Windows XP - I imagine they'll use the AI to optimize/streamline the codebase and reduce overhead of having all that code in the various ways (big and small) that making things smaller/more efficient is beneficial (performance/maintenance/compile/storage/distribution)
I imagine the driver would break if they tried to clean the code base up, which is why we're quickly approaching a 1GB driver despite NVIDIA dropping support for Kepler and older products.
If that is the case, then any performance improvements found in the driver stack would apply to any supported GPU. Not sure how far back support goes, but this sounds like a good thing for everyone. If it works out well, maybe Intel can find an AI to write their drivers for them.
Maxwell is as far back as support goes. Kepler gets security updates only until the middle of next year.
 
Getting back on topic, I am confused. "Nvidia is working on AI optimized drivers." Is that drivers being optimized for AI, or drivers optimized by AI? The former doesn't interest me, the latter does.
yea, isn't DLSS / drivers already AI optimizations?
 
Drivers optimized by AI, by AI they optimized drivers for.

Makes you wonder what the next "learn to code" slam is going to be.... when AI starts displacing coders. lol

In all seriousness we are probably 4-5 years away from self updating drivers. New game... Nvidia BOT adds optimizations to monthly iteration. We will get better drivers sure... but the driver teams at all the majors will get slashed down to a couple people.
Heck in less then a decade we might be looking at the first generation of GPU fully designed by AI... hardware to software stack. No jobs are safe from automation. lol
 
yea, isn't DLSS / drivers already AI optimizations?

What we are talking about here is next level. This is... her is 100,000 lines of driver code. Nvidia bot optimize this... now we have 92,000 lines of code. The AI is going to see redundant bits, routines that can be reused ect.

The down side may well be trying to keep logical track of all its potential optimizations. I would have to assume its going to want to cut out some tried and true things coders employee to ensure the code remains understandable to other humans. I mean if you don't care if another employee down the road 100% understands what your doing, you can take a few short cuts and not waste a bunch of time noting them. lol

I can see real upside to the tech in general for massive projects that have gotten a bit unwieldy. Imagine having an AI smart enough to sift through a larger project like an OS and unfuck a bunch of stuff people might not even know is broken.
 
No jobs are safe from automation. lol
Some plumbing-nursing and other real world physical always changing jobs, anything that you can do from a computer are certainly on the line and soon has the next 25 years (depending on how much power the profession has to artificially protect themselves).

Everyone putting their code on github and other opensource place for machine to learn from it kind of knew. For a while it will simply be an impressive optimizer and auto complete of more and more complex affair, but I would imagine that soon it would either reduce the numbers of coder needed or a bit like what happened with Bank clerk with ATM or accountant with Excel, explode the profession with how much they can do.
 
  • Like
Reactions: ChadD
like this
Makes you wonder what the next "learn to code" slam is going to be.... when AI starts displacing coders. lol

In all seriousness we are probably 4-5 years away from self updating drivers. New game... Nvidia BOT adds optimizations to monthly iteration. We will get better drivers sure... but the driver teams at all the majors will get slashed down to a couple people.
Heck in less then a decade we might be looking at the first generation of GPU fully designed by AI... hardware to software stack. No jobs are safe from automation. lol
The trick is to be the one who automates. The kids who want to learn to code to make games, I encourage of course but I do tell them to seriously look at and study AI.
 
Some plumbing-nursing and other real world physical always changing jobs, anything that you can do from a computer are certainly on the line and soon has the next 25 years (depending on how much power the profession has to artificially protect themselves).

Everyone putting their code on github and other opensource place for machine to learn from it kind of knew. For a while it will simply be an impressive optimizer and auto complete of more and more complex affair, but I would imagine that soon it would either reduce the numbers of coder needed or a bit like what happened with Bank clerk with ATM or accountant with Excel, explode the profession with how much they can do.

I agree we are probably heading to a much better software future at least in the short term. Having minor bugs in simple routines propagate and cause issues in large packages should become non existent. Imagine a compiler that doesn't just spit out Error in X... but simply corrects it and adds a log note. I can even foresee an even more human language friendly AI programming language emerge. (which might not be great in the long term) I mean why code in C when you can do 99.9% of the same stuff by just instructing the AI. The next decade is going to be interesting software wise.
Oh and your right on plumbers.... they are safe until we give the AIs bodies. lol :)
 
What we are talking about here is next level. This is... her is 100,000 lines of driver code. Nvidia bot optimize this... now we have 92,000 lines of code. The AI is going to see redundant bits, routines that can be reused ect.

The down side may well be trying to keep logical track of all its potential optimizations. I would have to assume its going to want to cut out some tried and true things coders employee to ensure the code remains understandable to other humans. I mean if you don't care if another employee down the road 100% understands what your doing, you can take a few short cuts and not waste a bunch of time noting them. lol

I can see real upside to the tech in general for massive projects that have gotten a bit unwieldy. Imagine having an AI smart enough to sift through a larger project like an OS and unfuck a bunch of stuff people might not even know is broken.
This right here!
10,000 eyes on a problem building a solution is great. Until you try to make sense of the whole thing. No one person there has a grasp of the full scope there will be overlap and redundancy and all sorts of things caused by simple human communication errors. Even if the AI doesn’t change a single line of code but simply acts as a massive crawler pouring through every line in every function finding those redundant lines to nowhere and flags them in a large report for the team to a analyze that’s a huge deal. An AI simply doing a large report that can visually break down the code into something high level a small team can follow could take a project that would be inconceivable due to the man hours involved and rapid changing nature of the problem and bang out the work on a weekend.
 
Imagine a compiler that doesn't just spit out Error in X... but simply corrects it and adds a log note.
Compiler that point the error with a message did you meant X instead with the correct solution (and IDE that let you correct them like in Word) do start to get more and more impressive.
 
The trick is to be the one who automates. The kids who want to learn to code to make games, I encourage of course but I do tell them to seriously look at and study AI.

AI is going to have a massive impact on game development. IMHO
We already have AI asset generation. We have basic non programming language scripting languages added to engines like Unreal. I have no doubt companies like Epic are already toying with AI powered scripting. I bet we are 1-2 Unreal engine versions away from natural language scripting. Its not hard to imagine a ChatGPT type script engine for something like the Unreal engine. Between AI assets, AI assisted scripting... game development is probably going to get a serious reduction in labour requirements.

In the next few years to develop a very high quality game all your really going to need is a few artist designers with an eye for picking out good AI work... and a small team who is able to tell the scripting system what they want it to do. Create a 120 item inventory bag system, give all my items a weight identifier and so on. For that matter even things like level design could be tied into a natural language scripting system... if the AI can whip you up a level design based on a prompt that is 90% of what you want, all you need to do is some editing.
 
This right here!
10,000 eyes on a problem building a solution is great. Until you try to make sense of the whole thing. No one person there has a grasp of the full scope there will be overlap and redundancy and all sorts of things caused by simple human communication errors. Even if the AI doesn’t change a single line of code but simply acts as a massive crawler pouring through every line in every function finding those redundant lines to nowhere and flags them in a large report for the team to a analyze that’s a huge deal. An AI simply doing a large report that can visually break down the code into something high level a small team can follow could take a project that would be inconceivable due to the man hours involved and rapid changing nature of the problem and bang out the work on a weekend.

That is a good point... no one says we have to let the AI just change the code. AI generated optimization reports could be huge.
 
That is a good point... no one says we have to let the AI just change the code. AI generated optimization reports could be huge.
Having an AI generate code is currently problematic, if you let it do things automatically there would be no oversight and no real accountability when things go wrong, if you flag it for manual review you would need somebody knowledgeable with that specific area of change, and capable of knowing how it could/would impact other areas of the code. You would spend more time in code review checking over the AIs work than it would likely take to have people doing the actual work.

Now if you are using the AI to rewrite functions that are then added to the various development libraries that is another thing entirely, I am sure there are a number of canned DX11 and 12 functions that were written by a human years ago that could be done far better with an AI. I mean the developer really only cares about the function call and the IO parameters, they know it works and that somebody built it long before they started working there. But when was the last time somebody seriously sat down through those thousands of functions and said hey I bet we can speed this up by 15% if we modernize it and recompile it for the new version of this.dll. But that would still need a team behind it checking the work and passing on how well it did back to the AI team so they can make tweaks to the algorithm.
 
What we are talking about here is next level. This is... her is 100,000 lines of driver code. Nvidia bot optimize this... now we have 92,000 lines of code. The AI is going to see redundant bits, routines that can be reused ect.

The down side may well be trying to keep logical track of all its potential optimizations. I would have to assume its going to want to cut out some tried and true things coders employee to ensure the code remains understandable to other humans. I mean if you don't care if another employee down the road 100% understands what your doing, you can take a few short cuts and not waste a bunch of time noting them. lol

I can see real upside to the tech in general for massive projects that have gotten a bit unwieldy. Imagine having an AI smart enough to sift through a larger project like an OS and unfuck a bunch of stuff people might not even know is broken.
won't this also help improve AI, ML, DL, GraphAI acceleration too? isn't this a circular situation of improvements feeding back into each other?
 
won't this also help improve AI, ML, DL, GraphAI acceleration too? isn't this a circular situation of improvements feeding back into each other?
That’s sort of the point. One improves the other in an endless circle jerk of minor improvements.
 
That’s sort of the point. One improves the other in an endless circle jerk of minor improvements.
couldn't we potentially enter a level of stagnation once everything is based upon prior model trainings without that novel development from human level engineers? try to grasp what i mean or i can work on re-articulation...
just seems at some point we approach diminishing returns or some inflexion / inflection point

"AI" still lacks that human-level reasoning, and even then there's so much variation in actual human intelligence that even the lowest levels may not be that useful. genius level human intelligence is something else to consider
 
couldn't we potentially enter a level of stagnation once everything is based upon prior model trainings without that novel development from human level engineers? try to grasp what i mean or i can work on re-articulation...
just seems at some point we approach diminishing returns or some inflexion / inflection point

"AI" still lacks that human-level reasoning, and even then there's so much variation in actual human intelligence that even the lowest levels may not be that useful. genius level human intelligence is something else to consider
There’s literally human level reasoning that’s below animals. I’ve seen where some humans were even questionable at passing Turing

New from tech tech potato 🥔

 
couldn't we potentially enter a level of stagnation once everything is based upon prior model trainings without that novel development from human level engineers? try to grasp what i mean or i can work on re-articulation...
just seems at some point we approach diminishing returns or some inflexion / inflection point

"AI" still lacks that human-level reasoning, and even then there's so much variation in actual human intelligence that even the lowest levels may not be that useful. genius level human intelligence is something else to consider
Obviously, it reaches a point where you can't do more with what you have and you need to change something, at this stage I think they are just using it to take some of the most human elements out of their code (aka the mistakes) if their driver packages are indeed hundreds of thousands of lines of code there is far too much going on there for any one person or even reasonably sized team to properly coordinate for a full top-down review. Simply using "AI" as a tool for identifying and flagging sections that it deems problematic or redundant and then bringing those to the attention of the necessary parties to correct would be a massive breakthrough.
If Nvidia were able to package and sell that AI or lease time on that AI, I can imagine that they would have a backlog of clients wanting time on there for all sorts of requests. The possibilities for Microsoft alone with the multitude of error logs they collect could potentially speed up patching and performance updates immensely, not to mention its ability to intentionally try to break the code by finding what inputs where cause things to fail in weird ways.
 
Obviously, it reaches a point where you can't do more with what you have and you need to change something, at this stage I think they are just using it to take some of the most human elements out of their code (aka the mistakes) if their driver packages are indeed hundreds of thousands of lines of code there is far too much going on there for any one person or even reasonably sized team to properly coordinate for a full top-down review. Simply using "AI" as a tool for identifying and flagging sections that it deems problematic or redundant and then bringing those to the attention of the necessary parties to correct would be a massive breakthrough.
If Nvidia were able to package and sell that AI or lease time on that AI, I can imagine that they would have a backlog of clients wanting time on there for all sorts of requests. The possibilities for Microsoft alone with the multitude of error logs they collect could potentially speed up patching and performance updates immensely, not to mention its ability to intentionally try to break the code by finding what inputs where cause things to fail in weird ways.
1673291805605.png

Wonder if a connection to Graph-based AIs can further mitigate this lack of reasoning towards developing those deep correlations and hopefully that eventual moment of eureka and or fluency
 
Back
Top