Nvidia CEO Jensen Huang's bet big on A.I. is paying off

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,894
They played their cards right enough this time

“In the car market, Nvidia is making autonomous-driving technology for Mercedes-Benz and others. Its systems are also used to power robots in Amazon warehouses, and to run simulations to optimize the flow of millions of packages each day.

Huang describes it as the "omniverse."

"We have 700-plus customers who are trying it now, from [the] car industry to logistics warehouses to wind turbine plants," Huang said. "It represents probably the single greatest container of all of Nvidia's technology: computer graphics, artificial intelligence, robotics and physics simulation, all into one. And I have great hopes for it."”

Source: https://www.cnbc.com/2023/03/07/nvidia-grew-from-gaming-to-ai-giant-and-now-powering-chatgpt.html
 
  • Like
Reactions: DPI
like this
The reality is that AI won't save Nvidia. They already sold their AI hardware and they were still down 21% in revenue.

 
The reality is that AI won't save Nvidia. They already sold their AI hardware and they were still down 21% in revenue.
That a strange thing to say, like home-depot-Levis during the gold rush or Nvidia-Amd during the crypto boom, the current AI rush will certainly be a big win for Nvidia-AMD

What do you mean by save ?

Their revenue are still leap and bound pre 2020-2021 crypto boom :
NVIDIA-CORP-revenues-history-quarter.png


And their last quarter were already higher than the previous one, do you think Nvidia will stop to exist this decade or will not be a major player in the massive AI upcoming spending ?
 
Whenever Nvidia succeeds, I worry that openness loses. This is a company that has never met a proprietary implementation it doesn't like and will throw money, time, and effort into marketing it into being THE STANDARD. AI is another issue where this is a major concern - aside from Stable Diffusion being FOSS and run locally w/ a useful data training set available, a pleasing exception - AI is generally proprietary and only exposed to the public to basically do the free labor of training the model and then they start charging and locking things down while pointing at alternatives (that haven't been constantly bombarded by kids trying to get ChatGPT to write their essays, CharacterAI making waifus, or Elevenlabs making famous figures read various scripts deepfake style) as insufficient. Nvidia doing well on the professional side and focus on things like CUDA means less platform and hardware independent standards are used or well supported; to say nothing for the preference for NV cards during the mining era and the like. I'm not too pleased at the thought of a similar focus regarding AI.

I have no problem if Nvidia makes good hardware in a world where there is open drivers, platform and application support but I'm not naive enough to think that's the way things are going if we just leave it to their own devices. Same things with AI - the chance of a world where a handful of proprietary models trained on massive amounts of data are the "real" AI soliutions for those who can pay and jealously guard their blackboxes (even worse when applied to things that affect people's lives based on broken inferences , bias, or other bad samples- the way that an AI trained to diagnose skin cancers decided that having a ruler in the photo made likely to be judged malignant! Of course if all the training data, methodology and the like is considered proprietary company secrets it gets harder to detect how these things happened) vs hobbyist and "step down, scraps tossed to the plebs in the form of a partial component or neutered variant from the big guys" models that can't really compete so outside of those ideologically interested in openness it will be yet another setup to proliferate needless inequality, with the bulk of the investment going into an increasingly small group of proprietary models, run on very specific hardware combinations, by those with the money, hardware, and access to a ton of resources (users and otherwise) to train them.
 
Last edited:
Same things with AI - the chance of a world where a handful of proprietary models trained on massive amounts of data are the "real" AI soliutions for those who can pay and jealously guard their blackboxes
The alternative to this would be a strange one, a lot of data used by the facebook, google of the world (let alone the cancer scan affair) will be private one.

Would we want search history used by people to train to be released, private message, emails and so on.

It will be a battle of who get the best dataset to learn on and the best feedback loop for the model once it get running has the actual tech behind it become a commodity his quite probable.
"step down, scraps tossed to the plebs in the form of a partial component or neutered variant from the big guys" models that can't really compete
will see the current impression giving by Microsoft API pricing, seem to try to give giant value and really good price to make it so that the competition never happen:
https://martech.org/openai-unveils-chatgpt-api-at-very-low-prices/

If you give a good enough version at a good enough price, you can make the hobby duty or small company home version useless, have commodity level margin but ridiculously giant volume.

Apparently demo of product of using azure API for their AI are quite impressive.
 
The alternative to this would be a strange one, a lot of data used by the facebook, google of the world (let alone the cancer scan affair) will be private one.

Would we want search history used by people to train to be released, private message, emails and so on.

It will be a battle of who get the best dataset to learn on and the best feedback loop for the model once it get running has the actual tech behind it become a commodity his quite probable.
This actually touches on two problems - the totally separate issue of big tech exploiting lack of regulation , users into giving away data and metadata for free (often without really knowing what is being used currently, what can be saved or used differently in the future , and what this will mean years from now when built upon ; to say nothing for vastly under-compensated) and a whole litany of privacy and data sovereignty issues needs to be dealt with independently. We need new privacy laws built from the ground up from the digital and post social media era and to rectify the damage done for NOT having them the past 20+ years. That's a whole separate discussion on the parameters thereeof but needs to happen soon or things will only get worse.

As far as the AI specific element, this can be handled by a couple of things. First, releasing the training data set itself can and should be released when possible. The LAION sets from Stable Diffusion mean that anyone running a local version on their PC will be just about as effective as an officially hosted one. When there is properly collected private information that you don' want to be individually picked through directly, you can use machine-readable hashed/processed versions to teach the same thing without being directly and easily transformed into examples of private information; This can be combined with reporting conventions for even if a given AI model/learning etc... is lucky enough to be open/libre, being able to track what datasets were used in its outcome will be important to avoid a future where you're tracing outputs and worried about additional obfuscated/proprietary inputs. Its hard enough when working with AI to really assess how the model comes up with each particular thing because its non-deterministic (ie feeding the same prompt will not result in the same image, answer, or story every time) but its twice as hard to figure out some things like the ruler issue if you aren't able to assess the contents of the data set and realize "Hey, a mark on the skin with a ruler next to it in the data set is usually already suspected or confirmed to be cancerous, so we're teaching it that rulers are a common factor in diagnosis and thereby to weight any skin lesion with a ruler next to it as more likely to be cancer". Pairing this with people who willingly submit their data for inclusion in a specific dataset with specified understanding of which parameters will be included and under what circumstance is helpfuil

However I have zero sympathy for Facebook or anyone else with a "What if we can't just gain free and perpetual troves of data from random users/sources, all metadata collected or exrrapolated from it, and spin it into exclusive billion dollar industries for our benefit" so called business model; that exploitation needs to be over and so many of the negatives that continue to pile up towards the "cyberpunk dystopia" will only get worse until we do something about it.

will see the current impression giving by Microsoft API pricing, seem to try to give giant value and really good price to make it so that the competition never happen:
https://martech.org/openai-unveils-chatgpt-api-at-very-low-prices/

If you give a good enough version at a good enough price, you can make the hobby duty or small company home version useless, have commodity level margin but ridiculously giant volume.

Apparently demo of product of using azure API for their AI are quite impressive.
This is one of those issues that is a wolf-in-sheep's clothing; as you say, giving free or cheap access to their service benefits them by ensuring that continued use bolsters their continued refinement of the data set. However, it is not good for AI or the users as a whole. Why? Well, the same way that "How is zero rating bad for the consumer bro its free services?" slimy attempts to get around net neutrality is harmful for example - or how having near monopoly on donations/purchase depend on PayPal or Visa/Mastercard is all well and good until you're controversial and these companies refuse to allow you to accept donations/payments. A future where a viable computing service/feature only exist in a useful form as a proprietary, software-as-a-service monopoly is absolutely a concern for anyone who's not an an AI focused big tech company.

Just with ChatGPT itself they started recently locking down what free users could do and limiting the output to make sure it doesn't say anything offensive or tell someone how to make a bomb or whatever the heck. Will it give accurate information on certain controversial topics to the best of its ability? Will it sanitize "misinformation" and who will decide on that? (Note there are increasingly intricate ways necessary to "jailbreak" or use "developer mode output" to get something closer to the models full capabilities but its almost guaranteed this will be locked down in time) All of this is understandable as a company worrying about PR from clickbait outrage articles saying how their model didn't come down hard enough on INSERT_HISTORICAL_FIGURE and thereby backed INSERT_IDEOLOGY, but isn't great for the end users This is a problem of usable monopoly - if everyone could run a trained dataset of ChatGPT et al on their own PC it wouldn't be so big a problem if the "official" instance had safeguards (provided it was public knowledge they were in place) because someone could always spin up another one without them. With an effective monopoly on proprietary tech and training data however, that cannot happen.
 
That a strange thing to say, like home-depot-Levis during the gold rush or Nvidia-Amd during the crypto boom, the current AI rush will certainly be a big win for Nvidia-AMD

What do you mean by save ?

Their revenue are still leap and bound pre 2020-2021 crypto boom :
View attachment 554321

And their last quarter were already higher than the previous one, do you think Nvidia will stop to exist this decade or will not be a major player in the massive AI upcoming spending ?
Fact is that Nvidia's position in the market is entirely based on the crypto boom that began in 2016. The only reason people are investing is because Nvidia and Tesla were the big winners during the pandemic with nobody knowing why, but everyone's collective conscientiousness agreed to invest into these two companies.

https://www.investorsobserver.com/news/qm-pr/4711477151361492
Revenue down 21%
Expenses up 27%
Operating Income down 58%
Net Income down 53%
FCF down 53%
Trading at 120x earnings

hj6p7ad2cvja1.png


As for the AI nonsense, Nvidia already sold the hardware for it. They said so on their website. They already sold the hardware and the largest cloud providers have already invested in 2022. Which is probably why Nvidia's revenue isn't as piss poor as it could be during 2022 when GPU sales fell.

https://nvidianews.nvidia.com/news/...al-results-for-fourth-quarter-and-fiscal-2023
"Using their browser, they will be able to engage an NVIDIA DGX™ AI supercomputer through the NVIDIA DGX Cloud, which is already offered on Oracle Cloud Infrastructure, with Microsoft Azure, Google Cloud Platform and others expected soon. "
 
As for the AI nonsense, Nvidia already sold the hardware for it. They said so on their website. They already sold the hardware and the largest cloud providers have already invested in 2022. Which is probably why Nvidia's revenue isn't as piss poor as it could be during 2022 when GPU sales fell.
Yes AMD-Nvidia and other in the AI space were already selling hardware for it, that was true for crypto before the latest crypto boom has well, the idea is that with the boom they could end up selling more like for mining they did more money with the boom than without even thought they were already mining able hardware before, I am really not sure the point you are making.

The only reason people are investing is because Nvidia and Tesla were the big winners during the pandemic with nobody knowing why, but everyone's collective conscientiousness agreed to invest into these two companies.

But they are making way more money than before the pandemy
Nivida annual revenues:
2022$26,914
2021$16,675
2020$10,918
2019$11,716
2018$9,714
2017$6,910
2016$5,010

Quarter result
1/31/2023$6,051
1/31/2022$7,643
1/31/2021$5,003
1/31/2020$3,105
1/31/2019$2,205
Almost the double than pre-pandemic result, yes the mining bubble popping affected them (how could it not be) but their general growth was real and still here no ? The fact that their growth from 2019-2020 is still giant seem to show that it was more than 100% due to mining.

It is because AMD-Nvidia are already so well placed in that world that they are perceived to be possible big winner (and very likely to be) well placed to profit from the wave, not because people think it is new for them.
 
Yes AMD-Nvidia and other in the AI space were already selling hardware for it, that was true for crypto before the latest crypto boom has well, the idea is that with the boom they could end up selling more like for mining they did more money with the boom than without even thought they were already mining able hardware before, I am really not sure the point you are making.

The only reason people are investing is because Nvidia and Tesla were the big winners during the pandemic with nobody knowing why, but everyone's collective conscientiousness agreed to invest into these two companies.

But they are making way more money than before the pandemy
Nivida annual revenues:
2022$26,914
2021$16,675
2020$10,918
2019$11,716
2018$9,714
2017$6,910
2016$5,010

Quarter result
1/31/2023$6,051
1/31/2022$7,643
1/31/2021$5,003
1/31/2020$3,105
1/31/2019$2,205
Almost the double than pre-pandemic result, yes the mining bubble popping affected them (how could it not be) but their general growth was real and still here no ? The fact that their growth from 2019-2020 is still giant seem to show that it was more than 100% due to mining.

It is because AMD-Nvidia are already so well placed in that world that they are perceived to be possible big winner (and very likely to be) well placed to profit from the wave, not because people think it is new for them.

ChatGPT (AI) bubble and part a whole tech sector recovery.
 
ChatGPT (AI) bubble and part a whole tech sector recovery.
I really doubt it will be a small bubble, like the 90s-early 2000s web, the PC before that or 2007-2012 mobile, many things will fail, it will be overinvested (which is all good for Amd or Nvidia), but the tech already today is incredibly impressive and already deliver giant value.

I pay $100 for github copilot and seem like a steal, the 2027-2028 version could be sold I would imagine 5 figures a seat/year and find customer.
 
I really doubt it will be a small bubble, like the 90s-early 2000s web, the PC before that or 2007-2012 mobile, many things will fail, it will be overinvested (which is all good for Amd or Nvidia), but the tech already today is incredibly impressive and already deliver giant value.

I pay $100 for github copilot and seem like a steal, the 2027-2028 version could be sold I would imagine 5 figures a seat/year and find customer.
I mean it quite literally is a steal - ignoring well-established and lawyer-hardened free/open source software licenses and in some cases even spitting back licensed/copyrighted code verbatim [0]. Quite the steal indeed.

[0]: https://mobile.twitter.com/mitsuhiko/status/1410886329924194309
 
I mean it quite literally is a steal - ignoring well-established and lawyer-hardened free/open source software licenses and in some cases even spitting back licensed/copyrighted code verbatim [0]. Quite the steal indeed.
If it is code hosted on github, it would be quite hard to be a literal steal. For stuff public but not possible to find on github I am sure there is a big issue (or hosted by microsoft but not on free public github), but it will be hard to argue that the tech is not transformative and by the time a case is made and won it will be too late.

Like other past case, the tech is so strong that it will streamroll the legal issue would be my guess that will have to adapt (like music industry, dvd industry and so on)
 
I mean it quite literally is a steal - ignoring well-established and lawyer-hardened free/open source software licenses and in some cases even spitting back licensed/copyrighted code verbatim [0]. Quite the steal indeed.

[0]: https://mobile.twitter.com/mitsuhiko/status/1410886329924194309
Hahahah you are hilarious.

Anyways, you're also absolutely right, and although I do think that GPL/MIT software is wonderful to have around, it's important to understand that either we as a "Technology Community" decide that these licenses don't matter now that code can be written by AI, or we could sort models by license, and that would make things a bit more difficult, but perhaps more friendly to human ideology or greed.

But if we want to step into the future wisely, it might be best to dismantle the idea of software code licenses, and determine that all code that is published publicly is now the property of AI. Many companies already make it a priority to keep their code absolutely secret. If it's on the public internet, then the rules might have to change due to the simple fact that this sort of AI cannot be stopped.

If we were to give code license authority over to AI, with the dark assumption that within ten years, it will no longer be feasible for mere humans to compete with AI/Human programmers, then that would immediately allow us to get to work without playing games of naivety.

I've read my Asimov. There's no time to bother moaning and groaning. Let's just embrace AI.

This also would demand that Co-Pilot, ChatGPT and other coding tools and AI models be made MIT/GPL or otherwise within the public domain, in terms of code and model. It would be the story of John Henry all over again if we tried to ignore this. You think the first year of AI is how we should judge it? This is going to turn into a firestorm soon enough and I sure as heck am not going to stand in the way of it.

A haunting suggestion: Biological beings never should have tried to build AI in the first place.
 
Yes AMD-Nvidia and other in the AI space were already selling hardware for it, that was true for crypto before the latest crypto boom has well,
The first crypto boom happened in 2014, but it was no where near as big as 2016, and no where near as big as 2020. This is how I acquired my Radeon HD 7850 because shortly after 2014 it dropped to $100. This is also how I got my RX 470 and Vega 56 after the failure of 2016. The dust for the pandemic crypto crash hasn't fully settled yet.
the idea is that with the boom they could end up selling more like for mining they did more money with the boom than without even thought they were already mining able hardware before, I am really not sure the point you are making.
Firstly, the point I'm making is that it's all down hill from here. Second, you can't compare the crypto boom with the AI boom. Nvidia couldn't pump out enough hardware for miners while AI currently has limited need for hardware.
But they are making way more money than before the pandemy
Nivida annual revenues:
The crypto crash didn't happen 100% from the start of 2022. Lots of people held onto the infinite money dream until the bitter end, which according to the Bitcoin value throughout 2022 was around June when their hopes and dreams finally collapsed. Even still, Ethereum didn't go proof of stake until September. Of course Nvidia did indeed sell hardware for AI, as they pointed out. You know who else sold GPU's and made more money than ever in 2022?
amd revenue.png


You know who didn't make that much money in 2022 that doesn't sell GPU's?
intels-net-revenue-since-1999.png

Almost the double than pre-pandemic result, yes the mining bubble popping affected them (how could it not be) but their general growth was real and still here no ? The fact that their growth from 2019-2020 is still giant seem to show that it was more than 100% due to mining.
The GPU market hasn't fully crashed yet. Nvidia claims compute and networking was 15 billion while graphics was 10 billion. Also looking at their revenue sources will tell you the automotive market is what's keeping them afloat. Might be from the Mercedes deal, but something similar happened with Nvidia and Tesla and look how that turned out.
nvidia revenue by market.png

It is because AMD-Nvidia are already so well placed in that world that they are perceived to be possible big winner (and very likely to be) well placed to profit from the wave, not because people think it is new for them.
People also thought GameStop was a good stock too, until it wasn't. Don't be surprised that their Q1 results for 2023 are going to be disappointing.
 
Firstly, the point I'm making is that it's all down hill from here. Second, you can't compare the crypto boom with the AI boom. Nvidia couldn't pump out enough hardware for miners while AI currently has limited need for hardware.
Ok but that cannot be based to the fact that Nvidia was selling hardware use for AI in 2015-2016-2017-2018..2022, etc... that does not mean that it will not sell more in 2023-2024 than in the past.

And yes the AI boom can be compared with the crypto boom (if someone say this AI boom will be just 20% of the crypto boom that's a very direct comparison being made), in both case AMD-Nvidia are well positioned in a domain that will receive tons off capitals, everyone will find the need to springle a little bit of it at least:
https://www.economist.com/business/...-nuts-for-chatgpt-ish-artificial-intelligence
https://www.forbes.com/sites/robtoe...nguage-ai-startups-is-coming/?sh=4bc102512b14

There is a lot of money sleeping and waiting to find place, there is a need for the next mobile, cloud, crypto to keep a lot of things going and it will be AI for the next few years.
 
Last edited:
There is a lot of money sleeping and waiting to find place, there is a need for the next mobile, cloud, crypto to keep a lot of things going and it will be AI for the next few years.
My concern is what business sees value yet in AI or ChatGPT? I can see why the car thing is happening because of self driving. Mercedes did reach Level 3 autonomous driving before anyone else, but I'm not sure how involved Nvidia is with that. ChatGTP which is being herald as the AI king, makes a lot of mistakes. Too many mistakes to depend on it to do anything but giggle and laugh. Also right now for some reason, IBM is the top dog in selling AI hardware. Intel, Google, and AMD are also developing AI hardware as well. It's not like Nvidia is alone in this AI boom, which I'm still not sure how AI will apply to real world applications. It's not that I can't think of any, I just can't think of any that wouldn't result in ChatGPT transferring millions of dollars to some woman's bank account in Albuquerque. Or used for tech support that doesn't respond with racist comments.
 
My concern is what business sees value yet in AI or ChatGPT?

ML is pervasive in any kind of high tech industry. It's everywhere and here to stay, and Nvidia is dominating it from a hardware standpoint. For example, I just sat through a presentation at a defense trade show by 3M about how almost all of their newer advanced materials are partially engineered by AI. AI is used in chip development, wireless signal discrimination, filtering applications, etc. On the lower-tech side of things, a surprising amount of the content you read online is generated by custom AI bots. For example, CNBC got busted for using a customized natural language engine to write most of their homepage content. There is a local bar owner I saw on Reddit who uses DALL-E to create the advertisements for his bar, replacing the need to hire a graphics designer. I could go on and on and on, but machine learning is not just going to change how businesses operate - it already has. Whoever doesn't see that is just going to get left behind.
 
ML is pervasive in any kind of high tech industry. It's everywhere and here to stay, and Nvidia is dominating it from a hardware standpoint. For example, I just sat through a presentation at a defense trade show by 3M about how almost all of their newer advanced materials are partially engineered by AI. AI is used in chip development, wireless signal discrimination, filtering applications, etc.
Basically it's a tool that you can use. It's pretty clear that everyone wants to rent this out via the cloud, but I can't see this exploding.
On the lower-tech side of things, a surprising amount of the content you read online is generated by custom AI bots.
Yea thanks for that one.
https://futurism.com/cnet-ai-errors
For example, CNBC got busted for using a customized natural language engine to write most of their homepage content. There is a local bar owner I saw on Reddit who uses DALL-E to create the advertisements for his bar, replacing the need to hire a graphics designer. I could go on and on and on, but machine learning is not just going to change how businesses operate - it already has. Whoever doesn't see that is just going to get left behind.
And my point is that these "AI" systems do create errors because it's just a giant pattern recognition software. Until it's 100% error proof, it's a nice toy.
 
My concern is what business sees value yet in AI or ChatGPT? I can see why the car thing is happening because of self driving. Mercedes did reach Level 3 autonomous driving before anyone else, but I'm not sure how involved Nvidia is with that. ChatGTP which is being herald as the AI king, makes a lot of mistakes. Too many mistakes to depend on it to do anything but giggle and laugh
I would assume you do not mean AI at large, but the LLVM like GPT 3 type of system ?

Some are obvious and already used in the field, software dev, customer service.

Customer service is already automated, rely on people googling, youtube video, etc... a lot, this will change this quite a bit, make a first line of defence and a company with 2 people at customer service will make the job of 10, you can usually charge around 30% of what a company save with your tech for it to be a nice success (so cheap non brainer for company to buy it, more expensive and a competitor will undercut you, but high enough to still make good money).

There is a lot of domain that a mistake does not matter, if I am designer and ask an AI to give 10 possible logo for a campaign-company-etc... I do not have to trust or rely on it, I am still looking at the logo before picking it, if it is software the compiler, test, looking at it tell us if the code work or not. If you write fiction, if you do excel type of affair.

A lot of the issues will be diminished a lot in less than 10 years ( i.e. tomorrow), I feel.

All domain that already used lesser version, like translation, customer service, search, will in part go to the more advanced but use more hardware version.

Everything that has a nice expert database, industrial, chemical, healthcare will be a huge one, everything that has a lot text based like law, science, unlike say crypto, the value seem obvious, direct, already there.

Until it's 100% error proof, it's a nice toy.
Just need to be way less error prone than humans, already is in some case, or just so much faster than added to human help combo it beat it.
 
Ok but that cannot be based to the fact that Nvidia was selling hardware use for AI in 2015-2016-2017-2018..2022, etc... that does not mean that it will not sell more in 2023-2024 than in the past.

And yes the AI boom can be compared with the crypto boom, in both case AMD-Nvidia are well positioned in a domain that will receive tons off capitals, everyone will find the need to springle a little bit of it at least:
https://www.economist.com/business/...-nuts-for-chatgpt-ish-artificial-intelligence
https://www.forbes.com/sites/robtoe...nguage-ai-startups-is-coming/?sh=4bc102512b14

There is a lot of money sleeping and waiting to find place, there is a need for the next mobile, cloud, crypto to keep a lot of things going and it will be AI for the next few years.
And now those datacenters who bought and taught their AIs on the older RTX 6000's and 8000's need to upgrade to the RTX A6000s or the new RTX 6000A's (fuck I really do hate their naming conventions for their server cards) to stay competitive. When the AI stuff was just being researched on and stuff developers and companies could hold onto their hardware longer, but now the race is on they have to make sure they can stay ahead or try and take the lead, need the new faster hardware to do that.
Before All this AI stuff was just, research fun times, with the hope of creating a viable product, now they have something viable and they have a clear means of generating cash flow and aggressive business plans, you better believe corporations are spending here.
 
And now those datacenters who bought and taught their AIs on the older RTX 6000's and 8000's need to upgrade to the RTX A6000s or the new RTX 6000A's (fuck I really do hate their naming conventions for their server cards) to stay competitive. When the AI stuff was just being researched on and stuff developers and companies could hold onto their hardware longer, but now the race is on they have to make sure they can stay ahead or try and take the lead, need the new faster hardware to do that.
Before All this AI stuff was just, research fun times, with the hope of creating a viable product, now they have something viable and they have a clear means of generating cash flow and aggressive business plans, you better believe corporations are spending here.

Shipments of AI Servers Will Climb at CAGR of 10.8% from 2022 to 2026

PRESS RELEASE by AleksandarK Today, 12:33 Discuss (0 Comments)
According to TrendForce's latest survey of the server market, many cloud service providers (CSPs) have begun large-scale investments in the kinds of equipment that support artificial intelligence (AI) technologies. This development is in response to the emergence of new applications such as self-driving cars, artificial intelligence of things (AIoT), and edge computing since 2018. TrendForce estimates that in 2022, AI servers that are equipped with general-purpose GPUs (GPGPUs) accounted for almost 1% of annual global server shipments. Moving into 2023, shipments of AI servers are projected to grow by 8% YoY thanks to ChatBot and similar applications generating demand across AI-related fields. Furthermore, shipments of AI servers are forecasted to increase at a CAGR of 10.8% from 2022 to 2026.
JcpfWYQp4ONo1wHL_thm.jpg bRfaWok8f0irCU4F_thm.jpg
TrendForce has also found that the four major North American CSPs (i.e., Google, AWS, Meta, and Microsoft) together held the largest share of the annual total AI server demand in 2022, accounting for 66.2% of the annual global procurement quantity. Turning to China, localization of manufacturing and self-sufficiency in critical technologies have been gaining momentum in recent years, so the build-out of the infrastructure for AI technologies has also accelerated in the country. Among Chinese CSPs, ByteDance was the leader in the procurement of AI servers in 2022. Its share in the annual global procurement quantity came to 6.2%. Following ByteDance were Tencent, Alibaba, and Baidu that comprised around 2.3%, 1.5%, and 1.5% respectively.

AI-Based Optimization of Search Engines Is Driving Demand for HBM
Seeing a bright future in the development of AI technologies, Microsoft has invested a considerable sum in the well-known research laboratory OpenAI. Furthermore, Microsoft launched an improved version of its search engine Bing this February. The new Bing has incorporated a large-scale language model named Prometheus and the technology that underlays ChatGPT. Prometheus, in particular, is a collaboration between Microsoft and OpenAI. Not to be left out, Baidu launched ERNIE Bot this February as well. Initially operating as a standalone software, ERNIE Bot will be integrated into Baidu's own search engine at a later time.

Regarding the models and specifications of the computing chips used in the aforementioned projects, ChatGPT has mainly adopted NVIDIA's A100 and exclusively utilizes the cloud-based resources and services of Microsoft Azure. If the demand from ChatGPT and Microsoft's other applications are combined together, then Microsoft's demand for AI servers is projected to total around 25,000 units for 2023. Turning to Baidu's ERNIE Bot, it originally adopted NVIDIA's A100. However, due to the export control restrictions implemented by the US Commerce Department, ERNIE Bot has now switched to the A800. If the demand from ERNIE Bot and Baidu's other applications are combined together, then Baidu's demand for AI servers is projected to total around 2,000 units for 2023. TrendForce's survey has revealed that in the market for server GPUs used in AI-related computing, the mainstream products include the H100, A100, and A800 from NVIDIA and the MI250 and MI250X series from AMD. It should be noted that the A800 is designed specifically for the Chinese market under the context of the latest export restrictions. In terms of the market share for server GPUs, NVIDIA now controls about 80%, whereas AMD controls about 20%.
 
You should see what my large Healthcare customers are doing. It is pervasive, top of mind, and always in the top 3 CTO/CIO projects going on. It is massive, and only going to continue to grow exponentially.
 
You should see what my large Healthcare customers are doing. It is pervasive, top of mind, and always in the top 3 CTO/CIO projects going on. It is massive, and only going to continue to grow exponentially.
Hmm, spend some 50 million on hardware to potentially be first in what could be Billions in pharmaceutical sales...
That sounds like a really hard call when that is less than a 1-year bonus for their CEOs
 
Back
Top