AI Isn’t Making Much Money

I'll let someone else clue you in to the multiple hilarious things in your above post since you'll discount it if I do. Are you drinking tonight, by chance?
 
Lots of devices with NPUs that creates opportunity to run AI software on client, for example: auto-correct

But if the ML models require lots of memory, then you may need 16gb/32gb ram etc. Could send DRAM prices soaring.
 
Last edited:
I see 2 popular use-cases for AI so far, but both don't seem ready for real-time use yet.

In search we seem to be in era before google came up with page-rank. For example, if you gave muslim, president, USA as input to search it returns with Barack Obama as closest result. So just like page ranking there needs to be an analysis of search output to give weightage/ranking/sentiment analysis etc. Basically this seems like a whole new layer that is missing now. Maths/stats wiz can correct me here, if I am wrong.

The second use-case is alerts. Think of the movie minority report. So there are 2 problems here. False positives & early nature of the alert. Both require more human involvement & monitoring bandwidth the supply of which is limited. The machine just does some calculations & spits out data but any decision making on top of this maths output has to be by humans.
To illustrate this with an example, there is a new blood test, GALLERI, that can test 50 types of cancer. For example for 1 person it gave a positive for testicular cancer. The natural reaction of doctors is surgery to prevent metastasis but also only biopsy is the true confirmation. On biopsy it turned out to be a false positive.

So in such cases of early warning (which doesn't exist now), you need a new paradigm (non-interventionist methodology without involving surgery) to better manage cancer as a lifestyle disease maybe 🤔
 
Lots of devices with NPUs that creates opportunity to run AI software on client, for example: auto-correct

But if the ML models require lots of memory, then you may need 16gb/32gb ram etc. Could send DRAM prices soaring.
16gb of ram have been quite standard for more than 10 years now, but a lot of stuff could take quite modest resource. For phone it would push model (I think only some rare Android goes that far)

MIcrosoft phi-3 model family is a big effort in that direction and with 4 bits and less model, the memory usage versus the previous generation of 16 bits get cut a lot.

A 4 bits PHI-3 can do a lot today with 1.8GB of memory

https://azizbelaweid.substack.com/p/phi-3-your-pocket-llm

Benchmark:
s%2F359d244a-d5ae-4c71-9c57-415d0335be12_1300x1364.jpg

People get good performance on small iGPU with 512MB of shared ram, phone, etc...

Even something like Phi-3-Vision is down to only 4 billions parameters and run on rtx 3080 quite well, with how much money going on (and the raw incredible competency of the company involved, AMD-Microsoft-Apple-Tesla-Nvidia-Amazon-Google, etc...) with the speed of progress, what a 4 bits or even 3 state model that run under 1 GB memory will be able to do with the training and model version we will have in 3 years, could be what needed 16GB to run well 6 months ago.
 
Last edited:
Lots of devices with NPUs that creates opportunity to run AI software on client, for example: auto-correct

But if the ML models require lots of memory, then you may need 16gb/32gb ram etc. Could send DRAM prices soaring.
That’s where NVidia’s work with Int4 becomes huge, CUDA and its Int 4 inference AI models use 70% less memory than that same model on OpenML FP16, while being multiple times faster.
 
So far we have spent roughly 400k in licensing and labor trying to AI automate about 15% of my job.

Now I am expensive… but the work (coding up clinical trial inclusions and excision criteria from free text clinical words to claims and clinical codified terminology) is not root boring work. It’s involved and complicated.

But hey it’s not my money to set on fire and burn because of a buzz word.
 
Our species does seem to have a really good track record of finding innovative ways to get rid of jobs, yet requiring more to be created as a result. We’ve been innovating people’s jobs out of existence for thousands of years. Yet we still all have jobs. Maybe this time will be different.
Jevon's Paradox. Only thing I have seen beat it was LEDs.
 
Back
Top