AI Accelerator Cards in Desktops?

That's an "over my dead body" proposition. I will literally die fighting the system before I ever submit to living under that kind of dystopia.



This is why it all needs to be set on fire. By any means necessary, this shit needs to be stopped. Absolutely nothing is off the table.
I may be overly cynical but I think AI will displace alot of white collar jobs just like automation displaced alot of blue collar jobs. Homelessness, mental health and drug use will increase as a result. I saw this happen to my home town which was traditionally blue collar town. Now there's a bunch of middle aged men with nothing to do suffering from the above living in homeless camps in parks throughout the city.

All those data crunching, accounting and forecasting, front line service jobs, driving related jobs, radiology, programming, etc jobs will eventually go the way of the dodo.

It may take a few years but this will displace alot of ppl. Its easy to see this where we're heading.
 
I may be overly cynical but I think AI will displace alot of white collar jobs just like automation displaced alot of blue collar jobs. Homelessness, mental health and drug use will increase as a result. I saw this happen to my home town which was traditionally blue collar town. Now there's a bunch of middle aged men with nothing to do suffering from the above living in homeless camps in parks throughout the city.

All those data crunching, accounting and forecasting, front line service jobs, driving related jobs, radiology, programming, etc jobs will eventually go the way of the dodo.

It may take a few years but this will displace alot of ppl. Its easy to see this where we're heading.

Yeah. I don't think it would be permanent. The economy will exit on the other end with a new normal and different employment norms that take I to account the higher peiductivities AI can result in.

Again, I don't think. AI wi completely replace people in most jobs, it will just make people more productive, andkve actual people to the places where they are actually needed.

But there will be what economists call "transitory effects". People - especially those in mid tomlate careers who lose their jobs because their skills don't line up with the new needs.

But dismissing so called "transitory effects" is wrong. Those transitory effects are very real for the people caught up in them. Yes the economy will likely exit on the other end in a new (and possibly better if we are lucky) state, but many of these people will never be able to worn again, and will live out their old age in poverty as a result.


People make fun of the luddites, but to a certain extent they were right. The industrial revolution may have long term had hugely beneficial impacts on society, but short to medium term it also ruined a lot of lives.

We shouldn't forget that.
 
I would love to have an AI assistant. I've wasted way too many hours doing menial tasks on a computer that i could have my AI slave do instead. It would be a massive improvement.
 
I know I love the AI that runs on both of my BlueIris Servers, Self Hosted object detection is great. If I can get away from having to use a gpu for inference that would be great. These little dongles, cards, etc have been around for a long time. To people that do not want AI in their lives I understand, however they have already been using AI and probably do not even know it.

There is nothing wrong with not wanting to use the tech though, I still keep chickens,cows, grow a garden, etc. Tech can be a helpful but dangerous thing.... If the power grid goes down no more AI, If the furnace goes out in your house and you dont have a wood burning stove when it is -40F you might die, if the grocery store runs out of food and you do not raise animals or grow a garden you are shit out of luck. Hanging on to the old ways is never bad, but totally embracing the new can be fatal in the right conditions.

The Luddites did in fact have it right the short and medium term benefits during the industrial revolution did change lives, and not for the better. In the long run though it has benefited far more than they could have ever imagined.
 
I just want Jensen to increase the number of tensor cores and VRAM in his cards, so I can game AND run LLMs locally without paying extra for 2 cards.

A 64 GB VRAM 5070 is what I want.
 
Yeah, unless they force it on my by integrating into the motherboard or CPU, I'll pass.

I want none of this shit.

I've made it to 2024 without using anything AI, I plan on making it until my deathbed without ever doing so.

Fuck all things AI. Fuck it long. Fuck it hard.
As a dev, I'd love these. I'd love to be able to run whatever I want, at high speed, at low wattage, locally on my box. Both training and inference. It's the type of stuff in the sci-fi novels I grew up with, running my own local ai (or rather, "advanced applied statistical model." :)
 
Either go all or nothing. Let's not play silly little games where all of a sudden science has gone too far, just starting here in 2024, rather than in 2011, or 1999, or even the dawn of the industrial era. Let's just keep walking forward into the future bravely, and focus carefully on using this technology wisely, rather than fear-mongering and practicing advanced hypocrisy.
I don't think either "extreme" approach is healthy.
 
Seems DOA to me. Remember PhysX cards? Now it's integrated into the GPU. These don't even have the luxury of having that head start. AI is already integrated into GPU's.
 
Ai accelerator cards are gonna be as viable and long term as physx cards. Not long at all!
 
Seems DOA to me. Remember PhysX cards? Now it's integrated into the GPU. These don't even have the luxury of having that head start. AI is already integrated into GPU's.
And CPU’s now too. Microsoft is demanding a certain amount of TOPS (currently 40) for AI certifications for Windows. So basically any processor regardless of AMD, Intel, or Apple will contain AI accelerators.
 
And CPU’s now too
And if I am not mistaken that a lot what killed physic simulation discrete type card, like sound and soundcard, cpu and physic engine software running on them got good enough, more than GPU being used for physics.
 
Ai accelerator cards are gonna be as viable and long term as physx cards. Not long at all!

I guess I don't have a good understanding as to what actually makes a good "AI accelerator".

With Nvidia already putting their "tensor cores" in GPU's are we already there, or is there something I am missing beyond that?

As far me and my usage is concerned - however - it doesn't matter. I use my PC like it is 2005 (just faster).

I pretty much don't use any features (outside of newer DX revisions) that have been launched since.

I don't use sync/cloud I don't use AI, I don't use crypto, heck I barely even use any kind of CUDA/OpenCL compute. I've also never used VR/AR and have no intent to do so, and want absolutely nothing to do with "the metaverse." Outside of mobile devices (which I try to use as little as possible, and when I use them I use them like I would a computer, spending most of my time using desktop rendered web pages in the browser, and never i de having tried an "assistant" of any brand) I don't even use WiFi, and I try to avoid anything and everything that forces me to install an app or create an account. I don't even have a Microsoft (or Apple) account, and I never will.

So y'all can play with your AI accelerators, but for me I guess it will just be another thing forced on me that I don't want, that I'll have to disable in BIOS/OS.

AI may come, but I am going to continue my blissful ignorance and not use it at all, and continue like it is 2005 until the day I die.

I don't want it, and I resent that it is being forced on me.
 
I guess I don't have a good understanding as to what actually makes a good "AI accelerator".

GPUs are good at it because they're obnoxiously parallel, and tend to support the type of math, but it's still more efficient to have hardware specifically designed to tear through it. Same as hardware decode when you're watching a video.

You _can_ decode in GPGPU, or even the CPU, but dedicated hardware is still faster and more efficient.

Much of the common AI voodoo can be distilled down to basically low precision matrix math and convolutions. There's not a tremendous amount of magic behind it.

So y'all can play with your AI accelerators, but for me I guess it will just be another thing forced on me that I don't want, that I'll have to disable in BIOS/OS.

In the nutshelliest of nutshells, all you'll be accomplishing really is making some numbers get multiplied slower.

Nothing's stopping you from running a model all in software. It'll just comparatively run like shit.
 
Last edited:
Stupid question: will the NPU eventually get merged into the GPU? So everyone, everyone has to buy a GPU. Or merged into the CPU?

Us old-timers, you know, guys who are now collecting Social Security (US government pensions, for peeps living outside the USA), us old-timers remember Intel 8087 math co-processor. https://en.wikipedia.org/wiki/Intel_8087
 
Stupid question: will the NPU eventually get merged into the GPU? So everyone, everyone has to buy a GPU. Or merged into the CPU?

Us old-timers, you know, guys who are now collecting Social Security (US government pensions, for peeps living outside the USA), us old-timers remember Intel 8087 math co-processor. https://en.wikipedia.org/wiki/Intel_8087

I remember the 8087, and I'm over 20 years away from social security :p
 
Stupid question: will the NPU eventually get merged into the GPU? So everyone, everyone has to buy a GPU. Or merged into the CPU?

Us old-timers, you know, guys who are now collecting Social Security (US government pensions, for peeps living outside the USA), us old-timers remember Intel 8087 math co-processor. https://en.wikipedia.org/wiki/Intel_8087
The NPU already is, Intel calls it the NPU, Apple calls it NE, AMD calls it Ryzen AI, and Nvidia just does it in their Tensor Cores, interestingly Intel also sells their NPU as an M.2 card, they have for a while so that's interesting at least, and until recently as a USB stick too.
 
Stupid question: will the NPU eventually get merged into the GPU? So everyone, everyone has to buy a GPU. Or merged into the CPU?

Us old-timers, you know, guys who are now collecting Social Security (US government pensions, for peeps living outside the USA), us old-timers remember Intel 8087 math co-processor. https://en.wikipedia.org/wiki/Intel_8087
It's been in NVDIA GPUs since 2018 with tensor cores, and in smartphones ARM SoCs since at least 2016.
The NPU is hardly new tech, it's just a new feature for x86-64 CPUs in laptops/Surfaces and desktops.

NPU TOPs pale in comparison to GPU tensor core TOPS.
 
I don't use AI, I don't use... [everything]

I guess it will just be another thing forced on me that I don't want, that I'll have to disable in BIOS/OS.
Dude, how do you compute with just a single core CPU?

I, for one, will accelerate all the things in a boat.
And I would accelerate them with a goat...
And I will accelerate them in the rain.
And in the dark. And on a train.
And in a car. And in a tree.
They are so good, so good, you see!

I remember the 8087, and I'm over 20 years away from social security :p
Me too. (OK, so you're about my age plus half a decade or so. YOU'RE SO OOOOOLLLLLDDDD!!!!! ;-)
 
  • Like
Reactions: noko
like this
Dude, how do you compute with just a single core CPU?

Dual core Athlon 64 x2's were available in 2005 :p

And some of us did the whole dual Celeron on an Abit BP6 as far back as 1999 :p

But yeah, I'm fine with more cores, faster architectures, etc. on both the CPU and GPU. I also like faster storage, and networking.

I just don't want any new features from a user experience perspective that didn't already exist in 2005.

I'm not opposed to them in theory. If something good comes along, I'll embrace it. I'm just not sure I've seen one of those yet.

I'm fine as long as I can disable them, but when they get pushed on me without giving me a choice I get really really cranky.

I feel like there was a point when tech increasingly got better and better every year, and then it hit an inflection point and just started getting stupid instead of better. I place that in about 2007. That's when lots of shit changed. (I should probably have said 2007 instead of 2005 in my above post)

- Apple changed the mobile side of things with the iPhone which wile it was pretty cool in and of itself, the fact that it resulted in the whole internet wholesale moving away from the desktop experience to the mobile experience with shitty UI design was a huge negative.

- At about that same time Google moved away from the "don't be evil" mantra and wholesale embraced data collection, cloud and other such nonsense, and once they opened the floodgates, everyone else followed.

And since then we have just gotten dumber and dumber to where we now have completely moronic shit like internet connected lightbulbs and toasters and a dystopia where you more and more just subscribe to everything and own nothing. And then along comes AI which will undoubtedly just make things worse with stuff like mass scale surveillance with face recognition etc. etc.

I really dislike almost everything about where the tech world has gone in the last 10-15 years.
 
Let's see how it plays Global Thermonuclear War and then I'll decide. I only want it to reach one conclusion.

I can see it working for gaming. A lot of gaming AI isn't that great. Some of it that is fairly good is still just decent but not great. Going up in difficulty doesn't make the AI better, just makes it easier to kill you (accuracy, more damage, etc.). Just not liking a dedicated module for it. I'd love to play some good strategy games with a great AI. Simulations would benefit from them.

I'd love to see some real world applications where this would be an added value. Anything resembling Cortana, Alexa, Google, Siri, etc. can just get out. I'm sad those are considered "AI" as they are all very, very dumb. If/then machines that take speech input and even then don't do a very good job at it half the time.

So far, poor meme generation, AI art (some is excellent, but still not that impressed), AI assistants, etc. just aren't doing it for me. Even guessing what I'm going to do and working on it before I ask isn't that impressive. It's nice, but it's not something I'd really call a usable feature. Sell me on a good solution for a problem that would utilize an AI accelerator. And I don't want a new problem with a new solution (solution without a problem).

I love AI. I'd love to see it implemented in cool things. I'm just not feeling it for the desktop.
 
Like when HD hit the market, every thing was HD. HD glasses, HD bed sheets, HD HD.........
Like when 3D hit everything, 3D this and 3D that.....
Now its Ai everything, this new toilet paper was designed by Ai that is now so HD it looks 3D.

Ohh and btw desktop PC's are going away!!!!!!one!
 
Like when HD hit the market, every thing was HD. HD glasses, HD bed sheets, HD HD.........
Like when 3D hit everything, 3D this and 3D that.....
Now its Ai everything, this new toilet paper was designed by Ai that is now so HD it looks 3D.

Ohh and btw desktop PC's are going away!!!!!!one!
HD (1080p) evolved into 2K/4K/8K and hasn't gone away - we can still find simple 1080p displays to this day.
3D TVs all died around 2017 and was a generational gimmick brought back from nostalgia of the 1980s and 1950s - it does live on in IMAX theatres.

AI isn't a simple cash grab gimmick that is going away anytime soon, much to the chagrin of many forum members on here.
Globally speaking, megacorps are investing billions to trillions of their national currencies into AI, and it all comes down to cost and control.

The AI bubble may one day burst, but by the time it does humanity will have long since suffered the consequences of it, at least in the short and medium terms.

I'm pretty sure I heard about it in this:


View: https://www.youtube.com/watch?v=CksvYbhU-IU

Thanks for providing the video.
The latency mentioned is when the AI tasks themselves are performed on the CPU, GPU, or NPU/tensor/etc. themselves (time it takes to process), not because of where the NPU is physically located.
 
The NPU already is, Intel calls it the NPU, Apple calls it NE, AMD calls it Ryzen AI, and Nvidia just does it in their Tensor Cores, interestingly Intel also sells their NPU as an M.2 card, they have for a while so that's interesting at least, and until recently as a USB stick too.
So what is the big friggin' deal with NPUs, like they are Intel's gift to mankind?
 
Dual core Athlon 64 x2's were available in 2005
I had dual core Athlon 32s, on an AMD (something) 266-D board,.

-to the mobile experience with shitty UI design was a huge negative.
Amen brother.
- At about that same time Google moved away from the "don't be evil" mantra and wholesale embraced data collection, cloud and other such nonsense, and once they opened the floodgates, everyone else followed.
And now Google is being very evil with all their layoffs. And I despise them as much as Facebook. Google makes Microsoft look good (cough, cough medic, I need that treatment so I don/t choke to death.)

completely moronic shit like internet connected lightbulbs and toasters

and people actually buy this stuff. What's next, an internet-connected condom?
 
I, for one, will accelerate all the things in a boat.
And I would accelerate them with a goat...
And I will accelerate them in the rain.
And in the dark. And on a train.
And in a car. And in a tree.
They are so good, so good, you see!
Dr. Seuss? Cat in the Hat?
 
In chat we were talking about how the possibilities for Direct Storage working like a massive rendered asset cache for GPUs could be used to speed up frame generation, but we kind of all agreed that there would be a huge problem trying to predict what assets to keep and which to dump and just end up with a bunch of useless data.

If the AI can predict your play style, in conjunction with Direct Storage, it could predict what assets to keep and which to dump, theoretically improving frame rates by just holding on to old assets and frames.
 
So what is the big friggin' deal with NPUs, like they are Intel's gift to mankind?
It’s because of how scarce AI acceleration actually is in the PC platform. AMD only has it on the high end, Nvidia GPU’s are relatively scarce. So realistically Intel makes up something like 80% of the consumer PC space, them having something that is uniform across the whole chip lineup and is easily added to existing systems via M.2 or USB is what the market needs for this to be a consumer staple across the Windows platform. And it’s relatively cheap as an upgrade like $100 so easy to retrofit into existing systems where it’s needed.
 
It’s because of how scarce AI acceleration actually is in the PC platform. AMD only has it on the high end, Nvidia GPU’s are relatively scarce. So realistically Intel makes up something like 80% of the consumer PC space, them having something that is uniform across the whole chip lineup and is easily added to existing systems via M.2 or USB is what the market needs for this to be a consumer staple across the Windows platform. And it’s relatively cheap as an upgrade like $100 so easy to retrofit into existing systems where it’s needed.

...and that's exactly the problem.

It shouldn't be encouraged in the consumer space. It should remain an "experts with multiple advanced science and technical degrees" niche. Those are the only folks that can be trusted to use it properly, and not misplace trust in it.

AI is not the problem. It never had been. It is the fools who use it and how they are going to make my life miserable I am concerned about.

Going consumer with AI is the most foolish and irresponsible thing I have seen the industry do, pretty much ever.
 
...and that's exactly the problem.

It shouldn't be encouraged in the consumer space. It should remain an "experts with multiple advanced science and technical degrees" niche. Those are the only folks that can be trusted to use it properly, and not misplace trust in it.

AI is not the problem. It never had been. It is the fools who use it and how they are going to make my life miserable I am concerned about.

Going consumer with AI is the most foolish and irresponsible thing I have seen the industry do, pretty much ever.
You are thinking of consumers using LLM’s but what about already done ones. Take some old photos and have the machine know how to upscale and enhance them correctly while sipping juice and not guzzling it. Not some fancy suite but the “free” Lightroom express.

Or how about streaming upscaling, NVidia does it with Google but what if it was just something the windows platform could do?

Cellphones and tablets have AI algorithms designed to optimize wifi signal strength in a crowded environment that PC’s lack why can’t they have it too?

I’m not talking about working with LLM’s directly for the average consumer but just utilizing existing models that are pre packaged and out there to do common things.

Look at what they do for camera phones, the amount of upscaling and AI enhancements there for just taking a picture is insane. It’s not the lenses they have in there that have brought the picture quality to where they are now.

Same for background noise cancellation on audio input, a $50 cellphone running a canned AI model does a better job at cleaning and processing audio then the best of the windows platform options.
 
Last edited:
AI is not the problem. It never had been. It is the fools who use it and how they are going to make my life miserable I am concerned about.
2gzyee.jpg


Although it really will be both... :borg: 🤖

4xpyds.jpg
 
Back
Top