Research: 34 Percent Afraid Of Artificial Intelligence

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
According to this article, 34% of the people surveyed said they are afraid of artificial intelligence. The other 66% are just stupid because everyone knows the machines are going to KILL US ALL!

It's been the subject of numerous sci-fi movies and books throughout the decades. It elicits fear and fascination because of the intelligent potential exhibited by machines and computer software. The idea of machines thinking on their own is a bit scary when the mind making the decisions has no concept of good versus evil.
 
I'm not worried at all. Someone with silly sunglasses and an stupid looking trench coat will leap across rooftops and run along walls while fighting hundreds of the same guy in a suit will save us. Though he'll be selfish and not share his cookie.
 
Was gonna say the same thing, 34% are afraid of AI, 66% will be...
 
Not worried about a real AI deciding to kill us.

A far bigger threat (at least in the near/mid term) will be non AI robots/drones specifically designed and programed to kill people. Instead of suicide bombers, they can send a swarm of drones.
 
What's the problem with turning against humanity every once in a while? AI will be AI. Running along you little algorithmic scamps!
 
Did they run those statistics via a computer? The AI probably altered these numbers.
 
We are as close to AI as we are inhabiting another star system. ie . . in a unknowable future so far out that we can't even guess how far. So anyone alive . . don't bother.
 
If the AI had feelings and emotions, then there might be cause for concern.

AI isn't like that and its programmed. What we will have to worry about is the battledroids the army will use for ground combat getting confused, the there could be some collateral damage.
 
If the AI had feelings and emotions, then there might be cause for concern.

AI isn't like that and its programmed. What we will have to worry about is the battledroids the army will use for ground combat getting confused, the there could be some collateral damage.

Oh don't you worry none, Man is perfectly capable of dishing out some serious colateral damage already as it is. Actually the subject is amazing to me, that people have gotten it in their heads that somehow we can conduct warfare in a manner where no innocent lives are supposed to be lost which is actually stupid if you know anything about modern warfare. People get it in their heads that the weapons are more precise so we don't need to carpet bomb cities. They are more precise, but they are also several times more de vestating in the damage they deliver. Weapons systems that can kill everyone within a one kilometer grid square have been around now for several decades and no one is lining up to stop using them.
 
We are as close to AI as we are inhabiting another star system. ie . . in a unknowable future so far out that we can't even guess how far. So anyone alive . . don't bother.

I agree. I feel that if this were close to an actual breakthrough then the statistics will be different. It would be like doing a survey to find out how many people would be afraid of faster-than-light space travel.
 
Oh don't you worry none, Man is perfectly capable of dishing out some serious colateral damage already as it is. Actually the subject is amazing to me, that people have gotten it in their heads that somehow we can conduct warfare in a manner where no innocent lives are supposed to be lost which is actually stupid if you know anything about modern warfare. People get it in their heads that the weapons are more precise so we don't need to carpet bomb cities. They are more precise, but they are also several times more de vestating in the damage they deliver. Weapons systems that can kill everyone within a one kilometer grid square have been around now for several decades and no one is lining up to stop using them.

Well said. Also a big factor in many of these 'precision weapons' is the cost. It is much cheaper to destroy with collateral damage than to avoid it. Not in all cases but that cost is always assessed before a strike happens, and more often than not, it is just cheaper and easier to drop cheap bombs.
 
Well said. Also a big factor in many of these 'precision weapons' is the cost. It is much cheaper to destroy with collateral damage than to avoid it. Not in all cases but that cost is always assessed before a strike happens, and more often than not, it is just cheaper and easier to drop cheap bombs.

Exactly. A standard 155mm NATO HE artillery shell costs about $440, whereas an M982 Excalibur 155mm shell costs about $53k. (FY 2013) The Excalibur round is fired from the same cannons, but it has GPS guided navigation.

Sure, the normal HE round isn't as accurate, but.... it also costs next to nothing in comparison, and the resulting explosion is the exact same.

Not sure if I had a real point. I just miss my old redleg days.
 
I read an article in the local paper lately where a General Ashley was telling us how "warfare has changed" and the new army will adabt blah balh, blah blah Blah blah.......

I would say he's an idiot except that I spent a few years breifing Generals and what I found out is most of those guys are far from idiots. Some of them are scary smart actually. So what that leaves me with is that the really smart general just thinks most of us are idiots so he can sell idiotic comments too us.

Warfare has not changed, not one bit. Warfare is the same as it's always been. What the General is saying, while not saying it, is that what we have been involving our military in is simply not actual warfare, that is what has changed. They changed the name awhile back because it was too truthful, for awhile they actually called it "Operations Other Than War", pronounced "Oow-Twa". Then people started asking why our military was planning on engaging in stuff that isn't really warfare and so they said "oops" and changed the name to something else like "Full Spectrum Operations". But warfae hasn't changed and it's still a bloody freaking endeavor and as fun as it is to play war on a computer screen, the real deal is a whole lot different.
 
The purpose of the military is to kill people and break things.
Using it for anything else is a waste of resources.

War is not civilized, it is a terrible and violent thing, and lots of people die.
War should be so brutal and cause so much death that people will agree to almost anything to avoid it.

Trying to fight a "civilized" war is a guarantee that you will eventually lose.
If you want to win, you need to completely defeat your enemy, to the point that they will unconditionally surrender and accept whatever terms you impose on them.
 
There is no such thing, and never will be. But if some asshole accidently stumbles into it ... you realize we are all DOOMED!!! :eek::rolleyes:
 
I am not worried because it will be programmed by humans, so it will be fundamentally flawed ... I have this vision of Skynet/Colossus/HAL or whatever starting it's countdown to decimate the troublesome humans ... right as it gets an overflow error or system fault and crashes gloriously to a glowing blue screen of DEATH ... and some evolved monkey nearby will run over and pull out its power cord while it is waiting for someone to hit AnyKey to reboot :cool:
 
Exactly. A standard 155mm NATO HE artillery shell costs about $440, whereas an M982 Excalibur 155mm shell costs about $53k. (FY 2013) The Excalibur round is fired from the same cannons, but it has GPS guided navigation.

Sure, the normal HE round isn't as accurate, but.... it also costs next to nothing in comparison, and the resulting explosion is the exact same.

Not sure if I had a real point. I just miss my old redleg days.
And if it wasn't a government contract I have a feeling the cost would be a lot closer to $5.3k
 
what I want to know is so machines become intelligent, and machines are logical by their vary nature of having to have everything programed in, the machines killing comes from where? Hollywood does it so it has a bad guy no one cares about killing but...
 
I highly recommend everyone interested in the future of AI and the reason why you should be afraid read an article on waitbutwhy about artificial superintelligence and how fast it's around the corner with predictions by almost all of the most respected and top engineers developing AI in the world.
 
Intelligence is no measure of how good a being can be.

Intelligence is merely the measure of a being's knowledge or how apt that being is to observe its surrounding and learn.

Intelligence is not wisdom.

All this also applies to Artificial Intelligence.

As long as AI has no concept right and wrong - a standard of measure which man has failed in as well - it too can pose a threat to all lifeforms.
 
I highly recommend everyone interested in the future of AI and the reason why you should be afraid read an article on waitbutwhy about artificial superintelligence and how fast it's around the corner with predictions by almost all of the most respected and top engineers developing AI in the world.

Nope, that source isn't at all biased. We should super-dupe-looper trust it without question.
 
Intelligence is no measure of how good a being can be.
Intelligence is merely the measure of a being's knowledge or how apt that being is to observe its surrounding and learn.
Intelligence is not wisdom.
All this also applies to Artificial Intelligence.
As long as AI has no concept right and wrong - a standard of measure which man has failed in as well - it too can pose a threat to all lifeforms.

While I agree on some parts, I disagree with others.
These two words can mean different things but always linked. and do not forget Intelligence is a synonym of wisdom. I like to see it as a super set of intelligence. You really can't have wisdom (good decision making) without intelligence and knowledge. (ie both knowledge and intelligence is required to make consistent "correct" decisions, A one off guess is not wisdom but luck)
The problem is the definition of wisdom people often use, It often implies "good" or moral judgment, which is a very subjective thing. Since knowledge and intelligence are very limited "perfect" solutions often can't be made. And for the most part morals are a human cultural construct.
So wisdom is making the "best" solution of what you have.
Ie definition - "the soundness of an action or decision with regard to the application of experience, knowledge, and good judgment"
The problem again that the"soundness" of a action or decision is subject to change with more information or from a different point of view. Making wisdom a very subjective concept.
Applying it to a intelligent AI (or a alien) brings up all sorts of problems since their thought processes are NOT human in any way, and the point of view is always different, Thus the soundness from OUR point of view may not match. (but from it's POV it IS the soundness action and thus could be quite wise in its own context.)
I think that is what scares people over AI and Aliens, not that they may be different but that we may find out how really flawed we are as seen from outside our own delusion and then have competition on the only thing we thing we are really good at . . . thinking.
 
What they fail to mention is that 84% of the population is afraid of real intelligence.
 
Back
Top