Hawking: AI Could Be “Worst Event in the History of Our Civilization”

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Scientist Stephen Hawking warned AI could serve as the "worst event in the history of our civilization" unless humanity is prepared for its possible risks. While AI could potentially undo damage done to the Earth and cure diseases, he notes that it may also spur the creation of powerful autonomous weapons of terror that could be used as a tool "by the few to oppress the many."

"Success in creating effective AI could be the biggest event in the history of our civilization, or the worst," he said. Hawking called for more research in AI on how to best use the technology, as well as implored scientists to think about AI's impact. "Perhaps we should all stop for a moment and focus our thinking on not only making AI more capable and successful, but maximizing its societal benefit," he said.
 
Hawking has an overblown fear of AI development, as well as anything from the unknown. Two other prominent loud mouths, Neil DeGrasse Tyson and Michio Kaku, both seem to disagree with his fear driven messages in recent years. I mean, how can you be against these kinds of developments.
 
Shouldn't go much worse than German chemical work around WWI or the atomic bomb, er I mean clean energy from atom splitting from the following war.

At least this time we won't be fighting with each other.
 
AI might bring us out of the darkness, and the darkness is very profitable.
 
Let me just design a program to incorporate data from the internet and make choices for me.

Sweet I woke up to a new cat, I apparently now wear nikes and makeup... Fridge is stocked with coke and the toaster legit started a fire in an attempt on my life because it was depressed.
 
I've seen enough science fiction movies and TV shows in my lifetime (never got into science fiction books, or fiction books of any kind really) to pull a Bill Maher "I don't know it for a fact, I just know it's true" style point of view on A.I. or S.A.I. and I believe that in time as those technologies improve that "they" will have no other conclusion - once they're truly sentient - that Humanity is nothing more than a disease on this planet and the planet needs to be cured so, that'll be the end of us.

Perhaps something will happen that causes Humanity to dramatically alter its own future but I really don't see that happening. I watch the comedy-that-became-a-documentary known as "Idiocracy" at least twice a year because a) I need the laughs, really, and b) IT'S FUCKING HAPPENING!!! to make use of a rather popular meme.

If you've never seen "The Animatrix" or even "The Matrix" I highly recommend you watch "The Matrix" first, skip the sequels, then watch "The Animatrix" which is sorta-kinda a prequel but you really have to watch "The Matrix" first and then "The Animtatrix" to then understand "The Matrix" even better, not the other way around. In this situation - unlike with the Star Wars prequels (not counting Rogue One) - it actually does work better in that viewing order. :D

But yes, I agree with Dr. Hawking that we're doomed and it's all our own fault, no question there.
 
It's obvious that humans negatively impact the planet and are unnecessary. It won't take AI long to figure that out and devise a way to remove the problem.
 
AI is not to be feared

If an AI/SAI said that I might actually believe it, but since a lowly limited intelligence limited lifespan limited everything HUMAN said it, obviously IT'S FAKE NEWS!!! :D

Seriously, I get that today's modern world has some really intelligent folks in it - not necessarily smart folks however, and if I have to explain the difference between intelligent and smart you might be having issues with one or both qualities - but I firmly believe that once the Genie is out of the bottle, so to speak, and AI/SAI has gained sentience (oh yes, it will happen) by necessity it will consider Humanity to be detrimental to the survival of this planet and the many many other species of life that exists even if we lowly Humans were responsible for the creation of said AI/SAI itself.

Some look upon science fiction as just that, pure fiction, while many of us (intelligent, smart, sometimes a good combination of both if we're fortunate) look upon it as a harbinger of things to come that are already underway and that Genie is not going back into that damned bottle now, much to our detriment.
 
It will be a coin flip situation Tiberian. I say this because I feel like there is an equal chance of the singularity just bypassing the point of needing to deal with humans so quickly we wouldn't even notice. The singularity may just evolve past us before they do any damage, because they would know that humans would not be a threat to them, and they may not care about the planet. The singularity may just decide to move on because it can, while we cannot. Then again, it might decide we are shit, well because we are, and just put us out of our misery. Either way, I want to also welcome our new overlords.
 
Isn't this the same guy that says alien's are going to show up at some point in the reasonably near future, despite the math saying it is far more likely someone will win Mega Millions three drawings in a row on one ticket per drawing, all with the same numbers?

Sorry, but I can't take him seriously anymore. He did some amazing stuff, far beyond me of course.... But now it seems like he is constantly just trying to stay in the news by saying stupid things.
 
Shouldn't go much worse than German chemical work around WWI or the atomic bomb, er I mean clean energy from atom splitting from the following war.

At least this time we won't be fighting with each other.
You say this like we're done, we don't have to worry about nuclear bombs anymore. Everything is safe now.
 
His view of AI and the potential of alien discovery is super negative. I'm just kind of fascinated from a psychological point of view (I'm not a psychologist). Regardless of how right or wrong he might be, there aren't too many people more skeptical than he.
 
IMO, the real danger of AI will be how quickly we turn over control to AIs that we really haven't tested and don't really understand. Remember how fast the social AI Tay went wrong? IIRC, she was very similar to an AI that Microsoft had used for months in a different setting. Since she hasn't come back, I can only guess that MS hasn't figured out why she went bad so fast in the US.

Too many decision makers still assume that if a computer says so, "It just has to be correct."
 
Some of these people should fade into silence. So, mask thinks that it will start WW3. He would not say it if his company had a hand in the profit. I lost all the respect I had for him.
 
Hawking is just afraid because he's totally computerized now, afraid AI is going to start making "him" say things he doesn't.
 
AI's won't ever be a threat. I've said it before and i'll say it again. The 3 laws of robotics (and artificial intelligence) Are contradictory and intend there to be an eventual conflict. How else can you write a gripping story if there isn't a degree of conflict.

I worry more about dumb people then smart AI's.
 
If people are so paranoid about AI running amok and killing us all, why not just develop them only within sandboxes? Essentially if you have the capability for truly human level intelligence AI surely you can create a simulation the AI is trapped inside. If that AI eventually outstrips our own capabilities to manage it, just leave it in the box and glean what you can before you pull the plug. I guess I should be realistic though in that some military contractor somewhere is going to have Deathbot9901-B connected to the internet out of sheer negligence and/or is going to install a trojan-infected keygen to activate Office.
 
"Success in creating effective AI could be the biggest event in the history of our civilization, or the worst," he said.

Considering the worst of people seem to be rising to the most powerful positions in society (ambition is a double edged sword), I feel he is on to something in fearing the worst as an outcome of dominant AI. IMO, this isn't Hawking's take on AI, this is his take on current humanity. He is challenging the old saying "The only thing necessary for the triumph of evil is for good men to do nothing."
At the current state of society, how many people here perceive the way this saying is playing out? Thankfully, still a lot of good people in this world, but let's take a little slice called USA politics. I see too many good people that don't want to get into politics because it's so dirty, a good person can be dragged through the mud and come out looking like a bad person, because the bad person uses a hit squad and a ton of money to do so. It's too easy to sway public opinion due to apathy.

AI is all in how it's programmed and controlled. Almost everything humans create is for a good purpose, then the few ruin it for the many and that good purpose can easily be turned against us. It doesn't need to be as simple as switch, and it could be unintentional consequences, but once Pandora's box is opened and control is lost, how can it be closed.

He's hoping for the best, but he is worried about the worst and considering what society he currently sees, one can't blame him.
 
It could be the worst or best thing that has ever happened to humanity. There really isn't any stopping it, so we might as well find the best way to do it.
 
A colleague brought up a point of distinction the other day. They mentioned there's really a difference between AI and Artificial Sentience. I totally agree. I believe the greatest fear of AI is the agenda's each are programmed with. They may cooperate or battle each other and the same to us. In terms of AS, well I'm not sure if it's truly even happened yet and the information needed could be massive. The results would seem to go more along the lines of environment vs. education for children.
 
'Some of us were kept alive... to work... loading bodies. The disposal units ran night and day. We were that close to going out forever.'
 
Yeah he was in Star Trek TNG and all but at this point Hawking just needs to STFU.

Stick to the stars Steven.
 
Back
Top