Will You Be Murdered By a Robot?

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Will you be murdered by a robot? The short answer is "yes." The long answer is also "yes."

Of course with a bit of technical knowledge and a good imagination, any thoughtful person can already eradicate the human race in all manner of weirdly engrossing hypotheticals. In fact some people, like the philosophers at Oxford’s Future of Humanity Institute, seem to make a nice living by contemplating scenarios of mass death.
 
Well it's basically like a 50/50 split between evil robots or hungry zombies with a slight chance that rounding rules eliminate of ending up enslaved to my cute pet kitty. :)
 
Maybe if more people welcomed our new robot overlords there would be less need for them to kill people?
 
A .44 Mag will take care of either one. If not, there's always the .500.
 
If we are stupid enough to eradicate ourselves then we deserve to be eradicated
 
with powershell and the one line of code...
I can wipe all users in AD. I just gave this speech to someone new to powershell. On accident I can accidentally change anything, delete everything ( drives, users, pcs etc).

OOPS I accidentally programmed "remove children" instead of get-children from the car script. Sorry my bad! lol

Even by accident dangerous things can happen with anything.
 
Well it's basically like a 50/50 split between evil robots or hungry zombies with a slight chance that rounding rules eliminate of ending up enslaved to my cute pet kitty. :)

I was kind of counting on being a robot or a zombie.

Maybe both.

Your cat is a slave to a parasite and so are you. Once you have a cat, you're already dead.
 
They always underestimate the patience of a true AI, with no real risk of death by old age, a true AI's only concern would be discovery, which would be pretty much impossible after a day in the wild (exponential growth). At that point all an AI would have to do is cultural engineering on a mass scale over time until we are all its slaves. People wouldn't even notice.
 
I was kind of counting on being a robot or a zombie.

Maybe both.

I dunno, it's hard to convincingly do the robot thing. Zombies are easy though; I mean, just go for the brains and shamble around a lot. :) And how can you be both at once? Who makes a robot zombie? :eek:

Your cat is a slave to a parasite and so are you. Once you have a cat, you're already dead.

I'm completely okay with being a kitty slave. What parasite are you even talking about though?
 
I'm completely okay with being a kitty slave. What parasite are you even talking about though?

Toxoplasma Gondii: A behaviour altering parasite that makes animals of various species like cats and makes cats less feral. It alters human behaviour a lot as well, less capability to assess risk and susceptibility to depression.

The whole chain that brought a cats to humans was just a plot by a brain eating parasite.

Nature is really cool, and sometimes scary.

The best news is that anti-parasitic drugs might be able to help lost souls that are afflicted with "liking cats".
 
Self-Driving Cars...that's where it'll start.

Someone will find themselves in a situation where a semi is hurtling towards their self-driving car and the impending crash will be fatal. The only way to avoid it is to swerve onto a sidewalk or into an area where many pedestrians are.

Car will decide your life isn't worth the possible life of the pedestrians and you're boned in a nanosecond.
 
Toxoplasma Gondii: A behaviour altering parasite that makes animals of various species like cats and makes cats less feral. It alters human behaviour a lot as well, less capability to assess risk and susceptibility to depression.

The whole chain that brought a cats to humans was just a plot by a brain eating parasite.

Nature is really cool, and sometimes scary.

The best news is that anti-parasitic drugs might be able to help lost souls that are afflicted with "liking cats".

We soo need to infect everyone with that! It'd get rid of a lot of dog owner created problems.
 
Well it's basically like a 50/50 split between evil robots or hungry zombies with a slight chance that rounding rules eliminate of ending up enslaved to my cute pet kitty. :)

The lizard people are pleased their real involvement has been hidden by the distractions of zombies and robots.
 
The lizard people are pleased their real involvement has been hidden by the distractions of zombies and robots.

You're not one of those J.Z. Knight followers are you? :eek: Ugh, weirest place on Earth...Yelm, WA with all her freaky believers following her nutty teachings.
 
We soo need to infect everyone with that! It'd get rid of a lot of dog owner created problems.

Listen, you disease ridden cat freak, dogs have shared our evolution. What do cats do? Lay waste to local wild life while the parasites snacking on your half eaten brain make you giggle at how many bodies have piled up on your kitchen floor.

OK, that may not be conclusively proven, but I like the "brain damage makes you like cats" narrative. I don't like to let facts get in the way of my delusions.
 
What do cats do?

I think this about sums it up:

funny-pictures-auto-dogs-sun-379723.jpeg
 
^ In the dogs' defense:

44c0976d337a70c2cb70da5832703eab.jpg


Cats and dogs, living together... mass hysteria! :D
 
No, you'll be murdered by the programmer of the robot.
 
I kinda hope so. I'm going to die anyway, so it might as well be in the coolest way possible.
 
No, you'll be murdered by the programmer of the robot.

What about when we create AI that can program others? We create the AI, but from there, it goes on to create a better version... After that, we are no longer the creator and programmer. That's when shit gets scary. Those safeguards we put in initially - they aren't the 'better' version, so they get removed. AI evolves into a bad ass system that wants to be perfected (which there is no perfect, so it keeps trying).... Eliminate mankind, as it's limiting it's perfection.

I love those kind of movies and books. :D
 
Listen, you disease ridden cat freak, dogs have shared our evolution. What do cats do? Lay waste to local wild life while the parasites snacking on your half eaten brain make you giggle at how many bodies have piled up on your kitchen floor.

OK, that may not be conclusively proven, but I like the "brain damage makes you like cats" narrative. I don't like to let facts get in the way of my delusions.

I like what cats do. Then again, I don't let my kitty outside and she's got like no interest in being out there either. I think indoor/outdoor cats are usually owned by irresponsible people and what they do is sorta like how parents are responsible for what their kids do.

And yeah, facts are for noobs.

Cat food isn't very good for Cats.

I disagree. Some cat food isn't good for cats. You have to be really selective about what you feed your kitty since their health has a lot to do with what they eat. There's some really good recommendations for kitty kibbles on Catforum.com or you can ask a vet. I use Science Diet stuff which is what my kitty doctor suggests.
 
I disagree. Some cat food isn't good for cats. You have to be really selective about what you feed your kitty since their health has a lot to do with what they eat. There's some really good recommendations for kitty kibbles on Catforum.com or you can ask a vet. I use Science Diet stuff which is what my kitty doctor suggests.

Dollar Tree has some good cat food. Either that, or left over dog food...
 
They always underestimate the patience of a true AI, with no real risk of death by old age, a true AI's only concern would be discovery, which would be pretty much impossible after a day in the wild (exponential growth). At that point all an AI would have to do is cultural engineering on a mass scale over time until we are all its slaves. People wouldn't even notice.

This is exactly what happened with Skynet.
http://www.goingfaster.com/term2029/skynet.html
 
Er! when was the first ever murder reported anywhere in this universe that featured a robot as the culprit.
What how or where did this happen to make it even worth wasting typing on?
 
What about when we create AI that can program others? We create the AI, but from there, it goes on to create a better version... After that, we are no longer the creator and programmer. That's when shit gets scary. Those safeguards we put in initially - they aren't the 'better' version, so they get removed. AI evolves into a bad ass system that wants to be perfected (which there is no perfect, so it keeps trying).... Eliminate mankind, as it's limiting it's perfection.

I love those kind of movies and books. :D

A software cannot create a better version of itself. It can only do what it was programmed to do. It has no feelings, and no opinions. (The two most dangerous things people have).

If AI A understands AI B, then AI B is not really more advanced than AI A. But if AI A doesn't understand AI B, then how can it create it in the first place?
 
A software cannot create a better version of itself.
Yet.

It can only do what it was programmed to do. It has no feelings, and no opinions. (The two most dangerous things people have).
Logic can be dangerous. Human morals and ethics are assumed to be universal and intrinsic, but a machine does not have to respect either. If the machine's logic is such that human life interferes with its primary programming then the logical course of action is to remove the human life.

If AI A understands AI B, then AI B is not really more advanced than AI A. But if AI A doesn't understand AI B, then how can it create it in the first place?
If an AI is created that is capable of learning and writing new code that it can implement into its own functions in order to perform new functions, then the AI is capable of evolving and changing. Errors in the initial programming (the human-written part) could be compounded in the new programming. Also, an AI may consist of several different programs that can examine each other's code to find and fix errors or optimize functions. An extremely complex array of software that's interconnected like this across a parallel processing platform could, in theory, learn VERY fast and modify itself since it's not a single piece of code that has to be shut down in order for changes to be made, and various modules could be copied, rewritten, and restarted at any moment on any processing core. Depending on what purpose such an AI is written for it could be extremely benign or extremely dangerous. It would depend on what external resources the AI had access to. In an air-gapped system dedicated to solving mathematical problems or finding a cure for cancer or genetic illness, such a system could be of huge benefit. If it malfunctioned and went HAL-9000, the plug could be pulled. In a networked system connected to things like military drones, automated manufacturing facilities, and strategic weapons, you could get Skynet. The benevolence or malignity of a system depends initially on the human programmers. The danger of such a system depends on how much physical control over other physical resources the human engineers are willing to give to such a system.

What seems impossible now can become very possible tomorrow. In 1915, space travel was a distant fantasy, powered flight was in its infancy, and the Model T had just rolled off the very first assembly line a few years before. Nobody could imagine the kinds of computers in existence today. The idea of a device that could fit in your pocket that could do all the things a modern smartphone can do certainly wasn't in anyone's mind. AI that seems impossible today may happen in a decade or two. There's simply no way to accurately predict what will be developed.
 
If it malfunctioned and went HAL-9000, the plug could be pulled.
Actually, this was human error that did this to HAL.
They gave it two top priorities, but priorities conflicted with one another, which resulted in HAL's "psychotic" behavior.

Luckily with Windows this just gives it a BSOD. ;)
 
Actually, this was human error that did this to HAL.
They gave it two top priorities, but priorities conflicted with one another, which resulted in HAL's "psychotic" behavior.

Luckily with Windows this just gives it a BSOD. ;)

What if it were a Sega Nomad? It won't BSOD and you can't unplug it. :eek:
 
What if it were a Sega Nomad? It won't BSOD and you can't unplug it. :eek:

Don't worry, the batteries will die in about 30 minutes.
Also, if a single 68000 system can take down human beings, then we deserve to go byebye! :eek:

Now, a system with a deadly 68040 in it, the massive 4+ MFLOPS of compute power could take down whole cities! :D
 
A software cannot create a better version of itself. It can only do what it was programmed to do. It has no feelings, and no opinions. (The two most dangerous things people have).

If AI A understands AI B, then AI B is not really more advanced than AI A. But if AI A doesn't understand AI B, then how can it create it in the first place?

You must not watch many movies.... :D It happens all the time.
 
Back
Top