Why Self-Driving Cars Must Be Programmed to Kill

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
I think there should be a button on the dash that let's you select the "killer" mode you want. What!?! They do it for performance levels, why no body count? :D

And that raises some difficult issues. How should the car be programmed to act in the event of an unavoidable accident? Should it minimize the loss of life, even if it means sacrificing the occupants, or should it protect the occupants at all costs? Should it choose between these extremes at random?
 
My car should take care of me first, everyone else second by default. Unless I tell it otherwise.
 
Scenario:
I'm sitting in the back of my autonomous car surfin the internet and some dumfuck steps out in front of my car.

My car had better not choose to swerve around them and into an oncoming semi. It's probably okay to spill my drink but that's about it.
 
Shoot the hostage!

I think it would be very complex for an autonomous car to be programmed to minimize loss of life. It would have to take into account and predict the actions of the other drivers on the road, including both other human drivers and other autonomous cars that were programmed in other ways that it wouldn't know (it would probably be closed source algorithms). I would think it easier to predict the behavior or actions of others under the assumption that others will attempt to make a choice to minimize the impact to themselves.

Plus The Three Laws of Robotics. Or is it the Golden Rule of Robotics - he who makes the robot makes the gold?
 
This is a no win situation so these type of scenarios will never be part of the autonomous driving program.

To illustrate the extant of pointlessness ethical discussions, replace all occupants with under age of 10 children. Little Jimmys and little Emmas. No adults.

Which kids do you want to save? An imminent crash that is unavoidable is about to happen between the children of a billionaire and poor ghetto kids with arrest records.
 
Shoot the hostage!

I think it would be very complex for an autonomous car to be programmed to minimize loss of life. It would have to take into account and predict the actions of the other drivers on the road, including both other human drivers and other autonomous cars that were programmed in other ways that it wouldn't know (it would probably be closed source algorithms). I would think it easier to predict the behavior or actions of others under the assumption that others will attempt to make a choice to minimize the impact to themselves.

Plus The Three Laws of Robotics. Or is it the Golden Rule of Robotics - he who makes the robot makes the gold?

Indeed. At first I imagine the basic human-freak-out reaction. Hit the brakes and hope for the best. Maybe alert the passengers to grab the "Oh shit!" handle.
 
This is all a bit moot, as how will it even be able to tell the difference between a dog and a child, or a deer and an adult with a big puffy trench coat on reliably? And if its by facial recognition... hope they don't use the HP stuff that can't recognize black people because of lack of contrast on the camera for the facial features on those with dark complexions. ;)
 
Hopefully the software is open source and therefore I can change it to self preservation.
 
The lawyers are going to have a heyday with these stupid cars, and human life will be further devalued by the legal system.
 
This is all a bit moot, as how will it even be able to tell the difference between a dog and a child, or a deer and an adult with a big puffy trench coat on reliably? And if its by facial recognition... hope they don't use the HP stuff that can't recognize black people because of lack of contrast on the camera for the facial features on those with dark complexions. ;)


Haha the 4th episode of "Better Off Ted" showed how this could be a real problem. :D
 
The car will be equipped with a D&D 20 sided dice. Roll a random number to determine who the car kills. Your LUCK stat will influence the outcome. 19-20 is a critical hit on the pedestrian.

But in theory if all cars are self driving there should be no accidents since the human is taken out of the equation. All cars will communicate with eachother and make sure they do not crash. But in cities we still will have pedestrians so I guess the only solution is flying cars.
 
I would never buy a self driving car, unless I was physically unable to drive. (paralyzed, too old, other medical condition, etc.) I enjoy driving, even on long trips.
 
I would never buy a self driving car, unless I was physically unable to drive. (paralyzed, too old, other medical condition, etc.) I enjoy driving, even on long trips.

Yes, because a self-driving car prohibits you from driving yourself... Instead of say, maybe only use it when you don't feel like driving, or want to look something up on your phone.

This is not a black and white decision, it would be super handy for you to, for example, while driving along to just turn on autopilot, look through your music and decide on something to listen to, and then take back control.

Or on a long trip to take a quick nap.

People saying they'll never buy a self-driving car are like our parents saying they'll never get a computer or smartphone. Granted, some never will, but most eventually have.
 
The way an autonomous car should drive, if an accident occures it should be close to 100% the other objects 'fault' barring mechanical/engineering failure of the autonomous car. In that event it should protect the occupants, who have done nothing wrong.

The problem is, we have a society that makes you responsible for everyone but yourself.
 
The way an autonomous car should drive, if an accident occures it should be close to 100% the other objects 'fault' barring mechanical/engineering failure of the autonomous car. In that event it should protect the occupants, who have done nothing wrong.

The problem is, we have a society that makes you responsible for everyone but yourself.


It's OK, because everyone else is responsible for you. ;) lol


I prolly would not mind having an automated car, but I certainly would love and automated RV. Climb in the RV, set the destination and waypoints for fuel, then chill in the back doing whatever while heading towards my destination.
 
The Car will go into Christine Mode and kill all the survivors and the witnessses.


But seriously, the care will prioritize in order of who has the better lawsuit. And since the driver agreed to a EULA most likely, they are last.
 
Well, I would say that it is the driver's responsibility by law to yield to pedestrians. I would suggest that the logic of these laws be followed. It is still not a great outcome, but what can you do other than make improvements to minimize the no win scenario?
 
I would never buy a self driving car, unless I was physically unable to drive. (paralyzed, too old, other medical condition, etc.) I enjoy driving, even on long trips.

I fully understand your position. For myself, I sur didn't spend 45K on a new Challenger so a computer could drive it. But the move iRobot put the whole question into a different light. Rember that at one point Will Smith took Manual control of his vehicle and people were a little surprised he would do this. My understanding was this is because the traffic is driving at higher speeds then what most humans can manage to safely deal with, not so much because of the shear control aspect but because the vehicle navagate at high speed and in congested traffic hence it's much safer when the car does the driving.

So now the question becomes, if the car can get you from point to point in congested traffic at 120MPH would you still be so inclined to try to manage it yourself? For me I am not so sure I would be willing to try that one, not in tight traffic. Maybe it wouldn't be so bad with all those computer driven vehicles around, everyone is driving faster together right. But still, at 120MPH you better be on top of your game cause a fuck up is really going to hurt.
 
But I also find this as a flawed concept. A driver that is put into such a situation where he finds himself hurtling at a crowd of pedestrians doesn't go through a moral reasoning process, he just reacts and then maybe he lives and maybe not. He doesn't conciously have time to think through everything and come to some sort of "hero" deecission to sacrifice himself for the greater good.

These guys need to deal with the more immediate problem, a small animal runs out in front of your car in town do you swerve or hold course and speed, or hold course and slow? Etc. A deer runs out in front of your car on the highway what does the car do? An Elk? a Moose? A Bull?

Is your car a Miata or is it an F250?

All of this matters and it's all part of an equation a driver has to run in his head under stress in just a second's time.

Let these engineers figure this out first cause if they can't get that part right then the stuff they are trying to deal with is moot.

They have the cart before the horse.
 
Accelerate and turn on the windshield wipers? :D

I see no reason for the car to consider itself or its driver a priority over anyone else, even if that someone else is a drunk biker on direct collision course.

I mean, sure not everyone will want a car that sacrifices its driver, but at the same time - wouldn't you feel uneasy in traffic or on foot with a bunch of robots around you that have less regard for your life than for that of its owner?

The very algorythm for such a solution would be illegal in itself.

At the upmost it should merely attempt to avoid the obstacle without creating an additional problem due to excess manoeuvering on a busy road. Decelerate, warn driver, dial 911, bluescreen.
 
Almost no one would want a car that sacrifices the driver. It really is going to have to be, "occupants come first, those outside the car second", if they hope to sell them.

I would not buy on programmed to sacrifice me based on numbers. I would prefer the car minimized the carnage, but the occupants of the car come first.
 
Okay, how about cases where it's the robot's actions that have led to such a horrible decision point?
Including indirect, such as performing a series of overly aggresive maneouvers due to a "glitch"? Like something as trivial as a sensor failing?

Man, this choice should at least be user configurable :D I have heard of cases where drivers indeed chose a tree to stop on just to avoid a person.
This happens over here especially, in rural areas, villages, where children have to walk to and from school on the side of the road - which are usually lined by trees on both sides.
 
I want my car to avoid the other person and not me. If it has to run over someone to save me then so be it.
 
Back
Top