MIT Lets You Decide Who To Kill In A Self-Driving Car Wreck

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
MIT has developed an online activity that allows you to decide who lives and who dies in a self-driving car crash. You get to chose between running over a bunch of animals or smashing into people (including a child) or smashing into a barrier versus a group of people that includes a pregnant lady. Here's where it gets interesting, you can also chose between hitting a group of criminals or a concrete barrier (sorry criminals) or young people versus old (bye grandma!). At the end of the activity your decisions are compared against what others around the country decided. :D
 
The safety should always be on the side of the person in the vehicle, or else why bother taking it if it's first thought is to sacrifice you.
Most of these scenarios involve idiots running out into the street anyways. If they have a death wish why stop them?
 
I still don't understand why you'd want a self-driving car, driving is the fun part of owning a car.
 
Driving a car can be fun. Driving a car to work in stop and go traffic with idiots changing lanes almost hitting you is not fun no matter the circumstances.

This. I love driving on a weekend just cruising around or exploring. Back and forth to work, yea I would rather relax and maybe get an extra bit of sleep or work done.

Not to mention, always having a DD.
 
Whoever is going to get sued will get the ability to decide. So if you expect the car maker to absorb the lawsuit, they will make the decision that minimized their litigation potential.
 
This is kind of a fallacious test. The car isn't going to know who the people are. I'm answering without looking at the descriptions because of this.

My results. My most important criteria in order of precedence are preservation of the passengers, minimalize loss of life, and avoiding intervention.

Moral Machine
 
I still don't understand why you'd want a self-driving car, driving is the fun part of owning a car.

On back roads through the countryside sure! it's a blast.

But 99.99% of my time behind the wheel is sitting in bumper to bumper traffic. If self driving cars can eliminate or reduce this incredible time sink, I'm in.

Hell, even if they don't I'm in because at the very least I could do something else instead of staring at the car in front of me.
 
The "smart" people are still thinking they should play god huh?

No one does this today. A human behind the wheel usually tries to avoid hitting people, cars or pedestrians, and then looks to self survival, and arguably, the first item is for the same reason as the second.

These smart guys think they have the ability to do this better, and once they do, they are taking responsibility for it. this is just one of those things where they should ask to computer to do things any differently than a human would, just do it better.
 
Not bad...but some of its conclusions are erroneous. I wasn't choosing a preference between old/young people or men and women. Many of my choices were based on don't leave the lane if the death toll is the same on either side. I also put less of a priority on passengers. Just because you have a car full of people doesn't mean you should run someone over instead of avoiding a barrier you should have seen in your own lane.
 
I still don't understand why you'd want a self-driving car, driving is the fun part of owning a car.

Others have posted counterarguments, but let me add one more: you're assuming that everyone finds driving in general fun. News flash: they don't. I hate driving. Even the times that I have gone 100MPH+ on an empty highway, I have never felt any sort of real fun from it. Just "okay, this is interesting. I'm going fast. Okay." To me owning a car is simply a necessity for the current state of the world, and driving it is a chore that has to transpire.

Mind you, I enjoy some countryside drives and seeing the sights, but this wouldn't change in a self driving car. In fact, I'd get to enjoy the sights more because I wouldn't have to actually make the road my #1 concern.
 
The lawyers are going to have a field day with this the first time a pedestrian is hit by a self-driving car.
 
I also had some issues with the conclusions the thing drew from decisions. I got a priority value on old/young when I was really choosing based on who was doing the wrong thing. It just happens that there were more young jaywalkers than old ones, but their actual age didn't matter. Don't leave the lane, don't give preference to pedestrians that are illegally in the road, don't give preference to animals, minimize death toll where other options are equal; pretty straightforward, but the addition of the random descriptors (executive, homeless, criminal, large, ect.) of the various actors causes them to read factors into your decision that didn't actually happen. I would assume in a larger, more rigorous study they'd work their way through establishing a baseline for easily definable options such as pedestrians vs. passengers, people vs. animals, etc., before trying to bring in abstract judgement calls like homeless vs. executive or man vs. woman.
 
What about size of humans? I worked with a guy that weighed in just over 600lbs (he used to brag about it), so I would not want to just go ahead and hit him if he was crossing illegally.
 
  • Like
Reactions: File
like this
They wouldn't have to worry about a "instadeath for passengers if car hits a concrete barrier at city speeds" if they only made the vehicle a little bit more substantial :p

Many of these choices are thus unrealistic.

It's not a choice between killing three at the cross-walk vs killing everyone in the car.

it's a choice between killing three at the cross-walk vs wrecking the car and shaking up it's passengers a bit.
 
Brake failure is the case for all of these.

my results: Moral Machine

Apparently I hate pets and fat people :p


My biases are apparently against pets and jaywalkers, and in favor of children.

Honestly though, who would choose to kill a child over an old person? Sorry old person.

And pets? I'm sorry animals, but you simply do not have equal value. I'd kill a billion animals to save even one human being.

Other than that, I almost entirely picked hitting the people crossing on red :p
 
This morality machine is dumb. Like it's supposing the machine has any knowledge of these things.

Of course the self driving cars of the future will have access to all records (including criminal records) so they can selectively run over criminals.

And it will identify people through their government mandated subdermal RFID tag. :p

The future is scary.
 
My biases are apparently against pets and jaywalkers, and in favor of children.

Honestly though, who would choose to kill a child over an old person? Sorry old person.

And pets? I'm sorry animals, but you simply do not have equal value. I'd kill a billion animals to save even one human being.

Other than that, I almost entirely picked hitting the people crossing on red :p

Humans don't make any of those decisions in an accident, the idea that a machine can analyse the potential outcomes of a course of action in a crisis and make a decision based on that analysis simply demonstrates its superiority to a human.

The thing the machine relly lacks is the ability to regret its decision.
 
They wouldn't have to worry about a "instadeath for passengers if car hits a concrete barrier at city speeds" if they only made the vehicle a little bit more substantial :p

Many of these choices are thus unrealistic.

It's not a choice between killing three at the cross-walk vs killing everyone in the car.

it's a choice between killing three at the cross-walk vs wrecking the car and shaking up it's passengers a bit.

It's a choice that should not ever be made at all. All the cars should do, is try to do what any human would try and do. They should try to mimic a "reasonable and correct" human reaction. No human tries to weight the value of lives in an emergency situation. I know, I have been there. All you can do is try to avoid hitting anything at all and hopefully the computers can do it better than humans.

For instance, a kid on a bike comes flying out into traffic and as a driver, you are in the hot seat. The only thing a reasonable human would do is try to not run over that kid. The human will either hit the brakes and try to stay in his lane, or he may try to swerve, or a combination of the two. He might even run into the ass of another car or run another car off the road. But he is not going to have time to run through a series of calculations to project the best outcome. If he had that much time he'd be able to avoid an accident entirely. A self driving car should be able to do what a human does, just a little better.

But the moment you try to make the computer do what a human wouldn't do, you have transferred that responsibility for the outcome to the developers or whoever contracted their work and is responsible for assigning priorities and outcomes.

I am a passenger in a new self-driving Uber car. There is a wreck and because of the situation presented to the computer, the car kills me. This is no longer an "accident". I was calculated and determined by a decision maker to become the casualty as a choice. A deliberate determination was made to sacrifice my life for "the greater good". That is no longer an accident.

Now if the car tries to do only what you would expect a reasonable and aware human to do and there is still a collision and loss of life, it's still an accident, you can still say "Sometimes, shit happens and there is no fault". Why can't having a computer do things a little better, or even a lot better than a human can be enough?

And by the way, how will all this work out when it's not just one car trying to avoid an accident and "choosing their targets" independently of each other's choices vs several cars all just reacting and trying to avoid hitting anything? The real world is going to rapidly make this decision process stupid, add complex decision cycles to the computer's workload when it should use those precious cycles just trying to work out avoidance vectors and if needed, last second collision impact reduction moves. In other words, do all it can not to hit anything, while all the other cars are doing the same thing, and if they are going to hit something, do what they can to lessen the impact. I think that alone should be enough.
 
Humans don't make any of those decisions in an accident, the idea that a machine can analyse the potential outcomes of a course of action in a crisis and make a decision based on that analysis simply demonstrates its superiority to a human.

The thing the machine relly lacks is the ability to regret its decision.


I disagree. The machine is not aware. It would not be making decisions at all, all it would be doing is interpreting scenarios of predetermined decisions already made by human beings and obeying them.

I agree, humans lack the ability to make these decisions in real time, but machines lack the ability to make decisions at all. We would simply be using the machines ability to parse large amounts of data in very short periods of time. The decision is still made by a human and programmed into the machine.

It's similar to automatic triggers with shares. The human makes the decision "sell once it reaches $xx". The machine is simply obeying those instructions based on the conditions it currently sees.
 
Last edited:
It's a choice that should not ever be made at all. All the cars should do, is try to do what any human would try and do. They should try to mimic a "reasonable and correct" human reaction. No human tries to weight the value of lives in an emergency situation. I know, I have been there. All you can do is try to avoid hitting anything at all and hopefully the computers can do it better than humans.

For instance, a kid on a bike comes flying out into traffic and as a driver, you are in the hot seat. The only thing a reasonable human would do is try to not run over that kid. The human will either hit the brakes and try to stay in his lane, or he may try to swerve, or a combination of the two. He might even run into the ass of another car or run another car off the road. But he is not going to have time to run through a series of calculations to project the best outcome. If he had that much time he'd be able to avoid an accident entirely. A self driving car should be able to do what a human does, just a little better.

But the moment you try to make the computer do what a human wouldn't do, you have transferred that responsibility for the outcome to the developers or whoever contracted their work and is responsible for assigning priorities and outcomes.

I am a passenger in a new self-driving Uber car. There is a wreck and because of the situation presented to the computer, the car kills me. This is no longer an "accident". I was calculated and determined by a decision maker to become the casualty as a choice. A deliberate determination was made to sacrifice my life for "the greater good". That is no longer an accident.

Now if the car tries to do only what you would expect a reasonable and aware human to do and there is still a collision and loss of life, it's still an accident, you can still say "Sometimes, shit happens and there is no fault". Why can't having a computer do things a little better, or even a lot better than a human can be enough?

And by the way, how will all this work out when it's not just one car trying to avoid an accident and "choosing their targets" independently of each other's choices vs several cars all just reacting and trying to avoid hitting anything? The real world is going to rapidly make this decision process stupid, add complex decision cycles to the computer's workload when it should use those precious cycles just trying to work out avoidance vectors and if needed, last second collision impact reduction moves. In other words, do all it can not to hit anything, while all the other cars are doing the same thing, and if they are going to hit something, do what they can to lessen the impact. I think that alone should be enough.


I agree that humans can't do it real time. It's impossible. When things happen fast we act instinctively, not with reason.

Why wouldn't we use the ability of machines to interpret massive amounts of data in a split second to improve on this though? Why would we program a machine to be just as fallible as we are, when we can program it to be better?

If you can optimize an algorithm to minimize the loss of human life, shouldn't you do it?

I'd argue every attempt should be made to optimize for minimization of loss of human life, and in a tie favor the occupants of the car, because after all, the occupant is probably it's owner. "Whose your daddy", and all that.


All that being said, I don't think this is realistic. Real life situations are massively more complex than counting lives and picking one outcome, and I doubt the cars will be able to identify a human being vs any other object that shows up as solid on it's radar. I also feel like the scenarios will be way too complex to pre-program decisions for. At least i th eforseeable future. 20 years from now? Who knows. Short term it's just a matter of a vehicle trying to avoid unidentifiable blips on its radar.
 
To even have a survey like this shows that self-driving cars is going to cause new problems.
 
Driving a car can be fun. Driving a car to work in stop and go traffic with idiots changing lanes almost hitting you is not fun no matter the circumstances.

Depends on the car you own, there is nothing like all the "thumbs up" and compliments you get when you drive a classic muscle car....even when you are just driving to work. :D
 
Depends on the car you own, there is nothing like all the "thumbs up" and compliments you get when you drive a classic muscle car....even when you are just driving to work. :D
uhh yeah, especially when you're driving on the highway and some retard sideswipes your precious little car. Then you'll think about getting a daily commuter car and have your classic muscle car for those midlife crisis moments
 
I agree that humans can't do it real time. It's impossible. When things happen fast we act instinctively, not with reason.

Why wouldn't we use the ability of machines to interpret massive amounts of data in a split second to improve on this though? Why would we program a machine to be just as fallible as we are, when we can program it to be better?

If you can optimize an algorithm to minimize the loss of human life, shouldn't you do it?

I'd argue every attempt should be made to optimize for minimization of loss of human life, and in a tie favor the occupants of the car, because after all, the occupant is probably it's owner. "Whose your daddy", and all that.


All that being said, I don't think this is realistic. Real life situations are massively more complex than counting lives and picking one outcome, and I doubt the cars will be able to identify a human being vs any other object that shows up as solid on it's radar. I also feel like the scenarios will be way too complex to pre-program decisions for. At least i th eforseeable future. 20 years from now? Who knows. Short term it's just a matter of a vehicle trying to avoid unidentifiable blips on its radar.

A agree to a point, and I am thinking along the same lines. One, that the car isn't really making decisions, it's following programing, the decisions been made for situations that fit scenarios. Two, that we aren't really there yet. And my third is different. Once people make these "decisions" in advance, it's on them. It's no longer on the owner. Maybe the makers will see this as a way to incentivize buying their new toys, "buy our self driving car and never be found guilty for vehicular manslaughter again". But I don't see it.
 
On back roads through the countryside sure! it's a blast.

But 99.99% of my time behind the wheel is sitting in bumper to bumper traffic. If self driving cars can eliminate or reduce this incredible time sink, I'm in.

Hell, even if they don't I'm in because at the very least I could do something else instead of staring at the car in front of me.

Would it just be easier to severely curtail the human population density then have to come up with super complex self driving car? We already have the ability to reduce the human population, so really it's the fiscally responsible choice.

Vote for Prop 354 - Human Depopulation!
 
I still don't understand why you'd want a self-driving car, driving is the fun part of owning a car.
It can be tedious on heavy traffic on your commute, or when driving very long distances on the highway.

Usually I cruise at 80mph when driving long distances in the boons, because otherwise I can't stay awake. If I had a true autopilot, I'd happily chill and cruise at 60mph while playing nintendo or something.
Vote for Prop 154 - Human Depopulation!
Would be hilarious if the suicide hotline was just a place to call and get pro-tips about the most successful strategies to kill yourself.

"No sir, you want to put the plastic bag over your head first, then release the helium valve counter clockwise to open. Yes sir, now slip into sweet unconsciousness for your forever nap."
 
Last edited:
uhh yeah, especially when you're driving on the highway and some retard sideswipes your precious little car. Then you'll think about getting a daily commuter car and have your classic muscle car for those midlife crisis moments

Wife just had that happen in her 1971 El Camino. Some girl changed lanes right into her causing thousands worth of (mostly cosmetic) damage. The GOOD news is that there is this thing called "insurance," and it pays for your car to be fixed as good as new. I hear it's gonna be big one day. ;)

(we don't wait for a mid-life crisis, etc. etc. to enjoy our lives. We drive em' like we stole em' and enjoy every minute we get...not just weekends :))
 
My biases are apparently against pets and jaywalkers, and in favor of children.

Honestly though, who would choose to kill a child over an old person? Sorry old person.

And pets? I'm sorry animals, but you simply do not have equal value. I'd kill a billion animals to save even one human being.

Other than that, I almost entirely picked hitting the people crossing on red :p
lol Jaywalkers be damned. None survived my test.
 
To even have a survey like this shows that self-driving cars is going to cause new problems.
This survey is just some moralist bs from a college. It doesn't indicate anything as per real self-driving issues.

Edit: I especially like how it draws flawed conclusions as several posters noted above.
 
Wife just had that happen in her 1971 El Camino. Some girl changed lanes right into her causing thousands worth of (mostly cosmetic) damage. The GOOD news is that there is this thing called "insurance," and it pays for your car to be fixed as good as new. I hear it's gonna be big one day. ;)

(we don't wait for a mid-life crisis, etc. etc. to enjoy our lives. We drive em' like we stole em' and enjoy every minute we get...not just weekends :))
Insurance is a hassle. Between deductibles, assigning fault, dealing with the accident report, the other party lying, dealing with repair shops and in general being a total waste of time, it would be a lot nicer if accidents didn't happen in the first place.

I was just busting your chops on your choice of car. There's nothing wrong with enjoying it.
 
lol Jaywalkers be damned. None survived my test.
The site was down when I tried to take it, but did it let you clip any people sitting on the curb with their legs in the street? People keep doing this where I drive and it's starting to piss me off.
 
The lawyers are going to have a field day with this the first time a pedestrian is hit by a self-driving car.

I agree but the likely case will be that the pedestrian will be at fault. In an auto v. pedestrian accident, I would put odds that the people will be at fault through jaywalking or jumping out from behind cars without regard for traffic, more than they are properly using a crosswalk.
 
Would it just be easier to severely curtail the human population density then have to come up with super complex self driving car? We already have the ability to reduce the human population, so really it's the fiscally responsible choice.

Vote for Prop 354 - Human Depopulation!

I'd vote for that. Would love to see the human population hold steady at 2 billion. BRING ON THE CULLING!

Edit: If we decreased the population enough, I might be able to afford a house in the city.
 
Back
Top