MIT Publishes Self Driving Morality Survey Results

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
MIT opened up their "Morality Machine" project over 2 years ago, asking participants to rate various accident scenarios self driving cars could run into. For those that want to take the survey, the site is still up. MIT published an analysis of their current data in Nature, and the results are interesting. For example, most people tend to spare females over males, spare the young over the old, and spare pedestrians over passengers. Some preferences were universal, while others, like sparing those with a higher social status, were strongly influenced by cultural backgrounds.

Our data helped us to identify three strong preferences that can serve as building blocks for discusssions of universal machine ethics, even if they are not ultimately endorsed by policymakers: the preference for sparing human lives, the preference for sparing more lives, and the preference for sparing young lives. Some preferences based on gender or social status vary considerably across countries, and appear to reflect underlying societal-level preferences for egalitarianism.
 
they all chose not to spare the cat, 100% agree with this poll now even though i don't really care, lol.
 
Lol this whole autonomous driver morality thing is so beyond stupid. The fact is there will be so fewer accidents that when an accident occures it will be a tragety unlike today where it's so common it'a a statistic
 
I only looked at the pretty pictures, but I would like to know how the autonomous vehicle is supposed to differentiate between a doctor/athlete/criminal/etc. I understand the interest for the sake of the 'morality' study, but if this is intended to be of benefit to the AI in autonomous vehicles, I don't see the point in separating careers.
 
I forgot about this, wasn't there a discussion on how AI will tell the difference on who is homeless and a criminal?
 
The correct answer is spare MY life and the life of those in my car. That is the closest we can come to what a normal driver would do. Anything else is social engineering. SCREW woman and children 1st. That is sexist.
Bingo, no one is going to buy a car that won't protect them in case some idiot wanders onto the road.
 
Huh. I thought the title said “mortality” study. That would’ve been much more interesting...
 
This is funny, a self driving car SHALL be held to a higher standard than a human driver.

The correct answer (for a human) to all of those cases is stomp on the brake and hope the car stops before it hits anything!

You are actually describing how a human reacts, we in essence don't have this problem since we don't have the capability to evaluate, consider, take action and avoid before we have actually crashed but computers can but the ironic twist is that another human not actually involved must programme it in to the software before it has ever happened.

These sorts of legal issues are far more likely to delay autonomous cars than the technology to do it.
 
I'm very dubious, to say the least, of their data interpretation. I took the quiz quite a while ago, and they presented my "decision summary"; it had nothing to do with why I made the decisions. Maybe with a significant number of participants they can tease out underlying patterns in the data, but their analysis of me as an individual was completely wrong. For example, why are cats the most targeted participant? Is it because people as a group don't like cats, or is it because they happened to appear mostly in scenarios where they got balanced against some number of human lives? Did they set up scenarios where the options are kill people, kill a dog, or kill a cat? If you don't have a direct comparison between the overall groupings, then you can't assign a relative weight to them that is meaningful.

They really needed this to be an interview process where they asked people to describe their decision criterion, not try to divine those criterion from the results. Evan a simple yes/no(and explain) at the end of the quiz would have made this far more robust.
 
I think I remember this. If I recall, I chose for the car to kill all the old people, rape the men, cat call the women, and back over the rich people. Kinda like GTAV.
 
Lol this whole autonomous driver morality thing is so beyond stupid. The fact is there will be so fewer accidents that when an accident occures it will be a tragety unlike today where it's so common it'a a statistic


I was right with you for the first sentence. The day might come when the rest of what you said will be true. In the meantime, we're just going to have to get used to this;

 
I think I remember this. If I recall, I chose for the car to kill all the old people, rape the men, cat call the women, and back over the rich people. Kinda like GTAV.


So I get killed, raped and backed over ..... feeling a little singled out here :ROFLMAO:


Could I buy some good will by starting a Go-Fund-Me in your name?

I'll tell them you're poor ... and pregnant ... and need to turn your life around.

Ben Shapiro has helped you see the world as it really is ... you'll wear a yarmulke :whistle::sneaky::D


images?q=tbn:ANd9GcR0G8Zz-QYVnE7YRRNygCzSEK6p_rdP_cCyemGVZUwqWz9nro7Q.jpg
 
Last edited:
In many European countries you couldn't chose who to kill in a accident based on anything to do with age, gender, etc since it would technically break discrimination laws.
 
In many European countries you couldn't chose who to kill in a accident based on anything to do with age, gender, etc since it would technically break discrimination laws.


I should be breaking something. The entire thing is morally arrogant beyond belief.

Look at it this way. If you do anything other than try to avoid the accident entirely or minimize the severity of the collision for the vehicle, thereby hopefully minimizing risk for others, then you are basically choosing who to save, and by choosing who to save, you are in effect, choosing who to kill.

That's the very definition of premeditated homicide. These people, in their arrogance, are contemplating writing code that chooses who will live and who will die and that is fundamentally wrong, so wrong. And you can not argue against this if you are assigning weights and values to lives based on any criteria.

In the US, (can't argue for other countries), but in the US, if you have an accident and it results in a fatality, then you will go through a process to determine if someone did something wrong that resulted in a death. If a driver is found at fault they will likely be charged with a criminal offense, typically something like involuntary or negligent man-slaughter. But if it were shown that someone chose to hit someone, the charge could be murder. By extension, someone who programed a vehicle to make this choice, premeditation would not be out the window.

These arrogant fools are so close to the math that they can't see the picture.
 
So MIT basically player a game of "would you rather", would you rather run into one group or another, would you rather hit a group of people or run straight into a wall, why "just fucking stop" isnt an answer seems silly. Just a study to see if the axiom of "women and children first" still holds true or "fuck that lets make sure I am safe"
 
So MIT basically player a game of "would you rather", would you rather run into one group or another, would you rather hit a group of people or run straight into a wall, why "just fucking stop" isnt an answer seems silly. Just a study to see if the axiom of "women and children first" still holds true or "fuck that lets make sure I am safe"

If you were stopped on the train tracks, about to be hit by a train, and the only thing stopping you was going forward over a stroller, or backwards over an old lady, which would you choose?
 
If you were stopped on the train tracks, about to be hit by a train, and the only thing stopping you was going forward over a stroller, or backwards over an old lady, which would you choose?

I'd choose not to be an idiot to get into such a situation in the first place. I'd also like to think the mother / father of the baby would be arrested for child endangerment.
 
If you were stopped on the train tracks, about to be hit by a train, and the only thing stopping you was going forward over a stroller, or backwards over an old lady, which would you choose?
Why are people crossing there? Why did I stop exactly when the train was coming? Why cant I just get out of the car? Do I know if there is a baby in the stroller or some homeless dude's shit?
 
The correct answer is spare MY life and the life of those in my car. That is the closest we can come to what a normal driver would do. Anything else is social engineering. SCREW woman and children 1st. That is sexist.

Pretty much this. If 5 kids jump out in front of my car and I'm following all of the traffic laws, I'll slam on the brakes, but I'm not gong to kill myself because someone else was negligent.
 
If you were stopped on the train tracks, about to be hit by a train, and the only thing stopping you was going forward over a stroller, or backwards over an old lady, which would you choose?

Forwards. Both I (as in walking/running) and my vehicle move faster in that direction. Moving out of the way faster has a better chance of a positive outcome for me. The stroller / old lady don't come into my decision, IE. you could swap them and I'd still say forwards.
 
Forwards. Both I (as in walking/running) and my vehicle move faster in that direction. Moving out of the way faster has a better chance of a positive outcome for me. The stroller / old lady don't come into my decision, IE. you could swap them and I'd still say forwards.
Plus if you want to be technical going towards the stroller is more likely to push it out of the way, the old lady is more likely to fall over and scream and make you second guess as you roll over her limp body.

The true answer is going forward because a car coming off the tracks has the right of way over even pedestrians specifically because of what if scenarios like this
 
I was right with you for the first sentence. The day might come when the rest of what you said will be true. In the meantime, we're just going to have to get used to this;



That's the next year or two. Fully autonomous is coming very fast
 
You are not supposed to brake for a cat here, but a dog on the other hand you are supposed to try and not run over, of course most people just stomp the brake and pray, not least now with the ABS which i dislike with a passion as i am old & shool.
in 1993 or something going home from work at 5 in the morning, as usual i was doing 120 or so on a 80 highway, and this big ass badger ran out in front of my car, and all i could do was not crash as the thing compacted under my car and probably launched the little 3 door Opel too.
Sum bitch ruined my front spoiler ( factory spoiler of course only idiots put upgrade plastics on their car ) and i had to throw 3 times to get the bigger parts of the badger off the road.

What infuriate me a whole lot are people running over a animal and not even stop, 2 times now i have had to stop to make sure / clean up and the one time i even had to finish off the poor cat.
 
You are not supposed to brake for a cat here, but a dog on the other hand you are supposed to try and not run over, of course most people just stomp the brake and pray, not least now with the ABS which i dislike with a passion as i am old & shool.
in 1993 or something going home from work at 5 in the morning, as usual i was doing 120 or so on a 80 highway, and this big ass badger ran out in front of my car, and all i could do was not crash as the thing compacted under my car and probably launched the little 3 door Opel too.
Sum bitch ruined my front spoiler ( factory spoiler of course only idiots put upgrade plastics on their car ) and i had to throw 3 times to get the bigger parts of the badger off the road.

What infuriate me a whole lot are people running over a animal and not even stop, 2 times now i have had to stop to make sure / clean up and the one time i even had to finish off the poor cat.


fuck that i don't stop for cats.. either the cats smart and runs away or it's dumb and deserved to die.. dogs are dumb and don't give a shit. they also tend to be larger and more likely to cause damage to my car so i stop for them.. ;)
 
Oh... Morality survey. Read the title as 'Self Driving Mortality survey results' and wondered about all the ethical conundrum chatter.

Waiting for the definitive Runaway Self-Driving Trolley results and whether these AI Morally Superior processors can willfully avoid an untimely bug splat without dinging the chrome.
 
Apparently brakes on self driving cars suck and fail all the time, have a zero star crash rating, and they drive way too fast. /s

For most of these scenarios how about option 3: steer towards the side barriers in the pictures to slow the car while flashing the lights and sounding the horn to warn of a malfunction and danger? Let those who can save themselves do so.

Also, I like how the animals crossed on the signal and the people didn't. Apparently someone thought legal vs. illegal crossing might sway someone more than the animal vs. person. Also can the AI differentiate between and animal and say, perhaps, a baby crawling?
 
Apparently brakes on self driving cars suck and fail all the time, have a zero star crash rating, and they drive way too fast. /s
Don't forget the always drive in lanes where concrete barracades block the road.

Forget morality, this is the "why would anyone get in a self driving car if it drove like that" test.
 
This is funny, a self driving car SHALL be held to a higher standard than a human driver.

The correct answer (for a human) to all of those cases is stomp on the brake and hope the car stops before it hits anything!

Hoping is for deluded humans. So is assuming the car can always brake in time.

Braking hard is the easy answer but not always the best answer. I've avoided accidents by running up on lawns and shoulders that braking would not have saved me.
 
Back
Top