If A Self-Driving Car Kills A Pedestrian, Who Is At Fault?

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
I predict that this is THE issue that is going to keep self-driving cars off the road for a long time. Who is at fault in a minor accident? If I wasn't driving, or even in the driver seat, how can I be at fault if my car ran someone over. Is my insurance going to cover an accident caused by a bad sensor? So on and so forth.

And, if we’re talking about fully autonomous vehicles, we’re also talking about some infrastructure that allows them to operate. It’s not going to just be the car, it’s also going to be all the broader systems within the city that allow the car to navigate. So [we could] see accidents where the blame is really on the public infrastructure that did not allow the car to make a correct decision.
 
It is so damn funny how close to reality the movie "Demolition Man" is coming to be, in more ways than one.

Michael Bloomberg IS the Evil Dr. Cacteau, after all:

upload_2016-12-27_14-53-16.png
upload_2016-12-27_14-54-19.png

Evil Dr. Cacteau........................................ Michael Bloomberg

Be Well
 
Well, it would be obvious that the driver in charge is the developer of the self-driving AI-system/Car manufacturer. That would also give them incentive to make it as perfect as can be with as many back-up sensors and independent systems as possible. Then again, if a vital component breaks that is crucial for the AI, then that would be no different than if there was a fatal design flaw in, say, the brakes of a car when a human is driving.
 
I can see scammers targeting these cars. They will fail if the makers do their job right, leading to only very dangerous situations being viable for scamming and even then risky vs rewards (100kmh+ crashes etc)..

Thing is they'll just go the way of Russia who has already dealt with this shit just non autonomous vehicles - dashcams become mandatory.
 
This is IMO, but if your in driver seat, you are responsible ,unless of a manufacture defect, like run away vehicle that was out of your control (Toyota Pris issue) .
I bet you(human) would always have override control.

Now if it gets to self-driving were you are in back seat with no input possible (I don't see this happening anytime soon). then I guess would fall back to manufacture or just your insurance if % was real low of this happening.
If self driving really helps in a safety way technically insurance "should' go down.
 
I can see scammers targeting these cars. They will fail if the makers do their job right, leading to only very dangerous situations being viable for scamming and even then risky vs rewards (100kmh+ crashes etc)..

Thing is they'll just go the way of Russia who has already dealt with this shit just non autonomous vehicles - dashcams become mandatory.

and make a sport out of it/gambling like the running man but with cars
 
General insurance should cover accidents, in principle - ie general taxation pool or industry insurance scheme paid for eg by a tax on car purchases or road licences.
All participants should then be subject to regulatory oversight and control. The (smaller amount of) accidents should be investigated - industry or government body - and causes dealt with appropriately. Eg - did components fail because they didn't meet industry quality standards in the first place? Was that installation permitted by the internal culture at the company (eg, emissions scandal) - then fine / new laws / new tests etc. A little bit like how air accidents are treated at the moment.

Taking the current model and trying to force it on the new world is mindbogglingly backward thinking.
 
Google and the other tech/car companies will spend enough money in Washington to make sure they aren't held responsible for anything their cars do.
 
The self driving pedestrian?

Seriously why is this a question? I assume there will be an investigation by the police just as there would be with a regular driver. And with all the data from the car it would be pretty easy to determine the blame.
 
This isn't even a real problem, there isn't a "yours mine and the truth" when on side of the story has a million cameras and sensors running constantly.

It will be just like it is now, the parties in the accident will have to use their own insurance unless there is gross negligence or unlawful intent.

Of course the car will have recorded every millisecond of the "accident" from at least a half-dozen angles so any attempted fraud will be easily caught.

Honestly, there isn't a single motor-vehicle situation or circumstance that isn't improved by self-driving cars. With the exception of the joy of driving a great car, of course.
 
This isn't even a real problem, there isn't a "yours mine and the truth" when on side of the story has a million cameras and sensors running constantly.

It will be just like it is now, the parties in the accident will have to use their own insurance unless there is gross negligence or unlawful intent.

Of course the car will have recorded every millisecond of the "accident" from at least a half-dozen angles so any attempted fraud will be easily caught.

If you own the car, you will have to have a "self driving car" insurance policy.
Cost of the policy will be based mostly on the driving record of the brand/model instead of the owners driving record.
If the crash was due to a software error, then the insurance company might go after the car manufacture to be reimbursed.

However, if the accident is due to poor maintenance or illegal modifications (Software changes, lowering or raising the car so the sensors are not working correctly, etc), then the insurance company will likely have a clause that nullifies the insurance and then the owner would be liable.

If the car is owned by a company like Uber, and you just call when you need a ride, then you don't have to worry about insurance.
 
If the blame goes on the developer of the software, then holy shit, you would not want to touch the development of these things ever.

Should be standard insurance, the owner / "driver" is to blame. Covers the fact no one is in the car too. But I see these having to be black boxed, get sensor readings / fault data from them, see if it's a sensor, poor road or even an atmospheric fault. Or the fact the pedestrian is too busy looking at their phone than what is around them. Some hits are just unavoidable.
 
Of course the car will have recorded every millisecond of the "accident" from at least a half-dozen angles so any attempted fraud will be easily caught.
Yes, criminals will never find a way to subvert these new technologies for their own personal gain. It's foolproof! What could possibly go wrong?
 
The company...
This isn't high end philosophy, if a company advertises that the car can drive itself without issue, then it's responsible for the issue.
 
If you own the car, you will have to have a "self driving car" insurance policy.
Cost of the policy will be based mostly on the driving record of the brand/model instead of the owners driving record.
If the crash was due to a software error, then the insurance company might go after the car manufacture to be reimbursed.

However, if the accident is due to poor maintenance or illegal modifications (Software changes, lowering or raising the car so the sensors are not working correctly, etc), then the insurance company will likely have a clause that nullifies the insurance and then the owner would be liable.

If the car is owned by a company like Uber, and you just call when you need a ride, then you don't have to worry about insurance.
Exactly...

Ultimately I suspect it will be the owner of the car. Early adapters will have chosen to buy a self-driving car as opposed to a manually controlled car. By making that choice they will assume responsibility. I can guarantee you that lawyers will write the purchase contract in such a way as to protect the manufacturer.

I don't expect self driving cars to be viable for a lone, long time...
 
The company...
This isn't high end philosophy, if a company advertises that the car can drive itself without issue, then it's responsible for the issue.

No car company is going to sell you a car that says anything close to "without issue". No current car will guarantee that they're
without defects and self-driving cars won't be any different. People will sue anyone for anything if they think there's money in it for them.

Until this actually is tried in court (Supreme), I suspect we'll all still be required to insure our own vehicles.... at least until we get to the point
where there aren't manual controls available in the vehicles anymore. By then it may be different in how the system works.
 
We cant even get bluetooth or hdmi to work 100% with all advertised features. Self driving cars are going to be a shit show for the next 20 years at minimum.
 
If someone hasn't already said this then here it is.

We already have fault-less resolution of accidents and even traffic related deaths.

Hell, sometimes the cops don't even show up.

Fault is not the issue, and autonomous cars are going to be the end of fault as a consideration. The question will no longer be resolved by determining fault, it will simply be a matter for the insurance companies to resolve and that is all it's going to be. If there is a problem with AI code for cars or for information and communications systems that fail to deliver timely data to the cars, then it will simply be a civil court case between companies.

Now if you think you have a situation where someone was hurt or killed involving an autonomous car, a wrongful death case is how it will be resolved, if the TOS you agreed to allows it (y)
 
The self driving pedestrian?

Seriously why is this a question? I assume there will be an investigation by the police just as there would be with a regular driver. And with all the data from the car it would be pretty easy to determine the blame.

You are missing the point. The question is if the car is 100% at fault (not the pedestrian), who is responsible if the "driver" is sitting in the back seat of a hypothetical 100% self driving car.

Google and the other tech/car companies will spend enough money in Washington to make sure they aren't held responsible for anything their cars do.

Partially agree. The logical end result will be nationalized auto insurance we all pay into, at a micro level who's "at fault" won't matter anymore. At a macro level there will(should) be data kept on incidents and penalties handed out tp auto manufacturers with an accident rate exceeding what is considered acceptable. Lobbying will likely help push this a long, but it's the only way this would work.
 
If it's your car parked on someone, it's your problem. When you engage autonomous driving, that states that you know what you are doing and that there are inherent risks involved. Read the fine print of ownership and use.
 
You are missing the point. The question is if the car is 100% at fault (not the pedestrian), who is responsible if the "driver" is sitting in the back seat of a hypothetical 100% self driving car.



Partially agree. The logical end result will be nationalized auto insurance we all pay into, at a micro level who's "at fault" won't matter anymore. At a macro level there will(should) be data kept on incidents and penalties handed out tp auto manufacturers with an accident rate exceeding what is considered acceptable. Lobbying will likely help push this a long, but it's the only way this would work.

Pretty much what I was saying. I mean it's already working this way in some regions and that's with humans behind the wheel.
 
I don't see an issue. If the pedestrian is killed on the road, it's the fault of the pedestrian, no matter the age; if the pedestrian is killed on the sidewalk, then it's the fault of the car.
 
Yes, criminals will never find a way to subvert these new technologies for their own personal gain. It's foolproof! What could possibly go wrong?

This falls under "fabricating conundrums because I fear thing". It will be much more difficult to alter the machine record and altering it WILL leave evidence.

It's far easier to run a dude down and just lie about it.

Nefarious autonomous vehicle hacking syndicates are going to have a much tougher time of it than a driver that lies about running down Judy Jogger in the park.

Every aspect of this situation is improved by an autonomous vehicle over a driver.
 
Unfortunately, autonomous cars are going to come to a (figurative) screeching halt as soon as "little timmy" gets killed by one. Never mind that the alternative -- keeping human drivers on the road -- will mean that little billy, little jimbo, and little willie *all* die.

Gotta love emotion over logic.
 
Embedded dashcam that flicks on when a potential collision is detected to record the incident and a judgment made from there at which point the insurance takes over, not exactly rocket science. Your basically just moving from platform an individual liability on behalf of the driver to some a little more generic.
 
Pedestrians always at fault. I'd sue them for damaging my car too.
 
With the data recording these cars shoild have (I expect 360 degree video recording, sensor input recording, and computer action recording), fault should be relatively easy to determine.
 
the dumbass looking at his phone when he should be watching the road

shit will happen, but it still will be better than what happens today
 
It'll probably be handled the same way a vehicle malfunction is handled now. If the car bugs out and it's a design flaw, recall and sue the manufacturer, if it's done because some dumbass covered a sensor with a beanie baby, no one because by the time everyone figured it out, no one cares anymore.
 
I've seen how people drive nowadays. The less American motorists are actively involved in the process the safer the streets will be as far as I'm concerned.
Well, nowadays, people in America are less and less American. Having lived all over, Americans really aren't that bad of drivers. But you have a lot of people coming over from places that have no real driving culture, and jumping behind the wheel here, and its scary.

Try crossing a crosswalk in Thailand, India, Malaysia, or China. Heck, I remember in rural Malaysia I was asking "who has right of way when this street isn't big enough for two cars". "Biggah cah" he said, and sure enough he hastily pulled over as a huge dump truck came hurling by at 60mph straight at us down a dirt road, and couldn't have stopped or swerved if he wanted to, lol! A big contrast from rural Texas where everyone is super friendly and pulls over onto the shoulder to make it easier for you to pass and waves. And then we get to the city, and everything is a mere suggestion, and everywhere we went whenever there was traffic the shoulder was always another dedicated lane in their opinion.

The problem with self-driving cars though, is that typically our poorest residents (low income brackets from broken families that don't teach altruism or our elderly on fixed incomes that were good drivers back in 1942, but not so much today) in the US are also the worst drivers by and large, and visa versa. So you'd only end up replacing our best drivers with automation, UNLESS you implement something like Uber was thinking about, where the majority of people will no longer own a car and just use a self-driving taxi or bus to get around.

That way, you can make driving a true privilege and REALLY police it hardcore, and require people to take really extensive $4K three weekend driving safety and education courses, the same as getting say a pilot's license.
 
I can see scammers targeting these cars.
The sophisticated cameras and sensors would capture the scammer's intentions. But yeah, we forgot to say "the pedestrian is at fault". There are a lot of idiots out there that try their hardest to win a darwin award, and their families shouldn't collect a penny if they were hit due to their own stupidity.
 
It is actually simple, will you go into an elevator knowing that it may go free fall if you don't intervene? If any government is serious about autonomous driving, then they should first fine car manufacturers in millions for each false advertisement and put 3rd degree murder charges on those who advertise autonomous driving as safe/better than human. If it is better than human, then why allow human to intervene?

What we have now is a glorified cruise control that is much safer than the old cruise control. It can get better, but until warranties include at fault accidents, it is just cruise control.
 
It is not "better than human", It is statistically better than average human. There is a gap of understanding.
 
It is not "better than human", It is statistically better than average human. There is a gap of understanding.
In some ways it can be better than a human, by vehicle-to-vehicle communication for example.

Even without it, this Tesla noticed the cars braked ahead and predicated the accident before the driver even saw it, since his forward view was obscured by the vehicle in front of him:
 
Is a train at fault for plowing through a stopped car on the tracks?
 
Back
Top