Four Self-Driving Cars Have Already Been In Accidents

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
According to this story, four of Google's self-driving cars have been in accidents since September.

Four of the nearly 50 self-driving cars now rolling around California have gotten into accidents since September, when the state began issuing permits for companies to test them on public roads.
 
Define "getting into accidents"? I was rearended by an uninsured Mexican while sitting at a stop sign, and his defense was that he didn't know I was going to come to a full stop... at a stop sign. Does that count as "getting into an accident"?

1) In two of the four incidents, the car was being driven by the human at the time. So no, we can't count those.
2) In the other two, the speeds were less than 10mph, and in both cases they were not at fault and were hit by retarded non-automated vehicles.

So as Google pointed out, accidents caused by human error and intention into the autonomous vehicles are to blame, which is something autonomous vehicles themselves are designed to address.

When an autonomous vehicle crashes into something itself, or is in an at-fault accident, then critics would have a leg to stand on. But as of right now, it just proves the point that the average person is pretty stupid, and 50% of people are even dumber than that.
 
Two accidents happened while the cars were in control; in the other two, the person who still must be behind the wheel was driving, a person familiar with the accident reports told The Associated Press.
and
Google and Delphi said their cars were not at fault in any accidents, which the companies said were minor.

Seems not very newsworthy but what is now a days?
 
four accidents, two of which under human control, since September, that is quite a good record in my opinion for a testing phase
 
four accidents, two of which under human control, since September, that is quite a good record in my opinion for a testing phase
Especially since in both incidents a human hit the self-driving cars and was at fault.
 
2) In the other two, the speeds were less than 10mph, and in both cases they were not at fault and were hit by retarded non-automated vehicles.

So as Google pointed out, accidents caused by human error and intention into the autonomous vehicles are to blame, which is something autonomous vehicles themselves are designed to address.

This is something everyone seems to forget, accidents are avoided usually by the other human(s) reacting, often the proper reaction is a case by case basis despite a lot of people thinking slamming the brakes is the solution to all.
 
This fucking fear mongering headline implies fault with the technology, all 4 accidents were human caused.

I hate moron headlining like this, it spreads misinformation
 
As someone who works in sales and logs thousands of kilometres each year, I'll go on the record and say that it's probably a far better idea to let the cars drive themselves than a lot of the human drivers that I see on the road.
 
Actually, this just goes to show how far we still have to go. Two out of Four were in automatous mode when the accidents happened. 48 cars total. That's a rate of 4% in since September. On specific roads, under specific conditions, in good weather at low speeds. This rate would only get higher after a full year and if these cars were tested in wider driving conditions and locations.

There are approximately 240 million cars on the road in the US and there are about 5.1 million accidents reported in a full year. That's a rate of 2% in all weather, roads and conditions across the country. Even if you double that rate for unreported incidents, you are still looking at statistically fewer accidents given the various conditions and a full year of records.

Yes, These accidents may have occurred with other non-autonomous vehicles, but that will be the scenario self driving cars will have to be in. For a very long time self driving cars will be sharing the roads with human drivers.

Don't think I am just bashing self driving cars. I'm really not. But it is a good example of how we aren't really as close to them as people would like to think. Especially the google style, no steering wheel or pedals model of car.

I think we are closer to cars that are able to autopilot under specific conditions. Maybe 5 to 10 years. Fully Autonomous cars, 15 to 20. Maybe....
 
This fucking fear mongering headline implies fault with the technology, all 4 accidents were human caused.

I hate moron headlining like this, it spreads misinformation

This is the same as the video of that moron who accidentally ordered an xbox online using an apple watch. And of course it was the watches fault.:rolleyes:
 
This fucking fear mongering headline implies fault with the technology, all 4 accidents were human caused.

I hate moron headlining like this, it spreads misinformation

QFT.

It really is just a click bait article. Even if the cars were at fault, still a better track record than most lol
 
Autonomous vehicles address most of the issues that cause human accidents ... They follow speed limits, they obey traffic rules, they don't take chances, they are always paying attention nor are they impaired, etc ... We will ultimately be better off once they are more commonplace
 
As someone who works in sales and logs thousands of kilometres each year, I'll go on the record and say that it's probably a far better idea to let the cars drive themselves than a lot of the human drivers that I see on the road.

I agree, and that's why this will be our future. It makes me sad as an automotive enthusiast but there will be no stopping the nanny state... Not to say they're wrong in this case, the general public is really really really bad at driving.
 
I'm okay with embracing automated cars in the future once they get to maturity. There will be the gearheads that will refuse to adopt so that's fine with me. As long they take the liability that they are going to be the cause of every accident they get involved in. Automated cars will probably come default with recording cameras to ensure to provide evidence of who is at fault.

I rather put up with an automated car going 60MPH in non-stop traffic than fluctuating anywhere between 20-75MPH in heavy traffic.
 
I'm okay with embracing automated cars in the future once they get to maturity. There will be the gearheads that will refuse to adopt so that's fine with me. As long they take the liability that they are going to be the cause of every accident they get involved in. Automated cars will probably come default with recording cameras to ensure to provide evidence of who is at fault.

I rather put up with an automated car going 60MPH in non-stop traffic than fluctuating anywhere between 20-75MPH in heavy traffic.

You can be liable to some degree for accidents that are not your fault. Someone cuts through a red light and you don't brake as much as you can before impact and you will be partially liable.

Or a hypothetical someone stalls in the middle of an intersection and you had time to stop and didn't you will be close to if not considered 100% liable even though the original turn of events is not your fault.
 
I have no issue with self driving cars. When humans are no longer allowed to drive is when I have a problem with it.
 
The Google cars have been rear-ended seven times, often when stopped "but also on the freeway," Urmson wrote. In other collisions, the cars were side-swiped or "hit by a car rolling through a stop sign." Eight of the 11 collisions were on city streets.
Google must have found the worst place to test their cars.
 
Actually, this just goes to show how far we still have to go. Two out of Four were in automatous mode when the accidents happened. 48 cars total. That's a rate of 4% in since September. On specific roads, under specific conditions, in good weather at low speeds. This rate would only get higher after a full year and if these cars were tested in wider driving conditions and locations.

There are approximately 240 million cars on the road in the US and there are about 5.1 million accidents reported in a full year. That's a rate of 2% in all weather, roads and conditions across the country. Even if you double that rate for unreported incidents, you are still looking at statistically fewer accidents given the various conditions and a full year of records.

Yes, These accidents may have occurred with other non-autonomous vehicles, but that will be the scenario self driving cars will have to be in. For a very long time self driving cars will be sharing the roads with human drivers.

Don't think I am just bashing self driving cars. I'm really not. But it is a good example of how we aren't really as close to them as people would like to think. Especially the google style, no steering wheel or pedals model of car.

I think we are closer to cars that are able to autopilot under specific conditions. Maybe 5 to 10 years. Fully Autonomous cars, 15 to 20. Maybe....

That's a lot of writing to say you didn't RTFA. The two accidents when the cars were in autonomous mode were caused by humans. In other words, all 4 accidents can be blamed on humans. That's a 0% at-fault accident rate for self-driving cars in this case.
 
I cant wait for the day where humans can no longer fuck with my commute. I dont want to see an end to "personal" conveyances. But I pine for a mandatory freeway automation system. Humans as a whole are to stupid to drive correctly amongst others.
 
That's a lot of writing to say you didn't RTFA. The two accidents when the cars were in autonomous mode were caused by humans. In other words, all 4 accidents can be blamed on humans. That's a 0% at-fault accident rate for self-driving cars in this case.

I was thinking the same exact thing. I can't believe he wrote that whole thing and had no basis at all for what he said. There's a lot of pent up anger or something there, me thinks. He makes a strong case that no one should be allowed to drive because they don't think!
 
You can be liable to some degree for accidents that are not your fault. Someone cuts through a red light and you don't brake as much as you can before impact and you will be partially liable.

Or a hypothetical someone stalls in the middle of an intersection and you had time to stop and didn't you will be close to if not considered 100% liable even though the original turn of events is not your fault.
So in both those situations, it's 100% the human's fault and the automated cars will kick in to avoid collision or come to full stop. Like I said, recording will probably come standard with automated cars so the liability will shift completely over to them. Unlike humans, automated cars don't go "Hmm I just want to go faster for the fuck of it".
 
Back
Top