Human Error Caused Self-Driving Car Crash

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
We've said it a thousand times, the biggest problem with self-driving cars is that they still have to share the road with humans. Or, at least in this case, are still controlled by humans.

A Nissan LEAF electric vehicle fitted with an autonomous driving system owned and operated by Cruise Automation, the company that produces a $10,000 aftermarket self-driving system, crashed into a parked Prius on Jan. 8 in San Francisco after the human driver took driving control away from the vehicle.
 
So it wasn't a self driving car when the crash occured. I smell click bait.
 
The accident occurred when the self-driving LEAF began drifting from left to right within its lane. The driver then reclaimed driving control from the car. At that point, "the operator did not change the path of the vehicle and it collided with an unoccupied Toyota Prius,"
To be more accurate, the self-driving car failed and needed its human driver to take over and save it, which unfortunately they could not do.
 
I always wonder how well an autonomous vehicle would detect worn/unpainted lanes, or find it's way in the snow.
 
I always wonder how well an autonomous vehicle would detect worn/unpainted lanes, or find it's way in the snow.

Currently, they don't. GPS isn't accurate enough. Accuracy is "within 3.5 meters. It needs to rely on its local sensors for the pin point stuff. They hope to work past these issues, but there is still a long way to gl.
 
Currently, they don't. GPS isn't accurate enough. Accuracy is "within 3.5 meters. It needs to rely on its local sensors for the pin point stuff. They hope to work past these issues, but there is still a long way to gl.

Survey grade GPS recievers are actually capable of much better than that right now. (With real time correction signal) But self driving cars could never rely solely on GPS to find their way, as there needs to be at least 4 satellites with good line of sight available to get any kind of position from a GPS system. But you need at least 7-8 to get a reliable position. That's never gonna happen, in a wooded area, or in narrow city streets between tall buildings,
 
To be more accurate, the self-driving car failed and needed its human driver to take over and save it, which unfortunately they could not do.

So, the computer-controlled car put the driver in a situation they couldn't handle, then blames it on the person? I can see companies using this when they are defendants in lawsuits.
 
This sort of thing has cropped up before, and it has always been due to human error.
-HAL 9000
 
Lol.. Why does it not surprise me that both a LEAF and a Prius were involved in this crash? Hard to find blame here.. being a Prius, it was probably just 'parked'..... in the middle of the freeway.
 
I remember reading about the Nissan system, they're behind Google and Tesla on self driving tech if I recall, by a few years perhaps.

http://arstechnica.com/cars/2016/01...ver-data-on-glitches-and-failures-to-the-dmv/

Google: 341 (labelled significant) disengagements - 424,000 miles of driving.
Nissan: 106 disengagements - 1,485 miles on public roads in California.
Delphi: 405 disengagements - 16,662 miles.
Tesla: 0 disengagements - ? miles.

Interesting breakdowns and details in article as well.
 
Why is it that when a human driven car crashes, 'something' went wrong with the car or signal light and when a computer driven car crashes, 'something' went wrong with the human or signal light?
 
Why is it that when a human driven car crashes, 'something' went wrong with the car or signal light and when a computer driven car crashes, 'something' went wrong with the human or signal light?


haha true. Trying to spread the blame i guess
 
I remember reading about the Nissan system, they're behind Google and Tesla on self driving tech if I recall, by a few years perhaps.

http://arstechnica.com/cars/2016/01...ver-data-on-glitches-and-failures-to-the-dmv/



Interesting breakdowns and details in article as well.

Trusting self-assessments reminds me of this
lhpUDCY.jpg
 
When these become more common place, I'm going to build multiple spinning laser LIDAR turrets on my car and spam it out in 360 degree direction. One on each corner of the car, then 4 more on the roof at the corners. Maybe it'll confuse the crap out of these autonomous cars.
 
Clearly, I don't know what was going on at the time of the crash.
...
Generally, however, the incident illustrates that humans are really not very good at driving.

Top kek.
Archive because Mashable. https://archive.is/d0Slp
 
Back
Top