Uber's Software Failed In Fatal Arizona Crash According to NTSB

Discussion in '[H]ard|OCP Front Page News' started by Montu, May 24, 2018.

  1. Montu

    Montu [H]ard DCOTM x4

    Messages:
    7,640
    Joined:
    Apr 25, 2001
    A couple of months ago we began reporting on a fatal crash in Arizona involving an Uber self-driving car and a pedestrian. Dashcam video was released and it looked like the area where it happened was dark. However, the NTSB has released a preliminary report that lays the fault of the accident on the software and not the sensors in the system. They say the sensors detected the pedestrian in sufficient time to make a stop, but the software became confused. The bottom line is Uber screwed the pooch. I guess they knew what really happened because they settled with the family very quickly. The preliminary report can be found here.

    The problem was that Uber's software became confused, according to the NTSB. "As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path," the report says.
     
  2. blade52x

    blade52x 2[H]4U

    Messages:
    2,918
    Joined:
    Sep 9, 2006
    Ignoring the careless driver falling asleep at the wheel, I guess the big question here is, would this have happened anyway without the autonomous software? Video feed may not really represent what a person actually sees, but given the darkness, potential glare, and seeing the pedestrian show up ~1 second before impact makes it look like this is an accident either way. The sensors and software will only get better and prevent things like from almost ever happening.
     
  3. vegeta535

    vegeta535 2[H]4U

    Messages:
    2,111
    Joined:
    Jul 19, 2013
    She was playing on her phone when just before she struck the pedestrian. Still I don't think it is possible to determine if the drive could of even seen the pedestrian in those conditions. People also need to be held accountable which kids these days are raised to believe it is never their fault. It is what will hold back self driving tech for years. People don't see past their own little world. I honestly surprised I haven't hit anyone in my 20 years driving. People just like to randomly walk into the middle of the street. I once almost creamed a girl that was standing on a medium strip looking at her phone. If I haven't been watching and took some precaution I would killed her cause she just stepped of the strip into the street never taking her eyes off her phone.
     
  4. Spidey329

    Spidey329 [H]ardForum Junkie

    Messages:
    8,728
    Joined:
    Dec 15, 2003
    So it still classified it as an object (alternating between various things) and chose the best path was through that object?

    Yeah, I'd say that's a software failure. They have a fault in their logic tree somewhere if the car is deciding the best course of action is to hit an unknown object.

    Doesn't surprise me though, isn't this based off of the stolen stuff from Waymo? The problem with stealing IP is that you don't have the collective knowledge that went into it's design, so flaws aren't easily found until they become painfully obvious (see: Russian Concord crash, Russian B29).
     
    Dayaks likes this.
  5. Drakeniir

    Drakeniir [H]Lite

    Messages:
    101
    Joined:
    May 27, 2011
    I still find the most surprising thing that came out about this case was that the driver was female. (yes, i'm a horrible person)
     
  6. BoogerBomb

    BoogerBomb [H]ardness Supreme

    Messages:
    6,836
    Joined:
    Jan 10, 2003
    The report doesnt not cite fault only the facts that it has gathered so far so dont report that it specified fault. The software was not faulty according to the report. It detected the person and how long it would need to brake before hitting her (1 or 2 seconds before, I forget the exact second, hitting her) . The problem is that the automatic braking system is disabled by Uber in autonomous vehicles. Determining fault is more complex than people realize. Ultimately a percentage of the fault will lie with Uber and a percentage will lie with the woman. She had meth and marijuana in her system, crossed a dark street where there wasn't a crosswalk or lighting and had no side reflectors on the bike.

    So people reading the report are drawing different conclusions based solely on their interpretation. Its best to wait for the final report after the investigation is complete and have people who know how to interpret what the report says correctly.

    Heres what the AP said about the report:
    https://www.msn.com/en-us/news/us/f...an-did-not-brake/ar-AAxKfls?OCID=ansmsnnews11
     
    Last edited: May 24, 2018
  7. 777

    777 [H]Lite

    Messages:
    114
    Joined:
    Nov 8, 2015
    lol
     
  8. BSmith

    BSmith [H]ard|Gawd

    Messages:
    1,037
    Joined:
    Nov 9, 2017
    Yes, let's put thousands of these on the roads right a way! It will take care of the rabbid bicycle population! WOOT!
     
  9. likeman

    likeman Limp Gawd

    Messages:
    440
    Joined:
    Aug 17, 2011
    It still should of braked, disabling the cars own built in system that would of prevented or limited the speed of impact if the standard feature of that car was enabled (witch is just radar I believe )

    They will fix this issue with the software I guess, as it getting confused what the object was it's failsafe should be brake not to ignore it and just plow into it as some tesla cars do (lack ladar)
     
  10. amddragonpc

    amddragonpc [H]ard|Gawd

    Messages:
    1,997
    Joined:
    Sep 20, 2012
    From the article:

    "At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision. According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator."

    Uh, let me guess, the notification to operator is a feature to be added in the next version?
     
  11. oldmanbal

    oldmanbal [H]ard|Gawd

    Messages:
    1,585
    Joined:
    Aug 27, 2010
    Good. Uber took shortcuts in utilizing the technology, and didn't want to spend money on the more advanced (read: safer) solutions (NVIDIA) available. Glad that blame is being correctly placed.
     
  12. Chaos Machine

    Chaos Machine Limp Gawd

    Messages:
    498
    Joined:
    Apr 13, 2012
    Are you talking about the tu-144? If I recall that plane project started in 1962 and actually flew before the concord and the reason it took longer to get into service and safety issues were due to budget restrictions.
     
  13. Ocellaris

    Ocellaris Ginger @le, an alcoholic's best friend.

    Messages:
    18,923
    Joined:
    Jan 1, 2008
    Ok to anyone saying “maybe the accident still would have happened...”. Never mind the dreadful $5 dash cam Uber was using making it look like this person was run over in a black fog. The accident was in a decently lit area at the time. Plus you can see way more than 1 second ahead of you at night, and the victim didn’t jump into the road or anything.

    Uber designed a car to drive itself, detect emergencies, decided not to apply the brakes AND NOT EVEN WARN THE DRIVER. This is severe negligence.
     
  14. Jim Kim

    Jim Kim 2[H]4U

    Messages:
    2,576
    Joined:
    May 24, 2012
    You had me at Uber's Software Failed.
     
  15. BoogerBomb

    BoogerBomb [H]ardness Supreme

    Messages:
    6,836
    Joined:
    Jan 10, 2003
    The software was crippled by UBER since the software couldnt control the brakes or notify the driver by design.
     
  16. Krenum

    Krenum [H]ardForum Junkie

    Messages:
    13,503
    Joined:
    Apr 29, 2005
    "According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator".

    Then whats the bloody point of a self driving vehicle?!
     
  17. mnewxcv

    mnewxcv [H]ardness Supreme

    Messages:
    5,921
    Joined:
    Mar 4, 2007
    Don't be stupid. It's self driving. Not self stopping. :rolleyes:
     
    mord likes this.
  18. gxp500

    gxp500 Gawd

    Messages:
    694
    Joined:
    Mar 4, 2015
    Seemed that the drivers was doing her job, monitoring the diagnostic data, the problem was the other self driving cars have two drivers, uber had one. Can't do two things at once.
     
  19. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    5,914
    Joined:
    Feb 22, 2012
    I feel like braking should be the last thing to disable... MAYBE limit it to 80% or something. Crazy.
     
  20. sfsuphysics

    sfsuphysics I don't get it

    Messages:
    12,970
    Joined:
    Jan 14, 2007
    Ok all things aside, this is fairly cool. The fact that it predicts where objects should be going.
     
  21. aokman

    aokman Limp Gawd

    Messages:
    500
    Joined:
    Jan 3, 2012
    Happy to blame uber but seriously everyone is giving the cyclist a free ticket for crossing the road in an random location with a car bearing down on them and taking no evasive action? Self presevation...? Anyone?
     
  22. Flatrock19

    Flatrock19 n00bie

    Messages:
    30
    Joined:
    Jun 6, 2017
    It's a bit more complicated than the sensors detected her and therefore the car should have stopped.
    The sensors don't provide a boolean. True if pedestrian detected and false if one is not detected. They provide feedback that has to be interpreted. False positives are a very significant problem. You don't want the car slamming on the brakes in an emergency stop for a pedestrian that isn't there. Not only will it make it so no consumers will want to buy or ride in your autonomous vehicle, it could cause an accident.
    The NTSB determined that in this case the Uber software had a clear enough signal that they should have been able to determine that the pedestrian was there and incorrectly identified it as a false positive.
    It is a flaw in Uber's system. It did result in the pedestrian's death. I'm only trying to point out that it's not a simple bug that should have been easily caught with basic testing. Uber is at fault, but you have a very limited ability to come up with a wide enough variety of tests to verify such software works in a closed environment. Thus the need for real world testing with a human driver to override the software when it fails to properly identify something, and then the data would be gathered to find the flaw and fix it.
    The weak point in Uber's testing strategy is the human driver. Most human beings aren't well suited to maintaining sharp focus and attention when that attention is only needed occasionally. I certainly couldn't do it. They need some technological solution that helps ensure the driver is being attentive. Of course that system too would need to be thoroughly tested and might suffer from some of the same problems. There is no perfect way to completely remove risks.
     
  23. Gavv

    Gavv PM Kyle for the TItle You Pick.

    Messages:
    18,637
    Joined:
    Dec 4, 2005
    It’s unfortunate someone was killed but it will also be beneficial in the end.

    We can speculate and assume what should have taken place but they can evaluate what did happen and build it better. Whether it’s Uber, the sensors or the driver it will all make self driving better in the future.

    Ultimately it was the operator of the car at fault. I mean she was testing unproven software. Her checking the diagnostics is no different than texting or falling asleep at the wheel. This is one of those concerning things with self driving vehicles. Where to place the blame? However if the vehicle has to have a watcher in the car it’s not fully self driving. Secondly it’s concerning it seems that the person responsible was comfortable enough to not be watching the road.

    Either way it’s still good data. Good data to help fix the issue in the future and good data to support having emergency braking maneuvere’s enabled due to overseers not paying attention. Diagnostic data should be damned and she should have been watching the road.

    Glad Uber is paying up and taking responsibility as they should. But I think the driver bares a lot the blame.

    As to the cyclist. It’s no different than kids who run into road and get hit. They shoulder some blame as well.
     
  24. sfsuphysics

    sfsuphysics I don't get it

    Messages:
    12,970
    Joined:
    Jan 14, 2007
    I wouldn't say everyone, in the bazillion page thread when this happened there were quite a few of us who really put down the jaywalker.

    I mean say all you want about the Uber car failing because something was switched off, the lights weren't switched off so that homeless gal should have seen a car coming.