Uber Self-Driving Car Crash May Be Due to LIDAR Blind Spot

rgMekanic

[H]ard|News
Joined
May 13, 2013
Messages
6,943
In more news surrounding the death of a pedestrian struck by a self-driving car operated by Uber, The Register is reporting that when Uber switched from the Ford Fusion to the Volvo XC90 in 2016, the number of LIDAR sensors on the car was reduced from five, to one. Instead of having LIDAR sensors mounted on the roof, front and rear bumpers, and on the sides, Uber switched to just one 360 degree LIDAR sensor mounted on the roof, which results in a blind spot all the way around the car.

The article also states that other self driving programs have not adopted the cost-saving single sensor option, with Waymo using six on it's cars, and GM using 5. Thanks to cageymaru for the story.

Velodyne is the company that makes the LIDAR sensors for Uber, and Marta Hall from Velodyne told Reuters:

"If you're going to avoid pedestrians, you're going to need to have a side LIDAR to see those pedestrians and avoid them, especially at night."
 
Why didn't the forward radar see the pedestrian though?

Nate
 
Yet, in an article posted 1.5hrs before this one....
Bloomberg is reporting that Uber disabled the standard collision-avoidance technology in the self-driving Volvo SUV that hit and killed a woman in Arizona last week. Aptiv Plc, the company that supplies the radar and camera for the Volvo system states "We don’t want people to be confused or think it was a failure of the technology that we supply for Volvo, because that’s not the case."

Intel Corp.'s Mobileye provides sensors and chips for Aptiv's collision avoidance system stated that despite the poor second-hand quality of the video of the accident, tests on their software using the video said it was able to detect the pedestrian one second before impact.

... which rabbit hole are we to follow?:geek::LOL:
 
Huh.
Would love to have been in earshot of the room where that discussion took place.
Since it basically amounts to the value of human life and cost of fallout in such an event, balanced against the risk of failure and the cost of equipment.

So what now "Oops! Thoughts & prayers, we'll do better ..? ...!"

It should go without saying that the risk should be 0%, requiring whatever hardware is necessary to achieve that, including backups should primary systems fail.

One sensor, really Uber?
This is how you start the drums for regulation.

Yet, in an article posted 1.5hrs before this one....


... which rabbit hole are we to follow?:geek::LOL:

Both rabbit holes are most likely true.
Uber probably disabled the Aptiv programming so there was less overhead and not two brain trying to control the same functions.

And Aptiv is saying their programming would have ID'd the pedestrian and acted accordingly, based on the test they did where their programming was able to take the low-quality camera feed and ID the pedestrian one second before impact.
 
Last edited:
Let the finger pointing begin!

I've watched the video though, and while I feel bad for the victim, she failed as much as the car and backup driver.
 
Yet, in an article posted 1.5hrs before this one....


... which rabbit hole are we to follow?:geek::LOL:

both.gif



Uber disabled the Volvo stuff as to not interfere with the testing of their own hardware that they didn't have enough of.
 
for lidar thats one huge and seriously flawed blind spot, not to see something in the middle of the road directly in front of it... that sensor or system failed that's all there is to it. just like any thing technological they do goofy things sometimes, like not work. and in this case it failed to function ... blind spot ... yeah right if those sensors have that big of blind spot they better pull every self driving car off the road that using those sensors because that is as serious danger to every pedestrian and motorist those cars are around.
 
Guess how surprised I am that it turned out that Uber cheaped out on the sensors.
Hint: Guess a very very small number and bet on the under.
 
The driver is at fault.

Sure, autopilot may have been engaged and it may have been a shitty and inadequate design, but that doesn't mean the driver is blameless.

Sure, the pedestrian was jay walking, and that's against the law, and legally may share some (or all) of the liability - depending on state traffic laws.

But I don't think that removes the driver from all responsibility, or puts any of the responsibility at the feet of Uber. We don't prosecute Ford or GM or VW if someone gets in an accident and cruise control was on? I don't see self-driving (WITH a driver in the vehicle) as any different from any other assistance technology. The driver is still in control, the driver is still responsible. If the tech sucks, the driver shouldn't rely upon it and take their eyes off the road. If the driver does decide to start text messaging and eating tacos while auto-pilot is engaged, they are still responsible for anything the vehicle may do while it's under their control.

Now an empty, driverless, fully autonomous vehicle - different story entirely. I don't think we are there yet, technically, legally, or culturally. And I don't think that's what Uber was doing since they very clearly still employed a "Safety Driver" (although it is their ultimate goal, obviously).
 
Last edited:
The thing that gets me about all this is why was Uber doing this in the first place? If they want self-driving cars than wait for the car companies to make them don't do it yourself...or do they plan on making their own cars too?
 
and as a society we need driverless cars why? wouldnt it be better to focus on driverless trains and ships? and when that is perfected then focus on aircraft. i dont need a driverless car. i like driving.
 
and as a society we need driverless cars why? wouldnt it be better to focus on driverless trains and ships? and when that is perfected then focus on aircraft. i dont need a driverless car. i like driving.
Lots of people could benefit. Just because of one accident doesn't mean the tech won't work. But hey yeah you'll never get too old to drive, or disabled somehow, nope never, so we defiantly shouldn't do driverless cars. As to ships from what I've read a lot of the larger ships to pilot themselves for the most part, kinda like commercial aircraft though they still use a pilot to land and take-off but even that could be automated if we wanted (this would mean updating some equipment).
 
The driver is at fault.

Sure, autopilot may have been engaged and it may have been a shitty and inadequate design, but that doesn't mean the driver is blameless.

Sure, the pedestrian was jay walking, and that's against the law, and legally may share some (or all) of the liability - depending on state traffic laws.

But I don't think that removes the driver from all responsibility, or puts any of the responsibility at the feet of Uber. We don't prosecute Ford or GM or VW if someone gets in an accident and cruise control was on? I don't see self-driving (WITH a driver in the vehicle) as any different from any other assistance technology. The driver is still in control, the driver is still responsible. If the tech sucks, the driver shouldn't rely upon it and take their eyes off the road. If the driver does decide to start text messaging and eating tacos while auto-pilot is engaged, they are still responsible for anything the vehicle may do while it's under their control.

Now an empty, driverless, fully autonomous vehicle - different story entirely. I don't think we are there yet, technically, legally, or culturally. And I don't think that's what Uber was doing since they very clearly still employed a "Safety Driver" (although it is their ultimate goal, obviously).

How is a driver to react to the computer not reacting? It's impossible to do so until it's too late.
 
The thing that gets me about all this is why was Uber doing this in the first place? If they want self-driving cars than wait for the car companies to make them don't do it yourself...or do they plan on making their own cars too?

UBER's end goal is to make their own AV cars in order to use them in their rideshare biz. Their fear is that their entire biz will be killed by GM CRUISE or Waymo once they have working cars.
 
The thing that gets me about all this is why was Uber doing this in the first place? If they want self-driving cars than wait for the car companies to make them don't do it yourself...or do they plan on making their own cars too?
Uber wants to cut out the middle man, any dollar spent on someone else's equipment is a dollar of profit lost, uber worships the all mighty dollar... nothing more.
Why have six lidar sensors when you can have one, why have two safety drivers when you can have one. Any sense of ethics or care is lost at uber, its rotten to the core.
 
Last edited:
How is a driver to react to the computer not reacting? It's impossible to do so until it's too late.
How would a driver react in that situation if there was no autopilot?

That’s what should have happened.
 
The driver is at fault.

Sure, autopilot may have been engaged and it may have been a shitty and inadequate design, but that doesn't mean the driver is blameless.

Sure, the pedestrian was jay walking, and that's against the law, and legally may share some (or all) of the liability - depending on state traffic laws.

But I don't think that removes the driver from all responsibility, or puts any of the responsibility at the feet of Uber. We don't prosecute Ford or GM or VW if someone gets in an accident and cruise control was on? I don't see self-driving (WITH a driver in the vehicle) as any different from any other assistance technology. The driver is still in control, the driver is still responsible. If the tech sucks, the driver shouldn't rely upon it and take their eyes off the road. If the driver does decide to start text messaging and eating tacos while auto-pilot is engaged, they are still responsible for anything the vehicle may do while it's under their control.

Now an empty, driverless, fully autonomous vehicle - different story entirely. I don't think we are there yet, technically, legally, or culturally. And I don't think that's what Uber was doing since they very clearly still employed a "Safety Driver" (although it is their ultimate goal, obviously).

Other people have already covered this although it should be completely and 100% totally obvious for anyone who is a decent driver. Without the constant input and output of controlling the car, a "driver" of an autonomous car cannot stay vigilant over any period of time. The "driver" is not controlling the car and has been taken out of the decision making process of the second to second operation of the car. There is a very simple test for this. Get someone to drive your car. Ride along as a passenger and see how long you can consciously continue to "fake" the control of the car. I'd be surprised if you could do this for more than a few minutes and even then your reactions would be slow and sluggish simply because your "output" to the car means nothing. You're also not receiving much in the way of input from the car because you're not using the steering wheel, the pedals or anything else.

As a driver, I "feel" orders of magnitude more than any passenger in the car does. I get vibration from the steering wheel, I feel if the wheels are pulling one way or another and automatically correcting for it, the input I receive from a bump in the road through the steering wheel is much higher than any passenger. The passengers don't feel the vibration in the steering wheel, they don't feel or know that the wheels are pulling one way or another because as the driver I'm correcting it.

These autonomous car "drivers" are literally nothing more than passengers in the car. They aren't supposed to affect anything to do with the second to second operation of the vehicle. To do so would ruin any test of the autonomous system. They can't stay vigilant as a driver because it's not possible to do so. This is the very reason why I think having any of these autonomous cars on the road is folly. These systems are not anywhere close to be ready to be on the road. They haven't been able to function in anything other than close to ideal circumstances. There are many other issues which no one seems to want to talk about such as bad road conditions, dirty or malfunctioning sensors and what is probably one of my biggest concerns: the fact that no one can say whatsoever how these things will have any chance of operating when there are tons of other autonomous cars out there. How are these vehicles going to be able to know what's going on when 100 different ones are throwing radar, lidar and everything else around everywhere?
 
360* LIDAR, except for the front 180*. The whole point of the sensor is to detect an object, track it and let the AI decide what to do. Even honking a horn might have let the pedestrian dodge. Humans reacting in instinctive life being threatened lizard brain mode can move pretty damn quick.

This incident also highlights the stupidity of expecting a human to take over if the car's AI goes "Oh Shit!, You've got it sucker!". By the time the 'driver' in the Uber car figured out something was wrong, it was way too late.
 
So they blame the tech....when it was installed to bare minimum and safety software was switched off.....hmmm

Seems like the driver should have been made aware?

I would say 20/30/50.
20% pedestrian walking blindly into the street.

30% idiot driver not driving and paying attention.

50% uber for using a Volvo equipped with 360* blind spot technology with the software disabled for collision avoidence.

I hate it when someone cuts cost on life saving engineering and it cost lives
 
Other people have already covered this although it should be completely and 100% totally obvious for anyone who is a decent driver. Without the constant input and output of controlling the car, a "driver" of an autonomous car cannot stay vigilant over any period of time. The "driver" is not controlling the car and has been taken out of the decision making process of the second to second operation of the car. There is a very simple test for this. Get someone to drive your car. Ride along as a passenger and see how long you can consciously continue to "fake" the control of the car. I'd be surprised if you could do this for more than a few minutes and even then your reactions would be slow and sluggish simply because your "output" to the car means nothing. You're also not receiving much in the way of input from the car because you're not using the steering wheel, the pedals or anything else.

As a driver, I "feel" orders of magnitude more than any passenger in the car does. I get vibration from the steering wheel, I feel if the wheels are pulling one way or another and automatically correcting for it, the input I receive from a bump in the road through the steering wheel is much higher than any passenger. The passengers don't feel the vibration in the steering wheel, they don't feel or know that the wheels are pulling one way or another because as the driver I'm correcting it.

These autonomous car "drivers" are literally nothing more than passengers in the car. They aren't supposed to affect anything to do with the second to second operation of the vehicle. To do so would ruin any test of the autonomous system. They can't stay vigilant as a driver because it's not possible to do so. This is the very reason why I think having any of these autonomous cars on the road is folly. These systems are not anywhere close to be ready to be on the road. They haven't been able to function in anything other than close to ideal circumstances. There are many other issues which no one seems to want to talk about such as bad road conditions, dirty or malfunctioning sensors and what is probably one of my biggest concerns: the fact that no one can say whatsoever how these things will have any chance of operating when there are tons of other autonomous cars out there. How are these vehicles going to be able to know what's going on when 100 different ones are throwing radar, lidar and everything else around everywhere?

Airline pilot, or captain of a ship at sea. They can use auto-navigation just fine. They don't even drive the vessel most of the time. Vessels at sea don't shut down and stop if the captain goes to sleep in his/her bunkroom even. But if anything happens, they are still 100% responsible. Not the manufacturer of the boat or plane. Not the manufacturer of the auto-navigation system.

You may counter, those aren't nearly as busy travel lanes, or different circumstances, but they aren't. Airlines fly into busy airports all the time. Same with ships and busy seaways/harbors. If it's not safe to use the system, you don't use the system. If it's ok, then ok. But if something fucks up, it's still the captain's ass on the line.

My feelings are the same way about a car.
 
If you want to find who is at fault for this death, follow the great big shlong that is currently pointed at Uber's ass. But don't follow too close, 'cause that would look gay.

The best thing that Uber's management can do right now is change their name, sell the company, and then check-in anonymously at a Motel 6 for about 4 years. Even a good caning won't help these guys. These guys are gonna be so unpopular they're gonna have to eat at Denny's for at least a decade, because even the people at Kentucky Fried Chicken will be smart enough to recognize them and say, "Hey, you're the assholes from that company called Uber."
 
Airline pilot, or captain of a ship at sea. They can use auto-navigation just fine. They don't even drive the vessel most of the time. Vessels at sea don't shut down and stop if the captain goes to sleep in his/her bunkroom even. But if anything happens, they are still 100% responsible. Not the manufacturer of the boat or plane. Not the manufacturer of the auto-navigation system.

You may counter, those aren't nearly as busy travel lanes, or different circumstances, but they aren't. Airlines fly into busy airports all the time. Same with ships and busy seaways/harbors. If it's not safe to use the system, you don't use the system. If it's ok, then ok. But if something fucks up, it's still the captain's ass on the line.

My feelings are the same way about a car.

That's actually false. If the auto-nav of either a boat or a plane screws up, the NTSB will blame the manufacturer.
 
Let the finger pointing begin!

I've watched the video though, and while I feel bad for the victim, she failed as much as the car and backup driver.

Sorry, but that's completely irrelevant. if it had been a 10 year old chasing a ball from a yard, would you be even considering that argument? Like I have said before, if these systems are going to be roaming the streets, they had better be better at avoiding pedestrians than a human driver or their isn't any point in using the damn things. And that includes when it's pitch black or in a driving rain. The sensors exist to do it, if you are going to try and get by on the cheap with peoples lives, you better find another line of work.
 
Airline pilot, or captain of a ship at sea. They can use auto-navigation just fine. They don't even drive the vessel most of the time. Vessels at sea don't shut down and stop if the captain goes to sleep in his/her bunkroom even. But if anything happens, they are still 100% responsible. Not the manufacturer of the boat or plane. Not the manufacturer of the auto-navigation system.

You may counter, those aren't nearly as busy travel lanes, or different circumstances, but they aren't. Airlines fly into busy airports all the time. Same with ships and busy seaways/harbors. If it's not safe to use the system, you don't use the system. If it's ok, then ok. But if something fucks up, it's still the captain's ass on the line.

My feelings are the same way about a car.

If you actually care about some sort of discussion, address the points made and actually argue to backup your points. Trying to use completely irrelevant situations which don't even remotely have any sort of relation to what is being discussed does not help you. Aircraft do not takeoff or land using any type of automatic systems nor do the pilots have to make split second decisions regarding someone who ran out or in this case, slowly walked across two lanes of traffic in front of them. The sky doesn't have anywhere near the traffic nor the density of traffic nor the obstacles as driving a vehicle. Airplanes are strictly controlled on where they can go, when they can be there and take off and land only in accordance with ground traffic control. At no point does it even remotely resemble being behind the wheel of a car. I'm not even going to get into your nautical example since many of the same lack of issues come into place as airflight.

Try responding to my post. Address the points I made if you can, though I doubt you'll be able to do it logically.
 
"I blame the pedestrian" - completely irrelevant

"the video is dark" - completely irrelevant

"a person would have hit her" - completely irrelevant

"you're holding AV to a higher standard than humans" - NO SHIT MORON

if it's no better than people, what's the goddamn point? why is anyone defending uber here? do you have such a boner for new tech that you can't see this is obviously not ready to prime time?

FUCK.
 
Try responding to my post. Address the points I made if you can, though I doubt you'll be able to do it logically.

No, I think you misunderstand. I stated my opinion and why I think as much. You stated yours, and that’s fine, I think it’s wrong but it’s an opinion and you have good reasons and your entitled to it. I don’t give enough shits to respond past this to your issues with my opinion.

It would take a very good, thorough case to shift my opinion away from the notion that the driver is responsible for the actions of a vehicle they are in control of. No one has presented a case to me that can do that yet.

If you want to speak about a fully autonomous, driverless vehicle - that’s an entirely different discussion. I do believe that’s what most people think when they think “self-driving”, but I also think most people are deluding themselves about the tech and their willingness and ability to relinquish that much control.
 
if it's no better than people, what's the goddamn point?

All your points are correct, but they are meaningless to your final questions.

1. No one is defending Uber. The more I learn about Uber's setup, the more I think that Uber is fucktardedness frosted with ineptitude.

2. Uber aside, no one has proven that autonomous cars are worse than human drivers. You're worried about the companies that are building driverless cars, but the regulators are clearing the way forward as fast as the developers are designing. The NHTSA and the DOT are not slowing down.

3. The 'point' should be obvious - driverless cars free us from the need to drive the car. People do stupid shit in cars. Mary is putting on her makeup on the way to work, Jimmy is texting his girlfriend, Fat Angus just pulled out of the drive-through at McDonald's and has his vision obscured by a large order of fries, and Uncle Bob is drunk and wants to get home.

Guns kill people, but cars kill just as many people, and they injure 70 times more. You're bitching about the accident rate of AI driven cars, but what makes you think the accident rate for human driven cars is so good? We SUCK at driving cars.
 
Last edited:
for lidar thats one huge and seriously flawed blind spot, not to see something in the middle of the road directly in front of it... that sensor or system failed that's all there is to it. just like any thing technological they do goofy things sometimes, like not work. and in this case it failed to function ... blind spot ... yeah right if those sensors have that big of blind spot they better pull every self driving car off the road that using those sensors because that is as serious danger to every pedestrian and motorist those cars are around.

As they said, other systems use a minimum of 5 LiDAR sensors. UBER used one, presumably roof mounted. I imagine that would create a blind spot for small objects (anything smaller than a car) in the near vicinity of the car, and depending on rotation speed, blind spots immediately behind the sweep of the sensor. There's also the fact that it was traveling at speed and probably tuned to look into the distance rather than nearby objects. Get into the right place at the right time, and it will literally never see you. That's just my theory though.
 
Why not test with drivers fully engaged... You know like when you give a controller to your kid brother... But you would have parameters where the system is overridden by the user: too much acceleration, too much breakes, too much steering at at certain points. Then you would have points where the automation hands off the system to what should be a fully engaged drivers. You then study this hands of, and improve the system, you do this for years.. this shit aint ready. Its just not. Lights of normal cars need to be updated to they emit IR signals, so auto car identify other cars in a more definitive manner, and not just run into stopped vehicles... companies need to come up with standards period. Its not ready no question.
 
Sorry, but that's completely irrelevant. if it had been a 10 year old chasing a ball from a yard, would you be even considering that argument? Like I have said before, if these systems are going to be roaming the streets, they had better be better at avoiding pedestrians than a human driver or their isn't any point in using the damn things. And that includes when it's pitch black or in a driving rain. The sensors exist to do it, if you are going to try and get by on the cheap with peoples lives, you better find another line of work.

Who said that safety isn't important? Of course they need to take it as seriously as possible, but that doesn't absolve jay walkers of personal responsibility.

Get off your moral high horse. This grown ass adult just won the Darwin awards and for good reason. If I crossed the street in such a recklessly oblivious manner I'd expect to die.
 
Back
Top