Video Released of Uber Self-Driving Car Accident

How do reflections work again when using lasers with narrow FOV?
Are you just throwing terms out there? FOV has no relation to being soft bodied either. If it's a FOV issue then you won't see a solid target either. If I remember correctly Uber uses a Velodyne lidar, with a 360 FOV, which has very high pulse repetition rate, but relatively slow rotation.
 
Under those circumstance, a human driver would have hit her too.

She was impossible to see until up close and crossing a road with high speed traffic without any sort of crosswalk. This was not a fault with the software. The only way a computer program would have been able to see that was if it was using infrared to see further down the road than a human could. A self driving car with infrared would have been the only way she would have been avoided.


Wait, where is all that millimeter-wave RADAR that people have said would allow self-driving cars to usher in a new era of safety?

I'm not saying these are your words dgingeri, but it wasn't that long ago when this was exactly what people were claiming. That there was no way a self driving car wouldn't have avoided killing this woman was how they were arguing.

And I will reiterate from my previous post, that camera footage may not be representative of the actual lighting conditions a human would have seen, there are street lights both sides of that roadway. The car seemed to make no reaction at all, it just plowed her down.

That is Phoenix at 10 PM, it should be very easy to go right back and expect to see the same lighting conditions again.

It's obvious to me, this failed;
https://techcrunch.com/2018/03/19/h...ving-cars-are-supposed-to-detect-pedestrians/

uber-atg-volvo.jpg
 
The AI system must have data recorded for this incident - be interesting the forensics if the system even registered an object crossing the road. If not then Uber is not ready to drive in dark conditions, if ever.
 
The initial systems were, but that proved too expensive. These days they all use cheap USB web cams. They don't even have IR. So, no, it wouldn't have been able to see her any better than a human.

No competent AV system is using cheap USB web cams for their sensing. And if that is all uber is using, they'll end up paying millions upon millions. Literally, off the shelf driver assist systems can handle this, even the driver assist systems installed in Volvo cars off the lot.
 
  • Like
Reactions: PaulP
like this
I'm not saying he did this, but how long does it take to glace down at your speedometer and then back up?
 
The guy in the car is not watching the road the way a normal driver would, isn't that negligence? What he did is equal to texting while driving.
Even though it's an autonomous car, he is still the operator. By law, the blame can be placed on the victim for jaywalking, but that's still doesn't refute that Uber's autonomous cars are no better than human drivers.

If AI is no better than human drivers but are still used to replace human drivers, it's a ploy to maximize corporate profits at the risk of the public.
I would say this depends on how automated the system is supposed to be.
If the backup driver is only supposed to take over when the system tells them to or when/if the system fails then no, i don't think they're at fault.

If the backup driver's job is to watch the road all the time as if they were driving and correct the system, then yes, they're at fault.

Assuming state laws haven't changed very much, since they were sitting on the driver's side i would assume they were the driver no matter if there was an automated system or not. At that point, one has to make the determination if the driver was at fault and i believe it was determined that the driver was not at fault. So going back to if they were distracted or not, i don't think it would matter at that point.
 
How do reflections work again when using lasers with narrow FOV?

Fine have it your way (as incorrect as it is) but it should've seen the bicycle. Thats not soft bodied and its a reflective metal. Just look at those rims in the video they are super bright white.

Even if you assume it wasn't visible until the 3 second mark that's still about 60 feet (assuming Az still has the standard 20 foot long markers from when I was there). Even if it was 30 feet from detection the speed would be reduced and have greatly increased her chances of survival. Even if it was slowing only from 40mph to 35mph the energy impact would be something along the lines of HALF.

So you're "it couldn't see her in time to totally stop" argument is just BS. It damn well had enough time to start slowing down and improving the odds.
 
I'm not saying these are your words dgingeri, but it wasn't that long ago when this was exactly what people were claiming. That there was no way a self driving car wouldn't have avoided killing this woman was how they were arguing.
Whoever said that they were pretty wrong. Technology can always be improved, it's just a question of feasibility.
 
Are you just throwing terms out there? FOV has no relation to being soft bodied either. If it's a FOV issue then you won't see a solid target either. If I remember correctly Uber uses a Velodyne lidar, with a 360 FOV, which has very high pulse repetition rate, but relatively slow rotation.

I was told many moons ago that when you wanted high-res you had to narrow the FOV and move closer to the target to get a good enough return to work with. My assumption is that a LASER won't reflect enough light off a person in rumpled clothing to generate one, especially at a distance. The table I posted that showed neoprene at 5% would seem to bear this out.
 
I would say this depends on how automated the system is supposed to be.
If the backup driver is only supposed to take over when the system tells them to or when/if the system fails then no, i don't think they're at fault.

If the backup driver's job is to watch the road all the time as if they were driving and correct the system, then yes, they're at fault.

Assuming state laws haven't changed very much, since they were sitting on the driver's side i would assume they were the driver no matter if there was an automated system or not. At that point, one has to make the determination if the driver was at fault and i believe it was determined that the driver was not at fault. So going back to if they were distracted or not, i don't think it would matter at that point.

what if the driver was not expected to keep watch all the time by UBER, and was hired and trained as such.

then it goes to the question of why are they items which can distract the user in the car..
 
Another stupid fuck behind the wheel doesn't know how to do their job. You're behind the wheel for a reason, to scan the road for anything the auto system misses, not to look down at you fucking phone and look up every few seconds.

Irrelevant. A human-driven car would have schmucked her all the same.

The autonomous system should have easily detected the pedestrian. That's the issue here.
 
I'm not saying he did this, but how long does it take to glace down at your speedometer and then back up?
There is a difference where a normal action/motion such as looking at your speedometer, even for a fraction of a second ends up you hitting someone who completely does something wrong, compared to a system that can never prevent something like above. One has a chance to avoid the circumstance and the other may never be able to avoid. I do believe a law suit will be in order, if she is homeless, people may come out of the wood work (who did not help her) to try to cash in.
 
but it should've seen the bicycle


and that is what a lot of people seem to be missing. OK, lets say her black clothing wasnt as reflective as it should have been. the cross section on that bike is WOW huge.

Unless they also are saying.. yeah we cant see bikes on the road. Which.. Might explain why the AV car in san francisco the other month swerved back into a lane with a bike in it and knocked him over.
 
Then they don't belong on the highway because a normally and reasonably attentive driver does.

That being said, by the video, I do not think a human driver would have avoided killing that woman, and I don't think it would have been the driver's fault as long as the imagery from the camera is representative of how a human would have seen it. Keep in mind, it might not be the same. It's entirely possible that a human would have detected the woman earlier if attentive, where the camera may misrepresent the lighting conditions. Yes it's 10 AM at night, but there are lights right at the "Y" intersection, one on either side.

I would challenge how well the dash cam represents the actual lighting conditions a human would have experienced. I am also wondering why the car isn't taking greater advantage of it's high beams. It's a fucking autonomous car, it should be able to make better use of the high beams and that could have made a real difference depending on the cars other sensors.

I think this is a failure of it's collision avoidance system, hell the car didn't seem to react at all.

As an fyi, there is a video out there on youtube of someone driving this path after the accident with cell phone for video camera at night and it is all perfectly well lit and cell phone cameras are generally shitty without a lot of light.

 
I can only assume a few things about the logic in a self driving car.
Should object detection exist for objects outside the path of the vehicle? Should objects smaller than cars be detected and avoided when you're going past a certain speed? At what point do you estimate an object's relative motion and predict you might get into an accident?

I'm just going to assume a few things which i'm sure someone will tell me is wrong. I would assume that at a high enough speed you would turn off pedestrian detection and only detect cars. Why waste cpu cycles on things that shouldn't be there. I would also assume that maybe you would turn it off on high speed areas with no pedestrian crossings.

I would also assume that traveling at 40-75 mph is different than traveling at 0-30. The rules that apply to both manual drivers and automated software is fairly self apparent.

I'm not saying it couldn't be made to be more effective. I'm just throwing out my ideas on why this happened in the first place. The logic should be changed so that if an object is detected and you're traveling at speed and the estimated path might intersect with the vehicle, that the vehicle should probably drop down in speed so there can be more reaction time. However you have to balance that with false alarms.

I'll take a shot at the answers.
Should object detection exist for objects outside the path of the vehicle?
Absolutely! The AI in a car is not nearly advanced enough to make any determination as to what path the object might or could take. When we drive, if you see a child on a bike heading towards possibly crossing in front of you, do you slow down? Of course you. The car should do it better.

Should objects smaller than cars be detected and avoided when you're going past a certain speed?
Absolutely! The higher your vehicle velocity the more dangerous any object becomes. A piece of ice tossed casually at your car will do far less damage than one hurling at your car at 175MPH.

At what point do you estimate an object's relative motion and predict you might get into an accident?
Anything which has the potential to cause you to be involve in an accident should be evaluated. If it comes into the sphere of influence, it should be monitored. That can change, based on the surrounding environment. The algorithm could switch to a lower granularity of observation if the object is not moving, but it cannot assume any object will never move. Hell, a wall can fall over.

The car failed, in this scenario. The accident may not have been able to be avoided, but the car did absolutely nothing to give you a sense it even sensed the lady was anywhere near it.
 
At that rate of speed, no, a human could not have stopped the car. A human might have been able to swerve to avoid a direct impact, but with less than 2 seconds of visibility before impact, I don't see how this is a major software failure.


Like so many others, you are making a huge assumption. You are failing to consider that the video representation might not accurately represent human vision in the given lighting conditions.
 
As an fyi, there is a video out there on youtube of someone driving this path after the accident with cell phone for video camera at night and it is all perfectly well lit and cell phone cameras are generally shitty without a lot of light.


Yep, very different prospective which is more inline what I see when driving at night from my experience.
 
I can only assume a few things about the logic in a self driving car.
Should object detection exist for objects outside the path of the vehicle? Should objects smaller than cars be detected and avoided when you're going past a certain speed? At what point do you estimate an object's relative motion and predict you might get into an accident?

Any competent AV tech has full 360 sensors, wide field of view in front of car, and long distance sensors in front of car. All objects should be detected regardless of speed as this is needed for even simple collision avoidance with multiple different types of vehicles that will be encountered. You are constantly predicting motions and paths.

I'm just going to assume a few things which i'm sure someone will tell me is wrong. I would assume that at a high enough speed you would turn off pedestrian detection and only detect cars. Why waste cpu cycles on things that shouldn't be there. I would also assume that maybe you would turn it off on high speed areas with no pedestrian crossings.

You never turn it off, you might not regard it in the same way in the decision matrices but you would never disregard it. You have to be able to handle things like a cop in the road.

I'm not saying it couldn't be made to be more effective. I'm just throwing out my ideas on why this happened in the first place. The logic should be changed so that if an object is detected and you're traveling at speed and the estimated path might intersect with the vehicle, that the vehicle should probably drop down in speed so there can be more reaction time. However you have to balance that with false alarms.

As previously said, this situation is a demo case for already existing OTS and for sale driver assist systems from a variety of auto manufacturers. Any AV that can't at least perform at the level of a OTS driver assist system doesn't belong on the road.
 
I think this was the perfect storm (unaware driver, unaware pedestrian, jaywalking) - but also raises more questions.
The pedestrian was jaywalking, across three lanes, and blissfully unaware of her surroundings. What happened to looking both ways? Especially at night, how does the biker not see lights approaching?

It'd be another thing altogether if this was at a crosswalk, but it wasn't.

Why would it be any different at a crosswalk?

The car does not seem to have reacted at all so why do you think a different location would have made any difference?
 
  • Like
Reactions: PaulP
like this
Wait, where is all that millimeter-wave RADAR that people have said would allow self-driving cars to usher in a new era of safety?

I'm not saying these are your words dgingeri, but it wasn't that long ago when this was exactly what people were claiming. That there was no way a self driving car wouldn't have avoided killing this woman was how they were arguing.

The original reports said she came out of nowhere. The latest information is that she was crossing the road from the opposite side of the road and not only crossed through the other lane but almost crossed through the lane the uber AV was traveling in. The impact was on the front right bumper. A minimal amount of even driver assist technology level braking would of prevented the accident.
 
Why would it be any different at a crosswalk?

The car does not seem to have reacted at all so why do you think a different location would have made any difference?
If the car system was malfunctioning, self diagnosing etc. Would there be an alarm, going into safe mode as in slowing down and pulling over? Alerting the passenger. From the video it appears the car saw zero hazards, working according to design, all the different sensors and data points not registering an event and failed.
 
From the perspective of software and hardware self driving cars are a primitive and immature technology being promoted by people with misplaced optimism. One cannot blissfully ignore the wisdom of Murphy with devastating consequences. Arrogance killed this woman; woe upon those who fail to learn the lesson of this tragedy...

You're right: Arrogance did kill this women. Her own arrogance.
 
I place the blame squarely on the woman. I seen dogs look both ways before crossing and stop when they see a car coming.
 
If it isn't programed or incapable of detection obstructions on highway then it is quite literally too immature to be on the road. The failure here is GLARING. It doesn't matter in this case what a human could see, AVs don't work like humans, they utilize LIDAR/RADAR in combination in monochromatic cameras to detect objects which for any competent system basically makes the distinction between midnight and noon immaterial. Pretty much every vehicle with driver assist in production could of handled this situation and would of activated emergency braking. If you are trying to do autonomous driving and you cannot handle situations that production automobiles can handle and use as their basic demonstrations, then you have failed so incredibly hard.

Ah, no. What a human could see is absolutely relevant, because "as good as or better than human" is absolutely a reasonable standard.
 
A simple threshold of the frame at 3s shows enough of a blob that you would expect the car to have done something, even if it was ineffective.

threshold.jpg
 
Then you're an idiot who has no experience driving. It's a straight road, no obstructions, car traveling at 40 MPH. If the headlights were working properly, a human driver paying attention would have plenty of time to see the woman entering the road and react appropriately.

I live in Tempe. I know this environment. This accident shouldn't have happened, period.

Uber will fry for this and they deserve to. They are going to get hammered in court (or settle for 7+ figures) and their self-driving tech is going into the garbage can.

There's a good chance this is related to Uber's alleged theft of the self-driving technology they are using.
If they'd developed it themselves, they might actually understand it, but if they just stole it ... not so much.

It's also related to Uber's corporate culture of breaking the law and putting people at risk (like by hiring felons as drivers) in the name of corporate valuation.

I believe you are too hard on him.

I do think he is incorrect in his assessment and I believe yours may have merit.

Gigus Fire is not an idiot, he's simply a victim of being human. The saying "Seeing is believing" is around for a reason. We place great faith on what our eyes show us, sometimes, even over-riding what our brain says is in difference.

You should reconsider your comment.

Additionally, I think the concept of "safety drivers" or relying on a human for anything related to an emergency is a bad idea. It's just too much to ask people to stay aware and to rely on them when the machine is doing such a "seemingly great job".

Look, if the car encounters a navigation situation where maneuvering is going to be really trick, leaving a concert or ball game parking lot, maneuvering through some really tricky construction or an accident site, the car needs to alert the driver to take over. But to expect a driver to remain alert and attentive in order to avoid an accident, no. If the car can't do this then the car isn't ready for the road.
 
Last edited:
A simple threshold of the frame at 3s shows enough of a blob that you would expect the car to have done something, even if it was ineffective.

View attachment 60954

Thats a long time for the car to slow down. Like I said even if its only 5MPH you drastically reduce the energy involved in the impact. I am not saying she would've lived but she would've had a better chance to live at say 35 vs 40.
 
Didnt someone already do the math, and even if the car locked them up the millisecond it saw the person, it would have stopped 5 yards short? And someone else did the math and came to the conclusion that a human driver would have the reaction time to start breaking .5 seconds before the hit.

In both cases, the human brake would have reduced the fatality risk by 33%, and the AI instant break would have resulted in zero broken bones.
 
  • Like
Reactions: PaulP
like this
Don't most marked crosswalks have some kind of signal near them if they're not at an intersection?

Where I live they usually have a sign post preceding it (at the minimum) and newer areas have additional lighting, either overhead or under.
 
I agree even fully alert driver would have hit this poor person no if or but about it they couldn't have avoided in 2sec time frame at night time how ever that may been diff story if it had been day time.
I blame it on poor lady not only was she jay walking in first place but fact that the bike had no lights of any kind not even any night time reflectors
 
Lets see.... pedestrian with bike not even looking to see if there's any traffic before moving into the street, wearing half black, at night. Car should have detected pedestrian and reacted and didn't, driver fucking with her phone instead of paying attention... yeah... I'm hoping this kills autonomous anything. I hear people talk about self-driving ANYTHING.. and all I hear is "We are lazy, stupid, and want bad things to happen..."
 
People are too focused on what is shown in the video. The video represents a small fraction of what the car can actually see, IF the sensors are all working.

The car should have sensed the lady was moving long before she was ever in the frame of the video and it should have reacted. It did not react at all and that is the failing of the systems.

It does not matter of she was in a crosswalk or not. She was a moving object on a collision vector with the car. The car failed to react to that.

Now, the safety driver is another issue. It has been prove, time and time again, people cannot react to a changing situation quickly unless they have been active in whatever lead up to those changes.

However, the safety driver should have been looking at where the car was going and not away from it.

Multiple failures lead to the death of a person. Blaming the dead lady is wrong. Sorry, but the car had plenty of time to react and did not do so. The safety driver may not have been able to safely miss her, but we will never know about that one.

All I see here is autonomous cars are not ready for prime time. At least, Uber's aren't.
 
Last edited:
A 480p video?
It appears there is no time to react but the visual quality of that footage is so poor I am not swayed one way or the other.
I can't say for certain I could do better but I can say my vision is way more refined and detailed than the footage in that video.
 
  • Like
Reactions: PaulP
like this

interesting

Tempe police have identified the driver as 44-year-old Rafael Vasquez. Court records show someone with the same name and birthdate as Vasquez spent more than four years in prison for two felony convictions — for making false statements when obtaining unemployment benefits and attempted armed robbery — before starting work as an Uber driver.
 
Back
Top