Video Released of Uber Self-Driving Car Accident

I propose we take all the cheerleaders of this tech and let them beta test these cars on a private course for the next 10-20 years and if nothing happens to them or their families then we can talk about allowing them on the roads next to our families. Make sense?
 
Uhm. Look both ways before you cross the street? Being in the fire service I have seen multiple people struck by human drivers in many circumstances. No one can say for sure 100% that a human driver would have avoided her. She wasnt even crossing in an area you would expect to see someone, hence your not normally searching for someone there.

Doesnt this also bring up the question, should the car slam into the guard rail harming or killing the passengers or hit the person? Im pretty sure if a human driver hit this person it would be labled as an accident.
 
One less stupid person in the world. Darwinism working as intended.

Woman jaywalking was at fault based on Arizona law. Sorry all you bleeding-heart types. This is why responsible parents teach their kids to look both ways and cross at an intersection... so they don't grow up to be this idiot...
 
LIDAR.jpg
 
And you know this how? Is that conclusion based on the four+ decades you knew I'd been driving? On your knowledge of the job I once had that required a lot of driving around NYC? Or the other job I had that involved a three hour commute five days a week for two years?

Oh wait, you don't know about any of that, do you? You're just talking out your ass, aren't you?
Well it's because you seem to be a royal dick. I only assume that spills over on your driving skills as well.
Only an idiot would assume that the video camera that recorded that video can "see" as well as a human being can in those lighting conditions.
It clearly cannot, as the lack of any details in most of the image makes clear.

Tempe is a land-locked low-rise city. There's hardly a square meter of it that isn't well-illuminated by streetlights.
You're never driving down a dark tunnel like that video makes it appear.
And headlights have a fairly high drop off for illuminating things that aren't directly in front of the car. She started crossing the road only seconds before the crash and wasn't in the direct front of the driver either. If the car would have passed that location 3 seconds later the crash wouldn't have happened.

The beam dropoff point is fairly clear in the video. The homeless woman wasn't crossing in an area illuminated by a streetlight. This is fairly clear from the video as well.
 
A lot of cars differentiate between automatic collision avoidance with other cars and pedestrians. Thus leading up to this hilarious video:

The point being? Current hiend cars have both, and volvo first made it with pedestrians in mind.

A lot of people in this thread seem to think high-res, high-freq, sensors are off the shelf items. They are not. They are restricted and I would imagine they aren't going to allow Joe Blow to buy a self-driving car and harvest the parts needed to make an ad-hoc missile.
I don't assume, I know. Well not exactly off the shelf but pretty easy to get. And we're not talking about joe blow, we're talking about uber, a huge multi-national company, if anyone can get access to any technology it's them. What is not easy and usually gets you a background check is hi-end mems and f.o.g. navigation systems. But for Uber? No problem.
 
Annnnnddd there it is. I'm pretty anti-self driving cars until the software can mature but this is very much a human failure. You had ONE JOB, lady, and you blew it.

The safety driver isn't there to response to instantaneous accidents. There is no way they can maintain the alertness required for that. It is the reason, that during critical flight phases, an airline pilot is directly in the loop and not an auxiliary even though we have systems that are generally perfectly capable to doing autonomous landings and takeoffs. It takes time to go from being an auxiliary to being directly in control.

Safety drivers are there to monitor the car and take control when the AV systems have a significant fault (crash, sensor freeze, etc). Think of them like your mom or dad riding shotgun while you learn to drive. They are there to monitor you and tell you to stop when you are screwing up.
 
One less stupid person in the world. Darwinism working as intended.

Woman jaywalking was at fault based on Arizona law. Sorry all you bleeding-heart types. This is why responsible parents teach their kids to look both ways and cross at an intersection... so they don't grow up to be this idiot...

sorry but in most parts of the world you can cross were you want on most types of roads. so unless uber is only going to market this car in the usa it had better be able to cope with pedestrians crossing the road.
 
If you start thinking that autonomous vehicles should be programmed in every instance or be able to come to a complete stop/avoid hitting people no matter what, then you're asking too much.

Can we take autonomy to jail?

The goal of AI is to be better than humans in these situations, not just like or worst than humans.
 
Under those circumstance, a human driver would have hit her too.

She was impossible to see until up close and crossing a road with high speed traffic without any sort of crosswalk. This was not a fault with the software. The only way a computer program would have been able to see that was if it was using infrared to see further down the road than a human could. A self driving car with infrared would have been the only way she would have been avoided.

All AV systems are using a combination of LIDAR/RADAR and IR/monochrome cameras. The AV system should of been able to see her like it was noon in the summer without a cloud in the sky. This was 100% the fault of the AV system.

Lets put it this way, this exact scenario is basically the demo case for the various production driver systems out there INCLUDING the one by Volvo who's vehicles Uber is using for their test beds. Let that sink in: an off the lot Volvo with their current production driver assist technology would of prevented the accident! The Uber AV system was strictly worse in this scenario than the unmodified underlying vehicle.
 
Splitting hairs. Using cameras only vs an actual system designed to detect obstructions.

LIDAR is a system for finding actual obstructions like buildings. It cannot handle soft-bodies especially when using eye-friendly frequencies. Same as fiberglass boats and RADAR; you have to mount reflectors to explicitly be noticed or risk a collision. This was a worst case scenario for any driver but especially for an autonomous vehicle.
 
All AV systems are using a combination of LIDAR/RADAR and IR/monochrome cameras. The AV system should of been able to see her like it was noon in the summer without a cloud in the sky. This was 100% the fault of the AV system.

Lets put it this way, this exact scenario is basically the demo case for the various production driver systems out there INCLUDING the one by Volvo who's vehicles Uber is using for their test beds. Let that sink in: an off the lot Volvo with their current production driver assist technology would of prevented the accident! The Uber AV system was strictly worse in this scenario than the unmodified underlying vehicle.

The initial systems were, but that proved too expensive. These days they all use cheap USB web cams. They don't even have IR. So, no, it wouldn't have been able to see her any better than a human.
 
Woman jaywalking was at fault based on Arizona law.
Not just her. Maybe not even mainly her.

See https://www.all-about-car-accidents.com/resources/auto-accident/auto-accident-causes/can-pedestrian-be-at-fault-in-car-accident:
"A driver or pedestrian who fails to exercise [reasonable] care will be considered negligent if their action (or inaction) causes a traffic accident."
See also http://corporate.findlaw.com/litigation-disputes/comparative-negligence-in-arizona.html/

You're not a lawyer, are you?
 
Can we take autonomy to jail?

The goal of AI is to be better than humans in these situations, not just like or worst than humans.
First you'd have to provide proof that it wasn't the jaywalker's fault and the driver's fault. That doesn't seem to be the case here.

Second you'd have to show this was a criminal offense. I don't think that's the case here either. The only other issue is insurance for cases like this, and uber would be moronic not to have that covered somehow.
 
The point being? Current hiend cars have both, and volvo first made it with pedestrians in mind.


I don't assume, I know. Well not exactly off the shelf but pretty easy to get. And we're not talking about joe blow, we're talking about uber, a huge multi-national company, if anyone can get access to any technology it's them. What is not easy and usually gets you a background check is hi-end mems and f.o.g. navigation systems. But for Uber? No problem.

A company can but they cannot turn around and re-sell them, hence the forms. The point of these projects is to produce a car that doesn't trigger those regulations.
 
At that rate of speed, no, a human could not have stopped the car. A human might have been able to swerve to avoid a direct impact, but with less than 2 seconds of visibility before impact, I don't see how this is a major software failure.

Imagine it was noon. And the human could see clearly 100+ ft in front of them. Cause that's what every other AV or driver assist system on the market sees.
 
If you cant see well enough to react in the conditions at hand you should not be traveling that speed. Going in and out of overhead lighting shadows and you are trying to be all sly looking down at your txt messages... Ze/Zir was the last safety interlock and was screwing around.

You must not drive that often. People do this ALL THE TIME. I'm ~15mi northwest of Birmingham, AL, where old US78/AL5 runs towards Jasper; it's absolutely fucking dark along several sections of this highway, and people routinely drive 10+mph over the speed limit. Every so often, there's either someone walking across/along the highway, or deer/boars crossing, or a vehicle turning into a side street; they get hit.

There are sections of the same road closer to B'ham that are similar to this road in Tempe, and people have been hit while trying to cross the road at night. In fact, in the city, where there's far more light, people are somehow hit on a regular basis.
 
It depends on the power and wavelength of the emitter. And the application. Mapping sensors see asphalt quite well, while getting almost no reflections from snow. At least not from snow already on the ground, but who cares about mid-air snowflakes? That's just noise for lidar.

LIDAR is a system for finding actual obstructions like buildings. It cannot handle soft-bodies especially when using eye-friendly frequencies. Same as fiberglass boats and RADAR; you have to mount reflectors to explicitly be noticed or risk a collision. This was a worst case scenario for any driver but especially for an autonomous vehicle.
You're pulling that out from thin air. There is no correlation between being "soft bodied" and being reflective to lidar or not.
 
Eye has tremendous dynamic range - way over the camera. I know headlights show a lot to the eyes, much more than that camera shows. Pretty sure everyone drives here and at least at night at times. As for avoidable, humans yes if they spot her at the tip of the headlight and not looking in the mirror, other side of road, playing with the radio, feeling for that lost french fry between the seat etc.

If I had a self driving car and I am suppose to be the backup system that in a split second after hours, days, months of driving avoid anything that might pop up - I will just throw the system out the window, not look back and just drive myself. No way I would be able to just stare and stare and stare and do nothing else. Give me the wheel - off you go.
 
Then you're an idiot who has no experience driving. It's a straight road, no obstructions, car traveling at 40 MPH. If the headlights were working properly, a human driver paying attention would have plenty of time to see the woman entering the road and react appropriately.

I live in Tempe. I know this environment. This accident shouldn't have happened, period.

Uber will fry for this and they deserve to. They are going to get hammered in court (or settle for 7+ figures) and their self-driving tech is going into the garbage can.

There's a good chance this is related to Uber's alleged theft of the self-driving technology they are using.
If they'd developed it themselves, they might actually understand it, but if they just stole it ... not so much.

It's also related to Uber's corporate culture of breaking the law and putting people at risk (like by hiring felons as drivers) in the name of corporate valuation.
I live near Tempe, and she passed between the light poles which is the darkest spot.(As far as I can tell)
Reaction times are different with people and cars. Most likely she would of been hit.(IMO)
 
It depends on the power and wavelength of the emitter. And the application. Mapping sensors see asphalt quite well, while getting almost no reflections from snow. At least not from snow already on the ground, but who cares about mid-air snowflakes? That's just noise for lidar.


You're pulling that out from thin air. There is no correlation between being "soft bodied" and being reflective to lidar or not.

How do reflections work again when using lasers with narrow FOV?
 
I blame it on poor jay walking skills. Never assume the car will slow down. Never run the full distance if the car is less than a block away.
 
Would a non-self driving car strike the person in the video under these circumstances? I'm willing to bet yes since it was in the dark, they were in a 2 lane road with no traffic going at around 40 mph. The woman doesn't seem to be crossing anywhere she should be and wasn't wearing reflective clothing at least that can be seen in the video.

If you start thinking that autonomous vehicles should be programmed in every instance or be able to come to a complete stop/avoid hitting people no matter what, then you're asking too much. I'm willing to bet that the car isn't programmed to look out for people when on a highway.

I have one problem with your logic. Sensors can "see" better than the human eye can.

Not saying this particular incident could have been avoided. Just addressing your specific logic.

And you would be wrong about autonomous cars not looking for "people" on a highway. They do not look for anything. They sense everything that could cause a collision or be collided with. After that, bets are all off as how they determine what they can safely collide with and what they cannot collide with.

At least, it better work that way or the entire autonomous industry is in for a very short run with all the law suits they could be hit with.

Remember the Google/Waymo car which hit a dog? It was a bug in the software which caused it.

The car failed. It did not make any attempt, at all, to avoid or slow down once an object was tossed in front of it. The object being the pedistrian. Sensors should have picked her up leaving the curb, crossing one lane and then in front of the car. The car should have slowed long before the woman was in front of the car. It did nothing. This is a failure of the system.
 
Last edited:
  • Like
Reactions: PaulP
like this
Would a non-self driving car strike the person in the video under these circumstances? I'm willing to bet yes since it was in the dark, they were in a 2 lane road with no traffic going at around 40 mph. The woman doesn't seem to be crossing anywhere she should be and wasn't wearing reflective clothing at least that can be seen in the video.

If you start thinking that autonomous vehicles should be programmed in every instance or be able to come to a complete stop/avoid hitting people no matter what, then you're asking too much. I'm willing to bet that the car isn't programmed to look out for people when on a highway.

Then they don't belong on the highway because a normally and reasonably attentive driver does.

That being said, by the video, I do not think a human driver would have avoided killing that woman, and I don't think it would have been the driver's fault as long as the imagery from the camera is representative of how a human would have seen it. Keep in mind, it might not be the same. It's entirely possible that a human would have detected the woman earlier if attentive, where the camera may misrepresent the lighting conditions. Yes it's 10 AM at night, but there are lights right at the "Y" intersection, one on either side.

I would challenge how well the dash cam represents the actual lighting conditions a human would have experienced. I am also wondering why the car isn't taking greater advantage of it's high beams. It's a fucking autonomous car, it should be able to make better use of the high beams and that could have made a real difference depending on the cars other sensors.

I think this is a failure of it's collision avoidance system, hell the car didn't seem to react at all.
 
  • Like
Reactions: PaulP
like this
Another stupid fuck behind the wheel doesn't know how to do their job. You're behind the wheel for a reason, to scan the road for anything the auto system misses, not to look down at you fucking phone and look up every few seconds.

Also the cyclist was a stupid fuck for not checking before crossing the road.
 
Well it's because you seem to be a royal dick. I only assume that spills over on your driving skills as well.
You have the logic skills of a gerbil. Congrats on still managing to breathe without being reminded to.

She started crossing the road only seconds before the crash
She crossed the other lane before entering the lane the Uber car was in. A human being would have seen her there.
And you're still latched on to the idiot notion that a human driver could only see what that shitty video shows, aren't you?
A human driver would have been able to read the sign on the right, see the outline of the buildings (we got plenty of skyglow here) and so on.

What, you have Uber stock options or something? Too bad for you.
 

stop relying on the video. that video is poorer than human vision. And that video CANNOT be the primary feed for controlling the automobile.
if it was, this car will never get approved.
 
Under those circumstance, a human driver would have hit her too.

She was impossible to see until up close and crossing a road with high speed traffic without any sort of crosswalk. This was not a fault with the software. The only way a computer program would have been able to see that was if it was using infrared to see further down the road than a human could. A self driving car with infrared would have been the only way she would have been avoided.

Agreed. I am wondering why there are no infrared sensors on this car. That would have prevented this entirely.
 
First you'd have to provide proof that it wasn't the jaywalker's fault and the driver's fault. That doesn't seem to be the case here.

Second you'd have to show this was a criminal offense. I don't think that's the case here either. The only other issue is insurance for cases like this, and uber would be moronic not to have that covered somehow.

The guy in the car is not watching the road the way a normal driver would, isn't that negligence? What he did is equal to texting while driving.
Even though it's an autonomous car, he is still the operator. By law, the blame can be placed on the victim for jaywalking, but that's still doesn't refute that Uber's autonomous cars are no better than human drivers.

If AI is no better than human drivers but are still used to replace human drivers, it's a ploy to maximize corporate profits at the risk of the public.
 
I have one problem with your logic. Sensors can "see" better than the human eye can.

Not saying this particular incident could have been avoided. Just addressing your specific logic.

And you would be wrong about autonomous cars not looking for "people" on a highway. They do not look for anything. They sense everything that could cause a collision or be collided with. After that, bets are all off as how they determine what they can safely collide with and what they cannot collide with.

At least, it better work that way or the entire autonomous industry is in for a very short run with all the law suits they could be hit with.

Remember the Google/Waymo car which hit a dog? It was a bug in the software which caused it.
I can only assume a few things about the logic in a self driving car.
Should object detection exist for objects outside the path of the vehicle? Should objects smaller than cars be detected and avoided when you're going past a certain speed? At what point do you estimate an object's relative motion and predict you might get into an accident?

I'm just going to assume a few things which i'm sure someone will tell me is wrong. I would assume that at a high enough speed you would turn off pedestrian detection and only detect cars. Why waste cpu cycles on things that shouldn't be there. I would also assume that maybe you would turn it off on high speed areas with no pedestrian crossings.

I would also assume that traveling at 40-75 mph is different than traveling at 0-30. The rules that apply to both manual drivers and automated software is fairly self apparent.

I'm not saying it couldn't be made to be more effective. I'm just throwing out my ideas on why this happened in the first place. The logic should be changed so that if an object is detected and you're traveling at speed and the estimated path might intersect with the vehicle, that the vehicle should probably drop down in speed so there can be more reaction time. However you have to balance that with false alarms.
 
Back
Top