Video Released of Uber Self-Driving Car Accident

We can blame the lighting issue on NHTSA and the fact they haven't allowed Volvo (or any other manufacturer) to use their active high beams. Had they been in use, their wouldn't have been an unlit area to the left front of the car since the active high beam would light up that area until an oncoming car came and the beam adapted around the oncoming car. Instead the antiquated headlight laws still have vehicles dipping their beams down and to right so they don't dazzle oncoming traffic. That's no longer an issue with the advanced lighting systems implemented in vehicles but they still haven't been approved for use in the US.

It still doesn't excuse where this woman decided to cross the street, but it certainly could have illuminated her earlier.


There was no unlit area, it's just bad video displayed in the clip. Those streets are very well lit at night.
 
So sorry for your anger at the inadequacies of self-driving cars being publicly out-ed... I'm guessing this will considerably slow the development of "killer" cars. Does this mean you will have to drive your own car for the next 20 or 30 years? FYI: yes...
I care nothing about self-driving cars and honestly I think the whole damn idea is quite stupid. The point that you clearly did not comprehend was that it was asinine to assume the driver paying attention would have been able to avoid the accident.
 
That's the 64,000 dollar question. It depends on how fast you are going, how far you want to see, and what kind of processing you are going to do with it. Sensor pitch and interpolation can mean missing perfectly horizontal or vertical edges, especially on small arrays. How fast those arrays can be reliably dumped matters too. None of which ultimately matters since the Uber system seems to only mount visible-light cameras.

So, basically, UBER fucked up, by using sensors that are not all weather/lighting capable.

Any system that is going to be on the road driving autonomously damn well better be able to see at night, in the rain and able to react to someone pushing a damn bicycle with reflectors on it across an empty street at 40 mph.

These cars have no business on the street if someone is going to be struck under what should have been a scenario that would have highlighted the strengths of the technology. Conditions that would be challenging to a human driver, should be trivial to a properly designed sensor suite on a self driving car. this obviously isn't the case.
 
What video were you watching? Because the one posted shows a walking victim, not a running one. And what does a crosswalk have to do with it, does Ubers sensors detect faded crosswalk paint better than it detects a bicycle and ambling human?
She walked in front of the car, yes, not ran. That is worse, like she didn't care or was on drugs. A crosswalk is where you cross if you are crossing streets. They are way better lit and have signals and such. Much safer than crossing in the middle.
 
She walked in front of the car, yes, not ran. That is worse, like she didn't care or was on drugs. A crosswalk is where you cross if you are crossing streets. They are way better lit and have signals and such. Much safer than crossing in the middle.

well in the uk you can cross wherever you like on most roads, I assume ubers systems are supposed to work in every country .
 
She walked in front of the car, yes, not ran. That is worse, like she didn't care or was on drugs. A crosswalk is where you cross if you are crossing streets. They are way better lit and have signals and such. Much safer than crossing in the middle.
Stop over looking the fact that the senors overlooked an object. The car in the video didn't even attempt to stop which means it didn't even see the person in which it was designed to see.
We get it the lady was stupid for not crossing at the cross walk, hell it infuriates me when dumbass college students in the town in live in walk out into the crosswalk when a car is 25 ft from the crosswalk at 30mph. It doesn't change the fact that senors missed an object at night. This shouldn't even happen, I really would like to see the data on what the car saw before the crash.
 
I care nothing about self-driving cars and honestly I think the whole damn idea is quite stupid. The point that you clearly did not comprehend was that it was asinine to assume the driver paying attention would have been able to avoid the accident.


Ummm, those streets are very well lit, as several of us have said, several times. If the driver had been paying attention it's entirely possible he would have seen the pedestrian and been able to react. But that's not always going to be the case, and I agree that I think it's unwise to put people into this situation, expecting them to remain alert with no manual interaction keeping their attention. Luckily for the driver, my understanding of the laws on these vehicles suggest that they are not supposed to be in the seat for this kind of thing and may not be held liable at a ll. Uber on the other hand is very likely to get hit with criminal charges.

That bad video that shows a dark unlit road, when it's obvious to people from the area that those roads are not unlit like that, it raises questions.

But here are some things to keep in mind.

The camera that produced that video might not be from one of the Nav Cameras and might not be representative of "what the car actually saw".
Maybe the video was doctored, it's not impossible.
Maybe the Government doesn't want this to blow up on Ducey, it is Arizona that chose to allow all this testing.
I'm sure there are other maybes as well.

But what I see as likely, is that the cops initially posted this video and made statements that make it sound like they bought it, that the streets were unlit and the accident unavoidable. Though I did see one comment that left such a ruling in a less certain light. I can't say if they truly bought it and are relooking it, or if they already suspected. Maybe they are investigating whether the video was doctored and that's why not much is really being said right now.

What the government does have to do is decide if this accident was unavoidable or if negligence was involved. I believe the accident was avoidable, that the car systems failed, and that Uber should face criminal charges. I do not see the driver as responsible nor do I believe the victim was so negligent in creating the situation that it should free Uber of their own failure in this.
 
There was no unlit area, it's just bad video displayed in the clip. Those streets are very well lit at night.

Video is part of how the vehicle is making its decisions while driving. Had Volvo's active high beams been on, that area would have illuminated to the point the video would have shown it that way.

You can clearly see the headlight beams cutoff at 3 seconds into the video where the bottom of the wheel of her bike start to show, that entire lane would have been lit instead of the low beam cutoff keeping the woman in the shadow.
 
Last edited:
I think the real issue here, is not so much Uber's software failed... I mean yeah that's tragic and all, but if Uber purposefully doctored that video that says loads more and that should have Uber engineers, CEOs, etc facing criminal charges.

Given the amount of corporate ass-hattery and legal line toeing/crossing that Uber has shown thus far, I wouldn't put it past them to have actually done just that.
 
Video is part of how the vehicle is making its decisions while driving. Had Volvo's active high beams been on, that area would have illuminated to the point the video would have shown it that way.

You can clearly see the headlight beams cutoff at 3 seconds into the video where the bottom of the wheel of her bike start to show, that entire lane would have been lit instead of the low beam cutoff keeping the woman in the shadow.

You can drive those roads at night with no lights on at all they are that well lit.

No go back, read my entire post again because you missed something.

There was no unlit area, it's just bad video displayed in the clip. Those streets are very well lit at night.

And my next post;

The camera that produced that video might not be from one of the Nav cameras, and might not be representative of "what the car actually saw".

If you take a moment and look at some of the videos of what those roads actually look like at night then you will get it. Now if you think that the camera that took this video is the same cameras used for navigation, and you are thinking that these cameras are simply not suitable for the purpose and this is why the vehicle failed .... maybe. It's possible and I don't have definitive information on that one way or the other so I have to say that it's possible.

But that doesn't explain why the LIDR didn't pick up the woman and her bicycle.
 
Video is part of how the vehicle is making it's decisions while driving. Had Volvo's active high beams been on, that area would have illuminated to the point the video would have shown it that way.

You can clearly see the headlight beams cutoff at 3 seconds into the video where the bottom of the wheel of her bike start to show, that entire lane would have been lit instead of the low beam cutoff keeping the woman in the shadow.
The light pole on teh street has enough light would clearly illuminate the area. You could still see a silhouette of a person against the background. Looks like to me the video was doctored to reduce the light.....IMO.
Secondly the video on youtube is nothing more than 480p......the video quality is extremely poor and at about 40 feet you can see the woman's shoes light up which means that could have seen her but the car didn't. The video is purposely bad for a reason.
 
The light pole on teh street has enough light would clearly illuminate the area. You could still see a silhouette of a person against the background. Looks like to me the video was doctored to reduce the light.....IMO.
Secondly the video on youtube is nothing more than 480p......the video quality is extremely poor and at about 40 feet you can see the woman's shoes light up which means that could have seen her but the car didn't. The video is purposely bad for a reason.

It would be nice if they showed more of the video with the vehicle coming over the bridge so we had a better idea of the videos lighting prior to right when the accident took place. My biggest issue is this IIHS diagram of the Volvo's lighting.

Screen-Shot-2018-03-23-at-9.24.48-AM.png


Because the low beams are dipped to the right to prevent blinding oncoming traffic, the cutoff for the low beam is only 110 ft for everything to the left of the front of the vehicle, exactly where this woman was walking from. If this vehicle would have been fitted with Volvo's Active High Beams, they could have been on the entire time illuminating 400 ft in front of the left side of the vehicle and because they are active, only actively dipping part of the beam for oncoming vehicles.
 
It would be nice if they showed more of the video with the vehicle coming over the bridge so we had a better idea of the videos lighting prior to right when the accident took place. My biggest issue is this IIHS diagram of the Volvo's lighting.

View attachment 61241

Because the low beams are dipped to the right to prevent blinding oncoming traffic, the cutoff for the low beam is only 110 ft for everything to the left of the front of the vehicle, exactly where this woman was walking from. If this vehicle would have been fitted with Volvo's Active High Beams, they could have been on the entire time illuminating 400 ft in front of the left side of the vehicle and because they are active, only actively dipping part of the beam for oncoming vehicles.


Not going to accept people telling you that you could have had your lights off and seen that woman?

The street lights would have had her lit just fine even in her black top. People posting videos for you to look at, you can see the tree branches of the trees at the side of the lane departure next to the light pole and that's not because of any car lights.

You know, people take these things for granted, they don't think about them all the time, even cops. The cops want the video, Uber pulls it, I'm not jumping out on a limb and saying uber doctored the video, it could just be a shitty dash cam and not the actually navigation cams, there may be no navigation cam footage because it might not be recorded data. So I am not going to jump out there on that branch. But I can see some cops waiting on the footage, Uber delivers, and they watch it and start making comments and they are forgetting that those roads aren't actually dark at all. I bet they have remembered it by now though.

EDIT: I do want to add something, a workmate has the opinion that no one will be charged with anything because it's pretty obvious that no matter failures of driver and car, the woman was crossing illegally and required to yield right of way to vehicles. We've had others killed similarly by normal vehicles and as long as nobody is driving under the influence, it's a done deal. So as much as I may feel that the woman should not have died in this instance, I think my friend is probably correct and no charges related to the death of the woman may come from it.

That doesn't mean that this isn't evidence of another problem specific to the vehicles failure to react in any way that we would expect it to. As a technical aspect of autonomous vehicle testing in Arizona, I believe their is a big issue here that must be addressed even if it's not about who's at fault for this fatality.
 
Last edited:
Not going to accept people telling you that you could have had your lights off and seen that woman?

I'm going to go based off the evidence that we have from the in car video. If you'd like to get in an XC90 and do the same drive in the same conditions, by all means, go for it. I'm sure the Tempe police are doing a full investigation and won't be covering anything up for Uber.

The street lights would have had her lit just fine even in her black top. People posting videos for you to look at, you can see the tree branches of the trees at the side of the lane departure next to the light pole and that's not because of any car lights.

Even in the videos posted, you can see right after he comes off the bridge and under the overpass, it's really dark until another car overtakes him in the left hand lane. Please go do the same drive in an XC90 with no other vehicles around and post the video.

You know, people take these things for granted, they don't think about them all the time, even cops. The cops want the video, Uber pulls it, I'm not jumping out on a limb and saying uber doctored the video, it could just be a shitty dash cam and not the actually navigation cams, there may be no navigation cam footage because it might not be recorded data. So I am not going to jump out there on that branch. But I can see some cops waiting on the footage, Uber delivers, and they watch it and start making comments and they are forgetting that those roads aren't actually dark at all. I bet they have remembered it by now though.

If that's the case, then once the full report comes out, we'll read more about it.

My entire point was the XC90 has a 110 ft cutoff on it's low beams for anything to the left of the vehicle. If NHTSA wasn't blocking car manufacturers from using modern vehicle lighting systems, that road could have been illuminated by the vehicles headlights as well as any streetlights. Even though apparently it's like the daytime there at night because the streetlights are amazing.
 
I'm going to go based off the evidence that we have from the in car video. If you'd like to get in an XC90 and do the same drive in the same conditions, by all means, go for it. I'm sure the Tempe police are doing a full investigation and won't be covering anything up for Uber.



Even in the videos posted, you can see right after he comes off the bridge and under the overpass, it's really dark until another car overtakes him in the left hand lane. Please go do the same drive in an XC90 with no other vehicles around and post the video.



If that's the case, then once the full report comes out, we'll read more about it.

My entire point was the XC90 has a 110 ft cutoff on it's low beams for anything to the left of the vehicle. If NHTSA wasn't blocking car manufacturers from using modern vehicle lighting systems, that road could have been illuminated by the vehicles headlights as well as any streetlights. Even though apparently it's like the daytime there at night because the streetlights are amazing.


So I have been looking harder at the videos and some thing's people have been posting online about where the woman was hit.

First, I need to say that I was under a false impression that the left lane was an exit lane cutting away to a new roadway, it's not, it's just the beginning of a normal turn lane leading up to an intersection.

Some photos show the bricked "crossing" areas that aren't actually pedestrian crossings, but these are just before the tree and the streetlight so she was crossing at the top of the turn lane, then the left lane and into the right lane. Even more, it's actually two left turn lanes at that intersection.

So as I pointed out above, the legal aspect, I'd say that if history is any guide, neither Uber nor the driver will be charged.

And ND40oz, do you have an XC90 you'd sell me? Or rent for a day?
 
Video is part of how the vehicle is making its decisions while driving. Had Volvo's active high beams been on, that area would have illuminated to the point the video would have shown it that way.

You can clearly see the headlight beams cutoff at 3 seconds into the video where the bottom of the wheel of her bike start to show, that entire lane would have been lit instead of the low beam cutoff keeping the woman in the shadow.

I'm going to go based off the evidence that we have from the in car video. If you'd like to get in an XC90 and do the same drive in the same conditions, by all means, go for it. I'm sure the Tempe police are doing a full investigation and won't be covering anything up for Uber.



Even in the videos posted, you can see right after he comes off the bridge and under the overpass, it's really dark until another car overtakes him in the left hand lane. Please go do the same drive in an XC90 with no other vehicles around and post the video.



If that's the case, then once the full report comes out, we'll read more about it.

My entire point was the XC90 has a 110 ft cutoff on it's low beams for anything to the left of the vehicle. If NHTSA wasn't blocking car manufacturers from using modern vehicle lighting systems, that road could have been illuminated by the vehicles headlights as well as any streetlights. Even though apparently it's like the daytime there at night because the streetlights are amazing.

From one of the youtube posters regarding the submitted one comparing Uber's video and user submitted video.

Matt Elliott 21 hours ago
No one drives with high beams on AZ roads within the city. I did not record this video, but I live here, and this is what it really looks like at night.



In the Uber video, the gamma or brightness is definitely turned down. You can see deep down the road to the buildings, the street lights, the parking garage beyond, other cars, and all clearly. But the Uber video is a mess of darkness.

Taken on its own, the darkness could prove Uber's negligence if that's what they consider safe for the car to see. Geeze lol.
 
It looks like, from the first point where the bicycle is visible, that the person is just past the street light. It seems to me that a test driver who was paying attention should have been able to see the person, despite what the video shows.

Regardless, video of the test drive does show that the person was not paying attention as if they were actually driving. And that's one of the biggest issues I have. The self driving car needs to be operated under the assumption that something can go wrong, and the test driver is there to correct for it.

If the test driver were paying attention properly, and still couldn't avoid the collision, then I would say it's just an unfortunate accident. Should the car have corrected itself? Well, that's where we're trying to get to.
 
I think if you hire a person to watch paint dry, they won't be able to do it 5 days a week 4 weeks a month without their thoughts wondering off. Imagine it's worse for someone not being paid.
 
It looks like, from the first point where the bicycle is visible, that the person is just past the street light. It seems to me that a test driver who was paying attention should have been able to see the person, despite what the video shows.

Regardless, video of the test drive does show that the person was not paying attention as if they were actually driving. And that's one of the biggest issues I have. The self driving car needs to be operated under the assumption that something can go wrong, and the test driver is there to correct for it.

If the test driver were paying attention properly, and still couldn't avoid the collision, then I would say it's just an unfortunate accident. Should the car have corrected itself? Well, that's where we're trying to get to.

Except that this is not a requirement under the law as Arizona has defined it. Under current Arizona Law, these drivers are not even required. What is required is that the vehicles must have a mechanism to move the vehicles off the roadways and shut them off in the event of a failure. For the Uber trucks that are making trans State deliveries, they don't even call them drivers, they are only in the vehicle to monitor the equipment and systems.

I know that a "Safety Driver" is a requirement in some states, but it's not in Arizona.
 
You can drive those roads at night with no lights on at all they are that well lit.

No go back, read my entire post again because you missed something.



And my next post;



If you take a moment and look at some of the videos of what those roads actually look like at night then you will get it. Now if you think that the camera that took this video is the same cameras used for navigation, and you are thinking that these cameras are simply not suitable for the purpose and this is why the vehicle failed .... maybe. It's possible and I don't have definitive information on that one way or the other so I have to say that it's possible.

But that doesn't explain why the LIDR didn't pick up the woman and her bicycle.
Since you don't know all you can do is assume. Like the rest of us.

I say most blame is on the lady walking(jaywalking) across the street.(Maybe 75%) 20% to the camera system and 5% to the driver.(even if paying attention, I doubt they could of reacted in time)
 
Uber should be liable, here is why: making a person innocent when hitting someone jaywalking makes sense.. its about protecting people from undeserved criminal charges, and the law works on the assumption that nearly everyone does not want to run over random people... The law i think would not me intended to protect malfunctioning machines barreling down the road. So which Uber executive should go to jail anyway?
 
Since you don't know all you can do is assume. Like the rest of us.

I say most blame is on the lady walking(jaywalking) across the street.(Maybe 75%) 20% to the camera system and 5% to the driver.(even if paying attention, I doubt they could of reacted in time)


Since I know know what?

The only thing I alluded to, that I said I don't know is this;
It's possible and I don't have definitive information on that one way or the other so I have to say that it's possible.

Hagrid, I didn't assume anything at all, I left the possibility open. Do you really have a problem with that? With me saying that it's possible?

But if you were paying any attention at all to my part in this discussion, you'd get, that I have come to the conclusion that legally, no one is going to be charged because the woman was crossing the road illegally and failed to yield to traffic, law in Arizona.


28-793. Crossing at other than crosswalk
A. A pedestrian crossing a roadway at any point other than within a marked crosswalk or within an unmarked crosswalk at an intersection shall yield the right-of-way to all vehicles on the roadway.
B. A pedestrian crossing a roadway at a point where a pedestrian tunnel or overhead pedestrian crossing has been provided shall yield the right-of-way to all vehicles on the roadway.
C. Between adjacent intersections at which traffic control signals are in operation, pedestrians shall not cross at any place except in a marked crosswalk.

And although it doesn't apply in this case because the acident was outside of the defined area;
Jaywalking in Tempe
Chapter 19:

Sec. 19-1(2) Central business district means all streets and portions of streets within the area described as follows. All that area bounded by the salt river on the north, to 10th Street on the south and from Myrtle Avenue on the east to Maple Avenue on the west. [note: the ASU campus is, mostly, not in the central business district. Rather it is east of the CBD.]

Sec. 19-151. Crossing a roadway.
(a) No pedestrian shall cross the roadway within the central business district other than within a marked or unmarked crosswalk.
(b) Every pedestrian crossing a roadway outside of the central business district at any point other than within a marked or unmarked crosswalk shall yield the right-of-way to all vehicles upon the roadway.
(c) No pedestrian shall cross a roadway where signs or traffic control signals prohibit such crossing.

I suppose I am just having a hard time deciding what it is you think I am making assumptions about.
 
Video is pretty hard to watch, there's like no seeming reaction by car to obstruction. Just a fail by everyone all around, pedestrian paying absolutely no attention to incoming traffic while jay walking, car driving in a dimly lit area with no high beams, backup driver not giving a damn, etc.

In the Uber video, the gamma or brightness is definitely turned down. You can see deep down the road to the buildings, the street lights, the parking garage beyond, other cars, and all clearly. But the Uber video is a mess of darkness.

Taken on its own, the darkness could prove Uber's negligence if that's what they consider safe for the car to see. Geeze lol.

Damn, wouldn't discount that being the case with Uber. They don't have the best record with ethics and re-looking at the video, she's passing right between two lights, but is shrouded in darkness.
 
Last edited:
At that rate of speed, no, a human could not have stopped the car. A human might have been able to swerve to avoid a direct impact, but with less than 2 seconds of visibility before impact, I don't see how this is a major software failure.

well you could say its a software failure if the sensors are not light dependent, and therefore would have had the same non-reaction on a perfectly sunny day.
 
There was a time when it was a big discussion online, about how autonomous cars need to be able to make judgement calls that would "choose the lesser of two evils" so to speak.

As I said at the time, I find this terribly arrogant and morally wrong.

I can tell you all from terrible first hand experience, in these situations, just a few inches can mean life or death, the difference between someone being injured as they roll along down the side of your vehicle or having their head crushed against your windshield as their bodies are propelled up into the air ending in certain death.

Inches

A human has no time to make wild judgement calls weighing the value of this life or that. In most cases, all a human can do is try not to hit anything, one thing at a time, until they can get stopped or past the danger. This is all we should try to make a car do in our stead, try and miss, then try and live. Nobody in an accident "chooses to die". I want to caveat this, sometimes a person will choose not to hit someone, whom they could not stand to hit. "I'll do anything as long as I don't hit this kid", and sometime that might mean death, it might even be certain death, but this kind of choice isn't a split second surprise choice, it's the thinking that flashes through a person's mind while the world turns into stop-frame time. Usually they may not even have time to act on what they thought even though it seemed to them that they were, reality and physics have already dictated the outcome.

For people to try and decide in advance who should live and who should die in a situation like this, well I am thinking we just saw a perfect example of it. They are going to tear that code apart.

It is so obvious to most of us that in simple terms, something went wrong. So now we are looking for what that was. I don't think it was crappy cameras or incapable sensors or computers doing a reboot or diagnostic at the wrong time.

I think it was simple human arrogance and as usual, the computer just did what it was programed to do.

What is that saying about the most obvious is usually right?

Again, the problem with your logic is that while a human can only see what is right in front of them, the computer will inevitably be able to instantly process 5 possible scenarios in the time it takes a human to realize there is a problem, if the technology isn't there already. Artificially restricting a computer's capabilities to match a human is unjustifiable no matter how you put it.

For example, if the car was capable of seeing taking actions xyz would save the lives of all involved but did abc because that's what a human would do and loss of life occurred, wouldn't you consider that a flaw in programming?
 
.............................. Artificially restricting a computer's capabilities to match a human is unjustifiable no matter how you put it......

Do you mind if I put you into someone else shoes for a moment?

I want you to be a programmer, and you are working on this "accident avoidance and damage mitigation decision tree algorithm", for lack of a better word :ROFLMAO:

Anyway, you are the guy, you are writing the code, or maybe you are the person for the company who has spoken with legal and anyway, you are deciding the values, the weighting, who's going to live and who's going to die in each of dozens of potential scenarios as you called them.

In this instance, one old man dies because there are three passengers and the only way to save the old man is to place the passengers into a terrible collision that the programming says the car's integrity will certainly be broken.

In another instance it's a child, or a couple of teenagers.

The point being, you are programing the machine to make a conscious decision weighing life, making a call, choosing.

Now for humans it's rarely a case. Really, rarely does a driver have an opportunity, when faced with an impending accident, to make such a call. Frequently they don't even see it coming or they woulda missed it. And most times, even when they do see it coming, they don't have time or the reactions to do anything about it. But in the cases when they do, most react, and think while they are reacting, and think that they are choosing and changing the outcome when in fact, it's too late and they are just along for the ride, no matter how much they think they are making a difference.

Now in the first place, you misunderstand what I am saying. I never said it should match a human's capabilities. I said that it should do what a human would try to do, only better if it can. There is a huge difference in these two statements.

Typically, a human faced with an accident is going to key in on what they see as the greatest threat and steer or break to avoid it. Sometimes they incorrectly identify the greatest threat, sometimes they choose the wrong way to react, swerve when they should have braked, etc.

You are correct, that a computer should be able to process information faster and perhaps with better accuracy for a better outcome, then a human might. Therefor a computer should be able to correctly identify the greatest risks, select the most appropriate course of action to avoid that risk or minimize damage from an unavoidable accident. It should be able to "do what a human would do, only better". So at this point, I'll ask you, why isn't this good enough?

Remember, when you write that program you will be choosing who's going to live and who's going to die. If you do it my way, well if a car can't avoid killing someone everyone can say that it's working as well as it can work, it couldn't be avoided, accidents do still happen, but hey, they are way down and fatalities are greatly reduced. But when you start programming in decision making based on other factors, this is more valuable than that, you are taking responsibility for those decisions and you have to live with them. Remember the whole iRobot thing with the little girl. Someone has to live with that programming choice.

Furthermore, I will again argue that inches matter. They can literally mean who lives and who dies, inches. Compute cycles spent making judgement calls instead of reacting can mean inches, hell it can mean feet, it can be life or death for people.

Technology is great, but humans can overdo it. We have a self driving car that just a few days ago, ran down a woman and killed her, and the damned thing acted like it never even saw her. You can talk with wishful thinking about the day when these cars will be so great and how we should make them better than a human, but we are no where near that day yet, and as it stands, these engineers need to get their heads out of the clouds and their feet back on the ground with this thing.

I hope I have made myself clearer because I think you took my comments too simply.
 
Do you mind if I put you into someone else shoes for a moment?

I want you to be a programmer, and you are working on this "accident avoidance and damage mitigation decision tree algorithm", for lack of a better word :ROFLMAO:

Anyway, you are the guy, you are writing the code, or maybe you are the person for the company who has spoken with legal and anyway, you are deciding the values, the weighting, who's going to live and who's going to die in each of dozens of potential scenarios as you called them.

In this instance, one old man dies because there are three passengers and the only way to save the old man is to place the passengers into a terrible collision that the programming says the car's integrity will certainly be broken.

In another instance it's a child, or a couple of teenagers.

The point being, you are programing the machine to make a conscious decision weighing life, making a call, choosing.

Now for humans it's rarely a case. Really, rarely does a driver have an opportunity, when faced with an impending accident, to make such a call. Frequently they don't even see it coming or they woulda missed it. And most times, even when they do see it coming, they don't have time or the reactions to do anything about it. But in the cases when they do, most react, and think while they are reacting, and think that they are choosing and changing the outcome when in fact, it's too late and they are just along for the ride, no matter how much they think they are making a difference.

Now in the first place, you misunderstand what I am saying. I never said it should match a human's capabilities. I said that it should do what a human would try to do, only better if it can. There is a huge difference in these two statements.

Typically, a human faced with an accident is going to key in on what they see as the greatest threat and steer or break to avoid it. Sometimes they incorrectly identify the greatest threat, sometimes they choose the wrong way to react, swerve when they should have braked, etc.

You are correct, that a computer should be able to process information faster and perhaps with better accuracy for a better outcome, then a human might. Therefor a computer should be able to correctly identify the greatest risks, select the most appropriate course of action to avoid that risk or minimize damage from an unavoidable accident. It should be able to "do what a human would do, only better". So at this point, I'll ask you, why isn't this good enough?

Remember, when you write that program you will be choosing who's going to live and who's going to die. If you do it my way, well if a car can't avoid killing someone everyone can say that it's working as well as it can work, it couldn't be avoided, accidents do still happen, but hey, they are way down and fatalities are greatly reduced. But when you start programming in decision making based on other factors, this is more valuable than that, you are taking responsibility for those decisions and you have to live with them. Remember the whole iRobot thing with the little girl. Someone has to live with that programming choice.

Furthermore, I will again argue that inches matter. They can literally mean who lives and who dies, inches. Compute cycles spent making judgement calls instead of reacting can mean inches, hell it can mean feet, it can be life or death for people.

Technology is great, but humans can overdo it. We have a self driving car that just a few days ago, ran down a woman and killed her, and the damned thing acted like it never even saw her. You can talk with wishful thinking about the day when these cars will be so great and how we should make them better than a human, but we are no where near that day yet, and as it stands, these engineers need to get their heads out of the clouds and their feet back on the ground with this thing.

I hope I have made myself clearer because I think you took my comments too simply.

Or maybe it DID see her.......

And was just waiting for the chance to kill one of the Humans.
 
Video is pretty hard to watch, there's like no seeming reaction by car to obstruction. Just a fail by everyone all around, pedestrian paying absolutely no attention to incoming traffic while jay walking, car driving in a dimly lit area with no high beams, backup driver not giving a damn, etc.

Damn, wouldn't discount that being the case with Uber. They don't have the best record with ethics and re-looking at the video, she's passing right between two lights, but is shrouded in darkness.

The thing is I bet the ped saw the car and was doing that thing where you cross a street, make eye contact and decide if the car sees you or not. And I sure as heck bet it was bright enough to see so the ped assumed the car would slow or move and not hit the ped. Regardless that Uber car should have slowed. There's no plausible reason for it not to slow. It should be able to see thru pitch blackness, hell its an autonomous car right??
 
In no particular order:

The pedestrian was crossing the street, at night, in a black jacket, not at an intersection or crosswalk, without yielding to oncoming traffic.
She was either incapable of taking responsibility for her own safety (mental/vision/spatial judgment issues) or she was requiring other people to take on that responsibility for her (which is becoming a more and more disturbing trend in our society).

Uber was operating an autonomous vehicle on public roads with either inadequate sensors, systems, or both.
A basic task for autonomous vehicle operation should include low light and poor visibility conditions to include night time, sun low on the horizon to the front of the vehicle, rain, etc.). Whether the video provided is actual sensor data or just a dashcam is not clear (it's output quality/light level is awful for the task at hand). If Uber adjusted/doctored that video before release then that may be seen as tampering with evidence.

The "safety driver" was not. The person in the driver's seat was functioning as a system monitor, a.k.a. "a passenger".
Whether they were looking at their phone, the radio, or data output on some other screen they were not driving. No matter how Uber spins it that person was not driving, they were doing even less than a driving instructor would with a 15 year old permit holder behind the wheel.

As far as OODA/decision tree debate I would try and keep it simple for obstacles in path:
1. Object in/entering vehicle path? Yes/No
2. If 1=Yes; Does object cause risk of collision? Yes/No/Unsure
3. If 2=Yes/Unsure APPLY VEHICLE BRAKES
4. Is there a clear path to left/right of object? Left/Right/None
and so on...

Teaching/programming of steps 2 and 4, and overall operation and function of the vehicle may not be anything like simple. Once you can get an AV through this decision tree maybe you can start adding other data points, such as "will acceleration avoid collision and maintain control?" Any complicated math about 2 passengers, the old man in the crosswalk, a baby carriage, or puppies is irrelevant. If you want to get crazy with this type of math don't forget the parameter that should ask "How do I keep my owner/operator from getting sued?"
 
We've seen the release of terrible quality video and we know the name of the driver (who we can now dox) but we still don't know what the car's sensors saw? If I didn't know any better, I would think this was an Uber PR campaign to paint the others as unsympathetic parties.
 
http://ideas.4brad.com/it-certainly-looks-bad-uber

1. On this empty road, the LIDAR is very capable of detecting her. If it was operating, there is no way that it did not detect her 3 to 4 seconds before the impact, if not earlier. She would have come into range just over 5 seconds before impact.
2.On the dash-cam style video, we only see her 1.5 seconds before impact. However, the human eye and quality cameras have a much better dynamic range than this video, and should have also been able to see her even before 5 seconds. From just the dash-cam video, no human could brake in time with just 1.5 seconds warning. The best humans react in just under a second, many take 1.5 to 2.5 seconds.
3. The human safety driver did not see her because she was not looking at the road. She seems to spend most of the time before the accident looking down to her right, in a style that suggests looking at a phone.
4.While a basic radar which filters out objects which are not moving towards the car would not necessarily see her, a more advanced radar also should have detected her and her bicycle (though triggered no braking) as soon as she entered the lane to the left, probably 4 seconds before impact at least. Braking could trigger 2 seconds before, in theory enough time.)

To be clear, while the car had the right-of-way and the victim was clearly unwise to cross there, especially without checking regularly in the direction of traffic, this is a situation where any properly operating robocar following "good practices," let alone "best practices," should have avoided the accident regardless of pedestrian error. That would not be true if the pedestrian were crossing the other way, moving immediately into the right lane from the right sidewalk. In that case no technique could have avoided the event.
 
This is a ton of response!

Why is it my personal car can see into the dark better than this whizamig filled gadget mobile. If my 2 year old Audi could have prevented this there is no way a driverless car shouldn't have caught it. My night vision option has saved me from two deer, both were easily seen on my dash display I'd say 300 feet away, and the warning popped up on my window. Now I paid a premium for the extras, but if you want to put a driveless car on the road with me, it better have every single option from every single high end car available today, I don't care what the cost is.

And that person in the car is clearly looking down at either a phone or having a snooze, and I hope there is big repercussions whether she could have prevented this or not. And when I'm in the boonies and its dark, and even my HID's make it not perfect, you know what, I slow down. Does the driveless car just drive at posted speeds regardless of the conditions?
 
The company that makes the LIDAR says it's not their fault, their equipment can see an obstacle like this day or night. They blame Uber's computer.
 
The company that makes the LIDAR says it's not their fault, their equipment can see an obstacle like this day or night. They blame Uber's computer.

Now it's getting real.

"Our Lidar can see perfectly well in the dark, as well as it sees in daylight, producing millions of points of information.

"However, it is up to the rest of the system to interpret and use the data to make decisions. We do not know how the Uber system of decision-making works."

Ms Hall added: "We are very sad, sorry, and worried for the future of a project which is intended to save lives."
 
You gotta be kidding me. For real.

ANY DRIVER THAT WAS PAYING ATTENTION WOULD HAVE AVOIDED THIS. 100% of the times. Reasons are obvious:

(I say object because the fact that it was a human being isn't relevant per my point, which is hitting something that is moving slowly in good-weather conditions and from across a free lane).

1) Black holes do not exist, nor do shadows at night. The video is the product of the (shitty) camera, not the real light conditions. Has anybody here driven at night?!

2) The object was moving at below 2mph (or 1m/s). She doesn't appear to be running, and she is pulling a bike.

3) The object started its trajectory a lane away, on the left of the vehicle. Using wikipedia, a typical USA lane is around 3.5m width, and she was hit by the right side of the vehicle. So, she travelled around 6 meters before collision.

4) Point 3) makes it so that she was crossing the lane for at least 6 seconds.

5) Visibility was perfect. There was no traffic. No height-changes or anything that might obstruct the way.

6) Are you telling me that you can't avoid a slow moving object when you have 6 SECONDS of vision of it getting into the way and going slowly towards your trajectory?! The car was moving at around 20m/s, more or less. This puts it 120 meters away from the object the moment it enters the lane.

What if it had been a rock? Dead driver

What if it had been a deer (that somehow was crossing very slowly)? Dead driver

What if...?

Yes, the pedestrian crossed where she shouldn't. But stop talking bullshit like "the driver had 0,0005 seconds of reaction time" when it didn't. The woman didn't teleport herself in front of the car, nor there was fog nor there was anything that would have impaired the vision. The recording of the camera can only be used for context (weather, traffic, point of impact, initial trajectory of object), but not for "there were shadows on the street". C'mon. You know better. At night, with your headlamps on, you are EXTREMELY AWARE of what is going on around you. Specially with things that appear on the lane for no reason. They simply pop out. And a human being pulling a bike is not a small object. A deer enters a lane and you don't see it? Well, that woman with the bike wasn't that much smaller...

And lets not talk about the massive failure of UBER autonomous systems as a whole. The car didn't even attempt to break. For gods sake.
 
I am going to say that if the guy in the driver seat was paying attention, he would not have hit her...
 
Back
Top