Video Released of Uber Self-Driving Car Accident

Nobody should still be arguing about visibility being a factor.

You all need to accept that this was not the case. I have seen more than enough evidence and witness information to say that it played no part in this accident.

Those roads are very well lit even at 10 PM at night. Hell, you could drive them with your lights out, some do.

I can make all kinds of guesses why that video looked like it was dark, it wasn't dark enough to have stopped the driver, and presumably the car, from seeing the victim.

Time you guys catch onto this fact and listen to people who know this city and look for yourself at the videos of what it actually looks like.
 
I know what you're saying, but isn't this accident a combination of a failure of Uber's autonomous system, distracted driving and jaywalking? It's like a perfect storm.

Red herrings. Even the jaywalking away from a crosswalk is a red herring.

What's to have stopped the car from plowing into someone with right of way at a crosswalk if the AV failed like this? Would crosswalks marked by lines instead of zebra markings break the system? What if the markings have been worn away by vehicles and haven't been repainted yet? What happens if there's a power outage and the traffic signals and street lights are out? What if there's a bit of snow on the ground? Is the geo-positioning system supposed to be accurate enough save us then? How would the AV know if it had right of way or not?
 
You're right: Arrogance did kill this women. Her own arrogance.
I guess you missed the irony of labeling a little old lady who rides a bike arrogant... An old lady is dead because a self-driving car hit here in a situation that a human driver paying attention would have seen and avoided her. Blaming the victim is a sure way to lost the national debate in a very big way.
 
Unfortunate incident. Many things went wrong - with the car, the driver and the pedestrian. The car sensors definitely need some improvement for detecting/reacting to obstacles while at higher speeds. The driver needed to be more attentive. The pedestrian needed to be more observant of incoming traffic and obey the law, especially at nighttime.

In a joking manner, I believe we do need more self-driving cars. Maybe then, fearing for their lives, we will have less jaywalkers or more cautious pedestrians, particularly at nighttime.
 
The logic is not entirely incorrect. When you are presented with a) maintaing course and definitely killing one person or b) swerving into oncoming traffic and potentially killing all four people in that vehicle, what do you do? The human driver would probably take a chance at swerving but, for a programmer, you go with the option with a known outcome.

There was a time when it was a big discussion online, about how autonomous cars need to be able to make judgement calls that would "choose the lesser of two evils" so to speak.

As I said at the time, I find this terribly arrogant and morally wrong.

I can tell you all from terrible first hand experience, in these situations, just a few inches can mean life or death, the difference between someone being injured as they roll along down the side of your vehicle or having their head crushed against your windshield as their bodies are propelled up into the air ending in certain death.

Inches

A human has no time to make wild judgement calls weighing the value of this life or that. In most cases, all a human can do is try not to hit anything, one thing at a time, until they can get stopped or past the danger. This is all we should try to make a car do in our stead, try and miss, then try and live. Nobody in an accident "chooses to die". I want to caveat this, sometimes a person will choose not to hit someone, whom they could not stand to hit. "I'll do anything as long as I don't hit this kid", and sometime that might mean death, it might even be certain death, but this kind of choice isn't a split second surprise choice, it's the thinking that flashes through a person's mind while the world turns into stop-frame time. Usually they may not even have time to act on what they thought even though it seemed to them that they were, reality and physics have already dictated the outcome.

For people to try and decide in advance who should live and who should die in a situation like this, well I am thinking we just saw a perfect example of it. They are going to tear that code apart.

It is so obvious to most of us that in simple terms, something went wrong. So now we are looking for what that was. I don't think it was crappy cameras or incapable sensors or computers doing a reboot or diagnostic at the wrong time.

I think it was simple human arrogance and as usual, the computer just did what it was programed to do.

What is that saying about the most obvious is usually right?
 
The woman crossing is clearly at massive fault for being that inattentive and that impossible to see at night. It doesn't matter why. Maybe she wanted to die, maybe she doesn't think well all the time, maybe she was on drugs. Who knows. But her actions made a situation where she was very hard to see and borderline impossible to avoid.

That said, there is something really really wrong with that video. The human eye has a lot more f-stops and ability to detect multiple areas of contrast than a camera does but that video footage is odd. The black shadow area remains a black hole that shows nothing at all, not even her white shoes. It doesn't appear to be raining. It looks like the cameras are tuned in such a way that outside of what is directly illuminated by the headlights is virtually invisible. Probably a noise reduction measure.

I'm finding it pretty hard to believe she wasn't visible to a human eye in those conditions.

I'm not saying "ninja driver." I'm saying, the video concerns me as it doesn't seem like with all the street lamps and headlights that pedestrian should have been quite that invisible up to the last second.

I live in a rainy state. I know what it's like to drive on really black-hole roads with pedestrians at night where you can't hardly see the lane let alone people. But this doesn't look like one of those nights. So... is the system capable of adequately seeing at night or not? Kind of looks like 'not' based on that video.
 
  • Like
Reactions: PaulP
like this
Nobody should still be arguing about visibility being a factor.

You all need to accept that this was not the case. I have seen more than enough evidence and witness information to say that it played no part in this accident.

Those roads are very well lit even at 10 PM at night. Hell, you could drive them with your lights out, some do.

I can make all kinds of guesses why that video looked like it was dark, it wasn't dark enough to have stopped the driver, and presumably the car, from seeing the victim.

Time you guys catch onto this fact and listen to people who know this city and look for yourself at the videos of what it actually looks like.
You can make guess's, but you were not there.
Did she have lights on her bike? She should of saw the car coming. The lights from the car are bright. She should of used the crosswalk.
Was the biker on drugs or anything? Lot's of questions, but all people are doing is giving their 2 cents. All we can do is guess.
 
My 17 Ford Explorer Sport detects my plastic garbage bin behind and right when I backup, along with vehicles approaching perpendicular to it, the telephone pole in my alley, the neighbor walking his dog, the fence, vehicles that slow down too fast ahead, etc. using ultrasonic sensors.

It doesn't have the autobrake feature, but it sure as hell knows whats around me without cameras or lidar. The Uber car flat out failed to do what self driving cars are intended to do - outperform humans. Period.
 
  • Like
Reactions: PaulP
like this
Nobody should still be arguing about visibility being a factor.

You all need to accept that this was not the case. I have seen more than enough evidence and witness information to say that it played no part in this accident.

Those roads are very well lit even at 10 PM at night. Hell, you could drive them with your lights out, some do.

I can make all kinds of guesses why that video looked like it was dark, it wasn't dark enough to have stopped the driver, and presumably the car, from seeing the victim.

Time you guys catch onto this fact and listen to people who know this city and look for yourself at the videos of what it actually looks like.

I stepped through this video frame by frame, and while her feet were visible from 0:02 frame 12 (0:02.5) actually recognizing there was anything in the road beyond a couple white points, she was only visible from 0:03 frame 14 through 0:04 frame 3. That is less than a second. That is 13 frames, just over half a second. The roads may very well have been lit, but she was crossing at a shadowed area, at a right curve, where the headlights didn't even line up on her until half a second before impact. Visibility was absolutely a factor in this.

Now, perhaps Uber went cheap on their detection devices, and this would not have happened if they'd used better cameras, or possibly overlapped the cameras with IR cameras for night use. From what I've read, Uber's system does only use visible light cameras. From the look of this footage, crappy cameras at that. Perhaps that was the mitigating factor here, cameras that could not see her because they were too cheap. It was still a matter of visibility.
 
I guess you missed the irony of labeling a little old lady who rides a bike arrogant... An old lady is dead because a self-driving car hit here in a situation that a human driver paying attention would have seen and avoided her. Blaming the victim is a sure way to lost the national debate in a very big way.
This old lady is dead because she thought she could "beat the train across the tracks", now I'm not going to defend Uber any more on this, but lets be honest she made a conscious decision to cross the street at night either a) ignoring the possibility of an incoming car or b) assuming the driver (no way she knew it was autonomous) car would stop, or c) misjudged how fast the car may have been traveling. Why would any competent person do that? Wait until it's safe to cross your life may be at stake, and whoops it very much was.

a) You cross the street at night, so low visibility (or hell even in broad daylight) you make sure no cars are coming first.
b) You see a car coming, most people will wait for some indication of the car slowing down/seeing them, even if they were in a crosswalk, much less in the middle of a relatively fast speed road where people are not supposed to be.
c) You do see lights, and think you can beat the car, you better be right or the laws of physics will show you how wrong you are in more ways than one.

Uber does not get a pass on this to be certain under any circumstance, but lets not paint this woman as a "victim", as a series of really DUMB choices that she made set this whole thing in motion and more likely than not would have lead to her death regardless of what was driving the car.
 
You can make guess's, but you were not there.
Did she have lights on her bike? She should of saw the car coming. The lights from the car are bright. She should of used the crosswalk.
Was the biker on drugs or anything? Lot's of questions, but all people are doing is giving their 2 cents. All we can do is guess.


Wait, I've driven those roads, I have been there. Others have been there and still others live there, people from this forum.

There are videos posted showing others who haven't been there just how well lit that road is.

Now you can talk about all the "should haves" and "shouldn't haves" but it really does all come down to one very simple set of facts, and they are facts.

The car was not traveling at an excessive speed and the woman was not invisible, nor would she even have been hard for an alert driver to detect.

An alert human driver was not at the wheel.

This accident and the woman's death was avoidable.

Because an alert human driver was not at the wheel, the woman died.

And there was no crosswalk because that wasn't an intersection, it's a "Y". That section of road is divided roadway, no opposing lane traffic, the left lane simply turns off at that point. Technically, had the woman been riding down that road she would have been traveling the wrong way and against traffic. No indication if she had been riding along that roadway, perhaps she was walking her bike all along.

As usual, this is not a single issue. Some people are specifically looking at legal "fault" from a position of a courtroom judgement.

Others are concerned with the technology and what looks like a failure of that technology.

I am one of the latter and am far less interested in trying to play judge and jury from behind a keyboard. I am more interested in the technical aspects of why this technology seems to have failed in what should have been a very simple accident avoidance scenario.

Those cars are running around Tucson and autonomous trucks are making deliveries in Arizona, this is my State, I have a personal stake in this. I frequently drive to Tucson and Phoenix, these vehicles are on my roads.

So don't make guesses about me making guesses. You are being told how things are by people who know how things are. Proof is here;
https://hardforum.com/threads/video...g-car-accident.1956859/page-3#post-1043546351

Jump to 00:30 to see the location in an entirely different light, pun intended.
 
Last edited:
Would a non-self driving car strike the person in the video under these circumstances? I'm willing to bet yes since it was in the dark, they were in a 2 lane road with no traffic going at around 40 mph. The woman doesn't seem to be crossing anywhere she should be and wasn't wearing reflective clothing at least that can be seen in the video.

If you start thinking that autonomous vehicles should be programmed in every instance or be able to come to a complete stop/avoid hitting people no matter what, then you're asking too much. I'm willing to bet that the car isn't programmed to look out for people when on a highway.
In all likely hood, yes a human driver would have probably avoided the accident. The human eye is much more capable than a standard camera. That said, supposedly, Uber's AV's use Lidar/radar sensors as well, which should have picked this up. This is supposed to be one of those situations that AV's handle better than humans. Heck, I've seen car ads showing automatic breaking on normal cars while avoiding a child darting out from some parked cars.

From a legal standpoint, the safety driver should have caught it, but was clearly distracted. From a technical standpoint, this was a major failure of the AV. The whole point of AV's is not needing a driver of any kind at all.
 
In all likely hood, yes a human driver would have probably avoided the accident. The human eye is much more capable than a standard camera. That said, supposedly, Uber's AV's use Lidar/radar sensors as well, which should have picked this up. This is supposed to be one of those situations that AV's handle better than humans. Heck, I've seen car ads showing automatic breaking on normal cars while avoiding a child darting out from some parked cars.

From a legal standpoint, the safety driver should have caught it, but was clearly distracted. From a technical standpoint, this was a major failure of the AV. The whole point of AV's is not needing a driver of any kind at all.


I agree right down the line.
 
I stepped through this video frame by frame, and while her feet were visible from 0:02 frame 12 (0:02.5) actually recognizing there was anything in the road beyond a couple white points, she was only visible from 0:03 frame 14 through 0:04 frame 3. That is less than a second. That is 13 frames, just over half a second. The roads may very well have been lit, but she was crossing at a shadowed area, at a right curve, where the headlights didn't even line up on her until half a second before impact. Visibility was absolutely a factor in this.

Now, perhaps Uber went cheap on their detection devices, and this would not have happened if they'd used better cameras, or possibly overlapped the cameras with IR cameras for night use. From what I've read, Uber's system does only use visible light cameras. From the look of this footage, crappy cameras at that. Perhaps that was the mitigating factor here, cameras that could not see her because they were too cheap. It was still a matter of visibility.

You're looking at the wrong video my friend.

You need to look at this post and it's video;

https://hardforum.com/threads/video...g-car-accident.1956859/page-3#post-1043546351

Look close starting at 00:30, that's the location of the accident.
 
I stepped through this video frame by frame, and while her feet were visible from 0:02 frame 12 (0:02.5) actually recognizing there was anything in the road beyond a couple white points, she was only visible from 0:03 frame 14 through 0:04 frame 3. That is less than a second. That is 13 frames, just over half a second. The roads may very well have been lit, but she was crossing at a shadowed area, at a right curve, where the headlights didn't even line up on her until half a second before impact. Visibility was absolutely a factor in this.

Now, perhaps Uber went cheap on their detection devices, and this would not have happened if they'd used better cameras, or possibly overlapped the cameras with IR cameras for night use. From what I've read, Uber's system does only use visible light cameras. From the look of this footage, crappy cameras at that. Perhaps that was the mitigating factor here, cameras that could not see her because they were too cheap. It was still a matter of visibility.


I don't think it's crappy cameras. I think it was bad programing design. I think all systems were working as intended, someone just didn't realize what the actual outcomes could be.

EDITED: I have to retract this idea. I've been talking this over with other people and I have found a problem with my idea that the car just decided it was best to hit the woman. The car never seems to have reacted at all, not even warning the human driver. I can't imagine that the car would see a risk and not try to wake up and warn the driver, so I think I need to throw "works as designed" out the window.

Currently I have to say that it looks like a straight up detection and/or reaction failure.
 
Last edited:
Without knowing how the camera was setup, hard to know if an average human would have seen the cyclist or not. Yes, it looks dark in the video, but a lot of cameras set their exposure based on the brightest thing they see. Human eyes often have a wider range of sensitivity. Quite possible a human would have seen bits of brighter stuff moving into the lane before the bright part of the headlight fully lights they cyclist up. Or the invisible shape of the cyclist eclipsing distant lights. Whether it would have been enough warning to allow a human to avoid, hard to tell.

The failure to detect seems to be a common fail mode with this tech. Two of the Tesla crashes involved cars that failed to detect large trucks in the road, one which was broadside to the car. Here, the car failed to detect the pedestrian and sideways bicycle. Pretty large target, complete failure to detect.
 
  • Like
Reactions: PaulP
like this
If you pay attention to the vehicle's trajectory, It appears that the driver, or the automated system, attempts to swerve slightly right just before impact. Unfortunately, in this case, swerving to the left may have been a better choice. Even with a human driver behind the wheel, there's a very good chance the pedestrian would have been struck given the conditions. I'm sure the lawyers are lining up to get a piece of the Uber pie (billions), but unless you expect a perfect driving system, which is technically impossible given the variable, there's always going to be an expected rate of accidents with automated driving systems. If that rate is significantly lower than human resulted error/accidents, then automated drive is statistically safer.
 
i vote for Uber failure.

Although the headlights don't illuminate the pedestrian until too late, I thought these cars were equipped with night vision?
 
Some pedestrians are really stupid or a have a death wish. One time I'm out, since I just turned onto a large road. I was traveling maybe 10 or 15mph. And this guy just walks right out in front of me. And this is a two lane road on each side. So cars to my right could of hit him. And hes just walking slowly across. This was day time.

All are at fault imo. The pedestrian for not using common sense. And the driver and the self driving system all together.
 
Under those circumstance, a human driver would have hit her too.

She was impossible to see until up close and crossing a road with high speed traffic without any sort of crosswalk. This was not a fault with the software. The only way a computer program would have been able to see that was if it was using infrared to see further down the road than a human could. A self driving car with infrared would have been the only way she would have been avoided.

Impossible to tell. Depends on the recording camera and it's dynamic range. If it's a cheap recording camera, then yes it wouldn't have caught it till last second. Second, radar should have picked it up.

As I have always said, self driving cars will never become a reality as there are too many dynamic situations only the human brain can handle.

The driver obviously wasn't paying attention and will likely be brought up on manslaughter charges.
 
I guess you missed the irony of labeling a little old lady who rides a bike arrogant... An old lady is dead because a self-driving car hit here in a situation that a human driver paying attention would have seen and avoided her. Blaming the victim is a sure way to lost the national debate in a very big way.
A driver paying attention might have been able to brake a bit but to say they could have completely avoided her is just you talking out of your ass. That dumb bitch bears the major responsibility here crossing the road in the dark like that where she shouldn't be crossing.
 
Irregardless of people's arm chair driving, the system should have detected the pedestrian, and the backup driver should have been paying attention. Uber and the driver should be sued and penalized.

In the light of Uber's existing skullduggery, this was really only a matter of time.
 
There's been a few videos posted of the accident area showing lighting to be quite good, or at least nowhere as bad as the Uber dashcam footage:



 
There is something seriously fucked up with the released video. As someone stated earlier, somehow there's a complete black hole absorbing all light in the area of the jaywalker. There's simply no way in that situation with the weather conditions that the headlights would not have illuminated the person. My 15 year old car with scratched, pitted and foggy headlights would have illuminated the person and especially the bike just fine. The "black hole" doesn't even remotely match what shadows would be. As far as I'm concerned there's a good chance that video was doctored. The streetlights would have illuminated a completely different area than what the "black hole" shows. Also, the headlights on the car don't appear to illuminate the left lane at all which is simply not possible unless they weren't on.

This video is complete and utter crap and in no way represents reality. The person and the bike would have been easily seen from much farther away. To take the video as gospel would mean that the headlights on cars don't shine for more than about 10 feet in front of the vehicle.
 
I probably have a lot more driving experience than you. Feel free to take a look at the video, the person wasn't visible until the last 2 seconds. From :03 to :05 when the crash occurred. That's not enough time to come to a complete stop. If the driver would have slammed on the brakes the accident would have still have happened.

You can't judge the scene by that video. A dashcam does not have the dynamic range of a human eye. High end camera's don't either. The whole point of HDR in cameras is to address this. The human eye can see more details in the brightest and darkest area than any camera can. Consider the next time you drive at night. You can still see what is not directly in front of your headlights. And if you look at the example footage people have posted, the lighting on this road in particular is quite good. I have no doubt that the driver could have reacted in time in a normal car. I'm not convinced that, even if he saw the pedestrian, he would have realized quickly enough that the car wasn't going to avoid the accident in time.

The human story here is that someone died due to a distracted driver.

The technical story is that this incident has possibly exposed a fundamental flaw in the sensor system this car used. In the end, AVs are supposed to avoid these types of accidents. It could be something wrong with that specific vehicle, or a weakness in the sensor system.
 
  • Like
Reactions: noko
like this
There is something seriously fucked up with the released video. As someone stated earlier, somehow there's a complete black hole absorbing all light in the area of the jaywalker. There's simply no way in that situation with the weather conditions that the headlights would not have illuminated the person. My 15 year old car with scratched, pitted and foggy headlights would have illuminated the person and especially the bike just fine. The "black hole" doesn't even remotely match what shadows would be. As far as I'm concerned there's a good chance that video was doctored. The streetlights would have illuminated a completely different area than what the "black hole" shows. Also, the headlights on the car don't appear to illuminate the left lane at all which is simply not possible unless they weren't on.

This video is complete and utter crap and in no way represents reality. The person and the bike would have been easily seen from much farther away. To take the video as gospel would mean that the headlights on cars don't shine for more than about 10 feet in front of the vehicle.
aliens.jpg


Technology today, you never know.
 
Impossible to tell. Depends on the recording camera and it's dynamic range. If it's a cheap recording camera, then yes it wouldn't have caught it till last second. Second, radar should have picked it up.

As I have always said, self driving cars will never become a reality as there are too many dynamic situations only the human brain can handle.

The driver obviously wasn't paying attention and will likely be brought up on manslaughter charges.


I would not find the driver guilty. I don't believe there should be a driver in the seat of an autonomous car to begin with. I believe that the entire concept is laid on false premises that they should be able to remain alert and ready to respond at any given moment. People fall asleep while driving a car, just sitting in the seat with nothing manual to do is just completely unrealistic.

Now if they had something to do, like drive the car, then an autonomously driven vehicle could rely upon it's human safety driver.

If I were on a jury I would not convict this man. It may have been his choice to accept this job, but it's a setup doomed to fail and this driver is just a patsy. It's a fell good measure that lacks any realistic chance of actually succeeding, as we saw demonstrated most recently. And the guy really was trying to pay attention, he was, he was looking up occasionally. Hell, I can be driving on I-10 headed home from Tucson at night, asking my wife to talk to me to help keep me awake, that woman just goes to sleep herself instead.

The only way you could keep the safety driver attentive is to give him something he must constantly perform, just not the driving. Then he would be fully mentally engaged in some tasks related to the operation of the vehicle. For instance, give him a stylus and a driving map and make him "drive" by using the stylus to mark each new segment of the road or the car will stop. The driver has to pay attention to where he is and where he is going and keep poking the digital map every quarter mile or the ride stops. Then he might have a chance to react properly.

I am not so sure they will charge the driver at all.

There is conflicting information on this.

On the one hand, Phoenix LE says that if a driver is present, any citations will go to the driver. But Gov. Ducey has stated that Arizona would hold the company responsible even in the event of criminal charges.
 
Honestly, I've been in situations like this where a person of low intelligence walks out straight into a dark road at night and barely was able to swerve around them using my full attention and at peak alert levels. It can be almost impossible to avoid individuals in this scenario. However, I am not a computer with radar, scanners, and laser data collection that is able to perceive threats in this manner. There is no excuse for the car being unequipped to handle this sort of phenomenon.

The driver clearly wasn't doing their duty and should also be held accountable, however, I see this as more of a red stain on what is most likely the typical 'autonomous driver pilot' sitting behind the wheel of a vehicle in auto-mode. It's just human nature to become complacent and lose focus, especially when you are not required to perform any menial tasks that would otherwise keep you dialed into your duties.

Unlike a computer printer, or a monitor, Autonomous vehicle systems have a 0% error margin that is acceptable at this point. With death the resulting failure, until they can be more thoroughly vetted, more incidents like this will continue. Fortunately many ai programmers will be able to learn from this incident, and hopefully improve their systems to accommodate similar incidents in the future.
 
BTW, this is the Executive Order Arizona Governor Ducey just signed a month or so ago. It's advertised that the intent was to tighten up regulation of autonomous vehicle test in Arizona as the rules were deemed a little "Wild West".

https://azgovernor.gov/executive-orders

It has some interesting language, but what it does do is specifically require a safety driver. It does however set retirements and some of the companies involved may, or may not, depend on a human "safety driver" to meet some of those requirements.

I also believe that anyone (companies included) who wants to put an autonomous vehicle on the Arizona roads must submit a letter detailing how their vehicle will meet the requirements. Chief among the requirements is basically, n the event of a failure of the autonomous driving system, the vehicle has to be able to move to a safe location out of the road and then be turned off. This may in fact be the sole purpose of the "Safety Driver".

If this is the case, the should the State find the vehicle negligent, it would be Uber and not the driver facing charges. One or the other may attempt to produce some documentation outlining driver responsibilities in an attempt to avoid the target being placed on their back.

In the mean time, you can bet that Gov. Ducey is damned glad he took some action to tighten up self driving vehicle testing in the State when he did. He'll still catch some hell for it being non-existent before, but I doubt it will hurt him too bad as it is.
 
That sucks. What a shitty place and time to cross the road.
The lady walked out into moving traffic at night...pretty sure she would have been hit no matter who was driving.

Given that poor dashcam feed, absolutely. Given the much higher sensitivity of human vision, perhaps a super close call, but even still a good chance of a hit in that situation. However, chances are the lighting is not that bad. Of course, given texting (or whatever that whatever was doing), revert back to absolutely.

Regardless, this car should be seeing beyond visible spectrum -- ultraviolet, infrared, and a LIDAR projection map -- and detected an obstruction regardless of the conditions. (Unless conditions were both sides of the road littered with millions of spinning laser pointers.) I bet Uber will do everything they can to hide what the car actually saw from the public eye.
 
Hell, seeing this video now, I don't blame the self driving car at all. The woman crossing the street in the damn dark not at a crosswalk deserves the blame. Any of us would have plowed right over her as well.
 
I would not find the driver guilty. I don't believe there should be a driver in the seat of an autonomous car to begin with. I believe that the entire concept is laid on false premises that they should be able to remain alert and ready to respond at any given moment. People fall asleep while driving a car, just sitting in the seat with nothing manual to do is just completely unrealistic.

Now if they had something to do, like drive the car, then an autonomously driven vehicle could rely upon it's human safety driver.

If I were on a jury I would not convict this man. It may have been his choice to accept this job, but it's a setup doomed to fail and this driver is just a patsy. It's a fell good measure that lacks any realistic chance of actually succeeding, as we saw demonstrated most recently. And the guy really was trying to pay attention, he was, he was looking up occasionally. Hell, I can be driving on I-10 headed home from Tucson at night, asking my wife to talk to me to help keep me awake, that woman just goes to sleep herself instead.

The only way you could keep the safety driver attentive is to give him something he must constantly perform, just not the driving. Then he would be fully mentally engaged in some tasks related to the operation of the vehicle. For instance, give him a stylus and a driving map and make him "drive" by using the stylus to mark each new segment of the road or the car will stop. The driver has to pay attention to where he is and where he is going and keep poking the digital map every quarter mile or the ride stops. Then he might have a chance to react properly.

I am not so sure they will charge the driver at all.

There is conflicting information on this.

On the one hand, Phoenix LE says that if a driver is present, any citations will go to the driver. But Gov. Ducey has stated that Arizona would hold the company responsible even in the event of criminal charges.

The driver is a backup for the system. That is why they are there and it is MANDATED by law in most states. And if you actually WATCHED the video, you would have realized it was a woman who was doing nothing but playing with some gadget with her eyes off the road 1/2 the time.
 
From what we can see it would take a very quick reaction from a human driver to avoid the accident. The video makes it look like only a couple seconds to react, while that may not be the actual case it is still very little time. What I wonder is, in a situation like this the driver would have needed to swerve to avoid hitting the person, this can lead to the car running off the road into an obstacle that could do severe damage to the car. Some people driving would sacrifice themselves to protect another person from harm, but would a self driving car? If that was a small child standing in the road, I would take any action necessary to avoid hitting them, but would a self driving car make the decision to drive into a tree if that is what it took to avoid hitting a child? Or would it choose hitting the child over possibly killing the people in the car? These are the types of decisions that AI is a long way from being able to make. Some things take more than pure logic to do. Self driving cars are definitely not a near to being ready for service than most optimists think they are.
 
This is why we have jury's. There are several great points blaming the car, several points blaming the pedestrian, and several blaming the driver. One user said he would never convict the driver.... well i would... hear me out...

That "driver" had one job to do, babysit the autonomous car and make sure it didnt hit anything. Clearly the driver was negligent in doing their job. Furthermore, i would argue that staring into a cellphone at night rendered the driver night blind due to your eyes inability to adjust from a cellphones bright screen, to a dark street... yes we have established that the street was well lit, but go outside at night - look at your cellphone, and then try to look across the street. It will take your eyes several seconds to adjust. This is why in the video, the driver clearly had no idea the pedestrian was there until it was too late. I believe the driver had plenty of time to react - had he/she/it not been staring into a cellphone instead of doing a job they were being PAID to do. Think about that... you and I are not paid to drive to work - which implicates the driver even more for being criminally negligent at a minimum.

Also, several people have called the pedestrian an idiot/moron/retard for crossing like that. This was a homeless person, who may not have had the mental capacity to comprehend this situation. Yes, the pedestrian bears some of the blame here. But not a walk for the driver.
 
This is why we have jury's. There are several great points blaming the car, several points blaming the pedestrian, and several blaming the driver. One user said he would never convict the driver.... well i would... hear me out...

That "driver" had one job to do, babysit the autonomous car and make sure it didnt hit anything. Clearly the driver was negligent in doing their job. Furthermore, i would argue that staring into a cellphone at night rendered the driver night blind due to your eyes inability to adjust from a cellphones bright screen, to a dark street... yes we have established that the street was well lit, but go outside at night - look at your cellphone, and then try to look across the street. It will take your eyes several seconds to adjust. This is why in the video, the driver clearly had no idea the pedestrian was there until it was too late. I believe the driver had plenty of time to react - had he/she/it not been staring into a cellphone instead of doing a job they were being PAID to do. Think about that... you and I are not paid to drive to work - which implicates the driver even more for being criminally negligent at a minimum.

Also, several people have called the pedestrian an idiot/moron/retard for crossing like that. This was a homeless person, who may not have had the mental capacity to comprehend this situation. Yes, the pedestrian bears some of the blame here. But not a walk for the driver.
Convict the driver? How long would it take to take control over the car if she had actually been paying attention? I bet not enough time to do anything.
 
Having looked a few more times at the footage, to be an armchair driver, I believe a person paying attention would have noticed the pedestrian.

She was absolutely negligent in operating the vehicle, autonomous or not, and should be 100% liable and potentially face manslaughter charges.
 
There's been a few videos posted of the accident area showing lighting to be quite good, or at least nowhere as bad as the Uber dashcam footage:
Well just looking at the videos these are coming from peoples phones which tend to have MUCH MUCH better sensors for lighting conditions than a dashcam in a car, not saying Uber didn't "doctor" the video but the cops are the ones who posted it, so you'd think they would know the conditions of the area and as a result could tell the video was in someway doctored.
 
Back
Top