Uber Car’s “Safety” Driver Streamed TV Show before Fatal Crash

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
The incident in which a self-driving Uber car hit a pedestrian in Arizona is in the news again, as police have revealed that the “safety” driver had been watching “The Voice” on Hulu when the crash occurred. Law enforcement said that if she was actually paying attention, the incident would have been “entirely avoidable.”

Police obtained records from Hulu, an online service for streaming TV shows and movies, which showed Vasquez’s account was playing the TV talent show “The Voice” for about 42 minutes on the night of the crash, ending at 9:59 p.m., which “coincides with the approximate time of the collision,” the report said.
 
Last edited:
Ya know... I've asked this multiple times now. If you can't trust the vehicle to drive itself, and you have to be paying so much attention to what's going on and be ready to grab the controls at any moment... wtf is the point of the vehicle driving itself? Sitting there doing nothing except waiting to grab the wheel seems like it would be far more stressful than just driving myself.
 
Negligent homicide? It's one thing I'd they were paying attention and this happened but this just takes the cake.
 
Ya know... I've asked this multiple times now. If you can't trust the vehicle to drive itself, and you have to be paying so much attention to what's going on and be ready to grab the controls at any moment... wtf is the point of the vehicle driving itself? Sitting there doing nothing except waiting to grab the wheel seems like it would be far more stressful than just driving myself.
Extreme laziness, and yes, there really are people that fucking lazy.

And stupidity of course.
 
Ya know... I've asked this multiple times now. If you can't trust the vehicle to drive itself, and you have to be paying so much attention to what's going on and be ready to grab the controls at any moment... wtf is the point of the vehicle driving itself? Sitting there doing nothing except waiting to grab the wheel seems like it would be far more stressful than just driving myself.

giphy.gif
 
The saddest thing about this story, besides it being preventable, was that the attendant was watching "The Voice" when she could have been streaming something good like "The Hotwives of Orlando". :p
 
It's possible the human safety attendant was simply listening to "The Voice" instead of watching it. The dash cam should show what they were actually doing in the moments before the collision.

To echo what others have said, what's the point of having an autonomous vehicle if a human is required to babysit it? I get the whole 'gathering data' angle of it, but it still begs the question of safety if a human is required to be present and paying attention to intervene with an expected response time of less than 2 seconds.

Lastly, I know it's a dead horse, but a few autonomous fatalities with how many miles have been driven is a significantly better miles:deaths ratio than human drivers.
 
Negligent homicide? It's one thing I'd they were paying attention and this happened but this just takes the cake.

I mean, maybe. It's easy to bash on this lady, but how many miles of autonomous driving had that car completed before the crash.


Have you ever been paid to just sit and stare off into the distance for hours... it's hard. I'd rather be homeless than have a job like that, I would zone out and the car would run over someone just like that. Or I'd just drive the thing myself to stay engaged.
 
It's possible the human safety attendant was simply listening to "The Voice" instead of watching it. The dash cam should show what they were actually doing in the moments before the collision.
The dash cam footage from inside the car was released awhile ago. She wasn't watching the road but a device in her hands.
 
Ya know... I've asked this multiple times now. If you can't trust the vehicle to drive itself, and you have to be paying so much attention to what's going on and be ready to grab the controls at any moment... wtf is the point of the vehicle driving itself? Sitting there doing nothing except waiting to grab the wheel seems like it would be far more stressful than just driving myself.

It's possible the human safety attendant was simply listening to "The Voice" instead of watching it. The dash cam should show what they were actually doing in the moments before the collision.

To echo what others have said, what's the point of having an autonomous vehicle if a human is required to babysit it? I get the whole 'gathering data' angle of it, but it still begs the question of safety if a human is required to be present and paying attention to intervene with an expected response time of less than 2 seconds.

Lastly, I know it's a dead horse, but a few autonomous fatalities with how many miles have been driven is a significantly better miles:deaths ratio than human drivers.

For a commercially available product your points would stand. However, this is a system under testing and development, and the safety driver is being paid to monitor the system and intervene in the case something goes wrong. The fact that she didn't and was doing other things is equivalent to sleeping on the job, watching youtube on the job, etc. in any other job. It's like the security guard watching youtube instead of alerting the police when bank robbers appear on the security cameras.
 
Ya know... I've asked this multiple times now. If you can't trust the vehicle to drive itself, and you have to be paying so much attention to what's going on and be ready to grab the controls at any moment... wtf is the point of the vehicle driving itself? Sitting there doing nothing except waiting to grab the wheel seems like it would be far more stressful than just driving myself.

Agreed

For a commercially available product your points would stand. However, this is a system under testing and development, and the safety driver is being paid to monitor the system and intervene in the case something goes wrong. The fact that she didn't and was doing other things is equivalent to sleeping on the job, watching youtube on the job, etc. in any other job. It's like the security guard watching youtube instead of alerting the police when bank robbers appear on the security cameras.

Also, this approach may not be humanly possible. Even if a "safety driver" is being conscientious and really trying to pay attention, when the car handles itself 99.99% of the time, focus will drift, and even if trying to pay attention, eventually our human monkey brains will start drifting off and daydreaming or something... Or, if less conscientious, we will start doing other things to prevent boredom, like texting, facebook, forums or streaming shows. Our human monkey brains are very poorly suited for tasks like this.

It's easy to pay attention if you absolutely have to (like when driving the car manually) but as soon as you don't have to, at most you'll get people to focus for a short period of time, but as time goes on, and we get more used to the automation we will drift off, or worse.
 
  • Like
Reactions: noko
like this
Negligent homicide? It's one thing I'd they were paying attention and this happened but this just takes the cake.

I still blame Uber :)

The NTSB report found that Uber's software "determined that an emergency braking maneuver was needed" 1.3 seconds before the crash. Unfortunately, the vehicle wasn't programmed to actually perform emergency braking procedures—nor was it programmed to alert the safety driver.
https://arstechnica.com/cars/2018/0...ulu-just-before-fatal-self-driving-car-crash/
 
Agreed



Also, this approach may not be humanly possible. Even if a "safety driver" is being conscientious and really trying to pay attention, when the car handles itself 99.99% of the time, focus will drift, and even if trying to pay attention, eventually our human monkey brains will start drifting off and daydreaming or something... Or, if less conscientious, we will start doing other things to prevent boredom, like texting, facebook, forums or streaming shows. Our human monkey brains are very poorly suited for tasks like this.

It's easy to pay attention if you absolutely have to (like when driving the car manually) but as soon as you don't have to, at most you'll get people to focus for a short period of time, but as time goes on, and we get more used to the automation we will drift off, or worse.

Put in a regulation that it can only run 45-60 minutes before requiring a 15 minute break. If that is what is required to safely develop these systems, so be it.
 
Have you ever been paid to just sit and stare off into the distance for hours... it's hard.
I don't think that's the proper job description, well its uber so maybe it is. Others have 2 drivers just in case one dozes off behind the wheel.
 
Was she actually watching The voice or listening to the voice? I stream tv shows and YouTube stuff all the time and just listen to them when I'm driving.

Regardless, what was the reaction time required for the driver to realize that the car malfunctioned and reasonably react? Was it even remotely plausable that she could have prevented the accident? Thats the only question that matters as far as I'm concerned.
 
The Safety Driver got so bored at work that they were reduced to watching The Voice.

Oh and what do airline pilots do to stay engaged when autopilot is on?
 
Ya know... I've asked this multiple times now. If you can't trust the vehicle to drive itself, and you have to be paying so much attention to what's going on and be ready to grab the controls at any moment... wtf is the point of the vehicle driving itself? Sitting there doing nothing except waiting to grab the wheel seems like it would be far more stressful than just driving myself.
Too bad, you haven't heard about the concept of "trial runs". You know, where the systems are tested under observation and/or controlled conditions to iron out the issues.

That's why you can't buy a "self driving" car yet. Because it is not market ready technology.

Sometimes the scientific and engineering illiteracy can be embarrassing here for a tech forum.
 
The main reason for this accident was in the end anyway the idiot pedestrian who crossed the highway by foot. Darwin at action.
 
The Safety Driver got so bored at work that they were reduced to watching The Voice.

Oh and what do airline pilots do to stay engaged when autopilot is on?

Talk and do paperwork/radio work
 
Lastly, I know it's a dead horse, but a few autonomous fatalities with how many miles have been driven is a significantly better miles:deaths ratio than human drivers.

While I would guess this is true, what are the numbers on public streets? Do we know?

While there are a lot of deaths... there is also a crap ton of hours put in vehicles each day.
 
Last edited:
I have very little sympathy for either side. The beached whale of a driver is a dumbass, and the lady walking her bike across a road with oncoming traffic, at night, was just as dumb.
 
I have very little sympathy for either side. The beached whale of a driver is a dumbass, and the lady walking her bike across a road with oncoming traffic, at night, was just as dumb.

Sadly that about sums it up. Camera footage clearly shows the idiot drivers staring at her phone for several seconds (the article said I believe 5 seconds, which as well all know, might as well be a lifetime in terms of what can happen while driving.) Then at the last half second, looks up just in time to see the bicyclist and there is nothing she could have done. (Maybe, just maybe, had she had her hands on the steering wheel, rather than a phone, she could have swerved and avoided or lessened the impact.)

Now being at night time like that, the bicyclist also looked to be equally as oblivious to her surroundings, just meandering across the street, (not a crosswalk either), so even if you had several seconds of time that you saw the bicyclist, it would of been hard to avoid, especially at night like that.
 
Too bad, you haven't heard about the concept of "trial runs". You know, where the systems are tested under observation and/or controlled conditions to iron out the issues.

That's why you can't buy a "self driving" car yet. Because it is not market ready technology.

Sometimes the scientific and engineering illiteracy can be embarrassing here for a tech forum.
Yes, it can. Because apparently you've never seen automated systems before that need to be continuously monitored long after they've been in production and are no longer just in "trial runs".
 
Yes, it can. Because apparently you've never seen automated systems before that need to be continuously monitored long after they've been in production and are no longer just in "trial runs".
The systems I assume you are referring to are assembly line automation, where the point is not doing the job unmonitored, the point is optimizing efficiency and increasing production volume. Obviously you cannot achieve that in driving. (Well at least not in that sense)
So if they fail to produce a driving system that can work without constant supervision it will never become a viable product. So you have nothing to worry about either way.
 
The systems I assume you are referring to are assembly line automation, where the point is not doing the job unmonitored, the point is optimizing efficiency and increasing production volume. Obviously you cannot achieve that in driving. (Well at least not in that sense)
So if they fail to produce a driving system that can work without constant supervision it will never become a viable product. So you have nothing to worry about either way.
Yet there are more and more idiots wedging water bottles or buying $200 magnets to attach to their steering wheels every day.
 
Uber + Volvo. Every single news article on the subject.
It's about time Volvo should take Uber to court. This accident was entirely Uber's fault , no doubt the lawyers would happily come up with something to make it stick.
 
Yet there are more and more idiots wedging water bottles or buying $200 magnets to attach to their steering wheels every day.
Stupid people will always find ways to be stupid. I don't want to halt progress for their benefit.
 
I mean, maybe. It's easy to bash on this lady, but how many miles of autonomous driving had that car completed before the crash.


Have you ever been paid to just sit and stare off into the distance for hours... it's hard. I'd rather be homeless than have a job like that, I would zone out and the car would run over someone just like that. Or I'd just drive the thing myself to stay engaged.


Used to get paid to do that, when you are on guard duty in the military it's a similar thing. staring off into the distance looking for that rare instance you have to actually do something. And like the safety driver, if you happen to be not paying attention when something does happen, it doesn't end well.
 
I am more pissed her data plan let's her watch entire tv shows online

Mine gets me through the theme and beginning credits then overages start.
 
Stupid people will always find ways to be stupid. I don't want to halt progress for their benefit.
There's a slight difference between an idiot managing to kill themselves in their own home, and an idiot sending a 5,000 pound missile down the street at 45mph into someone else.
 
There's a slight difference between an idiot managing to kill themselves in their own home, and an idiot sending a 5,000 pound missile down the street at 45mph into someone else.

There are two problems for me with self driving cars. Accidents seem to happen even when the system is working properly. However, technological devices fail.

Also, the new smart cars can be hacked and controlled remotely, as shown in some news reports. What would happen if someone managed to hack an autonomous vehicle, or a fleet of them?

Fully autonomous cars…driverless…should always have a human backup driver, one that is as fully responsible as in normal vehicular driving. Remember, Isaac Asimov postulated certain protocols and protections with regard to robots. The marketplace, which is often driven by greed, should not be the final determinant for how this new technology is used.
 
Last edited:
There are two problems for me with self driving cars. Accidents seem to happen even when the system is working properly. However, technological devices fail.
I must've missed the last decade, and this is 2028, when did self driving cars become commonplace? No, the Tesla is not a self driving car, it's a glorified lane assist feature with misleading marketing.

Also, the new smart cars can be hacked and controlled remotely, as shown in some news reports. What would happen if someone managed to hack an autonomous vehicle, or a fleet of them?
Everything is electronically controlled in cars already. Adding self driving feature won't make a car more easy to hack.
Fully autonomous cars…driverless…should always have a human backup driver, one that is as fully responsible as in normal vehicular driving. Remember, Isaac Asimov postulated certain protocols and protections with regard to robots. The marketplace, which is often driven by greed, should not be the final determinant for how this new technology is used.
With self driving cars the responsibility must lie with the manufacturer (the owner just as much as to keep up with maintenance and software updates). They cannot afford to market vehicles that are dangerous, if they're held responsible for every accident caused by the vehicle. That's the only way I'd allow self driving cars to be sold, and used without supervision. If they try to somehow make the attending passenger responsible, fuck that, no way.
 
I must've missed the last decade, and this is 2028, when did self driving cars become commonplace? No, the Tesla is not a self driving car, it's a glorified lane assist feature with misleading marketing.


Everything is electronically controlled in cars already. Adding self driving feature won't make a car more easy to hack.

With self driving cars the responsibility must lie with the manufacturer (the owner just as much as to keep up with maintenance and software updates). They cannot afford to market vehicles that are dangerous, if they're held responsible for every accident caused by the vehicle. That's the only way I'd allow self driving cars to be sold, and used without supervision. If they try to somehow make the attending passenger responsible, fuck that, no way.

It's not about them being commonplace, but rather that it is an emerging technology that demands a socially driven conversation. And you can relax, you didn't miss the alarm clock… It's not 2028.

The news reports of hacked smart cars showed limited remote control possibility.. I don't see how you can not see the danger in autonomous cars being hacked.

Let's have this conversation again, in 2028.
 
Back
Top