Tesla Driver in Fatal “Autopilot” Crash Got Numerous Warnings

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
According to a report by the National Transportation Safety Board, Joshua Brown’s “death by autopilot” was his own fault: his Tesla Model S repeatedly warned him to get his hands back on the wheel, but he chose to ignore those alerts. Brown, a former SEAL, was killed in Florida when his vehicle collided with a truck: during a 37-minute period of the trip when he was required to have his hands on the wheel, he apparently did so for just 25 seconds.

The National Transportation Safety Board (NTSB) released 500 pages of findings into the May 2016 death of Joshua Brown, a former Navy SEAL, near Williston, Florida. Brown's Model S collided with a truck while it was engaged in the "Autopilot" mode and he was killed. A Tesla Inc spokeswoman Keely Sulprizio declined to comment on the NTSB report. In 2016, the company said Autopilot "does not allow the driver to abdicate responsibility," however. Brown family lawyer Jack Landskroner said in an email the NTSB's findings should put to rest previous media reports that Brown was watching a movie at the time of the crash, which he called "unequivocally false."
 
Pretty sure most of us knew that, granted I remember the thread from back then and there were a number of people wanting to burn Tesla at the stake for this, even though at the time, all the data pointed to driver error and not the car or any system failing.

Just goes to show shitty drivers are shitty drivers.
 
Ignore warnings pay the price. Sad that it was a former SEAL but it just goes to show you that common sense pays when you use it.
 
1) What does Navy SEAL have to do with nay of this.
2) Apparently the software very much allows the driver to abdicate responsibility. They demonstrably did so.
3) I'm not sure how hands on the wheel for 26 seconds of 37 minutes has laid to rest the movie watching issue. He could have done that. been rubbing one out.. whatever. We just know he wasn't actively driving and the car bitched about it, but did not do something like slow and bring the vehicle to a halt due to his hands not being on the wheel.
 
Pretty sure most of us knew that, granted I remember the thread from back then and there were a number of people wanting to burn Tesla at the stake for this, even though at the time, all the data pointed to driver error and not the car or any system failing.

Just goes to show shitty drivers are shitty drivers.

My only gripe with Tesla on this is calling the system "Autopilot" in both marketing and documentation. That term makes people think of a fully automated system. Yes I know the term comes from aviation and the plane doesn't actually fly itself, but it doesn't matter. People are dumb enough to believe what they want to believe.
 
1) What does Navy SEAL have to do with nay of this.
2) Apparently the software very much allows the driver to abdicate responsibility. They demonstrably did so.
3) I'm not sure how hands on the wheel for 26 seconds of 37 minutes has laid to rest the movie watching issue. He could have done that. been rubbing one out.. whatever. We just know he wasn't actively driving and the car bitched about it, but did not do something like slow and bring the vehicle to a halt due to his hands not being on the wheel.
On #3, perhaps he stroked out, no pun intended, or had a similar medical emergency and was Incapacitated ? In either case the result would have been the same regardless of vehicle.
 
On #3, perhaps he stroked out, no pun intended, or had a similar medical emergency and was Incapacitated ? In either case the result would have been the same regardless of vehicle.
Nope, the Tesla is not like any other vehicle. It can avoid obstacles, steer and brake all by itself.
The failure here was in design, expect a Civil Trial.
 
1) What does Navy SEAL have to do with nay of this.
2) Apparently the software very much allows the driver to abdicate responsibility. They demonstrably did so.
3) I'm not sure how hands on the wheel for 26 seconds of 37 minutes has laid to rest the movie watching issue. He could have done that. been rubbing one out.. whatever. We just know he wasn't actively driving and the car bitched about it, but did not do something like slow and bring the vehicle to a halt due to his hands not being on the wheel.
Say your tesla is on the freeway doing 65mph in the middle lane, how does the tesla do anymore than bitch to the driver to take the wheel. It's not like slowing the car to a halt is any safer in such a situation? It doesn't abdicate any legal responsibility, as you the driver still turn on autopilot and is sitting behind the wheel. Ultimately it's a fuzzy situation for AI as any method to get aggravate the driver into taking the wheel can iffy as it could be considered a distraction and just flat out refusing to continue routines can lead to just as much safety hazard as continuing but with the added spice of going against the drivers will, which leads to added complexity to if autopilot is on or off.
 
Nope, the Tesla is not like any other vehicle. It can avoid obstacles, steer and brake all by itself.
The failure here was in design, expect a Civil Trial.
Good luck to that, I got my popcorn on that abrupt failure, hint, it won't be a issue for Tesla, they are on perfectly legal grounds.
 
I don't know why is it necessary to mention that he was a seal?
Is that supposed to exact sympathy, or is that supposed to mean that he had an excuse for being stupid?
I don't get it. It's completely irrelevant to the case.
 
Nope, the Tesla is not like any other vehicle. It can avoid obstacles, steer and brake all by itself.
The failure here was in design, expect a Civil Trial.
Just because your house is fitted with a fire alarm doesn't mean you can play with fire. And having your hands on the wheel for less than 1% of the time is the equivalent of playing with fire.
 
Just because your house is fitted with a fire alarm doesn't mean you can play with fire. And having your hands on the wheel for less than 1% of the time is the equivalent of playing with fire.
Ehh.... it's more like having a roomba with a warning on it that says you should watch it while it does it's job, then getting complacent, failing to heed the warning and having it burn down the house.
 
The car can change lanes, but it can't slow down, turn on the hazards and stop on the shoulder if people don't pay attention like it knows they're not doing?
 
My only gripe with Tesla on this is calling the system "Autopilot" in both marketing and documentation. That term makes people think of a fully automated system. Yes I know the term comes from aviation and the plane doesn't actually fly itself, but it doesn't matter. People are dumb enough to believe what they want to believe.

That is their problem. Ignorance is not an excuse when the prompt comes up EVERY time you use the system and have to click accept in the car that STATES this is not fully autonomous and explains that just like a planes autopilot you are required to remain in control.

The people who have this happen are the same people who crash without assisted driving, it probably would have just happened sooner.
 
Ehh.... it's more like having a roomba with a warning on it that says you should watch it while it does it's job, then getting complacent, failing to heed the warning and having it burn down the house.
How do you figure?

The collision avoidance systems aren't there to indulge your stupidity but to act as a last resort in case of an unexpected situation. This wouldn't have been an unexpected situation if the driver was paying attention to the road as he's supposed to. Just as the fire suppression system isn't there so you can leave fires unattended, but to act as a last ditch effort to save your sorry ass.

The roomba was designed to do it's job without human interaction and guidance. The autopilot feature in the tesla wasn't.
 
How do you figure?

The collision avoidance systems aren't there to indulge your stupidity but to act as a last resort in case of an unexpected situation. Just as the fire suppression system isn't there so you can leave fires unattended, but to act as a last ditch effort to save your sorry ass.

The roomba was designed to do it's job without human interaction and guidance. The autopilot feature in the tesla wasn't.
Ok. Imagine a world in which you're supposed to watch the roomba. Just imagine it for a second.
Too hard? Ok, lets change it to, i dunno, an automated train system and you're the operator who's supposed to watch it. Change it to autopilot (same name, wow) on an airplane that you're supposed to watch/keep track of. Change it to autopilot on a cruise ship that you're still supposed to watch and keep track of.
Instead of watching it, you decide to take a nap. Now there's a chance nothing bad will happen, but there's also a chance that catastrophe will strike.
 
The car can change lanes, but it can't slow down, turn on the hazards and stop on the shoulder if people don't pay attention like it knows they're not doing?
Obviously it could but you have to wonder what a driver might do if the car began to veer off course. Does the Tesla lane pass? Will it move over to get around other cars as part of regular driving? Or does it just stay in whatever lane you put it in forever? I could easily see some tilted driver trying to correct the car if it began to go into the shoulder trying to fight the AI and then what? Now you have a tesla that's veering back and forth across the highway like some drunk texting idiot.

It might just be one of those things whereby this is the new status quo, which is certainly better than the old one. Sure a few people might die who ordinarily would not have died without autopilot, however who knows how many near death experiences they would have had all along if it werent for autopilot being there. It's like complaining that an airbag can break your neck if deployed too close. But it also saves your life 99% of the time, more than you would fare without it. Same with autopilot, it will save more lives than it will cost.
 
So a professional murderer has died. dgz ain't sorry one bit
Is that how you look at everyone in the armed forces? You realize the SEAL's arent just assassins (and even when they are, the point of an assassination is to save lives, usually american lives)
 
"Tesla now in court for allegedly killing Seals!!!" Click to find out more!

That's why they mention he was a seal, for clickbait hyped up bs.
 
That is their problem. Ignorance is not an excuse when the prompt comes up EVERY time you use the system and have to click accept in the car that STATES this is not fully autonomous and explains that just like a planes autopilot you are required to remain in control.

The people who have this happen are the same people who crash without assisted driving, it probably would have just happened sooner.

Or the people that complain that they can't figure out how to use something or assemble something when they refuse to read instructions.

Can't fix stupid.
 
The software is doing what it's told. Again it comes around to human stupidity. And some here blame the software...um ok. I can see it now. Driver dies because software has over ridden human instructions and has to pull over no matter what because of morons.
 
Yes, this is how I look at everyone in any fucking army in the civilized world. No one joins an army because they are a nice and caring person. Why does it matter this guy was in the US Army? Well, I don't know. But since it's been mentioned, I though I'd share my opinion.

Your opinion is flat out wrong and reeks of terrible ignorance. Have you interviewed every member of the armed forces and watched their lifestyle? You have no grounds in which to say that. You are no better than trash.

I am not trolling. My despise is genuine

And unfounded. Get off your white horse, it's actually a mud covered pig.
 
Really... so you would rather there be no military so that whoever wants to can come and take over the country where you live an wantonly rape, kill, and/or torture you without anybody to stop them?

May your wildest dreams about this come to pass as long as you don't live in my country.

No, having an army is a necessary evil. Doesn't make the people who join such structures immune to evaluation. They are professional murderers.
 
I am not trolling. My despise is genuine
Troll:
-a person who makes a deliberately offensive or provocative online post.

Whether your feelings are genuine or not does not factor into the definition. Your choice of words "professional murderer" makes it deliberately provocative and offensive. Military personnel killing while on duty do not qualify for the "unlawful" part of the definition of murder. Therefore, you are deliberately trying to piss people off, especially those in the military or have family/friends in the military. Knock it off.:punch:
 
lol this after the "auto-pilot water bottle "hack"" on the front page!
and yeah, the topic is not about the army...
 
Good luck to that, I got my popcorn on that abrupt failure, hint, it won't be a issue for Tesla, they are on perfectly legal grounds.
I kinda think that the car should know better than to drive under a semi-trailer.
How much you wanna bet that software Rev 2.0 knows the average height of the vehicle and takes into account the height of the hole it is pointed at, because not every opening is a tunnel or overpass.:eek:
 
Nope, the Tesla is not like any other vehicle. It can avoid obstacles, steer and brake all by itself.
The failure here was in design, expect a Civil Trial.

I dont understand how you can look at the exact evidence and feel that tesla is on the hook. "Objects may be closer than they appear", "Batteries not included", "This is not a step", "Do not put in ear", "Do not use near fire flame or sparks", "Not for ingestion"..........they warned folks. Folks have a habit of ignoring warnings so much, that companies have to place labels on things to protect themselves. Common sense isnt always common. Some holloween costumes have actually had to be labeled "Costume doesnt allow you to fly"! Tesla warnings were ignored. People like to do something stupid then blame someone else. Unfortunately the driver was killed.
 
3) I'm not sure how hands on the wheel for 26 seconds of 37 minutes has laid to rest the movie watching issue. He could have done that. been rubbing one out.. whatever. We just know he wasn't actively driving and the car bitched about it, but did not do something like slow and bring the vehicle to a halt due to his hands not being on the wheel.
This is the only part I have issue with. Can Tesla prove with zero doubt that his hands were on the wheel for only that time frame? Can they prove the sensors that detect his hands on the wheel were functioning 100%?

Granted, if it was warning him and he did in fact have his hands on the wheel, then it brings up a whole different level of stupidity on his part (like not pulling over or attempting to shutdown the vehicle, calling 911 if he couldn't, etc.).
 
Nope, the Tesla is not like any other vehicle. It can avoid obstacles, steer and brake all by itself.
The failure here was in design, expect a Civil Trial.

If you believed that the Tesla has full auto pilot and can perform all those functions without error, you're delusional. They have disclaimers that pretty much states the driver needs to be fully attentive and in control at all times. That's why they have you agree to their EULA and TOS and all that BS. If you didn't read it, that's on you. If you didn't fully comply to their constant notification and be in control, that's on you. This is why companies put so many warnings on everything, because stupidity. And these asshats sue the companies. You're contributing to the problem.

Civil Trial? For what? Being stupid and negligent? Let me get my popcorn.
 
Nope, the Tesla is not like any other vehicle. It can avoid obstacles, steer and brake all by itself.
The failure here was in design, expect a Civil Trial.
My point was he wouldn't have been any better of in a traditional vehicle and those freaking should realize that alternative isn't better.
 
Civil Trial? For what? Being stupid and negligent? Let me get my popcorn.
You must not be from the good ol USA because if you were you would know that we are a sue happy nation. The Brown family does not have a lawyer looking into this because they like lawyers. They will sue, Tesla will settle out of court and life will go on.

PS I hope your crow flavored popcorn is low sodium.
 
Last edited:
My point was he wouldn't have been any better of in a traditional vehicle and those freaking should realize that alternative isn't better.

Even though it is equipped with Traffic-Aware Cruise Control (TACC) and the Autosteer lane-keeping system, the most damning fact is that "The car was also equipped with automatic emergency braking that is designed to automatically apply the brakes to reduce the severity of or assist in avoiding frontal collisions."
Telemetry shows the vehicles speed, before and up to the crash was "74 mph, where it remained for approximately two minutes up to and just after the crash. "\

The investigation proves that this vehicle did not "see" a tractor trailer rig in front of it.

The combination of software and sensors was negligent, resulting in the death of this man. Lawyers everywhere are chomping at the bit.
 
My only gripe with Tesla on this is calling the system "Autopilot" in both marketing and documentation. That term makes people think of a fully automated system. Yes I know the term comes from aviation and the plane doesn't actually fly itself, but it doesn't matter. People are dumb enough to believe what they want to believe.

an autopilot in an aircraft is not full automation the pilot is still required to be alert and at the controls as the autopilot system can kick off at anytime
 
Back
Top