U.S. Opens Investigation Into Fatal Tesla Crash

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
It was only a matter of time before something like this happened. Even though you should know better than to use the beta "autopilot" feature on the Tesla, someone didn't listen and now they are dead and the safety of the vehicles is being called into question. :(

The agency said the crash came in a 2015 Model S operating with automated driving systems engaged, and "calls for an examination of the design and performance of any driving aids in use at the time of the crash." It is the first step before the agency could seek to order a recall if it believed the vehicles were unsafe.
 
"Tesla said that "Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving."

This is almost Microsoft level of spin doctoring. Same lawyers?
 
I love the tech, but "people are stupid". They will abuse your tech in ways most engineers can never imagine. Because of that, the tech needs to be all or nothing. Either the car drives itself, or the human does. It needs to be 100% clear who, or what, is supposed to be driving at all times. It's that, or suffer a higher than usual number of state/federal investigations, and lawsuits.
 
It was only a matter of time before something like this happened. Even though you should know better than to use the beta "autopilot" feature on the Tesla, someone didn't listen and now they are dead and the safety of the vehicles is being called into question. :(

The agency said the crash came in a 2015 Model S operating with automated driving systems engaged, and "calls for an examination of the design and performance of any driving aids in use at the time of the crash." It is the first step before the agency could seek to order a recall if it believed the vehicles were unsafe.


Which is why I am thinking a Beta Feature should probably be deactivated until they are no longer Beta.
 
To me, it would seem like a weird position for a driver, to only be "partially" engaged in driving. Fully automatic self driving makes sense, but this weird state where I am keeping an eye on the road at all times, but not actually doing anything....maybe just because I have never had to do it.
 
everyone wants to blame the car.. but if the car was not in autopilot doesn't mean this would have been avoided sounds like the trailer is at fault.
 
From the videos I've seen recently, it seems like automatic emergency braking just not working, or it's all a lie. I actually haven't even heard of someone seeing it in action - it tends to only beep really loudly at you.
 
To me, it would seem like a weird position for a driver, to only be "partially" engaged in driving. Fully automatic self driving makes sense, but this weird state where I am keeping an eye on the road at all times, but not actually doing anything....maybe just because I have never had to do it.


It's pretty cool actually. My uncle has a Model S 75 or whatever and I got to squirt around town with it and test the auto drive. In reality it obeys all posted limits so it's a really boring drive after you get over the oh wow factor. An older relative tested it too and he was freaking the heck out when he engaged the auto drive. I think that had to do with his age lol. That said I can see how some could abuse it by completely ignoring the car and zoning out or sleeping. I was imagining how easy the drive would be from LA to Vegas baby!
 
everyone wants to blame the car.. but if the car was not in autopilot doesn't mean this would have been avoided sounds like the trailer is at fault.

Huh? There's not enough information to even make that claim. What if the car was in autopilot, ran a red light, and smacked into the trailer when it made a left turn on a green light? If it wasn't in autopilot and the driver was properly driving, then they would have stopped at a red light. What if the truck was turning onto the highway. He saw the Tesla was far enough away for him to come out into the road. So he did, but the Tesla never slowed down and just plowed into the trailer. Both situations would be the fault of the Tesla.

Is this what happened? I don't know. There's not enough information. Just a lot of speculation right now.
 
Huh? There's not enough information to even make that claim. What if the car was in autopilot, ran a red light, and smacked into the trailer when it made a left turn on a green light? If it wasn't in autopilot and the driver was properly driving, then they would have stopped at a red light. What if the truck was turning onto the highway. He saw the Tesla was far enough away for him to come out into the road. So he did, but the Tesla never slowed down and just plowed into the trailer. Both situations would be the fault of the Tesla.

Is this what happened? I don't know. There's not enough information. Just a lot of speculation right now.


It wouldn't. If the Tesla see's an obstacle it will slow to avoid the collision not swerve. The biggest issue are other cars crashing into the Tesla. The incident in question is from hole in the recognition software/sensors something I presume that made it "look" like the truck wasn't there. But then that's where the driver is supposed to be there to catch the system but the driver was probably sleeping. Read the story.
 
After reading the article. It seems this problem is similar to the other problems I read on here. So the car doesn't detect things if they're above the ground just right. I take it this detection is not like whats in self driving cars?
 
Frank Baressi, 62, the driver of the truck and owner of Okemah Express LLC, said the Tesla driver was "playing Harry Potter on the TV screen" at the time of the crash and driving so quickly that "he went so fast through my trailer I didn't see him."

"It was still playing when he died and snapped a telephone pole a quarter mile down the road," Baressi told The Associated Press in an interview from his home in Palm Harbor, Florida.
 
Well sucks for the guy, but I'm curious he was watching Harry Potter, which sounds a lot like a self-driving car, and it makes me wonder why this is legal yet Google self-driving cars are limited to streets 25MPH or less.
 
It wouldn't. If the Tesla see's an obstacle it will slow to avoid the collision not swerve. The biggest issue are other cars crashing into the Tesla. The incident in question is from hole in the recognition software/sensors something I presume that made it "look" like the truck wasn't there. But then that's where the driver is supposed to be there to catch the system but the driver was probably sleeping. Read the story.

FYI, autopilot isn't designed to deal with traffic lights. It blows through them. I think the feature has been slowly getting rolled out, but it's still not smart enough to deal with complex traffic lights in those like 5-6 point intersections. Doubt it ever will, without getting more sensors.


After reading the article. It seems this problem is similar to the other problems I read on here. So the car doesn't detect things if they're above the ground just right. I take it this detection is not like whats in self driving cars?

Nah. Autonomous cars have the same sensors as the Tesla, but they also have a bunch of other sensors the Tesla doesn't have. So it can see objects well above ground level.

This wasn't so much a problem with height detection, but trying to detect what's in front of it in general. The problem with using only one or two types of sensors. Each sensor has limitations, so they need to use different types to make up for the limitation. Using a camera has limitation of not being able to properly see what's in front of it. Sonar would help with that, but sonar has limitations with seeing far. That's where Google uses lidar, but Musk seems to not want to use it on their cars. Understandable, it'd make the car ugly to have lidar sitting on top of their car.
 
From the videos I've seen recently, it seems like automatic emergency braking just not working, or it's all a lie. I actually haven't even heard of someone seeing it in action - it tends to only beep really loudly at you.
I've seen a video where an oncoming car cuts off the tesla trying to make a left turn across traffic which would have resulted in a head on collision. The tesla braked incredibly fast and probably saved the driver from a crash, certainly faster than most people could react given the circumstances (think it was raining, at night, and in a construction zone with lots of cones and reflective shit everywhere)
 
The thing about autopilot is that it is not really as situationally aware as people think. It is just using GPS to maintain road position and basic car radar that is used in adaptive cruise control systems to maintain speed. It's not like an AI with cameras that can recognize obstacles, differentiate between threats and so forth like a human can. To me this is pretty unnerving. I want something to have human level awareness of its surroundings before trusting it. However the data suggests nearly all traffic accidents are caused by human incompetence in areas precisely where autopilot shines. Being able to see a pile of bricks fall off a bridge or an airplane making an emergency landing on the freeway are not legitimate threats. If such a thing were to occur the Tesla would be toast, however this is unlikely. How people do get into wrecks is by merging lanes without checking blind spots, making idiotic last second decisions to take an exit ramp, texting while driving and slamming into someone from behind, being a piece of shit and just going for a turn across multiple lanes of traffic because they dont want to wait for the next light, etc etc etc. All of these things are rather trivial for autopilot systems so in that regard you should be safer statistically speaking. But I still dont feel comfortable behind the wheel of something that is the equivalent of a blind man holding out his hands in front of him.
 
I've seen a video where an oncoming car cuts off the tesla trying to make a left turn across traffic which would have resulted in a head on collision. The tesla braked incredibly fast and probably saved the driver from a crash, certainly faster than most people could react given the circumstances (think it was raining, at night, and in a construction zone with lots of cones and reflective shit everywhere)
Hundreds of models have autobraking options, and have sor quite some time. It's not a tesla thing.
 
After reading the article. It seems this problem is similar to the other problems I read on here. So the car doesn't detect things if they're above the ground just right. I take it this detection is not like whats in self driving cars?


It's very similar to that tesla model s owner that summoned a car to the sidewalk and it ran into the back of a truck trailer where the trailer was higher off the ground than a normal vehicle.


Seems like a relatively simple thing to adjust for in the future, just expand the sensors to take into account of higher truck trailer vehicles, and if there is some blinding of the sensors in sunlight, fix that as well.
 
I've seen a video where an oncoming car cuts off the tesla trying to make a left turn across traffic which would have resulted in a head on collision. The tesla braked incredibly fast and probably saved the driver from a crash, certainly faster than most people could react given the circumstances (think it was raining, at night, and in a construction zone with lots of cones and reflective shit everywhere)

I just hope Tesla improves the emergency braking feature. I had my wife nearly rear end someone at night once, and I did not notice the car slowing down until I screamed at her to slam on the breaks. Though to be fair, she was not using autopilot.

I don't know what to think of this tragedy. He seemed like a cool guy.
 
Cue the hysterical technophobe overreactions that will set back the industry, to the benefit of large US car companies that don't have self driving tech yet......
 
I still think this autonomous driving push is a crock of shit, as I have said since the beginning. It will never leave the sunny southern states.

Self-Driving Cars Hit a Roadblock in the Snow

"However, when human drivers can’t see the lines they generally create their own paths. Eustice says teaching driverless cars how to follow those patterns is “really hard.” They’ll have to undergo experience-based learning using artificial intelligence. But even if these issues are resolved, senior director of automotive at Nvidia Danny Shapiro says, “I don’t think that we should expect that in a blinding snowstorm the autonomous vehicle will be fine.”"

As a health care worker - even in a state of emergency I have to get to work. Why would I buy an additional $10k+ worth of tech for something I can only use 2/3 of the year?
 
No matter how many "Danger, Use at Own Risk, Beta Test Feature" warnings Tesla puts on this, the blunt truth is no one else on the street that day knew about or accepted the risk of that driver using a test feature that Tesla installed into the vehicle. If Tesla keeps using public roads as test tracks to test beta features using poorly prepared owners as test drivers, they will eventually loose that lottery and get sued into bankruptcy.
 
"Tesla said that "Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving."

This is almost Microsoft level of spin doctoring. Same lawyers?
That statement is raw meat to a lawyer. The first part more or less makes it clear that it would be confusing to the driver how much involvement is required from the driver. They just confessed to a moving target in its capability.

That said, fundamentally a half measure like this where the driver can withdraw his attention 90% of the time will tempt the driver to withdraw his attention 100% of the time. You just made driving boring and 90% of the time it will be like watching grass grow and their mind will want to wonder. Automation has to be in the 10-30% range, then go to the full automation. 30-99% is begging for trouble..
 
It was only a matter of time before something like this happened. Even though you should know better than to use the beta "autopilot" feature on the Tesla, someone didn't listen and now they are dead and the safety of the vehicles is being called into question. :(

The agency said the crash came in a 2015 Model S operating with automated driving systems engaged, and "calls for an examination of the design and performance of any driving aids in use at the time of the crash." It is the first step before the agency could seek to order a recall if it believed the vehicles were unsafe.
It's a system designed to work under human supervision. Not as an autonomous vehicle. The blame falls entirely on the driver. Case closed.
 
I still think this autonomous driving push is a crock of shit, as I have said since the beginning. It will never leave the sunny southern states.

Self-Driving Cars Hit a Roadblock in the Snow

"However, when human drivers can’t see the lines they generally create their own paths. Eustice says teaching driverless cars how to follow those patterns is “really hard.” They’ll have to undergo experience-based learning using artificial intelligence. But even if these issues are resolved, senior director of automotive at Nvidia Danny Shapiro says, “I don’t think that we should expect that in a blinding snowstorm the autonomous vehicle will be fine.”"

As a health care worker - even in a state of emergency I have to get to work. Why would I buy an additional $10k+ worth of tech for something I can only use 2/3 of the year?
The tesla is not an autonomous car for crying out loud. It's a regular car equipped with driving aids. Much like the automatic transmission, adaptive cruise control, esp, esr, tcs, and so on. The tesla has a lane assist system nothing more. Using an example of an idiot who thought otherwise to keep real autonomous vehicles down is the crock of shit.

Remember the case when the RV driver went back into the living area during driving on the highway to make coffee? Because he thought cruise control would take care of the driving on an empty stretch of road. Yet noone advocated against cruise control in cars, everyone understood that he was an idiot. Eeven if he won the court case, because of the absurd US legal system.

If we would ban everything that was misused by some idiot we wouldn't have many nice things.

As for the ridiculous example In a blinding snowstorm no human is fine either. By that logic we shouldn't even have cars, because they're useless in 5 feet snow. And sometimes we have 5 feet snow. Why do people buy motorbikes when they can only really use them during the summer months? Why do you buy a rubber dingy when you can't use it if the pond is frozen? Hell I purchased a camera I only use about five times a year.
Does your car have cruise control? Or did you refuse to pay for that too, because you can't use it all the time?
 
Last edited:
Did you guys read the bottom of the article?
A YouTube account belonging to a Joshua Brown whose personal details, including the company where he worked, match those of the accident victim, includes a video posted on April 5 titled “Autopilot Saves Model S.” In the video, a bucket truck, the type used by people working on utility poles, cuts off a Model S.

The written description of the 40-second video states, “The truck tried to get to the exit ramp on the right and never saw my Tesla. I actually wasn't watching that direction and Tessy (the name of my car) was on duty with autopilot engaged. I became aware of the danger when Tessy alerted me with the "immediately take over" warning chime and the car swerving to the right to avoid the side collision."

As of Thursday afternoon, the video had 1.7 million views.

A video posted to YouTube last October by the same user showed scenarios in which autopilot “might not do so well,” according to the commentary, which added that drivers need to be “very aware of what the car is doing.”
It's possible he got so many youtube views that he would put himself in dangerous situations just to post them.
 
Great. As a motorcycle rider, I got to worry about vehicles on autopilot. Lots of Tesla's running around where I live. Ugh.
 
Did any of you knuckleheads read the fucking article?

"The NHTSA said preliminary reports indicate the crash occurred when a tractor-trailer made a left turn in front of the Tesla at an intersection."

The Driver didn't hit the brakes either which means he either a) wasn't paying attention or b) didn't see the truck either.

Stop blaming tech for shitty drivers.
 
I don't think Tesla's marketing of the self-drive feature is very responsible. They should be calling it advanced cruse control rather than autopilot because people hear autopilot and think. "Now I can text people and not pay attention". There isn't anything wrong with the technology, just the promises made by the company are out of step with what the tech actually is. I have been told by many Tesla owners that the car can drive itself, which is not strictly true and Tesla is definitely to blame there, if not strictly to blame for this or any other crash.
 
Cue the hysterical technophobe overreactions that will set back the industry, to the benefit of large US car companies that don't have self driving tech yet......
How about in the opposite direction?

This is further proof we can't have PEOPLE driving on the roads anymore, lets take a tally of how many PEOPLE caused deaths vs. auto-drive deaths? And lets not skew the values by using some silly deaths per 1000 drivers ratio!
 
Back
Top