Another Tesla Autopilot Crash Reported

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
I firmly believe that, if you are a Tesla owner with both internet access and/or a TV, and you continue to use your car's Autopilot feature, you have no one to blame except yourself when something like this happens.

The accident happened in Whitehall, Montana. Based on the pictures and the information relayed by his friend, the driver appears to claim that he was driving on Autopilot set between 56-60 mph on a road without a center divider, when the vehicle drove off the road and hit a road guard made of wooden posts.
 
I'm wondering what the benefit/point for auto-pilot on the roads are if the human driver will still be at fault.
 
I'm wondering what the benefit/point for auto-pilot on the roads are if the human driver will still be at fault.
I like to think of auto-pilot like cruise control with steering. With cruise control the driver is still responsible for slowing down. Even with auto braking systems the computer is only good to a point. We have to remember in these situations that auto-pilot is not the same as autonomous.
 
I'm wondering what the benefit/point for auto-pilot on the roads are if the human driver will still be at fault.

Given the lack of actual details about this crash, I would imagine the human driver had enough time to tap the brake to disengage autopilot if they were paying attention. Hell, the driver could have even grabbed the wheel to bring it back into the lane, which also disengages autopilot.

I can't picture a scenario in which this would have occurred so suddenly that there would be no time to react. The human should be at fault for dismissing the GIANT WALL OF TEXT agreement to enable the feature.
 
If both hands are on the wheel and you are looking straight ahead I fail to see how you can ever lose control of the car with auto pilot turned on. If both hands are not on the wheel and/or you are not looking straight ahead and rather doing something on your phone or watching DVD's I can easily see how the car can lose control.
 
I can't picture a scenario in which this would have occurred so suddenly that there would be no time to react. The human should be at fault for dismissing the GIANT WALL OF TEXT agreement to enable the feature.

Dark road, probably long drive considering it is Montana, driver dozes off, jolted awake as vehicle is smashing into posts, claims no idea what went wrong to cover own ass.
 
Problem with the autopilot system, or any autopilot system (aircraft, helos, etc), is garbage in, garbage out.

Tesla's system relies on so many standardized cues from the environment that missing one or more can result in a crash. Most back country roads a missing several or most of standardized markings found in more urban environment.

Just like with aircraft, the operator, is responsible for the safety of the vehicle and that the system is working correctly at all times.

I don't know where people get this fucked up idea they can just jump in and its the cars responsibility to get them from point A to B with no interaction or supervision from the driver. If you get in a Tesla and sleep, you should not only get a hefty fine but lose your license.
 
Every Tesla Crash, ever, from now on, will be caused by autopilot!

It will be interesting to see what happens when they find people lying about it. I'm pretty sure it's a mere pittance for them to find out based on the amount of data this car records.
 
Auto-driving will never be as reliable as a human person. There is just too much subtlety going on that sensors would not pick up on.

A sensor might see the car adjacent to you driving straight and think there's no problem. The human might see the driver in said vehicle is texting and doesn't have their eyes on the road. The human would slow down in advance anticipating the distracted driver might drift out of their lane, or have to make a sudden move.

Auto-driving cannot predict... it can only react after the fact.
 
Auto-driving will never be as reliable as a human person. There is just too much subtlety going on that sensors would not pick up on.

A sensor might see the car adjacent to you driving straight and think there's no problem. The human might see the driver in said vehicle is texting and doesn't have their eyes on the road. The human would slow down in advance anticipating the distracted driver might drift out of their lane, or have to make a sudden move.

Auto-driving cannot predict... it can only react after the fact.

Must not drive in a major city... I would argue that autopilot in its current state is probably better than at least 50% of the human drivers on the road. People in general are shitty drivers that are typically focused on anything but driving (typically their phone).
 
I rather be control when accident happens than to die by some computer.
Only people who don't care about their life will take their hands off the wheel.
 
I'm wondering what the benefit/point for auto-pilot on the roads are if the human driver will still be at fault.

This is a driver aid, like cruise control, ABS, parking sensors etc etc not a responsibility waver.

Autopilot is meant to be a lane assist, meaning it helps you stay in the lane without to much actual hand input etc thus reducing strain on the driver, much the same way cruise control reduces that strain of having to modulate the pedal at all times. You are supposed to keep your hands on the wheel at all times and are still driving, people however are using it as if its a totally autonomous vehicle.
 
I'm wondering what the benefit/point for auto-pilot on the roads are if the human driver will still be at fault.

Well, that's actually a question we can't answer yet, since there is no system on the planet currently marketed, sold, or allowed to be used as your autonomous driver!

But, when we do have legally approved autonomous systems for usage, there will be laws, and insurance surrounding these yet to exist services.

Since they don't exist, asking a question like you did, might imply you think what Tesla has in their system is something to be used as your driver, without your supervision! (it's not)
They have been pretty clear about this as well (or so I thought)
 
Again, eye-tracking camera + software to make sure they're looking at the windscreen.

At the very least, Tesla should consider disabling the feature until users agree to have an inward facing camera for use in accident situations. Guaranteed these fucks aren't watching the road.
 
Dark road, probably long drive considering it is Montana, driver dozes off, jolted awake as vehicle is smashing into posts, claims no idea what went wrong to cover own ass.
Every Tesla Crash, ever, from now on, will be caused by autopilot!
It will be interesting to see what happens when they find people lying about it. I'm pretty sure it's a mere pittance for them to find out based on the amount of data this car records.

You can't fake an autopilot crash with a Tesla. They collect so much data that they know how many times you opened the glove box in the last 6 months. Manual brake application, steering input and accelerator application during the 10 minutes before and after a crash are a given.
 
According to the article this was on a road without a center divider and Tesla recommends not to use the auto feature on a road without one. So who's to blame here?
 
According to the article this was on a road without a center divider and Tesla recommends not to use the auto feature on a road without one. So who's to blame here?

Like always, the human driver is responsible.
 
130 million miles and 3 crashes in a few months... and it will just continue as time passes and more people own Tesla vehicles.

I mean, I've been a beta tester for a wide variety of products for decades now, not just computer hardware and software, but I wouldn't be a beta tester for autonomous vehicles if you paid me cash, gave me the car free of charge, and paid for all the gas and the maintenance for 10 solid years. ;)
 
Depends on the person.

Based on some drivers I've seen, even Tesla is getting pretty close :p
If they had tracking built into the road, and every single vehicle was automated, there would be no need for traffic stops and there would be zero accidents or very few, if any.
 
I remember before Tesla's autopilot was launched, a Tesla shareholder was so excited saying how when could take a nap during her commute soon.

130 million miles and 3 crashes in a few months... and it will just continue as time passes and more people own Tesla vehicles.

I mean, I've been a beta tester for a wide variety of products for decades now, not just computer hardware and software, but I wouldn't be a beta tester for autonomous vehicles if you paid me cash, gave me the car free of charge, and paid for all the gas and the maintenance for 10 solid years. ;)
I would. They just wouldn't get much data from me as I will always be manually driving it.
 
If both hands are on the wheel and you are looking straight ahead I fail to see how you can ever lose control of the car with auto pilot turned on. If both hands are not on the wheel and/or you are not looking straight ahead and rather doing something on your phone or watching DVD's I can easily see how the car can lose control.

What's the point of autocontrol is you have to have hands on the wheel and pay attention anyway? I just knew this shit was going to be a fail.
 
The only thing about the Tesla AutoPilot that compels me whatsoever is the cars ability to come to ME instead of me going to it. To walk out of Sears, press a button on my phone, and then have the freakin' car pull up to the curb and pick me up is science fiction shit, and it
actually works.
 
You can't fake an autopilot crash with a Tesla. They collect so much data that they know how many times you opened the glove box in the last 6 months. Manual brake application, steering input and accelerator application during the 10 minutes before and after a crash are a given.

I believe that's what I was saying :)
 
What's the point of autocontrol is you have to have hands on the wheel and pay attention anyway? I just knew this shit was going to be a fail.
I don't really know what the point is. I suppose it would be nice to just keep one hand on the wheel and let it steer around minor turns and such but you cannot let go and do whatever you want in the car.
 
Auto-driving will never be as reliable as a human person. There is just too much subtlety going on that sensors would not pick up on.

A sensor might see the car adjacent to you driving straight and think there's no problem. The human might see the driver in said vehicle is texting and doesn't have their eyes on the road. The human would slow down in advance anticipating the distracted driver might drift out of their lane, or have to make a sudden move.

Auto-driving cannot predict... it can only react after the fact.

Yes auto pilot will never work... ever... in history. There will be no advancements in self-driving technology and as the human race we will only need gasoline powered, human controlled vehicles.
 
130 million miles and 3 crashes in a few months... and it will just continue as time passes and more people own Tesla vehicles.

I mean, I've been a beta tester for a wide variety of products for decades now, not just computer hardware and software, but I wouldn't be a beta tester for autonomous vehicles if you paid me cash, gave me the car free of charge, and paid for all the gas and the maintenance for 10 solid years. ;)

For context US averages ~75 accidents and ~1 death per 100 million miles driven.
 
The common denominator here is stupid people who actually think autopilot means "do anything and everything except pay attention to the road". Not sure about the legal aspect of it, but if I were tesla, I'd put in some sort of microcam that activates anytime autopilot is turned on. And Tesla is smart enough to program it such that sticking black tape over it will not fool the system.

Don't want to be recorded? Then no autopilot for you.

At the end of the day Tesla has done more miles with less accidents/injury but nobody seems to understand this. Until you remove the human element on both sides of the equations you as a driver, and other drivers on the road, you can't account for 100% of every possible situation.
 
Reference material? Or just made up shit?

Off this article talking about driverless cars -- the accident rate in the US is 1 accident for every 165,000 miles driven. Google's Driverless Car Is Now Safer Than the Average Driver

Tesla has logged about 47 million miles worth of autopilot driving, according to

this: Google’s self-driving car vs Tesla Autopilot: 1.5M miles in 6 years vs 47M miles in 6 months

To be *as good as* the average human driver... we should have seen something around 300 accidents involving Tesla autopilot so far. We are up to what? 5? 8? Hell round up and add a zero for fun, still way better than the average human.

edit: duplicate pasted links.
 
Off this article talking about driverless cars -- the accident rate in the US is 1 accident for every 165,000 miles driven. Google's Driverless Car Is Now Safer Than the Average Driver

Tesla has logged about 47 million miles worth of autopilot driving, according to

this: Google’s self-driving car vs Tesla Autopilot: 1.5M miles in 6 years vs 47M miles in 6 months

To be *as good as* the average human driver... we should have seen something around 300 accidents involving Tesla autopilot so far. We are up to what? 5? 8? Hell round up and add a zero for fun, still way better than the average human.

edit: duplicate pasted links.

Unless you decide to look at only highway miles, type of traffic, type of weather, road conditions, etc. Which is about 2-3 accident every 1 million vehicle miles, depending on how many lanes are on the highway. That's for the state of NY.

https://www.dot.ny.gov/divisions/operating/osss/highway/accident-rates

Really, nothing you link will provide any reasonable means that Autopilot is better than an average human, except when you decide to use Autopilot in perfect conditions, on specific types of roads. So yes, is it better. Yes, but only when it's nice outside on specific types of roads and you need to also still pay attention, cause the car isn't all that good at doing it on it's own.

Bet you Tesla won't put out how many numbers on how many accidents were avoided because the human driver intervened.
 
Autopilot is not a replacement for a human so to say it's better or worse is erroneous. Autopilot helps you keep the car straight and to avoid accidents if you don't notice someone breaking or swerving into you, that's it.
 
Can we start putting a reference number in the title of these threads? I think this is #3 now? Helps me keep track of whether I have read about this one or not.
 
Off this article talking about driverless cars -- the accident rate in the US is 1 accident for every 165,000 miles driven. Google's Driverless Car Is Now Safer Than the Average Driver

Tesla has logged about 47 million miles worth of autopilot driving, according to

this: Google’s self-driving car vs Tesla Autopilot: 1.5M miles in 6 years vs 47M miles in 6 months

To be *as good as* the average human driver... we should have seen something around 300 accidents involving Tesla autopilot so far. We are up to what? 5? 8? Hell round up and add a zero for fun, still way better than the average human.

edit: duplicate pasted links.

I am just wondering about the claim of 47 million miles in 6 months. I see that technically all 70,000 Model S are capable of this feature right? So that would be 266,000 miles divided by 70,000 cars which is only 3 miles every day, which seems pretty reasonable. But the "patch" for this option only came about last October. Do all cars automatically get this upgrade? Do you have to pay for it? Do we really know the penetration of the fleet? The article also makes it sounds like Tesla is being liberal with the values in that ALL driving is learning driving and not just when the autopilot is actually doing the driving.

47 million miles in 6 months just sounds ambitious, so I am raising questions that many of us probably have.
 
Back
Top