Tesla Autopilot Nearly Recreates Fatal Crash

rgMekanic

[H]ard|News
Joined
May 13, 2013
Messages
6,943
A Tesla owner in Indiana decided to recreate the fatal crash from last month, where the cars autopilot mode drove straight into a K-Rail according to an article from electrek. The video below shows a Model S with the latest autopilot hardware mistakenly treating a right-side land as a left-side lane and pointing the vehicle squarely at a barrier.

How is it the radar system of neither of these cars could see a solid, immobile concrete barrier? Fortunately the Indiana driver in the video managed to stop just in time.
 
And on a personal note, if that faint pulsing around the gauge cluster that starts at :22 is what's supposed to alert the driver that their input is needed, then that is a massive fucking problem. People do not look at their gauges nearly enough, how many drivers do you see at night with no headlights on, when a quick glance down will tell you that you can't see your gauges.

The warning system should be a fucking claxton and a taint-taser in the seat.
 
Yeah, the lack of a white line where it should be seems like what confused it. I saw this somewhere else earlier and was wondering why he let it get that close.
 
And on a personal note, if that faint pulsing around the gauge cluster that starts at :22 is what's supposed to alert the driver that their input is needed, then that is a massive fucking problem. People do not look at their gauges nearly enough, how many drivers do you see at night with no headlights on, when a quick glance down will tell you that you can't see your gauges.

The warning system should be a fucking claxton and a taint-taser in the seat.
Taint-taser you say? Yeouch. You volunteering to test such a thing and give input on how strong to make it? :p
 
This is why I am not investing my money into self driving cars. Mixing autonomous vehicles in with human-controlled vehicles on roadways that cater to humans is a really bad idea. Humans are conditioned through experience to deal with the millions of unexpected situations that can happen when driving their car. Autonomous cars are nowhere near that sophisticated and even with high-end neural net learning can't deal with all the possibilities as well as a veteran human driver can.
 
And on a personal note, if that faint pulsing around the gauge cluster that starts at :22 is what's supposed to alert the driver that their input is needed, then that is a massive fucking problem. People do not look at their gauges nearly enough, how many drivers do you see at night with no headlights on, when a quick glance down will tell you that you can't see your gauges.

The warning system should be a fucking claxton and a taint-taser in the seat.

A self-driving car or auto-pilot or whatever that needs you attention should make ALL THE NOISE.

All of it, all at once.
 
And on a personal note, if that faint pulsing around the gauge cluster that starts at :22 is what's supposed to alert the driver that their input is needed, then that is a massive fucking problem. People do not look at their gauges nearly enough, how many drivers do you see at night with no headlights on, when a quick glance down will tell you that you can't see your gauges.

The warning system should be a fucking claxton and a taint-taser in the seat.

Except these days so many of the cars have gauges that are as lit up as this one so you can't actually tell a difference between headlights on and off. Hell in my RX-8 the only difference is the gauges go from white to red, but I can see them either way, and that car is 12 years old.

In my wife's Mazda6 the visuals aren't particularly noticeable, but the audio alerts will definitely grab your attention.
 
I see that they are recreating the scenario by even keeping their hands off the wheel for far longer than Tesla recommends. You can see the HUD is flashing, which is one of the first warnings that your hands have been off the wheel for too long. It follows up with beeping and ,finally, with the deactivation of auto pilot. It takes, IIRC, around 4-5 minutes before the first warnings go off. So this person has had their hands off the wheel for a number of minutes before they even started recording. Doesn't that go against what Tesla recommends to begin with?

I'm not questioning the fact that the 'Autopilot' is not making a mistake here. But I am questioning the improper use of the 'Autopilot' by its drivers who may not be using it the way it was intended (or even instructed) in its current state of capability.

And to blame the name 'Autopilot', to me, is just silly: 1) It's just a name, 2) It's term used elsewhere in other industries is understood that it doesn't imply that the driver/pilot can stop paying attention.

On that note, while I find the video interesting and provides good information on the behavior of the auto pilot system, I think the way they carried it out was both careless and dangerous.
 
I see that they are recreating the scenario by even keeping their hands off the wheel for far longer than Tesla recommends. You can see the HUD is flashing, which is one of the first warnings that your hands have been off the wheel for too long. It follows up with beeping and ,finally, with the deactivation of auto pilot. It takes, IIRC, around 4-5 minutes before the first warnings go off. So this person has had their hands off the wheel for a number of minutes before they even started recording. Doesn't that go against what Tesla recommends to begin with?

I'm not questioning the fact that the 'Autopilot' is not making a mistake here. But I am questioning the improper use of the 'Autopilot' by its drivers who may not be using it the way it was intended (or even instructed) in its current state of capability.

And to blame the name 'Autopilot', to me, is just silly: 1) It's just a name, 2) It's term used elsewhere in other industries is understood that it doesn't imply that the driver/pilot can stop paying attention.

On that note, while I find the video interesting and provides good information on the behavior of the auto pilot system, I think the way they carried it out was both careless and dangerous.
If you have to maintain control of the vehicle at all times, what the fuck is the point of autopilot.
 
we have to endure and go through these situations, accidents, deaths in order to continue the path to autonomy.
there will be sacrifices, for a reliable trustworthy end product. but these situations will and have to happen.
 
This technology has been out for what, 2-3 years? We've had smartphones for about 15 years now and we still have problems with them, such as phones blowing up. Self driving cars are the future, and hopefully will be mandatory eventually, but they're far more complicated than a phone. It's going to be awhile til we can safely switch to them.
 
Looks like Tesla auto pilot has some way to go before it can be considered to be totally safe !!
 
we have to endure and go through these situations, accidents, deaths in order to continue the path to autonomy.
there will be sacrifices, for a reliable trustworthy end product. but these situations will and have to happen.

Exactly. Deaths occurred before seat belts were created, this is no different.
 
This is why I am not investing my money into self driving cars. Mixing autonomous vehicles in with human-controlled vehicles on roadways that cater to humans is a really bad idea. Humans are conditioned through experience to deal with the millions of unexpected situations that can happen when driving their car. Autonomous cars are nowhere near that sophisticated and even with high-end neural net learning can't deal with all the possibilities as well as a veteran human driver can.

Waymo's system has driven over 5 million miles to date. I just did some back-of-the-napkin math, and assuming that I drive until I'm 80, I'll have only racked up 750,000 miles personally. A high-end neural net is going to aggregate all of the learning of all of the devices, and the experience it gains by sheer numbers is impossible for a human to duplicate. When these things are ready, they won't completely end traffic fatalities and accidents (that clearly would be impossible), but they are going to be so far ahead of human drivers it will be reckless not to use the self-driving technology.

If you have to maintain control of the vehicle at all times, what the fuck is the point of autopilot.





Keep your hands on the wheel, eyes on the road, and treat Autopilot the same way you would a passenger in the car who says "Oh shit, watch out!" when they see something you might have missed. It's a safety system that you hopefully wouldn't ever need, but is an awesome tool that might save your skin if you're in an unexpected situation.
 
I see that they are recreating the scenario by even keeping their hands off the wheel for far longer than Tesla recommends. You can see the HUD is flashing, which is one of the first warnings that your hands have been off the wheel for too long. It follows up with beeping and ,finally, with the deactivation of auto pilot. It takes, IIRC, around 4-5 minutes before the first warnings go off. So this person has had their hands off the wheel for a number of minutes before they even started recording. Doesn't that go against what Tesla recommends to begin with?

I'm not questioning the fact that the 'Autopilot' is not making a mistake here. But I am questioning the improper use of the 'Autopilot' by its drivers who may not be using it the way it was intended (or even instructed) in its current state of capability.

And to blame the name 'Autopilot', to me, is just silly: 1) It's just a name, 2) It's term used elsewhere in other industries is understood that it doesn't imply that the driver/pilot can stop paying attention.

On that note, while I find the video interesting and provides good information on the behavior of the auto pilot system, I think the way they carried it out was both careless and dangerous.

You're supposed to have a hand on the wheel the entire time you have the system engaged. Clearly spelled out in the instructions and the warning you have to go through to activate the system. Seems they may need to ramp up the checking and warnings because people are dumb. Also bet they will need to make the disengage also coast to a stop and/or change into the right most lane. I personally wouldn't mine a rename but likely too late at this point. Still love my Model S.
 
Waymo's system has driven over 5 million miles to date. I just did some back-of-the-napkin math, and assuming that I drive until I'm 80, I'll have only racked up 750,000 miles personally. A high-end neural net is going to aggregate all of the learning of all of the devices, and the experience it gains by sheer numbers is impossible for a human to duplicate. When these things are ready, they won't completely end traffic fatalities and accidents (that clearly would be impossible), but they are going to be so far ahead of human drivers it will be reckless not to use the self-driving technology.

Keep your hands on the wheel, eyes on the road, and treat Autopilot the same way you would a passenger in the car who says "Oh shit, watch out!" when they see something you might have missed. It's a safety system that you hopefully wouldn't ever need, but is an awesome tool that might save your skin if you're in an unexpected situation.

I agree 100%. At some point not only will they react faster, they can react better. A human can steer, go, stop. That's it. A car can stop one wheel, accelerate another wheel, turn exactly x degrees. It could even turn on the windshield wipers if that would help for some fucking reason (it wouldn't but just saying a computer can do more in a millisecond than any human).
 
This is a poorly marketed system. The very name used is the start of the problem, then the users who made idiotic videos afterwards which exuded such "confidence" in a self driving car followed. Tesla's reluctance to call it for what it is "an improved cruise control" is the continuance of this problem and leads to self driving car technology being lumped in with this cruise control tech.
 
Waymo's system has driven over 5 million miles to date. I just did some back-of-the-napkin math, and assuming that I drive until I'm 80, I'll have only racked up 750,000 miles personally.
But the Waymo system suffers a significant issue every ~5000 miles, requiring human intervention including saving it from accidents every 50K miles or so.

A typical beginner human driver in North America only takes a few hundred to a few thousand miles of driving to get as good as the best self-driving systems today.
 
what the fuck is the point of autopilot.

.....its supposed to be an assistant than complete self driving without assistance. Tesla should be held accountable for making it seem like it can do it all because fully self driving is how many people are using it. Its only supposed to be an assistant but since its named "autopilot" the belief that there is a CPU driver in the car is expected. Its premature tech but folks are being used as triage until they reach version <something other than todays>. :cool:
 
If you have to maintain control of the vehicle at all times, what the fuck is the point of autopilot.
As others have mentioned, it is an assistant/aide to your driving. It isn't intended to free you off all duties and responsibilities of being a responsible driver. Emphasis on "driver". You are not a passenger when autopilot is engaged just like a pilot in a plane is not nor a captain of a boat.
 
Exactly. Deaths occurred before seat belts were created, this is no different.

I dont have the same callous position on people dying because of the misuse of technology. Guess you also realize that these things are usually settled in court and then we get confirmation that the manufacturer went ahead of himself by putting something out that wasn't ready for mass consumption. This isnt the same as placing a sticker on the paint/tool shelf on a step ladder stating "This is not a step" although somewhat related. When there is a flaw in a product because of improper placement the manuf is held accountable.

Unfortunately many times the penalty seems to go to the govt instead of the folks affected but it prevents further deaths.

Defective airbags is a perfect example. A device created to save lives instead caused personal injury while possibly saving a life. Manuf held accountable.

I dont see it as the same since its possible that one "driverless" car could cause injury to many people. Manuf needs to make sure its properly sold as assistance than automation. The name helps.
 
If you have to maintain control of the vehicle at all times, what the fuck is the point of autopilot.

To reduce fatigue and make small adjustments to keep on course, the same thing aircraft autopilot is for. Keep in mind that aircraft autopilot has far less to deal with as it's in the air and does not, despite what people, including yourself seem to think, fly the plane without any input or control/attention from a pilot, this allows them to keep better monitor of other functions such as the many systems on an aircraft to the weather. So the name is fitting, and when using the system (in a Tesla) it warns you about the system and what it is for and what it is not for and that you need to keep control at all times, I believe they still require you to sign an agreement when getting a car with this function that you understand it. But people not being stupid enough, Tesla also added in hand placement detection, audible warnings and visual warnings for when the person has chosen to ignore the paperwork they read and signed and the warning prompt you have to click when you engage the system.

MR035ot.png
 
sophisticated high-end, neural net

Half of this stuff is just buzzwords and people are riding the hype train to oblivion. The tech industry needs another 'must have' to sell. TV's, white goods, Mobiles, PC's, Cars etc.. they reached a threshold where everyone has at least most of the items required for modern day living at a good enough quality with all the bells and whistles. It's not good for industry when the consumer has nothing left to buy.

Everyone just needs to re-buy everything again that's 'Smart' and then again when it's 'Automated-Smart'. It never ends.
 
What I don't get is how it made the decision that the correct action was to drive into a solid object.

I assume (perhaps incorrectly?) it was the driver who slammed on the brakes.
 
...I also read/saw that its possible that the confusion was caused because "The barrier was previously crashed into".

When the barrier has been uncrashed into, its at least 10 ft longer. Once a car crashes into it, it gets reduced to just a barrier, not a *deceleration* barrier. It may be that the software is smart enough to detect a deceleration barrier but not after its been used. Might mean that the software has programmed into it an expectation of the road that its on so missed it because it was unexpected.

One theory.
 
And on a personal note, if that faint pulsing around the gauge cluster that starts at :22 is what's supposed to alert the driver that their input is needed, then that is a massive fucking problem. People do not look at their gauges nearly enough, how many drivers do you see at night with no headlights on, when a quick glance down will tell you that you can't see your gauges.

The warning system should be a fucking claxton and a taint-taser in the seat.

It would still get ignored. Plenty of statistics show that warning buzzers that go off get ignored all the time...especially if they go off frequently. Ask any pilot who ever uninitentionally landed gear up if they heard the warning buzzer go off and damn near universally they will say that it didnt. Tests nearly universally show it was functioning perfectly ;).

Ask yourself how many SA's ignored the "warnings" in the logs or alert mechanisms.

Most people just dont pay attention well enough...let alone to some "bitchin betty" system.
 
...I also read/saw that its possible that the confusion was caused because "The barrier was previously crashed into".

When the barrier has been uncrashed into, its at least 10 ft longer. Once a car crashes into it, it gets reduced to just a barrier, not a *deceleration* barrier. It may be that the software is smart enough to detect a deceleration barrier but not after its been used. Might mean that the software has programmed into it an expectation of the road that its on so missed it because it was unexpected.

One theory.
The software needs to detect objects in the road, period.
It does not matter if the object is Crashed, un-crashed, pedestrian, pedestrian walking a bicycle, pedestrian walking a bicycle at night, pedestrian walking a bicycle at night while wearing all black, or in Tesla's case a fire truck or a 53' long white rectangle 4 feet off the ground.
 
And on a personal note, if that faint pulsing around the gauge cluster that starts at :22 is what's supposed to alert the driver that their input is needed, then that is a massive fucking problem. People do not look at their gauges nearly enough, how many drivers do you see at night with no headlights on, when a quick glance down will tell you that you can't see your gauges.

The 'hold steering wheel' message and accompanying flashing border occur within the first 10 seconds of hands free driving. After 15 seconds you get a loud warning sound and at around 30 seconds the warning sound plays until, if still kept hands free, the vehicle begins to decelerate into a stop at about 40 seconds.

So basically they give you about 45 seconds of wanking off behind the wheel before it deactivates and are unable to reengage it until a couple hours later.
 
we have to endure and go through these situations, accidents, deaths in order to continue the path to autonomy.
there will be sacrifices, for a reliable trustworthy end product. but these situations will and have to happen.

No we fucking don't. Jesus, you make it sound like this Noble quest in a video game.

keep this shit off the road until it stops killing people.
 
You know what system of transportation has great autopilot? MASS TRANSIT.

We need more light rails and other similar options. Or sure, continue to figure out how to pack an ever growing population into one car each and have roads/infrastructure to support it.
 
You know what system of transportation has great autopilot? MASS TRANSIT.

We need more light rails and other similar options. Or sure, continue to figure out how to pack an ever growing population into one car each and have roads/infrastructure to support it.
LOL You're in a Tesla thread, buddy.. Mass transit sucks and if you had the option to summon a car, get in, and be somewhere in 10 minutes rather than an hour.. You would be all over that, just like everyone. This is the future, along with tunnels and lifts. Think of these machines as "personal mass transits" if you must.
 
Back
Top