Tesla Accelerated Into Barrier

rgMekanic

[H]ard|News
Joined
May 13, 2013
Messages
6,943
The National Transportation Safety Board has released a preliminary report on the fatal March 23 crash where a Tesla Model X in Autopilot drove into a barrier, killing 38 year old Walter Huang. The data shows that the Tesla made no attempt to brake or to steer around the barrier. The SUV was operating with traffic-aware cruise control, and autopilot engaged and during the 60 seconds before the crash, Huang's hands were detected on the steering wheel for a total of 34 seconds, but for the final 6 seconds, his hands were not on the wheel per a report from the Associated Press.

Further, eight seconds before the crash, the Tesla was following a vehicle at 65 mph, and a second later performed a "steering left movement" while still following. Four seconds before the crash, Autopilot no longer detected a vehicle in front of it, so it accelerated from 62, to 70.8 mph before impact.

Not much to add to this after all the other posts. Just graphic proof that self-driving cars are not fit to be on public roadways as of yet in my opinion. Check here for a near recreation of the fatal crash from another Tesla driver. At least the car didn't kill anyone else when it caught fire twice, days after the crash.

It likely will take more than a year to determine what caused the crash, NTSB spokesman Christopher O’Neil said Thursday. Among other factors, investigators are trying to determine how the car’s camera, radar and ultrasonic sensors were working and what they were tracking.
 
There was an incident recently with a lady crossing the road that was somthing that a human driver also would not have avoided, but after digging into it with this one it's pretty clear that tesala's self driving technology is flawed and as a result killed the driver involved. If the tech can't avoid crashing into a stationary barrier it's not ready to be used on the road. Time for Tesela to go back to the drawing board and this time there needs to be way more testing. The good new is that Truck drivers get to keep their job's a bit longer.
 
There was an incident recently with a lady crossing the road that was somthing that a human driver also would not have avoided, but after digging into it with this one it's pretty clear that tesala's self driving technology is flawed and as a result killed the driver involved. If the tech can't avoid crashing into a stationary barrier it's not ready to be used on the road. Time for Tesela to go back to the drawing board and this time there needs to be way more testing. The good new is that Truck drivers get to keep their job's a bit longer.

Tesla is not self driving, it's a driving aid that requires you to keep in control of the car at all times.
 
I understand what this tech is trying to do but its implementation is ultimately flawed, not because the tech isn't any good but because people are too easily distracted. When actively driving even when zoned out you are still subconsciously watching your environment, but when passively driving you are not nearly as alert, your reflexes and response times are not close to what they would be while actively driving. Driver assist should be limited to systems that pull over and stop on the side of the road when it detects the driver has fallen asleep or other emergencies, if the car isn't 100% autonomous then you should always be 100% engaged.
 
graphic proof that self-driving cars are not fit to be on public roadways as of yet in my opinion

That's not "proof", it's evidence towards a claim, but it's a claim that is based on poor evaluation of relative risk factors. Watching people panic/worry/complain about self-driving or assisted-driving cars is a great demonstration of the spotlight fallacy. It demonstrates that shocking incidents stand out in our minds and that we're really, _really_ bad at evaluating risk, especially comparative/relative risk.
 
This is consistent with it following California's shit lane lines into out of bounds thinking it was in-bounds. They need to put hatch marks over those spaces.
 
Those cars are not self-driving, it's aid driving much like advance cruise control. Heck I had to sign a paper that stated that my car isn't self-driving when I bought it and that I need to be alert at all time. If Ford can do this, I hope Tesla does it too and tell people how stupid it would be assume they're same to sleep.
 
I understand what this tech is trying to do but its implementation is ultimately flawed, not because the tech isn't any good but because people are too easily distracted. When actively driving even when zoned out you are still subconsciously watching your environment, but when passively driving you are not nearly as alert, your reflexes and response times are not close to what they would be while actively driving. Driver assist should be limited to systems that pull over and stop on the side of the road when it detects the driver has fallen asleep or other emergencies, if the car isn't 100% autonomous then you should always be 100% engaged.

Proof? Because all of the data and research I have seen says otherwise.

A full paper was put out by the last crash investigation that showed autopilot use significantly reduced crashes.

Hopefully they can learn from this incident and improve the system.

We can, we learned that the driver was not paying attention and failed at following instructions, and the minute before the crash had his hands on the wheel only half of the time.

Those cars are not self-driving, it's aid driving much like advance cruise control. Heck I had to sign a paper that stated that my car isn't self-driving when I bought it and that I need to be alert at all time. If Ford can do this, I hope Tesla does it too and tell people how stupid it would be assume they're same to sleep.

They do, and go a bit further as well, because to turn on and use Autosteer a feature in Autopilot, you have to agree in the car to a screen prompt as well.

MR035ot.png
 
Tesla is not self driving, it's a driving aid that requires you to keep in control of the car at all times.

1. Tesla calls it an Autopilot function so yeah it's intended to be self driving. 2. The driver had his hands on the wheel up until the last 6 seconds. If you can't take your hands of the wheel safely for 6 seconds then the tech is useless and not ready for prime time. Hence I stand by my statement.
 
1. Tesla calls it an Autopilot function so yeah it's intended to be self driving. 2. The driver had his hands on the wheel up until the last 6 seconds. If you can't take your hands of the wheel safely for 6 seconds then the tech is useless and not ready for prime time. Hence I stand by my statement.

The word "autopilot" has been around for quite a while in the aviation world -- and it's never actually meant "self flying". You are confusing what you WANT tesla's system to be vs what it actually is. You can easily take your hands off the wheel for more than 6 seconds at a time, countless youtube videos prove this, problem is when you are a stupid/lazy human and don't actually take control in the situations that warrant it. That's like having an automated robot helper at work that does 90% of the heavy lifting for you provided you throw in your 10% duty. You want to scrap the whole thing and go back to the stone age because you can't even be bothered to stay on top of your 10%.


You are also making a mistake assuming software can account for every mile, and every possible scenario and situation, it can't. Even a normal human driver can't be 100% every time, every mile, in every possible scenario. It was documented that the driver complained multiple times that the autopilot didn't work correctly in the area where the crash occurred... so why then would you rely on it even more in an area he KNEW wasn't reliable? That's like me a nerdy white guy going down to a crack house at 3am and expecting everything to be just fine and dandy.

Imperfect software mixed with perpetually imperfect human decisions means you will get results like this. Per mile driven I'd still rather have a mostly predictable AI/software driving than the usual assortment of drunk/high/selfish/asshole people on the road we have now.

There were almost 38,000 deaths on US highways in 2016 all caused by stupid/careless humans, you prefer that situation over what Telsa is trying to do?
 
Am I the only person who thinks this whole incident sounds like a suicide? You don't normally take your hands off the wheel while accelerating down the highway and he had supposedly told people the car would veer along that stretch.
 
Am I the only person who thinks this whole incident sounds like a suicide? You don't normally take your hands off the wheel while accelerating down the highway and he had supposedly told people the car would veer along that stretch.
Either that or he was testing to see what actually was triggering the issue, and didn't expect it to accelerate toward the wall, so he panicked and missed the brake pedal. Or it could have been a malfunction in the steering or braking system, though I find that coincidence less likely.
 
It's really unfortunate that the system did something unsafe. You want driving aids to help and not hinder your driving. I have tried autopilot and it's pretty amazing and I'm looking forward to possibly getting it should my model 3 configuration day ever arrive. You don't see people blaming cruise control when they get into a wreck using it. Auto pilot is really just advanced cruise control. However it does steer your car, making it more blamable. I hope they get this sorted out.

In fact, if your cruise control just took off on you and caused you to panic and wreck I do in fact see how it could be blamed.
 
Proof? Because all of the data and research I have seen says otherwise.

A full paper was put out by the last crash investigation that showed autopilot use significantly reduced crashes.



We can, we learned that the driver was not paying attention and failed at following instructions, and the minute before the crash had his hands on the wheel only half of the time.



They do, and go a bit further as well, because to turn on and use Autosteer a feature in Autopilot, you have to agree in the car to a screen prompt as well.

View attachment 79460
While it does cut down on fender benders because it can detect stopped cars and bad merges, all the major accidents occurred when the driver was completely off doing other things perhaps I worded my statement earlier unclearly, the tech is good but people need to catch up to it. 90% of the time its a great thing but you have to train yourself to stay alert for the other 10%, you can't get complacent because that's when accidents happen. All of these accidents have happened because the drivers got complacent with the Tech doing all the heavy lifting and decided they were OK doing their own thing instead of driving. So yes they did bad things, zero argument on that but is it possible lazy people get lulled into a false sense of security when they see things work for them 99% of the time.
 
it's pretty clear that tesala's self driving technology is flawed and as a result killed the driver involved.
The technology might be driving correcting technology (and argue how flawed it may be if you wish) but it is not self driving.

uring the 60 seconds before the crash, Huang's hands were detected on the steering wheel for a total of 34 seconds, but for the final 6 seconds, his hands were not on the wheel
I would say this is the result of what killed the driver. Add this to the fact there are reports the driver said the auto-drive (or whatever it's called) acting wonky multiple times at that spot, so the driver KNEW it was going to act wonky and decided to still use it.
 
This is consistent with it following California's shit lane lines into out of bounds thinking it was in-bounds. They need to put hatch marks over those spaces.

This is as much to blame as the driver not paying attention.

Current self driving technology is only able to respond to what it is programed for.
Bad or improper road markings, or people in dark cloths stepping into the road from the shadows, are not easy to program for.
 
did anyone notice the car swerved while he had his hands on steering wheel

8 sec before crash

"Further, eight seconds before the crash, the Tesla was following a vehicle at 65 mph, and a second later performed a "steering left movement" while still following." only in the last 6 secs were his hands OFF the steering wheel"

so excuse me if this is not a tesla issue as to the drivers for not paying attention

more likely the driver in removing hands cause the steering wheel to move - causing the crash - maybe fell asleep - maybe did it intentionally to see what the car would do - maybe wanted to bite the big one


moreover He knew the car had an issue with the spot from previous instances and still proceeded to let the car drive when in the past he had to take control and avoid said spot in road

further i remember the original story also said the car told him to take control prior to accident
 
Last edited:
As someone who's been using auto pilot for a good two months now this is a whole lot of, "drivers still ahve to monitor at all times"

Auto-pilot is good 90% of the time. I've seen it improve in the updates I'm getting. Is it perfect? No. Is it getting better? Yes. Will it be a better driver than any human in the very near future? Yes.
 
All of the pop up screens with piles of legalize before software will install have trained most of us to ignore the text and just click the OK/Proceed/Yes button. Don't see the Tesla Autosteer notice as being any different. Mostly useless and maybe not legally valid. Adding terms of use after the money has changed hands is questionable. If Autosteer was part of the advertised package when the car was purchased, forcing an additional agreement after the fact seems poor practice.

These repeating crashes pretty well point out the futility of expecting humans to pay 100% attention when using something called "AUTOsteer" or "AUTOpilot" or whatever the marketing term is this week. We suck at paying 100% attention in a normal car. Just look at all of the crashes with primary/contributing causes of "distracted driving". What makes anyone think we will do better when using something that pretends to automate some or all aspects of driving?
 
did anyone notice the car swerved while he had his hands on steering wheel

further i remember the original story also said the car told him to take control prior to accident

Turns out Telsa lied about that part , and the NSTB report says it was time before ( I think I recall it was 3 minutes before)

Also those 34 seconds of time his hands were on the wheel is a summary, so i t appears he was moving his hands on and off the wheel, also NSTB never said if his hands were on the wheel 7 seconds before impact they only detailed the last 6 seconds.
 
did anyone notice the car swerved while he had his hands on steering wheel

8 sec before crash

"Further, eight seconds before the crash, the Tesla was following a vehicle at 65 mph, and a second later performed a "steering left movement" while still following." only in the last 6 secs were his hands OFF the steering wheel"

so excuse me if this is not a tesla issue as to the drivers for not paying attention

more likely the driver in removing hands cause the steering wheel to move - causing the crash - maybe fell asleep - maybe did it intentionally to see what the car would do - maybe wanted to bite the big one


moreover He knew the car had an issue with the spot from previous instances and still proceeded to let the car drive when in the past he had to take control and avoid said spot in road

further i remember the original story also said the car told him to take control prior to accident

He may have removed his hands to cover his face?
 
I say drop all this autopilot stuff and return to manual driving and paying full attention to the road.
 
1. Tesla calls it an Autopilot function so yeah it's intended to be self driving. 2. The driver had his hands on the wheel up until the last 6 seconds. If you can't take your hands of the wheel safely for 6 seconds then the tech is useless and not ready for prime time. Hence I stand by my statement.

So, you have no idea what autopilot is, because autopilot even in planes does NOT allow you to let go of the controls and let the plane fly it self, it requires full time control. Tesla's autopilot is actually more capable and advanced than many aircraft systems. Autopilot and autonomous are NOT the same thing. Autopilot in a plane helps the pilot stay on a course, allowing them to look at and pay attention to other instruments, weather, etc etc, a number of airliners have crashed due to autopilot and the crew not taking back over full control soon enough. Only in the last few years has autopilot in aircraft advanced, however they are still not a level 5 autonomous system, they are in most cases, even the best systems, a level 3, Tesla is a level 2. And aircraft have much less to deal with, being that they are in the air with minimal other traffic, restricted flight areas, known flight paths, well marked runways with built in sensors to help guide the system, and don't have to deal with many obstacles, traffic laws, traffic signals, thousands of other drivers only feet away, pedestrians etc etc etc.

No where does autopilot mean autonomous, not even in aircraft. Paperwork is signed stating that you understand the car is not self driving, before turning on the autosteer feature you have to agree to a prompt in the car that AGAIN states this, and states that the feature is in beta and that you have to remain in control of the car at all times. Because it is a very advanced lane keeping feature, and because some people will always do stupid things, some people ignore all of the instructions given and do what they want. That is in NO WAY, the fault of the system, but the fault of the driver. People often fault Tesla for these crashes because of their incorrect understanding of the word autopilot.

While it does cut down on fender benders because it can detect stopped cars and bad merges, all the major accidents occurred when the driver was completely off doing other things perhaps I worded my statement earlier unclearly, the tech is good but people need to catch up to it. 90% of the time its a great thing but you have to train yourself to stay alert for the other 10%, you can't get complacent because that's when accidents happen. All of these accidents have happened because the drivers got complacent with the Tech doing all the heavy lifting and decided they were OK doing their own thing instead of driving. So yes they did bad things, zero argument on that but is it possible lazy people get lulled into a false sense of security when they see things work for them 99% of the time.

Again, do you have any research for that? I am serious here, I am always open to seeing new research and changing my view on a new tech, but everything, including the research papers from the NTSB on the Florida crash showed that all crashes as a whole went down, in many cases cut in half, not just fender benders. And you are correct in that all of the major crashes have been when the driver was ignoring the system. Meaning that they were not operating it as it should be and how they were instructed, making it the drivers fault. And yes, lazy and stupid people can and will, but they do that now, steering with their knees, texting on the phone, putting on makeup, not wearing a seat belt etc etc because "nothing has ever happened to them". This minority however is no reason to ignore the far bigger and positive impact the system makes, the number of crashes that happen with these systems on is small, however, unlike the systems in many other cars, ANY time a Tesla is in a crash, it makes the news. People like to focus on the one crash every few months, but no one hears about the thousands that were prevented by the system. Tesla collects all of that data btw, which is part of what the NTSB went over in their investigation of the Florida crash.

For a tech forum, I would expect people to read up on the tech more before demonizing it. Tesla systems are a level 2, meaning the driver is responsible for all control of the vehicle. Once they reach a level 4 or 5 and something like this happens, they would deserve all of the flaming they could get. This is coming from someone who is into racing, cars and driving. But even I will admit that for commuting to work, I would love to have a level 4 or 5 system, because no one enjoys sitting and driving in traffic.

All of the pop up screens with piles of legalize before software will install have trained most of us to ignore the text and just click the OK/Proceed/Yes button. Don't see the Tesla Autosteer notice as being any different. Mostly useless and maybe not legally valid. Adding terms of use after the money has changed hands is questionable. If Autosteer was part of the advertised package when the car was purchased, forcing an additional agreement after the fact seems poor practice.

These repeating crashes pretty well point out the futility of expecting humans to pay 100% attention when using something called "AUTOsteer" or "AUTOpilot" or whatever the marketing term is this week. We suck at paying 100% attention in a normal car. Just look at all of the crashes with primary/contributing causes of "distracted driving". What makes anyone think we will do better when using something that pretends to automate some or all aspects of driving?

You actually sign paperwork when you get the car, AND get the prompt when turning it on. They sit there and explain the system in pretty good detail making sure you understand it will not drive the car. If you ignore that sit down, and signing of paperwork, ignore the prompt, then ignore the chimes in the car when you take your hands off the wheel and the visual warnings as well....I consider that suicide, because it's beyond stupid.
 
Last edited by a moderator:
So, you have no idea what autopilot is, because autopilot even in planes does NOT allow you to let go of the controls and let the plane fly it self, it requires full time control. Tesla's autopilot is actually more capable and advanced than many aircraft systems. Autopilot and autonomous are NOT the same thing. Autopilot in a plane helps the pilot stay on a course, allowing them to look at and pay attention to other instruments, weather, etc etc, a number of airliners have crashed due to autopilot and the crew not taking back over full control soon enough. Only in the last few years has autopilot in aircraft advanced, however they are still not a level 5 autonomous system, they are in most cases, even the best systems, a level 3, Tesla is a level 2. And aircraft have much less to deal with, being that they are in the air with minimal other traffic, restricted flight areas, known flight paths, well marked runways with built in sensors to help guide the system, and don't have to deal with many obstacles, traffic laws, traffic signals, thousands of other drivers only feet away, pedestrians etc etc etc.

No where does autopilot mean autonomous, not even in aircraft. Paperwork is signed stating that you understand the car is not self driving, before turning on the autosteer feature you have to agree to a prompt in the car that AGAIN states this, and states that the feature is in beta and that you have to remain in control of the car at all times. Because it is a very advanced lane keeping feature, and because some people will always do stupid things, some people ignore all of the instructions given and do what they want. That is in NO WAY, the fault of the system, but the fault of the driver. People often fault Tesla for these crashes because of their incorrect understanding of the word autopilot.



Again, do you have any research for that? I am serious here, I am always open to seeing new research and changing my view on a new tech, but everything, including the research papers from the NTSB on the Florida crash showed that all crashes as a whole went down, in many cases cut in half, not just fender benders. And you are correct in that all of the major crashes have been when the driver was ignoring the system. Meaning that they were not operating it as it should be and how they were instructed, making it the drivers fault. And yes, lazy and stupid people can and will, but they do that now, steering with their knees, texting on the phone, putting on makeup, not wearing a seat belt etc etc because "nothing has ever happened to them". This minority however is no reason to ignore the far bigger and positive impact the system makes, the number of crashes that happen with these systems on is small, however, unlike the systems in many other cars, ANY time a Tesla is in a crash, it makes the news. People like to focus on the one crash every few months, but no one hears about the thousands that were prevented by the system. Tesla collects all of that data btw, which is part of what the NTSB went over in their investigation of the Florida crash.

For a tech forum, I would expect people to read up on the tech more before demonizing it. Tesla systems are a level 2, meaning the driver is responsible for all control of the vehicle. Once they reach a level 4 or 5 and something like this happens, they would deserve all of the flaming they could get. This is coming from someone who is into racing, cars and driving. But even I will admit that for commuting to work, I would love to have a level 4 or 5 system, because no one enjoys sitting and driving in traffic.

Maybe they are blinded by jealousy?
 
All part of the learning curve and growing pains. The day of Minority Report type of automobiles will happen, not in my life time, but it will happen.
 
All of the pop up screens with piles of legalize before software will install have trained most of us to ignore the text and just click the OK/Proceed/Yes button. Don't see the Tesla Autosteer notice as being any different. Mostly useless and maybe not legally valid. Adding terms of use after the money has changed hands is questionable. If Autosteer was part of the advertised package when the car was purchased, forcing an additional agreement after the fact seems poor practice.

These repeating crashes pretty well point out the futility of expecting humans to pay 100% attention when using something called "AUTOsteer" or "AUTOpilot" or whatever the marketing term is this week. We suck at paying 100% attention in a normal car. Just look at all of the crashes with primary/contributing causes of "distracted driving". What makes anyone think we will do better when using something that pretends to automate some or all aspects of driving?
You summed up the general consensus. well done
 
So, you have no idea what autopilot is, because autopilot even in planes does NOT allow you to let go of the controls and let the plane fly it self, it requires full time control. Tesla's autopilot is actually more capable and advanced than many aircraft systems. Autopilot and autonomous are NOT the same thing. Autopilot in a plane helps the pilot stay on a course, allowing them to look at and pay attention to other instruments, weather, etc etc, a number of airliners have crashed due to autopilot and the crew not taking back over full control soon enough. Only in the last few years has autopilot in aircraft advanced, however they are still not a level 5 autonomous system, they are in most cases, even the best systems, a level 3, Tesla is a level 2. And aircraft have much less to deal with, being that they are in the air with minimal other traffic, restricted flight areas, known flight paths, well marked runways with built in sensors to help guide the system, and don't have to deal with many obstacles, traffic laws, traffic signals, thousands of other drivers only feet away, pedestrians etc etc etc.

No where does autopilot mean autonomous, not even in aircraft. Paperwork is signed stating that you understand the car is not self driving, before turning on the autosteer feature you have to agree to a prompt in the car that AGAIN states this, and states that the feature is in beta and that you have to remain in control of the car at all times. Because it is a very advanced lane keeping feature, and because some people will always do stupid things, some people ignore all of the instructions given and do what they want. That is in NO WAY, the fault of the system, but the fault of the driver. People often fault Tesla for these crashes because of their incorrect understanding of the word autopilot.



Again, do you have any research for that? I am serious here, I am always open to seeing new research and changing my view on a new tech, but everything, including the research papers from the NTSB on the Florida crash showed that all crashes as a whole went down, in many cases cut in half, not just fender benders. And you are correct in that all of the major crashes have been when the driver was ignoring the system. Meaning that they were not operating it as it should be and how they were instructed, making it the drivers fault. And yes, lazy and stupid people can and will, but they do that now, steering with their knees, texting on the phone, putting on makeup, not wearing a seat belt etc etc because "nothing has ever happened to them". This minority however is no reason to ignore the far bigger and positive impact the system makes, the number of crashes that happen with these systems on is small, however, unlike the systems in many other cars, ANY time a Tesla is in a crash, it makes the news. People like to focus on the one crash every few months, but no one hears about the thousands that were prevented by the system. Tesla collects all of that data btw, which is part of what the NTSB went over in their investigation of the Florida crash.

For a tech forum, I would expect people to read up on the tech more before demonizing it. Tesla systems are a level 2, meaning the driver is responsible for all control of the vehicle. Once they reach a level 4 or 5 and something like this happens, they would deserve all of the flaming they could get. This is coming from someone who is into racing, cars and driving. But even I will admit that for commuting to work, I would love to have a level 4 or 5 system, because no one enjoys sitting and driving in traffic.



You actually sign paperwork when you get the car, AND get the prompt when turning it on. They sit there and explain the system in pretty good detail making sure you understand it will not drive the car. If you ignore that sit down, and signing of paperwork, ignore the prompt, then ignore the chimes in the car when you take your hands off the wheel and the visual warnings as well....I consider that suicide, because it's beyond stupid.
I don’t know what research you are asking me for, but every major accident involving auto pilot or a self driving car (Tesla and Uber) were deemed to be avoidable if the humans at the wheel were paying attention. For the hand rail incident the driver was looking away with no hands on the wheel for more than 7 seconds before the accident. For the accident with the tractor trailer they were watching a movie. There was one in the UK where the driver was in the passenger seat napping, they lived but lost their license. For the Uber crash while the tech did fail the driver was clearly shown to be distracted and in extended footage they spent very little time actively engaged with the car and more time instead on their phone. In each of these cases the driver was behaving like a passenger not a driver around the time of the accidents. In regards to my comments on Complacency it is well regarded as the number 1 cause of work place accidents.
 
Stupid / lazy people are still going to be stupid and lazy, even when we have robots or AI doing 99.9% of the work for them and all they have to even do is 0.01%. It's human nature. Hell most people don't even know how to cross the street anymore (at least here in DC).
 
All part of the learning curve and growing pains. The day of Minority Report type of automobiles will happen, not in my life time, but it will happen.
Audi already has Level 3, and they're rolling out Level 4 in 2020. Full Level 5 automation is a lot closer than you think. Tesla's Autopilot is Level 2, at best.
 
1. Tesla calls it an Autopilot function so yeah it's intended to be self driving. 2. The driver had his hands on the wheel up until the last 6 seconds. If you can't take your hands of the wheel safely for 6 seconds then the tech is useless and not ready for prime time. Hence I stand by my statement.
6 seconds with your hands off the wheel is a huge distance when going 60+ mph. That's ~530 feet. That's almost two football fields. Probably not a safe idea.
 
Audi already has Level 3, and they're rolling out Level 4 in 2020. Full Level 5 automation is a lot closer than you think. Tesla's Autopilot is Level 2, at best.


And how many combined hours / miles does the audi l3 have so far? I'm guessing it's not even a tiny percentage of Tesla has under it's belt. You don't think audi is going to have similar problems until the system matures?.....
 
This is consistent with it following California's shit lane lines into out of bounds thinking it was in-bounds. They need to put hatch marks over those spaces.

Hatch marks would be nice, but the car needs to work in the real world, not in some fantasy land where Caltrans does their job. :)

did anyone notice the car swerved while he had his hands on steering wheel

8 sec before crash

"Further, eight seconds before the crash, the Tesla was following a vehicle at 65 mph, and a second later performed a "steering left movement" while still following." only in the last 6 secs were his hands OFF the steering wheel"

so excuse me if this is not a tesla issue as to the drivers for not paying attention

I've seen many reports that the hand detection has a lot of false readings.

Regardless, do you think anybody would pay for Autopilot, if they told people the truth? Autopilot is designed to ignore stationary objects; if there's not a car moving in front of you to cheat off of, it will run into every kind of wall, or parked emergency vehicle or what have you -- and if the car in front switches lanes to avoid the hazard, Autopilot will accelerate (aggressively) to your set speed while ignoring the stationary object.
 
What I find sad is how many reports of people dying to you have to read before you notice its not an auto pilot and you shouldn't be using it as such.

And before the first fool comes here and says well they advertise it as auto pilot.

If I made a advertisement that tide pods are safe to eat would you??? probably not because common sense!!!!!
 
Back
Top