Tesla Accelerated Into Barrier

Tesla is not self driving, it's a driving aid that requires you to keep in control of the car at all times.

But advertised as "Auto pilot" Brilliant. "We'll drive the car, except when you have to"
 
That's not "proof", it's evidence towards a claim, but it's a claim that is based on poor evaluation of relative risk factors. Watching people panic/worry/complain about self-driving or assisted-driving cars is a great demonstration of the spotlight fallacy. It demonstrates that shocking incidents stand out in our minds and that we're really, _really_ bad at evaluating risk, especially comparative/relative risk.

"Our automated Pill dispensing machine just made a mistake and recognized cyanide as aspirin. We're sorry. It only happened once. Don't worry about it happening in the future"
 
I mean people can't even walk without being distracted and walking into/off something. Yes there are some people that are amazing drivers, but that is the exception, not the rule. Would auto driving cars be flawless and not kill anyone? Never. But it would reduce the accidents dramatically. We just have a hard time handlign the idea of a death caused by a computer driving a machine as opposed to a human not paying attention operating that machine
 
Data is stated that the driver had his hands on and off the wheel at very specific times.

I find it hard to believe in the veracity of that data.

-How is "hands on" status determined?
-How is "hands off" status determined?

I see three technologies which could determine this status.
1. A camera which has an unobstructed line of sight, at all times, to the entire periphery of the wheel.
2a. Electronic sensors, such as those on treadmills, detecting a grip. (This would either need to be large enough to surround the circumference, or just be limited to sensing a specific grip zone.)
2b. A sqeeze-o-meter built into the steering wheel which can sense any grip. There would be a limit to this sensitivity and a discrimination function. Is that my knee on the wheel? Is that my fingertip?
3. A dynamic coupling between the steering wheel and the steering rack (or whatever mechanism is used) which determines if there is a difference between actual wheel pointing and commanded wheel pointing. E.g., is the driver trying to turn? (What if the driver is commanding exactly what the computer steering is commanding? There would be no difference.)

I am doubtful of such statements of "fact" that the driver had his hands on or off the wheels at such and such a time.
 
I mean people can't even walk without being distracted and walking into/off something. Yes there are some people that are amazing drivers, but that is the exception, not the rule. Would auto driving cars be flawless and not kill anyone? Never. But it would reduce the accidents dramatically. We just have a hard time handlign the idea of a death caused by a computer driving a machine as opposed to a human not paying attention operating that machine

There's certainly potential for self driving cars, but it needs to be developed safely. Waymo (Google)'s cars have had tons of accidents, and many of them are because their cars did stupid shit, but they've managed to all be low speed, no/minor injuries, because they're being careful, and their cars are always prepared to stop. OTOH, Telsa and Uber don't understand that if the car can't figure out what to do, it should stop -- they have both said that false positive stops were inconvenient and uncomfortable, so they ignore signals they can't understand. I haven't heard much from the automakers self-driving, which probably means they're being careful, too.

Also --- I fully believe that computer safety assistance programs can be safe and effective -- let the human drive the car, and the computer watch for unsafe situations and help the human -- alert and activate brakes when the cars in front do stupid shit, help with lane keeping (without swerving into barriers), notice speed limits etc, but let the human use their own judgement to override when the computer doesn't understand.
 
That's not "proof", it's evidence towards a claim, but it's a claim that is based on poor evaluation of relative risk factors. Watching people panic/worry/complain about self-driving or assisted-driving cars is a great demonstration of the spotlight fallacy. It demonstrates that shocking incidents stand out in our minds and that we're really, _really_ bad at evaluating risk, especially comparative/relative risk.
So true. A real test of new technology's maturity is when things go perfectly, not when it fails.
 
Hopefully they can learn from this incident and improve the system.
Hopefully more manufacturers will join the brave and innovative Tesla and more people die as soon as possible to improve the technology more quickly. The progress is too slow currently.
 
Hatch marks would be nice, but the car needs to work in the real world, not in some fantasy land where Caltrans does their job. :)



I've seen many reports that the hand detection has a lot of false readings.

Regardless, do you think anybody would pay for Autopilot, if they told people the truth? Autopilot is designed to ignore stationary objects; if there's not a car moving in front of you to cheat off of, it will run into every kind of wall, or parked emergency vehicle or what have you -- and if the car in front switches lanes to avoid the hazard, Autopilot will accelerate (aggressively) to your set speed while ignoring the stationary object.


Or we could tell them the truth that enabling autopilot allows them to fly to their destination at light speed..... /sarcastic statement that is almost as dumb as whatever you're trying to post as fact.....
 
I don’t know what research you are asking me for, but every major accident involving auto pilot or a self driving car (Tesla and Uber) were deemed to be avoidable if the humans at the wheel were paying attention. For the hand rail incident the driver was looking away with no hands on the wheel for more than 7 seconds before the accident. For the accident with the tractor trailer they were watching a movie. There was one in the UK where the driver was in the passenger seat napping, they lived but lost their license. For the Uber crash while the tech did fail the driver was clearly shown to be distracted and in extended footage they spent very little time actively engaged with the car and more time instead on their phone. In each of these cases the driver was behaving like a passenger not a driver around the time of the accidents. In regards to my comments on Complacency it is well regarded as the number 1 cause of work place accidents.

You were claiming at first a rise in crash rate, then changed to it only helps with fender benders, implying it makes more serious crashes more common, which does not match up with any data or reports I have seen. I am asking if you have any data for this, or are you basing it on the fact the news covers EVERY crash that involves a Tesla?

Your first sentence here explains the whole situation, the humans at the wheel were not paying attention, which is the most common cause of crashes as a whole, not just with assisted driving systems. What that has to do with the system it self, I have no idea, it's a driver problem, period.

The Uber crash and Telsas system are NOT the same thing either. The Uber system is meant to be a higher level system, a level 3, as level 3 is the start of responsibility moving from all/mostly driver, to mostly car. Level 4 being "mind off" in that the car is mostly responsible, and that the "driver" can watch a movie or even go to sleep, level 5 is steering wheel optional and the car is responsible for everything.

Thing is that "autopilot" = "self driving" for most people. Blame it on KITT :D:D

Thats the fault of the person, not the system. As Autopilot does not mean self driving, or self flying in planes. The people who buy a Tesla are sat down and have this explained to them, they are then given paperwork to sign stating the same thing, they get a prompt when turning it on, and there are active audible and visual warnings in the car when a person ignores all of that instruction.

But advertised as "Auto pilot" Brilliant. "We'll drive the car, except when you have to"

Autopilot is the correct term for the Tesla system. You are confusing Autopilot with autonomous.
 
Autopilot is the correct term for the Tesla system. You are confusing Autopilot with autonomous.

They did choose the right terminology for it. The problem is people not understanding what it means. Auto pilot for planes is to keep a plane on course, not make decisions and avoid hazards or react to events.

People do the same thing with water resistant devices, people missunderstanding words and phrases and instead of getting clarity just assume and then go underwater with their phone and wonder why it stops working.

We are overal just assuming so much and not looking for understanding. Hell I don't even trust cruise control.
 
I say drop all this autopilot stuff and return to manual driving and paying full attention to the road.

Yes because everyone pays full attention to the road. No one has a drink while driving. No one eats anything while driving. No one adjusts the climate controls while driving. No one talks to anyone else while driving. No one blinks while driving. No one sneezes while driving. No one allows their mind to wander while driving.

Yeah... No. Automation is the future.
 
They did choose the right terminology for it. The problem is people not understanding what it means. Auto pilot for planes is to keep a plane on course, not make decisions and avoid hazards or react to events.

People do the same thing with water resistant devices, people missunderstanding words and phrases and instead of getting clarity just assume and then go underwater with their phone and wonder why it stops working.

We are overal just assuming so much and not looking for understanding. Hell I don't even trust cruise control.

Which would fall on the fault of the person, not the device.

You also have to remember, unlike with a watch or phone, when buying a Tesla with this system, you have a sit down with a person who explains this to you. They provide you with a full manual on it as well and have paperwork for you to read that again covers it that you sign. After that, when you turn it on in the car, it greats you with a prompt stating that you have to remain in full control, after agreeing to that, you have audible and visual warnings any time you take your hands off the wheel. There is not much else Tesla can do for these people, there are ALWAYS, with anything, a subset of people who are always going to ignore the safety or correct operation of a device.
 
I feel very sad for the family, but to me personally I consider this a suicide. The driver knew and had in the past complained about the autopilot having a problem with that specific barrier. Yet he persisted in driving past the same barrier with the autopilot in control. If you know in advance that you might have an accident at a specific location, under specific circumstances like allowing autopilot to drive, wouldn't you make a specific extra effort to avoid having that accident? He didn't and he paid the price. I know I sound cold but we'll never know what was going through his head to do such a thing that would result in his death.
 
You were claiming at first a rise in crash rate, then changed to it only helps with fender benders, implying it makes more serious crashes more common, which does not match up with any data or reports I have seen. I am asking if you have any data for this, or are you basing it on the fact the news covers EVERY crash that involves a Tesla?

Your first sentence here explains the whole situation, the humans at the wheel were not paying attention, which is the most common cause of crashes as a whole, not just with assisted driving systems. What that has to do with the system it self, I have no idea, it's a driver problem, period.

The Uber crash and Telsas system are NOT the same thing either. The Uber system is meant to be a higher level system, a level 3, as level 3 is the start of responsibility moving from all/mostly driver, to mostly car. Level 4 being "mind off" in that the car is mostly responsible, and that the "driver" can watch a movie or even go to sleep, level 5 is steering wheel optional and the car is responsible for everything.



Thats the fault of the person, not the system. As Autopilot does not mean self driving, or self flying in planes. The people who buy a Tesla are sat down and have this explained to them, they are then given paperwork to sign stating the same thing, they get a prompt when turning it on, and there are active audible and visual warnings in the car when a person ignores all of that instruction.



Autopilot is the correct term for the Tesla system. You are confusing Autopilot with autonomous.

OK here are a number of articles and study's showing drivers become overly complacent and reliant on the technology and more times than not fail to respond in time to an emergency situation where the technology fails because they are too distracted as they were not actively driving.

https://www.prnewswire.com/news-rel...an-lead-to-deadly-consequences-300618643.html
https://360.here.com/2017/04/10/will-autonomous-cars-make-drivers-complacent/
https://inews.co.uk/news/technology/driverless-cars-make-drivers-complacent-reliant-technology/

https://www.claimsjournal.com/news/national/2017/02/21/276963.htm
https://www.ft.com/content/3b0eaba6-38b5-11e8-8b98-2f31af407cc8
https://www.verisk.com/insurance/visualize/self-driving-cars-no-cure-for-distracted-driving/
https://www.wired.com/2011/07/active-safety-systems-could-create-passive-drivers/

Again my point is when you are behind the wheel of a self driving car you are not the active driver it becomes too easy to become complacent with the technology doing its job and it easily leads a driver to become distracted. I am not saying that the technology doesn't work what I am saying is it will change what kind of accidents you see. There are training programs designed to fight this very issue in a number of specialized tasks from pilots, to radar technicians on submarines, this has been a documented issue for any job where a machine does the majority of the work and the person merely the operator, and every work-safe program has a section dealing with complacency on the job being the #1 cause of accidents. When you are not actively engaged your mind wanders, a wandering mind is a distracted mind, and when operating heavy machinery a distracted operator is a dangerous operator.
 
For the Uber crash while the tech did fail the driver was clearly shown to be distracted and in extended footage they spent very little time actively engaged with the car and more time instead on their phone.

She was Monitoring the system witch was in a silly place (and the cars own built in autobrakeing system disabled and did not brake when it did see the object )
 
She was Monitoring the system witch was in a silly place (and the cars own built in autobrakeing system disabled and did not brake when it did see the object )
The video showed the driver looking away from the road more than 75% of the time they were behind the wheel, assume they are looking at the monitoring system I have one quick question for you, at what point do you know the vehicle has failed to identify the situation correctly and do you then take control of the vehicle.
 
OK here are a number of articles and study's showing drivers become overly complacent and reliant on the technology and more times than not fail to respond in time to an emergency situation where the technology fails because they are too distracted as they were not actively driving.

https://www.prnewswire.com/news-rel...an-lead-to-deadly-consequences-300618643.html
https://360.here.com/2017/04/10/will-autonomous-cars-make-drivers-complacent/
https://inews.co.uk/news/technology/driverless-cars-make-drivers-complacent-reliant-technology/

https://www.claimsjournal.com/news/national/2017/02/21/276963.htm
https://www.ft.com/content/3b0eaba6-38b5-11e8-8b98-2f31af407cc8
https://www.verisk.com/insurance/visualize/self-driving-cars-no-cure-for-distracted-driving/
https://www.wired.com/2011/07/active-safety-systems-could-create-passive-drivers/

Again my point is when you are behind the wheel of a self driving car you are not the active driver it becomes too easy to become complacent with the technology doing its job and it easily leads a driver to become distracted. I am not saying that the technology doesn't work what I am saying is it will change what kind of accidents you see. There are training programs designed to fight this very issue in a number of specialized tasks from pilots, to radar technicians on submarines, this has been a documented issue for any job where a machine does the majority of the work and the person merely the operator, and every work-safe program has a section dealing with complacency on the job being the #1 cause of accidents. When you are not actively engaged your mind wanders, a wandering mind is a distracted mind, and when operating heavy machinery a distracted operator is a dangerous operator.



None of those are actual research, but news links stating that some person THINKS this or that. I can find stuff like that all day long. Hell, the first link is more or less a paid ad for a driving course. And nothing that states the systems result in more serious crashes more often. Any system people will become dependent on, the question is does the system offer a net benefit and the result based on research is an absolute yes, and the systems are still young.

Another note, you keep saying "self driving car", Tesla is NOT a self driving car.

The video showed the driver looking away from the road more than 75% of the time they were behind the wheel, assume they are looking at the monitoring system I have one quick question for you, at what point do you know the vehicle has failed to identify the situation correctly and do you then take control of the vehicle.

The Uber cars and those like it are not meant to be a level 2 system like Tesla, they are a test bed for level 3 and 4 systems, the people in the cars are trained and should be monitoring, but like with anything else, things happen, the track record for the systems and miles driven are actually far, FAR better than the normal human driver. You will never have a perfect system, however humans are not perfect drivers either and once these systems are as good or better than the normal driver, there is no reason they should not be allowed to exist. Now, if you want to talk responsibility and liability at that point is something else.
 
For some reason self driving almost seems like a religious or political discussion here.

What’s up with that?
 
Took his hands off the wheel...

'nuff said.

I feel for his family and friends experiencing the loss. It's the toughest thing to go through in life.
 
if we had the data from Tesla to compare; I think we would find a WAY safer driving record. We have to remember that every single autonomous / semi-autonomous vehicle incident makes national headlines so it tends to skew our opinion. We must strive not to fall for this effect and base our judgements on the facts

If we divide the number of miles driven with autopilot engaged for all Tesla's autos equipped with autopilot by the number of accidents in Tesla's with autopilot turned on; and compare to the number of miles driven by all other vehicles without autopilot divided by the number of accidents. I think the numbers would speak to the clear advantages of autonomous / semi-autonomous systems. (but I cannot be certain) I worked for years as an avionics technician on large aircraft. In the sky there are way fewer obstacles and complexities even when accounting for 3 dimensions and I can honestly attest to the fact that a pilot needs to maintain alertness and be ready to take control at a moments notice. The same should be observed for a vehicle on the road with an auto-pilot feature. I think some Tesla drivers simply don't understand the capabilities and limitations of the technology.
 
The video showed the driver looking away from the road more than 75% of the time they were behind the wheel, assume they are looking at the monitoring system I have one quick question for you, at what point do you know the vehicle has failed to identify the situation correctly and do you then take control of the vehicle.

The system does not currently warn the driver of an possible object in path (in this case car did see but did not brake or sound a warning witch is a Major flaw that I assume they will correct) disabling the cars built in auto braking was a major error as it works very well and does not do unexpected random braking as they stated ( they even went to Australia with this car because it was not identifying kangaroos that were running across roads and the cars were not automatically braking until they went there to teach the car to identify kangaroos or objects like that)

the self driving cars should have automatic breaking anyway as this type of self driving car has ladar and did detect the person eventually but chose not to break and the cars own built in automatic braking was disabled

With tesla issue is that its mostly using cameras witch leads to these situations where it can drive into a wall so you need to keep more attention to the road when coming near slip roads or road split offs or bad road markings as the car may do unexpected things (as this driver was fully aware of at this split off but still trusted it to much with his life )

Overall tesla is very good but missing ladar is a bit of a problem and drivers should be aware of it
 
Back
Top