Tesla Driver in Fatal “Autopilot” Crash Got Numerous Warnings

I kinda think that the car should know better than to drive under a semi-trailer.
How much you wanna bet that software Rev 2.0 knows the average height of the vehicle and takes into account the height of the hole it is pointed at, because not every opening is a tunnel or overpass.:eek:
Doesn't change the fact it's safer with than without. They have no legal concerns here, the worst that can possibly happen is a name change, which won't change people doing what they're doing people are idiots, plain and simple.
 
Pretty sure most of us knew that, granted I remember the thread from back then and there were a number of people wanting to burn Tesla at the stake for this, even though at the time, all the data pointed to driver error and not the car or any system failing.

Just goes to show shitty drivers are shitty drivers.

Their semi-autonomous driving mode was failing, hence why it would alert the driver to the need to actually drive the car. I don't blame the car for that. Autonomous, be it semi-autonmous or fully autonomous is just not there yet.

I don't blame the car for the accident. I blame the driver and the company for overselling the features of their semi-autonomous features. Think that's why all your other automakers are staying clear of the semi-autonomous quagmire.
 
I dunno. I mean shouldn't we all know theres no such thing as a fully autonomous car
All these things are not even semi autonomous, so you have to be alert at all times

Hasn't it been proven in all crashes involving these kinda cars, that the driver was at fault for not paying enough attention?
 
0*LICz5sLAuKOuCMI2.gif


I don't particularly care if it was "the driver's fault". Fact is, there was an automated system in place that, by its very presence, dicked with the driver's sense of command and control. Either all cars on a particular road circuit are fully automated, or none at all. Mixing AI and human control, even in the same car, is a recipe for trouble.
 
I don't particularly care if it was "the driver's fault". Fact is, there was an automated system in place that, by its very presence, dicked with the driver's sense of command and control. Either all cars on a particular road circuit are fully automated, or none at all. Mixing AI and human control, even in the same car, is a recipe for trouble.

Totally disagree - The driver is responsible. Automation no matter how advanced is simply a tool - unfortunately the driver failed in his duty to remain active in managing his resources.
If Tesla's figures are to be believed, then even partial automation has value in reducing accident rates. This still has value...

Parallels exist in manufacturing, construction and transport to name a few - there are issues such as lack of effective training, but in these industries the accident rates have improved immensely.
Reducing it to an all or nothing argument doesn't do it justice..
 
Their semi-autonomous driving mode was failing, hence why it would alert the driver to the need to actually drive the car. I don't blame the car for that. Autonomous, be it semi-autonmous or fully autonomous is just not there yet.

I don't blame the car for the accident. I blame the driver and the company for overselling the features of their semi-autonomous features. Think that's why all your other automakers are staying clear of the semi-autonomous quagmire.

It is driver assist, not autonomous. Again, adding or manipulating words does nothing. This is without a doubt a user using a device in a way it was not meant to and is expressly stated not to, while active alarms and alerts are telling you to stop. Tesla owns no responsibility for that, this is 100% user at fault.
 
Totally disagree - The driver is responsible. Automation no matter how advanced is simply a tool - unfortunately the driver failed in his duty to remain active in managing his resources.
If Tesla's figures are to be believed, then even partial automation has value in reducing accident rates. This still has value...

Parallels exist in manufacturing, construction and transport to name a few - there are issues such as lack of effective training, but in these industries the accident rates have improved immensely.
Reducing it to an all or nothing argument doesn't do it justice..

Lol, you sound like a true bureaucrat. Lack of effective training... in other words - taking the control away from automation and placing it in the hand of the controller. Where does that road end - that's right, no automation at all.
 
Am I the only person thinking this may have been more of a suicide? idksolol.
It was one of my first thoughts too. either way we can all learn from this. I'd want my car to be able to at least detect if I'm falling asleep or not, and give wake me up with a swift jolt of electricity, or slap to the face, or better yet a high pitched asian lady screaming "wake up!" that would work too...

also, why not have cameras inside the car to record the driver? not only useful for facial/eye detection, but also for proof of insurance in case of accidents? I could see that being a thing...
 
0*LICz5sLAuKOuCMI2.gif


I don't particularly care if it was "the driver's fault". Fact is, there was an automated system in place that, by its very presence, dicked with the driver's sense of command and control. Either all cars on a particular road circuit are fully automated, or none at all. Mixing AI and human control, even in the same car, is a recipe for trouble.
Eh, i don't agree with this at all.
An automated system could detect autonomous vehicles and human operated vehicles and treat them differently.
Because we haven't gotten to that level of sophistication yet doesn't mean it's not possible.
 
Doesn't change the fact it's safer with than without. They have no legal concerns here, the worst that can possibly happen is a name change, which won't change people doing what they're doing people are idiots, plain and simple.
Tell the decapitated driver just how safe his vehicle was. Be sure to remind him of the Emergency Braking feature that never activated because there was a semi-trailer blocking his path.

And don't forget to mention that the beeping and alarms warning him to maintain control are easily ignored, further lulling the driver into a false sense of security.

A cognitive driver, in a normal vehicle, would of had more chance to either avoid and/or survive this accident. IMHO

Like I said above, Tesla needs to program their software to look for big hoverering white boxes, hindsight combined with death is 20/20.
 
Eh, i don't agree with this at all.
An automated system could detect autonomous vehicles and human operated vehicles and treat them differently.
Because we haven't gotten to that level of sophistication yet doesn't mean it's not possible.

Anything is possible if you throw enough money and engineering at it. I'm of the opinion that the self-driving car is too quickly being asked too much of it. I have great hopes that it will become a reality, but since human bodies are squishy, mixing robots and people in what is statistically by far the most dangerous pastime that people currently involve themselves in is just asking for trouble.

The technological ask is audacious to the point of arrogance and many more will be paying with their lives because instead of revolutionizing public transport infrastructure and taking people out of the equation completely, they wanna mix it up because it's politically and socially the most expedient.

I'll be grabbing the popcorn and watching the bloodbath.
 
Anything is possible if you throw enough money and engineering at it. I'm of the opinion that the self-driving car is too quickly being asked too much of it. I have great hopes that it will become a reality, but since human bodies are squishy, mixing robots and people in what is statistically by far the most dangerous pastime that people currently involve themselves in is just asking for trouble.

The technological ask is audacious to the point of arrogance and many more will be paying with their lives because instead of revolutionizing public transport infrastructure and taking people out of the equation completely, they wanna mix it up because it's politically and socially the most expedient.

I'll be grabbing the popcorn and watching the bloodbath.
Eh... roughly 16% of all accident related deaths in the US come from automobiles. That's all human interaction at the moment. I feel as if adding some automated incidents will not be a big dent in the overall picture.
Personally i feel it definitely can and will be done. If all cars were automated that 16% should shoot down to almost nothing. Even if half the cars are automated, i fully suspect the % of automobile deaths will be greatly reduced.
I personally also feel as if there are too many groups looking for the holy grail when there is no one thing that will work. Connected cars would really help automation. Having multiple systems (radar and visual recognition for example) combined with wayside equipment (beacons or sensors built into the road) would also help out a lot.
 
Eh... roughly 16% of all accident related deaths in the US come from automobiles. That's all human interaction at the moment. I feel as if adding some automated incidents will not be a big dent in the overall picture.
Personally i feel it definitely can and will be done. If all cars were automated that 16% should shoot down to almost nothing. Even if half the cars are automated, i fully suspect the % of automobile deaths will be greatly reduced.
I personally also feel as if there are too many groups looking for the holy grail when there is no one thing that will work. Connected cars would really help automation. Having multiple systems (radar and visual recognition for example) combined with wayside equipment (beacons or sensors built into the road) would also help out a lot.

And that is just a hackers wet dream waiting to happen.

Sounds like a perfectly great idea to me to have all the automated cars communicating with each other on the same frequency.

All you would have to do to muck that up is use a signal jammer.

Or what happens when a sensor or radio messes up? We all know how unreliable current wireless connections can be and how infuriating it can be to get all your devices working properly on the same network.

Couple that with a large amount of traffic, with every single vehicle within a certain distance needing to be connected to each other and you are setting yourself up for a fuster-cluck.

Visual detection goes out the window in bad weather and I suspect low range radar wouldn't fare much better.

And what happens when the sensors/cameras/whatever get dirty?
 
And that is just a hackers wet dream waiting to happen.

Sounds like a perfectly great idea to me to have all the automated cars communicating with each other on the same frequency.

All you would have to do to muck that up is use a signal jammer.

Or what happens when a sensor or radio messes up? We all know how unreliable current wireless connections can be and how infuriating it can be to get all your devices working properly on the same network.

Couple that with a large amount of traffic, with every single vehicle within a certain distance needing to be connected to each other and you are setting yourself up for a fuster-cluck.

Visual detection goes out the window in bad weather and I suspect low range radar wouldn't fare much better.

And what happens when the sensors/cameras/whatever get dirty?

Communication between cars would be used for optimizing flow of traffic. Primary car behavior would still be dependent on the sensors.

Redundant sensors as well as failure detection/limp mode would take care of sensor failure.

Human eyes are not that mich better in bad weather. Worse, human hubris would often have people not slowing down to safe speeds for the conditions.
 
Ok. Imagine a world in which you're supposed to watch the roomba. Just imagine it for a second.
Too hard? Ok, lets change it to, i dunno, an automated train system and you're the operator who's supposed to watch it. Change it to autopilot (same name, wow) on an airplane that you're supposed to watch/keep track of. Change it to autopilot on a cruise ship that you're still supposed to watch and keep track of.
Instead of watching it, you decide to take a nap. Now there's a chance nothing bad will happen, but there's also a chance that catastrophe will strike.
Do you have a point somewhere?

Trains have alerters exactly for this reason to keep the driver's attention. They're not supposed to take a nap. AS airline pilots are not supposed to take a nap either when the "autopilot" is engaged.
Also:
 
And that is just a hackers wet dream waiting to happen.

Sounds like a perfectly great idea to me to have all the automated cars communicating with each other on the same frequency.

All you would have to do to muck that up is use a signal jammer.

Or what happens when a sensor or radio messes up? We all know how unreliable current wireless connections can be and how infuriating it can be to get all your devices working properly on the same network.

Couple that with a large amount of traffic, with every single vehicle within a certain distance needing to be connected to each other and you are setting yourself up for a fuster-cluck.

Visual detection goes out the window in bad weather and I suspect low range radar wouldn't fare much better.

And what happens when the sensors/cameras/whatever get dirty?
The communication between cars would be optional because not all systems would use it. It would basically sent out a broadcast that would indicate what it's currently doing to all the surrounding cars.
So if the car ahead is equipped with the system and detects that it's going to break, it would broadcast that to let the cars behind it know. If it needs to change lanes and there's a car parallel to it, the car to the parallel would speed up or slow down a small amount making room.
Honestly in that scenario a human driving a car could have the same system on and just allow the automated cars have a smoother interaction.
It's not like i'm advocating 100% dependence on the transmission of information. That's doomed for failure. But in those scenarios, better decision making could be made if that information was available.

I don't see how visual detection goes out the window. How is a camera different than human eyesight in that manner? If it's not safe to drive for humans, then it's probably going to be very difficult for automated vehicles to travel as well.

Well when the windshield becomes dirty, we have this mechanism called windshield wipers that assist in providing visibility. I suspect it won't be that much different with sensors.
 
And that is just a hackers wet dream waiting to happen.

Sounds like a perfectly great idea to me to have all the automated cars communicating with each other on the same frequency.

All you would have to do to muck that up is use a signal jammer.

Or what happens when a sensor or radio messes up? We all know how unreliable current wireless connections can be and how infuriating it can be to get all your devices working properly on the same network.

Couple that with a large amount of traffic, with every single vehicle within a certain distance needing to be connected to each other and you are setting yourself up for a fuster-cluck.

Visual detection goes out the window in bad weather and I suspect low range radar wouldn't fare much better.

And what happens when the sensors/cameras/whatever get dirty?

Ah, why won't you people get tired of these doomsday scenarios related to autonomous vehicles? There is FUD and there is [H]FUD.
 
Do you have a point somewhere?

Trains have alerters exactly for this reason to keep the driver's attention. They're not supposed to take a nap. AS airline pilots are not supposed to take a nap either when the "autopilot" is engaged.
Also:

There have been train accidents that have occured with automated systems on them: https://en.wikipedia.org/wiki/June_2009_Washington_Metro_train_collision
And that's including an operator who's supposed to be watching it (similar to a tesla's autopilot).
Airline pilots have taken naps with autopilot on: http://www.dailymail.co.uk/news/art...ng-asleep-overshooting-airport-150-miles.html
My point is that people get used to automated features so much that they start to trust in it when it's clearly not made for full automation. That false sense of security leads to (at worst) accidents with loss of life or incidents in the least.
 
I remember reading about what I think was the first autopilot death where the car couldn't differentiate between a sun set and apparently the rear end of a semi
 
Does anyone know if that Tesla had the version 2 hardware? Dont know if that would have made a difference. And if the car says keep your hand on the steering wheel, then fucking do it.
 
There have been train accidents that have occured with automated systems on them: https://en.wikipedia.org/wiki/June_2009_Washington_Metro_train_collision
And that's including an operator who's supposed to be watching it (similar to a tesla's autopilot).
Airline pilots have taken naps with autopilot on: http://www.dailymail.co.uk/news/art...ng-asleep-overshooting-airport-150-miles.html
My point is that people get used to automated features so much that they start to trust in it when it's clearly not made for full automation. That false sense of security leads to (at worst) accidents with loss of life or incidents in the least.
Yes but that's still human stupidity. I don't think it's Tesla's job to protect people from their own negligence and stupidity. In fact you can't protect people from their own stupidity. They'll always find some way to counter act your countermeasures, just look at the bottle trick they come up with.
And if someone gets into an accident while using a bottle to keep the car at bay, Tesla would still get blame. It's not right.
And even if there was absolutely no way to circumvent the alerters, people will try to force autonomous devices into dangerous situations to try and catch them out.
There should be a zero tolerance for this. You got into an accident with a high blood alcohol level? You get done even if you're not the responsible party. The Same zero tolerance should apply when there is evidence that you were hands off for extended periods. Because there is a pretty good chance that you could've avoided the accident if you were paying attention even if you weren't the one violating the traffic code.
 
The communication between cars would be optional because not all systems would use it. It would basically sent out a broadcast that would indicate what it's currently doing to all the surrounding cars.
So if the car ahead is equipped with the system and detects that it's going to break, it would broadcast that to let the cars behind it know. If it needs to change lanes and there's a car parallel to it, the car to the parallel would speed up or slow down a small amount making room.
Honestly in that scenario a human driving a car could have the same system on and just allow the automated cars have a smoother interaction.
It's not like i'm advocating 100% dependence on the transmission of information. That's doomed for failure. But in those scenarios, better decision making could be made if that information was available.

I don't see how visual detection goes out the window. How is a camera different than human eyesight in that manner? If it's not safe to drive for humans, then it's probably going to be very difficult for automated vehicles to travel as well.

Well when the windshield becomes dirty, we have this mechanism called windshield wipers that assist in providing visibility. I suspect it won't be that much different with sensors.

And how many vehicles have you seen or driven where the windshield wipers were not replaced when they should have been? If people don't replace those when they wear out, how often do you think they are going to replace wipers on cameras or other sensors that have 0 effect on them actually being able to see out the windows?

Adding more complicated systems is going to end up adding even more points of failure, which will also increase the initial cost of the vehicle, but also add to the maintenance costs of the vehicle.

And what about when a car with the communication system is being manually driven? Is it still going to send the signal to other vehicles or is it going to go into some bypass mode. Because if it is being manually driven and the system still sends out information that could lead to disaster, especially if the driver is being an idiot.
 
And how many vehicles have you seen or driven where the windshield wipers were not replaced when they should have been? If people don't replace those when they wear out, how often do you think they are going to replace wipers on cameras or other sensors that have 0 effect on them actually being able to see out the windows?

Adding more complicated systems is going to end up adding even more points of failure, which will also increase the initial cost of the vehicle, but also add to the maintenance costs of the vehicle.

And what about when a car with the communication system is being manually driven? Is it still going to send the signal to other vehicles or is it going to go into some bypass mode. Because if it is being manually driven and the system still sends out information that could lead to disaster, especially if the driver is being an idiot.
The nice part about automated vehicles is that when it detects it's time for maintenance, it could just drive itself over to the maintenance shop of your choosing.
All I'm saying is the we already rely on a mechanism to clean the windshield. I have to believe there should be something similar that could be done for sensors. I wasn't trying to provide a bulletproof solution for you.
Driving is a complicated process. I'm saying the prototypes should be trying out lots of different sensors and mechanisms to find a solution that works really well. Right now it's a bit early in the evolution to say what will work and what won't work.

What's the problem with a car that's being driven manually having that connected component on? If you have an automated vehicles and the car opposite you is being driven erratically, if that information is being broadcast, maybe the automated vehicles can stay away? If the person can't stay in the lanes properly and it's being transmitted, the automated vehicles could avoid being next to it? Hell, the automated vehicles could detect that a car is a hazard and automatically notify the authorizes to pull the vehicles over because the other person is too busy texting or driving while drunk?

I mean, lets look at currently technologies. Lane adherence is already a simple problem that was solved. Dashcams with very little computing power have the algorithms to warn the driver in the vehicle if they're slowly veering off lane. If a 50$ dashcam can do that, autonomous vehicles can certainly do that. I'd say that a very high amount of accidents related to texting have to do with the car veering off lane and going either offroad, into incoming traffic, or hitting the person next to them. I consider this to be in the low hanging fruit category and expect to see driving warning systems standard in cars very soon. Transmitting that information out is fairly straightforward if we ignore not having any federal standards on how it should be done and in what format. Utilizing that information will at best save lives and at worst, save some minor fender benders from occurring.
 
Tell the decapitated driver just how safe his vehicle was. Be sure to remind him of the Emergency Braking feature that never activated because there was a semi-trailer blocking his path.

And don't forget to mention that the beeping and alarms warning him to maintain control are easily ignored, further lulling the driver into a false sense of security.

A cognitive driver, in a normal vehicle, would of had more chance to either avoid and/or survive this accident. IMHO

Like I said above, Tesla needs to program their software to look for big hoverering white boxes, hindsight combined with death is 20/20.

You say all that but it doesn't change anything about the fact they are legally covered. They have done nothing against the law or could be held to them outside of -maybe- the name, at best. the only reason you see this news so much is it's Tesla, crazier shit than this happens in a normal car, daily, hell, probably hourly. You just don't hear about it.
 
While everyone continues the argue about software versus human, The KEY point is that the SOFTWARE WARNED the human driver multiple times before the crash happened and the person CHOSE to ignore the warnings.....end of story.
 
You say all that but it doesn't change anything about the fact they are legally covered. They have done nothing against the law or could be held to them outside of -maybe- the name, at best. the only reason you see this news so much is it's Tesla, crazier shit than this happens in a normal car, daily, hell, probably hourly. You just don't hear about it.
This is how it will go down.

Ladies and Gentlemen of the Jury.
Today we seek justice for An American Hero who survived the wars in Afganistan, covert actions in Pakistan and the War on ISIL (
pause for effect), (emphasize the following) only to be brutally slaughtered by his vehicle upon returning home.

After posting numerous youtube videos of him driving hands free and accumulating 100's of thousands of views, so many in fact that Tesla lauded him, he was lulled into a false sense of security.

Civil Court is where it's at, don't believe me? just ask O.J. Simpson.
 
Last edited:
While everyone continues the argue about software versus human, The KEY point is that the SOFTWARE WARNED the human driver multiple times before the crash happened and the person CHOSE to ignore the warnings.....end of story.

To you and I, and probably most people who use this forum, that is the end of the story.

Out in the idiocy of the rest of humanity it is not even close to the end.
 
To you and I, and probably most people who use this forum, that is the end of the story.

Out in the idiocy of the rest of humanity it is not even close to the end.

nah, most people on this forum will probably argue that is why this shouldn't be called a self driving car as it should have 100% been able to do what was needed to avoid the crash even if that meant pulling over and stopping on the side of the road when the driver didn't listen.
 
It is driver assist, not autonomous. Again, adding or manipulating words does nothing. This is without a doubt a user using a device in a way it was not meant to and is expressly stated not to, while active alarms and alerts are telling you to stop. Tesla owns no responsibility for that, this is 100% user at fault.

Except the user was using the device exactly how Tesla themselves claimed it would do. They oversold the capabilities of their Autopilot system. Every other car company sells the same systems, but gives very little control to the system and they don't call it autonomous anything.

Even after multiple videos came out showing users driving it incorrectly and even Musk himself coming out saying he was disappointed that users were doing such. Nothing was changed. It wasn't until this guy's accident did they make changes to Autopilot. They don't change their advertising of it though.

The accident is 100% the driver's fault, yes. I'm not denying such. I was simply stating that I blame Tesla for overselling it's features. Even now, they still oversell it.
 
Except the user was using the device exactly how Tesla themselves claimed it would do. They oversold the capabilities of their Autopilot system. Every other car company sells the same systems, but gives very little control to the system and they don't call it autonomous anything.

Even after multiple videos came out showing users driving it incorrectly and even Musk himself coming out saying he was disappointed that users were doing such. Nothing was changed. It wasn't until this guy's accident did they make changes to Autopilot. They don't change their advertising of it though.

The accident is 100% the driver's fault, yes. I'm not denying such. I was simply stating that I blame Tesla for overselling it's features. Even now, they still oversell it.

How? Everywhere it is talked about, you are told you remain in control, every time the system is used, it tells you this and you have to click "Accept" on the screen. Tesla has not called it autonomous anything either, it is called autopilot, if you are to ignorant to understand that I don't know what to tell you.

None of these users have been using the device as Tesla claims it should be used, where you get this I don't know, no where in the data, manual or on screen warnings state ANYTHING about this being a fully autonomous driving system. Musk said he was not happy people were doing so much, in the context of him wanting a fully autonomous system for the cars, stop pulling things out of context. This has always been one of their goals, and they make it very clear it is not there yet and that even Autopilot is still a huge work in progress, just like all driver assists from all MFGs right now, we are in the early stages still. To suggest having goals is a bad thing or something you should be found at fault for is just ridiculous.
 
How? Everywhere it is talked about, you are told you remain in control, every time the system is used, it tells you this and you have to click "Accept" on the screen. Tesla has not called it autonomous anything either, it is called autopilot, if you are to ignorant to understand that I don't know what to tell you.

My bad. I must have gotten confused with Musk talking about autonomous, while their website talks about self-driving and how the car does all this stuff for it. That or 2 minute video showing a car driving around with the user never putting their hands on the wheel.

None of these users have been using the device as Tesla claims it should be used, where you get this I don't know, no where in the data, manual or on screen warnings state ANYTHING about this being a fully autonomous driving system. Musk said he was not happy people were doing so much, in the context of him wanting a fully autonomous system for the cars, stop pulling things out of context. This has always been one of their goals, and they make it very clear it is not there yet and that even Autopilot is still a huge work in progress, just like all driver assists from all MFGs right now, we are in the early stages still. To suggest having goals is a bad thing or something you should be found at fault for is just ridiculous.

"Build upon Enhanced Autopilot and order Full Self-Driving Capability on your Tesla. This doubles the number of active cameras from four to eight, enabling full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver."

Ya, Tesla's never said anything about their car being able to drive itself. You know, except on their Tesla Autopilot page.
 
My bad. I must have gotten confused with Musk talking about autonomous, while their website talks about self-driving and how the car does all this stuff for it. That or 2 minute video showing a car driving around with the user never putting their hands on the wheel.



"Build upon Enhanced Autopilot and order Full Self-Driving Capability on your Tesla. This doubles the number of active cameras from four to eight, enabling full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver."

Ya, Tesla's never said anything about their car being able to drive itself. You know, except on their Tesla Autopilot page.

Yes? And Tesla has been moving to release fully autonomous cars for some time now, which is an UPGRADE to the old system, which was just driver assist. You can not talk about an old system and then quote and talk about a totally new system Tesla doesn't even have fully available yet no less at the time of this event LAST YEAR. As you just so conveniently ignored the whole text which goes on to say: "Please note that Self-Driving functionality is dependent upon extensive software validation and regulatory approval, which may vary widely by jurisdiction. It is not possible to know exactly when each element of the functionality described above will be available, as this is highly dependent on local regulatory approval. Please note also that using a self-driving Tesla for car sharing and ride hailing for friends and family is fine, but doing so for revenue purposes will only be permissible on the Tesla Network, details of which will be released next year."

Stop witch hunting already.
 
My bad. I must have gotten confused with Musk talking about autonomous, while their website talks about self-driving and how the car does all this stuff for it. That or 2 minute video showing a car driving around with the user never putting their hands on the wheel.
And Chevy or Ford or just about any other company shows that a car cutting off another will have your car stop super quick without you hitting the brakes... All of these companies tout this "enhanced safety feature" (or whatever they're marketing it as) yet if you read the fine print it does say you should not rely on this technology.
 
And Chevy or Ford or just about any other company shows that a car cutting off another will have your car stop super quick without you hitting the brakes... All of these companies tout this "enhanced safety feature" (or whatever they're marketing it as) yet if you read the fine print it does say you should not rely on this technology.
They have to for liability reasons.
On the one hand, it should in theory save the car from crashes.
On the other hand, it reinforces bad driving behavior like following too close and being distracted while driving. Just thinking that you can be a bad driver and the technology will compensate is a fairly bad outcome.
If anything, i'm surprised more of this (https://www.wired.com/2016/08/hackers-fool-tesla-ss-autopilot-hide-spoof-obstacles/) isn't happening more. Being able to trick a car into braking to get off your ass should be *fun* in the future.
 
Nope, the Tesla is not like any other vehicle. It can avoid obstacles, steer and brake all by itself.
The failure here was in design, expect a Civil Trial.

No, the failure here was a dumbass behind the wheel.
 
No, the failure here was a dumbass behind the wheel.
So I guess the software patch that fixed the car not being able to see large hovering rectangles was not needed? Interesting logic you got there.
 
So I guess the software patch that fixed the car not being able to see large hovering rectangles was not needed? Interesting logic you got there.

It is entirely irrelevant because the system demands the driver to pay attention to it at all times.
 
Back
Top