Tesla Model X Driver Blames Autopilot for Insane Collision with Semi-Truck

No. I also don't have proof there isn't a giant spaghetti monster flying above us.
Yes, mom.

So he can't back anything up nor can you yet you know you are the correct one...got it. Go to bed.
 
After looking at the picture of the wreck and reading what happened. This isn't a weakness in the autopilot as much as a celebration of how bad-ass Tesla's safety technology is. Tesla's safety engineers are magicians.
 
It's almost like driver inattentiveness is the whole problem to begin with. :eek:

I'm not sure how naming it something different is going to help avoid accidents. A humans response time (go figure, another "problem" when traveling near 100ft/sec) just isn't suited to taking over at a moments notice, especially when you otherwise haven't been doing anything but watching - both other traffic and your own car. Every single video I've seen of this, your hands aren't actually on the wheel so you've added to your response time even more.


Gotta watch out for those Cessna's broke down on the side of the flight path, and those damn cargo plans blocking it.

That's collision avoidance, not autopilot.

But commercial jets do that as well, so your comeback kinda sucks on both levels.
 
And you know what really gets me about your statement, in some way you seem to believe that an Aircraft's autopilot system is superior to Tesla's. I mean the plane only has to maintain course and altitude and shit, in the sky, a sky with no lanes, guard rails, bridge abutments, and where the next closest vehicle is usually 30 or more miles away with several "minutes" in which to make adjustments as opposed to mere seconds.

I'm just thinking that the Tesla bar is a little bit higher to have to jump over.

If anything, your little diatribe only enforced the need to NOT call it fucking autopilot lol.

Moving the goalposts is a logical fallacy.
 
That's collision avoidance, not autopilot.

But commercial jets do that as well, so your comeback kinda sucks on both levels.
Let me get these for you...
grabbingatstraws.jpg
 
After looking at the picture of the wreck and reading what happened. This isn't a weakness in the autopilot as much as a celebration of how bad-ass Tesla's safety technology is. Tesla's safety engineers are magicians.
It's quite good, but that's standard for modern cars and it wasn't like he hit a brick wall head on.
 
If anything, your little diatribe only enforced the need to NOT call it fucking autopilot lol.

Moving the goalposts is a logical fallacy.

So we shouldn't call it AutoPilot because that suggests that it is not as capable as a system, that is in fact, less capable ?

And that is moving a goal post?

Tell me this is not an accurate description of what Jim Kim is saying.

My understanding of Jim Kim's statement is that Tesla should not call their system AutoPilot because that suggests it is more capable then it actual is.

But as I pointed out, Tesla's AutoPilot is actually more capable then an aircraft AuoPilot system, it's just not fully autonomous, which an aircraft autopilot system isn't either.

No reason to let reality get in the way of things.

Do you believe I am misunderstanding Jim Kim's comment? That there is another reason that Tesla shouldn't call their system AutoPilot ?
 
None of the other systems are doing or are marketed or used or percieved as Tesla's autopilot is.

"yet miles driven under autopilot have a MUCH lower crash rate than normal unaided drivers"
There are no accurate statistics to back that up and corelation does not equal causation.

From the National Highway Traffic Safety Administration.

"The agency found that Teslas without Autosteer crashed an average of 1.3 times per million miles driven, compared with 0.8 per million miles for vehicles equipped with Autosteer." So over all a 40% reduction in crashes just by the addition of autosteer, this was also on first gen hardware.

"Not only did the NHTSA report absolve Tesla of any blame in the accident, but analysis and testing of the Autopilot system found that across Tesla’s fleet, the deployment of Autopilot reduced accidents by around 40 percent. Sorry, humans: we’re all bad at driving, and even the first generation of trial software is better at paying attention than your brain on Red Bull."

 
After looking at the picture of the wreck and reading what happened. This isn't a weakness in the autopilot as much as a celebration of how bad-ass Tesla's safety technology is. Tesla's safety engineers are magicians.


And a little bit of luck. The article makes it sound like the car ran straight up the truck's ass but the description of the damage suggests the car hit off center, the passenger area was sheared off while the driver side wasn't. This guy was saved because for whatever reason, he did not slam straight into the rear of the truck or the driver's side would have suffered the same fate. This is what it sounds like to me.
 
I've made no comparable claims.

It's the same dam thing when you assumed he was wrong off the bat otherwise why would have you replied to it the way you did? How does that not compare ffs?

Eventually you have to have real live beta tests with consumers. You can't think of everything some dumb fuck might do no matter how controlled it is.
 
It's the same dam thing when you assumed he was wrong off the bat otherwise why would have you replied to it the way you did? How does that not compare ffs?
That which can be asserted without evidence, can be dismissed without evidence.

From the National Highway Traffic Safety Administration.

"The agency found that Teslas without Autosteer crashed an average of 1.3 times per million miles driven, compared with 0.8 per million miles for vehicles equipped with Autosteer." So over all a 40% reduction in crashes just by the addition of autosteer, this was also on first gen hardware.

"Not only did the NHTSA report absolve Tesla of any blame in the accident, but analysis and testing of the Autopilot system found that across Tesla’s fleet, the deployment of Autopilot reduced accidents by around 40 percent. Sorry, humans: we’re all bad at driving, and even the first generation of trial software is better at paying attention than your brain on Red Bull."
Correlation does not imply causation and bad reporting is bad reporting. I should say, now standard reporting.
 
So the reality is that a human was the cause of the accident. Both the tesla driver and the truck driver.
 
  • Like
Reactions: Meeho
like this
That which can be asserted without evidence, can be dismissed without evidence.


Correlation does not imply causation and bad reporting is bad reporting. I should say, now standard reporting.

LOL that still does not make you right or him wrong. How long did it take you to search for that saying? Good try at deflection but it falls short. Apparently you both are full of bullshit without backup. :LOL:

It's not s stretch to think automation will cause less accidents than humans in most scenarios. Humans get distracted driving often (simple google search). How many of these accidents have turned out to be human vs. the truth when Telsa checks the logs. I can't remember one where the driver wasn't lying or left out details in the end. I'm not going to claim it as fact because we know how that turns out.
 
more capable then an aircraft AuoPilot system, it's just not fully autonomous, which an aircraft autopilot system isn't either.


But it is lol. You put your trip plan in, and it will fly your butt there and avoid collisions with other planes... it's an autopilot. It is not required to have hands on.

The tesla system is not in anyway designed to do the above, first and foremost it requires hands on constant attention at all times. (https://www.faa.gov/regulations_pol...advanced_avionics_handbook/media/aah_ch04.pdf)


Now it's a valid point sure, that programming a car driving system is much more complicated than a plane... but that isn't the argument, and that is what i meant by moving the goalposts.
 
LOL that still does not make you right or him wrong. How long did it take you to search for that saying? Good try at deflection but it falls short. Apparently you both are full of bullshit without backup. :LOL:
0.5 seconds to remember it, 10 seconds to Google and copy the exact wording. It's not a deflection. He stated a statistical and causation certainty without anything to back it up. It is not on me to prove it wrong but on him to prove it right. Before that we were dealing with opinions. So no, you didn't catch me in a clever gotcha.

It's not s stretch to think automation will cause less accidents than humans in most scenarios. Humans get distracted driving often (simple google search). How many of these accidents have turned out to be human vs. the truth when Telsa checks the logs. I can't remember one where the driver wasn't lying or left out details in the end. I'm not going to claim it as fact because we know how that turns out.
How does it turn out? With having to prove it? That's terrible.
Automation can certainly cause less accidents. The current state of Tesla's automation is in far too early stages to take that for granted and I would argue that the false sense of security and capability does more harm than good.
 
The current state of Tesla's automation is in far too early stages to take that for granted and I would argue that the false sense of security and capability does more harm than good.

Can you call it automation if you have to have both hands on the steering wheel?


"look at this automated widget machine... but it requires a worker to put both hands on this handle to work..."
 
But it is lol. You put your trip plan in, and it will fly your butt there and avoid collisions with other planes... it's an autopilot. It is not required to have hands on.

The tesla system is not in anyway designed to do the above, first and foremost it requires hands on constant attention at all times. (https://www.faa.gov/regulations_pol...advanced_avionics_handbook/media/aah_ch04.pdf)


Now it's a valid point sure, that programming a car driving system is much more complicated than a plane... but that isn't the argument, and that is what i meant by moving the goalposts.

That's your argument, not "the argument".

Tell me if I have this incorrect ok?

Some people think that Tesla shouldn't call their system AutoPilot because it misrepresents the system's capabilities.... to who?

It misrepresents them to who?

To the customer who purchased the car and received the training in how to operate their vehicle to include the optional AutoPilot system?

Is that who Tesla misrepresented AutoPilot to?

The dealership explains what the system is to the customer and after they buy they are told what it is again, how to operate it, what it is and what it isn't. Is this who you are claiming was uninformed?

Because it's called AutoPilot?

Take your time, I have to go home, I'll be back in the morning. Good evening.
 
Can you call it automation if you have to have both hands on the steering wheel?
Only every X seconds. And the idiot in the article thought so. The marketing also implies lots of automation. The name kinda hints to it as well.
 
0.5 seconds to remember it, 10 seconds to Google and copy the exact wording. It's not a deflection. He stated a statistical and causation certainty without anything to back it up. It is not on me to prove it wrong but on him to prove it right. Before that we were dealing with opinions. So no, you didn't catch me in a clever gotcha.


How does it turn out? With having to prove it? That's terrible.
Automation can certainly cause less accidents. The current state of Tesla's automation is in far too early stages to take that for granted and I would argue that the false sense of security and capability does more harm than good.

And what have you proven? The same as the other person claiming the opposite...nothing. And that false sense of security is dip-shit human related not automation. I'd suggest you take a look again at that beta screenshot before you enable the software on the car. Like other software you don't know all the issues and glitches until you get it in the hands of real users. Lots of them. I'd argue it's the future so get on board or GTFO of the way. I hope they don't let morons with money who don't listen or "nay sayers" like yourself keep Telsa from pushing forward.
 
Tesla is the second coming of Christ, everything it touches turns to gold and it's propelling mankind into the future. Should I get more excited about their reality defying cars or Hyperloop? After all, it's Tesla, they can do no wrong and are so hip and futuristic and cool and only old farts and backward losers can criticize them.
 
Tesla is Satan reincarnated, everything they touch turns to shit and they are holding back humanity. Should I get more depressed over their unrealistic push for electric cars and clean energy? After all, its Telsa, they can do no right and are soo boring and outdated and and only preteen nerds care about this sort of thing.
 
even though its amusing the people in here, the semi changed lanes suddenly which resulted in the crash (not the autopilot) and did not give enough time for the collision avoidance system to react fully or the driver at highway speeds (from the damage to the passenger side it very likely did try to avoid it but was to close when got cut up by the vehicle in front that changed lanes suddenly)

there are some accidents you just can't avoid, not sure what the issue was with the driver blaming the autopilot

"There was a pickup truck that was out of gas in the right lane (lights were either dim or off, and give the night, was hard to see). A semi was pulling up onto it, saw it, braked and swerved into my middle lane. Autopilot did not disengage, but did the emergency beep about 1 second before impact. I was looking off to the side, and impacted the truck immediately after I heard the beep and looked forward."

i guess if he was looking forward not off to the side (not sure what he was looking at) he might of been able to react to it he might still of hit it (still surprised the collision avoidance did not kick in, maybe a blind spot in the radar?) i do hope the computer on board was still operational after the crash as it would of sent it to tesla, so be nice to see the report on this one (all of this likely happened in 1-3 seconds)

Slight edit

this is a First gen tesla so mostly likely why the collision avoidance system did not react until it was to late (the early models are more for rear end crashes into stationary traffic or side on lane changes crash avoidance) the 2-3rd gen have more sensors but the 4-5th gen tesla cars have more computer power and a lot more sensors and would of likely reacted the semi pulling out (as long as there was not a car beside car)
 
Last edited:
this is a First gen tesla so mostly likely why the collision avoidance system did not react until it was to late (the early models are more for rear end crashes into stationary traffic or side on lane changes crash avoidance) the 2-3rd gen have more sensors but the 4-5th gen tesla cars have more computer power and a lot more sensors and would of likely reacted the semi pulling out (as long as there was not a car beside car)
There's only two generation of HW, the first gen based on MobilEye and the current gen based on Nvidia. In almost all respects, the MobilEye is still better.
 
Correlation does not imply causation and bad reporting is bad reporting. I should say, now standard reporting.

And the truth finally comes out. You do not care about facts or the truth, you only used evidence as what you thought was an easy way out not realizing there is a mountain of it, not just from Tesla, but also Google cars.

"Correlation does not imply causation" despite what you seem to think here is not an argument nor does it suffice as a refutation of data. It also only deals with observation of two points of data assumed to be related, however, as linked this was a case study with data, not just assumption, it is also backed by the 1.3 billion miles of driving time in these cars. Unlike other tests, the cars are the same, all Tesla and the only change being equipped with Autopilot or not, also unlike most other cars Tesla has a streaming data connection to the cars that keeps and sends driving and crash data to their deep learning servers.

"Correlation does not imply causation" only means that things can correlate yet not be related, it does NOT mean that because things correlate they are not related, if that was the case, nothing would ever be proven. That is just a logical fallacy.

So, if you are going to use "Correlation does not imply causation", you should at least have a counter argument and data. Without that it means nothing and only displays your bias in being unwilling to even acknowledge the actual data you asked for. If you will not take a case study from the National Highway Traffic Safety Administration as proof, what will you accept? The Word of God only? Give me a break, at this point I am 95% sure you are just a troll.

Apparently you both are full of bullshit without backup. :LOL:.

Are you talking about me? Because as soon as he asked for data, my very next reply was a link to a NHTSA case study with the very data he asked for, not sure how that is "bullshit without backup".

0.5 seconds to remember it, 10 seconds to Google and copy the exact wording. It's not a deflection. He stated a statistical and causation certainty without anything to back it up. It is not on me to prove it wrong but on him to prove it right. Before that we were dealing with opinions. So no, you didn't catch me in a clever gotcha.

Without anything to back it up? Just because you don't like the outcome of the data doesn't mean it does not count, unless you are one of those special little snowflakes.
 
And the truth finally comes out. You do not care about facts or the truth, you only used evidence as what you thought was an easy way out not realizing there is a mountain of it, not just from Tesla, but also Google cars.

"Correlation does not imply causation" despite what you seem to think here is not an argument nor does it suffice as a refutation of data. It also only deals with observation of two points of data assumed to be related, however, as linked this was a case study with data, not just assumption, it is also backed by the 1.3 billion miles of driving time in these cars. Unlike other tests, the cars are the same, all Tesla and the only change being equipped with Autopilot or not, also unlike most other cars Tesla has a streaming data connection to the cars that keeps and sends driving and crash data to their deep learning servers.

"Correlation does not imply causation" only means that things can correlate yet not be related, it does NOT mean that because things correlate they are not related, if that was the case, nothing would ever be proven. That is just a logical fallacy.

So, if you are going to use "Correlation does not imply causation", you should at least have a counter argument and data. Without that it means nothing and only displays your bias in being unwilling to even acknowledge the actual data you asked for. If you will not take a case study from the National Highway Traffic Safety Administration as proof, what will you accept? The Word of God only? Give me a break, at this point I am 95% sure you are just a troll.



Are you talking about me? Because as soon as he asked for data, my very next reply was a link to a NHTSA case study with the very data he asked for, not sure how that is "bullshit without backup".



Without anything to back it up? Just because you don't like the outcome of the data doesn't mean it does not count, unless you are one of those special little snowflakes.

Hey BlueFireIce, sorry I missed that post so obviously I was mistaken and you did have proof. I was to busy replying to Meeho the intellectual who obviously was not backing up what he was saying by using deflective wording. It's easy to sit on your high horse with all doom and gloom but that does not advance technology. Yes you will have idiots who refuse to listen even when screamed in their face. We've had that before "smart" anything so nothing will change that.
 
Hey BlueFireIce, sorry I missed that post so obviously I was mistaken and you did have proof. I was to busy replying to Meeho the intellectual who obviously was not backing up what he was saying by using deflective wording. It's easy to sit on your high horse with all doom and gloom but that does not advance technology. Yes you will have idiots who refuse to listen even when screamed in their face. We've had that before "smart" anything so nothing will change that.

Trust me, I understand. It seems like many people want this kind of tech to be perfect, and if it's not, it's horrible. However, if it fills a need/want and is only as bad as current drivers, who cares? Let the people who want it use it, but in this case it shows to be much safer. Something he refuses to admit, and it's still in the very early stages. I myself am very much into driving and I will never give up my ability to drive a car....But at the same time, having this for my commute to work would be a God send, big time if it ever reaches the point of being fully autonomous.
 
Should I get more depressed over their unrealistic push for electric cars and clean energy?
Ah, yes, the clean electric cars. Their energy and components are made from unicorn dust.

And the truth finally comes out. You do not care about facts or the truth, you only used evidence as what you thought was an easy way out not realizing there is a mountain of it, not just from Tesla, but also Google cars.

"Correlation does not imply causation" despite what you seem to think here is not an argument nor does it suffice as a refutation of data. It also only deals with observation of two points of data assumed to be related, however, as linked this was a case study with data, not just assumption, it is also backed by the 1.3 billion miles of driving time in these cars. Unlike other tests, the cars are the same, all Tesla and the only change being equipped with Autopilot or not, also unlike most other cars Tesla has a streaming data connection to the cars that keeps and sends driving and crash data to their deep learning servers.

"Correlation does not imply causation" only means that things can correlate yet not be related, it does NOT mean that because things correlate they are not related, if that was the case, nothing would ever be proven. That is just a logical fallacy.

So, if you are going to use "Correlation does not imply causation", you should at least have a counter argument and data. Without that it means nothing and only displays your bias in being unwilling to even acknowledge the actual data you asked for. If you will not take a case study from the National Highway Traffic Safety Administration as proof, what will you accept? The Word of God only? Give me a break, at this point I am 95% sure you are just a troll.



Are you talking about me? Because as soon as he asked for data, my very next reply was a link to a NHTSA case study with the very data he asked for, not sure how that is "bullshit without backup".



Without anything to back it up? Just because you don't like the outcome of the data doesn't mean it does not count, unless you are one of those special little snowflakes.
Were those cars driven by the same people on the same roads in the same conditions? No? Thank you, that is all. But that's just trolling, don't let me stop you from forming whatever conclusions make you feel good from a too limited set of data. It's a study from a known agency, that automatically makes any conclusion true. And you're accusing me of trolling? That's rich
 
Right. Because magically if he were driving everything would have gone differently. Sometimes shit happens that nothing could avoid.

yeah because paying attention when driving never results in a more favorable outcome...

considering that the braking performance of your average car is FAR SUPERIOR to any tractor trailer I would say that the driver could have easily avoided contact with said tractor....
 
Can a commercial pilot go hands off when autopilot is on?

Yes they can, when the plane is flying its self, they are there to monitor and intervene if needed.... if they put too much pressure on the control stick( airbus) or yoke (boeing) the AP will disengage as it will determine that the pilot wants control.

Are you really comparing an airliner in the sky to a car on the road? Are you really?

And by not driving you are paying more attention to the road? And steering/braking/accelerating takes a big power of one's mental capacity? I...no, just no.

The guy you quoted has no idea how an auto pilot on an aircraft works and does not understand what the pilot's role in flying an aircraft is when autopilot is engaged.

Typically 500ft off the ground to 500 ft before touchdown the auto pilot in the aircraft flies the plane on a predetermined heading at a specified flight plan with specified waypoints and altitudes. The PILOT is there to take control if something is not going to plan or intervention is required. comparing a system that has been in use reliability for DECADES powered by a computer cluster that costs more than your average upscale house to a cheap unreliable version in a tesla is ludicrous...
 
Last edited:
So, if you are going to use "Correlation does not imply causation", you should at least have a counter argument and data. Without that it means nothing and only displays your bias in being unwilling to even acknowledge the actual data you asked for. If you will not take a case study from the National Highway Traffic Safety Administration as proof, what will you accept? The Word of God only? Give me a break, at this point I am 95% sure you are just a troll.

Are you talking about me? Because as soon as he asked for data, my very next reply was a link to a NHTSA case study with the very data he asked for, not sure how that is "bullshit without backup".
Have you actually read the report or are you just going by what some reporter wrote? It's a rhetorical question. I could write a whole article about what it does and doesn't say on the issues and aspects it covered, but I'll just leave this part here:

5.4 Crash rates. ODI analyzed mileage and airbag deployment data supplied by Tesla for all MY
2014 through 2016 Model S and 2016 Model X vehicles equipped with the Autopilot Technology
Package, either installed in the vehicle when sold or through an OTA update, to calculate crash rates by
miles travelled prior to21 and after Autopilot installation.22 Figure 11 shows the rates calculated by ODI
for airbag deployment crashes in the subject Tesla vehicles before and after Autosteer installation. The
data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.

21
Approximately one-third of the subject vehicles accumulated mileage prior to Autopilot installation.
22
The crash rates are for all miles travelled before and after Autopilot installation and are not limited to actual
Autopilot use.

It didn't account for the feature's use at all. It was just present, ever being used or not, not to go into all the other variables unaccounted for. Modern journalism was at its finest reporting on NHTSA's findings.


Another interesting piece of data from the report is that the higher the level of automation, the bigger its ratio in longer driver off road distractions. That's logical and to be expected and that is why such systems are in too early stages to go public.
 
Last edited:
they should not call it auto pilot. It has quite clearly demonstrated that it cannot drive the car by its self in all conditions. Remember the guy that got his head cut off when neither he nor the tesla did not see the tractor trailer crossing in front of him or how about the time the Tesla failed to see the road construction that resulted in significant damage to the car? It is a driver assist technology, not a I can drive the car for you technology....
 
Have you actually read the report or are you just going by what some reporter wrote? It's a rhetorical question. I could write a whole article about what it does and doesn't say on the issues and aspects it covered, but I'll just leave this part here:



It didn't account for the feature's use at all. It was just present, ever being used or not, not to go into all the other variables unaccounted for. Modern journalism was at its finest reporting on NHTSA's findings.


Another interesting piece of data from the report is that the higher the level of automation, the bigger its ratio in longer driver off road distractions. That's logical and to be expected and that is why such systems are in too early stages to go public.

Yes, I have. Along with the ones for Google and the like.

Only one thing changed on the cars, from before and after, that being Autopilot install or equipping. Are you suggesting a magical improvement in driver ability after the install of Autopilot? If so, that is quite the leap, but would suggest the install of Autopilot to be even more valuable. Looks like you are grasping at straws at this point. It is clear you will always find "something" you feel doesn't prove it to be worth it because you already stated that these systems make people assholes and are totally useless. Good luck with your denial, I am out.

they should not call it auto pilot. It has quite clearly demonstrated that it cannot drive the car by its self in all conditions. Remember the guy that got his head cut off when neither he nor the tesla did not see the tractor trailer crossing in front of him or how about the time the Tesla failed to see the road construction that resulted in significant damage to the car? It is a driver assist technology, not a I can drive the car for you technology....

Yet again, autopilot (even in planes) is NOT autonomous. Autopilot in planes actually is less capable than the Tesla and functions in even less conditions than Tesla can. Please do some simple research before proclaiming something.
 
Yes, I have. Along with the ones for Google and the like.

Only one thing changed on the cars, from before and after, that being Autopilot install or equipping. Are you suggesting a magical improvement in driver ability after the install of Autopilot? If so, that is quite the leap, but would suggest the install of Autopilot to be even more valuable. Looks like you are grasping at straws at this point. It is clear you will always find "something" you feel doesn't prove it to be worth it because you already stated that these systems make people assholes and are totally useless. Good luck with your denial, I am out.



Yet again, autopilot (even in planes) is NOT autonomous. Autopilot in planes actually is less capable than the Tesla and functions in even less conditions than Tesla can. Please do some simple research before proclaiming something.
take your own advice.. in a modern passenger airliner, the plane is fully capable of reaching the destination with virtually zero pilot input other than entering route information and adjustments as needed for thing like changing altitude.. go watch mayday and get back to me when you understand what happens when automation is overly relied upon
 
Back
Top