China Sees First Tesla Autopilot Crash

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Honestly, as stupid as people are, Tesla should just rename Autopilot or require an IQ test before you are able to use the feature. Also, what is the point of Autopilot if you have to have your hands on the wheel at all times? Watch the video, it's kinda scary that the Tesla didn't even attempt to avoid the car that it sideswiped. Thanks to cageymaru for the link.
 
You contradict yourself there Steve. You blame the drivers saying they should take an IQ test, then also blame Tesla in the following sentence. What side of the fence are you on?

I'm with the driver here. If the cars are too stupid to handle even the most basic accident avoidance, it shouldn't be marketed the way Tesla is doing so. I personally going to stay as far away from all of this shit until they have it completely figured out. No half way measures. Full autonomous, or nothing at all.
 
Yea most of the time I am on Tesla's side with saying the driver is at fault but this time I am kinda with the driver here. That video showed that the car did nothing to move away from the parked car. What if that was a person standing there. Not sure how the Tesla did not see that car and move. Only reason I could think of is that a car was next to the Tesla so the Tesla did not want to change lanes and cause a problem. This looks like a very basic thing for the Tesla to do.
 
I have always thought that Auto Pilot was a terrible name for the feature. I blame both the car owners and Tesla for accidents like this. Telsa for misleading marketing and over hyping the feature and the owners for being gullible enough to fall for it.
 
Fucking snowflakes not taking the blame for being a dumbass. Why on earth would you just sit there and let it happen? He wasn't moving at such a speed that he couldn't have grabbed the wheel and avoided the car himself.

That's like bitching how ABS sucks on icy roads.
 
Of course they did. It usually only takes them about 2 weeks to copy something.



:)
 
That was entirely the driver's fault. The video just cements it. You could see the car go to the right, but no further than the boundary line. The driver should have taken control of the vehicle.
 
I know that is one of the cases mentioned in the manual that autopilot won't avoid. You can only go so far to help stupid.
 
Don't know enough about the feature as I don't have a tesla, but if it can't truly change lanes to avoid non moving objects?!?! Than wtf is it for. My adaptive cruise control is more intelligent than that. It would at least hit the brakes and just sit there instead bouncing off the car or skidding through it. That being said, anything parked and I'm not going to depend on these automated systems to do anything. I think they can be life savers for something sudden like a deer running in front of your car or something that requires faster than human response time, but they are not self driving by any means.

Now I think Tesla has a marketing/communication problem here. They can try to pass the buck to the driver, but part of the blame is on them an how they have named the feature and how their salemen have sold it to consumers. The corporate lawyers can point at the fine print all they want and yes the drivers are stupid, but it sounds like tesla needs to communicate better the capabilities of the car (or lack there of). Of course it sounds sexier if you make people think it's self driving but leave the dirty details in fine print or other bullshit disclaimers.

I don't think any of these customers should be reimbursed but I do think tesla should be punished in some form. Maybe a fine or requirement to change the name or market it differently and more accurately.
 
Tesla explicitly states that the car may not see a stopped car on the side of the road, if it was previously following a car in front of it. It's one of the current limitations of Autopilot - except in this case, the guy had PLENTY of time to take over and move the car away from the obstruction. No sympathy here, pay attention to wtf you are doing.
 
Hmm, hard to say what the car could've done if there was another car on the right lane, all it could do is minimize the damage by hugging the lane separator. The driver should be taking over here since the car can't bend the laws of physics imho.
 
Here's the video of the accident. The autopilot shouldn't have done that.

You shouldn't expect it to make evasive maneuvers when someone does something stupid in front of you, like partially blocking a lane. It's a level 2 autonomous vehicle, you should never cede full control of the vehicle to autopilot feature. The driver should have had his hands on the wheel and correctly steered around the stopped vehicle like every other car did.
 
Tesla shipped a buggy, improperly tested advanced cruise system and claimed it was an autonomous system. They need to pull this product before they get buried in lawsuits.
 
Dumb driver but wow, I am surprised Tesla can not avoid such an obvious obstacle, my cruise control would at least begin hitting the breaks.
 
Honestly, as stupid as people are, Tesla should just rename Autopilot or require an IQ test before you are able to use the feature. Also, what is the point of Autopilot if you have to have your hands on the wheel at all times? Watch the video, it's kinda scary that the Tesla didn't even attempt to avoid the car that it sideswiped. Thanks to cageymaru for the link.

Autopilot != no human involvement required. Even on a plane, pilots still have to pay attention; the purpose of the autopilot is to simply help out with monotonous tasks, not to replace the human entirely.
 
Autopilot != no human involvement required. Even on a plane, pilots still have to pay attention; the purpose of the autopilot is to simply help out with monotonous tasks, not to replace the human entirely.
That word does not mean what you think it means.
Merriam-Webster defines "Autopilot" as "a device that steers a ship, aircraft, or spacecraft in place of a person."
 
That word does not mean what you think it means.
Merriam-Webster defines "Autopilot" as "a device that steers a ship, aircraft, or spacecraft in place of a person."

You're reading entirely too much into that definition.

What do you think would happen to a Cessna 172 with an active autopilot, if another aircraft crossed its flight path at close range? Hint: most aircraft autopilots are even less "intelligent" than Tesla's.
 
You're reading entirely too much into that definition.

What do you think would happen to a Cessna 172 with an active autopilot, if another aircraft crossed its flight path at close range? Hint: most aircraft autopilots are even less "intelligent" than Tesla's.
And the reason pilots can take their hands off the wheel is that the sky is a fairly empty place until one gets near an airport where air traffic control personnel on the ground keep track of where airplanes are in relation to each other.

It's simply not Tesla's fault that drivers operate their cars in unsafe ways and against the explicit instructions the company provides, any more than it's a car manufacturer's fault when idiots drive drunk.
 
Yes, Tesla should rename the feature and re-advertise how to use it, but I also can't believe people just turn the feature on, kick back and check out mentally. We simply aren't there yet folks, regardless of what the ads imply and only a naive fool would truly believe that we are, especially with the late high-profile accidents making the news.

In order for this to work, human drivers need to be test-condition perfect in their driving habits.
 
That word does not mean what you think it means.
Merriam-Webster defines "Autopilot" as "a device that steers a ship, aircraft, or spacecraft in place of a person."


I'm pretty sure you are supposed to follow the vehicle's operating manual and not the dictionary.

And as mentioned already, autopilot has never meant in aviation that you can stop paying attention to what the aircraft is doing.

Here is an example of the limitations of airbus autopilot controls:
Airbus Flight Control Laws

Btw even some experience pilots have problems understanding the limitations of their plane's autopilot (I can provide plenty of examples if required), so this issue will continue no matter what tesla will do.
 
I'm pretty sure you are supposed to follow the vehicle's operating manual and not the dictionary..
Pretty sure the vast majority of car owners have never read more than a dozen pages of their car's manual, and then only if they get a flat and need to find out where the spare is hidden (only to discover that there isn't one.)

And in a lawsuit for deceptive advertising, I'll give good odds that what Tesla says "Autopilot" means in the operating manual would not even be admitted by the court as evidence -- because it's irrelevant.
 
Pretty sure the vast majority of car owners have never read more than a dozen pages of their car's manual, and then only if they get a flat and need to find out where the spare is hidden (only to discover that there isn't one.)

How is the owner's failure to read the manual the fault of the manufacturer? Failure to read the manual places the blame for any avoidable accidents or malfunctions squarely on the shoulders of the owners. Or are we to protect people from themselves and their own stupidity, impatience, and/or misplaced sense of capability?


And in a lawsuit for deceptive advertising, I'll give good odds that what Tesla says "Autopilot" means in the operating manual would not even be admitted by the court as evidence -- because it's irrelevant.

You'd payout on every bet taken against that position. One of the fundamental functions of a trial court is to define words, terms, phrases, and clauses whose definitions are not in agreement between parties.

Using a term in a context different from any generally recognized use is not inherently illegal. Words used as marketing terms are frequently used thus. For example, when Microsoft Windows 1.0 was released, few people outside the computer industry had any knowledge of the use of the word "window" as something other than a hole in a wall, vehicle, etc.

Using a term in a way that is similar, but not necessarily identical, to common use would likewise not be illegal-- especially if the term were further defined in official literature, like a manual. Challenge to Tesla's use of the term "Autopilot" would be unlikely to come, let alone succeed, solely from semantics.

The likely challenge is in a harmed party claiming that they were mislead by Tesla's use of the term. But again, such a challenge is unlikely to succeed if the term was (further) defined in literature the plaintiff had access to and-- importantly-- should have read. The manual would thus be submitted as evidence, and the court would accept it as such. Whether the judge or jury considered it credible or important is another matter, but irrelevant it would not be.

Further, if Tesla shows that their use of the term is not significantly different from an already accepted use-- e.g. aircraft autopilot-- then a claim that the plaintiff was misled becomes much less likely to succeed.

I've already demonstrated, above, that Merriam-Webster's definition of "autopilot"-- or at least your interpretation of that definition-- is at odds with the actual use of the word in the aircraft industry. Keep in mind that the word "autopilot" originated from the aircraft industry-- it's meaning there is far more germane than a dictionary definition.
 
How is the owner's failure to read the manual the fault of the manufacturer? ....

One of the fundamental functions of a trial court is to define words, terms, phrases, and clauses whose definitions are not in agreement between parties. ...

Using a term in a context different from any generally recognized use is not inherently illegal.

I take it then that you've never actually been to law school?
You certainly have no clue how product liability or truth-in-advertising laws work, that's for sure.

And before you ask: yes, I am an attorney. After 30 years as an engineer, I switched fields.
Gray hair is a liability for an engineer, but it's an asset for an attorney.
 
I take it then that you've never actually been to law school?
You certainly have no clue how product liability or truth-in-advertising laws work, that's for sure.

And before you ask: yes, I am an attorney. After 30 years as an engineer, I switched fields.
Gray hair is a liability for an engineer, but it's an asset for an attorney.

I have not, but so far you've not shown relevance, have not addressed my points, or offered facts or solid reasoning to support your own positions. You've not even stated your own area of expertise, while calling into question my own. In short, you've done little more than pound on the table.
 
Don't know enough about the feature as I don't have a tesla, but if it can't truly change lanes to avoid non moving objects?!?! Than wtf is it for.

It is exactly for keeping the car inside the lane. And keep the following distance to the car in front, and it can change lanes by driver request. That's the end of it features. Asking it to change lanes automatically is like blaming a blender for not putting things back together. It's the exact opposite of what it was made for. My car or rather one of my company cars have radar technology to keep following distance automatically. That's not fool proof either. Maybe 1 out of 100 times it will not see the car right in front of it. Mostly white vans. Jet I've never seen anyone wanting to ban that technology nor the car.
 
Last edited:
Four simple words may end autonomous cars: self-driving car bomb.
It's not like there is a shortage of suicidal religious fanatics. The hard part is procuring enough explosives and making a functioning bomb without anyone noticing. Finding a dumb idiot to drive it into a crowd is the easy part. Not to mention you'll never get an autonomous car to drive up to a curb or into an area barred from traffic. Yet that's kind of a requirement for bombing.
 
Back
Top