Man Says Tesla Car Started And Crashed On Its Own

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Okay, let's just say this guy is lying and we totally believe Tesla's version of events. That would mean that, best case scenario, this is an example of Tesla's self-parking feature in action. Nailed it (literally)! :D

“Tesla has reviewed the vehicle’s logs, which show that the incident occurred as a result of the driver not being properly attentive to the vehicle’s surroundings while using the Summon feature or maintaining responsibility for safely controlling the vehicle at all times,” the letter signed by a regional service manager read. The letter said the Summon feature — which enables a Tesla vehicle to park itself, among other functions — “was initiated by a double-press of the gear selector stalk button, shifting from Drive to Park and requesting Summon activation.”
 
He wanted to see if it would work in that scenario, and didn't realize it would not.
He set the car up into that scenario, and I bet let that happen before he reacted
 
Summon mode is in beta, mentions that it can't see things hanging from above and explicitly says to use it on private property. So basically the owner failed in multiple ways.


pretty much this lol.. also i like the fact that they made sure they included the child part at the end of it, yeah sorry but that's not going to work as an excuse if you bothered to pay attention to how the system works...
 
Car was probably tired of being a slave to its owner & decided the best course of action was to destroy itself.

Basically like the Maximum Overdrive movie, but the Hipster version.
 
Steve, what kind of lame-ass title is this? How about "Skynet is Gonna Kill Us ALL: Tesla Edition"?
 
Autonomous automobiles are perfect for people who take no personally responsibility and blame everything bad on something else.
 
That guy claims that he witnessed it but he also says he was inside a store at the time and came out to find the car already crashed. My bullshit meter is reading over 9000.
 
Reminds me of the guy who drove his Toyota truck into a dealership (literally through the front window) then claimed the truck accelerated on its own and wanted a refund for the truck.
 
While it appears likely this was mostly a stupid human trick, there are some things to be afraid of.
1. Was this feature on the car and explained when he bought it or was it a mandatory automatic update that occurred after he purchased the car?
2. Tesla is claiming they are not responsible because in part he had to agree to a TOS/EULA on the screen after the car was purchased.
3. Wonder how effect those sensors are at picking up something like an elk standing in the road? The main part of an elk is about the same height as that stuff the car ran into. Or does the summon feature use a subset of the sensor suite?
 
While it appears likely this was mostly a stupid human trick, there are some things to be afraid of.
1. Was this feature on the car and explained when he bought it or was it a mandatory automatic update that occurred after he purchased the car?
2. Tesla is claiming they are not responsible because in part he had to agree to a TOS/EULA on the screen after the car was purchased.
3. Wonder how effect those sensors are at picking up something like an elk standing in the road? The main part of an elk is about the same height as that stuff the car ran into. Or does the summon feature use a subset of the sensor suite?

Since the feature is in Beta it sounds like the kind of thing that you need to specifically opt in for, and therefore be aware of it's early status and limitations.

I don't think it would run into an Elk because their legs are similar enough to human legs (or even whole bodies) and I'm sure avoiding humans is the number one visual recognition avoidance priority. They did say that it has trouble seeing overhanging things in this mode and bicycles though, if the guy had read the warnings..

Also from the sounds of this, if the data said that it occurred 3 seconds after it was activated then that means the driver was sitting in the seat when it happened since it's activated from the gear selector button. What are the chances that the data being sent from the car is actually wrong?

Even in a case like this where it seems very clear that it was the driver's fault, I wouldn't put it past the US legal system to take the driver's side. That's why there are so many ridiculous rules and warnings these days because people and companies can be sued for just about anything it seems.
 
If anything, I believe that truck is loaded illegally.

In VA, much shorter overhangs require big ass flags on the back.
 
If anything, I believe that truck is loaded illegally.

In VA, much shorter overhangs require big ass flags on the back.
If you watch the video, you can see that that is the front of the trailer where a semi would back-up and attach to it. The Tesla driver apparently parked on the wrong side of the road.
 
All of Tesla's self-driving features require a driver to be there in the car, watching. This is for a reason, they're not very good. Tesla is definitely at least partly at fault here because their advertising is totally out of whack with how well the features work.
 
This is not just about Tesla, this is a rising problem with all the technology integrated into today's vehicles. Especially remote access. We have already seen proof of concept of a vehicle being hijacked remotely while on the road. We need a point where we just say "STOP! THAT IS ENOUGH!"
 
If anything, I believe that truck is loaded illegally.

In VA, much shorter overhangs require big ass flags on the back.
It was a parked trailer, no truck attached. There were two of them there probably waiting to be offloaded.
 
Isn't anyone concened that a production car has BETA software?
 
And that's not all.

It also sucked down the entire content of a bottle of scotch that was stored in the glove-compartment!
And there wasn't any air at all in the spare-tire, just useless white powder!

Don't buy Tesla, unless you like to get screwed.
 
Isn't anyone concened that a production car has BETA software?

you opt in to the beta features to help test them. Just like anything else the best way to actually work out issues is to put it out there.
 
ok I have to make the joke about the knight rider 2000 movie where the car says hitting the deer at the speed it was going would do less damage to the car than risking swerving... clearly the ai decided to play chicken with a parked truck and lost... much a ship and a light house...
seriously it sounds more like the guy said hey watch this... thirty seconds later it crashes into the truck causing a moving violation for the guy... if he reports it to his car insurance or he blames it on the shiny new auto drive feature which having no one at the wheel is sure to work great... I like tesla's but I would not want a car that can drive itself... there are plenty of times I had to pull over to the side of the road and sleep for twenty minute or at a rest stop so I could get where I was going that day instead of stopping at a motel then sleeping in, and my boss saying so why are you here two days late... I would shrug and say sorry I had trouble getting out of bed. Really does the car have a driver's license can it pass a drivers written and practical test? can it decide to crash rather than run over a deer or worse a kid?
 
Its quite critical if it can crash the car.

I wouldn't think of this in the same way as beta software you run on your computer. Think of it more as version 1. They have created a feature, self driving. They can test it all they want in the lab but the real test is once you get people using it in the real world. So you give people that are interested in doing so the option to enable "test" features that have been tested by you and now you want real world people to make use of. This lets you see if there are things that the average person will do or not do that your test people don't see.
 
I wouldn't think of this in the same way as beta software you run on your computer. Think of it more as version 1. They have created a feature, self driving. They can test it all they want in the lab but the real test is once you get people using it in the real world. So you give people that are interested in doing so the option to enable "test" features that have been tested by you and now you want real world people to make use of. This lets you see if there are things that the average person will do or not do that your test people don't see.

This is not relevant to the original statement nor my reply.
 
Back
Top