Another Tesla Autopilot Crash Reported

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Tesla needs to just remove the Autopilot mode until it is completely finished and people aren't such morons. If it were up to me, I wouldn't even bother with autopilot until people are completely removed from the equation.

In his crash report, Vukovich stated that Scaglione's car was traveling east near mile marker 160, about 5 p.m. when it hit a guard rail "off the right side of the roadway. It then crossed over the eastbound lanes and hit the concrete median." After that, the Tesla Model X rolled onto its roof and came to rest in the middle eastbound lane.
 
Tesla needs to just remove the Autopilot mode until it is completely finished and people aren't such morons.........................................

Steve, you sure you mean and, and not or ? :p
A might happen, but B may never be.

Anyway, this:
Tesla says that before Autopilot can be used, drivers have to acknowledge that the system is an "assist feature" that requires a driver to keep both hands on the wheel at all times. Drivers are told they need to "maintain control and responsibility for your vehicle" while using the system, and they have to be prepared to take over at any time.

This comment from the manufacturer sounds great as long as you make the assumption that the Autopilot feature isn't going to surprise the driver by creating a bad situation. It's one thing to assume that the driver has to be ready to take over if he sees a situation developing, it's a whole different thing to expect that driver to take over when his car suddenly veers off the road and into a guard rail. What, was it trying not to hit a rabbit?
 
Last edited:
I think they are trying this a little too soon. It would seem something that is so entrusting needs YEARS of testing in multiple controlled environments.
 
Tesla is to blame here, not for a system that isn't foolproof, but for a misleading name and poorly educating their customers what "autopilot" really is. It's a driving aid like cruise control, not a completely autonomous driving system.
 
That is sort of like saying we should never put a person in a rocket until we are 100% sure the rocket will never blow up... you cannot remove risk, and particularly in this case Tesla needs the data from people using it to hone the algorithms and help them deal with the crazy number of corner cases that exist in daily driving.
 
At least this time the car stopped and did not keep going.


I've read comments that the name is killing people, Tesla wanted to use "extremely dangerous beta cruise control, possibly deadly" but it would not fit on the autopilot button.
 
Guy was sleeping, someone pulled up next to him and honkered their horn, he freaked and grabbed the steering wheel. Or something similar. Could be anything, even an actual software glitch. But if it was driver error 99% sure the driver would swear it wasn't.
 
I think they are trying this a little too soon. It would seem something that is so entrusting needs YEARS of testing in multiple controlled environments.
Tesla owners are the beta testers, im fine with that.
 
for a misleading name and poorly educating their customers what "autopilot" really is. It's a driving aid like cruise control
Well yeah, isn't that what an autopilot is supposed to do?
An autopilot system on an aircraft can't do everything (taxi and takeoff), and at least one pilot has to be monitoring the flight.
Tesla says that before Autopilot can be used, drivers have to acknowledge that the system is an "assist feature" that requires a driver to keep both hands on the wheel at all times. Drivers are told they need to "maintain control and responsibility for your vehicle" while using the system, and they have to be prepared to take over at any time.
Personally I think a semi-autonomous car isn't the best idea, but I'm not the one selling it.
 
Trouble is, it isn't just the Tesla owners who are the beta testers. Every driver, passenger and pedestrian near a beta Tesla is also a beta tester, and they DIDN'T agree to any EULA or TOS or Disclaimer.

Just thank Christ that Bethesda's not in charge of self driving cars.
 
Well yeah, isn't that what an autopilot is supposed to do?
An autopilot system on an aircraft can't do everything (taxi and takeoff), and at least one pilot has to be monitoring the flight.
That depends on the system. Some of the most advanced autopilot systems in planes can basically take off and land the plane. However, once in the air with pretty much any aircraft autopilot system, it's basically hands off while the pilot "monitors" the system. You don't really have an issue with staying in your "lane", avoiding other planes, etc. like you do in a car. It's a much more simplistic problem to solve technically. Basically it just has to maintain heading and altitude, and adjust those at different waypoints. It's very different from what an equivalent system in a car would have to do.

The Tesla autopilot is the worst possible semi-autonomous driving system. It is seems fully autonomous to the driver most of the time lulling the driver into a false sense of security or reliance. But, when it gets itself into trouble it disengages and puts the driver, who likely hasn't been very paying close attention, back in control with insufficient time to properly assess the situation and take appropriate action.
 
That is sort of like saying we should never put a person in a rocket until we are 100% sure the rocket will never blow up... you cannot remove risk, and particularly in this case Tesla needs the data from people using it to hone the algorithms and help them deal with the crazy number of corner cases that exist in daily driving.


Personally, I would be willing to die to advance the research of something that required a rocket to get there. A commercial car to take me yoga class? Not so much.
 
Just thank Christ that Bethesda's not in charge of self driving cars.

I rage quite Fallout 3 because after several complications, I finally got stuck on 2 polygons over lapping while walking. Just said fuck it and never went back.....didn't even get to blow up Atomic Town.
 
I completely agree that it is a horribly flawed system, and has no place in a production vehicle. I just wanted to nitpick your use of autopilot :) (I feel it is an autopilot system)
 
Thousands of people die every year in normal old manually driven cars: its an acceptable statistic.

A handful of Telsa vehicles are crashing and being proven to be human error each time: MOTHER OF GOD ITS SUCH A BAD IDEA WE'RE ALL GOING TO DIE NOW
 
Thousands of people die every year in normal old manually driven cars: its an acceptable statistic.

A handful of Telsa vehicles are crashing and being proven to be human error each time: MOTHER OF GOD ITS SUCH A BAD IDEA WE'RE ALL GOING TO DIE NOW
I get what you mean, but how many regular cars are here, and how many tesla cars? I mean if there are 1000 deaths from 1,000,000,000 car drivers that isn't much. On the other hand, if its 10 drivers out of 10,000, it's out a bit.
 
I get what you mean, but how many regular cars are here, and how many tesla cars? I mean if there are 1000 deaths from 1,000,000,000 car drivers that isn't much. On the other hand, if its 10 drivers out of 10,000, it's out a bit.

I drove by two collisions this morning. On one highway. In the span of one hour. In one city.

last I checked, not every collision results in a fatality, I would guess not even 2% of collisions are fatal. So multiply the thousands of people dying every year by 50 or whatever gives you the actual number of total collisions fatal or non fatal, then have a look at the number of Tesla crashes per Tesla vehicle that have been prooven to be caused by the autopilot.
 
Thousands of people die every year in normal old manually driven cars: its an acceptable statistic.

A handful of Telsa vehicles are crashing and being proven to be human error each time: MOTHER OF GOD ITS SUCH A BAD IDEA WE'RE ALL GOING TO DIE NOW

Yup, its the same reactionary outrage when one caught into flames. The percentage of Tesla vehicles doing this vs standard cars was magnitudes smaller, but hey, what would the internet be without easy clickbait. :rolleyes:
 
multiply the thousands of people dying every year by 50 or whatever gives you the actual number of total collisions fatal or non fatal, then have a look at the number of Tesla crashes per Tesla vehicle that have been prooven to be caused by the autopilot.

That really is the question. What is the accident rate for self driving vs the accident rate for people. If the self driving rate is lower, it's still advantages to use.

Many years ago I worked for a document imaging company. The largest customer ran into a bug, that caused documents to occasionally disappear from the system instead of being filed (They conducted random audits of the documents filed and noticed the problem).
However, we lucked out, since even with this bug causing document losses, the number of lost documents was several times smaller than when they were manually filing the papers. :p
 
They need to stop calling it "autopilot".

All it's doing is causing morons to disengage from the actual act of driving.

On a plane, an autopilot system doesn't have to worry about thousands of other planes in it's general proximity, landmarks, etc. And, even then, if the pilot walks away, there are other crew members in the cockpit monitoring the systems just in case.

They should rebrand it as "driver assist", or something to that effect. Because continuing to push it as an actual "autopilot" feature is just going to encourage more imbeciles to do something that could get them killed.
 
I drove by two collisions this morning. On one highway. In the span of one hour. In one city.

last I checked, not every collision results in a fatality, I would guess not even 2% of collisions are fatal. So multiply the thousands of people dying every year by 50 or whatever gives you the actual number of total collisions fatal or non fatal, then have a look at the number of Tesla crashes per Tesla vehicle that have been prooven to be caused by the autopilot.
This whole auto pilot thing..... Didn't they say it's no street ready? I thought that was he case the last time it was brought up. So, I don't think it's fair to think of it as something that people should rely on, which these people seemed to do.
 
I would never let my car drive itself, but the part where I can summon the car to come and get me in the parking lot? That there is priceless.
 
They need to stop calling it "autopilot".

All it's doing is causing morons to disengage from the actual act of driving.

On a plane, an autopilot system doesn't have to worry about thousands of other planes in it's general proximity, landmarks, etc. And, even then, if the pilot walks away, there are other crew members in the cockpit monitoring the systems just in case.

They should rebrand it as "driver assist", or something to that effect. Because continuing to push it as an actual "autopilot" feature is just going to encourage more imbeciles to do something that could get them killed.
I agree. Never underestimate human stupidity while driving.
 
I think they are trying this a little too soon. It would seem something that is so entrusting needs YEARS of testing in multiple controlled environments.

Problem is that only real life test works, controlled environment is a joke when it comes to testing stuff like this. There is no test yet that can test all real life random things and if it were it would probably be really long test.
 
No, I say leave autopilot. This will weed out stupid rich people and only leave smart rich people

The problem with that line of thinking is that the person inside the Tesla or whatever autonomous vehicle will more than likely not be the only person that sustains injuries or potentially dies.

If dumb rich people want to off themselves I don't give a fuck but when they choose a method to do such things and other people suffer for their stupidity that's when I've got a major problem as I would expect anyone to.

"Smart rich people..." is practically an oxymoron nowadays. :)
 
A semi-autopilot is just a plain stupid idea. The way Tesla thinks it works (at least pretending to for legal reasons) is you becoming the AI's driving instructor, constantly monitoring traffic and environment and evaluating the AI actions' correctness, always ready to jump in to correct it. That's horribly stressful.

I haven't had the privilege of driving one yet, but I'd imagine it doesn't work like a regular assistance system either. It does the steering and any force *I* apply to the steering wheel will disengage the autopilot, you know, user input overrides the AI. So I'd have to rest my hands on the steering wheel without force. Braking is worse. At which point do I come in and apply the brakes? By the time I realize the Tesla hadn't correctly identified the stopped car in front of me it may very well be too late for my manual correction.

Don't get me wrong, I believe in the autonomous car, I want one yesterday. And I'm willing to believe that Tesla's autopilot - and similar systems by other manufacturers - are probably safer right now than human drivers. But a semi-autopilot is just a stupid approach, even though I know it's just for legal reasons ..
 
They need to stop calling it "autopilot".

All it's doing is causing morons to disengage from the actual act of driving.

On a plane, an autopilot system doesn't have to worry about thousands of other planes in it's general proximity, landmarks, etc. And, even then, if the pilot walks away, there are other crew members in the cockpit monitoring the systems just in case.

They should rebrand it as "driver assist", or something to that effect. Because continuing to push it as an actual "autopilot" feature is just going to encourage more imbeciles to do something that could get them killed.
GM has a similar system that's coming out in 2017 on the CT6. They call it SuperCruise, which is a much better name than Autopilot. People are used to the idea that they still have to pay attention when using cruise control.
 
Yup, its the same reactionary outrage when one caught into flames. The percentage of Tesla vehicles doing this vs standard cars was magnitudes smaller, but hey, what would the internet be without easy clickbait. :rolleyes:
You mean when musk compared the record of a 6 month old car to cars that on the road upto 20 or more years and are victims of arson? ok....
 
Smart rich people..." is practically an oxymoron nowadays. :)
With the exception of celebrities, athletes, and old money, most rich people didn't get rich without being smart. Of course I'd argue most people buying Tesla's are the wannabe rich who want to impress others with their fancy environmentally friend car.
 
A semi-autopilot is just a plain stupid idea. The way Tesla thinks it works (at least pretending to for legal reasons) is you becoming the AI's driving instructor, constantly monitoring traffic and environment and evaluating the AI actions' correctness, always ready to jump in to correct it. That's horribly stressful.

I haven't had the privilege of driving one yet, but I'd imagine it doesn't work like a regular assistance system either. It does the steering and any force *I* apply to the steering wheel will disengage the autopilot, you know, user input overrides the AI. So I'd have to rest my hands on the steering wheel without force. Braking is worse. At which point do I come in and apply the brakes? By the time I realize the Tesla hadn't correctly identified the stopped car in front of me it may very well be too late for my manual correction.

Don't get me wrong, I believe in the autonomous car, I want one yesterday. And I'm willing to believe that Tesla's autopilot - and similar systems by other manufacturers - are probably safer right now than human drivers. But a semi-autopilot is just a stupid approach, even though I know it's just for legal reasons ..

It's a retarded marketing scheme meant to sell stock.


Yea, Let me turn on autopilot yet keep both hands on the wheel and alert, that's logical. He human brain doesn't work that way. Based on their instructions, your better off just driving the thing yourself. They know good and damn well people weren't going to follow the rules.
 
Tesla doesn't need to change anything. People were having accidents long before Tesla ever implemented this glorified cruise control system, and they'll keep having them until the last human driver is taken off the road.
 
I'm sure there is code in the Tesla that reads:

IF AutoPilotMode AND DriverAsleep Then
Execute BumperCarMode("ForReal")
ENDIF
 
Tesla doesn't need to change anything. People were having accidents long before Tesla ever implemented this glorified cruise control system, and they'll keep having them until the last human driver is taken off the road.
And people have been successfully blaming and suing auto manufacturers just as long and will continue to do so.
 
Back
Top