Toyota Suspends Public Road Testing of Self-Driving Cars

DooKey

[H]F Junkie
Joined
Apr 25, 2001
Messages
13,552
As a result of the March 18th crash in Arizona of a Uber self-driving car that killed a pedestrian, and the subsequent release of the video of the accident, Toyota has suspending all testing on public roads. They say they are giving their test drivers some time off and they want to assess the Uber accident. As far as I'm concerned this is a good move and I wish the other automakers would follow suit until we know what really happened.

Different automakers are still going through with their plans to keep testing self-driving cars. Ford Motor and General Motors will continue to test their self-driving cars in public streets. BMW expressed sympathy for the victim of the collision but has said that this will not affect its plans to continue to test self-driving cars.
 
*sigh* work on self driving at freeway speeds, and human driving on city streets. Come on Toyota you know all about hybrids!
 
Seriously? Toyota is the only one thinking about public safety for real?
Either they are,
1) that confident that their systems will not hit a person walking across the street.
or
2) Really don't give 2 craps about potentially killing more people for "testing purposes". As famously said in Portal 2 "For Science!"

I know these companies have closed tracks and the capability to setup the testing scenarios for exactly what happened in AZ.
Testing death cabs on public streets is probably something that should be taken seriously. Pedestrians are by far the biggest risk when these things are being tested, the smaller walking targets are soft objects and don't fair very well when it comes to a multi-ton vehicle. Kinda sad to see that they don't care to disclose how and to what extent they have tested for driverless cars "seeing" walkers.
I do agree with the people saying that lady shouldn't have walked out into traffic but I still maintain the ideal that the car should have seen it and attempted to brake, maybe at lower speeds the person could have survived.
 
Because we all know that woman would have lived if it was a human driving the vehicle :rolleyes:

From NPR: https://www.npr.org/2017/03/30/522085503/2016-saw-a-record-increase-in-pedestrian-deaths
A report released today by the Governors Highway Safety Association shows that the number of pedestrians killed in traffic jumped 11 percent last year, to nearly 6,000. That's the biggest single-year increase in pedestrian fatalities ever, and the highest number in more than two decades.

How many pedestrians have been killed by self driving cars so far? 1?

People really need to put things into perspective.

This one event is not going to stop or even hinder self-driving technology. It's still coming quickly.
 
Last edited:
Because we all know that woman would have lived if it was a human driving the vehicle :rolleyes:

From NPR: https://www.npr.org/2017/03/30/522085503/2016-saw-a-record-increase-in-pedestrian-deaths


How many pedestrians have been killed by self driving cars so far? 1?

People really need to put things into perspective.

This one event is not going to stop or even hinder self-driving technology. It's still coming quickly.
Your taking skewed results. The limited amount of driverless cars on the road is why there a very few pedestrians being killed , and for good reason. Until there are 254 million driverless cars to compare the results claim 1 fatality as a low amount is bad statistical analysis.
 
Your taking skewed results. The limited amount of driverless cars on the road is why there a very few pedestrians being killed , and for good reason. Until there are 254 million driverless cars to compare the results claim 1 fatality as a low amount is bad statistical analysis.

This is being done on both sides though. The fact that there was "1", for some, is enough to kill off the whole thing in their minds. For an educated society, we still suffer from a human condition of ignoring data when it doesn't fit our narrative. I just find it more insulting that people think these companies are activity thinking "we don't care if people die"; it is about as disingenuous as you can get.
 
This is being done on both sides though. The fact that there was "1", for some, is enough to kill off the whole thing in their minds. For an educated society, we still suffer from a human condition of ignoring data when it doesn't fit our narrative. I just find it more insulting that people think these companies are activity thinking "we don't care if people die"; it is about as disingenuous as you can get.
Its really not because they said they won't stop testing on the street even though a pedestrian was killed. This isn't a system that is ready, they are still testing it. Which means that even with flaws they don't care if it hits another person or 10. They need to take the responsibility and make sure the cars are equipped to "see" walkers in all conditions that's all i'm saying.
It should be close to 100% passing the tests that it "sees" someone to test it on the street IMO.
 
Your taking skewed results. The limited amount of driverless cars on the road is why there a very few pedestrians being killed , and for good reason. Until there are 254 million driverless cars to compare the results claim 1 fatality as a low amount is bad statistical analysis.

Let's take for example WAYMO. 4 million miles driving on public roads and no pedestrians hit - https://www.theverge.com/2017/11/28...riving-autonomous-cars-public-roads-milestone

Result? 0 / 4million = 0 pedestrian deaths per mile driven or NaN if you want miles per pedestrian death :p


How about traffic accidents?

1 accident caused by WAYMO in 4 million miles

5,419,000 accients in the US in 2010 so ballpark would be 5 million / 3 trillion (miles driven per year in US) = 0.0017 accidents / mile for human-operated vehicles
https://en.wikipedia.org/wiki/Motor_vehicle_fatality_rate_in_U.S._by_year
https://www.npr.org/sections/thetwo...ecord-number-of-miles-driven-in-u-s-last-year

1 autonomous accident / 4 million miles driven = 0.00000025 accidents / mile for autonomous vehicles (i am not adding the accidents caused by the human drivers here)

Which would make self-driving cars 6,800 times safer to be in than human-driven vehicles.
 
Last edited:
Let's take for example WAYMO. 4 million miles driving on public roads and no pedestrians hit - https://www.theverge.com/2017/11/28...riving-autonomous-cars-public-roads-milestone

Result? 0 / 4million = 0


How about traffic accidents?

1 accident caused by WAYMO in 4 million miles

5,419,000 in the US in 2010. so ballpark would be 5 million / 3 trillion (miles driven per year in US) = 0.0017 accidents / mile for human-operated vehicles

1 autonomous accident / 4 million miles driven = 0.00000025 accidents / mile for autonomous vehicles (i am not adding the accidents caused by the human drivers here)

Which would make self-driving cars 6,800 times safer to be in than human-driven vehicles.
Where in the statics does it say were on public roads, at night, with people walking? it could say 100 million miles but doesn't give you the full version, So why is this car different from any other? The car failed to even apply the brake why is that even hard to understand? The senors are there to prevent this and yet it didn't.
 
Where in the statics does it say were on public roads, at night, with people walking? it could say 100 million miles but doesn't give you the full version, So why is this car different from any other? The car failed to even apply the brake why is that even hard to understand? The senors are there to prevent this and yet it didn't.

Will there be deaths? Yes

If the number who die is drastically less than when humans were driving is it better? Yes

If you're arguing this one death means it's bad to allow autonomous driving then you indirectly arguing that humans killing more humans is fine and that robots killing less humans is bad.

Deep learning algorithms will get better over time rapidly. What you see today will be considered a very dumb version in a year's time. This is no longer linear but is instead exponential.
 
Which would make self-driving cars 6,800 times safer to be in than human-driven vehicles.

How are the rates for those human drivers that spend most of their operation below 35 mph and have a backup driver to take over if something goes wrong?
 
Its really not because they said they won't stop testing on the street even though a pedestrian was killed. This isn't a system that is ready, they are still testing it. Which means that even with flaws they don't care if it hits another person or 10. They need to take the responsibility and make sure the cars are equipped to "see" walkers in all conditions that's all i'm saying.
It should be close to 100% passing the tests that it "sees" someone to test it on the street IMO.

Who defines the use cases and pass/fail criteria for "cars are equipped to see walkers in all conditions". That is an un-testable requirement as you stated it. What are "all conditions"? What is "near 100%"?

Mistakes happen. That is how we learn. The question is whether or not we are being "reckless" in pursuit of that knowledge. The amount of test time and investment sure indicates we aren't; over $100 billion and over 100 million miles Do you know how many of the most common and advanced surgical techniques and drugs have had less than stellar starts but are now considered "why weren't we doing it this way from the beginning"?
 
Lots of crosswalks near me are death traps. Feels like playing frogger when crossing. May as well not be there as all they do is give the pedestrian a false sense of safety. I can see why people jaywalk when they don’t feel safe in crosswalks or when crosswalks are stupid far apart.

Worst case senario is usually designed for in automotive. This unfortunate senario seems like it’s close to that.
 
Your taking skewed results. The limited amount of driverless cars on the road is why there a very few pedestrians being killed , and for good reason. Until there are 254 million driverless cars to compare the results claim 1 fatality as a low amount is bad statistical analysis.
It's not the first time:
https://en.wikipedia.org/wiki/Mary_Ward_(scientist)
there was only one of that car on road at that time. Didn't stop people from continuing to experiment with cars. There are a lot more than one self-driving car on the road now. If I knew how many we could compare the stats, I don't, doesn't really matter, unless this is like the Hindenburg accident (which seems unlikely) this technology will continue to develop.

Edit:
To clarify, it's bad whenever someone dies, but don't throw the baby out with the bathwater. Uber may have had a bad testing methodology but, if we discarded every tech because someone died in an accident involving it, we wouldn't even have achieved the stone age.
 
Last edited:
Where in the statics does it say were on public roads, at night, with people walking? it could say 100 million miles but doesn't give you the full version, So why is this car different from any other? The car failed to even apply the brake why is that even hard to understand? The senors are there to prevent this and yet it didn't.

Lets ignore the fact that the pedestrian was walking in the pitch black, across what appears to be a highway. No lights, no reflectors, if this car was being driven by a human the result would be the same. Because the car has sensors and other equipment we are holding it to a higher standard! Suspending testing is idiotic. Fix the bug and move on...
 
  • Like
Reactions: WhoMe
like this
IMO only way a self driving car should hit someone is if its an impossible situation. Like a lot of people said the car never should of hit that person.
 
Who defines the use cases and pass/fail criteria for "cars are equipped to see walkers in all conditions". That is an un-testable requirement as you stated it. What are "all conditions"? What is "near 100%"?

Mistakes happen. That is how we learn. The question is whether or not we are being "reckless" in pursuit of that knowledge. The amount of test time and investment sure indicates we aren't; over $100 billion and over 100 million miles Do you know how many of the most common and advanced surgical techniques and drugs have had less than stellar starts but are now considered "why weren't we doing it this way from the beginning"?
All condition very simple when you come up with a test case. Weather for 1 did they think about how the vehicle is going to react in say rain, fog, snow, night, day, misty, any any other environmental possibilities. If you say they can't think of everything i just came up with 6 off the top of my head and yet a night driver kills a pedestrian crossing the street where the car didn't even try to brake. Pretty simple test case I think. Closed track and using safety dummys like crash tests to make sure the car properly picks up the object in its sensors.
The failure is either with the sensors or with the software either way Uber (or whoever makes the driverless system) should be accountable as well. It honestly should scare the shit out of the car companies because if it continues I see law suits in the near future for negligence of putting faulty cars on the road......that is why they have recalls all the time they find a problem and are made to fix it. This should be no different, in fact they should stop and make sure it doesn't happen again not just say "it can't happen to our cars" and blow it off until it really does happen again.
 
Lets ignore the fact that the pedestrian was walking in the pitch black, across what appears to be a highway. No lights, no reflectors, if this car was being driven by a human the result would be the same. Because the car has sensors and other equipment we are holding it to a higher standard! Suspending testing is idiotic. Fix the bug and move on...

You are ignoring the fact that the system should have seen the person in the pitch black. The person my not have seen her but the sensors should have. Have you used night vision goggles? Thermal imaging? The tech is out there to be used and this design failed to do what it was designed to do. The car struck her and she was on the right side of the vehicle which means she was in front of the car a good distance away and yet the car never braked......

By no means am I saying "STOP DEVELOPING DRIVERLESS TECH!!!" Nope, I'm saying they should suspend testing on live streets until they figure out WTF happened and fix the bugs.
There is a saying that is completely true in the IT world "we don't always test our code, but when we do its in production". Looks like this is the case here too except this can actually kill people.
 
All condition very simple when you come up with a test case. Weather for 1 did they think about how the vehicle is going to react in say rain, fog, snow, night, day, misty, any any other environmental possibilities. If you say they can't think of everything i just came up with 6 off the top of my head and yet a night driver kills a pedestrian crossing the street where the car didn't even try to brake. Pretty simple test case I think. Closed track and using safety dummys like crash tests to make sure the car properly picks up the object in its sensors.
The failure is either with the sensors or with the software either way Uber (or whoever makes the driverless system) should be accountable as well. It honestly should scare the shit out of the car companies because if it continues I see law suits in the near future for negligence of putting faulty cars on the road......that is why they have recalls all the time they find a problem and are made to fix it. This should be no different, in fact they should stop and make sure it doesn't happen again not just say "it can't happen to our cars" and blow it off until it really does happen again.

Ahh...so you want to test to specific condition...not actual conditions? I mean this needs to be 100% perfectly repeatable or it isn't a test. For example, do realize how stupidly specific the crash test are? Did it ever occur to you what would happen to these vehicles if you shift the impact point a few inches to the left or to the right? Remember, they engineers know EXACTLY where the impact will occur and at what speed. This very logic you have is why facial image recognition sucks; the AI studied for the "test" and passed the "test" and it can't see black people well at all and does a pretty shitty job on women as well. People put way too much faith in tests. It is a human failure in mental capacity.

Did UBER pull their fleet after the accident? YES OR NO? By their very action they are acting responsibly. Ignoring the fact that the driver was in violation of the vehicles operating standard/requirement and ignoring the fact pedestrian violating traffic laws...it seems the response being given is to disproportionately attack an entity and not the person. But I guess that is easier because blaming the human means you need to potentially blame yourself..and that is hard.
 
All Toyota needs to do is be reviewing the complacency of their drivers with the self drive tech. Ford will just catch fire, no one buys gm, and the monkeys that are in the bmw's will be gassed and replaced.

At least those are the actual manufacturer and not just someone outfitting a chinese car that apparently has lost all its value as being pedestrian safe in a collision. Uber shouldn't even be doing this as self drive should not be for commercial ventures but rather its end goal of driving those that otherwise can't such as the elderly, that is where the life saving comes from. The old stop crashing and can still get out.
 
Because we all know that woman would have lived if it was a human driving the vehicle :rolleyes:

From NPR: https://www.npr.org/2017/03/30/522085503/2016-saw-a-record-increase-in-pedestrian-deaths


How many pedestrians have been killed by self driving cars so far? 1?

People really need to put things into perspective.

This one event is not going to stop or even hinder self-driving technology. It's still coming quickly.

Your attitude is kinda appalling.

https://arstechnica.com/cars/2018/0...victim-came-from-the-shadows-dont-believe-it/
 
Ahh...so you want to test to specific condition...not actual conditions? I mean this needs to be 100% perfectly repeatable or it isn't a test. For example, do realize how stupidly specific the crash test are? Did it ever occur to you what would happen to these vehicles if you shift the impact point a few inches to the left or to the right? Remember, they engineers know EXACTLY where the impact will occur and at what speed. This very logic you have is why facial image recognition sucks; the AI studied for the "test" and passed the "test" and it can't see black people well at all and does a pretty shitty job on women as well. People put way too much faith in tests. It is a human failure in mental capacity.

Did UBER pull their fleet after the accident? YES OR NO? By their very action they are acting responsibly. Ignoring the fact that the driver was in violation of the vehicles operating standard/requirement and ignoring the fact pedestrian violating traffic laws...it seems the response being given is to disproportionately attack an entity and not the person. But I guess that is easier because blaming the human means you need to potentially blame yourself..and that is hard.

Look man all Im saying is the whole thing is a mess and environmental conditions need to be tested and times of day should also be included BUT not tested on the street until everything they can think of can be tested on a closed circuit. Yes some items may not be thought of but at night driving is one of the most obvious cases to be considered.
It really looks like Uber is cutting corners. The systems didn't see a person walking (key point 1), the person walking was dumb for crossing the street in front of a card at night (J walking) and the SD wasn't paying attention. They all need to be considered. Period.
The biggest question being why did the car not detect the moving object in the street.

Toyota is trying to consider being responsible and making sure it is not going to kill people needlessly while "testing". The rest seem to say "we don't care we are just trying to get the work done before anyone else". Maybe they should take more caution but then again we did test atomic bombs on live people for decades....I guess we never learn our lessons do we?
 
Look man all Im saying is the whole thing is a mess and environmental conditions need to be tested and times of day should also be included BUT not tested on the street until everything they can think of can be tested on a closed circuit. Yes some items may not be thought of but at night driving is one of the most obvious cases to be considered.
It really looks like Uber is cutting corners. The systems didn't see a person walking (key point 1), the person walking was dumb for crossing the street in front of a card at night (J walking) and the SD wasn't paying attention. They all need to be considered. Period.
The biggest question being why did the car not detect the moving object in the street.

Toyota is trying to consider being responsible and making sure it is not going to kill people needlessly while "testing". The rest seem to say "we don't care we are just trying to get the work done before anyone else". Maybe they should take more caution but then again we did test atomic bombs on live people for decades....I guess we never learn our lessons do we?

Okay...so how can you define this "whole thing is a mess". Help me wrap my brain around that. You say "UBER is cutting corners"....but are you playing armchair autonomous driver engineer? If we go that route...I can pretty much say everyone in every industry is cutting corners when it is convenient to my argument. On your statement though...you said they were cutting corners. What corners did UBER cut. Define it specifically? Or, are you making an assumption they did. Can you point to leaked internal memo's that say "hey boys, this algorithm isn't perfect but we need to release now to make our deadline"? Can you point to anything...even leaked on some even random but at least somewhat respected news site? Again..if this industry was cutting serious corners, something would have leaked by now. It would be too damn juicy not to.

But more on point. This exact case of where this person got hit may have NEVER been in the test even if there was one. You can't, for any reasonable certainty, saying it would be there. Remember, tests are based upon being able to be repeated. Furthermore, tests come from usually exiting use cases...which is why they are tested for. Will this data-set now be part of their model for the AI machine; you can bet your ass it will. That is why I poke on the car impact tests. Those tests come from statistical data from PREVIOUS ACCIDENTS. They just didn't appear from some groups brain. You think the bird strike test on jet engines was thought about on day 1? In retrospect people will say "duh, you should have known that"...but that is a joke since evidently fundamentally missed it.
 
Let's take for example WAYMO. 4 million miles driving on public roads and no pedestrians hit - https://www.theverge.com/2017/11/28...riving-autonomous-cars-public-roads-milestone

Result? 0 / 4million = 0 pedestrian deaths per mile driven or NaN if you want miles per pedestrian death :p


How about traffic accidents?

1 accident caused by WAYMO in 4 million miles

5,419,000 accients in the US in 2010 so ballpark would be 5 million / 3 trillion (miles driven per year in US) = 0.0017 accidents / mile for human-operated vehicles
https://en.wikipedia.org/wiki/Motor_vehicle_fatality_rate_in_U.S._by_year
https://www.npr.org/sections/thetwo...ecord-number-of-miles-driven-in-u-s-last-year

1 autonomous accident / 4 million miles driven = 0.00000025 accidents / mile for autonomous vehicles (i am not adding the accidents caused by the human drivers here)

Which would make self-driving cars 6,800 times safer to be in than human-driven vehicles.

WAYMO has a much better record than Uber. And possibly better than humans. They're also pretty transparent and doing things above board. Their cars are very timid (except when trying to merge into the side of a bus), and frustrating to be around, but alright. I'm ok with them testing on the street, because I know they're trying to be safe, and I'm pretty sure they spent a lot of time with things on test tracks. And we've seen the cars just kind of sit around doing nothing when they can't figure out what to do.

Uber moved to Arizona because they didn't feel like following California's rules about reporting, and AZ didn't care, so I haven't seen them in action, except for the reports of really aggressively ignoring rules. I'd rather them figure their shit out on a test track.
 

How am i being appalling? That i am taking a utilitarian stance where i am ok with a few deaths asserting that many fewer will die as a result? It's not a glorious stance and i won't apologize for it either as in this instance i do know - from the available data and first hand experience - that it will save many many more lives in the long run

I am standing behind it with supporting, owning, and being driven by a Tesla.
 
The hypocrisy of some of you people just amazes me. If a car kills a member of your family, because of a failure in its systems, you are going to be fine with that. Sure, right. Uh-huh.

The attitude of some people just scares the crap out of me. Let's, voluntarily, put more people at risk! We can learn so much by needlessly killing people!

Anyone remember the Apollo 1 accident? The entire space program was shut down for a year while they studied the failure and corrected the problem. They did not put another person into a capsule just to be able to keep going and risk causing more deaths. That is irresponsible. It has nothing to do with advancing technology. You advance it by stopping what you are doing, studying it, corecting the flaws, then start again.

The states allowing autonomous cars on the road need to shut is all down until an answer is found as to why the Uber car failed in order to insure a gross failure of this type does not happen again. Needlessly putting people at risk is just insane.

Toyota is doing the right thing.
 
The hypocrisy of some of you people just amazes me. If a car kills a member of your family, because of a failure in its systems, you are going to be fine with that. Sure, right. Uh-huh.

The attitude of some people just scares the crap out of me. Let's, voluntarily, put more people at risk! We can learn so much by needlessly killing people!

Anyone remember the Apollo 1 accident? The entire space program was shut down for a year while they studied the failure and corrected the problem. They did not put another person into a capsule just to be able to keep going and risk causing more deaths. That is irresponsible. It has nothing to do with advancing technology. You advance it by stopping what you are doing, studying it, corecting the flaws, then start again.

The states allowing autonomous cars on the road need to shut is all down until an answer is found as to why the Uber car failed in order to insure a gross failure of this type does not happen again. Needlessly putting people at risk is just insane.

Toyota is doing the right thing.

I agree. There needs to be a pause to consider the problems. If it is the technology then fix it. If it is the company operating the technology, then maybe reconsider letting them use the technology.

As for a zero accident requirement on self driving autos, that is not practical. Imagine if we shut down all human driven cars with the next accident where a person was killed who was crossing a street not in a crosswalk. We would have to shut down our whole transportation system! Here recently a person was killed while trying to cross rail road tracks with a train approaching, should we stop all rail service? Going to the extreme in either direction is bad. Saying stop all self driving vehicles over one accident is wrong, but also saying do nothing about it is wrong also. Pause, evaluate, make corrections, this is what needs to be done.

I don't believe the technology is anywhere near ready for full deployment but it is getting better.
 
It seems to me that Uber was cutting corners, there were articles published going back years on how their self driving program was a mess.

The problem here is that technology surpassed regulation, because for human drivers you have to pass a test at least. What tests do autonomous driving software pass? Because the government officials granting them public road use, sure as hell don't know shit about the technology and when is it safe.

And no, I'm not against self driving cars, I'm against corporate rule, where all they have to say to get on the road is "nah it's fine".

Because this is a mess, not detecting that pedestrian at all, is really problematic. Even if I'm about 99% sure that 99% of human drivers wouldn't have been able to react either.

Comparatively google's self driving algorithms detected cyclists or pedestrians years ago that a human driver couldn't even possibly notice. And I mean it would be literally impossible to see for a human driver. It did that 3-4 years ago! So uber's system is either seriously flawed or completely broken. And this only shows that we need an independent "driving test" for self driving technologies before they're allowed on public roads.

The biggest problem here is that the ignorant naysayers will point back to this and say we told you so. And will try to use this to stifle autonomous driving research and testing. Like religious fundamentalists succeeded in stifling stem cell research causing countless people to die unnecessarily for years. This is the same scenario, because autonomus driving software are already better, than human drivers, and the sooner it is deployed on a mass scale the sooner will people stop dying unnecessarily. That might sound weird right after this event, but the problem here again is not the technology, I know the technology is valid as I understand the technology. So the problem again goes back to misuse of technology in service of corporate greed. Pushing an unfinished system out to the roads without necessary control.
 
Maybe I'm showing my age, but I was taught to use the crosswalk and look both ways.

Yea unfortunately it seems our generation failed to pass on basic concepts to the younger generation like (don't walk in front of a car even if you think you have the right of way), that or the younger generation is just stupid and doesn't care. The amount of people I watch nearly get hit in store parking lots because they don't look and just assume the driver is paying attention is astounding. Far too many people don't understand that "Right of way" doesn't alleviate you of the responsibility of paying attention to your surroundings.
 
Yea unfortunately it seems our generation failed to pass on basic concepts to the younger generation like (don't walk in front of a car even if you think you have the right of way), that or the younger generation is just stupid and doesn't care. The amount of people I watch nearly get hit in store parking lots because they don't look and just assume the driver is paying attention is astounding. Far too many people don't understand that "Right of way" doesn't alleviate you of the responsibility of paying attention to your surroundings.
I've even heard the "older generation" teaching bad habits as well, the fact I've heard moms tell kids shit like "It's ok we're pedestrians they have to stop" more than once is a little disturbing.
 
well in most of the world that is not the usa you can cross most streets wherever you like.

AV's had better be good at detecting pedestrians crossing the road wherever they cross, I think the crosswalk bit is just a red herring.
 
Has no one ever thought about this:

Uber's system is not representative of other company's systems. It makes sense for Uber to pull its fleets, but not others. The Apollo example is flawed because did other rocket programs come to a halt because of it? I highly doubt it. If a vaccine in development caused a death, do we suspend the research of all vaccines from other companies?

Don't make this greater than what it actually is: a failure of Uber's self driving system. It is NOT a failure of Ford's self driving system, Tesla's system, or any other company out there, and shouldn't be extrapolated as a failure of self driving tech as a whole. Toyota doing this is more of a publicity stunt than anything else. It's even right there in their statement, they want to "assess the situation."
 
Has no one ever thought about this:

Uber's system is not representative of other company's systems. It makes sense for Uber to pull its fleets, but not others. The Apollo example is flawed because did other rocket programs come to a halt because of it? I highly doubt it. If a vaccine in development caused a death, do we suspend the research of all vaccines from other companies?

Don't make this greater than what it actually is: a failure of Uber's self driving system. It is NOT a failure of Ford's self driving system, Tesla's system, or any other company out there, and shouldn't be extrapolated as a failure of self driving tech as a whole. Toyota doing this is more of a publicity stunt than anything else. It's even right there in their statement, they want to "assess the situation."

Are you certain Uber has not purchased any of its software from a third party? I know the hardware is all third party.

Suspending systems, which are a direct threat to human life, until this is resolved, is the responsible thing to do.
 
Suspending systems, which are a direct threat to human life, until this is resolved, is the responsible thing to do.
The problem is that won't ever be enough for people, I mean christ look at this board in general even before this incident screaming about rabble rabble rabble without knowing the code that's involved. Lets say they find out exactly why the car didn't slow down, they update the code so that SPECIFIC incident won't happen again (more resolution to their laser range finder or something). People are still going to say the same thing "It's unsafe"
 
Are you certain Uber has not purchased any of its software from a third party? I know the hardware is all third party.

Suspending systems, which are a direct threat to human life, until this is resolved, is the responsible thing to do.

Even if the software was 3rd party, it would be exclusive to Uber. If another company was using the same software to test on their cars, it would be a collaboration that at least 90% of the time would be made public. The fact that it isn't almost definitely rules out other self driving projects using the same software.
 
Back
Top