Nvidia Halts Self Driving Car Testing Globally

rgMekanic

[H]ard|News
Joined
May 13, 2013
Messages
6,943
Despite the fact that Jensen just spent quite a bit of time showing off their Self Driving car technology at GDC, Nvidia has announced that they are suspending self-driving car tests globally. The news comes a week after a self-driving car operated by Uber struck and killed a pedestrian in Arizona. As well the NTSB has opened a field investigation of a fatal crash last week involving a Tesla, but it is unclear if the car was being driven by it's automated-control system at this time.

While this decision has cause NVIDIA stock price to drop quite sharply, I completely agree with their decision. While I know many are excited about self-driving car technology, it is my opinion that especially in light of the recent crashes, that the technology is not far enough along to be being tested on public roadways.

Around 320 firms involved in self-driving cars - from software developers, automakers and their suppliers, sensor and mapping companies - use Nvidia Drive platform, according to the company's website.
 
that the technology is not far enough along to be being tested on public roadways
You can't put all of them under the same umbrella. Uber's system is flawed as it seems, but google's system was far better than this 4 years ago. When one f150 crashes you don't pull all pickup trucks from the road.

Not that NVIDIA had that many public road tests going on with self driving cars, they were still working on closed tracks afaik. So this is more a PR stunt to ride the wave of attention it seems to me.
 
You can't put all of them under the same umbrella. Uber's system is flawed as it seems, but google's system was far better than this 4 years ago. When one f150 crashes you don't pull all pickup trucks from the road.

Not that NVIDIA had that many public road tests going on with self driving cars, they were still working on closed tracks afaik. So this is more a PR stunt to ride the wave of attention it seems to me.

You mean this Uber system is flawed?

https://nvidianews.nvidia.com/news/uber-selects-nvidia-technology-to-power-its-self-driving-fleets
 
Uber used bits and pieces from here and there, the whole system was a frankenstein monster, as I've found out about it recently.

NVIDIA isn't providing the whole system, just some components. There are upstarts here in my country as well using NVIDIA technology, but their system likely has almost nothing in common with Uber.
 
I'd say their tech is a lot further along than most 16 yr olds that we let drive on our roadways.

I got no problem with self-driving tech being on the road. But I also think the driver of the vehicle is liable, regardless of if they have the autopilot engaged or not. That doesn't mean the AI/Tech provider doesn't need to provide safe systems, just that flipping on the Autopilot shouldn't absolve the driver of the ultimate responsibility of ensuring their vehicle is operating in a safe manner.
 
Long term, suspending testing of autonomous technology is the responsible thing to do, until the current issues are investigated and resolved.
 
I mean, it was bound to happen eventually. They could test this for 100 years and there's a pretty good chance it will still occasionally kill a pedestrian. I just don't get the knee jerk reaction like this. They better NEVER enter the self driving space again.
 
I mean, it was bound to happen eventually. They could test this for 100 years and there's a pretty good chance it will still occasionally kill a pedestrian. I just don't get the knee jerk reaction like this. They better NEVER enter the self driving space again.

The "bound to happen" excuse doesn't really work with emergent technology to me.

"Hey, this new drug we started testing last week has killed 2 people, but over 100 years, a couple people were bound to die."
 
The "bound to happen" excuse doesn't really work with emergent technology to me.

"Hey, this new drug we started testing last week has killed 2 people, but over 100 years, a couple people were bound to die."
The point I was making they could test it in a closed environment for 100 years, and when they started road tests EVEN THEN it would probably kill someone. No amount of testing would be able to perfect it, and the drug comparison is bad because drugs having unforeseen side effects that kill people happens pretty regularly. Expecting these things to be flawless is unreasonable.
 
The backup driver may as well just drive. The problem is you're going to get sleepy or incredibly bored just looking out the window. You can try this now. Just be a passenger with someone. Never talk to the driver, never use your phone.
So if I was a backup driver. I'd rather just be driving instead. Cause the boredom is going to make me quit for a different job.
 
Hope it's not that math error causing havoc again, cause I saw that video and that car did not even hesitate.
 
It's called "don't have a convicted felon sit on their cell phone while being the safety driver". How hard is that?
 
I'd say their tech is a lot further along than most 16 yr olds that we let drive on our roadways.

I got no problem with self-driving tech being on the road. But I also think the driver of the vehicle is liable, regardless of if they have the autopilot engaged or not. That doesn't mean the AI/Tech provider doesn't need to provide safe systems, just that flipping on the Autopilot shouldn't absolve the driver of the ultimate responsibility of ensuring their vehicle is operating in a safe manner.

Can drive at 14 here. I definitely have anxiety when I see a pair of hands on a steering wheel in my review mirror and just the top of a head.
 
I mean, it was bound to happen eventually. They could test this for 100 years and there's a pretty good chance it will still occasionally kill a pedestrian. I just don't get the knee jerk reaction like this. They better NEVER enter the self driving space again.

There is bound to happen and then there is failing the basic demo test for any car assist technology let alone autonomous operation. When your so called AV can't handle a test case that *SHIPPING* cars can handle....
 
I can guarantee you that this will never work. The only solution is to change the laws so that "the unexpected" is allowed to "die". And I don't want to see that. Show me the "perfect" autonomous vehicle and I'll show it killing someone. It's that simple and that easy. It boggles my mind that anyone couldn't see that.

Show me a robot surgeon (which exist) and I'll show you someone that died at it hands.

When a person kills (inadvertently) another person, we call it an "involuntary manslaughter". Until you give machines the same sort of "thing"... which you can't, this sort of fails. IMHO, make the CEO of the company behind the vehicle take the rap and potential lawsuits. :)

The bad part is that you can "trick" these cars into doing a lot of evil. They aren't human. And what they call "AI" really isn't (by human standards). We need robocar (like Robocop) and not killercar (ED-209). Anyone want to volunteer for the robocar project? (just like with Robocop, needs to be one of the "best" drivers out there).
 
Since Nvidia clearly has the best current solution available, I would hate to think of the next best using this temporary halt to go out and start wantonly killing pedestrians.
 
I was about to say it sounds like a bit of a knee-jerk/marketing-based reaction, but if you think about it, they are overhyping a shit product they don't trust and are now pulling it.
 
Since Nvidia clearly has the best current solution available, I would hate to think of the next best using this temporary halt to go out and start wantonly killing pedestrians.

Eh? Where are you getting this BS from. Nvidia doesn't have shit. They are a hardware vendor with no particular advantage wrt any current AV project. Their software is WAY WAY WAY behind the market leaders.
 
It had to happen eventually and I knew it was be just like this. I do remember thinking of something like this happening back around I'd say 2000 or so when people were into saying the future will have cars that drive themselves! :O
 
Maybe the NVidia cars are possessed like in that really bad movie Killdozer.
 
Eh? Where are you getting this BS from. Nvidia doesn't have shit. They are a hardware vendor with no particular advantage wrt any current AV project. Their software is WAY WAY WAY behind the market leaders.

If you know of another hardware solution for self driving vehicles that is currently more powerful than Nvidia's, please enlighten us. If you don't think they have an advantage in the AV field you're sorely misguided. Their partner list is a who's who of tommorrows vehicles (Toyota, Mercedes, Audi, Tesla, Volvo, Volkswagon ect).
 
If you know of another hardware solution for self driving vehicles that is currently more powerful than Nvidia's, please enlighten us. If you don't think they have an advantage in the AV field you're sorely misguided. Their partner list is a who's who of tommorrows vehicles (Toyota, Mercedes, Audi, Tesla, Volvo, Volkswagon ect).

They have hardware, they don't have an AV solution. The hardware compute is quite honestly the easy part and can be had OTS from multiple vendors. From all available info, neither of the two leaders in AV development are using Nvidia Drive hardware. In fact, the current consensus leader is using an Intel platform. GM hasn't detailed their underlying compute hardware from what I've seen.
 
Gawd, while a tragedy, and it should be investigated, and consquences should be meted out.... if we as a society were this pantywaisted over everything we wouldn't have rockets, airliners, cars, trains, bridges, tunnels, fuck - you name SOMETHING that we've built that people live in, use, ride around in, walk over, that HASN'T killed someone.....

If I remember right, we just had a pedestrian bridge collapse while under construction and kill several people - we better not build any more of THOSE either!

BB
 
I've been saying this for a while now: You can get self driving tech to handle the 99% cases easily enough, but that last 1% will forever make it unviable.

Look, I'm a software engineer by trade, and I understand many of the issues that are going to show themselves within the software. I see road construction, and I wonder "will the software be smart enough to stay between the cones and ignore the lane markers, or follow the lane markers and run into a construction worker?". And don't even get me started on non-standard signage or lane markings.

Short of a Federal law that makes all signage/lane markings consistent across all states, right down to how visible (brightness/thickness) they are, you are never going to have self driving tech be seen as reliable enough for the population at large to trust it. [Disregard the fact the software will still be better then human drivers; perception matters more then reality when it comes to consumers.]
 
Gawd, while a tragedy, and it should be investigated, and consquences should be meted out.... if we as a society were this pantywaisted over everything we wouldn't have rockets, airliners, cars, trains, bridges, tunnels, fuck - you name SOMETHING that we've built that people live in, use, ride around in, walk over, that HASN'T killed someone.....

If I remember right, we just had a pedestrian bridge collapse while under construction and kill several people - we better not build any more of THOSE either!

BB

Ok, I'll bite. I'll go with NASA, as I have pretty good knowledge of that area.

We lost Apollo 1, Challenger, and Columbia. Almost lost Apollo 13 as well.

In every case, when something did not meet expectations, everything was stopped. Before we attempted to put any more human life at risk, the cause was investigated. If it could be corrected, it was, and once it was we resumed operations. It was the responsible thing to do.

Now, if the deaths were directly caused by human error, then there is not much that can be done about it (cars, trains, airliners....). Better education is about the extent.

It is not "pantywaisted" to act in a responsible manner. It is completely irresponsible to not pause and investigate. Fix the problem, if possible, then resume.
 
The "bound to happen" excuse doesn't really work with emergent technology to me.

"Hey, this new drug we started testing last week has killed 2 people, but over 100 years, a couple people were bound to die."

problem with that example is that most meds probably have killed that many people during testing.
 
problem with that example is that most meds probably have killed that many people during testing.
"May cause impotence, elevated blood pressure, flatulence and sudden death. Please consult with your physician before taking Adhedra. Adhedra, healing dry skin has never been easier!"
 
Despite the fact that Jensen just spent quite a bit of time showing off their Self Driving car technology at GDC, Nvidia has announced that they are suspending self-driving car tests globally. The news comes a week after a self-driving car operated by Uber struck and killed a pedestrian in Arizona. As well the NTSB has opened a field investigation of a fatal crash last week involving a Tesla, but it is unclear if the car was being driven by it's automated-control system at this time.

While this decision has cause NVIDIA stock price to drop quite sharply, I completely agree with their decision. While I know many are excited about self-driving car technology, it is my opinion that especially in light of the recent crashes, that the technology is not far enough along to be being tested on public roadways.

Around 320 firms involved in self-driving cars - from software developers, automakers and their suppliers, sensor and mapping companies - use Nvidia Drive platform, according to the company's website.

I agree, in light of recent crashes involving people-driving car technology hitting pedestrians, we should ban then from public roadways.
Also in light of multiple events involving people killing people with guns, we should ban all research on people on people gun violence.
 
The "bound to happen" excuse doesn't really work with emergent technology to me.

"Hey, this new drug we started testing last week has killed 2 people, but over 100 years, a couple people were bound to die."

That's how drug approvals work..... every approved drug could potentially kill some people. Doesn't mean the drug doesn't get approved because it helps more than it kills.
 
Being a little facetious with this paraphrase but there's some truth to it.

"In light of recent crashes involving people driving. . . .hitting pedestrians. . . . .we should ban them from public roadways."

I agree with rgMekanic about the tech not being ready. Fully autonomous has many issues still needing to be solved before it can be trusted. Let's not also forget a major source of auto related injuries, the driver themselves. So trusting a human to baby sit one of these things on the public roadway isn't the best idea either.
 
Back
Top