Researcher Hacks Self-Driving Car Sensors

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
How in the world did the people building self-driving cars not realize this? The kicker is that it only cost $60 to make one of these. :eek:

“I can take echoes of a fake car and put them at any location I want,” says Jonathan Petit, Principal Scientist at Security Innovation, a software security company. “And I can do the same with a pedestrian or a wall.”
 
This hacking is getting out of hand. It prevents pretty much any new technical innovation at this rate. They should impose so heavy penalties for hacking that it's pretty much like getting caught with murder.
 
This hacking is getting out of hand. It prevents pretty much any new technical innovation at this rate. They should impose so heavy penalties for hacking that it's pretty much like getting caught with murder.

That's what will happen. Crashes/injury/death caused by hacking will be punishable much like DUI crashes/injury/death. It'll just take some time for laws to catch up with tech.
 
That's what will happen. Crashes/injury/death caused by hacking will be punishable much like DUI crashes/injury/death. It'll just take some time for laws to catch up with tech.

You're ignoring the terror factor of hackable cars. Before, it could be assumed that every crash was either due to weather, mechanical failure, or driver error. Now, the cause can be something that is completely exterior to the crash event itself, potentially from thousands of miles away. How will a law eliminate the terror factor? It can't.
 
I'm pretty happy that they are doing this sort of hacking and calling it out in public. This allows them to get things that they never thought out ironed out and fixed before these things start going out in mass to the public.
 
I'm pretty happy that they are doing this sort of hacking and calling it out in public. This allows them to get things that they never thought out ironed out and fixed before these things start going out in mass to the public.

By that logic, our computer systems will one day be unhackable as well.
 
You're ignoring the terror factor of hackable cars. Before, it could be assumed that every crash was either due to weather, mechanical failure, or driver error. Now, the cause can be something that is completely exterior to the crash event itself, potentially from thousands of miles away. How will a law eliminate the terror factor? It can't.

It's no different than any time you get into your vehicle. You never know what external factors can play into your drive. You don't watch the news if you don't see a "terror factor" of the way some people drive their cars now. In the two weeks, I've seen two fatal crashes where 4 teenagers died and a stolen ambulance driven around (and that's just what I remember).
 
This hacking is getting out of hand. It prevents pretty much any new technical innovation at this rate. They should impose so heavy penalties for hacking that it's pretty much like getting caught with murder.

I am happy that publicly minded folks are finding some of the flaws in these potential weapons. Imagine the death and damage toll if some terrorist had discovered a way to mass send 'turn left and accelerate' commands to the Chrysler remote control Jeeps during rush hour. Or place a lidar spoofer on an overpass over a major highway. The mayhem that would ensue as hundreds of self driving car computers get overloaded and do some default 'safe the car' action could take a day or two clean up.
 
I think there is another active thread about driverless cars and the blind faith that people are putting into this technology and its developers.

Something about your kids never having to learn how to drive?

:)
 
Security and Safety are not about perfection, they are about reducing the chance of something bad happening.

When it comes to physical security, a lot of systems are build around the idea that the perp does not want to get injured or captured. Most of the time, that is true - but not always and it's more or less a maxim that if someone is willing to die, they can probably get to you.

One of the questions posed about diverless cars had to do with AI: If the car has to make a dangerous move to preserve life, what should the calculation be? Preserve the life of the owner at the potential cost of others, or should it be willing to kill it's own occupants if it means not killing a bus full of strangers?

Whatever the scenario, you can bet that it will be gamed and manipulated at some point.

Some sort of anti hacking law isn't going to stop this.
 
I am happy that publicly minded folks are finding some of the flaws in these potential weapons. Imagine the death and damage toll if some terrorist had discovered a way to mass send 'turn left and accelerate' commands to the Chrysler remote control Jeeps during rush hour. Or place a lidar spoofer on an overpass over a major highway. The mayhem that would ensue as hundreds of self driving car computers get overloaded and do some default 'safe the car' action could take a day or two clean up.

well most of the terrorists about at the moment aren't that technically proficient plus quiet a number are eager to sacrifice themselves

I can see how breaking sharply could cause problems, but its not like they could make the car drive off a cliff.

quote:-

"Using such a system, attackers could trick a self-driving car into thinking something is directly ahead of it, thus forcing it to slow down. Or they could overwhelm it with so many spurious signals that the car would not move at all for fear of hitting phantom obstacles."
 
I'm more concerned with enforcing said laws.

Without enforcement laws are really nice intentions on paper. Great reading material.
 
I am not all that worried about this one. The worst thing that could happen is that you wouldn't move. Your life wouldn't be in danger, except for those who don't have self driving cars and run into the back of you because they were following to close and can't stop in time.
 
This hacking is getting out of hand. It prevents pretty much any new technical innovation at this rate. They should impose so heavy penalties for hacking that it's pretty much like getting caught with murder.

So, what you're saying is that it should be illegal to research security vulnerabilities? Fascinating.
 
By that logic, our computer systems will one day be unhackable as well.

Computers really became hackable when they started connecting full time to the internet, I mean other than someone physically breaking into an office/home and accessing the computer (which is something that can be done with a car as well), solution, minimize (zero would be good) the ability to remotely contact anything for data that can effect the car.

And then pass some laws that basically say anyone trying to spoof signals for cars, or buy gear to do so can be brought up on attempted murder charges or conspiracy to commit murder.
 
This is what i've been saying all along. When people try to intentionally trick the system into doing something because it's been programmed without security in mind, these things will happen.
It's good that it's public. Now they can combine radar sensor data with camera data to see if it adds up, thus making the system more robust. Will it ever been foolproof? nope. But once it's better than normal drivers and has the bonus of being secure to most hackers, it will be worth using.
Just adding laws won't stop criminals from using it. It won't stop security agencies like the CIA to use it for political assassinations. Only a thoroughly tested system that's been designed to recognize intentional sensor faults and has been hardened will be acceptable.
 
Sensor attacks are not limited to just robotic drivers, of course. The same laser pointer that Petit used could carry out an equally devastating denial of service attack on a human motorist by simply dazzling her, and without the need for sophisticated laser pulse recording, generation, or synchronization equipment.

How in the world did people driving cars not realize this?
 
Back
Top