Feds: Google Car’s Computer Can Be Considered A Driver

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
If the computer in Google's cars can be considered a driver, who is held responsible for stuff like tickets and accidents? If an autonomous car swerves off a cliff to avoid hitting a pedestrian, is it Google's fault if the driver is injured or killed?

U.S. vehicle safety regulators have said the artificial intelligence system piloting a self-driving Google car could be considered the driver under federal law, a major step toward ultimately winning approval for autonomous vehicles on the roads. The National Highway Traffic Safety Administration told Google, a unit of Alphabet Inc, of its decision in a previously unreported Feb. 4 letter to the company posted on the agency’s website this week.
 
Strange thought occured to me....
I've been hearing a lot that we aren't prepared for a cyber war, but we are getting more dependent on systems that can be hacked... it's like we're readying ourselves to get screwed over.
 
There is no question that it should be the software company, but they'll fight it forever because of the cost involved for them. It might end up killing the self-driving car.
 
Well that actually makes sense, you can't blame the owner of the car if he isn't driving it.
 
is it Google's fault if the driver is injured or killed?

Yes, wholeheartedly. Or how about GPS telling the card to drive off a cliff where it thought there was a road? Or what about these "cars" that don't have steering or pedals? How do they drive through snow if not? What if there is a turtle in the road? Does it turn into oncoming traffic to avoid it? Or what if it's not a turtle but a solid rock? Will it smash into it, or turn you off of a bridge? Can it distinguish a bicyclist from a large dog? Will it hit the human but swerve for the dog? How about a large block of styrofoam? Will it drive through it or swerve? How about following a sand or dump truck? Will it still follow within its measured distance and have the windshield and headlights busted out for rocks falling off the back of it?

I'm sorry, but there is absolutely no way that they can program a computer to make logical decisions that a real human could instantly make. Go ahead and trust a computer to make judgement calls on your very life - and the lives of others. There will be huge lawsuits.

Autonomous cars are another way to give up your freedom. I would guess Democrats love this idea. Let someone someone else make basic life decisions for you, like how to drive your car.

:)
 
I'm sorry, but there is absolutely no way that they can program a computer to make logical decisions that a real human could instantly make. Go ahead and trust a computer to make judgement calls on your very life - and the lives of others. There will be huge lawsuits.

Autonomous cars are another way to give up your freedom. I would guess Democrats love this idea. Let someone someone else make basic life decisions for you, like how to drive your car.

:)

Humans don't necessarily make very logical decisions in the heat of the moment either. That's why it is beneficial to take the logical decision making offline when you are thinking more rationally (not while you are hurtling towards that cliff) and then programming it to be automated.

Humans make a lot of decisions, some of them good and some of them really bad, and humans also make mistakes under pressure. A computer algorithm CAN be more reliable on average.

At least in theory. I see some problems, but down the line it may actually be quite nice.

I - for one - wouldn't like to have my option to drive taken away and replaced by automation, but it would be nice to have the option to go out and not think about how many glasses of wine I have with dinner, and then be able to stumble back to my car, and press the "home" button, and get home safely and legally.

And that's probably how most early autonomous driving systems will work. optional. Turn them on when you want them, and turn them back off when you don't.

Professional drivers will likely also be the first to go. Trucks, cabs, etc. Replace the human who needs to sleep and be paid with a computer program which can run 24/7.

While I don't want to give up the ability to drive, I wouldn't mind if everyone else did :p (It's always those damned Other People on the road. You know the ones cutting acros 3 lanes of traffic to make their exit, weaving in and out of traffic, tailgating, doing their makeup or shaving while driving, reading the newspaper while driving, texting, etc. etc. Oh and cabbies :p

I don't think autonomous drivers are anywhere near being better than human drivers at their best, but it wouldn't take very much to be better than human drivers at their worst :p
 
who is held responsible for stuff like tickets and accidents? If an autonomous car swerves off a cliff to avoid hitting a pedestrian, is it Google's fault if the driver is injured or killed?

I would imagine before the cars can be registered in each state a caveat will be created that says the autonomous "driver" must be supervised by a licensed driver behind the wheel. Almost like a student driver must be supervised.

Another example is autoland capable autopilots which must be "supervised" by a qualified pilot(s).

The tort side would always come back to the owner of the vehicle, licensed supervising driver, and vehicle/parts manufacturer. If the autonomous system is at fault I guarantee Googles and/or the company that makes hardware would have their products liability exposed. Just like aircraft.
 
I would imagine before the cars can be registered in each state a caveat will be created that says the autonomous "driver" must be supervised by a licensed driver behind the wheel. Almost like a student driver must be supervised.

Another example is autoland capable autopilots which must be "supervised" by a qualified pilot(s).

The tort side would always come back to the owner of the vehicle, licensed supervising driver, and vehicle/parts manufacturer. If the autonomous system is at fault I guarantee Googles and/or the company that makes hardware would have their products liability exposed. Just like aircraft.

By "supervisor", do you mean someone playing with their phone or a passed out drunkard? And if you have to be alert and awake, why not just do the driving anyway? If you don't want to drive - ride a bus, taxi, or train.
 
Well that actually makes sense, you can't blame the owner of the car if he isn't driving it.

Of course you can. It's called "where you lay the mantel of responsibility". If it is reasonable that the driver be held liable for the safe operation of his vehicle to include the hardware and software of autonomous driving systems then yes, you can still find the vehicle owner at fault, even if he isn't in the vehicle in an accident.
 
Yes, wholeheartedly. Or how about GPS telling the card to drive off a cliff where it thought there was a road? Or what about these "cars" that don't have steering or pedals? How do they drive through snow if not? What if there is a turtle in the road? Does it turn into oncoming traffic to avoid it? Or what if it's not a turtle but a solid rock? Will it smash into it, or turn you off of a bridge? Can it distinguish a bicyclist from a large dog? Will it hit the human but swerve for the dog? How about a large block of styrofoam? Will it drive through it or swerve? How about following a sand or dump truck? Will it still follow within its measured distance and have the windshield and headlights busted out for rocks falling off the back of it?

I'm sorry, but there is absolutely no way that they can program a computer to make logical decisions that a real human could instantly make. Go ahead and trust a computer to make judgement calls on your very life - and the lives of others. There will be huge lawsuits.

Autonomous cars are another way to give up your freedom. I would guess Democrats love this idea. Let someone someone else make basic life decisions for you, like how to drive your car.

:)



Wow, I have answers for you although you might not understand or accept them. First issue, just because these people think they have some form of moral obligation to make a self driving car do better than a human would, why is it anyone thinks they really should?

Think it through, first, if the system is safe enough to ever, at a basic level, put your life in it's hands, then keep that in your mind from now on. Now, why can't the vehicle owner be held liable for his vehicle's actions? He is the owner, it wouldn't be on the road if he hadn't bought it. Why must it do anything more than a human would, swerve to avoid the dog and ooppss, "well that sucked for old Bob" but it "could have happened to anyone" The driver didn't react in an unreasonable manner given the circumstances even if the driver was a machine.

Starts to clear up some issues right? I mean won't we still be better off? Human error being what it is, won't we still be better off even if the car doesn't make big moral rationalizations about who it should and shouldn't save? Just do what any human would do, try not to panic, try not to hit anything as each obstacle is presented and if the car even manages to do this a little better isn't a win ffs?
 
Autonomous cars have nothing at all to do with Democrats or your freedom. It's about one thing, money, and your free time, not your freedom. The more free time you have the more tweets you can write, shopping you can do, games you can play, add infinitum.

It's just a way to make more money from us.
 
I still wonder how they'll solve the liability/insurance problem.

Will the manufacturer be required to carry auto insurance for the self driving vehicle? I mean, liability for accidents caused can't rest with the owner if the owner isn't the one controlling the car.

Now, granted, the collision rate would likely be MUCH lower than for the average human piloted car, and as such the insurance cost would be cheaper. I'm just curious how this will be solved legally.
 
Zarathustra[H];1042132024 said:
I still wonder how they'll solve the liability/insurance problem.

Will the manufacturer be required to carry auto insurance for the self driving vehicle? I mean, liability for accidents caused can't rest with the owner if the owner isn't the one controlling the car.

Now, granted, the collision rate would likely be MUCH lower than for the average human piloted car, and as such the insurance cost would be cheaper. I'm just curious how this will be solved legally.

I would bet that there would have to be some type of forever recurring payment for a self-driving car (part of the benefit seen by corporations, no doubt), for software updates and insurance costs.

Although, to make it even more complicated, how would you split liability for maintenance of the vehicle (tires, etc) if the driver/owner didn't properly maintain the vehicle?

I see self driving vehicles more as breaking the ownership of vehicles as well. If you can drive the cost down enough to where it is more cost effective to get it on demand and as a commodity...
 
Nah there be disclaimers and agreements individuals will sign before owning an autonomous capable vehicle. Within that agreement will probably house individual responsibility and not anybody else.
 
No software will ever be perfect. But if it makes better decisions on average - which it does - then I can imagine limited liability on federally owned roads via legislation. States would likely mimic the legislation, and so on. It is an easy 'needs of the many' argument.
 
Zarathustra[H];1042132024 said:
I still wonder how they'll solve the liability/insurance problem.

Will the manufacturer be required to carry auto insurance for the self driving vehicle? I mean, liability for accidents caused can't rest with the owner if the owner isn't the one controlling the car.

Now, granted, the collision rate would likely be MUCH lower than for the average human piloted car, and as such the insurance cost would be cheaper. I'm just curious how this will be solved legally.

Manufacturers already carry Products and Completed Operations liability coverage. Most auto maintenance facilities carry the same. So if the product was at fault, ie: faulty system, then the manufacturer is covered. If the fault was caused by faulty maintenance then the CO section comes into play.

Insurers will probably play with the policies but the end result will be they will handle the coverages/declarations much like their aviation counterparts do. All the major insurers that write PCO for auto also write it for aviation.

The states will demand the person operating the vehicle be legal and able to supervise the actions of the autonomous system. Driving drunk / incapacitated is already illegal. Driving without a license is illegal. Cell phone / texting goes back to local laws. Just like a student driver is not legal to drive without a licensed legal driver sitting next to him/her. You can't get hammered then have your 15 year old student driver take you around bar to bar.

If the supervising occupant isn't monitoring the autonomous system or overrides it, the fault could lie back onto that person; however, if the system is at fault or played a part in the loss it could easily go back to the supervising occupant, owner, and manufacturer. It would then be up to the manufacturer's insurer and other insurers to fight over how it is handled and who pays what. Just like aviation.

And just like aviation, after losses are high enough the industry will ask the Govt for public law (statute of repose) limiting their exposure. Much like GARA did for aviation.
 
So if the car OS gets a virus, would that be considered as driving DUI?
 
So, is the software considered one driver across all the vehicles, or is it thousands of different drivers for each individual auto installation? Does it need a license?
 
Human error being what it is, won't we still be better off even if the car doesn't make big moral rationalizations about who it should and shouldn't save? Just do what any human would do, try not to panic, try not to hit anything as each obstacle is presented and if the car even manages to do this a little better isn't a win ffs?

I can forgive people. I can't forgive the people who think they are better than humans by programming machines that can't count for all situations - and on top of that they will be charging you much more money and also be making profit off of it. The programmers are responsible.

Although I don't understand some your responses, I do have one question. How can you prove that automated cars are doing it "a little better"? The only thing I can think a car may do better is shorter response times. I absolutely believe that. What is it going to do in the case of a mechanic failure on a car? What if the brakes quit working? Is it going to know to downshift or just keep on rolling into a car or a river? What about when the computer that controls it fails or shorts out? Modules and computers short out constantly. I know because most of my family work as mechanics at new car dealerships. One malfunction and you are history.

*Knock on wood* - I have a MUCH better driving record than even Google's cars limited to 25mph.
 
I can forgive people. I can't forgive the people who think they are better than humans by programming machines that can't count for all situations - and on top of that they will be charging you much more money and also be making profit off of it. The programmers are responsible.

Although I don't understand some your responses, I do have one question. How can you prove that automated cars are doing it "a little better"? The only thing I can think a car may do better is shorter response times. I absolutely believe that. What is it going to do in the case of a mechanic failure on a car? What if the brakes quit working? Is it going to know to downshift or just keep on rolling into a car or a river? What about when the computer that controls it fails or shorts out? Modules and computers short out constantly. I know because most of my family work as mechanics at new car dealerships. One malfunction and you are history.

*Knock on wood* - I have a MUCH better driving record than even Google's cars limited to 25mph.

Yes, the big "I" again. When it comes to the "I" everybody is better than everyone and everything.

"one malfunction and you're history" Is that a conclusion based on your long in-depth study of the matter?

You as an outsider with no particular expertise on self-driving cars thought up a few situation that might happen. Are you really going to assume that the developers and engineers working on it never even considered those situations?

And humans are "programmed"? for all situations? No, they'll do something in an unfamiliar situation, that's usually worse than doing nothing. This happens on a daily basis. A large percentage of RTCs could be avoided if the drivers involved did the right thing, instead of something.

And you want to keep down automated driving systems because the might not work in a tiny fraction of cases? Over humans who do the wrong thing in many cases.

The program can have the combined driving experience of thousands or tens of thousands of expert drivers in it. Which human could claim that?
 
And humans are "programmed"? for all situations? No, they'll do something in an unfamiliar situation, that's usually worse than doing nothing. This happens on a daily basis. A large percentage of RTCs could be avoided if the drivers involved did the right thing, instead of something.

I wanted to say "Did nothing, instead of something"
 
Of course you can. It's called "where you lay the mantel of responsibility". If it is reasonable that the driver be held liable for the safe operation of his vehicle to include the hardware and software of autonomous driving systems then yes, you can still find the vehicle owner at fault, even if he isn't in the vehicle in an accident.

Sure you can. It is still utterly retarded to do so. But, you're right, you definitely can.
 
Yes, the big "I" again. When it comes to the "I" everybody is better than everyone and everything.

"one malfunction and you're history" Is that a conclusion based on your long in-depth study of the matter?

You as an outsider with no particular expertise on self-driving cars thought up a few situation that might happen. Are you really going to assume that the developers and engineers working on it never even considered those situations?

And humans are "programmed"? for all situations? No, they'll do something in an unfamiliar situation, that's usually worse than doing nothing. This happens on a daily basis. A large percentage of RTCs could be avoided if the drivers involved did the right thing, instead of something.

And you want to keep down automated driving systems because the might not work in a tiny fraction of cases? Over humans who do the wrong thing in many cases.

The program can have the combined driving experience of thousands or tens of thousands of expert drivers in it. Which human could claim that?

Yes, the big "I" because I have personal proof that my driving record is better. You bet.

Thousands or tens of thousands of expert drivers? I didn't realize there are that many programmers working on these things. I withdrawal all my conclusions then. ;) Some of you are making an awful lot of conclusions (and oversight) that these things are better when you may not have [enough?] personal experience examining these things. I guess we need to have Mythbusters do an examination. :D
 
Zarathustra[H];1042132024 said:
I still wonder how they'll solve the liability/insurance problem.

Will the manufacturer be required to carry auto insurance for the self driving vehicle? I mean, liability for accidents caused can't rest with the owner if the owner isn't the one controlling the car.

Now, granted, the collision rate would likely be MUCH lower than for the average human piloted car, and as such the insurance cost would be cheaper. I'm just curious how this will be solved legally.



Its actually quite simple, and Cr4ck is partially correct.

TLDR: I work in the insurance industry and with enterprise risk management, the insurance mechanisms already exist to insure these vehicles, it is just a matter or legislation.

You have two key components to insuring autonomous cars, one is the physical property, the other is the software component.

The physical property further falls into two insurance silos, product and completed operations (a extension to the general liability policy) and Errors and Commissions (professional liability). The E&O comes in to cover the allegation that the physical vehicle was not designed correctly.

Products and completed operations liability insures claims caused by products that have been sold, distributed, produced, or handled, and are no longer in the named insured's possession or operation.

On the software end, you have the professional liability of the software engineers on the line. Pretty obvious that they would be coming to the party, as any average lawyer will allege that the software failed by error or omission in its design.

Unlike aviation, the individual exposure to risk is small, I don't see the industry seeking to limit liability because of large losses. In a plane crash, its easy to have over a hundred injured or likely dead parties, with estates all suing for compensation. Vehicles rarely have that many people involved.

Large corporations (like auto manufacturers) are insured through captive insurance companies (wholly owned by that corporation, usually setup in a tax haven like the Bahamas). These captives have deep pockets, purchase reinsurance, and have even formed reciprocals in addition to the prior.

A reciprocal is another form of risk transfer, common to public entities, where every entity pays into a pot and losses are paid out of that, with cash calls if required.

lastly you have the driver, which is currently in limbo and needs the legislation, you should not be directly responsible for the cars autonomous actions. Indirectly, yes, but again there is coverage for this existing under commercial insurance called a non-owned auto endorsement. While the non-owned portion would need a slight wording tweak, it essentially covers your company for lawsuits that arise out of others causing a vehicle accident. The most common is employees driving personal vehicles on work business, getting into an accident, the lawyer alleges that the vehicle would never have been on the road except for performing that work related function.

Legislation is needed to properly instruct the courts how to process the already existing insurance mechanisms and assign liability accordingly. As self driving cars so far are safer than people driven cars (all accidents the other party has been at fault, in which case their insurance is responsible) I don't foresee this becoming a huge claims problem for insurers, but it will be a revenue problem as that auto insurance dollar will decline.

I for one look forward to the perfection of this technology and the end of the traffic ticket cop profession.
 
Legislation is needed to properly instruct the courts how to process the already existing insurance mechanisms and assign liability accordingly. As self driving cars so far are safer than people driven cars (all accidents the other party has been at fault, in which case their insurance is responsible) I don't foresee this becoming a huge claims problem for insurers, but it will be a revenue problem as that auto insurance dollar will decline.

Correction, In October of 2015 google reported 16 minor incidents all human error, in January 13th report released by google shows that in analyzing 69 events where the human driver took control, 13 of those prevented accidents where the self driving vehicle would have been at fault. Most accidents that went unreported where others rear ending the self driving vehicle at 5mph or less.

http://fortune.com/2015/10/09/self-driving-cars-crashing/
http://fortune.com/2016/01/13/google-self-driving-car-accidents/
http://time.com/4098303/self-driving-cars-accident/
 
As self driving cars so far are safer than people driven cars (all accidents the other party has been at fault, in which case their insurance is responsible)

I for one look forward to the perfection of this technology and the end of the traffic ticket cop profession.

Thank you for your insurance insight. :)

They are safer? Yes because they are limited to 25mph. All accidents the other party has been at fault? Yes, but because of how the self driving car drives, which puts the self-driving car at fault in my book. In the reports I have read, it is because the self driving car freaks out and hits the brakes in situations that didn't need it. This throws off humans as they can see ahead and not expect a random brake check. Perhaps this is all programmed to try showing humans as inferior.

End of traffic ticket cops? I doubt human driving cars will ever be outlawed. Are you expecting 100+ years and hundreds of millions (billions?) of cars to be banned? I will be damned if I have to give up my freedom to drive my own car (and be responsible for my own life, and not the government's).

While this is a tech website, I would expect more than the average percentage of you to side on technology. However, I am blown away that so many of you are easily parted from your freedoms. :confused:
 
While this is a tech website, I would expect more than the average percentage of you to side on technology. However, I am blown away that so many of you are easily parted from your freedoms. :confused:

I think its because many people do not equate the daily commute with freedom. Also can you define how you think you'd be losing freedom with self driving cars?

My ideal picture is essentially a moving living area, where you can work (my main problem with getting vacation time), play and relax. This to me is far more freedom than being stuck behind a wheel, focused on the road in heavy traffic barely moving. The most I can interact with (safely) is the radio, or at best a phone call blue toothed to my speakers.

I fail to see how this is an issue of freedom, outside the tinfoil hat 'their watching me' theories.
 
Thank you for your insurance insight. :)

End of traffic ticket cops? I doubt human driving cars will ever be outlawed. Are you expecting 100+ years and hundreds of millions (billions?) of cars to be banned? I will be damned if I have to give up my freedom to drive my own car (and be responsible for my own life, and not the government's).

no Edit :(

Your welcome.

Yes I do, not banned per say, but I expect the majority of our children will not drive. They will be happy to commute without that hassle, and as they did not grow up with the same driving culture we did (left over from the American Graffiti driving culture, which was the peak) they will not have the same nostalgia goggles for driving we do.

Basically, it will be a phasing out lead by the consumer, as it should be. As less and less human driven cars are on the road, less traffic cops will be required. This will be forced by budgets, as the tax payer never wants to pay anything, police have come to rely on traffic tickets as a source of revenue. This revenue will dry up as self driving cars do not violate the rules of the road, forcing them to spend less on traffic enforcement and lobby for greater tax dollars.

All in all, I think this is a good thing, police should be focused on issues other than ticketing idiots on the roads.
 
I think its because many people do not equate the daily commute with freedom. Also can you define how you think you'd be losing freedom with self driving cars?

My ideal picture is essentially a moving living area, where you can work (my main problem with getting vacation time), play and relax. This to me is far more freedom than being stuck behind a wheel, focused on the road in heavy traffic barely moving. The most I can interact with (safely) is the radio, or at best a phone call blue toothed to my speakers.

I fail to see how this is an issue of freedom, outside the tinfoil hat 'their watching me' theories.

Freedom - as in several people want ALL cars to be self-driving, and to remove ALL humans from being able to drive their own car. This is forcing me to where I don't have the freedom to drive my own car. I am forced to not have my own choice. This in terms of removing freedom should make sense. Pretty much like taking away my freedom to own a gun. I will never give up my classic muscle car nor my choice in going offroad to enjoy my suv. As I stated earlier - many people just want these self-driving cars so they can be stuck into their phone texting. :(
 
Freedom - as in several people want ALL cars to be self-driving, and to remove ALL humans from being able to drive their own car. This is forcing me to where I don't have the freedom to drive my own car. I am forced to not have my own choice. This in terms of removing freedom should make sense. Pretty much like taking away my freedom to own a gun. I will never give up my classic muscle car nor my choice in going offroad to enjoy my suv. As I stated earlier - many people just want these self-driving cars so they can be stuck into their phone texting. :(

They text anyway, and are a huge hazard doing so.

It is unlikely that driver cars will be banned anytime soon, more likely in our grand children or greater generation, but it is still likely that certain areas will be setup and maintained for driver cars.

Without a dramatic social shift, I don't see the western world outlawing our current cars. As I said, I think it will be a generational shift, and its one we can't and shouldn't stand in the way of, this is progress. If we do, we are just that old man yelling about the yout's from our rocker while patting our shotgun.
 
There is no question that it should be the software company, but they'll fight it forever because of the cost involved for them. It might end up killing the self-driving car.

Yes and the gun company for making guns.

There most apparently be a question about it since i believe it should be the passenger riding the car foremost unless as specific software/hardware bug can be found. just like with any other car
 
I can forgive people. I can't forgive the people who think they are better than humans by programming machines that can't count for all situations - and on top of that they will be charging you much more money and also be making profit off of it. The programmers are responsible.

Although I don't understand some your responses, I do have one question. How can you prove that automated cars are doing it "a little better"? The only thing I can think a car may do better is shorter response times. I absolutely believe that. What is it going to do in the case of a mechanic failure on a car? What if the brakes quit working? Is it going to know to downshift or just keep on rolling into a car or a river? What about when the computer that controls it fails or shorts out? Modules and computers short out constantly. I know because most of my family work as mechanics at new car dealerships. One malfunction and you are history.

*Knock on wood* - I have a MUCH better driving record than even Google's cars limited to 25mph.

Wait wait wait. Why is a programmer responsible when it's the buyer who is putting the car on the road? As long as the manufacturer doesn't misrepresent the product then what have they done wrong?

I think too many people are making wild assumptions that this self driving car thing will absolve the owner of liability. There are many examples of accidents that happen and it's not the product manufacturer that winds up being responsible, it's the owner/operator that carries that risk.

I think you guys are jumping the gun with these ideas that business, insurance companies, and the government are going to let us get away with absolving ourselves of liability. The days of criminal charges for such accidents might end, but you will still be licensed to operate your vehicle and if there is an accident it most likely will devolve into a simple "no fault" both insurance companies pay. Unless of course someone sues claiming one of the owners/operators modified their software or tampered with it, or heaven forbid you weren't up to date on having your maintenance done. Let your car fall out of warranty and who is keeping your car's software up to date.

Are you seeing the angle here. It can become a life long financial burden to buy a car. Imagine, your car is still good, but the software is no longer supported by the vendor so you are risking an accident, criminal charges, and a civil suite for driving with unsupported software.
 
Freedom - as in several people want ALL cars to be self-driving, and to remove ALL humans from being able to drive their own car. This is forcing me to where I don't have the freedom to drive my own car. I am forced to not have my own choice. This in terms of removing freedom should make sense. Pretty much like taking away my freedom to own a gun. I will never give up my classic muscle car nor my choice in going offroad to enjoy my suv. As I stated earlier - many people just want these self-driving cars so they can be stuck into their phone texting. :(

It's similar to guns, but no constitutional right here - currently vehicles kill ~33k people per year, ~5k of them pedestrians. A further 2.3M people are injured by them annually as well.

The amount of lives that could be saved is worth banning driven cars. That doesn't even take into account:

Cost savings on fuel, time, efficiency, emissions, etc.
 
Sure you can. It is still utterly retarded to do so. But, you're right, you definitely can.

Better read my post above before you settle with this answer.

Remember, the development of self-driving cars is not being done for the betterment of humanity, it's for the betterment of business. Once you remember this and frame everything in the correct picture you start seeing the angels.
 
Wait wait wait. Why is a programmer responsible when it's the buyer who is putting the car on the road? As long as the manufacturer doesn't misrepresent the product then what have they done wrong?

Its simple, because the programmer made the software that drives the car. It is as simple as that, and that is why the programmer will almost always be named in every self driven vehicle accident.

Yes, there are other factors, you can see my email above for the basic insurance breakdown. Yes, it is possible that fault will lay with the owner of the vehicle (not updating software is a great example).

None of that will fully absolve the company that designed and implemented the software. If you want I can run you through a brief set of legal arguments that could potentially be made in a case where the owner didn't update the software that would still give a good chance that the programmers E&O will end up paying at least partial settlement.
 
It's similar to guns, but no constitutional right here - currently vehicles kill ~33k people per year, ~5k of them pedestrians. A further 2.3M people are injured by them annually as well.

The amount of lives that could be saved is worth banning driven cars. That doesn't even take into account:

Cost savings on fuel, time, efficiency, emissions, etc.

Yes, and all while ensuring that you must buy a new car every six years so that you will forever have a monthly car payment of $250+ a month depending on how many cup holders you want. Your insurance will rise and fall based on statistics or at least it will rise with statistics as an excuse and fall .... never, just like now.

You will not modify your vehicles without incurring great legal risk. You will not put off a service because "software maintenance" must be maintained to keep your vehicle updated and safe. And if you can't afford to drive a new self driving car you won't be driving. They won't even have to ban manually driven cars, the insurance and legal risks of getting into a no-win accident with a self driving car will keep them off the road and any old fart with a beater pickup won't keep his vehicle after his next fender bender.

Saving lives is not an objective, it's a justification. Money is the only objective here.

In Tucson today, if you get into an accident and there are no injuries, the City cops will not even come to write a ticket. You just call your insurance companies and they send people out to settle it for you, the cops are no longer in the picture.

Everything is being done to save costs associated with "managing the population". If they can avoid responding they will do that, if not they will try and bill someone. The bill doesn't go to the persons at fault, it goes to those who can/will pay.
 
Its not an email (doh).

Here is a snippet and I'll expand on how your responsibilities for a self driven car should flow:
"lastly you have the driver, which is currently in limbo and needs the legislation, you should not be directly responsible for the cars autonomous actions. Indirectly, yes, but again there is coverage for this existing under commercial insurance called a non-owned auto endorsement."

So you as the owner are responsible for that vehicle being on the road, and thus can bare a degree of liability for any accident. However you are not solely responsible for the safe operation, as the task of driving (at this current tech level) is split between the vehicle and the driver. This split is where lawyers, adjusters and insurers will fight over who pays what, if they cannot reach a settlement pre-trial, it will be up to the courts to assign negligence.

Legislation needs to address how much attention a 'driver' of a self driving car should pay to meet the general standard of a reasonable and prudent person. Until this is done, we are all shooting in the dark as to how much responsibility rests with the driver.

All my original post was stating is that the mechanisms to insure these vehicles already exist, and likely insurers will re-bundle them to meet the demand of self driving cars. Until that legislation is in effect however, insurers won't do anything because they don't know which way to jump on the driver responsibility.
 
Saving lives is not an objective, it's a justification. Money is the only objective here.

Welcome to capitalism. That's the whole idea, make money and do good simultaneously.

If cars become self-driving, I'll bet for 90% of people it becomes a commodity for which they just get the cheapest one that works; a 300hp engine doesn't matter any more. So overall prices should drop, particularly as all the stuff in the cabin for the driver doesn't need to be there anymore, simplifying design.
 
Its simple, because the programmer made the software that drives the car. It is as simple as that, and that is why the programmer will almost always be named in every self driven vehicle accident.

Yes, there are other factors, you can see my email above for the basic insurance breakdown. Yes, it is possible that fault will lay with the owner of the vehicle (not updating software is a great example).

None of that will fully absolve the company that designed and implemented the software. If you want I can run you through a brief set of legal arguments that could potentially be made in a case where the owner didn't update the software that would still give a good chance that the programmers E&O will end up paying at least partial settlement.

It is not as simple as that. To start with, the programmer just works for a company so the programmer himself will never be held individually liable. I don't see Honda getting a ticket when someone get's rear-ended in town. When a bus driver falls asleep and slams into a semi I don''t see the bus manufacturer in the news. And I don't see Google lining up to accept massive responsibility for accident liability related to their products. Not any more than Ford or Dodge take on with theirs.

My new Challenger has much more software control over the driving mechanics then I really want to think about. My old 325i wasn't too much different. It was raining and I wasn't paying attention and set the cruise control, hit a water puddle.... OMG..... that car freaked out. It was trying to correct the loss of control completely without any regard for how I was trying to correct the vehicles attitude. The left wheel hit the puddle, the resistance yanked the car to the left, I steered right, the car braked right and applied power to the left front "no traction there", the car over-corrected to the right and if I hadn't been on wet pavement I think I would have rolled. Again the car tried to brake and divert power to correct the car's attitude which only ensured that I would never get any decent traction in order to straighten out my car. Finally the car slowed enough and the pavement was puddle free enough I got control. I was freaking scared to death. I never fucked up like that again, it sure taught me a lesson.

But don't worry, they are going to do this. They will do this because there is money in it, and they will convince us it's what we want. And we will pay more, and we will not be able to drive old beater cars any more. You won't be able to use a beater for work driving. But don't go thinking they are going to accept financial responsibility and added risk any more than what they carry now. We shoulder that burden today and it will be no different tomorrow.
 
Back
Top