Ford Calls for Standardized Self Driving Signals

AlphaAtlas

[H]ard|Gawd
Joined
Mar 3, 2018
Messages
1,713
Human expressions like making eye contact with a pedestrian or waving someone by are an essential part of driving. Current autonomous vehicles simply have no way to replicate them, and Ford sees that as a significant hurdle to future development. Yesterday, the company announced that they want to standardize the way robocars communicate with humans. In collaboration with the Virginia Tech Transportation Institute, the company is installing light bars on top of self driving cars that communicate the AI's intentions. Interestingly, the Ford sedan used in the tests appears to be the same one featured in Ford's quantum computing research. Jaguar-Land Rover performed their own creepy communication experiments earlier this year, and made a similar call for standardization.

Yielding: Two white lights moving side to side to indicate vehicle is about to come to a full stop. Active driving mode: Solid white light to indicate vehicle intends to proceed on its current course (although can respond appropriately to objects and other road users in the course of its travel). Start-to-go: Rapidly blinking white light to indicate vehicle is beginning to accelerate from a stop.
 
68000B5D-8987-4E32-BB55-E151451DCB93.gif
 
Simpler solution: Train humans to follow the rules of the road, such as "Do not step in front of a moving vehicle".

In what world is it easier to train humans to perform a task flawlessly 100% of the time than it is to program a machine?

We're relying too much on technology to do the mundane things that humans can do without it.

I’d argue that we rely on humans to do to many things that we were not at all designed to do.
 
In what world is it easier to train humans to perform a task flawlessly 100% of the time than it is to program a machine?

In whatever world it is that you think a machine can perform a task flawlessly 100% of the time.

We need to focus on getting the cars able to drive before we start trying to communicate with peds...
 
Simpler solution: Train humans to follow the rules of the road, such as "Do not step in front of a moving vehicle".

That is not what this is for. When driving, body language and eye contact tells many pedestrians as well as drivers, the other persons intentions, as well as saying "hey, I see you", that is not possible with how a autonomous car works without the addition of some sort of signaling system. This system has uses even if it's a pedestrian crossing at a cross walk, looking at a human driver making eye contact you can tell the person sees you and will yield, with an AI car there is no such indication as of yet.
 
In whatever world it is that you think a machine can perform a task flawlessly 100% of the time.

We need to focus on getting the cars able to drive before we start trying to communicate with peds...

Typically US regulations tend to be pretty slow for cars. Getting standards set should be helpful to get things main stream before we have each manufacturer with different rules for their cars. It's why I'm all for US regulations for cars placing standards on cars that make it safer and easier for humans.

That being said, "Standards" are one thing but they can be optional. So we'll still need regulations to set general guidelines.
 
Wonder what's the signal when the car decides to run-over group B instead of group A when there's no other choice to avoid either?

Would be great knowing I won't get hit by making eye contact at the crosswalk. I just love how right-turner ignore that cross signal!
 
I think people need to observe college campuses to see what this means. Every time I drive on the campus near where I live I swear there should be more kids dead from being hit by cars. They literally walk out and expect the cars to stop for them even when its traveling at 30 mph and are 15 ft away.......the law allows the ped to be clear of fault but if some moron wants to step out because the laws states its not the peds fault when in a crosswalk is ridiculous.... common sense has been trumped by straight up idiocy and self-centeredness.
Even if the bar is there to communicate with humans there is very few people that will even notice.
 
I think people need to observe college campuses to see what this means. Every time I drive on the campus near where I live I swear there should be more kids dead from being hit by cars. They literally walk out and expect the cars to stop for them even when its traveling at 30 mph and are 15 ft away.......the law allows the ped to be clear of fault but if some moron wants to step out because the laws states its not the peds fault when in a crosswalk is ridiculous.... common sense has been trumped by straight up idiocy and self-centeredness.
Even if the bar is there to communicate with humans there is very few people that will even notice.

I like Texas view on this, that drivers are to "exercise due care", meaning even if a ped is jaywalking, and the car can stop, it should, even though they don't have the right of way, however if that person gets hit and it was not reasonable or more dangerous for the car to stop, the ped can be found at fault. That way, it's not an "always" case, and can be viewed on a case by case basis.
 
The problem with this is humans use eye contact and that type of hand waving to mean the same things whether driving, walking or figuring out who gets on the elevator next and it is handled pretty much on an automatic level. These blinking lights are an entirely new signalling setup with little use elsewhere that folks will be expected to learn. Plus it is yet another damn blinky light on a vehicle. Between blinky brake lights, flashing amber lights on a growing number of vehicles, white strobes on many school buses, it is getting hard to see normal vehicles in between all the damn blinky lights calling for attention.
 
We're relying too much on technology to do the mundane things that humans can do without it.

Like what? Kill each-other by doing something that human have shown time and time again they can't do undistracted like driving?





These are not fully autonomous but this is showing that technology can already predict accidents and it's not a big step to add avoidance

 
Last edited:
In whatever world it is that you think a machine can perform a task flawlessly 100% of the time.

We need to focus on getting the cars able to drive before we start trying to communicate with peds...

LOL what world do you live in? Look around you. There are countless machines doing countless tasks flawlessly 99.999% of the time.

If autonomous cars can drive even 200k miles between accidents that's still better than humans by ~20%. That would save roughly 6400 lives a year in the US.

Tesla autonomous driving is roughly 1.5 million miles between accidents. Waymo is at about the same number. For Waymo there was only 1 accident caused by the Waymo so, take out humans, Waymo is closer to 8 million miles between accidents and only getting better.

Humans are hopelessly incompetent when it comes to driving safely when compared to autonomous driving (even at this point)
 
LOL what world do you live in? Look around you. There are countless machines doing countless tasks flawlessly 99.999% of the time.

If autonomous cars can drive even 200k miles between accidents that's still better than humans by a ~20%. That would save roughly 6400 lives a year in the US.

Tesla autonomous driving is roughly 1.5 million miles between accidents. Waymo is at about the same number. For Waymo there was only 1 accident caused by the Waymo so, take out humans, Waymo is closer to 8 million miles between accidents and only getting better.

Humans are hopelessly incompetent when it comes to driving safely when compared to autonomous driving (even at this point)

The real one. Machines fail a lot more than you think and in ways you probably never considered. Have you ever stopped to think that maybe there is such a thing as too much automation? I am not saying self driving cars are bad, just that they wont ever be perfect.
 
I like Texas view on this, that drivers are to "exercise due care", meaning even if a ped is jaywalking, and the car can stop, it should, even though they don't have the right of way, however if that person gets hit and it was not reasonable or more dangerous for the car to stop, the ped can be found at fault. That way, it's not an "always" case, and can be viewed on a case by case basis.
I think its the same here in Illinois, but the problem is the students abuse this idea and think they can walk out whenever the hell they want. That's the problem.....they are "legally" in the crosswalk but still step out in front of cars because " F- You, the world revolves around me" attitude.

I guess the ones that just say F-- it and walk out, are the ones who think that a law will physically save their life? :unsure:
 
The real one. Machines fail a lot more than you think and in ways you probably never considered. Have you ever stopped to think that maybe there is such a thing as too much automation? I am not saying self driving cars are bad, just that they wont ever be perfect.

What i work with, i have up times of 99.999%. If that's where autonomous cars get, then it's a much better world w/o human drivers. Even if it's just 95% any car accident is a tragedy because they happen so rarely anywhere.
 
Last edited:
I think its the same here in Illinois, but the problem is the students abuse this idea and think they can walk out whenever the hell they want. That's the problem.....they are "legally" in the crosswalk but still step out in front of cars because " F- You, the world revolves around me" attitude.

I guess the ones that just say F-- it and walk out, are the ones who think that a law will physically save their life? :unsure:

Guess they haven't met Texas drivers....They will run your ass over, as everyone is doing double the posted limit. :ROFLMAO::ROFLMAO::ROFLMAO:
 
What i work with, i have up times of 99.9999%. If that's where autonomous cars get, then it's a much better world w/o human drivers. Even if it's just 95% any car accident is a tragedy because they happen so rarely anywhere.

So you only have 31.6 seconds of downtime (including planned maintenance) a year for each machine?
 
So you only have 31.6 seconds of downtime (including planned maintenance) a year for each machine?

Heh hit 9 one too many times. In either case with k8s it's all done for me and AWS cover the rest. In either case I see you don't have any other rebuttal to the rest of my post, which was the main focus, so you must agree that humans should not be driving for much longer :)
 
Heh hit 9 one too many times. In either case with k8s it's all done for me and AWS cover the rest. In either case I see you don't have any other rebuttal to the rest of my post, which was the main focus, so you must agree that humans should not be driving for much longer :)

I never disagreed that machines could be more reliable, I just disputed the fact that they were perfect or would ever perform flawlessly 100% of the time.

MOST humans should not be driving today. That being said I dont trust a random programmer to make an ethical decision.
 
Like what? Kill each-other by doing something that human have shown time and time again they can't do undistracted like driving?





These are not fully autonomous but this is showing that technology can already predict accidents and it's not a big step to add avoidance




I suppose you're the type of person who will eventually need a computer to tell you that you need to take a shit! Cars, even if fully automated will still crash. There's simply too many variables that are unaccountable for, for a human engineer to program into the cars firmware. FYI...I've been driving for 32 years, no accidents.
 
I suppose you're the type of person who will eventually need a computer to tell you that you need to take a shit! Cars, even if fully automated will still crash. There's simply too many variables that are unaccountable for, for a human engineer to program into the cars firmware. FYI...I've been driving for 32 years, no accidents.

As i said in another post nothing is perfect. Tesla and Waymo are already around 1.5 million miles between accidents as opposed to humans' 160k miles between accidents; autonomous driving is only getting better. In the case of Waymo, taking into account who was at fault, they're at 8 million miles between accidents.

Once cars are autonomous a car accident anywhere will be a surprise and a tragedy because they happen so rarely


Whole video is good but here's it starting at autonomous cars
 
Last edited:
I wonder how it will react when it gets flipped off by a New Yorker ;)

That's actually an important point. Driver-pedestrian communication is two-way, so some day cars will have interpret rude gestures too.
 
these cars have to work in other countries apart from the usa so have to be able to cope with pedestrians crossing wherever they like.
 
Seems like a good idea.
Standardization seems like an obvious and necessary step in the right direction.
Especially when your effort is trailing. Standardization is nice way to level the playing field generally down to the lowest performers.
 
Yeah? And what about a vehicle making a right on red when pedestrians have the green light to cross?
Not sure why you chose to be snarky about something that on no level I can imagine could be seen as a negative.
In Chicago you do that at your own risk anyhow. I still yield to cars no matter if I have the walk signal or not when I'm a ped. I'm still alive so I must be doing something right..:ROFLMAO:
 
As i said in another post nothing is perfect. Tesla and Waymo are already around 1.5 million miles between accidents as opposed to humans' 160k miles between accidents; autonomous driving is only getting better. In the case of Waymo, taking into account who was at fault, they're at 8 million miles between accidents.

Once cars are autonomous a car accident anywhere will be a surprise and a tragedy because they happen so rarely


Whole video is good but here's it starting at autonomous cars

I'm calling BS on those statistics. They're not part of mandatory reporting so those numbers can be anything and there's a very good chance that said companies have fudged them. Just like corporate creative accounting.
 
I'm calling BS on those statistics. They're not part of mandatory reporting so those numbers can be anything and there's a very good chance that said companies have fudged them. Just like corporate creative accounting.

Umm how is Tesla or waymo hiding crashes and, even more, fatalities. Remember all the stories after Uber killed the woman?
Live in a fantasy world of hidden world governments. I will live in the advancing real one.
 
Back
Top