See the World Through the Eyes of a Tesla

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
Tesla made some big changes to their autopilot with the v9 update. The neural net can now use all the car's cameras to recognize things on the road, and one particular Tesla owner managed to overlay the computer's metadata on top of the camera feeds.

Check out the footage here, or watch a more problematic video where the Tesla failed to recognize some road debris here.

Well, as the great (I think) v8.1 autopilot footage was released it was a bit marred by a quickly followed v9 update where Tesla improved their offering quite a bit (took away some of the new thingies since then). More importantly, they started using all the cameras and that created quite some problems in how to capture those esp. without Tesla actually cooperating. The solution I chose was to just limit framerate on all the cameras but main one (the storage driver on ape can handle only at most 80MB/sec). 9fps is used for the side cameras and 6 fps for fisheye (the car actually gets 36fps feeds from all cameras bu backup where it gets 30fps) and even that tends to overwhelm the drive from time to time and this is visible in still frames here and there. One of the problem is the backup camera is actually quite a bit unreliable and in many runs there's no output captured from it. As such I decided not to collect it at all for now. (you know it's not working on your cam when on a trip the cars come behind your car real close at a traffic light and nothing shows on the IC, surprisingly the CID backup cam display still works, so Tesla decided to just paper over the old "freeze frame" issue but not the autopilot problem of the same). It's also notable that different cameras get different detection rates it appears and since I cannot predict which ones are which sometimes detections seem to be a little bit off - know it's most likely a sampling artifact.
 

sirmonkey1985

[H]ard|DCer of the Month - July 2010
Joined
Sep 13, 2008
Messages
22,231
that's pretty nuts seeing just how much it's keeping track of at the same time.
 

Darunion

2[H]4U
Joined
Oct 6, 2010
Messages
4,050
Vision systems are always neat to watch. The nerd in me always thinks of terminator vision.
 

Ihaveworms

Ukfay Ancerkay
Joined
Jul 25, 2006
Messages
4,366
I was watching this and realized I recognized the place. This is downtown Knoxville, TN.
 

travisty

Gawd
Joined
Feb 3, 2016
Messages
815
That's only a fraction of it, likely. But to put it in context, the human mind does this subconciously. So just imagine the amount of data we're actually processing from our sensors at any given moment.

It is impressive what the human mind can process, autonomous vehicles are far more advanced. While we have two visual inputs then memory/audio Inputs to intelligently guess/assume what's not in our visual field of view.

Autonomous vehicles actually know what's around the vehicle at all times with lidar (waymo/gm), 360 cameras (Tesla), and radar
 

Darunion

2[H]4U
Joined
Oct 6, 2010
Messages
4,050
That's only a fraction of it, likely. But to put it in context, the human mind does this subconciously. So just imagine the amount of data we're actually processing from our sensors at any given moment.

We surpassed what our brains could do a long time ago. And with the average persons reflexes being around 100ms, we beat that as well. We learn and adapt and think in a different way, but we certainly don't process faster. Not to mention our data recording is terrible. And we are easily distracted de-prioritizing important life threatening things for something flashy like a billboard or a text message.

Just look at how long it takes a person to decide what to eat for dinner. :p
 

cdabc123

2[H]4U
Joined
Jun 21, 2016
Messages
2,759
That's only a fraction of it, likely. But to put it in context, the human mind does this subconciously. So just imagine the amount of data we're actually processing from our sensors at any given moment.

Our vision isn't all that impressive and tech has long past it. What's impressive is how much we are able to do with this info thanks to our brains (whitch we are FAR from being able to replicate)
 

Dead Parrot

2[H]4U
Joined
Mar 4, 2013
Messages
2,831
It is impressive what the human mind can process, autonomous vehicles are far more advanced. While we have two visual inputs then memory/audio Inputs to intelligently guess/assume what's not in our visual field of view.

Autonomous vehicles actually know what's around the vehicle at all times with lidar (waymo/gm), 360 cameras (Tesla), and radar

Unless the target is a lady pushing a bicycle in the dark. The last report I read said that car saw her in plenty of time to avoid and spent several seconds switching between a target ID of person, bicycle, and basically WTF instead of simply steering around an unknown target.

Another advantage humans have is our sensors are passive. 1 human or 10,000, our sensors don't interfere with each other. Not sure we will be able to say the same once we have a 1000 cars in a urban street grid all transmitting lidar and sonar pulses which are bouncing off the people, other cars, and buildings.
 

Jagger100

Supreme [H]ardness
Joined
Oct 31, 2004
Messages
7,626
We surpassed what our brains could do a long time ago. And with the average persons reflexes being around 100ms, we beat that as well. We learn and adapt and think in a different way, but we certainly don't process faster. Not to mention our data recording is terrible. And we are easily distracted de-prioritizing important life threatening things for something flashy like a billboard or a text message.

Just look at how long it takes a person to decide what to eat for dinner. :p
That's why AI in video games is so good?
 

Darunion

2[H]4U
Joined
Oct 6, 2010
Messages
4,050
That's why AI in video games is so good?

Completely different thing. The processor is controlling the environment, sound placement, npcs, all the graphics. Meanwhile the players brain is allowing drool to fall out the side of his mouth while he is only utilizing his thumbs and maybe a couple fingers. Meanwhile target correction is added to assist the player in aiming so they don't get discouraged.

We were talking about the capability, not what is scripted for a video game.

Our creativity is what seperates us and puts us above a computer. Our processing power is not.
 

serpretetsky

[H]ard|Gawd
Joined
Dec 24, 2008
Messages
1,791
Computer Vision is definitly very impressive. And it may very well excel at processing particular types of inputs it was trained for beyond human vision. But I definitely disagree that computer vision at this current moment is "more complex" or "processes more data" than human vision. The entirety of human vision, from processing primitive shapes and lines, to being able to recognize and classify an enormous amount of objects and use that as a feedback to look at the original image again is impressive.

Show the average human a random photo of a scene and he's going to be able to extrapolate and deduce SO MUCH MORE information from that photo than any computer vision system. Things like motion can be deduced from a still photo because of the amount of information a human can get from a photo (guy riding a bicycle? you can probably guess roughly how fast he is going and in which direction. ). See some weird contraption? You can probably begin seeing individual parts on the contraption and start deducing what the contraption might do based on parts you see.

There is alot of "RawInput <-> Basic shapes <-> Gradient/Lighting and stereoscoping vision <-> 3d object modeling <-> object classification and deconstruction <-> reanalyze image based on new object data" processing going on, and it is not linear like I wrote it and it goes back and forth between one step and another to provide amazing amounts of information.

You can probably argue much of what I'm describing isn't vision, but I think that it's so intertwined in humans it's hard to say where vision stops and where other brain processes begin.

edit: Disclaimer: I am not a vision in expert in neither machine learning nor human cognitive abilities
 
Last edited:

luke51087

2[H]4U
Joined
Aug 7, 2005
Messages
2,403
Did anyone notice how much the car drifted in to the other lane at 5:30 also seems that it had no idea to slow down that much on the exit
 

luke51087

2[H]4U
Joined
Aug 7, 2005
Messages
2,403
Thats what you got out of all that?
i mean if i was in the other lane and it decided to do that i would not be that happy. It is pretty neat to see the overlays on the cameras to see it function.
 

toast0

[H]ard|Gawd
Joined
Jan 26, 2010
Messages
1,039
Unless the target is a lady pushing a bicycle in the dark. The last report I read said that car saw her in plenty of time to avoid and spent several seconds switching between a target ID of person, bicycle, and basically WTF instead of simply steering around an unknown target.

Another advantage humans have is our sensors are passive. 1 human or 10,000, our sensors don't interfere with each other. Not sure we will be able to say the same once we have a 1000 cars in a urban street grid all transmitting lidar and sonar pulses which are bouncing off the people, other cars, and buildings.

I never imagined I'd be defending the concept of autonomous cars, but it's clear that Uber's system isn't living up to the potential here in so many ways. It's still pretty cool to see this video with augmented data from the Tesla systems; it demonstrates the potential of object sensing and tracking in all directions in a way that's somewhere between hard and impossible for a human to do consistently at this level -- although I'm far from ready to trust something like this to drive for me -- I'd love to see a similar video from Waymo, I think they have made a lot of different decisions on what to look for, and it would be pretty cool to see the difference.
 

kju1

2[H]4U
Joined
Mar 27, 2002
Messages
3,293
I never imagined I'd be defending the concept of autonomous cars, but it's clear that Uber's system isn't living up to the potential here in so many ways. It's still pretty cool to see this video with augmented data from the Tesla systems; it demonstrates the potential of object sensing and tracking in all directions in a way that's somewhere between hard and impossible for a human to do consistently at this level -- although I'm far from ready to trust something like this to drive for me -- I'd love to see a similar video from Waymo, I think they have made a lot of different decisions on what to look for, and it would be pretty cool to see the difference.

You know what though...I will trust it to drive as long as I can override it anytime I want. I will not trust a computer to drive for me w/out that ability unless they invent a way to transfer my consciousness into a new body postmortem.
 

travisty

Gawd
Joined
Feb 3, 2016
Messages
815
Unless the target is a lady pushing a bicycle in the dark. The last report I read said that car saw her in plenty of time to avoid and spent several seconds switching between a target ID of person, bicycle, and basically WTF instead of simply steering around an unknown target.

Another advantage humans have is our sensors are passive. 1 human or 10,000, our sensors don't interfere with each other. Not sure we will be able to say the same once we have a 1000 cars in a urban street grid all transmitting lidar and sonar pulses which are bouncing off the people, other cars, and buildings.

One pedestrian death out of how many million miles with uber's inferior technology. How about waymo: 0 out of 10 million autonomous miles. Tesla 0 out of 1 billion miles
 
Last edited:

mdburkey

Limp Gawd
Joined
Jan 19, 2007
Messages
498
I was watching this and realized I recognized the place. This is downtown Knoxville, TN.

Yep...I was like this is cool...then, wait a minute, that's Summit Hill Dr....all the way out to the West Hill's exit and to West Town Mall.
 

Dead Parrot

2[H]4U
Joined
Mar 4, 2013
Messages
2,831
I never imagined I'd be defending the concept of autonomous cars, but it's clear that Uber's system isn't living up to the potential here in so many ways. It's still pretty cool to see this video with augmented data from the Tesla systems; it demonstrates the potential of object sensing and tracking in all directions in a way that's somewhere between hard and impossible for a human to do consistently at this level -- although I'm far from ready to trust something like this to drive for me -- I'd love to see a similar video from Waymo, I think they have made a lot of different decisions on what to look for, and it would be pretty cool to see the difference.

I am not against the concept of autonomous cars. I was addressing a post that implied such cars have everything around them properly identified all the time. According to the report I read, the Uber car didn't. The sad thing is that death was probably avoidable. The car DID detect the lady+bicycle in plenty of time to avoid. The code put indentifying the target at a higher priority then avoiding the collision. It used all of the time between the initial detection several seconds before impact and the impact trying to figure out what it was going to hit. Complex targets like lady+bicycle or homeless person in shaggy blanket pushing shopping cart full of crap with a dog in tow are things these cars are going to have to identify and avoid.
 

ol1bit

[H]ard|Gawd
Joined
Jan 15, 2007
Messages
1,232
That is some awesome programming to be sure! Of course replacing the Model 3 windshield is $1800! My buddy found this out with 2018 Audi A4. Cracked window (covered). ADAS re-calibration by Audi, 8 hours, $850!

So not so sure people are prepared for the cost to maintain these systems. Human cost (coffee and a doughnut). :)
 

kju1

2[H]4U
Joined
Mar 27, 2002
Messages
3,293
That is some awesome programming to be sure! Of course replacing the Model 3 windshield is $1800! My buddy found this out with 2018 Audi A4. Cracked window (covered). ADAS re-calibration by Audi, 8 hours, $850!

So not so sure people are prepared for the cost to maintain these systems. Human cost (coffee and a doughnut). :)

What would your buddy's A4 windshield replacement have to do with a Model 3?
 

dyzophoria

Gawd
Joined
Jan 17, 2006
Messages
946
That is actually cool, We can say still has some quirks into it, but technology really have gone a long way.
 
Top