See the World Through the Eyes of a Tesla

Discussion in 'HardForum Tech News' started by AlphaAtlas, Oct 23, 2018.

  1. AlphaAtlas

    AlphaAtlas [H]ard|Gawd Staff Member

    Messages:
    1,716
    Joined:
    Mar 3, 2018
    Tesla made some big changes to their autopilot with the v9 update. The neural net can now use all the car's cameras to recognize things on the road, and one particular Tesla owner managed to overlay the computer's metadata on top of the camera feeds.

    Check out the footage here, or watch a more problematic video where the Tesla failed to recognize some road debris here.

    Well, as the great (I think) v8.1 autopilot footage was released it was a bit marred by a quickly followed v9 update where Tesla improved their offering quite a bit (took away some of the new thingies since then). More importantly, they started using all the cameras and that created quite some problems in how to capture those esp. without Tesla actually cooperating. The solution I chose was to just limit framerate on all the cameras but main one (the storage driver on ape can handle only at most 80MB/sec). 9fps is used for the side cameras and 6 fps for fisheye (the car actually gets 36fps feeds from all cameras bu backup where it gets 30fps) and even that tends to overwhelm the drive from time to time and this is visible in still frames here and there. One of the problem is the backup camera is actually quite a bit unreliable and in many runs there's no output captured from it. As such I decided not to collect it at all for now. (you know it's not working on your cam when on a trip the cars come behind your car real close at a traffic light and nothing shows on the IC, surprisingly the CID backup cam display still works, so Tesla decided to just paper over the old "freeze frame" issue but not the autopilot problem of the same). It's also notable that different cameras get different detection rates it appears and since I cannot predict which ones are which sometimes detections seem to be a little bit off - know it's most likely a sampling artifact.
     
  2. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    20,982
    Joined:
    Sep 13, 2008
    that's pretty nuts seeing just how much it's keeping track of at the same time.
     
  3. Darunion

    Darunion 2[H]4U

    Messages:
    3,669
    Joined:
    Oct 6, 2010
    Vision systems are always neat to watch. The nerd in me always thinks of terminator vision.
     
  4. Spidey329

    Spidey329 [H]ardForum Junkie

    Messages:
    8,698
    Joined:
    Dec 15, 2003
    That's only a fraction of it, likely. But to put it in context, the human mind does this subconciously. So just imagine the amount of data we're actually processing from our sensors at any given moment.
     
    BloodyIron and Skyblue like this.
  5. Ihaveworms

    Ihaveworms [H]ardness Supreme

    Messages:
    4,261
    Joined:
    Jul 25, 2006
    I was watching this and realized I recognized the place. This is downtown Knoxville, TN.
     
    mdburkey likes this.
  6. travisty

    travisty Gawd

    Messages:
    816
    Joined:
    Feb 3, 2016
    It is impressive what the human mind can process, autonomous vehicles are far more advanced. While we have two visual inputs then memory/audio Inputs to intelligently guess/assume what's not in our visual field of view.

    Autonomous vehicles actually know what's around the vehicle at all times with lidar (waymo/gm), 360 cameras (Tesla), and radar
     
  7. Darunion

    Darunion 2[H]4U

    Messages:
    3,669
    Joined:
    Oct 6, 2010
    We surpassed what our brains could do a long time ago. And with the average persons reflexes being around 100ms, we beat that as well. We learn and adapt and think in a different way, but we certainly don't process faster. Not to mention our data recording is terrible. And we are easily distracted de-prioritizing important life threatening things for something flashy like a billboard or a text message.

    Just look at how long it takes a person to decide what to eat for dinner. :p
     
  8. cdabc123

    cdabc123 2[H]4U

    Messages:
    2,171
    Joined:
    Jun 21, 2016
    Our vision isn't all that impressive and tech has long past it. What's impressive is how much we are able to do with this info thanks to our brains (whitch we are FAR from being able to replicate)
     
  9. Dead Parrot

    Dead Parrot 2[H]4U

    Messages:
    2,330
    Joined:
    Mar 4, 2013
    Unless the target is a lady pushing a bicycle in the dark. The last report I read said that car saw her in plenty of time to avoid and spent several seconds switching between a target ID of person, bicycle, and basically WTF instead of simply steering around an unknown target.

    Another advantage humans have is our sensors are passive. 1 human or 10,000, our sensors don't interfere with each other. Not sure we will be able to say the same once we have a 1000 cars in a urban street grid all transmitting lidar and sonar pulses which are bouncing off the people, other cars, and buildings.
     
  10. Jagger100

    Jagger100 [H]ardness Supreme

    Messages:
    7,375
    Joined:
    Oct 31, 2004
    That's why AI in video games is so good?
     
  11. Darunion

    Darunion 2[H]4U

    Messages:
    3,669
    Joined:
    Oct 6, 2010
    Completely different thing. The processor is controlling the environment, sound placement, npcs, all the graphics. Meanwhile the players brain is allowing drool to fall out the side of his mouth while he is only utilizing his thumbs and maybe a couple fingers. Meanwhile target correction is added to assist the player in aiming so they don't get discouraged.

    We were talking about the capability, not what is scripted for a video game.

    Our creativity is what seperates us and puts us above a computer. Our processing power is not.
     
  12. serpretetsky

    serpretetsky [H]ard|Gawd

    Messages:
    1,674
    Joined:
    Dec 24, 2008
    Computer Vision is definitly very impressive. And it may very well excel at processing particular types of inputs it was trained for beyond human vision. But I definitely disagree that computer vision at this current moment is "more complex" or "processes more data" than human vision. The entirety of human vision, from processing primitive shapes and lines, to being able to recognize and classify an enormous amount of objects and use that as a feedback to look at the original image again is impressive.

    Show the average human a random photo of a scene and he's going to be able to extrapolate and deduce SO MUCH MORE information from that photo than any computer vision system. Things like motion can be deduced from a still photo because of the amount of information a human can get from a photo (guy riding a bicycle? you can probably guess roughly how fast he is going and in which direction. ). See some weird contraption? You can probably begin seeing individual parts on the contraption and start deducing what the contraption might do based on parts you see.

    There is alot of "RawInput <-> Basic shapes <-> Gradient/Lighting and stereoscoping vision <-> 3d object modeling <-> object classification and deconstruction <-> reanalyze image based on new object data" processing going on, and it is not linear like I wrote it and it goes back and forth between one step and another to provide amazing amounts of information.

    You can probably argue much of what I'm describing isn't vision, but I think that it's so intertwined in humans it's hard to say where vision stops and where other brain processes begin.

    edit: Disclaimer: I am not a vision in expert in neither machine learning nor human cognitive abilities
     
    Last edited: Oct 23, 2018
  13. luke51087

    luke51087 2[H]4U

    Messages:
    2,381
    Joined:
    Aug 7, 2005
    Did anyone notice how much the car drifted in to the other lane at 5:30 also seems that it had no idea to slow down that much on the exit
     
    John721 likes this.
  14. kju1

    kju1 2[H]4U

    Messages:
    3,188
    Joined:
    Mar 27, 2002
    Thats what you got out of all that?
     
  15. luke51087

    luke51087 2[H]4U

    Messages:
    2,381
    Joined:
    Aug 7, 2005
    i mean if i was in the other lane and it decided to do that i would not be that happy. It is pretty neat to see the overlays on the cameras to see it function.
     
  16. toast0

    toast0 Gawd

    Messages:
    894
    Joined:
    Jan 26, 2010
    I never imagined I'd be defending the concept of autonomous cars, but it's clear that Uber's system isn't living up to the potential here in so many ways. It's still pretty cool to see this video with augmented data from the Tesla systems; it demonstrates the potential of object sensing and tracking in all directions in a way that's somewhere between hard and impossible for a human to do consistently at this level -- although I'm far from ready to trust something like this to drive for me -- I'd love to see a similar video from Waymo, I think they have made a lot of different decisions on what to look for, and it would be pretty cool to see the difference.
     
  17. kju1

    kju1 2[H]4U

    Messages:
    3,188
    Joined:
    Mar 27, 2002
    You know what though...I will trust it to drive as long as I can override it anytime I want. I will not trust a computer to drive for me w/out that ability unless they invent a way to transfer my consciousness into a new body postmortem.
     
  18. travisty

    travisty Gawd

    Messages:
    816
    Joined:
    Feb 3, 2016
    One pedestrian death out of how many million miles with uber's inferior technology. How about waymo: 0 out of 10 million autonomous miles. Tesla 0 out of 1 billion miles
     
    Last edited: Oct 23, 2018
  19. toast0

    toast0 Gawd

    Messages:
    894
    Joined:
    Jan 26, 2010
  20. travisty

    travisty Gawd

    Messages:
    816
    Joined:
    Feb 3, 2016
    I was strictly referring to pedestrian deaths

    Deaths per mile in general, Waymo leads the way with 0. Telsa is around 1 death per 300 million miles driven. Compared to humans which are 1 death per 86 million miles.

    Remember too, autonomous vehicles are only getting better exponentially :)
     
  21. mdburkey

    mdburkey Limp Gawd

    Messages:
    500
    Joined:
    Jan 19, 2007
    Yep...I was like this is cool...then, wait a minute, that's Summit Hill Dr....all the way out to the West Hill's exit and to West Town Mall.
     
  22. Dead Parrot

    Dead Parrot 2[H]4U

    Messages:
    2,330
    Joined:
    Mar 4, 2013
    I am not against the concept of autonomous cars. I was addressing a post that implied such cars have everything around them properly identified all the time. According to the report I read, the Uber car didn't. The sad thing is that death was probably avoidable. The car DID detect the lady+bicycle in plenty of time to avoid. The code put indentifying the target at a higher priority then avoiding the collision. It used all of the time between the initial detection several seconds before impact and the impact trying to figure out what it was going to hit. Complex targets like lady+bicycle or homeless person in shaggy blanket pushing shopping cart full of crap with a dog in tow are things these cars are going to have to identify and avoid.
     
  23. Lamont

    Lamont 2[H]4U

    Messages:
    3,164
    Joined:
    Oct 26, 2004
    Kinda waited to see if it started marking people as "threats".
     
  24. BloodyIron

    BloodyIron 2[H]4U

    Messages:
    3,459
    Joined:
    Jul 11, 2005
    This is really neat. I want one!
     
  25. ol1bit

    ol1bit [H]ard|Gawd

    Messages:
    1,234
    Joined:
    Jan 15, 2007
    That is some awesome programming to be sure! Of course replacing the Model 3 windshield is $1800! My buddy found this out with 2018 Audi A4. Cracked window (covered). ADAS re-calibration by Audi, 8 hours, $850!

    So not so sure people are prepared for the cost to maintain these systems. Human cost (coffee and a doughnut). :)
     
  26. termite

    termite [H]ardness Supreme

    Messages:
    4,385
    Joined:
    Aug 27, 2004
    They need to work on those hit boxes.
     
  27. kju1

    kju1 2[H]4U

    Messages:
    3,188
    Joined:
    Mar 27, 2002
    What would your buddy's A4 windshield replacement have to do with a Model 3?
     
  28. dyzophoria

    dyzophoria Gawd

    Messages:
    947
    Joined:
    Jan 17, 2006
    That is actually cool, We can say still has some quirks into it, but technology really have gone a long way.