Researchers Use Eye Tracking and AI to Control a Drone

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
Researchers from the University of Pennsylvania, the U.S. Army Research Laboratory, and New York University created a system that can control a drone with nothing but eye movement. The setup uses Tobii Pro Glasses 2 to track both eyes and provide an HD camera feed, which is fed to an Nvidia Jetson TX2 system running a deep neural network. The software uses the drone's apparent size and the user's head orientation to figure out where it is in 3D space, calculates a vector based on the user's gaze, and sends the drone there. This makes the system self-contained, with no need for GPS or external sensors away from the user.

Check out a video of the drone in action here.

"To compute the 3D navigation waypoint, we use the 2D gaze coordinate provided from the glasses to compute a pointing vector from the glasses, and then randomly select the waypoint depth within a predefined safety zone. Ideally, the 3D navigation waypoint would come directly from the eye tracking glasses, but we found in our experiments that the depth component reported by the glasses was too noisy to use effectively. In the future, we hope to further investigate this issue in order to give the user more control over depth."

It's somewhat remarkable that the glasses are reporting depth information from pupil-tracking data at all, to be honest, but you can see how this would be super difficult, determining the difference in pupil divergence between you looking at something that's 20 feet away as opposed to 25 feet away. Those five feet could easily be the difference between an intact drone and one that's in sad little pieces on the ground, especially if the drone is being controlled by an untrained user, which is after all the idea.
 
I would be in so much trouble... "Honey, I crashed the drone again today.." "Was she at least attractive this time?"
 
Back
Top