Eye-tracking tech uses cellphone camera, knows where users are looking

Over the past 40 years, many groups have used eye-tracking technology to determine simply where people are directing their gaze. The tech has been implemented in psych experiments, marketing research, and now can work with just an ordinary cellphone camera.

Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory and the University of Georgia have leveraged crowd-sourced data to determine where mobile users are looking by turning any smartphone into an eye-tracking device.

The system could be used to create new computer interfaces or detect signs of neurological disease or mental illness, as well as make existing applications of eye-tracking technology more accessible.

“The field is kind of stuck in this chicken-and-egg loop,” said Aditya Khosla, an MIT graduate student in electrical engineering and computer science and co-first author on the paper. “Since few people have the external devices, there’s no big incentive to develop applications for them. Since there are no applications, there’s no incentive for people to buy the devices. We thought we should break this circle and try to make an eye tracker that works on a single mobile device, using just your front-facing camera.”

Researchers developed a simple application for devices that use Apple’s iOS operating system.  (Illustration Credit: Christine Daniloff/MIT)
Researchers developed a simple application for devices that use Apple’s iOS operating system. (Illustration Credit: Christine Daniloff/MIT)

Khosla and  colleagues Kyle Krafka of the University of Georgia, MIT professors of electrical engineering and computer science Wojciech Matusik and Antonio Torralba, and three others built their device using machine learning,  in which computers learn to perform tasks by looking for patterns in large sets of training examples.

The team had a lot of data to work with. Its training set includes examples of gaze patterns from 1,500 mobile-device users. Previously, the largest data sets used to train experimental eye-tracking systems reached their limits at about 50 users.

The team employed crowdsourcing to help assemble their data sets.

In their initial round of experiments the researchers used data from 800 mobile-device users and were able to get the system’s margin of error down to 1.5 centimeters, a two-fold improvement over previous experimental systems. They have now reduced the margin of error to about a centimeter.

Larger groups help improve the system because the researchers train and re-train it using different-sized subsets of their data. Their work suggests that about 10,000 training examples should be enough to lower the margin of error to a half-centimeter, which Khosla says would be good enough to make the system commercially viable.

training examples were collected using  a simple application for devices that use Apple’s iOS. The application flashes a small dot somewhere on the device’s screen, attracting the user’s attention, then briefly replaces it with either an “R” or an “L,” instructing the user to tap either the right or left side of the screen. During this process, the device camera continuously captures images of the user’s face.

Application users were retrieved through Amazon’s Mechanical Turk crowdsourcing site and paid a small fee for each successfully executed tap. The data set contains, on average, 1,600 images for each user.

 

“In lots of cases — if you want to do a user study, in computer vision, in marketing, in developing new user interfaces — eye tracking is something people have been very interested in, but it hasn’t really been accessible,” said Noah Snavely, an associate professor of computer science at Cornell University. “You need expensive equipment, or it has to be calibrated very well in order to work. So something that will work on a device everyone has, that seems very compelling. And from what I’ve seen, the accuracy they get seems like it’s in the ballpark that you can do something interesting.”

Story via MIT.

 

Comments are closed, but trackbacks and pingbacks are open.