This code will transform your webcam into eye-tracking tech

There may now be a way for website owners and developers to determine what catches a user’s eye on webpage as they’re scrolling.

Brown University computer scientists have developed a new software, called WebGazer.js, that transforms ordinary webcams into eye-tracking tech that can determine where a user is looking on a particular webpage.

The software can be added to any website by adding a few lines of code and runs on the user’s browser. Of course, the user’s permission is required to access the webcam, but the only thing that is recorded is the location of the user’s gaze, which is reported back to the website in real time.

Eye-tracking helps web developers make better websites, but it's expensive to do. New software, which can be embedded in any website, turns webcams into eye trackers. (Image Credit: Huang Lab / Brown University)
 New software turns webcams into eye trackers. (Image Credit: Huang Lab / Brown University)

“We see this as a democratization of eye-tracking,” said Alexandra Papoutsaki, a Brown University graduate student who led the development of the software. “Anyone can add WebGazer to their site and get a much richer set of analytics compared to just tracking clicks or cursor movements.”

While eye tracking is not a new technology, and not even new when it comes to tracking web analytics, it usually requires the use of a standalone eye-tracking device that can cost tens of thousands of dollars. The studies using the technology are typically conducted in a lab setting and users need to hold their heads a certain distance from a monitor or wear a headset for it to work.

“We’re using the webcams that are already integrated in users’ computers, which eliminates the cost factor,” said Papoutsaki. “And it’s more naturalistic in the sense that we observe people in the real environment instead of in a lab setting.”

Once the code is embedded into a website, it prompts users to give permission to access their webcams. When permission is granted, the software employs a face-detection library to locate the user’s face and eyes. The system then converts the image to black and white, which enables it to distinguish the sclera (the whites of the eyes) from the iris.

When the software locates the iris, it starts a statistical model that is calibrated by the user’s clicks and cursor movements. The model assumes that a user is looking where they are clicking, so each click tells the model what the eye looks like when it’s viewing a particular spot. It takes about three clicks to get a reasonable calibration, after which the model can accurately determine the location of the user’s gaze in real time.

The team conducted a series of experiments, which showed that the software can infer gaze location within 100 to 200 screen pixels. While it’s not as accurate as expensive commercial eye trackers, it still provides a good estimation of where the user is looking, according to Papoutsaki.

Papoutsaki and her colleagues see the tool as one that can help website owners to prioritize popular or eye-catching content, optimize a page’s usability, as well as place and price advertising space.

For example, a newspaper website “could learn what articles you read on a page, how long you read them and in what order,” said Jeff Huang, an assistant professor of computer science at Brown and co-developer of the software.

The team will now continue to refine the software, which it believes will open even more applications down the road, possibly in eye-controlled gaming or helping people with physical impairments to navigate the web.

“Our purpose here was to give the tool both to the scientific community and to developers and owners of websites and see how they choose to adopt it,” Papoutsaki said.

The software code is available to anyone who wants it at

Comments are closed, but trackbacks and pingbacks are open.