As you read this page, the movement of your eyes shows where you are focusing your attention. Researchers have found many uses for tools that track where people are looking. But the systems are still expensive, and their widespread adoption as user interfaces will need a compelling, high-volume application.
Over the years, engineers have explored various ways to track eye movement. Sensors mounted on the eyeball, or even bulky headgear, are accurate but are considered too intrusive for casual users. But in recent years companies have made great strides with more discreet systems that use a camera to follow the reflections of an infrared source in the eyeball while compensating for head motion. Swedish company Tobii Technologies, for example, claims to measure gaze position to within half a degree, roughly the apparent width of a thumbnail at arm's length.
So far, though, the systems cost thousands of dollars. This is acceptable for researchers who use a single system to study how people interact with information displays, for example in psychology, interface design, or advertising. It's also seen as a modest price to assist disabled people who can't control traditional mouse interfaces. But these premium markets are too small to drive down costs to a level that would make eye-tracking generally accessible.
"The biggest impediment is the widespread availability of low-cost accurate eye tracking devices," says Manu Kumar, founder of venture capital fund K9 Ventures. A few years ago, Kumar's research at Stanford University explored how eye-tracking entry could foil "shoulder surfers," who steal passwords by observing keystrokes. This April, at CHI 2010, the ACM Conference on Human Factors in Computing Systems, Robert Biddle, professor of Human-Computer Interaction of Carleton University in Ottawa, Ontario and his colleagues extended this idea to gaze-based "graphical passwords," which exploit image-recognition skills to make passwords easier to remember and difficult for others to guess.
Biddle is cautious about the results so far. But eye tracking "is a sensible area to be conducting research, because we do expect this to become ubiquitous soon," he says. "If you look at the actual hardware costs of the technology, there's no reason that, in high volume production, the price couldn't come down enormously." Still, using the eyes for pointing has limitations, Biddle notes. "Physiologically that's not what eyes are all about." So most interface schemes require users to indicate when to act on their gaze, for example by pressing a key.
In contrast to such "explicit" control, researchers from the Kaiserslautern laboratory of the German Research Center for Artificial Intelligence (DFKI) describe the "implicit" use of eye tracking to enhance reading. This enhancement, which they call "Text 2.0" could take the form of Hollywood-style music or sound effects when particular passages are read. Alternatively, eye tracking could trigger context-sensitive translations, pronunciations, or explanations, as the team illustrates in a demonstration video. "What we are presenting is a sort of information delivery right at your—I would [ordinarily] say fingertips—but [in this case] right at a glance," says Ralf Biedert. His colleague Georg Buscher notes that the right software could provide much of this information without any extra human annotation of the text.
Despite recent advances in eye-tracking technology, the German researchers agree that cost is a barrier. "What we need is an application for the masses that everyone wants to have," Buscher notes. One likely source is in gaming, which has previously adopted other novel interfaces. But Biedert says he wants to show that eye-tracking can benefit serious activities, too. "It's not only games."