Carnegie Mellon University (CMU) robotics researchers have developed an algorithm that uses crowdsourcing to detect where people's gazes intersect. The researchers tested their method by monitoring the gaze patterns of volunteers equipped with head-mounted displays. The data enabled researchers to determine if the volunteers were listening to a single speaker, interacting as a group, or watching a bouncing ball in a ping-pong game.
The researchers say their algorithm could be used by robots to evaluate social cues, such as the expressions on people's faces or body movements, or data from other types of visual or audio sensors. "This really is just a first step toward analyzing the social signals of people," says CMU's Hyun Soo Park.
The technology could eventually help robots understand their social environment. The head-mounted cameras provide precise data about what users are looking at in social settings, and the CMU algorithm can automatically estimate the number and 3-D position of "gaze concurrences," which are the positions where the gazes of multiple people intersect. CMU professor Yaser Sheikh the cameras also could be used by people who work in cooperative teams with robots.
From Carnegie Mellon News
View Full Article
Abstracts Copyright © 2012 Information Inc., Bethesda, Maryland, USA