A method in which tapping one's skin to control gadgets using a combination of sensors and software has been worked out by Carnegie Mellon University Ph.D student Chris Harrison and colleagues at Microsoft Research. The system involves placing sensors on the arm that listen for acoustic input, while the software can be trained to assign specific sounds to specific locations, into which different gadget functions can be bundled. Early tests demonstrate that the system is capable of picking up a five-location system with more than 95 percent accuracy after a short amount of training.
Harrison envisions three unique applications for the system—controlling an in-pocket gadget via coupling with Bluetooth; controlling a music player attached to the user's upper arm; and turning the forearm or hand into a display surface in conjunction with a pico-projector.
From BBC News
View Full Article
Abstracts Copyright © 2010 Information Inc., Bethesda, Maryland, USA