Imbuing robots and other machines with human-like sensory capabilities is an incredibly difficult task. In recent years, machine vision, hearing, and speech have advanced remarkably. Yet, the ability to building machines that can touch and feel the way people do has proven elusive.
All of this is about to change. Haptics—the use of technology to simulate touch, feeling, and motion—is finally hitting its stride. "A confluence of new actuator and sensor technology, along with ever increasing computing power, is pushing the technology into the mainstream," says Jake Rubin, founder and CEO of HaptX, a developer of haptics systems for robots and virtual reality (VR).
The technology introduces ways to use VR far more effectively—particularly for training in industrial settings. Yet haptics also makes it possible to use robots for a wider array of tasks, and to produce biomechanical limbs that replicate a sense of feel. "We are much closer to creating systems that talk to the brain or interact with nerves in ways that seems realistic," says Gregory A. Clark, director of the Center for Neural Interfaces at the University of Utah.
Out of the Body
Simulating human touch is an incredibly complex endeavor. It requires a deep understanding of physiology and neurobiology along with an ability to translate electrical signals into and out of binary code. Although VR, robotics, and biomechanical limbs incorporate and communicate touch in different ways, there's a common denominator, Clark says: a system ultimately must convince the brain that the signals flowing into it from machines are real.
HaptX, for example, has developed highly specialized gloves that use 130 tiny actuators and microfluidic air channels embedded in a fabric. The gloves displace the skin to 2 mm with no latency, producing ultrarealistic sensations; they also offer 6 degrees of freedom. "When you touch a virtual object, the pixel-like bubbles on this material inflate or deflate precisely to produce the same forces over the same area that that object would produce if it touched your hand in real life," Rubin explains.
Haptics technology is also fueling advances in robotics and biomedicine. For example, Joseph Francis, professor of biomedical engineering and electrical and computer engineering at the University of Houston, has developed a brain-computer interface (BCI) that autonomously responds to a prosthetic limb through user feedback. The system records neural activity from the brain and translates it into movement data, he says. As the brain registers perceived successes and errors, the system updates and improves the translation agent.
At the University of Utah, Clark and a team of researchers use a sophisticated prosthetic arm that tricks the brain into feeling objects. The LUKE arm (developed by DEKA Research and Development Corp.) creates sensation using artificial sensors and stimulation of sensory nerve fibers via implanted electrodes. These devices connect to a nerve-computer interface and use encoding algorithms to "feel" objects and movements. Simply put, the system sends sensor outputs up the sensory nerve fibers to the brain via stimulation-evoked nerve discharges, which the brain then interprets as real.
Much the same holds true in reverse for the outgoing digital signals carried by motor fibers in the nerve. "Every digital pulse that travels along the biological 'wire' of the arm creates a corresponding single digital pulse in the targeted muscle. The electrical activity is converted into a twitch," he says. Researchers have successfully tested the system on amputees. Clark has learned that the more biologically realistic the nerve activation patterns are, the greater the improvement.
Things Get Touchy
What makes artificial touch difficult, Clark says, is that there's a vast array of human "touch" receptors that respond differently to steady-state pressure, velocity (how rapidly the pressure is changing), and acceleration (changes in velocity). The team has addressed this challenge by developing a system that maps the different types or locations of sensations evoked by specific electrodes, and then uses specific algorithms to generate biologically realistic firing patterns to recreate biologically realistic sensations.
Francis says that over a user's lifespan, "Reliable and safe neural recording and stimulation methods would be game-changing" for medicine and other fields. Meanwhile, HaptX aims to develop full body suits that deliver an immersive haptic experience. Telerobotics could make many difficult and dangerous jobs completely safe by allowing a person to operate avatars or remote robots with the ability to feel objects in the environment as if they are real. It also could open new vistas for virtual travel and entertainment. "Our ultimate vision is to bring this feedback to the full body, not just the hands," Rubin explains.
Concludes Francis, "Realistic and natural sensations could change the world not only for prosthetics, but also introduce virtual worlds where you feel totally immersed."
Samuel Greengard is an author and journalist based in West Linn, OR, USA.