News
Architecture and Hardware News

Robots With a Human Touch

Empowering smart machines with tactile feedback could lead to tremendous new applications.
Posted
  1. Introduction
  2. Feeling Around
  3. Modeling the World
  4. Space, Health Care, and More
  5. Further Reading
  6. Author
  7. Figures
grasping robot arm
Equipped with the high-resolution GelSight tactile sensor, a robot can grasp a cable and plug it into a USB port.

In a Northeastern University lab in Boston, MA, a red and roughly humanoid robot named Baxter spots a USB cable dangling from a cord. Baxter, made by Boston’s Rethink Robotics, reaches out and grasps the plug between two fingers, then slowly guides it downward. On the desk below, a USB socket is plugged into a power strip. A camera mounted on Baxter’s arm helps the robot locate the socket; then the hand guides the USB plug into place, wriggling it briefly before completing the connection.

A job this delicate is difficult enough for people, let alone robots. Machines have long been able to use vision to identify the plug and socket, but to complete this task autonomously, Baxter relied on another type of feedback: a sense of touch. The robot was equipped with a new high-resolution GelSight tactile sensor. GelSight, developed by Massachusetts Institute of Technology (MIT) engineer Edward Adelson and put to work on Baxter with the help of Northeastern University roboticist Robert Platt, does not just tell the robot whether it is holding something in its hand; the sensor provides high-resolution feedback that allows the robot to identify the object from its shape and imprinted logo, then determine its orientation in space. As a result, Baxter can figure out whether it is holding the plug the right way and inserting it into the socket correctly.

Humans do the same thing when handling small objects; we rely on feel. In fact, the human hand is evidence that equipping robots with a sense of touch could be incredibly powerful, leading to new applications in surgery, manufacturing, and beyond. “We have in our own hands a proof that tactile sensing is very important,” says Bill Townsend, CEO of robotics manufacturer Barrett Technology. “The question is how do we get from where we are with tactile sensors now to what we know can be possible based on what our own hands can do.”

Back to Top

Feeling Around

Researchers have been working to develop powerful tactile sensors for decades, but few of these technologies have successfully transitioned out of the academic lab. “There have been lots and lots of designs, but nobody has ever come up with a really reliable tactile sensor,” says Ken Goldberg, director of the Center for Automation and Learning for Medical Robotics at the University of California, Berkeley. “It’s easy to build a camera and get high resolution at a low cost, but a sense of touch is extremely difficult. It’s just a very hard problem.”

Platt hopes GelSight could be one possible solution. The sensor consists of a square of synthetic rubber coated on one side with metallic paint. The other side of the rubber is attached to a transparent piece of plastic equipped with light-emitting diodes (LEDs) and a small camera. The camera captures the LED light as it bounces off the layer of metallic paint, and a computer processes this data.

When the robot grips that USB plug, the layer of rubber and paint deforms as if the device had been pressed into a piece of clay. This change in the surface also changes the way the LED light is reflected. The computer interprets these alterations, generating a fine-grained picture of what the robot has in its hand. “It’s able to feel the geometry of the surface it’s touching in a very precise way,” Platt explains. “If the sensor touches the back of your hand, you get an image showing that it sensed the hairs and the wrinkles in your skin. That kind of accuracy is really a first.”

Platt envisions a robot with this level of sensitivity accomplishing a range of fine-manipulation jobs, from assembling small parts in an electronics factory to changing the batteries in a smoke alarm. However, an effective tactile sensor also needs to be reliable, affordable, and robust enough to survive the wear and tear of the real world.

The Barrett Technologies robot hand illustrates still another challenge. One model of the BarrettHand features 24 touch-sensitive pads in each of its four fingers. When the robot grabs an object, researchers can analyze data from the force-torque sensors in the finger joints to register that item’s presence and guess its shape. The tactile pads provide even more finely grained detail, in part because they are distributed widely across the fingers. Yet Townsend says researchers struggle to incorporate that feedback; he suspects that may be because the software needed to make sense of that data has not yet been developed.

Back to Top

Modeling the World

Roboticist Charlie Kemp and his team at the Georgia Institute of Technology in Atlanta have adopted a more software-centric approach to tactile sensing. They built their own sensors because they could not find any off-the-shelf models that met their particular needs, basing the hardware on a design by John Ulmen and Mark Cutkosky at Stanford University. Each sensor is about as wide as a Sharpie, and consists of alternating layers of conductive rubber, capacitive plates, foam, and steel mesh. Kemp and his team outfitted the entire arm of a humanoid named Cody with 384 of these tactile sensors, but they were not focusing on manipulating or identifying objects like Platt; instead, the researchers wanted their robot to use tactile feedback to learn.

The group set up several experiments in which the Cody robot was given a task, such as grasping or simply making contact with a small ball. The trick was that the ball was half-hidden in a thicket of houseplants, tree trunks, or cinderblocks, or tucked behind a window with only select panes open. The traditional approach would be for the robot to study the scene visually, then make a plan and execute it, but with a cluttered setup like the one featuring the plants, that would have led to failure. “Based on just looking at the scene, the robot can’t tell what’s going on,” he explains. “It’s just this big patch of green leaves.”

In Kemp’s experiments, when the robot’s arm bumped against the cinderblock or the tree trunks, and the tactile sensors registered high forces from the pressure of these solid objects, Cody recognized it would not be able to move them, so it tried another route. When it brushed up against the fronds of the houseplants, the robot kept moving, making the assumption, given the low forces, that these were not real impediments. Finally, as it moved and made contact with the different objects, it started to build a virtual model of what to avoid and which paths were more promising. “Every place it makes contact, it alters its model a little,” Kemp says. “It’s making a map based on tactile feedback.”

Back to Top

Space, Health Care, and More

Kemp says that there is still plenty of work to be done on both the hardware and the software for tactile sensing, but addressing these challenges could lead to a whole new set of roles for robots. For example, at the Center for Automation and Learning for Medical Robotics, Goldberg has his eye on autonomous surgery, an application that will demand advances in both sensors and learning. Today’s da Vinci surgical robot does not include tactile feedback, so a doctor relies primarily on cameras when performing a remote operation. Surgeons would benefit from a sense of touch, but Goldberg says this feedback could also allow the machines to complete simple procedures on their own. In this case, developing the right sensor will only be the first step; the robots also will need to learn from their experiences. Goldberg pictures them uploading the data gathered during surgeries to the cloud, then allowing other robots to draw upon and learn from those experiences when performing their own basic operations.

Several researchers mention that the U.S. National Aeronautics and Space Administration has taken an interest in tactile sensors as well. Platt notes that robots with tactile feedback could be used to build structures on distant planets in advance of an astronaut crew’s arrival; as with surgery, the added sensitivity would give the robots the ability to complete these jobs autonomously, without waiting for orders from human operators. Eduardo Torres-Jara, a computer scientist at Worcester Polytechnic Institute in Massachusetts, explains this was part of the problem with the Spirit rover on Mars; if the robot had been able to feel its way around, it might not have spent so long stuck in the Martian soil.

Whether these robots end up toiling in operating rooms, offices, factories, or homes, the researchers say tactile feedback will be critical if the machines are going to do real work. Kemp, for one, envisions a bright future for sensitive machines. “I’m optimistic that robots are going to be able to serve as 24/7 assistants for people from older adults to those with severe disabilities,” Kemp says. “And for those types of things, touch is going to be critical.”

Back to Top

Further Reading

Li, R., Platt R., Yuan, W., ten Pas, A., Roscup, N., Srinivasan, M., and Adelson, E.
Localization and Manipulation of Small Parts Using GelSight Tactile Sensing IEEE Int’l Conf. on Intelligent Robot Systems (IROS), 2014.

Kemp, C., Edsinger, A., and Torres-Jara, E.
Challenges for Robot Manipulation in Human Environments IEEE Robotics and Automation, March 2007, pgs. 20–29.

Jain, A., Killpack, M., Edsinger, A., and Kemp, C.
Reaching in Clutter with Whole-Arm Tactile Sensing, The International Journal of Robotics Research, 2013.

Khatib, O.
Mobile manipulation: The robotic assistant, Robotics and Autonomous Systems Volume 26, Issues 2–3, Feb. 28 1999, pp.175–183.

Automatic USB cable insertion using GelSight tactile sensing http://bit.ly/1IyFHNA

Back to Top

Back to Top

Figures

UF1 Figure. Equipped with the high-resolution GelSight tactile sensor, a robot can grasp a cable and plug it into a USB port.

Back to top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More