Home → News → Robots With a Human Touch → Abstract

Robots With a Human Touch

By Gregory Mone

Communications of the ACM, Vol. 58 No. 5, Pages 18-19

[article image]

In a Northeastern University lab in Boston, MA, a red and roughly humanoid robot named Baxter spots a USB cable dangling from a cord. Baxter, made by Boston's Rethink Robotics, reaches out and grasps the plug between two fingers, then slowly guides it downward. On the desk below, a USB socket is plugged into a power strip. A camera mounted on Baxter's arm helps the robot locate the socket; then the hand guides the USB plug into place, wriggling it briefly before completing the connection.

A job this delicate is difficult enough for people, let alone robots. Machines have long been able to use vision to identify the plug and socket, but to complete this task autonomously, Baxter relied on another type of feedback: a sense of touch. The robot was equipped with a new high-resolution GelSight tactile sensor. GelSight, developed by Massachusetts Institute of Technology (MIT) engineer Edward Adelson and put to work on Baxter with the help of Northeastern University roboticist Robert Platt, does not just tell the robot whether it is holding something in its hand; the sensor provides high-resolution feedback that allows the robot to identify the object from its shape and imprinted logo, then determine its orientation in space. As a result, Baxter can figure out whether it is holding the plug the right way and inserting it into the socket correctly.


No entries found