Home → News → Robot's In-Hand Eye Maps Surroundings, Determines... → Full Text

Robot's In-Hand Eye Maps Surroundings, Determines Hand's Location

By Carnegie Mellon News (PA)

May 19, 2016

[article image]

Carnegie Mellon University (CMU) researchers have developed Articulated Robot Motion for simultaneous localization and mapping (ARM-SLAM), a new technology that includes a camera attached to a robot's hand that can rapidly create a three-dimensional (3D) model of its environment and locate the hand within that 3D world.

The researchers found they could improve the accuracy of the map by incorporating the arm as a sensor, using the angle of its joints to better determine the position of the camera.

Placing a camera or other sensor in the hand of a robot has become feasible as sensors have grown smaller and more power-efficient, according to CMU professor Siddhartha Srinivasa.

"Automatically tracking the joint angles enables the system to produce a high-quality map even if the camera is moving very fast or if some of the sensor data is missing or misleading," says CMU researcher Matthew Klingensmith.

The researchers demonstrated ARM-SLAM using a small-depth camera attached to a lightweight manipulator arm. The researchers used the system to build a 3D model of a bookshelf, and found it produced reconstructions equivalent or better than other mapping techniques.

"We still have much to do to improve this approach, but we believe it has huge potential for robot manipulation," Srinivasa says.

From Carnegie Mellon News (PA)
View Full Article


Abstracts Copyright © 2016 Information Inc., Bethesda, Maryland, USA


No entries found