Rutgers University's Matthew Purri and Kristin Dana have trained a camera-linked artificial intelligence (AI) to read the tactile properties of an object when presented with a photograph or series of images of it.
The researchers captured photos of more than 400 materials, then took 100 images of each surface using a device with a mechanical arm.
They linked the images to an existing dataset, with 15 physical properties logged for each material in categories including friction, adhesion, and texture. The Rutgers researchers fed this data to a deep learning algorithm and tested it on previously unseen surfaces; given a single image taken from directly overhead, the algorithm could reliably estimate 14 of 15 object surface properties, while adhesion was difficult to determine. Accuracy improved when presented with more images at different camera angles; the researchers think the AI could be used in robots and in cars to help estimate surface properties of roads.
From "AI Camera Can Tell What Surfaces Feel Like with Just a Glance"
New Scientist (09/22/20) Donna Lu
View Full Article