Sign In

Communications of the ACM

ACM TechNews

AI Camera Can Tell What Surfaces Feel Like with Just a Glance


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
A camera equipped with artificial intelligence can tell what objects in this picture feel like.

Researchers at Rutgers University taught an artificial intelligence to read the tactile properties of an object from a photo of that object.

Credit: Cavan Images/Alamy

Rutgers University's Matthew Purri and Kristin Dana have trained a camera-linked artificial intelligence (AI) to read the tactile properties of an object when presented with a photograph or series of images of it.

The researchers captured photos of more than 400 materials, then took 100 images of each surface using a device with a mechanical arm.

They linked the images to an existing dataset, with 15 physical properties logged for each material in categories including friction, adhesion, and texture. The Rutgers researchers fed this data to a deep learning algorithm and tested it on previously unseen surfaces; given a single image taken from directly overhead, the algorithm could reliably estimate 14 of 15 object surface properties, while adhesion was difficult to determine. Accuracy improved when presented with more images at different camera angles; the researchers think the AI could be used in robots and in cars to help estimate surface properties of roads.

From "AI Camera Can Tell What Surfaces Feel Like with Just a Glance"

New Scientist (09/22/20) Donna Lu
View Full Article


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account