AI has given robots the ability to “hear” and “see” the world to understand human commands and perform tasks better, but now Meta’s AI researchers are testing ways to make robots mimic the sense of touch, too. Meta’s Fundamental AI Research (FAIR) division has just introduced a suite of tools that will allow robotic instruments to detect, decipher and respond to what they touch. That could make even the most basic robotic arm sensitive enough to handle delicate objects without damaging them, making them useful in more environments.
Meta showcased a combination of new technologies and features that work together to give robots the ability to feel things. Touch-sensitive technology Sparsh gives AI a way to identify things like pressure, texture and movement without the need for a huge database. It’s like an AI version of how you can feel something in the dark and describe what it feels like, even if you don’t know what you’re touching.
To feed information about what the robot is touching to the AI model, Meta partnered with a company called GelSight to essentially create a robotic fingertip called Digit 360. The sensors in Digit 360 are very sensitive, so the AI can not only determine details about what the robot is touching, but also apply pressure appropriate for a task with the object, such as lifting or rotating it.
For the rest of the robotic hand (or equivalent), Meta partnered with Wonik Robotics to create a system called Plexus to distribute multiple touch sensors across the device. Meta claims that Plexus can adequately mimic the human sense of touch for delicate or difficult objects. Here’s how the three technologies work together in a robotic hand.
Sensitive AI
“The human hand is amazing at communicating information to the brain, from the fingertips to the palm. This allows the muscles in the hand to be controlled to make decisions, such as how to type on a keyboard or how to grip an object that is too hot.” ,” Meta explained in a blog post afterward. “Achieving embodied AI requires a similar coordination between tactile detection and motor activation of a robotic hand.”
There are many ways in which robotic hands that “feel” connected to AI that can interpret those sensations could be useful. Imagine robotic surgical assistants that can sense small changes in the body and respond more quickly, with precise but gentle movements that match or even exceed human responses. The same goes for producing delicate devices without breaking them, and perhaps better coordination between multiple robotic hands, as humans do with their pairs. It could make virtual experiences feel like humans, though, with insights into how objects and environments should feel when used to inform their virtual counterparts.
Using AI to mimic the sense of touch of robots isn’t the only human experience that AI is replicating for machines. Researchers at Penn State recently showed how AI models paired with an electronic tongue can simulate a sense of taste well enough to detect small differences in flavor. Meanwhile, a company called Osmo has taught AI models to mimic a sense of smell far better than a human’s. The company has shown how its AI can analyze a scent accurately enough to even recreate it from scratch, selecting and combining chemicals without human intervention.