11.5 C
London
Saturday, November 23, 2024
HomeNewsTechMeta-AI researchers give robots a sense of touch and it's giving us...

Meta-AI researchers give robots a sense of touch and it’s giving us all creepy vibes

Date:

Related stories

spot_imgspot_img

AI has given robots the ability to “hear” and “see” the world to understand human commands and perform tasks better, but now Meta’s AI researchers are testing ways to make robots mimic the sense of touch, too. Meta’s Fundamental AI Research (FAIR) division has just introduced a suite of tools that will allow robotic instruments to detect, decipher and respond to what they touch. That could make even the most basic robotic arm sensitive enough to handle delicate objects without damaging them, making them useful in more environments.

Meta showcased a combination of new technologies and features that work together to give robots the ability to feel things. Touch-sensitive technology Sparsh gives AI a way to identify things like pressure, texture and movement without the need for a huge database. It’s like an AI version of how you can feel something in the dark and describe what it feels like, even if you don’t know what you’re touching.

To feed information about what the robot is touching to the AI ​​model, Meta partnered with a company called GelSight to essentially create a robotic fingertip called Digit 360. The sensors in Digit 360 are very sensitive, so the AI ​​can not only determine details about what the robot is touching, but also apply pressure appropriate for a task with the object, such as lifting or rotating it.

For the rest of the robotic hand (or equivalent), Meta partnered with Wonik Robotics to create a system called Plexus to distribute multiple touch sensors across the device. Meta claims that Plexus can adequately mimic the human sense of touch for delicate or difficult objects. Here’s how the three technologies work together in a robotic hand.

Sensitive AI

“The human hand is amazing at communicating information to the brain, from the fingertips to the palm. This allows the muscles in the hand to be controlled to make decisions, such as how to type on a keyboard or how to grip an object that is too hot.” ,” Meta explained in a blog post afterward. “Achieving embodied AI requires a similar coordination between tactile detection and motor activation of a robotic hand.”

There are many ways in which robotic hands that “feel” connected to AI that can interpret those sensations could be useful. Imagine robotic surgical assistants that can sense small changes in the body and respond more quickly, with precise but gentle movements that match or even exceed human responses. The same goes for producing delicate devices without breaking them, and perhaps better coordination between multiple robotic hands, as humans do with their pairs. It could make virtual experiences feel like humans, though, with insights into how objects and environments should feel when used to inform their virtual counterparts.

Using AI to mimic the sense of touch of robots isn’t the only human experience that AI is replicating for machines. Researchers at Penn State recently showed how AI models paired with an electronic tongue can simulate a sense of taste well enough to detect small differences in flavor. Meanwhile, a company called Osmo has taught AI models to mimic a sense of smell far better than a human’s. The company has shown how its AI can analyze a scent accurately enough to even recreate it from scratch, selecting and combining chemicals without human intervention.

Subscribe

- Never miss a story with notifications

- Gain full access to our premium content

- Browse free from up to 5 devices at once

[tds_leads input_placeholder="Your email address" btn_horiz_align="content-horiz-center" pp_msg="SSd2ZSUyMHJlYWQlMjBhbmQlMjBhY2NlcHQlMjB0aGUlMjAlM0NhJTIwaHJlZiUzRCUyMiUyMyUyMiUzRVByaXZhY3klMjBQb2xpY3klM0MlMkZhJTNFLg==" pp_checkbox="yes" tdc_css="eyJhbGwiOnsibWFyZ2luLXRvcCI6IjMwIiwibWFyZ2luLWJvdHRvbSI6IjQwIiwiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdCI6eyJtYXJnaW4tdG9wIjoiMTUiLCJtYXJnaW4tYm90dG9tIjoiMjUiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsImxhbmRzY2FwZSI6eyJtYXJnaW4tdG9wIjoiMjAiLCJtYXJnaW4tYm90dG9tIjoiMzAiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZV9tYXhfd2lkdGgiOjExNDAsImxhbmRzY2FwZV9taW5fd2lkdGgiOjEwMTksInBob25lIjp7Im1hcmdpbi10b3AiOiIyMCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" display="column" gap="eyJhbGwiOiIyMCIsInBvcnRyYWl0IjoiMTAiLCJsYW5kc2NhcGUiOiIxNSJ9" f_msg_font_family="downtown-sans-serif-font_global" f_input_font_family="downtown-sans-serif-font_global" f_btn_font_family="downtown-sans-serif-font_global" f_pp_font_family="downtown-serif-font_global" f_pp_font_size="eyJhbGwiOiIxNSIsInBvcnRyYWl0IjoiMTEifQ==" f_btn_font_weight="700" f_btn_font_size="eyJhbGwiOiIxMyIsInBvcnRyYWl0IjoiMTEifQ==" f_btn_font_transform="uppercase" btn_text="Unlock All" btn_bg="#000000" btn_padd="eyJhbGwiOiIxOCIsImxhbmRzY2FwZSI6IjE0IiwicG9ydHJhaXQiOiIxNCJ9" input_padd="eyJhbGwiOiIxNSIsImxhbmRzY2FwZSI6IjEyIiwicG9ydHJhaXQiOiIxMCJ9" pp_check_color_a="#000000" f_pp_font_weight="600" pp_check_square="#000000" msg_composer="" pp_check_color="rgba(0,0,0,0.56)" msg_succ_radius="0" msg_err_radius="0" input_border="1" f_unsub_font_family="downtown-sans-serif-font_global" f_msg_font_size="eyJhbGwiOiIxMyIsInBvcnRyYWl0IjoiMTIifQ==" f_input_font_size="eyJhbGwiOiIxNCIsInBvcnRyYWl0IjoiMTIifQ==" f_input_font_weight="500" f_msg_font_weight="500" f_unsub_font_weight="500"]

Latest stories

spot_img