Meta develops new technology that gives robots human-like touch
Meta is betting big on the emerging field of embodied AI by incorporating the sense of touch and feel to its robotic innovations. The technology giant is collaborating with US-based sensor company GelSight and South Korean robotics firm Wonik Robotics to commercialize tactile sensors for AI. The new devices are not designed for consumers; instead, they are aimed at scientists.
In this regard, Meta has released three research artifacts—Sparsh, Digit 360, and Digit Plexus—focusing on touch perception, robot dexterity, and human-robot interaction. Additionally, the company is introducing PARTNR, a new benchmark for assessing planning and reasoning in human-robot collaboration.
Robots to handle tasks that require reasoning
The new advancements are based on a renewed optimism in the industry that foundation models, including large language models (LLMs) and vision-language models (VLMs), can enable robots to perform more complex tasks that require reasoning and planning.
Sparsh, developed in partnership with the University of Washington and Carnegie Mellon University, is a set of encoder models designed for vision-based tactile sensing, aimed at giving robots touch perception capabilities. This ability is essential for various robotics tasks, such as gauging how much pressure can be applied to an object without causing damage.
Digit 360 is an artificial finger-shaped tactile sensor featuring over 18 sensing capabilities and more than 8 million taxels for capturing omnidirectional and granular deformations on its fingertip. This design allows for a more nuanced understanding of environmental interactions and object manipulation.
The sensor also includes on-device AI models, minimizing reliance on cloud servers and enabling local processing for quick touch responses, similar to the reflex arcs in humans and animals.
Digit Plexus is a hardware-software platform that simplifies the development of robotic applications. It allows various fingertip and skin tactile sensors to be integrated into a single robot hand, encoding tactile data and transmitting it to a host computer through one cable. By sharing the code and design for Digit Plexus, Meta hopes to help researchers advance robot dexterity.
New benchmark to evaluate AI models
Meta is also launching Planning and Reasoning Tasks in Human-Robot Collaboration (PARTNR), a benchmark to evaluate how well AI models assist humans with household tasks.
PARTNR is based on Habitat, Meta’s simulated environment, and includes 100,000 natural language tasks across 60 houses, featuring over 5,800 unique objects. It is designed to assess how effectively LLMs and VLMs follow human instructions.
The new benchmark adds to a rising trend of projects investigating the use of LLMs and VLMs in robotics and embodied AI. Over the past year, these models have demonstrated significant potential as planning and reasoning components for robots handling complex tasks.
“With PARTNR, we aim to drive advancements in human-robot interaction and collaborative intelligence, transforming Al models from ‘agents’ to ‘partners’,” the company noted.
link