Without a sense of touch, Frankenstein’s monster might never have learned that “fire is bad,” potentially resulting in an unstoppable killing machine. Therefore, it’s worth appreciating the often-overlooked sense of touch—one that robots may soon also experience. Facebook has unveiled a collection of tactile technologies designed to equip robots with a sense of touch previously unimagined by even the wildest fiction.
Why is Facebook venturing into robotics? During a recent press call, Yann LeCun, Facebook’s chief AI scientist, recalled a conversation with Mark Zuckerberg. Initially skeptical about robotics, Zuckerberg later recognized the significant advancements taking place at the intersection of AI and robotics. This realization led to the establishment of Facebook's AI Research (FAIR) program dedicated to this area.
FAIR's tactile technology research revolves around four primary aspects: hardware, simulation, processing, and perception. One notable achievement is the DIGIT sensor, introduced in 2020. Unlike conventional tactile sensors, which depend on capacitive or resistive methods, DIGIT utilizes vision-based technology.
“Inside the sensors is a camera, RGB LEDs, and silicon gel,” explained FAIR AI Research Scientist Roberto Calandra. “When the silicone is touched, it creates shadows and color changes recorded by the collar, allowing for high resolution and spectral sensitivity while maintaining mechanical robustness.” Remarkably, DIGIT costs about $15 to produce and is available as open-source hardware, accessible for universities and research institutions. It's also available for purchase through a partnership with GelSight.
For simulation, FAIR developed TACTO, enabling machine learning systems to train in virtual environments without extensive real-world data collection. TACTO can generate hundreds of realistic touch readings per second, simulating sensors like DIGIT to streamline the data gathering process for researchers.
LeCun emphasized the need for machines to learn a model of the world similar to humans, who can drive a car with just 10-20 hours of practice. Understanding the implications of physical interactions, like gravity, is crucial for effective learning. The challenge lies in teaching machines to anticipate and plan for the consequences of their actions.
To enhance accessibility for aspiring roboticists, FAIR created PyTouch—a platform distinct from the PyTorch machine learning library. PyTouch focuses on touch sensing applications, allowing researchers to easily connect DIGIT, download pretrained models, and incorporate touch sensing into their robotic projects.
Additionally, FAIR, in collaboration with Carnegie Mellon University, has introduced ReSkin—a touch-sensitive “skin” for robots and wearables. Made from a deformable elastomer embedded with micro-magnetic particles, ReSkin detects force by measuring changes in magnetic flux as the material deforms.
“ReSkin provides rich contact data useful for AI in various touch-based tasks, including object classification and robotic grasping,” noted FAIR AI Research Manager Abhinav Gupta. Despite being inexpensive to produce—around $6 for 100 units—ReSkin is durable enough to handle up to 50,000 touches while delivering precise tactile feedback.
FAIR envisions applications for ReSkin in various fields, including ensuring delicate tasks, like picking up fragile objects, and accurately measuring tactile forces in human-robot interactions. Emphasizing collaboration and advancement, FAIR has open-sourced DIGIT, TACTO, PyTouch, and ReSkin, aiming to propel tactile technology across the robotics domain.