Facebook Research is Creating Emotionally Engaging Robots with a Sense of Curiosity

Facebook's Advancements in AI and Robotics

As a global social media leader, Facebook relies heavily on artificial intelligence (AI) and machine learning to maintain platform integrity and minimize harmful content. Following an announcement about self-supervised learning, computer vision, and natural language processing, Facebook recently detailed three additional areas of research aimed at developing more capable AI.

A key focus of their robotics research is self-supervised learning, where systems learn from raw data to adapt to new tasks and environments. In a recent blog post, researchers from Facebook AI Research (FAIR) highlighted advancements in model-based reinforcement learning (RL), which allows robots to learn through trial and error via sensor input.

One notable project involves a six-legged robot designed to learn how to walk independently. According to FAIR researcher Roberto Calandra, "Locomotion is a challenging problem in robotics, making it an exciting area for research." The hexapod starts as a collection of legs with no situational awareness and gradually develops a controller to achieve forward movement. By employing a recursive self-improvement function, the robot leverages its experience to optimize its behavior over time, significantly reducing the time required to learn walking.

Once the robot masters locomotion, its next challenge is exploration. Traditional AI typically focuses on achieving specific goals, but Facebook aims to instill curiosity in its robots with assistance from NYU researchers. Previous studies on AI curiosity centered on reducing uncertainty; however, the latest approach seeks a more structured method.

FAIR researcher Franziska Meier explained, "We began with a model that knows little about itself." As the robot learns to maneuver its arm, it optimizes its planning by predicting the necessary actions. To avoid repetitive action sequences, the research team rewards the robot for resolving uncertainty, allowing for more effective learning and adaptability to new tasks.

Moreover, Facebook is exploring how robots can "feel" physically, using a predictive deep-learning model designed for video. Calandra elaborated that this technique predicts future states based on current images and actions. The team demonstrated that robots could learn to manipulate objects using high-resolution tactile sensors without external rewards, achieving tasks like rolling a ball and identifying the correct face of a die.

By integrating visual and tactile inputs, Facebook aims to enhance robotic functionality and learning methods. "To create machines that learn through independent interactions with their environments, we need robots capable of processing data from multiple senses," the research team concluded. While the future applications of this research remain undisclosed, they hold significant potential for advancing robotic capabilities.

Most people like

Find AI tools in YBX