Sophy AI Triumphs Over Gran Turismo's Elite Racers: A New Era in Racing Technology

Hyper-capable AI in Gaming: Introducing GT Sophy

For years, hyper-capable AIs have dominated our favorite games. From Go and Jeopardy to DOTA 2 and Nethack, artificial intelligence has consistently showcased its competitive edge, advancing gaming technology alongside machine learning and computational science.

Now, Sony has unveiled its latest innovation: GT Sophy, an AI racer designed to challenge and surpass some of the world’s best Gran Turismo players. Created through a collaboration between Sony AI, Polyphony Digital, and Sony Interactive Entertainment, GT Sophy embodies over five years of extensive research and development.

“Gran Turismo Sophy represents a significant leap in AI, aimed not just at outperforming human players but at providing a compelling opponent that inspires players to enhance their skills and creativity,” stated Hiroaki Kitano, CEO of Sony AI. “This breakthrough not only benefits the gaming community but also paves the way for advancements in autonomous racing, driving, high-speed robotics, and control systems.”

Using an innovative deep reinforcement learning method, the research team has equipped Sophy with the ability to control a digital race car in the Gran Turismo environment. This AI has learned vehicle dynamics, racing strategies like slipstreaming, and essential track etiquette.

“To compete, GT Sophy must master driving at physical limits, optimizing braking and acceleration while finding the best racing lines to gain milliseconds,” explained Michael Spranger, COO of Sony AI. “This includes understanding its opponents and managing complex aerodynamic interactions on the track.”

Sophy’s training involved deep reinforcement learning to perfect its performance on the circuit. “By observing factors such as speed, acceleration, and the positions of competitors, GT Sophy learns to execute actions like throttling, steering, and braking effectively,” Spranger noted. “The AI receives positive feedback—or rewards—when it successfully navigates and overtakes, and negative signals when it falters.”

The results of this training have been remarkable. Within just two days, Sophy outperformed 95% of human competitors. In a recent exhibition race against top Gran Turismo drivers in Japan, GT Sophy clinched victory alongside two other variants, dominating the Lago Maggiore circuit and finishing over five seconds ahead of the nearest human racer. Notably, one of the AI racers demonstrated human-like misjudgment by miscalculating a passing maneuver and crashing into the wall.

“This project goes beyond mere technical achievement,” said Kenichiro Yoshida, CEO of Sony Group. “It’s about empowering game developers with AI tools to create new, engaging player experiences.”

Anticipation builds as players prepare to challenge GT Sophy. Gran Turismo 7, set for release on March 4th for PS4 and PS5, aims to integrate this groundbreaking AI as a potential in-game driving coach or teammate in future updates.

Most people like

Find AI tools in YBX