Google is inviting users to assist in training its AI to understand human speech through a new initiative called LipSync. This project, developed by YouTube for Chrome on desktop, encourages users to lip sync a short segment of "Dance Monkey" by Tones and I.
As you perform, LipSync will evaluate your lip movements and send the video clips to Google’s AI, focusing solely on visual data—no audio is recorded. The aim is to help the AI learn how human faces move during speech. This advancement holds significant potential for assisting individuals with ALS and speech impairments, enabling the AI to interpret their facial movements and vocalize on their behalf.
Google’s ongoing commitment to accessibility is evident through various initiatives, including Android apps for the hard of hearing and features that highlight accessible locations in Maps. The development of AI speech recognition could lead to even more innovative solutions in the future.