Meta's efforts in generative AI have faced regulatory challenges in both the EU and Brazil due to concerns regarding data training practices. As the parent company of Facebook and Instagram, Meta has decided not to introduce its multimodal AI models to European users because of uncertainties surrounding data rules. The European Data Protection Supervisor has requested Meta to pause the training of future AI models using user data from European users, to which Meta is currently complying. This decision means that EU users may miss out on advanced AI features like the Meta AI chatbot, as Meta withholds its next multimodal AI model.
The company, known for working on various multimodal systems, such as a enhanced version of its Llama model capable of processing text, video, images, and audio inputs, has expressed concerns about the unpredictable regulatory environment in Europe. Meta is unsure whether its European user data usage aligns with data protection laws like the GDPR. This uncertainty poses challenges for businesses utilizing Meta's AI models, potentially affecting their ability to offer products and services incorporating the new multimodal models in Europe.
In an attempt to address this issue, Meta proposed a solution in June, allowing EU users to opt out of data usage for training AI models. Dieter Gerdemann, a European lead at Kearney, believes this request from the Data Protection Supervisor will impact all technology companies that utilize European data for AI model training, prompting a discussion on secure legislation to guide data usage in training AI models.
Meanwhile, Meta has halted its Meta AI chatbot service on its social media apps in Brazil following the country's Data Protection Authority's prohibition of using Brazilian citizens' personal data to train AI models. The agency cited concerns with Meta's revised privacy policy, which included permissions for data usage but lacked transparency. As a result, Meta decided to suspend the chatbot service in Brazil while engaging in discussions with the ANPD.