TikTok Penalized in Italy Following Consumer Safety Investigation Linked to Controversial 'French Scar' Challenge

Italy’s competition and consumer authority, AGCM, has imposed a €10 million (approximately $11 million) fine on TikTok following an investigation into serious algorithmic safety concerns. This inquiry was initiated last year after reports surfaced regarding a disturbing “French Scar” challenge, where users shared videos showcasing facial marks created by pinching their skin.

In a press release issued on Thursday, AGCM detailed that three regional subsidiaries of ByteDance—TikTok Technology Limited (Ireland), TikTok Information Technologies UK Limited, and TikTok Italy Srl—were penalized for what it characterized as “unfair commercial practices.” The authority criticized TikTok for not establishing adequate mechanisms to oversee the content shared on its platform, especially concerning videos that could endanger minors and other vulnerable groups. Moreover, it noted that such harmful content is frequently promoted to users through algorithmic recommendations, leading to increased usage of the platform.

AGCM's investigation confirmed TikTok's role in distributing content that poses risks to the mental and physical safety of users, particularly minors. The authority highlighted the platform's failure to implement sufficient protections against this content and noted a lack of adherence to its own community guidelines. It also criticized TikTok's enforcement of these guidelines, stating they are not adequately tailored to account for the unique vulnerabilities of adolescents. The agency pointed out the developmental sensitivity of teenagers, who may be more susceptible to peer pressure and might engage in risky behaviors to fit in socially.

The AGCM's remarks underscored the critical part TikTok's recommendation algorithms play in disseminating potentially harmful content. The platform's design incentivizes user engagement and interaction, aiming to prolong user activity to maximize advertising revenue. This recommendation system underpins TikTok's “For You” and “Followed” feeds, relying on algorithmic profiling based on users’ digital behaviors to tailor content.

"This creates an undue conditioning of users, leading them to spend increasing amounts of time on the platform," the AGCM remarked, signaling a notable critique of engagement driven by such profiling-based content feeds.

We have reached out to the authority for further clarification. However, the AGCM's cautious stance on algorithmic profiling is significant, especially as some European lawmakers are advocating for default settings that disable profiling-based content feeds. Advocacy groups like the ICCL have suggested that this could mitigate the outrage-driven dynamics that ad-supported social media platforms exploit, which can contribute to societal division and discord.

TikTok has contested the AGCM’s decision, framing the ruling as an overreaction to a single, minor challenge rather than acknowledging broader algorithmic risks. In a statement, the platform highlighted that “French Scar” content achieved only around 100 daily searches in Italy before the AGCM’s announcement, asserting that it had already limited visibility of such content to users under 18 and excluded it from the For You feed.

Although this enforcement action originates from a single EU member state, TikTok is under the European Commission’s purview for compliance with algorithmic accountability and transparency as outlined in the Digital Services Act (DSA). Noncompliance could lead to fines of up to 6% of the company’s global annual revenue. TikTok was classified as a very large platform under the DSA in April last year, with compliance expected by late summer.

A key change mandated by the DSA includes TikTok’s option for non-profiling based feeds; however, these alternatives are opt-in, meaning that users must actively choose to disengage from AI-driven tracking and profiling.

Last month, the EU launched a formal investigation into TikTok, focusing on issues such as addictive design practices and harmful content while prioritizing the protection of minors. This inquiry is still ongoing, and TikTok has expressed its eagerness to provide the Commission with comprehensive details about its strategies for protecting young users.

Nonetheless, TikTok has encountered various regulatory challenges regarding child safety in recent years, including interventions by the Italian data protection authority, a €345 million fine over data protection violations linked to minors, and ongoing concerns voiced by consumer protection organizations regarding the safety and profiling of younger users.

Furthermore, TikTok faces increasing potential for regulation from member state agencies implementing the Audiovisual Media Services Directive, including scrutiny from Ireland’s Coimisiún na Meán, which is contemplating rules mandating that profiling-based recommendation algorithms be disabled by default.

The platform also faces significant challenges in the U.S. where lawmakers have proposed a bill to ban TikTok unless it sever ties with its Chinese parent company, ByteDance, citing concerns over national security and the risk of foreign manipulation through user profiling and tracking.

Most people like

Find AI tools in YBX