Europe's Digital Services Act Takes Effect Tomorrow—Essential Insights You Should Know

The European Union's updated e-commerce regulations will be fully implemented starting tomorrow. These regulations introduce new legal obligations for thousands of platforms and digital businesses that fall within their scope. The Digital Services Act (DSA) is a significant initiative by the EU aimed at establishing an online governance framework for platforms while utilizing transparency requirements to eliminate illegal content and products from the regional internet.

The fundamental principle is simple: if something is illegal to say or sell in a specific Member State, it shouldn't be possible to bypass those laws online. Therefore, online marketplaces in Europe cannot facilitate the sale of items like firearms in jurisdictions where such sales are prohibited. Similarly, social media platforms must remove hate speech if local laws are in place against it.

Protecting minors is another critical aspect of the DSA. The regulation mandates that relevant platforms provide "a high level of privacy, safety, and security" for children, including a ban on using their data for targeted advertising.

While it's challenging to pinpoint the exact number of companies affected—given the constant emergence of new digital platforms—the EU estimates that at least a thousand businesses will need to comply with these new rules. Failure to adhere to the DSA can result in severe penalties, with fines reaching up to 6% of global annual turnover for confirmed violations.

In addition to enforcing content moderation on platforms and know-your-customer (KYC) requirements for marketplaces, the DSA also imposes certain obligations on hosting services and online intermediaries, such as ISPs and domain registries. Smaller platforms, categorized as "micro" or "small" enterprises (employing fewer than 50 people and earning below €10 million annually), are largely exempt from many of these provisions. However, they must still provide clear terms and conditions and designate a contact point for authorities. Rapidly scaling startups that exceed these criteria will receive a "targeted exemption" for certain DSA provisions during a transitional 12-month period, as specified by the Commission.

Affected companies have had over a year to prepare their compliance strategies since the DSA text was published in October 2022, but many businesses are still working to understand how the rules will specifically apply to them.

More Regulations for Major Tech Companies

Large platforms and marketplaces will face the strictest DSA requirements. They have already met one compliance deadline: a subset of DSA regulations focusing on algorithmic transparency and risk mitigation has been in effect for major platforms and search engines (referred to as VLOPs and VLOSEs) since late August. In December, the Commission initiated its first formal investigation into a VLOP—Elon Musk's X (formerly Twitter)—over suspected violations.

Starting tomorrow, these tech giants, including X, must comply with the DSA's general obligations. If Musk's compliance efforts have been subpar, he now faces additional requirements, including providing content reporting tools for users, allowing users to appeal moderation decisions, collaborating with "trusted flaggers" (authorized third-party reporting agents), producing transparency reports, and enforcing KYC rules.

Platforms are also required to provide users with a "statement of reasons" for every moderation decision that impacts them, like content removal or demotion. Currently, the EU is compiling these statements into a database—accumulating more than 4 billion from larger platforms already subject to VLOP rules. The Commission aims to gather a comprehensive overview of content moderation practices as smaller platforms begin contributing their statements to the database.

Additional obligations for platforms include disclosing information about advertisements they run and any algorithmic recommender systems they utilize. The DSA explicitly forbids the use of children's data for advertising, necessitating that platforms ensure minors' information is excluded from existing ad targeting systems. However, the Commission acknowledges the complexities involved in determining users' ages without infringing on privacy rights, such as by mandating comprehensive age verification measures.

As a result, starting tomorrow, all platforms must provide "effective protection measures for minors." Ongoing discussions among DSA enforcers are focused on identifying acceptable technologies to support compliance, leaving platforms uncertain about the specific methods they should adopt.

Digital Services Coordinators

Oversight of tech giants for the general DSA rules falls to Digital Services Coordinators (DSCs) at the EU Member State level, not the Commission, which will continue to enforce obligations for VLOPs and VLOSEs. The DSA brings an entirely new layer of digital supervision to regulate online activities across Europe, maintaining a "country of origin" principle from previous e-commerce regulations. Thus, regulatory oversight will come from authorities in countries where platforms are established.

For instance, X's compliance with DSA regulations will likely be monitored by the Irish media regulator, Coimisiún na Meán, which will also oversee compliance for other tech giants like Apple, Meta, and TikTok, all of which have their European headquarters in Ireland. In contrast, the Luxembourg competition authority may oversee Amazon's compliance due to its choice of regional base.

Platforms without a regional establishment and those that have not appointed a local legal representative may face enforcement actions from any competent body within any Member State, increasing their risk of regulatory exposure. This scenario raises questions about the enforceability of EU laws on foreign entities, particularly considering existing challenges faced by EU data protection authorities in regulating companies like Clearview AI under the GDPR.

Smaller EU-based platforms and startups will likely be subject to general DSA oversight from the DSC appointed in their respective markets. For example, France's BeReal, a popular photo-sharing service, will presumably have its DSA compliance monitored by ARCOM, the country's communications and audiovisual regulator.

DSCs are a mix of existing regulatory agencies, including telecom, media, consumer, and competition regulators. Member States may appoint multiple bodies to ensure thorough oversight.

The EU has created a webpage for identifying DSCs in each Member State, though not all appointments have been finalized as of this writing. As "coordinators," these DSCs will work together to leverage relevant expertise and ensure effective oversight of various platforms and businesses. They will also assist the Commission's enforcement actions on larger platforms' systemic risks, although ultimate enforcement authority over VLOPs/VLOSEs remains with the Commission.

Additionally, the DSA establishes a new body—the "European Board for Digital Services," where DSCs will meet regularly to share information and coordinate efforts. Early workstreams are already focused on setting best practices for data access for researchers, awarding trusted flagger status, and streamlining user complaint handling.

Until best practice consensus is achieved and compliance guidelines are issued, regulated platforms will have to navigate their own paths forward. DSCs will also serve as contact points for citizens wishing to file DSA-related complaints, taking responsibility for forwarding complaints about platforms that fall outside their jurisdiction to the appropriate body.

In addition to regulatory complaints, EU consumers will have options for collective redress litigation if a company fails to uphold their rights under the DSA, adding yet another layer of potential litigation for non-compliant platforms.

DSCs appointed in time for tomorrow's deadline may choose to initiate investigations or request information from the platforms they oversee. However, it remains uncertain how quickly these new digital enforcers will act. Based on previous EU digital regulation implementations, it is probable that platforms will be given some leeway to develop compliance measures as enforcement agencies adjust to their new roles.

DSCs can issue fines of up to 6% of a company's global annual turnover for regulatory violations, matching the penalties that the Commission imposes on VLOPs/VLOSEs for breaches of special obligations. In practice, this means that platforms may face dual penalties if they fail to comply with both DSA obligations and general regulations.

As the DSA is fully enacted, it intensifies the regulatory landscape for platforms operating in the EU, bringing in numerous new requirements and introducing another layer of oversight amidst a growing array of existing laws including the General Data Protection Regulation, ePrivacy Directive, Data Act, and the anticipated AI Act.

Legal firms and consultants will likely find significant demand in interpreting how these complex regulations intertwine and overlap.

Emerging Challenges and Insights

In a sign of the evolving regulatory environment, Ireland’s Coimisiún na Meán is currently consulting on rules for video-sharing platforms that may require them to disable profiling-based content feeds by default in that market.

Although this proposed policy stems from EU audiovisual rules rather than the DSA, it exemplifies how Ireland’s regulatory body could conduct unique regulatory experiments when implementing the DSA for platforms like Meta, TikTok, and X.

Another intriguing topic is how the DSA will be applied to rapidly evolving generative AI tools. The swift popularity of AI chatbots like OpenAI’s ChatGPT arose after the DSA had already been drafted, but the regulation is designed to be adaptable to new types of platforms and services that may emerge.

When questioned about this, a Commission official noted two scenarios concerning generative AI tools: one involves embedding such AI within an existing DSA-compliant platform (e.g., integrating it into a search engine or recommendation system), where the DSA's guidelines already apply.

The second involves standalone AI tools not integrated into platforms defined under the DSA. Here, the challenge lies in determining whether the AI tool qualifies as either a platform or a search engine according to the regulation's definitions.

“A legal examination will clarify whether it functions as a search engine, or if it is technically hosting content in response to user requests,” the official explained. “If it meets the definition, the DSA applies straightforwardly.”

However, the timeline for this determination remains unclear and will depend on the discretion of the corresponding DSC.

According to the Commission, standalone AI tools meeting the DSA's platform or search engine definition and boasting at least 45 million monthly users could, in the future, be classified as VLOPs/VLOSEs. If that occurs, the additional requirements for algorithmic transparency and risk management would come into force, with the Commission accountable for oversight and enforcement. However, details from the upcoming AI Act may also influence how these regulations interact, further complicating this landscape.

This evolving regulatory landscape presents both challenges and opportunities as the DSA is set to reshape how platforms operate across the EU.

Most people like

Find AI tools in YBX

Related Articles
Refresh Articles