Blog
Intermediary Liability Revisited: Platform Accountability in Recent Judgments

Intermediary Liability Revisited: Platform Accountability in Recent Judgments

Alexandra Blake, Key-g.com
podle 
Alexandra Blake, Key-g.com
5 minut čtení
Právní poradenství
Duben 14, 2025

The concept of intermediary liability—whether and when digital platforms are responsible for user-generated content—has been a cornerstone of EU digital law since the adoption of the E-Commerce Directive (Directive 2000/31/EC). Under Article 14, hosting providers are exempt from liability for illegal content unless they have actual knowledge and fail to act expeditiously to remove or disable access.

However, recent case law from the Court of Justice of the European Union (CJEU), particularly Glawischnig-Piesczek v Facebook Ireland (C-18/18), has challenged some long-held assumptions about the scope and limits of that exemption. As the EU’s Digital Services Act (DSA) begins to apply, these decisions offer critical guidance on what platforms must do—and what they can no longer ignore.

Glawischnig-Piesczek v Facebook (C-18/18): Beyond Notice-and-Takedown

In this 2019 ruling, the CJEU considered whether Facebook could be ordered to remove or block access to defamatory content worldwide and whether such an obligation could extend to identical or equivalent content—not just the specific post reported.

The case was initiated by Austrian politician Eva Glawischnig, who sought removal of a user’s Facebook post that insulted and defamed her. She also requested that Facebook prevent equivalent content from appearing in the future.

Key Holdings:

  • A platform can be required to remove content identical to or equivalent to content that has already been found unlawful, provided that the search does not require independent assessment by the platform.
  • The obligation can extend globally, depending on the scope of the national court’s order and applicable international law.
  • The decision does not conflict with Article 15 of the E-Commerce Directive, which prohibits general monitoring obligations, because the obligation to monitor is specific and targeted.

Implications:

  • Platforms must be prepared to implement automated filtering or proactive identification tools once they are notified of unlawful content.
  • The idea of “equivalent content” introduces new complexity. Platforms must decide how broadly to interpret similarity and what tools are necessary to comply.
  • The decision suggests that jurisdictional boundaries may not limit removal obligations, raising risks of global takedown orders.

Other Notable Judgments on Hosting Liability

While Glawischnig-Piesczek is the headline case, several other decisions have shaped the liability framework for online intermediaries:

  • YouTube and Cyando (Joined Cases C-682/18 and C-683/18):
    • Reaffirmed that platforms can benefit from hosting liability exemptions if they remain passive and do not actively contribute to the presentation or promotion of illegal content.
    • However, they lose immunity if they play an active role—such as curating content or recommending infringing material.
  • SABAM v Netlog (C-360/10):
    • Confirmed that general filtering obligations are not permitted under EU law.
    • Platforms cannot be compelled to monitor all user content in advance.
  • L’Oréal v eBay (C-324/09):
    • Established that platforms may be liable when they have knowledge of illegal activity and do not act quickly.
    • Platforms engaging in commercial promotion of third-party goods may not be “neutral” intermediaries.

Key Principles Emerging from Case Law

  • Actual knowledge triggers responsibility. Once a platform is notified of illegal content, it must act promptly or face liability.
  • Passive vs. active distinction is critical. The more editorial or curatorial control a platform exerts, the less likely it will qualify for immunity.
  • Obligations may be ongoing and proactive. Platforms may have to remove not just the original content, but also equivalent content—even proactively.
  • No general monitoring, but targeted filtering allowed. Courts can order platforms to prevent specific types of content, but not to pre-screen all uploads.

How Platforms Should React

  1. Implement Effective Notice-and-Action Mechanisms.
    Platforms must ensure they have fast, transparent, and user-friendly processes for handling illegal content notifications. Delays can trigger liability.
  2. Develop and Audit Filtering Technology.
    Proactive tools (e.g., content hashing, keyword matching) may be required to identify and remove equivalent or repeat content.
  3. Maintain Content Moderation Logs and Records.
    Platforms should document takedown decisions, timestamps, and user notifications to demonstrate good faith compliance.
  4. Establish Clear User Terms and Enforcement Policies.
    Strong and enforceable community guidelines can help demonstrate the platform’s commitment to preventing abuse.
  5. Assess Global Impacts of Court Orders.
    Legal teams must evaluate whether takedown obligations have international reach and how to balance conflicting laws.

Závěr

Intermediary liability in the EU is no longer a static or purely reactive concept. The CJEU has opened the door to proactive, targeted obligations, and platforms must respond with scalable and legally robust processes. With the Digital Services Act reinforcing and expanding upon these principles, the need for legal risk assessments, compliance programs, and moderation infrastructure is greater than ever.

For platform operators, legal counsel, and digital service providers, staying ahead of this case law is essential—not just to avoid liability, but to build trust in an increasingly regulated online environment.

If your company operates an online platform, we can assist in reviewing your content policies, notification systems, and legal risk under EU and national laws. Contact our digital services legal team for tailored compliance support.