Legal consultingApril 14, 20255 min read

    Notice-and-Action Mechanisms in the Courts

    Notice-and-action mechanisms have become a cornerstone of online content regulation, shaping how platforms respond to illegal content. This article examines key court rulings that interpret these mechanisms, highlighting their impact on platform liability and user rights.

    Notice-and-Action Mechanisms in the Courts

    The notice-and-action (N&A) mechanism lies at the heart of the EU’s framework for content moderation under the E-Commerce Directive and, more recently, the Digital Services Act (DSA). These mechanisms require online platforms and hosting providers—collectively known as Information Society Services (ISS)—to remove or disable access to illegal content once they have been duly notified.

    /wp:paragraph wp:paragraph

    While the legal foundation for N&A systems is harmonized at the EU level, national courts have played a central role in interpreting how these obligations apply in practice. Key cases from Member States reveal diverging standards on what constitutes a valid notice, how swiftly platforms must act, and when intermediary liability arises.

    /wp:paragraph wp:heading

    The Legal Framework: From E-Commerce Directive to DSA

    /wp:heading wp:paragraph

    Under Article 14 of the E-Commerce Directive (2000/31/EC), hosting providers are not liable for illegal content stored on their services provided they do not have actual knowledge of its illegality, or upon obtaining such knowledge, act “expeditiously” to remove it.

    /wp:paragraph wp:paragraph

    The DSA (Regulation (EU) 2022/2065) builds on this foundation, introducing:

    /wp:paragraph wp:list
    • A standardized notice-and-action system for all hosting providers (Art. 16),
    • Requirements for user-friendly and accessible notification tools,
    • Obligations to give reasons when removing or retaining content (Art. 17),
    • Avenues for appeal and complaints (Art. 20).
    /wp:list wp:paragraph

    National courts are already shaping the practical meaning of these principles, especially in the absence of detailed CJEU guidance on procedural requirements.

    /wp:paragraph wp:heading

    Germany: Rigorous Timelines and Systemic Duties

    /wp:heading wp:paragraph

    German courts have long interpreted N&A obligations strictly, especially under the Network Enforcement Act (NetzDG), which imposes a 24-hour deadline for removing "manifestly unlawful content".

    /wp:paragraph wp:paragraph

    In a 2021 decision by the Higher Regional Court of Dresden, a social media platform was held liable for defamatory comments posted by users because it delayed action after receiving a sufficiently detailed takedown request. The court emphasized that once notified, platforms must act swiftly—even without a formal court order—if the content’s illegality is clear.

    /wp:paragraph wp:paragraph

    Key Takeaway:
    In Germany, a detailed notice from an affected party can be sufficient to establish actual knowledge, and delays beyond a few days may lead to liability, even without further judicial involvement.

    /wp:paragraph wp:heading

    France: Balancing Judicial Oversight with Platform Responsibility

    /wp:heading wp:paragraph

    France’s approach under the LCEN Law (2004) reflects a more cautious stance, with a strong emphasis on judicial determination of illegality before content removal is mandated.

    /wp:paragraph wp:paragraph

    In a 2022 case before the Paris Court of Appeal, the court ruled that a platform was not liable for failing to remove content upon user request, because the reported material was not manifestly illegal and required judicial assessment of defamation. The court underscored that platforms are not obliged to act as judges and may wait for a court decision unless the content clearly violates the law (e.g., hate speech, incitement to violence).

    /wp:paragraph wp:paragraph

    Key Takeaway:
    French courts distinguish between manifestly illegal content, which must be removed upon notice, and content whose illegality is ambiguous or disputed—where judicial review is expected.

    /wp:paragraph wp:heading

    Spain: Emphasis on Response Times and Duty of Diligence

    /wp:heading wp:paragraph

    In Spain, the Ley de Servicios de la Sociedad de la Información (LSSI) incorporates the E-Commerce Directive’s N&A mechanism. Spanish courts have generally favored a pragmatic, diligence-based approach.

    /wp:paragraph wp:paragraph

    In a 2020 decision by the Audiencia Provincial de Madrid, a platform was found partially liable for failing to remove slanderous content after receiving multiple user complaints. The court concluded that, although the platform lacked “actual knowledge” at first, the pattern of repeated notifications and lack of internal escalation procedures showed a breach of its duty to act diligently.

    /wp:paragraph wp:paragraph

    Key Takeaway:
    Spanish courts expect platforms to implement effective internal escalation procedures and treat repeated user notifications as a trigger for deeper content review.

    /wp:paragraph wp:heading

    Cross-Border Implications Under the DSA

    /wp:heading wp:paragraph

    The DSA now provides a harmonized structure that reflects many of the principles derived from national case law:

    /wp:paragraph wp:list
    • Platforms must ensure accessible, standardized mechanisms for users to report illegal content (Art. 16).
    • Notices must include a clear explanation of illegality, the exact URL, and the identity of the notifier.
    • Hosting providers must notify users when action is taken on their content, and must preserve due process rights through complaint and appeal mechanisms.
    /wp:list wp:paragraph

    The first enforcement actions under the DSA are beginning to reflect these obligations. Regulators and courts are now focusing on whether platforms respond proportionately and promptly—and whether users are kept informed throughout the process.

    /wp:paragraph wp:heading {"level":3}

    Recommendations for ISS Providers

    /wp:heading wp:list {"ordered":true}
    1. Implement Robust N&A Infrastructure
      Ensure that your platform has an easily accessible and well-documented complaint system, with built-in workflows for content review, escalation, and response.
    2. Train Moderation Teams on Legal Criteria
      Content moderators must understand the legal definitions of “manifestly illegal” under applicable national laws, especially when distinguishing hate speech from offensive opinion or satire.
    3. Document Every Action Taken
      Retain logs of notices received, actions taken, and timelines to demonstrate diligence and compliance in the event of legal challenge or regulatory audit.
    4. Review and Update Terms of Service
      Clearly explain moderation rules, notification procedures, and appeal channels to users in your platform’s terms and privacy policies.
    5. Monitor Jurisprudence Across Member States
      As national interpretations of the DSA evolve, platforms must stay up to date with the case law in key jurisdictions—especially where they operate or have significant user bases.
    /wp:list wp:heading {"level":3}

    Conclusion

    /wp:heading wp:paragraph

    The evolution of notice-and-action jurisprudence across EU Member States shows a steady move toward greater accountability and procedural transparency. While national approaches vary, the trend is clear: platforms must respond to credible notifications with diligence and traceability.

    /wp:paragraph wp:paragraph

    With the DSA now in force, litigation is likely to focus on compliance with procedural rights and documentation of internal moderation practices. For platform operators and digital service providers, understanding the practical standards set by national courts is essential to avoiding liability and maintaining user trust.

    /wp:paragraph

    Ready to leverage AI for your business?

    Book a free strategy call — no strings attached.

    Get a Free Consultation