博客
Notice-and-Action Mechanisms in the Courts

Notice-and-Action Mechanisms in the Courts

亚历山德拉-布莱克,Key-g.com
由 
亚历山德拉-布莱克,Key-g.com
5 分钟阅读
法律咨询
4 月 14, 2025

The notice-and-action (N&A) mechanism lies at the heart of the EU’s framework for content moderation under the E-Commerce Directive and, more recently, the Digital Services Act (DSA). These mechanisms require online platforms and hosting providers—collectively known as Information Society Services (ISS)—to remove or disable access to illegal content once they have been duly notified.

While the legal foundation for N&A systems is harmonized at the EU level, national courts have played a central role in interpreting how these obligations apply in practice. Key cases from Member States reveal diverging standards on what constitutes a valid notice, how swiftly platforms must act, and when intermediary liability arises.

The Legal Framework: From E-Commerce Directive to DSA

Under Article 14 of the E-Commerce Directive (2000/31/EC), hosting providers are not liable for illegal content stored on their services provided they do not have actual knowledge of its illegality, or upon obtaining such knowledge, act “expeditiously” to remove it.

The DSA (Regulation (EU) 2022/2065) builds on this foundation, introducing:

  • A standardized notice-and-action system for all hosting providers (Art. 16),
  • Requirements for user-friendly and accessible notification tools,
  • Obligations to give reasons when removing or retaining content (Art. 17),
  • Avenues for appeal and complaints (Art. 20).

National courts are already shaping the practical meaning of these principles, especially in the absence of detailed CJEU guidance on procedural requirements.

Germany: Rigorous Timelines and Systemic Duties

German courts have long interpreted N&A obligations strictly, especially under the Network Enforcement Act (NetzDG), which imposes a 24-hour deadline for removing “manifestly unlawful content”.

In a 2021 decision by the Higher Regional Court of Dresden, a social media platform was held liable for defamatory comments posted by users because it delayed action after receiving a sufficiently detailed takedown request. The court emphasized that once notified, platforms must act swiftly—even without a formal court order—if the content’s illegality is clear.

Key Takeaway:
In Germany, a detailed notice from an affected party can be sufficient to establish actual knowledge, and delays beyond a few days may lead to liability, even without further judicial involvement.

France: Balancing Judicial Oversight with Platform Responsibility

France’s approach under the LCEN Law (2004) reflects a more cautious stance, with a strong emphasis on judicial determination of illegality before content removal is mandated.

In a 2022 case before the Paris Court of Appeal, the court ruled that a platform was not liable for failing to remove content upon user request, because the reported material was not manifestly illegal and required judicial assessment of defamation. The court underscored that platforms are not obliged to act as judges and may wait for a court decision unless the content clearly violates the law (e.g., hate speech, incitement to violence).

Key Takeaway:
French courts distinguish between manifestly illegal content, which must be removed upon notice, and content whose illegality is ambiguous or disputed—where judicial review is expected.

Spain: Emphasis on Response Times and Duty of Diligence

In Spain, the Ley de Servicios de la Sociedad de la Información (LSSI) incorporates the E-Commerce Directive’s N&A mechanism. Spanish courts have generally favored a pragmatic, diligence-based approach.

In a 2020 decision by the Audiencia Provincial de Madrid, a platform was found partially liable for failing to remove slanderous content after receiving multiple user complaints. The court concluded that, although the platform lacked “actual knowledge” at first, the pattern of repeated notifications and lack of internal escalation procedures showed a breach of its duty to act diligently.

Key Takeaway:
Spanish courts expect platforms to implement effective internal escalation procedures and treat repeated user notifications as a trigger for deeper content review.

Cross-Border Implications Under the DSA

The DSA now provides a harmonized structure that reflects many of the principles derived from national case law:

  • Platforms must ensure accessible, standardized mechanisms for users to report illegal content (Art. 16).
  • Notices must include a clear explanation of illegality, the exact URL, and the identity of the notifier.
  • Hosting providers must notify users when action is taken on their content, and must preserve due process rights through complaint and appeal mechanisms.

The first enforcement actions under the DSA are beginning to reflect these obligations. Regulators and courts are now focusing on whether platforms respond proportionately and promptly—and whether users are kept informed throughout the process.

Recommendations for ISS Providers

  1. Implement Robust N&A Infrastructure
    Ensure that your platform has an easily accessible and well-documented complaint system, with built-in workflows for content review, escalation, and response.
  2. Train Moderation Teams on Legal Criteria
    Content moderators must understand the legal definitions of “manifestly illegal” under applicable national laws, especially when distinguishing hate speech from offensive opinion or satire.
  3. Document Every Action Taken
    Retain logs of notices received, actions taken, and timelines to demonstrate diligence and compliance in the event of legal challenge or regulatory audit.
  4. Review and Update Terms of Service
    Clearly explain moderation rules, notification procedures, and appeal channels to users in your platform’s terms and privacy policies.
  5. Monitor Jurisprudence Across Member States
    As national interpretations of the DSA evolve, platforms must stay up to date with the case law in key jurisdictions—especially where they operate or have significant user bases.

结论

The evolution of notice-and-action jurisprudence across EU Member States shows a steady move toward greater accountability and procedural transparency. While national approaches vary, the trend is clear: platforms must respond to credible notifications with diligence and traceability.

With the DSA now in force, litigation is likely to focus on compliance with procedural rights and documentation of internal moderation practices. For platform operators and digital service providers, understanding the practical standards set by national courts is essential to avoiding liability and maintaining user trust.