The responsibilities of Information Society Service (ISS) providers to address hate speech and defamation have come under increasing scrutiny by national courts across the European Union. While the E-Commerce Directive (Directive 2000/31/EC) provides a harmonised baseline for limiting intermediary liability, Member States have developed distinct approaches—particularly when it comes to notice obligations, takedown timelines, and content moderation expectations.
In this article, we examine how courts in Germany, France, and Spain are interpreting ISS obligations to remove unlawful content and what legal and compliance lessons can be drawn from these jurisdictions.
Germany: Proactive Obligations Under the Network Enforcement Act (NetzDG)
Germany has adopted one of the strictest national regimes for platform accountability through the Network Enforcement Act (NetzDG), which applies to social networks with more than two million users in Germany. It imposes a quasi-regulatory duty on platforms to act swiftly against “manifestly unlawful content,” including hate speech and defamation.
Judicial Trends:
- German courts often uphold high standards for removal obligations and interpret “actual knowledge” broadly, especially when the unlawful nature of content is obvious.
- In several rulings, courts have required systemic improvements to moderation practices, not just case-by-case responses.
Key Requirements Under NetzDG:
- Removal of manifestly unlawful content within 24 hours of receiving a complaint.
- Reporting obligations to the Federal Office of Justice.
- Establishment of transparent and efficient user complaints procedures.
Lesson:
In Germany, ISS providers—especially large platforms—are expected to implement automated and scalable moderation mechanisms, and failure to comply can result in significant administrative fines.
France: Liability in Light of Constitutional Freedoms and the Avia Law
France balances intermediary obligations with constitutional protections for freedom of expression. The French legal tradition distinguishes between content that is manifestly illegal (e.g. Holocaust denial, incitement to violence) and content requiring judicial review.
Legislative Background:
- The Avia Law (2020), which attempted to mandate 24-hour takedowns for hate speech, was largely struck down by the French Constitutional Council for being disproportionate and risking over-censorship.
- France continues to rely on the LCEN Law (Loi pour la Confiance dans l’Économie Numérique), which mirrors the E-Commerce Directive and requires removal upon notification if content is “manifestly unlawful.”
Judicial Application:
- French courts emphasise that platforms must act once they receive specific and detailed notifications of harmful content.
- In some cases, courts have imposed liability where platforms failed to act on repeated reports of defamatory or racist content.
Lesson:
In France, while the regulatory environment remains aligned with EU standards, platforms must ensure robust notice-and-action procedures and avoid arbitrary takedowns that could infringe on free speech rights.
Spain: Civil Liability and the Role of Platform Neutrality
In Spain, the Ley de Servicios de la Sociedad de la Información (LSSI) implements the E-Commerce Directive, providing conditional immunity for hosting providers. Spanish courts have generally reinforced the passive role of intermediaries, but also recognise duties once knowledge of unlawful content is established.
Key Case Examples:
- In several defamation cases involving social media platforms and blogs, Spanish courts have ruled that failure to act after credible notice removes the platform’s liability shield.
- Courts have also acknowledged secondary liability where platforms fail to enforce their own content policies in bad faith or enable repeated violations.
Judicial Attitudes:
- Spanish judges tend to favour balance between protection of reputation and freedom of information, often requiring judicial review before ordering content removal, unless the harm is obvious and serious.
Lesson:
ISS providers operating in Spain should ensure comprehensive logging and response systems to handle user complaints and should document actions taken in response to reported content.
Comparative Insights
Country | Key Standard | Removal Timeframe | Risk of Liability |
---|---|---|---|
Germany | Proactive duty under NetzDG | 24 hours (manifest cases) | High if systemic failures occur |
France | Judicially balanced, detailed notice needed | No fixed time, must act “promptly” | Medium—judicial oversight crucial |
Spain | Duty triggered by notice and credibility | Reasonable timeframe | Low if neutral and responsive |
Recommendations for ISS Providers
- Implement Differentiated Notice Systems
Provide structured forms for users to submit detailed complaints and classify them by severity (e.g., hate speech vs. controversial opinion). - Maintain Clear Moderation Policies
Publicly communicate your standards for illegal content removal and ensure your internal practices are consistent. - Train Legal and Moderation Teams on National Nuances
Moderation teams should understand local legal standards and cultural sensitivities, especially when handling hate speech complaints. - Preserve Audit Trails
Keep records of all complaints, investigations, and removals to demonstrate compliance if challenged. - Review Cross-Border Jurisdiction Risks
Platforms accessible across the EU must be prepared to navigate varying national standards while also preparing for Digital Services Act (DSA) obligations from 2024 onward.
Wnioski
As hate speech and defamation continue to pose legal and ethical challenges in the digital sphere, national courts across Europe are shaping a nuanced and often divergent picture of ISS responsibilities. Platforms operating in the EU must go beyond minimal compliance and adopt jurisdiction-sensitive, proactive strategies to manage unlawful content. Understanding these judicial trends is key to legal risk management and long-term trust with users and regulators.
Our firm advises tech companies, digital platforms, and hosting providers across the EU on content moderation strategies and liability risk. Contact us for tailored legal assessments, policy drafting, and compliance audits based on your target jurisdictions.