Legal consultingApril 14, 20255 min read

    Hate Speech, Defamation, and ISS Duties: Lessons from National Courts

    The rise of online platforms has intensified challenges around hate speech and defamation, raising critical questions about the responsibilities of Internet Service Providers (ISS).

    Hate Speech, Defamation, and ISS Duties: Lessons from National Courts

    The responsibilities of Information Society Service (ISS) providers to address hate speech and defamation have come under increasing scrutiny by national courts across the European Union. While the E-Commerce Directive (Directive 2000/31/EC) provides a harmonised baseline for limiting intermediary liability, Member States have developed distinct approaches—particularly when it comes to notice obligations, takedown timelines, and content moderation expectations.

    /wp:paragraph wp:paragraph

    In this article, we examine how courts in Germany, France, and Spain are interpreting ISS obligations to remove unlawful content and what legal and compliance lessons can be drawn from these jurisdictions.

    /wp:paragraph wp:heading

    Germany: Proactive Obligations Under the Network Enforcement Act (NetzDG)

    /wp:heading wp:paragraph

    Germany has adopted one of the strictest national regimes for platform accountability through the Network Enforcement Act (NetzDG), which applies to social networks with more than two million users in Germany. It imposes a quasi-regulatory duty on platforms to act swiftly against “manifestly unlawful content,” including hate speech and defamation.

    /wp:paragraph wp:paragraph

    Judicial Trends:

    /wp:paragraph wp:list
    • German courts often uphold high standards for removal obligations and interpret “actual knowledge” broadly, especially when the unlawful nature of content is obvious.
    • In several rulings, courts have required systemic improvements to moderation practices, not just case-by-case responses.
    /wp:list wp:paragraph

    Key Requirements Under NetzDG:

    /wp:paragraph wp:list
    • Removal of manifestly unlawful content within 24 hours of receiving a complaint.
    • Reporting obligations to the Federal Office of Justice.
    • Establishment of transparent and efficient user complaints procedures.
    /wp:list wp:paragraph

    Lesson:
    In Germany, ISS providers—especially large platforms—are expected to implement automated and scalable moderation mechanisms, and failure to comply can result in significant administrative fines.

    /wp:paragraph wp:heading

    France: Liability in Light of Constitutional Freedoms and the Avia Law

    /wp:heading wp:paragraph

    France balances intermediary obligations with constitutional protections for freedom of expression. The French legal tradition distinguishes between content that is manifestly illegal (e.g. Holocaust denial, incitement to violence) and content requiring judicial review.

    /wp:paragraph wp:paragraph

    Legislative Background:

    /wp:paragraph wp:list
    • The Avia Law (2020), which attempted to mandate 24-hour takedowns for hate speech, was largely struck down by the French Constitutional Council for being disproportionate and risking over-censorship.
    • France continues to rely on the LCEN Law (Loi pour la Confiance dans l'Économie NumĂ©rique), which mirrors the E-Commerce Directive and requires removal upon notification if content is “manifestly unlawful.”
    /wp:list wp:paragraph

    Judicial Application:

    /wp:paragraph wp:list
    • French courts emphasise that platforms must act once they receive specific and detailed notifications of harmful content.
    • In some cases, courts have imposed liability where platforms failed to act on repeated reports of defamatory or racist content.
    /wp:list wp:paragraph

    Lesson:
    In France, while the regulatory environment remains aligned with EU standards, platforms must ensure robust notice-and-action procedures and avoid arbitrary takedowns that could infringe on free speech rights.

    /wp:paragraph wp:heading

    Spain: Civil Liability and the Role of Platform Neutrality

    /wp:heading wp:paragraph

    In Spain, the Ley de Servicios de la Sociedad de la InformaciĂłn (LSSI) implements the E-Commerce Directive, providing conditional immunity for hosting providers. Spanish courts have generally reinforced the passive role of intermediaries, but also recognise duties once knowledge of unlawful content is established.

    /wp:paragraph wp:paragraph

    Key Case Examples:

    /wp:paragraph wp:list
    • In several defamation cases involving social media platforms and blogs, Spanish courts have ruled that failure to act after credible notice removes the platform’s liability shield.
    • Courts have also acknowledged secondary liability where platforms fail to enforce their own content policies in bad faith or enable repeated violations.
    /wp:list wp:paragraph

    Judicial Attitudes:

    /wp:paragraph wp:list
    • Spanish judges tend to favour balance between protection of reputation and freedom of information, often requiring judicial review before ordering content removal, unless the harm is obvious and serious.
    /wp:list wp:paragraph

    Lesson:
    ISS providers operating in Spain should ensure comprehensive logging and response systems to handle user complaints and should document actions taken in response to reported content.

    /wp:paragraph wp:heading

    Comparative Insights

    /wp:heading wp:table
    CountryKey StandardRemoval TimeframeRisk of Liability
    GermanyProactive duty under NetzDG24 hours (manifest cases)High if systemic failures occur
    FranceJudicially balanced, detailed notice neededNo fixed time, must act "promptly"Medium—judicial oversight crucial
    SpainDuty triggered by notice and credibilityReasonable timeframeLow if neutral and responsive
    /wp:table wp:heading

    Recommendations for ISS Providers

    /wp:heading wp:list {"ordered":true}
    1. Implement Differentiated Notice Systems
      Provide structured forms for users to submit detailed complaints and classify them by severity (e.g., hate speech vs. controversial opinion).
    2. Maintain Clear Moderation Policies
      Publicly communicate your standards for illegal content removal and ensure your internal practices are consistent.
    3. Train Legal and Moderation Teams on National Nuances
      Moderation teams should understand local legal standards and cultural sensitivities, especially when handling hate speech complaints.
    4. Preserve Audit Trails
      Keep records of all complaints, investigations, and removals to demonstrate compliance if challenged.
    5. Review Cross-Border Jurisdiction Risks
      Platforms accessible across the EU must be prepared to navigate varying national standards while also preparing for Digital Services Act (DSA) obligations from 2024 onward.
    /wp:list wp:heading

    Conclusion

    /wp:heading wp:paragraph

    As hate speech and defamation continue to pose legal and ethical challenges in the digital sphere, national courts across Europe are shaping a nuanced and often divergent picture of ISS responsibilities. Platforms operating in the EU must go beyond minimal compliance and adopt jurisdiction-sensitive, proactive strategies to manage unlawful content. Understanding these judicial trends is key to legal risk management and long-term trust with users and regulators.

    /wp:paragraph wp:paragraph

    Our firm advises tech companies, digital platforms, and hosting providers across the EU on content moderation strategies and liability risk. Contact us for tailored legal assessments, policy drafting, and compliance audits based on your target jurisdictions.

    /wp:paragraph

    Ready to leverage AI for your business?

    Book a free strategy call — no strings attached.

    Get a Free Consultation