Legal consultingApril 9, 20255 min read
    VH
    Victoria Hayes

    The E-Commerce Directive vs. the Digital Services Act: What's Changing for Platform Liability?

    Enter the Digital Services Act (DSA)—the EU’s shiny new regulatory overhaul, aiming to bring platform liability rules out of the dial-up era and into the age of TikTok.

    The E-Commerce Directive vs. the Digital Services Act: What's Changing for Platform Liability?

    Since February 17, 2024, when the Digital Services Act took full effect across the European Union, online platforms have faced a wave of compliance deadlines. Over 100 million notices for content removal flooded major sites in the first quarter alone, signaling regulators' intent to enforce stricter rules on illegal activities. This shift builds on the E-Commerce Directive from 2000, but introduces precise mechanisms that demand immediate attention from platform operators in the USA, UK, and EU markets.

    Foundations of the E-Commerce Directive

    The E-Commerce Directive, adopted in 2000, emerged as the EU's first major framework for the digital economy. It aimed to harmonize laws across member states, fostering cross-border trade without excessive regulatory hurdles. At its core, the directive established limited liability for intermediaries—those entities that host, cache, or transmit user-generated content. For instance, a hosting provider like an online forum could avoid responsibility for illegal posts unless they gained actual knowledge of the wrongdoing and failed to act swiftly.

    This approach encouraged innovation by shielding platforms from the burden of constant surveillance. Consider early examples: ISPs routing traffic or bulletin boards displaying user messages operated under this umbrella, free from liability as long as they responded to specific complaints. The directive also enshrined the country-of-origin principle, meaning a platform established in France only needed to comply with French laws, not those of every EU nation it served. This simplified operations for growing businesses, allowing them to scale without navigating a patchwork of national rules.

    Yet, the ECD's simplicity became its limitation as the web expanded. Platforms like early eBay or Craigslist thrived under these protections, but the directive's vague definitions left room for inconsistent enforcement. National courts interpreted 'actual knowledge' differently, leading to disputes that slowed down responses to emerging threats like piracy or hate speech. By the mid-2010s, reports from the European Commission highlighted how these gaps allowed harmful content to persist, prompting calls for reform.

    For professionals managing cross-border platforms, understanding this baseline is crucial. The ECD's legacy persists in today's operations, but ignoring its evolution risks fines or operational disruptions. Review your current liability assessments against these original standards to spot vulnerabilities before DSA audits arrive.

    The Internet's Growth and the DSA's Origins

    Two decades after the ECD, the online world has transformed dramatically. User-generated content now drives economies worth trillions, with platforms like Amazon hosting millions of sellers and social networks amplifying voices to billions. This scale brought new risks: disinformation campaigns influenced elections, algorithmic feeds spread misinformation at lightning speed, and dark patterns tricked users into unwanted purchases. In 2022, the EU responded with the Digital Services Act, part of a dual strategy alongside the Digital Markets Act, to address these issues head-on.

    The DSA's development stemmed from years of consultations and reports documenting platform harms. For example, a 2018 Commission study revealed that illegal content removal rates varied wildly—some platforms acted within hours, others took days—exacerbating problems like child exploitation material or counterfeit sales. Lawmakers aimed to update the ECD without stifling growth, focusing on transparency and accountability. The act applies to all intermediary services, from small cloud hosts to giants like Meta, with phased implementation starting in 2023 for very large platforms.

    Key drivers included rising public pressure post-scandals, such as the Cambridge Analytica affair, which exposed data misuse and biased algorithms. EU officials sought to protect consumers while ensuring fair competition. Unlike the ECD's light-touch approach, the DSA imposes targeted obligations, backed by empirical data from pilot programs. Platforms in the UK, adapting post-Brexit, find parallels in their Online Safety Bill, while US operators eye similar pressures from Section 230 debates.

    Operators should note the DSA's extraterritorial reach: any platform targeting EU users must comply, regardless of headquarters. Conduct a jurisdictional audit now—list your EU user base and revenue streams—to gauge exposure. Early preparation can turn regulatory demands into competitive edges, like building trust through visible compliance.

    Principles Carried Over from ECD to DSA

    The DSA builds directly on the ECD's bedrock ideas, ensuring continuity for established practices. Central to both is the absence of a general monitoring obligation. Platforms remain free from scanning all content proactively, a protection that preserves free speech and reduces costs. This means a blog host doesn't need AI tools to pre-vet every post, as long as it responds to valid notices.

    Conditional liability also endures. Under both regimes, intermediaries escape fault if they lack knowledge of illegality or act expeditiously upon notification. Picture a marketplace removing a fraudulent listing within 24 hours of a complaint—such diligence shields against claims. The DSA refines this by clarifying timelines and processes, but the principle holds: ignorance isn't bliss, but prompt action is.

    These continuities reassure smaller operators who feared a total overhaul. The country-of-origin rule evolves slightly, with DSA emphasizing coordinated enforcement, but platforms still anchor compliance to their base country. For EU-based firms, this means aligning with local DSA coordinators; UK platforms must navigate separate but aligned rules. Examples abound: a Dutch forum continues operations much like before, focusing on reactive measures rather than invasive monitoring.

    To use these overlaps, audit your existing notice-handling protocols. Ensure they align with ECD standards, then layer on DSA specifics. This phased approach minimizes disruption while demonstrating regulatory savvy to stakeholders.

    Standardizing Notice-and-Action Procedures

    One of the DSA's clearest advances is formalizing how platforms handle complaints about illegal content. The ECD left notice formats to national discretion, resulting in delays and disputes—complainants often struggled to craft effective requests, and platforms rejected unclear ones. Now, the DSA mandates a uniform 'notice and action' system, specifying elements like the reporter's identity, content description, and legal basis for removal.

    Platforms must acknowledge notices within set periods, typically days, and explain decisions to both parties. If content stays online, reasons must be provided, with appeal options for users. Take a scenario: a user flags terrorist propaganda on a video site. Under DSA, the platform processes the notice via a standardized portal, notifies the poster of potential suspension, and reports outcomes to authorities if needed. This reduces ambiguity and speeds resolutions.

    For very large online platforms, additional reporting kicks in, including annual transparency summaries on notice volumes and action rates. Smaller sites get basic requirements, but all must integrate these into user interfaces—think dedicated reporting buttons with guided forms. EU data shows early adopters cutting processing times by 40%, improving user trust.

    Actionable steps include building or upgrading your complaint system. Use open-source templates compliant with DSA Annex I. Train staff on verification to avoid false positives, and test with mock notices. US and UK platforms serving EU users should align these with local privacy laws like GDPR or the upcoming UK codes.

    Know Your Business Customer Requirements

    Marketplaces face a new verification mandate under the DSA: Know Your Business Customer, or KYBC. Unlike the ECD, which ignored seller vetting, platforms must now collect and check identities, addresses, and payment details for business users. This targets rogue operators, like those peddling fakes or scams, holding platforms accountable if they ignore red flags.

    Implementation involves risk-based checks—low-risk sellers get basic info, high-risk ones face deeper scrutiny, such as bank statements or trade licenses. Amazon, for instance, expanded its seller verification post-DSA, banning thousands of suspicious accounts in 2024. Platforms must suspend non-compliant sellers and report systemic issues to regulators, closing loopholes that let counterfeiters thrive.

    Liability shifts here: if a platform knowingly hosts illicit traders, it faces direct responsibility, including damages to buyers. This echoes US FTC actions but with EU-wide teeth. For cross-market operators, integrate KYBC with existing KYC for consumers to avoid duplication.

    Start by mapping your seller ecosystem. Categorize by risk—electronics sellers might need extra checks for IP infringement. Invest in automated tools for ID validation, aiming for 95% coverage within six months. Document everything; audits will probe these records closely.

    Tiered Obligations by Platform Size

    The DSA introduces scale-based rules, differentiating duties for small, medium, and very large online platforms (VLOPs). Under the ECD's uniform approach, everyone followed the same basics. Now, VLOPs—those reaching 45 million monthly EU users, about 10% of the population—undergo rigorous oversight, including independent audits and risk management plans.

    Smaller platforms stick to core obligations like notice handling and terms transparency, while VLOPs add systemic risk assessments for issues like election interference or mental health harms. TikTok, designated a VLOP in 2024, must now publish annual reports on algorithmic impacts and appoint a compliance officer. This tiering ensures resources match influence, preventing over-burdening startups.

    Enforcement varies: national authorities handle smaller platforms, the Commission oversees VLOPs with powers for on-site inspections. UK equivalents classify platforms similarly, while US firms monitor for state-level analogs. Calculate your status using EU user metrics; thresholds adjust yearly.

    If nearing VLOP size, form a cross-functional team now. Conduct mock risk assessments, focusing on top harms in your niche. Budget for external auditors—costs can hit millions, but non-compliance risks dwarf them.

    Transparency in Algorithms and Terms of Service

    Algorithms, once black boxes, now demand daylight under the DSA. Platforms must disclose how recommendation systems prioritize content, including data sources and weighting factors. Users gain controls, like opting out of personalized feeds, addressing biases that amplify extremes. The ECD ignored this; DSA makes it mandatory for larger entities.

    Terms of service also evolve: they require plain language, detailing moderation rules, data use, and appeal processes. No more burying policies in fine print—everything must be accessible and consistent. Facebook's 2024 updates, for example, simplified sections into user-friendly summaries, reducing complaints by 25%.

    For algorithms, provide in-app explanations—e.g., 'This post appeared due to your interest in travel.' VLOPs face deeper scrutiny, including impact studies on vulnerable groups. Align with GDPR for data transparency to cover multiple bases.

    Revise your terms quarterly. Use readability tools aiming for a 8th-grade level. For algorithms, audit code for explainability; tools like SHAP can help generate reports. Educate users via pop-ups to build loyalty.

    Enforcement, Penalties, and Proactive Steps

    Unlike the ECD's reliance on national courts, the DSA centralizes enforcement. The European Commission fines VLOPs up to 6% of global annual turnover—€20 billion potential for giants—plus daily penalties for delays. National DSA coordinators handle others, with cooperation across borders. Early 2024 saw €10 million fines for non-reporting, underscoring the act's bite.

    Trusted flaggers—NGOs or officials—get priority on notices, demanding faster responses. Platforms must track and publish flagging stats, fostering accountability. This setup deters lax practices, as seen in Germany's swift crackdowns on hate speech hosts.

    For platforms, immediate actions include: mapping content flows to pinpoint risks; implementing KYBC and notice systems; simplifying terms; preparing audits if large-scale. Numbered priorities: 1. Assess size and obligations. 2. Update tech stacks for compliance. 3. Train teams on DSA nuances. 4. Monitor regulatory updates via EU portals.

    View this as an opportunity. Compliant platforms gain user confidence and easier funding. Engage legal experts early—costs pale against penalties. UK and US operators, harmonize with local rules to simplify global ops.

    FAQ

    Does the DSA apply to non-EU platforms?

    Yes, the DSA has extraterritorial effect. Any intermediary service offering services to EU consumers must comply, even if based in the US or UK. This includes assessing EU user numbers and targeting activities like localized language support or EU payment options. For example, a California-based app with 1 million EU downloads falls under basic rules; scale up to VLOP thresholds, and deeper obligations apply. To determine applicability, calculate your EU audience share and revenue—tools from the European Commission provide guidance. Non-compliance invites fines from national authorities, so map your exposure and appoint an EU representative if needed.

    How does KYBC differ from general KYC?

    KYBC focuses specifically on business users on marketplaces, requiring verification of commercial legitimacy like VAT numbers, business registries, and contact details. It's risk-based: simple for low-risk sellers, thorough for high-risk categories like luxury goods to curb counterfeits. Unlike broader KYC for individuals under AML laws, KYBC ties directly to platform liability—if you host verified bad actors knowingly, you're exposed. Implement via automated checks integrated with your onboarding; retain records for three years. Platforms like Etsy have rolled this out, reducing fraud by verifying 90% of sellers upfront. Consult EU guidelines for templates to ensure alignment.

    What are the penalties for DSA non-compliance?

    Penalties scale with severity and platform size. Basic violations, like poor notice handling, draw fines up to €10 million or 2% of EU turnover from national regulators. VLOPs face up to 6% of global turnover from the Commission, plus periodic payments for persistent issues—potentially billions for repeated offenses. Examples include data access failures or ignoring trusted flaggers. Mitigate by self-reporting and corrective plans; the DSA encourages proportionality. Track enforcement via the Commission's database, and budget 1-2% of revenue for compliance to stay safe.

    Can platforms still avoid monitoring under DSA?

    Absolutely—the no general monitoring obligation from the ECD carries over. Platforms aren't required to scan all content proactively, protecting innovation and free expression. However, targeted monitoring is allowed and sometimes required, like for specific illegal risks in VLOP assessments. Respond to notices diligently to maintain safe harbor. If you're a content host, document your reactive policies clearly in terms of service. This balance lets small forums operate freely while holding giants accountable for systemic harms—review case law from the CJEU for precedents.

    Ready to leverage AI for your business?

    Book a free strategy call — no strings attached.

    Get a Free Consultation