Copyright and the Internet: Poland vs. Parliament and Council (Case C-401/19), Advocate General's Opinion, July 15, 2021

 

 

Lorna Woods, Professor of Internet Law, University of Essex

Introduction

The rise of “Web 2.0” and social media has enabled widespread content sharing, but also raised questions about managing user rights and the role of online platforms in addressing challenges that their services might facilitate. One issue is the use of filtering technologies, particularly their impact on users’ freedom of expression. Courts have previously voiced concerns about these techniques in copyright cases. Given the volume of online content, manual review is impractical. Copyright enforcement and the “value gap” from unauthorized use persist. Platforms, potentially profiting from this content through advertising, haven’t been incentivized to intervene. The existing system where rightsholders notify platforms to remove content for liability protection under Article 14 of the e-Commerce Directive is considered ineffective.

This spurred an overhaul of copyright law, leading to the Copyright Directive in the Digital Single Market (Directive 2019/790). This directive, heavily debated during its development, seeks to bridge the “value gap” and support content creators through measures like a new press publisher’s right (Article 15) and, notably, Article 17, which addresses online platforms’ use of protected content. Article 17’s controversial nature sparked a challenge from Poland, leading to a recent opinion from the Advocate General. While this opinion clarifies Article 17’s scope, its reasoning might have broader implications.

Contested Provisions

Article 17 modifies copyright law by automatically considering platforms covered by the directive as performing “acts of communication to the public” when providing access to copyrighted content uploaded by users. This necessitates authorization from rightsholders. Article 17(3) supersedes Article 14 of the e-Commerce Directive, which offered conditional immunity to neutral hosts. Platforms lacking licensing agreements can only maintain immunity under Article 17(3) by fulfilling the conditions of Article 17(4). This introduces four cumulative conditions:

(a) demonstrating best efforts to secure authorization,

(b) demonstrating, with high industry standards, best efforts to prevent availability of unauthorized works when provided with relevant information by rightsholders, and

(c) acting swiftly to disable access or remove notified works upon receiving a valid notice, and making best efforts to prevent future uploads as per point (b).

While the first part of Article 17(4)(c) mirrors Article 14 of the e-Commerce Directive, the remaining elements are new. This raises several questions: does Article 17(4) necessitate upload filters, potentially leading to over-blocking? What constitutes “best efforts,” especially regarding implied monitoring? Does Article 17(4) effectively mandate “general monitoring” despite Article 17(8)’s clarification?

Article 17(7) offers a potential counterbalance, stating that collaboration between online platforms and rightsholders should not block content that doesn’t violate copyright, including works protected by exceptions or limitations. It explicitly lists exceptions for quotation, criticism, review, caricature, parody, or pastiche. Article 17(9) mandates redress and complaint mechanisms, which some industry players deem burdensome. While Commission guidance aims to clarify these requirements, it was unavailable when the case was filed.

The Legal Challenge

The Core Issue

Poland’s legal challenge seeks to annul Article 17(4)(b) and (c), or the entirety of Article 17, citing incompatibility with freedom of expression under the EU Charter (Article 11 EUCFR). Poland argues that the provision either eliminates the essence of this right or disproportionately restricts it.

The Obligation’s Nature

A key question is whether Article 17(4)’s obligation, particularly regarding preventive monitoring, necessitates upload filters. While not explicitly mandated, the Advocate General believes that such tools are necessary in many cases. Industry standards will likely influence what constitutes best practice. Thus, despite recitals outlining suitable methods, upload filters will likely be used.

Impact on Freedom of Expression

Applicability of the Right

The Advocate General distinguishes between platforms having genuine choices and the situation under Article 17. While platforms seemingly have a choice – comply for exemption or face potential liability – the reality is that compliance becomes an obligation.

Limitations – Lawfulness

Article 52(1) EUCFR outlines the conditions for limiting Article 11 EUCFR. The requirement for restrictions to be “provided for by law” aligns with existing case law, demanding accessibility and foreseeability. While the former is met, the latter allows for legislative flexibility without compromising foreseeability, as seen in the Delfi v Estonia case. However, safeguards against arbitrary interference are necessary.

Limitations – Essence of the Right

The obligation to respect the essence of the right restricts legislative discretion in balancing competing interests. It safeguards an “untouchable core” from interference. A blanket obligation to monitor content for any illegal or undesirable information would constitute such interference. Article 15 of the e-Commerce Directive, a “general principle of law” governing the internet, prohibits this. However, not all monitoring is forbidden; the CJEU distinguishes between specific and general monitoring, mirroring the ECtHR’s stance in Delfi. Building on prior cases like L’Oreal, Scarlet Extended, SABAM, McFadden, and Glawischnig-Piesczek, the Advocate General classifies Article 17 as a specific monitoring obligation, focusing on specific content.

Limitations – Proportionality

The Advocate General acknowledges the EU legislature’s right to rebalance Article 14 of the e-Commerce Directive with the Copyright Directive, considering the changed context and their broad discretion. Factors justifying this include: economic harm from large-scale uploading, ineffective notice-and-takedown systems, difficulties in prosecuting infringers, and the obligation targeting specific service providers.

To address concerns about over-blocking, the Advocate General interprets the “best efforts” obligation as requiring platforms to consider users’ rights proactively. This suggests that arbitrary blocking solely based on content without considering legitimate uses would be unacceptable.

Citing Glawischnig-Piesczek, where platforms weren’t required to make independent content decisions, the Advocate General posits that platforms can only filter content deemed illegal by a court or if its illegality is immediately evident. This limits filtering to content deemed illegal through legal processes or blatantly unlawful content.

Referring to the YouTube and Cyando opinion, the Advocate General reiterates that platforms cannot be compelled to filter all information for any infringement.

The Advocate General finds sufficient safeguards within Article 17. Article 17(7), prohibiting the blocking of non-infringing content, prevents widespread blocking, prioritizing freedom of expression in unclear cases. “False positives” (blocking legal content) are deemed more serious than “false negatives”. Rightsholders retain the option to request takedowns of infringing content. A zero error rate for false positives isn’t mandatory, but the rate should be minimized. Article 17(10)’s stakeholder cooperation is crucial for practical implementation.

Commentary

The Opinion attempts to navigate the divergent interpretations of Article 17, highlighting its contentious nature. The case’s outcome holds significance beyond copyright, as similar mechanisms might be required in other areas, such as the TERREG regulation addressing terrorist content online. While not finding Article 17 contrary to Article 11 EUCFR, the opinion acknowledges the complexities of its obligations.

The opinion raises important questions about the interplay between fundamental rights and the actions of private platforms. When do platform choices become attributable to Member States, triggering fundamental rights considerations? While platforms can set their terms of service, using national law as justification might raise freedom of expression concerns.

The Advocate General’s approach of incorporating safeguards against abuse within the proportionality analysis, rather than addressing them under lawfulness, is noteworthy. This approach, borrowed from ECtHR case law on privacy and surveillance, might blur the lines between distinct legal issues.

The discussion on the “essence of the right” is novel. The Advocate General draws a parallel with surveillance case law, arguing that general content monitoring would violate this essence, unlike specific monitoring. However, this distinction rests on the assumption that filtering for specific content differs significantly from broader monitoring and leaves the definition of “specific” open to interpretation.

The Advocate General’s interpretation of the “best efforts” obligation, while aiming to prevent over-blocking by limiting reliance on imperfect technology, might hinder the development and implementation of effective content moderation tools. This could shift the balance towards less proactive measures, potentially contradicting the legislation’s goals. This content-agnostic approach, while based on Article 17(7), might have broader implications if applied to areas like child sexual abuse content.

It is crucial to remember that this is a non-binding opinion. Its influence on the Court’s final decision remains to be seen, as does its impact on future legislation involving proactive content moderation measures.

Photo credit: via wikicommons media

Licensed under CC BY-NC-SA 4.0