Over the past few weeks, several legal decisions have addressed how individuals can control their information online. Two cases, GC et al and Google v CNIL, focused on how search engines should interpret their obligations under the General Data Protection Regulation (GDPR). This post will focus on a third case, Glawischnig-Piesczek v Facebook, which examined how the e-Commerce Directive’s prohibition on general monitoring applies to “stay down” notices. The case raises a broader question about the consistency of different internet regulations, including those related to intellectual property, child exploitation, and terrorism.
The Court’s Decision
This case originated from a simple situation: Glawischnig-Piesczek requested Facebook to remove defamatory posts. When Facebook didn’t comply, she obtained a court order. However, the order’s scope was unclear, prompting the Austrian Supreme Court to ask the Court of Justice of the European Union (CJEU) for clarification on several points regarding “stay down” notices:
- Do they violate Article 15 of the e-Commerce Directive when applied to identical content?
- Are there geographical limits to this obligation?
- Can they apply to content that conveys the same meaning as the original defamatory content but uses different wording?
- In the case of similar content, does the obligation begin when the platform becomes aware of it?
The CJEU stated that the immunity provided by Article 14 of the Directive doesn’t exempt platforms from all legal obligations. National authorities can still demand the removal of illegal content. Additionally, Article 18 mandates that EU countries establish legal procedures for addressing such content, including measures to swiftly remove infringements and prevent further harm. The court determined that the e-Commerce Directive doesn’t restrict the reach of these national measures.
Regarding Article 15, the Court acknowledged that while general monitoring is prohibited, monitoring in “specific cases” is allowed, as stated in Recital 47. The court considered the search for a specific piece of content stored on a platform at a user’s request as a “specific case.” Due to how information spreads online, the court decided that requiring a platform to block or remove identical content, regardless of who initially posted it, is a legitimate way to prevent further harm. This, according to the court, doesn’t constitute general monitoring.
The Court defined “equivalent meaning” based on the message conveyed, not just the wording. Therefore, injunctions could cover slightly reworded content to prevent circumvention. However, the court emphasized that such injunctions should include clear guidelines for platforms to identify infringing content without making independent judgments. The court suggested that “automated search tools and technologies” could be used for this purpose.
The CJEU also confirmed that Article 18 doesn’t impose territorial limits on measures taken by Member States. Therefore, orders could have global implications, provided they comply with international law.
Finally, the Court decided that addressing the third question wasn’t necessary.
Analysis
This ruling has several noteworthy aspects. This analysis will focus on three: the approach to general monitoring, handling non-identical content, and the issue of territorial scope. It will also touch upon freedom of expression concerns.
General Monitoring
The judgment confirms that searching for specific content doesn’t equate to general monitoring. This approach, however, doesn’t address the fact that such a search might involve scanning all content on a platform, seemingly contradicting previous rulings like McFadden and L’Oreal, which emphasized that platforms shouldn’t be required to actively monitor everything.
It’s possible that the court distinguishes between targeted searches for specific content and broader monitoring activities. This distinction might stem from the assumption that targeted searches don’t inherently invade privacy, especially when automated. However, this assumption has been challenged in cases like Watson/Tele2. This aspect remains unaddressed in the Glawischnig-Piesczek judgment.
Non-identical Content
Allowing injunctions to cover non-identical content presents challenges. While the ruling states platforms shouldn’t make subjective judgments, it doesn’t define how similar content must be to warrant removal. The judgment seems to assume technology can effectively identify such content, but this remains unclear.
Territorial Scope
The ruling’s position on territorial scope is significant. The Court doesn’t mandate that injunctions must have extraterritorial effects but suggests that EU law doesn’t prevent it. This means national courts, within their own legal frameworks, could issue such orders.
The CJEU’s stance here contrasts with its decision in Google v CNIL, where it upheld the “right to be forgotten” but limited its geographical scope to EU member states. This difference might stem from the EU’s objective to harmonize regulations within its borders while acknowledging the complexity of global internet governance.
The ruling in Glawischnig-Piesczek doesn’t explicitly instruct national courts to consider freedom of expression when issuing injunctions. This is noteworthy because what’s considered defamatory in one country might be protected speech in another.
In conclusion, while the CJEU addressed several aspects of online content regulation, the Glawischnig-Piesczek case highlights the ongoing challenge of balancing competing interests like privacy, reputation, and freedom of expression in a globally interconnected digital environment.