Cracking down on online terrorism - a threat to freedom of speech? The EU Regulation on online terrorist content under scrutiny.

Professor Lorna Woods, University of Essex

On September 12th, 2018, the European Commission presented a proposal for a regulation aimed at combating terrorist content online. This regulation would mandate that EU Member States require internet platforms to proactively remove such content and ensure that authorities possess the capabilities to address this illegal material. This proposal builds upon existing voluntary partnerships like the EU Internet Forum and previous non-binding recommendations from the Commission on tackling illegal content. By shifting from non-binding to legally enforceable measures, the Commission aims to strengthen its actions against such content. This move aligns with a broader trend of increasing requirements for internet intermediaries, as seen in areas like copyright and the revised Audiovisual Media Services Directive. As the proposal falls under the “internal market” legal framework, it would be applicable to all EU Member States.

The Proposal’s Details

The proposal specifically targets “hosting service providers” (HSPs) responsible for storing and making available “illegal terrorist content,” with both terms clearly defined within the regulation. “Illegal terrorist content” encompasses various forms of information, including material that incites or glorifies terrorist acts, encourages participation in such activities, promotes terrorist groups, or provides instructions on methods for carrying out attacks. The format of the content, whether text, images, audio, or video, is irrelevant.

The regulation places several obligations on HSPs, including the requirement to explicitly prohibit terrorist content in their terms of service. They must also implement reasonable and proportionate measures to address this content, while upholding fundamental rights, particularly freedom of expression. The proposal introduces the concept of “removal orders,” which competent authorities in Member States can issue to mandate the swift removal of terrorist content within one hour of receiving such an order. Additionally, “referral orders” can be issued to flag potentially problematic content to HSPs for their assessment, without requiring immediate removal. HSPs are obligated to establish mechanisms for assessing referred content and are encouraged to take proactive measures to prevent uploads of terrorist material. Furthermore, they are required to retain data for specific periods, provide transparency reports, and establish points of contact for cooperation with relevant authorities, including European bodies like Europol.

The regulation’s scope extends beyond EU-based services to include those operating outside the EU that cater to EU users. These providers must designate a legal representative within the EU, and failure to do so would subject them to the jurisdiction of all Member States. The proposal emphasizes that national implementing measures would generally not be required, as it takes the form of a regulation.

Member States are tasked with designating competent authorities to enforce the regulation and ensuring effective, proportionate, and dissuasive penalties are in place for violations. The regulation also includes provisions for monitoring the actions taken by both authorities and HSPs, emphasizing the need for Member States to equip their competent authorities with adequate resources to combat online terrorist content.

Initial Observations

This proposal complements the existing Terrorism Directive, which also includes provisions for content blocking and removal. However, the proposal’s introduction suggests that these existing measures might be perceived as inadequate. Furthermore, it reflects a shifting perspective on internet intermediaries, particularly platforms hosting third-party content. Unlike the earlier “safe harbor” approach embodied in the e-Commerce Directive, which emphasized platform neutrality, this proposal suggests a greater responsibility for platforms to actively address harmful content.

While acknowledging the duty of care placed on HSPs to proactively address terrorist content, the proposal clarifies that this should not be misconstrued as “general monitoring,” which is prohibited under the e-Commerce Directive. However, the practical distinction between these two concepts remains to be seen, particularly given the regulation’s focus on both preventing uploads and ensuring rapid takedowns. Furthermore, the proposal suggests that measures taken under this regulation could potentially deviate from the e-Commerce Directive’s approach in specific cases where overriding public security concerns outweigh the limitations on proactive monitoring. This represents a notable shift in the interpretation of the e-Commerce Directive.

The Commission has indicated that the regulation’s broad definition of HSPs could encompass a wide range of services, including social media platforms, video streaming services, file sharing platforms, and even websites that allow user comments and reviews. However, the requirement for content to be accessible to third parties introduces some ambiguity in defining the scope of HSPs. The regulation’s application to services whose primary function is not hosting but allow for user-generated content further complicates this issue.

The broad definition of HSPs and their obligations also raise concerns about potential overlap with existing regulations, such as the Audiovisual Media Services Directive, and the proportionality of these obligations for smaller companies. The Commission has stated that the regulatory burden will be proportionate, considering the level of risk and economic capabilities of the HSPs. However, the practical implementation of this proportionality principle remains to be seen.

Similar to other recent legislation, such as the GDPR, this proposal has an extraterritorial reach, impacting HSPs that offer services within the EU, even if based outside the EU. While the proposal clarifies that mere accessibility of a service within the EU is insufficient to trigger its application, the potential for a “blackout effect” similar to that seen with the GDPR exists.

Although criminal law falls under the purview of individual Member States, the proposal relies on a European definition of terrorist content. This offers some consistency for companies operating across borders but raises questions about the suitability of this definition and the potential preclusion of national standards. The short one-hour timeframe for complying with removal notices is particularly noteworthy, contrasting sharply with existing legislation in some Member States. This timeframe poses significant challenges for HSPs, particularly those operating across time zones, requiring them to establish efficient mechanisms for handling removal notices, potentially including automation.

While the short timeframe for removal has been criticized for potentially encouraging the removal of any reported content to avoid penalties, it is important to note that HSPs are not required to make independent judgments about the content’s nature. The assessment of whether the content constitutes “terrorist content” rests with the competent authorities issuing the removal notices. However, concerns remain about the potential for over-removal, given the limited time for review.

The purpose of referral notices, which lack specific deadlines for action, appears less clear. These notices seem intended to formalize existing voluntary arrangements where authorities flag potentially problematic content to HSPs. However, the lack of obligation for removal and the focus on assessing compliance with the HSP’s terms of service rather than the legality of the content raise questions about the effectiveness of this mechanism.

Article 6, which mandates HSPs to take “effective proactive measures,” is somewhat vague, potentially providing HSPs with significant leeway in interpreting and implementing these measures. While this allows for flexibility and adaptation to different services and contexts, it also raises concerns about the privatization of countering terrorism and the potential for inconsistent approaches across different HSPs. The ability for national authorities to impose specific measures on HSPs under Article 6(4) further contributes to this potential for fragmentation across the EU.

The proposal’s impact on freedom of expression is another crucial consideration. While the regulation aims to prevent the erroneous removal of non-terrorist content by emphasizing safeguards like human oversight and verification, it remains unclear how this will play out in practice. The absence of explicit provisions for HSPs to enforce their own content standards, beyond the scope of the regulation, raises questions about the balance between combating terrorism and protecting freedom of expression for users and online communities.

Finally, the substantial penalties outlined in the proposal, with fines of up to 4% of global turnover for systematic non-compliance, are significant and reflect a trend observed in other areas of online regulation, such as the GDPR. These hefty penalties further underscore the importance of understanding and complying with the regulation’s requirements for HSPs operating within the EU.

Barnard & Peers: chapter 25, chapter 9

JHA4: chapter II:5

Photo credit: Europol

Licensed under CC BY-NC-SA 4.0