Unresolved concerns regarding the regulations for trusted flaggers under the Digital Services Act regime

Alessandra Fratini and Giorgia Lo Tauro, FratiniVergano European Lawyers

Photo credit: ** Lobo Studio Hamburg**, via Wikimedia Commons

1 Introduction

The EU’s Digital Services Act (DSA) formally establishes the roles and responsibilities of “trusted flaggers,” important entities in the online world that have operated with varying degrees of influence since the early 2000s. This new framework aligns with the DSA’s goals: creating consistent, effective, and balanced rules across the EU to improve the online marketplace. This aims to foster a secure, predictable, and trustworthy online space that protects fundamental rights and encourages innovation, relying heavily on responsible actions from intermediary service providers. This article examines the evolution of EU regulations and practices leading to the DSA’s adoption, delves into the DSA’s approach to trusted flaggers, and highlights unresolved issues that will need practical solutions.

2 Trusted reporters: the precedents paving the way to the DSA

Flagging generally refers to third parties reporting harmful or unlawful content to the intermediary service providers hosting it, prompting them to take action. “Trusted flaggers” typically have special privileges, like faster processing of their reports or access to dedicated reporting channels. This raises questions about their accountability and trustworthiness, as trust in a flagger isn’t universal.

The concept of trusted flaggers in EU law dates back to Directive 2000/31 (the “e-Commerce Directive”), the foundation for online services regulation in the EU. The Directive shielded intermediaries from liability for illegal content under specific conditions: Articles 12 (“mere conduit”), 13 (“caching”), and 14 (“hosting”) – now Articles 4-6 DSA. These articles stated that intermediary service providers were only liable for user-stored information if they failed to swiftly remove or disable access to unlawful content after becoming aware of it. The Directive encouraged voluntary agreements among stakeholders to develop procedures for addressing illegal information.

This conditional liability framework prompted service providers to incorporate flagging systems into their content moderation policies. These systems, while not mandated by the Directive, were a response to its liability rules, enabling quick responses to avoid liability.

Following Article 16 of the Directive, which promotes EU-level codes of conduct, the Commission introduced the EU Code of Conduct on countering illegal hate speech online in 2016. Signed by the Commission and numerous service providers, the Code represents a voluntary pledge to review most flagged content within 24 hours, remove or block access to illegal content, and collaborate with civil society organizations as “trusted reporters.” The Code designates trusted reporters to deliver high-quality notices, with signatories expected to provide information about them on their websites.

In 2017, the Commission adopted the Communication on tackling illegal content online to guide online service providers on managing illegal content. This Communication proposed that industry players agree on criteria respecting fundamental rights and democratic values, either through self-regulation or within EU standardization frameworks. It also stressed the importance of balancing the quality of notices from trusted flaggers with the burden on companies to ensure those standards, including the possibility of revoking trusted flagger status in case of abuse.

Building on these voluntary initiatives, the Commission adopted Recommendation 2018/334 on effectively tackling illegal content online. This Recommendation emphasized fostering cooperation between hosting providers and trusted flaggers, particularly through expedited procedures for processing their notices. It encouraged providers to publish clear, objective criteria for designating trusted flaggers, ensuring they possess the necessary expertise and operate diligently, objectively, and within the values of the EU.

The 2017 Communication and 2018 Recommendation laid the groundwork for the DSA’s institutionalized trusted flaggers regime. However, other developments occurred in the interim.

In 2018, following consultations with citizens and stakeholders, the Commission released a Communication on tackling online disinformation, acknowledging the role of trusted flaggers in promoting information credibility and inclusive solutions. Platform operators agreed to voluntary self-regulatory standards against disinformation, adopting a Code of Practice on Disinformation. However, a 2020 Commission assessment revealed shortcomings, including inconsistent application across platforms and Member States and inadequate monitoring. Consequently, in May 2021, the Commission issued its Guidance on Strengthening the Code of Practice on Disinformation, recommending a function for users to report false or misleading information. This Guidance aimed to evolve the existing Code of Practice into a “Code of Conduct,” as envisioned by (now) Article 45 DSA.

Subsequently, 34 signatories involved in revising the 2018 Code signed and presented the Strengthened Code of Practice on Disinformation in 2022. The Code aims to become a mitigation measure and Code of Conduct under the DSA’s co-regulatory framework for Very Large Online Platforms (VLOPs).

Finally, predating the DSA, Article 17 of Directive 2019/790 (the “Copyright Directive”) builds on the e-Commerce Directive’s Article 14(1)(b) on intermediary liability limitations. This article recognizes the crucial role of copyright holders in flagging unauthorized uses of their protected works. Article 17(4) stipulates that without authorization, online content-sharing services are liable for unauthorized sharing of copyrighted works unless they can demonstrate they made their best efforts to obtain authorization, diligently worked to prevent the availability of specific works flagged by rightsholders, and acted swiftly to remove or disable access to such works upon receiving a valid notice.

3 Trusted flaggers under the DSA

The DSA legitimizes trusted flaggers, formally recognizing a previously voluntary practice.

Under the DSA, a trusted flagger is an entity certified by the Digital Service Coordinator (DSC) in its home Member State within a specific expertise area after meeting specific legal requirements. Online platforms must prioritize and promptly address notices from trusted flaggers regarding illegal content. This mandates that platforms implement the necessary technical and organizational measures within their notice and action systems. Recital 61 clarifies that only entities (public, non-governmental, private, or semi-public) can be designated as trusted flaggers, not individuals. Consequently, private entities representing individual interests, such as brands or copyright holders, are not excluded but industry associations representing their members are preferred. This preference seemingly stems from the need to maintain the value of the fast-track system and limit the overall number of trusted flaggers under the DSA. Recital 62 emphasizes that these rules don’t prevent platforms from prioritizing notices from non-trusted entities or individuals or collaborating with other entities under applicable law. Additionally, the DSA allows platforms to use mechanisms for swift and reliable action against content violating their terms and conditions.

Eligibility requirements

Article 22(2) outlines three cumulative conditions for an applicant seeking trusted flagger status: proven expertise in identifying and reporting illegal content, independence from any online platform provider, and demonstrated diligence, accuracy, and objectivity in their operations.

The status’ award

Article 22(2) states that the trusted flagger status is awarded by the DSC in the applicant’s Member State. Unlike voluntary schemes managed by individual platforms, the DSC-awarded status applies to all platforms under the DSA’s purview. The DSC informs the Commission and the European Board for Digital Services of awarded, suspended, or revoked statuses, which the Commission then publishes and maintains in a publicly accessible database.

Article 49(3) mandated that Member States designate their DSCs by February 17, 2024. The Commission lists these DSCs on its website. These DSCs, responsible for overseeing and enforcing the DSA, ensure coordinated supervision and enforcement throughout the EU. The European Board for Digital Services will be consulted on the Commission’s forthcoming guidelines for trusted flaggers and matters related to trusted flagger applications.

The fast-track procedure

Article 22(1) mandates that online platforms prioritize and handle notices from trusted flaggers promptly, aligning with the general rules for notice and action mechanisms under Article 16. Recital 42 encourages platforms to establish a single electronic point of contact for both trusted flaggers and professional entities in specific relationships with them. Recital 62 adds that the faster processing of trusted flaggers’ notices hinges on the “actual technical procedures” implemented by platforms, which retain the responsibility for the organizational and technical aspects of this fast-track process.

Activities and ongoing obligations of trusted flaggers

Article 22(3) requires trusted flaggers to regularly (at least annually) publish, publicize, and submit detailed reports on their submitted notices to the awarding DSC. The trusted flagger status can be revoked or suspended if the entity fails to consistently meet the required conditions or fulfill its obligations. This revocation or suspension can only occur after an investigation by the awarding DSC, initiated either by the DSC itself or based on external information, including reports from online platforms. Trusted flaggers can respond to investigation findings and take corrective actions.

Furthermore, Article 53 grants trusted flaggers the right to file complaints with their local DSC if they identify any DSA violations by platforms. This right extends to all service recipients to ensure effective enforcement of the DSA.

The role of the DSCs

The DSA mandates that platforms prioritize notices from designated trusted flaggers. While platforms can still enter into bilateral agreements with trusted private entities or individuals for prioritized processing, they must prioritize those designated by the DSCs. This shifts some decision-making responsibility from platforms to DSCs while increasing their accountability for implementing measures that guarantee this mandated prioritization. For reporters, the DSA introduces harmonized requirements for obtaining and maintaining the trusted flagger status across platforms.

While awaiting the Commission’s guidelines, some DSCs have issued their own to guide potential applicants on the trusted flagger status requirements. For instance, France’s ARCOM, Italy’s AGCOM, Ireland’s Coimisiún na Meán, Austria’s KommAustria, Denmark’s KFST, and Romania’s ANCOM have published guidelines or application forms on their websites. These national guidelines aim to ensure consistency and harmonization in implementing Article 22.

4 Open issues

Although still in its early stages, the DSA’s approach to trusted flaggers has been praised for standardizing existing practices, harmonizing eligibility criteria, complementing specialized regimes like the Copyright Directive’s Article 17, reinforcing collaboration among stakeholders, and formalizing the role of trusted flaggers in notice and action procedures.

However, the DSA leaves some practical issues unresolved. These include clarifying the roles of trusted flaggers in relation to other actors combatting illegal or harmful content online, such as end users and reporters with direct platform agreements.

The role of trusted flaggers vis-à-vis end users

While the DSA doesn’t explicitly address the interaction between trusted flaggers and end users, some national guidelines suggest that applicant entities should demonstrate mechanisms for end users to report illegal content to them. This demonstration is part of the due diligence requirement during the application process. Essentially, applicants need to outline their content monitoring selection process, including their use of end-user reports, and demonstrate a balanced approach that considers all legitimate rights and interests. This leaves the organization and management of their relationships with end users, including onboarding procedures and the handling of their reports, to the trusted flaggers. Some organizations already operating within existing voluntary schemes, like those in the INHOPE network, offer hotlines for the public to anonymously report illegal content.

The DSA unequivocally allows end users to directly report content to platforms (Article 16) without involving trusted flaggers. They also retain the right to lodge complaints against platforms (Article 53) and seek compensation for damages (Article 54). It remains to be seen whether the expedited processing offered through trusted flaggers will incentivize end users to use them over direct reporting. Conversely, it’s unclear to what extent applicant entities will be obligated to create robust mechanisms for processing end-user reports within their area of expertise as part of the due diligence requirements.

Trusted flaggers might uncover platform violations while reviewing illegal content reports, just like any online service user. In such instances, Article 53 grants them the right to file a complaint with the relevant DSC, a right they share with end users. If “priority” is the cornerstone of the trusted flagger status when reporting illegal content to platforms, the question arises whether they should receive similar priority when reporting platform violations to DSCs. This raises another question: should trusted flaggers be empowered to lodge complaints with DSCs on behalf of end users, particularly concerning platform abuses like shadow banning?

The role of trusted flaggers vis-à-vis other reporters

The DSA requires online platforms to implement “easy to access and user-friendly” notice and action mechanisms (Article 16) and establish internal complaint-handling systems for service users (Article 20). However, these provisions apply universally, without distinguishing between trusted flaggers and other users. Despite the priority given to their notices under Article 22, the DSA doesn’t define the extent of this prioritization compared to notices from other trusted reporters with whom platforms have existing agreements.

Guidance is needed on the degree of precedence platforms should give trusted flaggers’ notices over other trusted reporters’ and whether the nature of the content affects this precedence. From the trusted flaggers’ standpoint, there should be a tangible incentive for taking on this role with its inherent obligations.

5 Concluding remarks

While the concept of trusted flaggers in addressing illegal online content isn’t new, the DSA introduces novel responsibilities for DSCs. This redistribution of responsibilities aims to harmonize best practices across the EU and enhance online user protection. However, addressing open questions like those outlined above is crucial for the effectiveness of the trusted flagger mechanism in ensuring swift action against harmful and illegal online content. The forthcoming Commission guidelines under Article 22(8) DSA are expected to provide clarity on these issues. Otherwise, the lack of certainty regarding the actual benefits might discourage potential applicants from taking on the role of a trusted flagger.

Licensed under CC BY-NC-SA 4.0