The complexities of the new EU Media Freedom Act

Samira Asmaa Allioui, research and tutorial fellow at the Center for International and European Studies at the University of Strasbourg

Photo credit: Bin im Garten, via Wikimedia Commons

Journalists face various pressures. Recent years have seen media freedom, particularly media pluralism, under threat.

On December 15, 2023, the European Council and Parliament reached an agreement on regulations to protect media freedom, pluralism, and editorial independence within the European Union. The EU Media Freedom Act (EMFA) aims for greater transparency in media ownership and safeguards against government surveillance and spyware targeting journalists. This agreement follows numerous revisions to the Audiovisual Media Services Directive (AMSD) and new regulations like the Digital Market Act (DMA) and Digital Services Act (DSA). It’s important to note that the EMFA builds upon the DSA.

This piece will provide an overview of the EMFA, analyzing the extent to which its rules might still hinder free speech, erode trust, disrupt democratic processes, enable disinformation, and create legal ambiguity.

The EMFA mandates that EU nations uphold editorial freedom, prohibit spyware and political interference, provide stable public media funding, protect online media, and ensure transparent state advertising. It establishes a new independent watchdog, the European Board for Media Services, to combat interference both within and outside the EU.

However, this new EU law, specifically Article 18, attempts to restrict journalists’ activities concerning media content protection on very large online platforms (VLOPs). This raises concerns about the potential negative consequences of introducing a media exemption. A significant ambiguity lies in Article 2’s definition of “media service,” a widely recognized issue. This begs the question of who the EMFA truly protects. Does it genuinely strengthen democracy and citizens’ access to impartial, unbiased information? The potential for political interference during the European Parliament elections is a valid concern. External actors might exploit democratic elections to illegally influence the media by creating fake social media accounts and spreading divisive content through large-scale propaganda.

INFORMATION ACCURACY

The EMFA centers on two key aspects of VLOPs. First, it argues that platforms restrict access to reliable content when their terms and conditions impact media companies upholding journalistic standards and editorial responsibility. Essentially, the regulation challenges the control VLOPs have over access to media content, aiming to reshape the relationship between media and platforms. Media service providers exercising editorial responsibility are considered crucial for online information dissemination and freedom of information. By fulfilling this responsibility, they are expected to act diligently, providing reliable information that respects fundamental rights and adheres to relevant regulatory or self-regulatory frameworks in member states.

Second, it posits that media quality can combat disinformation. To address this, the EMFA aims to redefine the platform-media dynamic. According to Article 2, “media service” refers to a service, as defined in Articles 56 and 57 of the TFEU, whose primary purpose, or that of a separable section thereof, is to provide programs or press publications to the public under the editorial responsibility of a media service provider. This is done through any means to inform, entertain, or educate. A “media service” receives certain protections under the Act. Joan Barrata argues that the EMFA’s definition of media is overly narrow, inconsistent with international and European human rights standards, and discriminatory as it excludes certain forms of media and journalistic work. The DSA classifies platforms or search engines with over 45 million monthly users in the EU as VLOPs or Very Large Online Search Engines (VLOSEs). For instance, under Article 18, media service providers gain specific transparency and contestation rights on platforms. Moreover, according to Article 19, they can engage in structured dialogue with platforms on issues like disinformation. The agreement mandates that VLOPs inform media service providers of content removal or restriction plans, giving them 24 hours to respond (except in crises defined by the DSA).

Article 18’s 24-hour content moderation exemption for media essentially forces platforms to host content. This prevents the removal of media content that violates community guidelines. However, this could not only threaten marginalized groups but also jeopardize free speech and fuel disinformation. It creates a cycle where false information is posted, amplified by platforms’ algorithms or bots, then further disseminated by users who encounter and share it.

The EMFA stipulates that platforms establish a “special/privileged communication channel” before users sign up. This channel facilitates discussions about content restrictions with “media service providers,” defined as entities providing media services with editorial responsibility for content selection and organization. Instead of hosting all content, online platforms should prioritize certain media outlets.

However, this approach not only restricts platforms’ ability to enforce their terms of service (regarding nudity, disinformation, self-harm, etc.) but also threatens marginalized groups who are often the primary targets of disinformation and hate speech. Politics remains a breeding ground for both. Online platforms and social media have exacerbated the spread of hate speech and disinformation. Reports show these platforms are widely misused by political parties and governments. In fact, over 80 countries engage in political disinformation campaigns.

This could lead to misleading information remaining online long enough for widespread dissemination, undermining the EMFA’s goal of providing citizens with reliable information sources.

EXCESSIVE REGULATORY INTERVENTION AND ERODING TRUST

Any government intervention in areas like freedom of expression or media freedom raises concerns. EU member states, leveraging their EU Treaty competencies in security and defense, seem to have secured their ability to spy on journalists. However, the final text (April 11, 2024) includes European Parliament-added safeguards for spyware use. It’s now permissible only in specific cases, requiring authorization from a judicial authority investigating serious offenses with significant prison sentences.

Even in such cases, individuals have the right to be informed after surveillance and can challenge it in court. Spyware use against journalists, media outlets, and their families is explicitly prohibited. The rules also specify that journalists shouldn’t face prosecution for protecting their sources.

The law limits exceptions for national security reasons (falling under member states’ authority) or investigations into a defined list of crimes like murder, child abuse, or terrorism. In such situations, or cases of neglect, actions must be justified on a case-by-case basis, adhering to the Charter of Fundamental Rights and only if no other investigative tools suffice.

The law introduces concrete safeguards at the EU level. Affected journalists have the right to seek legal protection from an independent court in the relevant member state. Each state must designate an independent authority to handle journalists’ complaints regarding spyware use. These authorities must provide an opinion on compliance with the media freedom law within three months of a request.

Some European governments have intervened in journalists’ work, using national security as a pretext. To prevent erosion of trust, media service providers must be fully transparent about their ownership. Addressing this growing concern in the EU, the final version of the EMFA (April 2024) enhances transparency in media ownership. It expands transparency requirements, guaranteeing it for media ownership, preventing conflicts of interest (Article 6), and establishing a mechanism for national regulators to coordinate against propaganda from hostile external actors (Article 17).

Strengthening safeguards to prevent economic capture of all media by private owners is crucial to avert media bias. Unofficial intervention, leading to non-transparent and selective support for pro-government media, can be even more detrimental. This highlights the risks posed by political pressure and corruption to a free press.

The EMFA’s content moderation provisions could also damage public trust in the media and compromise information integrity. Online platforms already moderate illegal content. The new provisions include: solutions-oriented dialogue between VLOPs, media, and civil society to prevent unjustified content removal; mandatory annual reporting by VLOPs on content moderation practices (covering illegal content, complaints, automated tool use, training, etc.); prioritized processing of complaints from media service providers; and increased protection for professionally produced media content against unjustified removal by VLOPs. Platforms must take precautions to inform media service providers of content suspension reasons before enacting it. These safeguards aim to align this rapid alert procedure with the European Commission’s priorities, such as combating disinformation. The Electronic Frontier Foundation argues that “by creating a privileged class of self-declared media providers exempt from content removal on large platforms, the law not only dictates company policies but risks harming users within and outside the EU”.

MEDIA OUTLETS AND PLATFORMS NEGOTIATING CONTENT

The EMFA fails to address who oversees the self-declaration process outlined in Article 18(1). Specifically, this article states that VLOPs “shall provide a functionality allowing recipients of their services to declare” their status as media service providers. This self-declaration hinges on three criteria: meeting the definition of a media service provider under Article 2; declaring editorial independence from member states, political parties, and entities owned or controlled by third countries; and declaring adherence to regulatory requirements for editorial responsibility in one or more member states or a widely recognized self-regulatory mechanism for editorial standards within the relevant media sector in one or more states. If a VLOP suspends services for content from a self-declared media service provider based on terms and conditions violations, Article 18(4) mandates providing a statement of reasons to the affected provider before enacting the suspension or visibility restriction.

Furthermore, Article 18 EMFA fragments legislation by deviating from the rules established in the Digital Services Act (DSA). The DSA is a horizontal instrument aimed at building a more trustworthy online environment. It establishes a multi-level responsibility framework targeting various service types and proposes harmonized asymmetric obligations at the EU level for transparent and accountable online spaces. These DSA rules encompass all services, illegal content, goods, and services. Media regulators will be involved in the cooperation mechanisms for aspects under their purview, though the practical implications remain unclear.

The introduction of a “structured cooperation” mechanism seeks to enhance the robustness, legal certainty, and predictability of cross-border regulatory cooperation. This means increased coordination and collective deliberation among national regulatory authorities (NRAs), potentially benefiting the implementation of the EMFA. Although media regulators will participate in areas under their remit, the practicalities require clarification.

A key question is how this legislation will be practically applied to ensure it neither undermines free speech and democratic debate nor endangers vulnerable groups. While Article 18 includes safeguards against AI-generated content (details of which remain unclear; see Hajli et al.%20. on “Social Bots and Disinformation” and Vaccari and Chadwick on “Deepfakes and Disinformation”), the use of generative AI to spread disinformation and deep fakes is concerning. In this age of dominant technologies, voluntary guidelines are insufficient. Stronger measures are needed to balance free speech with control over AI systems. While AI can be beneficial for journalists, it can also be misused.

INEQUALITY AMONG MEDIA PROVIDERS: SPECIAL STATUS

The unequal treatment of media providers (those negotiating content) under the EMFA, where some receive special status while others don’t, creates inequality. Platforms must ensure public access to most reported information. The primary privilege of this special status is that VLOP providers face greater restrictions on content moderation, not in terms of prohibiting action but requiring greater transparency and information sharing with the affected provider. This creates an uncertain negotiation landscape where influential media outlets and platforms dictate content visibility. This is particularly problematic as media outlets have financial incentives to prioritize rapid dissemination and visibility, potentially at the expense of smaller providers.

CONCLUSION

Article 18’s self-proclamation mechanism still poses a risk of manipulating public opinion by masking disinformation and propaganda as legitimate media content. Additionally, the fragmentation of legislation and its divergence from the DSA risks creating a two-tiered system of free speech. Article 18, by allowing self-proclaimed media entities to operate with limited oversight, could hinder our ability to make informed decisions. This unregulated spread of disinformation could severely damage democratic processes. Lastly, the lack of clarity in Article 18 regarding the verification of self-proclaimed media outlets creates challenges for compliance enforcement.

These points highlight the drawbacks of the new legislation, underscoring the need for future efforts to address the critical state of press freedom within the EU.

Licensed under CC BY-NC-SA 4.0