Christian Thönnes, Research Associate at the Max Planck Institute for the Study of Crime, Security and Law in Freiburg, Department of Public Law.
July 13, 2021, marked a crucial moment for the future of privacy and security in the European Union. On this day, the Court of Justice of the European Union (CJEU) held a hearing regarding its preliminary ruling procedure C-817/19. This procedure was initiated after the Constitutional Court of Belgium submitted ten questions to the CJEU, prompted by a legal challenge from the Belgian NGO Ligue des Droits Humains. These questions focused on the interpretation and compatibility of Directive (EU) 2016/681 (the PNR Directive) with EU primary law. This directive concerns the use of passenger name record (PNR) data for preventing, detecting, investigating, and prosecuting terrorist offenses and serious crime. The hearing offered significant insight into the court’s perspective. Notably, Judge von Danwitz, the judge-rapporteur, posed numerous critical questions to the EU Commission, which signaled potential difficulties for advocates of this expansive surveillance measure. Having previously worked on a comparable case for the Berlin-based NGO Gesellschaft für Freiheitsrechte, I attended the hearing and offer a concise summary and analysis in this entry.
The significance of this case cannot be overstated: The PNR preliminary ruling procedure has the potential to become a landmark decision, shaping the legal foundation not only for the mass retention of travel data in the name of security but also, more specifically, for utilizing self-learning algorithms to predict and identify potentially suspicious travel patterns. Essentially, the CJEU is tasked with addressing one of the first EU-wide, large-scale instances of predictive policing. Should the court choose to validate this paradigm shift, it could pave the way for a dramatic expansion of technology-driven surveillance, extending its reach to encompass various aspects of ordinary human behavior, irrespective of individual suspicion or imminent threat. For instance, in its national transposition law, the Belgian parliament decided to broaden the application of the PNR Directive beyond aviation, encompassing international trains, buses, and ferries (Doc. parl., Chambre, 20152016, DOC 54-2069/001, p.7).
The PNR Directive: An unprecedented tool of technology-driven mass surveillance
The PNR Directive mandates that EU member states require air carriers to provide national security authorities, designated as “Passenger Information Units” (PIUs), with a set of data points for each passenger. PNR datasets contain unverified passenger information given to airlines or travel agencies to streamline flight processing. While the specific data points within PNR datasets vary based on the commercial needs of each airline, Annex I of the PNR Directive outlines all required data categories for transmission. These categories encompass clearly defined data elements, such as date of birth, travel companions, itineraries, and baggage information. However, they also include less defined elements, such as “general remarks” from aviation staff.
Upon receipt by PIUs, PNR datasets undergo automated comparison against databases deemed “relevant for the purposes of preventing, detecting, investigating and prosecuting terrorist offences and serious crime” (Art. 6 § 3 letter a), as well as against “pre-determined criteria” (Art. 6 § 3 letter b). The latter aims to pinpoint “unknown” suspects (EU Commission PNR Directive Proposal, SEC(2011) 132, p. 12). While the PNR Directive remains vague about the exact nature of these pre-determined criteria, Art. 6 § 4 stipulates that they should be “targeted, proportionate and specific,” and not rely on factors such as “race or ethnic origin, political opinions, religion or philosophical beliefs, trade union membership, health, sexual life or sexual orientation.” The overarching objective, however, is to extrapolate suspicious patterns from passengers’ flight behaviors. Academic research (Korff, Passenger Name Records, data mining & data protection: the need for strong safeguards, p. 4; Rademacher, Predictive Policing im deutschen Polizeirecht, AöR 2017, 366, 410-415; Sommerer, Personenbezogenes Predictive Policing, S. 96-98), policy papers, and organizations like the GFF broadly view “pre-determined criteria” as a likely outcome of self-learning algorithms.
As outlined in Art. 6 § 5, when a “hit” occurs, meaning an automatically generated match between a PNR dataset and either a database or pre-determined criteria, it undergoes “review by non-automated means.” Verified hits can then be disseminated to other relevant national law enforcement bodies on a “case-by-case basis” (Art. 6 § 6), shared with other member states (Art. 9), or, under specific conditions, disclosed to third countries (Art. 11). These authorities can then choose to pursue further action under their respective national laws. While the PNR Directive initially mandates the collection of PNR data solely for flights entering and exiting the EU, it acknowledges the right of member states to extend this scope to intra-EU flights, a right exercised by nearly all member states. PNR data is stored in its raw form for six months, followed by an additional four and a half years in a (reversibly) pseudonymized form (Art. 12).
EU Member States scramble to defend the PNR Directive’s proportionality
In line with standard procedure, the hearing commenced with opening statements from involved parties, including member states and EU institutions. This segment, however, leaned heavily in favor of the PNR Directive, as the plaintiff’s representative was met with a united front of eleven delegations from member states and EU institutions, all advocating for the directive. They countered accusations of the PNR Directive’s disproportionality primarily by offering anecdotal evidence of its value and downplaying the interference with fundamental rights.
Many member states initially referenced the CJEU’s prior Opinion 1/15 of 26 July 2017, which addressed the Draft PNR Agreement between Canada and the European Union. They argued that this opinion did not categorically prohibit the mass retention of PNR data nor did it, in principle, oppose a five-year retention period. They further questioned whether PNR data wouldn’t be more secure under the PNR Directive than in Canada, suggesting that data stored on European servers benefit from the protections of the GDPR and other legal frameworks. However, they failed to mention that the CJEU deemed the Draft PNR Agreement incompatible with Articles 7, 8, 21, and Article 52 § 1 CFR. This incompatibility stemmed from the Draft Agreement’s lack of clear and precise criteria, both substantive and procedural, for the automated processing of PNR data. The CJEU also criticized the absence of a necessary connection between the Agreement’s stated objective of mitigating terrorist threats and the prolonged retention and processing of PNR data (n° 232), a link contingent on factors like emerging threats. As highlighted earlier, the PNR Directive perpetuates this ambiguity (see Art. 6 § 4) and expands its reach beyond the Draft Agreement’s scope of serious cross-border crimes. It encompasses any crime of a certain (not necessarily high) degree of severity, including fraud (Annex II, Number 7) or aiding illegal entry and residence (Number 11).
Furthermore, member states highlighted the effectiveness of the PNR Directive in combating crime. The Belgian government, for instance, mentioned its role in intercepting victims of human trafficking at Belgian airports. The French government lauded the directive for enabling the apprehension of an individual operating an illegal prostitution ring who, accompanied by multiple minors, was en route to Bangkok. The Polish government cited its use in detecting the smuggling of cigarettes from Poland to Germany and in identifying Ukrainian nationals attempting to utilize flights for illegal immigration. Notably, however, all provided evidence remained purely anecdotal. Similar to the EU Commission in its Evaluation Report, member states failed to present detailed statistical data demonstrating the PNR Directive’s contribution to preventing, detecting, investigating, or prosecuting terrorism or serious crime. The hearing also failed to clarify whether, and if so, how many of the cases cited by member states could have been addressed using pre-existing data processing techniques.
Moreover, most member states asserted that the PNR Directive did not significantly infringe upon the right to respect for private and family life (Article 7 CFR) or the right to personal data protection (Article 8 CFR). They argued that the PNR Directive prohibits the processing of sensitive personal data (see Article 13 § 4 and Recital 37). Some member states, including Germany, Ireland, Spain, and Cyprus, reinforced this claim by emphasizing that all utilized pre-determined criteria are established by humans, not self-learning algorithms. Their opening statements strongly denounced the use of such algorithms as processing tools. The Dutch government went as far as to assert that the PNR Directive outright bans the use of artificial intelligence or self-learning algorithms in creating pre-determined criteria. This assertion, however, appears dubious at best. Academic literature has framed the PNR Directive as a model for utilizing self-learning algorithms (see references above), and the Directive itself lacks an explicit prohibition on artificial intelligence. Additionally, the European Parliamentary Research Service mentions the PNR Directive in its report on “Artificial Intelligence at EU borders” (pages 18-19). It’s important to note that artificial intelligence relies on massive datasets to function. This reliance is reflected in Article 6 § 2 letter c of the PNR Directive, which permits “analysing PNR data for the purpose of updating or creating new criteria to be used in the assessments carried out” [through pre-determined criteria].
Judge von Danwitz’ questions
While the opening statements proved largely repetitive and congratulatory, the second part of the hearing, dedicated to questions from the court, proved far more engaging. Judge-rapporteur Professor von Danwitz and the EU Commission representative engaged in a significant exchange during this portion. Judge von Danwitz structured his inquiries around four central themes: the statistical reliability (or lack thereof) of the PNR system (1), the degree of interference with fundamental rights caused by the PNR Directive (2), the potential for discriminatory outcomes within the PNR system (3), and the overall proportionality of the system (4).
(1) Judge von Danwitz initiated his questioning by referencing the concerningly high false-positive rates mentioned in the member states’ statements. The EU Commission’s Evaluation Report (page 28) notes that in 2019, only “0.59% of all passengers whose data have been collected have been identified through automated processing as requiring further examination.” However, merely 0.11% of all data underwent human verification and subsequent transfer to law enforcement agencies. As Judge von Danwitz emphasized, this suggests a false-positive rate exceeding 81%. Moreover, uncertainty remains regarding the legitimacy of processing the remaining 19% of datasets, rendering a definitive assessment of the full false-positive rate impossible. Drawing a parallel to the COVID pandemic while questioning the PNR system’s suitability for its purpose, Judge von Danwitz remarked, “If a PCR test operated with a sensitivity of 19%, I doubt it would be welcomed with open arms.” (While the original statement was delivered in French, the translations provided are my own.) While these figures may appear small at first glance, consider this: During the adoption of the German transposition law, the German government estimated roughly 170 million yearly affected air passengers (Gesetzesbegründung, BT-Drs. 18/11501, S. 23). Applying this estimation, in Germany alone, 187,000 individuals could be subjected to false automated suspicion annually.
Indeed, these EU figures are not unique. Member states have reported similarly high false-positive rates: In a case brought before the Administrative Court of Wiesbaden by the GFF (docket number 6 K 806/19.WI), the Bundeskriminalamt, acting as the German PIU, disclosed that out of 31,617,068 processed PNR datasets, only 237,643 automatic matches occurred. After human review, a mere 910 matches remained, resulting in a 99.6% false-positive rate. Of these 910 matches, 396 investigations proved fruitless as the implicated passengers were not the actual individuals sought, further highlighting the occurrence of severe false suspicions. It is important to note that this rate solely pertains to database matches. The error rate for the significantly more variable matching procedure against pre-determined criteria remains unknown.
Judge von Danwitz highlighted the statistical basis for these high rates: base rate fallacy. This fallacy describes the phenomenon where, when searching for exceptionally rare occurrences within massive datasets, even the most advanced detection tools are prone to producing more false positives than true positives. Within the vast pool of EU air passengers, genuine terrorists or serious offenders constitute a tiny fraction – European law enforcement is effectively searching for a needle in a haystack. Increasing the size of the haystack does not guarantee finding more needles; needles remain equally scarce and challenging to locate.
The EU Commission, when confronted with this criticism, pointed to the inherent limitations of the legislator’s role in addressing mathematical constraints, stating, “There are mathematical limits and errors, but the legislator is not required to conduct mathematical demonstrations.”
(2) Judge von Danwitz then challenged the assertion made by member states and EU institutions that the PNR Directive’s lack of involvement with processing sensitive data translates to a lack of serious interference with Articles 7 and 8 CFR. He drew a direct comparison with telecommunications data, the subject of Digital Rights Ireland, another landmark CJEU ruling on mass data retention. While acknowledging that telecommunications data might inherently contain more sensitive information than passenger data, he emphasized that evaluating the severity of interference requires considering both the scale and methods of processing. Firstly, while Directive 2006/24/EC primarily intended for telecommunications data to be accessible yet unexamined, the PNR Directive mandates the automated analysis of every single PNR dataset. Secondly, Judge von Danwitz stressed that utilizing data mining via self-learning algorithms amplifies the severity of interference, effectively dismissing the disavowals presented by numerous member states.
The EU Commission, in principle, agreed with Judge von Danwitz’s assessment (“Oui, certainement, la gravité de l’ingérence est déterminée dans la manière où le Data Mining se fait.”) but contended that varying degrees of data mining exist (“Il y a Data Mining et Data Mining."). They maintained that the safeguards incorporated into the PNR Directive mitigate the intensity of the data mining in question.
(3) Subsequently, Judge von Danwitz addressed the ambiguity surrounding criteria for the manual review of automated matches as mandated by Article 6 § 5. He posited that this lack of clarity creates an environment susceptible to discrimination. In response to the EU Commission’s assertion that Article 6 § 4 prohibits discrimination, Judge von Danwiz countered that indirect discrimination remains a distinct possibility, a point conceded by the EU Commission (“un risque de discrimination indirecte existe toujours"). Judge von Danwitz then questioned the EU legislator’s decision to omit more robust provisions within the text to curb the risk of indirect discrimination, particularly given the extraordinarily high false-positive rate exceeding 80%. The EU Commission responded by citing the inherent limitations of any legislator’s capacity for specificity (“tout législateur a des limites lorsque l’on doit réglementer une activité minutieuse et détaillée"). In response, the European Data Protection Supervisor suggested reversing the burden of proof as a potential solution. However, the EU Commission rejected this proposal, arguing that it presupposes that “everyone is automatically a victim.”
(4) Judge von Danwitz’s final line of inquiry delved into the overall proportionality of the PNR system. His questions predominantly centered on the (absence of a) connection between the trigger for mass data retention—air travel—and the stated objective of the PNR Directive: combatting terrorism and serious crime. The lack of clear criteria substantiating this link was a primary concern for the CJEU in its Opinion 1/15 (see n°217). The EU Commission countered by arguing that criminals specifically exploit the efficiency of international air travel to facilitate their activities. Judge von Danwitz, however, pointed out that the notion of subjecting locations and behaviors conducive to crime to mass surveillance could be expanded almost indefinitely. He questioned, “Why not rock concerts?” and “Why not museum visits?”. Surprisingly, the EU Commission appeared to concur, suggesting that rock concerts could indeed be susceptible to drug-related offenses (“I don’t have any police experience, but I could imagine that there could be much drug-related crime occurring at rock concerts.”).
Lingering doubts regarding the PNR system’s proportionality
This report is not exhaustive and doesn’t encompass the entirety of the oral hearing’s discussion or the factors to consider when evaluating the mass surveillance of air passengers. For instance, Advocate General Pitruzzella raised additional questions regarding the ambiguity surrounding which databases qualify as “relevant” for comparison purposes (von Danwitz even inquired about the potential use of Facebook databases), whether the widespread extension of the PNR Directive’s scope to include intra-EU flights was justified or excessive, and whether the five-year retention period was disproportionate. Advocate General Pitruzzella also inquired about external oversight mechanisms for pre-determined criteria and whether false positives are systematically used to refine algorithms (some member states confirmed this practice).
However, from my perspective, the hearing exposed numerous constitutional vulnerabilities within the PNR Directive. The EU Commission and member states failed to assuage my concerns regarding its lack of clarity, questionable suitability and effectiveness, and unchecked potential for widespread discrimination. The Directive’s most glaring weakness, however, lies in its sheer overreach: Subjecting ordinary human behavior to an unprecedented level of technology-driven surveillance, impacting millions of European air passengers, and sifting through mountains of irrelevant and potentially discriminatory data simply to—perhaps, with some uncertainty—apprehend a handful of criminals is demonstrably disproportionate.
Photo credit: Juke Schweizer, via Wikimedia Commons