Lorna Woods, Professor of Internet Law, University of Essex
The Advocate General recently issued an opinion on two cases referred by national courts. These cases stem from the Court of Justice of the European Union’s (CJEU) decision in Digital Rights Ireland, which invalidated the Data Retention Directive. Both cases examine whether the mass retention of communications data aligns with EU law, raising a critical question: can such broad data retention ever truly comply with human rights? While the Advocate General suggests this is possible, the situation may be more nuanced.
Background
The confidentiality of communications sent through public electronic communications networks is guaranteed under EU law, specifically the Privacy and Electronic Communications Directive (Directive 2002/58). This directive’s scope is limited to activities under the Treaty on the Functioning of the European Union (TFEU), excluding matters covered by Titles V and VI of the Treaty on European Union (TEU) at the time, like public security and defense. Even within the directive’s scope, Article 15 allows Member States to restrict the granted rights
‘when such restriction constitutes a necessary, appropriate and proportionate measure within a democratic society to safeguard national security, defence, public security, and the prevention, investigation, detection and prosecution of criminal offences or of unauthorised use of the electronic system..’.
Specifically, Member States could legislate for the retention of communications data, encompassing communication details but not content, for the general population. The subsequent Data Retention Directive outlined common maximum retention periods and safeguards, which Member States implemented, sometimes with challenges.
Following the Data Retention Directive’s invalidation, the legality of Member State data retention laws became unclear. This led Tele2 and Watson, along with then-Conservative MP David Davis, to challenge their respective national data retention regimes. They argued that such systems were incompatible with Digital Rights Ireland’s standards. The Tele2 case involved Swedish legislation implementing the Data Retention Directive, while the Watson case concerned UK legislation implemented later: the Data Retention and Investigatory Powers Act (DRIPA). Due to their similarities, the cases were joined.
The Swedish referral questioned the compatibility of general traffic data retention laws with EU law, asking further questions about the Swedish regime’s specifics. Watson et al posed two questions: whether Digital Rights Ireland established requirements applicable to national regimes, and whether Articles 7 and 8 of the EU Charter of Fundamental Rights (EUCFR) set stricter requirements than Article 8 of the European Convention on Human Rights (ECHR) - the right to private life. Though Watson et al involves the UK, the CJEU’s decision remains relevant post-Brexit. CJEU case law dictates that non-Member States’ data protection laws must closely align with EU law to facilitate data flows.
Opinion of the Advocate General
The Advocate General addressed the EUCFR’s scope of protection first, deeming the question inadmissible as it wasn’t crucial for resolving the dispute. He confirmed that Article 52 EUCFR, obligating EUCFR rights to be interpreted consistently with the ECHR, sets a minimum level of protection, not a ceiling. The EU could provide greater protection, as explicitly stated in Article 52(3) EUCFR.
Article 8 EUCFR, granting a right to data protection, lacks a direct ECHR equivalent. Therefore, the Advocate General argued that the consistent interpretation rule in Article 52(3) EUCFR doesn’t apply to Article 8 EUCFR. He later dismissed the argument that Digital Rights Ireland was inapplicable because Watson et al concerned a national regime, not one established by the EU legislature. Articles 7, 8, and 52 EUCFR, interpreted in Digital Rights Ireland, are relevant here, making the case relevant despite the court’s different jurisdiction.
Next, the Advocate General considered if EU law permits Member States to establish general data retention regimes. The first question was whether Article 1(3) excluded these regimes from Directive 2002/58’s scope, as the data’s sole purpose was for national security and other grounds mentioned in Article 1(3). The Advocate General responded with three points:
Since Article 15(1) specifically anticipated data retention regimes, national laws establishing such regimes were actually implementing Article 15(1).
The governments’ argument centered on public authorities’ access to data, while national schemes concerned data acquisition and retention by private entities. The former potentially falling outside the directive doesn’t imply the latter does as well.
The Court’s approach in Ireland v Parliament and Council, which challenged the Data Retention Directive’s enacting Treaty provision, implies that general data retention obligations “do not fall within the sphere of criminal law.”
The next question was whether Directive 2002/58’s Article 15 applied. Its wording, referencing data retention, clarifies that it’s not inherently incompatible with the directive. Instead, the intention was to subject such measures to safeguards, meaning data retention is legal if the scheme adheres to these safeguards. The Advocate General, following his previous reasoning, rejected the argument that Article 15 is a derogation requiring restrictive interpretation.
This leads to whether adequate safeguards are in place. As the Advocate General considered general data retention regimes as Member States implementing Article 15, such measures fall under EU law. Thus, according to Article 51 EUCFR, the Charter applies, even if rules about authorities accessing data fall outside EU law. However, given the link between access and retention, constraints on access are important when assessing the data retention regime’s proportionality.
Assessing compliance with the EUCFR first requires identifying an interference with protected rights. Citing Digital Rights Ireland, the Advocate General acknowledged that “[g]eneral data retention obligations are in fact a serious interference” with privacy (Article 7 EUCFR) and data protection (Article 8 EUCFR) rights. Justifying such interferences must satisfy Article 15(1) Directive 2002/58 AND Article 52(1) EUCFR, outlining circumstances for Member State derogations from EUCFR rights. The Advocate General then outlined six factors arising from these obligations:
Legal basis for retention;
Observance of the essence of the rights in the EUCFR (just Article 52 EUCFR, not Article 15 of the directive);
Pursuit of a general interest objective;
Appropriateness for achieving that objective;
Necessity for achieving that objective; and
Proportionality within a democratic society to the pursuit of the objective.
Regarding the legal basis requirement, the Advocate General argued for explicitly applying the “quality” considerations from ECHR jurisprudence within EU law. These include accessibility, foreseeability, adequate protection against arbitrary interference, and being binding on relevant authorities. National courts are responsible for these factual assessments.
The Advocate General considered the “essence of the rights” requirement, as interpreted in light of Digital Rights Ireland, unproblematic. The data retention regime didn’t grant access to communication content, and the held data required secure storage. A general interest objective is easily demonstrable: combating serious crime and safeguarding national security. However, the Advocate General rejected arguments that fighting non-serious crime and ensuring smooth proceedings outside criminal contexts constitute public interest objectives. Similarly, data retention provides national authorities “an additional means of investigation to prevent or shed light on serious crime,” proving particularly useful as general measures empower authorities to examine communications of persons of interest predating their identification. Thus, they are appropriate.
A measure must be necessary, meaning “no other measure exists that would be equally appropriate and less restrictive.” Furthermore, according to Digital Rights Ireland, privacy right derogations and limitations apply only when strictly necessary. The first question was whether a general data retention regime can ever be necessary. The Advocate General argued that Digital Rights Ireland only addressed a system with insufficient safeguards, lacking a definitive statement that general data retention is inherently unnecessary. While the lack of differentiation was problematic in Digital Rights Ireland, the Court “did not, however, hold that that absence of differentiation meant that such obligations, in themselves, went beyond what was strictly necessary.” The Court’s examination of safeguards in Digital Rights Ireland suggests it didn’t view general data retention regimes as inherently unlawful. Based on this, the Advocate General opined:
“a general data retention obligation need not invariably be regarded as, in itself, going beyond the bounds of what is strictly necessary for the purposes of fighting serious crime. However, such an obligation will invariably go beyond the bounds of what is strictly necessary if it is not accompanied by safeguards concerning access to the data, the retention period and the protection and security of the data.”
Comparing this measure’s effectiveness to others must occur within the relevant national regime, considering that generalized data retention allows for “examining the past.” The applicable test isn’t about utility but whether other measures or combinations can be as effective.
The focus then shifts to safeguards, particularly whether those identified in Digital Rights Ireland (paragraphs 60-68) are mandatory for all regimes. These rules concern:
Access to and use of retained data by relevant authorities;
The data retention period; and
Data security and protection during retention.
Contrary to arguments from various governments, the Advocate General argued that “all the safeguards described by the Court in paragraphs 60 to 68 of Digital Rights Ireland must be regarded as mandatory.” Firstly, the Court didn’t mention compensating for one safeguard’s weakness by strengthening another. Furthermore, such an approach wouldn’t guarantee individuals protection from unauthorized access to and abuse of their data: each aspect identified requires protection. Strict access controls and short retention periods are meaningless if retained data security is weak, exposing the data. The Advocate General noted that the European Court of Human Rights, in Szabo v Hungary, emphasized these safeguards’ importance, citing Digital Rights Ireland.
While the Advocate General emphasized that assessing compliance falls to national courts, he noted these points:
National regimes aren’t sufficiently restrictive regarding data access purposes, allowing access for general crime-fighting, not just serious crime, which is the only justifiable general objective.
Prior independent review, required by Digital Rights Ireland, is absent, despite being necessary due to the interference’s severity and the need to handle sensitive cases (e.g., legal profession) individually. The Advocate General acknowledged that emergency procedures may be acceptable in certain cases.
Retention criteria must be determined using objective criteria and limited to what’s strictly necessary. In Zacharov, the European Court of Human Rights accepted 6 months as reasonable but required data deletion once it’s no longer needed. This deletion obligation should be present in national regimes and apply to both security services and service providers.
The final question concerns proportionality, unaddressed in Digital Rights Ireland. The test is:
“a measure which interferes with fundamental rights may be regarded as proportionate only if the disadvantages caused are not disproportionate to the aims pursued.”
This sparks a debate on the importance of the protected values. The system’s advantages were already discussed regarding necessity. Regarding disadvantages, the Advocate General referenced Digital Rights Ireland and noted:
“in an individual context, a general data retention obligation will facilitate equally serious interference as targeted surveillance measures, including those which intercept the content of communications”
and it can impact a large number of people. Given the volume of access requests, the risk of abuse is real. While balancing advantages and disadvantages is left to national courts, the Advocate General emphasized that even with all Digital Rights Ireland safeguards, considered the minimum, a regime could still be deemed disproportionate.
Comment
Interestingly, the Court of Appeal’s reference didn’t ask if DRIPA complied with fundamental EUCFR rights. Instead, the questions seemed to preclude that possibility - first by limiting the EUCFR’s scope to a specific interpretation of Article 8 ECHR and second by treating Digital Rights Ireland as a challenge to a directive’s validity, irrelevant nationally. While the Advocate General didn’t answer the first question, his reasoning for dismissing it highlights the Court of Appeal’s flawed approach. It’s difficult to see how Article 52(3), read in its entirety, could support “reading down” the EUCFR to the ECHR level. Article 52(3) states:
In so far as this Charter contains rights which correspond to rights guaranteed by the Convention for the Protection of Human Rights and Fundamental Freedoms, the meaning and scope of those rights shall be the same as those laid down by the said Convention. This provision shall not prevent Union law providing more extensive protection.
The second question’s focus was similarly misguided. As noted by the Advocate General, Digital Rights Ireland hinged on interpreting two EUCFR provisions: Articles 7 and 8. Their meaning should be consistent regardless of where they’re applied.
Clearly, the Advocate General avoids stating that mass surveillance, in this case, general data protection rules, inherently clashes with human rights. One of the Opinion’s key statements is that “a general data retention obligation imposed by a Member State may be compatible with the fundamental rights enshrined in EU law.” The focus then shifts to reviewing safeguards instead of declaring certain activities off-limits for Member States. This debate is common in this area, as illustrated by the European Court of Human Rights’ case law (see Szabo, particularly the dissenting opinion).
Fine distinctions abound. For instance, the Advocate General relies on the distinction between metadata and content to reaffirm that the essence of Articles 7 and 8 remains intact. Yet, while he strives to maintain that general data retention might be permissible, tensions emerge. His point about the “essence of the right” assumed that collecting metadata is less intrusive than intercepting content. When assessing a general data protection regime’s impact, he then implies the opposite. He even quotes Advocate General Cruz Villalon in Digital Rights Ireland who said such surveillance allows for creating:
“a both faithful and exhaustive map of a large portion of a person’s conduct strictly forming part of his private life, or even a complete and accurate picture of his private identity.”
The Advocate General concludes that:
“the risks associated with access to communications data (or ‘metadata’) may be as great or even greater than those arising from access to the content of communications.”
Another example involves EU law’s scope. The Advocate General separates access to collected data (related to policing and security) and data acquisition and storage (concerning private entities’ activities). The data retention regime concerns the latter group, whose activities fall under EU law. Here, the Advocate General follows the Court’s approach in the Irish judicial review challenging the Data Retention Directive’s legal basis (concluding it was correctly based on Article 114 TFEU). Having separated these aspects regarding EU law’s scope, the Advocate General then merges them when assessing safeguards’ acceptability.
Regarding safeguards, the Advocate General firmly reaffirms the requirements in Digital Rights Ireland. All mentioned safeguards are mandatory minimums, and shortcomings in one area cannot be offset by strengths elsewhere. If the Court agrees, this could impact national regimes, especially regarding the need for prior independent review (except in emergencies). The Advocate General might even be seen as exceeding both European Courts in this regard. Furthermore, he limits the permissible purposes for general data retention to serious crime only, contrasting, for example, with the approach to internet connection records in the UK’s Investigatory Powers Bill.
Another novel aspect is the discussion of lawfulness. As noted by the Advocate General, the Court of Justice hasn’t explicitly addressed this issue, although the lawfulness requirement is well-established in Strasbourg case law. While this might not seem particularly groundbreaking, he highlights that the law must be binding:
“[i]t would not be sufficient, for example, if the safeguards surrounding access to data were provided for in codes of practice or internal guidelines having no binding effect.”
Traditionally, many specifics of UK surveillance practices were found in codes. As security forces’ practices came to light, many were formalized as codes under relevant legislation (e.g., s. 71 of the Regulation of Investigatory Powers Act). However, historically, not all were publicly available, binding documents.
While headlines might focus on general data retention potentially being acceptable, with the final assessment left to national courts, this seems more theoretical than practical. The Advocate General goes beyond endorsing Digital Rights Ireland’s principles: even regimes meeting those safeguards might still be deemed disproportionate. While Member States might not have wanted a checklist of safeguards, even following it might not suffice. Of course, this opinion isn’t binding; the Court might reach a different conclusion. The judgment date is yet to be scheduled.
Photo credit: choice.com.au
Barnard & Peers: chapter 9
JHA4: chapter II:7