European Data Protection Authorities should reconsider their approach to Facial Recognition Technology in schools due to power imbalances and issues of consent.

Dr. Asress Adimi Gikay, a lecturer specializing in AI, Disruptive Innovation, and Law at Brunel Law School, Brunel University London, focuses his research on Artificial Intelligence, Law, and Policy. He also teaches courses on Artificial Intelligence and the Law, and Law, Policy and Governance of Artificial Intelligence (https://www.brunel.ac.uk/people/asress-gikay). Twitter: https://twitter.com/DrAsressGikay

Facial Recognition Technology in Schools

In the digital age, large technology companies like Facebook, Google, and Apple exert significant control over our personal information and privacy. These companies possess a vast amount of data about us, including our physical locations, contact details, online communications, personal preferences, financial situations, and other sensitive information that we might not even share with close acquaintances. Children are particularly vulnerable to this pervasive surveillance. In the UK, for instance, children as young as 13 can legally consent to the processing of their personal data, including the use of facial recognition technology, by these companies. In many cases, these companies might know more about our children’s preferences than we do. Unfortunately, society lacks effective mechanisms to regulate these powerful entities that have become deeply embedded in our lives. Surveillance has become the norm, and we have limited means to safeguard ourselves from this intrusion.

Despite the pervasive presence of Big Tech in our lives, the introduction of facial recognition technology (FRT) in European schools has generated significant alarm among citizens, advocacy groups, and Data Protection Authorities (DPAs). This alarm has triggered a strong response from DPAs, leading to consistent efforts to block the technology’s deployment in schools due to concerns surrounding privacy violations and potential breaches of the General Data Protection Regulation (GDPR).

Facial recognition is an AI-powered process that uses facial images or videos to identify or verify individuals. This technology works by comparing the captured digital image or video with existing biometric data to determine a match. There have been numerous examples of FRT being used in schools for purposes like attendance tracking in Sweden, access control in France, and even payment processing in UK school cafeterias. While schools have cited potential benefits such as increased efficiency and security, DPAs and, in the case of France, the Administrative Court of Marseille, have intervened to halt the use of FRT due to serious privacy concerns.

Although school authorities have often relied on obtaining explicit consent from students or their legal guardians to justify the use of FRT, DPAs have rejected this argument. They highlight the inherent power imbalance between school authorities and students, suggesting that genuine, freely given consent cannot be guaranteed in such a situation. This raises the crucial question of whether public institutions like schools should be permitted to use FRT even with explicit consent, and if not, what alternatives should be explored.

The Concerns about FRTs in Schools and the Fix

Scholars and advocacy groups have expressed concerns that using FRT, particularly when processing children’s data, presents significant risks. These risks include the potential misuse of biometric information by both companies involved in providing or utilizing the technology and malicious actors like hackers. Additionally, there’s concern that normalizing surveillance, particularly among young people, could have long-term consequences for privacy expectations. DPAs, emphasizing the potential for disproportionate intrusion on student privacy, advocate for less invasive technological alternatives and strict assessments of the legal basis for using FRT, including the legitimacy of any consent obtained.

In a 2019 decision to issue a fine to the secondary school board of Skellefteå Municipality, the Swedish DPA argued that despite obtaining explicit consent for using FRT to monitor student attendance, the inherent power imbalance between the school and the students invalidated the consent. Similarly, the Administrative Court of Marseille, upholding the French DPA’s stance, determined that school councils had not demonstrated sufficient measures to ensure freely given and informed consent from students for using FRT for access control. In the UK, the Information Commissioner’s Office (ICO) urged schools in North Ayrshire to reconsider plans to replace fingerprint payment systems in canteens with facial recognition, advocating for less intrusive methods. Consequently, the schools were compelled to postpone the implementation of the technology.

These decisions strongly suggest that the mere presence of a power dynamic between the data controller and the data subject is sufficient to invalidate explicit consent, especially when dealing with sensitive biometric information. This is evidenced by the UK schools’ decision to suspend FRT implementation solely based on a letter from the ICO, indicating an unwillingness to risk potential GDPR breaches and sanctions.

While documents obtained through a Freedom of Information request from the North Ayrshire Council reveal shortcomings in their consent process, their overall approach to data protection compliance appeared reasonable. The ICO’s intervention, however, effectively barred them from seeking valid consent. This stance was further echoed in a House of Lords debate where opposition to using FRT in schools was voiced, emphasizing the ethical concerns of involving children in such technologies. This illustrates the mounting pressure to categorically ban FRT in schools, a ban that is, for all practical purposes, already in effect in Europe despite the absence of specific legislation.

Imbalance of Power under the GDPR

The GDPR generally prohibits the processing of sensitive personal data, including biometric data like facial images. However, it outlines exceptions where processing such data is permitted. One exception allows for processing biometric data to uniquely identify an individual if explicit consent has been granted for specific, clearly defined purposes. This consent must be given through a clear, affirmative action, demonstrating a freely given, specific, informed, and unambiguous agreement to the data processing.

Demonstrating freely given consent becomes particularly difficult when a power relationship exists, as highlighted by the DPAs’ conclusions in the Swedish and French cases. The GDPR acknowledges this challenge, stating that consent obtained under such circumstances, especially when a public authority acts as the data controller, is unlikely to be truly freely given and therefore might not constitute a valid legal basis for data processing.

While the GDPR empowers DPAs and courts to consider power imbalances when assessing the validity of consent, it does not impose a blanket ban on public authorities using explicit consent for processing personal data. The European Data Protection Board’s guidelines reiterate this point, stating that consent as a lawful basis for data processing by public authorities is not entirely excluded under the GDPR framework. This means that regulators and courts have the authority to retroactively scrutinize whether explicit consent was obtained fairly and transparently but cannot invalidate valid consent solely based on a perceived power imbalance. If a European Union Member State wants to prohibit the use of consent for processing specific categories of personal data entirely, the GDPR allows for such legislation. However, without such legislation, the validity of consent within a power dynamic must be evaluated on a case-by-case basis rather than through a categorical ban. Schools, therefore, should be permitted to demonstrate that the presumed power imbalance did not influence the consent-gathering process.

The Current Approach should be Reexamined

The concerns surrounding FRT’s intrusive nature, raised by scholars and privacy advocates, should be considered within the context of the data protection and privacy safeguards established by the GDPR. This regulation contains provisions ensuring that personal data is used only for its intended purpose, kept secure, and not shared with third parties without explicit consent from the data subject. Given these safeguards, the current approach, which leans towards a de facto ban on FRT in schools based on privacy concerns and an assumption of inherent power imbalance invalidating consent, appears unreasonable and requires reevaluation.

Firstly, this approach disproportionately disadvantages smaller entities and public institutions compared to large technology companies. While Big Tech companies often operate with significant leeway in handling personal data, smaller organizations face stricter scrutiny and potential sanctions. This creates an uneven playing field where Big Tech companies operate with less oversight despite potentially engaging in questionable data practices.

Secondly, an overly cautious approach to data innovation driven by an exaggerated fear of privacy violations could hinder technological progress. While addressing legitimate concerns about FRT is essential, a balanced approach that allows for responsible innovation while safeguarding privacy is crucial. Preventing smaller companies and institutions from harnessing data-driven technologies like FRT while Big Tech companies continue to amass data with less oversight is counterproductive.

In conclusion, while protecting student privacy remains paramount, the current approach to FRT in schools, which effectively functions as a ban, appears excessive. A more balanced approach is needed—one that acknowledges and addresses the legitimate concerns about FRT’s potential for misuse while also recognizing its potential benefits and the importance of fostering data innovation in a responsible and ethical manner. This balanced approach requires a nuanced understanding of consent in the context of power dynamics, ensuring that safeguards are in place to prevent coercion while not stifling the potential of technological advancements for the benefit of all stakeholders.

Art credit: Peder Severin Krøyer, via Wikimedia Commons

Licensed under CC BY-NC-SA 4.0