Image source - https://bit.ly/3EWRHwF

This article has been written by Eascham pursuing the Diploma in International Data Protection and Privacy Laws from LawSikho. This article has been edited by Zigishu Singh (Associate, Lawsikho) and Smriti Katiyar (Associate, Lawsikho).

Introduction

A lawsuit filed before the French Administrative Court of Marseille for the removal of biometric facial recognition technology (FRT) in two high schools was successful. The French administrative court invalidated the deliberation of the Provence-Alpes-Côte d’Azur Regional Council (PACA) and canceled the experiment. 

This case is significant e because it’s the first French administrative Court decision applying the General Data Protection Regulation (GDPR) on AI biometric technology. Being in a democratic society, keeping in mind the fundamental rights, this case gave a clearer insight on when facial recognition is necessary, and when it is needless and unwarranted.

Download Now

Facts of the case 

In October 2017, the President of the Provence-Alpes-Côte d’Azur (PACA) Regional Council sought CNIL’s (Commission nationale de l’informatique et des libertés, France’s data protection authority) assistance for the purpose of conducting a series of facial recognition tests at the entrance of two High Schools located in Nice and Marseille (South of France), for granting or refusing access to the students.

Even though CNIL did not authorize it, the Regional Council, PACA, ignored the warning and, in a deliberation dated 14 December 2018, still went ahead with the plan by labeling it an experiment. 

Several French data protection and human and civil rights associations and NGOs like La Ligue des Droits de l’Homme, La Quadrature du Net, and the federation of parents of pupils of the public schools of the Alpes-Maritimes, protested this experiment, claiming that it was illegal as it violated the GDPR. And therefore, filed an action for annulment for PACA’s deliberation before the French Administrative Court of Marseille on 14 February 2019. 

On 27th February 2020, the french Administrative Tribunal, in conclusion, annulled the facial recognition system because the biometric data of underage children had been processed without the backing of any legal basis. 

Objective of the experiment

The experiment intended to assist the staff of the high schools. It aimed to control and speed up the entry of students and to regulate the premise access of occasional visitors in order to determine the entry of only authorized people. Another objective of the experiment was to avoid identity card theft or misuse. 

The FRT that was to ensure the required security measures, comprised virtual access control devices, by which cameras would recognize high school students and grant them access and be able to follow the trajectory of people. 

The role of CNIL

The CNIL, a FRENCH DATA PROTECTION AUTHORITY (DPA), is an independent administrative authority that practices in conformity with the French Data Protection Act of the 6th of January 1978 (amended in August 2004). They have many powers including advisory power, investigation powers, and administrative sanctioning powers.

The CNIL’s role is to analyze the repercussions of innovations and technologies on citizens’ fundamental right to privacy and liberty. It is their responsibility to share information and raise awareness on data protection culture. DPAs ensure that information technology should always be at the service of the citizen, and they should gain from it and that technology should not undermine human rights, privacy, and their liberties in the name of service.

The CNIL comes under the membership of the EUROPEAN DATA PROTECTION BOARD (EDPB). The EDPB was established by GDPR under Article 68-76.

GDPR

THE GENERAL DATA PROTECTION REGULATIONS are a legal framework for ensuring data protection and privacy of the citizens (data subjects) in the European Union (EU) and the European Economic Area (EEA). They are Regulations on the protection of natural persons with regard to the processing of personal data and the free movement of such data.

Reasons for CNIL, the data protection authority, to not authorise the experiment 

The CNIL showed its disagreement with PACAs decision to carry on the experiment. It was convinced that FRTs were unnecessary and they were very intrusive biometric mechanisms that processed and stored sensitive personal data.

Considering the privacy and civil liberties of the citizens, there was a huge risk involved. It emphasized the fact that people involved in this experiment (data subjects) were minors, making matters all the more sensitive. 

GDPR requires a necessity and proportionality test and minimization of data but the experiment that was carried out was contrary to both principles.

Necessity and proportionality test

Necessity is the fundamental test for assessing the validity of the processing of personal data. It is an essential criterion to ensure that the restriction of fundamental rights to the protection of personal data is reasonable. 

Proportionality is a principle of EU law. Considering fundamental rights to the protection of personal data ensures that the advantages of limiting the right are not outweighed by the disadvantages to exercising the right. In short, the limitation or restriction on the right must be reasonable and justified.

Therefore, both necessity and proportionality require authorities to strike a balance between the means used and the intended aim. 

The objective of FRT was to increase the security and fluidity of traffic, CNIL firmly believed that there were other means that already existed, that were less intrusive and could achieve the same result. (such as badge/ID control).

The Court’s decision of February 27th, 2020

The decision taken on 27 February 2020 confirmed the CNIL interpretation and analysis of GDPR in its entirety. The Administrative Court of Marseille invalidated the decision of the PACA region council.

The court justified its decision on the following basis:

  1. Consent, a legal basis for processing

Art. 7 GDPR talks about the conditions for consent. For a consent (recital 32) to be legally valid it has to be a freely given, specific, informed, and unambiguous indication of the data subject’s agreement to the processing of personal data

But in the present case, it was observed that the PACA region counsel tried to legally justify the consent taken from the students and the legal representatives of the minor students for processing of the biometric data through a simple form signed by high school students in a subordinate position to their respective high schools’ directors. It was evident enough that students under the direct authority, cannot guarantee that the consent was freely given and was an informed choice.

The Court concluded, simple signature in a form is not sufficient grounds for valid consent under GDPR. 

The French Act of 20 June 2018, amending the Data Protection Act, in line with European texts stated that in the absence of consent, an operator, whether public or private, may only implement biometric processing if it has first been authorized by law.

  1. On the proportionality test

The Court thus used the CNIL’s interpretation of proportionality test and GDPR and concluded that PACA counsel could not convince the court that access control by a badge or ID card along with video surveillance (CCTV), was insufficient and could not achieve the same results as from FRT. 

The proportionality criteria were not satisfied because of the existence of other less intrusive means that could give the same results as the FRT and were more appropriate. Therefore, the FRT test undertaken violates Article 9 of the GDPR and cannot be justified by the exceptions announced in para. 2 of this Article.

  1. PACA’s authority to conduct the experiment

Responsibility for school safety lies with the Head of School, not the Region PACA, so the latter acted ultra vires by engaging in such an FRT experiment.

  1. Processing of minors personal data

The Court remained silent on the processing of biometric data of students predominantly minors. Since Minors are a vulnerable group within the meaning of the GDPR, there is still a void in certain provisions of the GDPR and French Data Protection Act concerning the processing of their sensitive biometric data. 

Conclusion

Facial recognition falls under the category of biometric technology. Biometrics are AI, automated processes that can assess physical characteristics like fingerprints, blood vessel patterns, iris structure, and even physiological and behavioral characteristics in certain cases. The GDPR defines these characteristics as “biometric data”, and they come under sensitive personal data. “sensitive” data under GDPR, includes data like health or sex life, political opinions, and religious beliefs.

With the present technology, biometrics, no matter how legitimate, are still prone to cyber-attacks and can have particularly very serious consequences. Therefore like other biometric techniques, facial recognition is never a completely harmless type of data processing. Hence, a strict legal framework has been reinforced by GDPR laws and national data protection laws.

Furthermore, data protection principles cannot be ruled out even in the experimental stage, especially data concerning sensitive data of “vulnerable persons” such as minors according to the GDPR. While deploying FRTs, it is expected of the controllers to demonstrate a high level of compliance with data protection principles due to the complexity of ensuring compliance when deploying facial recognition tools.


Students of Lawsikho courses regularly produce writing assignments and work on practical exercises as a part of their coursework and develop themselves in real-life practical skills.

LawSikho has created a telegram group for exchanging legal knowledge, referrals, and various opportunities. You can click on this link and join:https://t.me/joinchat/J_0YrBa4IBSHdpuTfQO_sA

Follow us on Instagram and subscribe to our YouTube channel for more amazing legal content.

LEAVE A REPLY

Please enter your comment!
Please enter your name here