Facial recognition in EU member states.
Despite the fact that facial recognition seems to be contrary to European legislation, at a national level, this method is often applied in several ways. There are different reasons for its application, such as reasons of security (in airports), protection of public health or personal entertainment (these issues will be considered in more detail below).
France is the first country in the European Union where biometric data processing techniques are applied via the extensive use of cameras in the city of Nice, following the terrorist attack. Furthermore, in aiming to curb the spread of the Covid-19 pandemic, French police have used cameras with speakers to reprimand those in violation of Covid-19 rules. However, French firm Datakalab whose software is used for video surveillance in many towns in France ensures the protection of personality and personal information as no image is stored or transmitted. In that sense, there is no facial recognition. Certainly, it should be noticed that both the European Union and the United Nations encourage the use of digital tools and new technologies with the aim to fight against Covid-19. In any case, it is explicitly declared that any contact tracing and modern digital application shall meet all the safeguards for the respect of fundamental rights, especially of data privacy. Hence, in the EU, member states must adopt a necessary and proportionate data retention policy, which conforms with the European regulation on data rights (GDPR) as well as establishes strong safeguards to prevent stigmatization of infected persons or close contacts of infected persons.
In the context of digitalization of services and e-government, we should highlight that France is the first country in the EU with a facial recognition ID system. In fact, the French government through its law on facial recognition, launched a project called “Alicem” (Authentification en LIgne CErtifiée sur Mobile). It aims to include facial recognition on users’ smartphones to allow them to connect to government services applications. Its primary goal is to provide French citizens and legal residents with a secure and valid digital identity. According to the Ministry of Interior, Alice will comply with the “high” security level defined by the European eIDAS (electronic IDentification, Authentication and trust Services) regulation and is in the process of certification by ANSSI (Agence nationale de la sécurité des systèmes d'information – National agency for information systems security). However, France’s data regulator condemns this project as it violates the provisions of the GDPR, in particular those of requiring the consent of the subject.
It should also be noted that the method under consideration has been considered for use at airports for security and crime prevention purposes. However, there is no particular legislation which regulates the legality of the use of facial recognition. The opinion of the French Data Protection Agency (CNIL) is enlightening on these issues, as it provides a clear legal framework under which facial recognition can be considered legitimate. CNIL indicates that the GDPR should govern the application of facial recognition in airports. The principles of necessity and proportionality should be taken into account in order to prevent any damage to public security. Of course, the protection of privacy and the completion of data protection impact assessments are required. Furthermore, all the principles of legal processing of data must be applied, such as those of accuracy, storage limitation, integrity and confidentiality and accountability. CNILs position regarding obtaining prior valid consent is of great interest. According to the guidance in case, consent should be the legal basis for processing, and thus should meet the requirements for consent under the GDPR.
Furthermore, there are some special additions:
airports should provide an alternative to individuals who do not consent to the use of facial recognition technology;
airports should also allow individuals to withdraw their consent;
consent should not be tied to or mixed with the acceptance of the terms and conditions of a ticket;
individuals should receive enhanced information about the use of facial recognition technology and its alternative(s); and
facial recognition technology should be used only on individuals who have provided their prior consent (for example, it should blur the picture of other individuals in the background and indicate the control zones).
However, in our opinion, the requirement of prior valid consent is not necessary for the lawfulness of facial recognition. The prior information of passengers could constitute the right legal basis as “processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller”, according to article 1(e) of the GDPR. Moreover, there is Directive 2016/681 which has been implemented, widely known as Directive Passengers Name Record (PNR), which provides for the transfer by air carriers of passenger’s name records (PNR) data of passengers of both extra-EU and intra-EU flights. With respect to the protection of fundamental rights of passengers and safeguards to their lawful processing, purposes of security and prevention of criminal acts take precedence.
It becomes obvious that GDPR provisions offer safe guidance for the application of facial recognition. Under those conditions, French courts declared school facial recognition illegal due to the GDPR, regardless of whether or not the prior consent of students had been obtained. CNIL also reaffirmed the decision by drawing attention to alternative less intrusive means, such as badge control. In the same vein, the Swedish DPA has fined a municipality 200 000 SEK (approximately 20 000 euros) for using facial recognition technology to monitor the attendance of students in schools.
However, facial recognition seems to be legitimate and legal for the purposes of public security. Hence, in October 2019, the Swedish Data Protection Authority (DPA) approved its use for criminal surveillance, finding it legal and legitimate (subject to clarification of how long the biometric data will be kept). Similarly, the UK DPA has advised police forces to “slow down” due to the volume of unknowns – but have stopped short of calling for a moratorium. UK courts have failed to see their DPA’s problem with facial recognition, despite citizens’ fears that it is highly invasive. In the only European ruling so far, Cardiff’s high court found police use of public face surveillance cameras to be proportionate and lawful, despite accepting that this technology infringes on the right to privacy.
In accordance with the aforementioned rationale, Greece has recently issued the presidential decree 75/2020 which authorizes the installation and operation of surveillance systems which capture or record audio or video, in public places. The prevention of criminal acts as well as traffic management that includes dealing with road network emergencies, regulating vehicle traffic, and preventing road accidents are defined as legal grounds for the installation and use of surveillance systems. Furthermore, the principles of justification and proportionality are required. Therefore, sufficient indications are required in order to demonstrate either the present or the possible future commitment of criminal offenses. The contribution of sufficient evidence is justified by the reporting of factual data such as, in particular, statistical or empirical data, studies, reports, testimonies, information on the frequency, type and specific characteristics of crimes committed in a particular area, as well as on the basis of the above elements, probable spread or transfer of crime to another public place. Surveillance is deemed necessary when, in the light of the above facts, a reasonable belief is formed that serious public safety risks are posed in these public areas. The prior authorization of judicial authorities is also necessary in case of public gathering. The data collected are erased 48 hours after the end of the event, unless there are serious reasons for investigation of criminal acts. In that case the period of erasure can be extended up to 15 days.