12.5 Biometric Recognition Technology

12.5.1 Access Control 

The following example published by the ICO provides useful clarity and is also applicable to fingerprint recognition in the case of access control. 

A gym introduces a facial recognition system to allow members access to the facilities. It requires all members to agree to facial recognition as a condition of entry – there is no other way to access the gym. This is not valid consent as the members are not being given a real choice – if they do not consent, they cannot access the gym. Although facial recognition might have some security and convenience benefits, it is not objectively necessary in order to provide access to gym facilities, so consent is not freely given. 

However, if the gym provides an alternative, such as a choice between access via facial recognition and access via a membership card, consent could be considered freely given. The gym could rely on explicit consent for processing the biometric facial scans of the members who indicate that they prefer that option.

12.5.2 Information Commissioner’s Opinion 18th June 2021

Executive Summary 

Facial recognition technology (FRT) relies on the use of peoples’ personal data and biometric data. Data protection law therefore applies to any organisation using it. Live facial recognition (LFR) is a type of FRT that often involves the automatic collection of biometric data. This means it has greater potential to be used in a privacy-intrusive way. 

The Commissioner previously published an Opinion on the use of LFR in a law enforcement context. It concluded that data protection law sets high standards for the use of LFR to be lawful when used in public places. The Information Commissioner’s Office (ICO) has built on this work by assessing and investigating the use of LFR outside of law enforcement. This has covered controllers who are using the technology for a wider range of purposes and in many different settings. 

This work has informed the ICO’s view on how LFR is typically used today, the interests and objectives of controllers, the issues raised by the public and wider society and the key data protection considerations. The Commissioner has published this Opinion to explain how data protection law applies to this complex and novel type of data processing.

1.1 What is facial recognition technology?

Facial recognition is the process by which a person can be identified or otherwise recognised from a digital facial image. Cameras are used to capture these images and FRT software produces a biometric template.

Often, the system will then estimate the degree of similarity between two facial templates to identify a match (e.g. to verify someone’s identity), or to place a template in a particular category (e.g. age group). FRT can be used in a variety of contexts from unlocking our mobile phones, to setting up a bank account online, or passing through passport control. It can help make aspects of our lives easier, more efficient and more secure.

1.2 What is live facial recognition? 

The uses of FRT referenced above typically involve a ‘one-to-one’ process. The individual participates directly and is aware of why and how their data is being used. LFR is different and is typically deployed in a similar way to traditional CCTV. It is directed towards everyone in a particular area rather than specific individuals. It has the ability to capture the biometric data of all individuals passing within range of the camera automatically and indiscriminately. Their data is collected in real-time and potentially on a mass scale. There is often a lack of awareness, choice or control for the individual in this process.

1.3 Why is biometric data particularly sensitive? 

Biometric data is data that allows individuals to be recognised based on their biological or behavioural characteristics, such as data extracted from fingerprints, irises or facial features.

  • It is more permanent and less alterable than other personal data; it cannot be changed easily. Biometric data extracted from a facial image can be used to uniquely identify an individual in a range of different contexts. It can also be used to estimate or infer other characteristics, such as their age, sex, gender or ethnicity. The UK courts have concluded that ‘like fingerprints and DNA’ a facial biometric template is information of an ‘intrinsically private’ character.
  • LFR can collect this data without any direct engagement with the individual.

With any new technology, building public trust and confidence is essential to ensuring that its benefits can be realised. Given that LFR relies on the use of sensitive personal data, the public must have confidence that its use is lawful, fair, transparent and meets the other standards set out in data protection legislation.

1.4 How is LFR used?

The ICO has assessed or investigated 14 examples of LFR deployments and proposals (as summarised in this Opinion), as well as conducting wider research and engagement in the UK and internationally. 

Controllers often use LFR for surveillance purposes, aiming to prevent crime or other unwanted behaviours in physical retail, leisure and transport settings or other public places. LFR can identify particular individuals entering the premises and allow the controller to take action (e.g. removing them). The ICO has also seen an increasing appetite to use LFR for marketing, targeted advertising and other commercial purposes. This can involve using an individual’s biometric data to place them in a particular category. 

In the longer term, the technology has the potential to be used for more advanced practices. This could include integration with big data ecosystems which combine large datasets from multiple sources, such as social media. We are investigating some examples of FRT systems where images captured from online sources are being used to identify individuals in other contexts. 

Based on these examples, this Opinion focuses on the use of LFR for the purposes of identification and categorisation. It does not address verification or other ‘one-to-one’ uses. It defines public places as any physical space outside a domestic setting, whether publicly or privately owned. But it acknowledges that the nature and context of such places may be very different, as will the public’s expectations of privacy in different settings. This Opinion does not address the online environment.

1.5 What are the key data protection issues involved in LFR?

The Commissioner has identified a number of key data protection issues which can arise where LFR is used for the automatic collection of biometric data in public places. These have been identified through the ICO’s investigations, our work reviewing data protection impact assessments (DPIAs) and wider research. These issues include:

  • the governance of LFR systems, including why and how they are used
  • the automatic collection of biometric data at speed and scale without clear justification, including of the necessity and proportionality of the processing
  • a lack of choice and control for individuals
  • transparency and data subjects’ rights
  • the effectiveness and the statistical accuracy of LFR systems
  • the potential for bias and discrimination
  • the governance of watchlists and escalation processes
  • the processing of children’s and vulnerable adults’ data and
  • the potential for wider, unanticipated impacts for individuals and their communities.

Other parties, including international organisations and civil society groups, have raised further issues about LFR, including ethical, equalities and human rights concerns. This Opinion sets out where such issues may be relevant to data protection analysis, for example, where bias in facial recognition algorithms could lead to unfair treatment of individuals.

It is not the role of the Commissioner to endorse or ban particular technologies. Rather, it is their role to explain how the existing legal framework applies to the processing of personal data, to promote awareness of the risks and safeguards and to monitor and enforce the law.

1.6 What are the requirements of the law?

LFR involves the processing of personal data, biometric data and, in the vast majority of cases seen by the ICO, special category personal data. While the use of LFR for law enforcement is covered by Part 3 of the Data Protection Act 2018 (DPA 2018), outside of this context the relevant legislation is the UK General Data Protection Regulation (UK GDPR) and the DPA 2018. 

Controllers seeking to deploy LFR must comply with all relevant parts of the UK GDPR and DPA 2018. This includes the data protection principles set out in UK GDPR Article 5, including lawfulness, fairness, transparency, purpose limitation, data minimisation, storage limitation, security and accountability.

Controllers must also enable individuals to exercise their rights. These requirements of UK law represent universal core principles of data protection common to many legal regimes worldwide. 

While all relevant elements of the legislation apply, based on the ICO’s experience the central legal principles to consider before deploying LFR are lawfulness, fairness and transparency, including a robust evaluation of necessity and proportionality. This evaluation is particularly important because LFR involves the automatic collection of biometric data, potentially on a mass scale and without individuals’ choice or control. 

For their use of LFR to be lawful, controllers must identify a lawful basis and a condition to process special category data and criminal offence data where required. They must ensure that their processing is necessary and proportionate to their objectives, in line with the development of these concepts in UK case law. Any processing of personal data must also be fair. This means that controllers should consider the potential adverse impacts of using LFR for individuals and ensure they are justified. They should also consider and take steps to mitigate any potential biases in their systems and ensure it is sufficiently statistically accurate. Controllers must be transparent and take a ‘data protection by design and default’ approach from the outset so that their system complies with the data protection principles. 

Controllers are accountable for their compliance with the law and must demonstrate that their processing meets its requirements. Before deciding to use LFR in public places, they should complete a DPIA. As part of this process, they must assess the risks and potential impacts on the interests, rights and freedoms of individuals. This includes any direct or indirect impact on their data protection rights and wider human rights such as freedom of expression, association and assembly. 

Overall, controllers should carefully evaluate their plans with a rigorous level of scrutiny. The law requires them to demonstrate that their processing can be justified as fair, necessary and proportionate. 

Together, these requirements mean that where LFR is used for the automatic, indiscriminate collection of biometric data in public places, there is a high bar for its use to be lawful. While this is the ICO’s general assessment of what the legislation requires in this context, they emphasise that any investigation or regulatory assessment would be based on the facts of the case, considering the specific circumstances and relevant laws.

1.7 Next Steps

The ICO will continue their investigative and advisory work. This includes completing investigations already underway, assessing DPIAs which identify high-risk processing, conducting a proactive audit of LFR systems in deployment and, where appropriate, support data protection Codes of Conduct or certification schemes. Further next steps for the ICO and for controllers are detailed in the conclusion to this Opinion, alongside recommendations for technology vendors and the wider industry. 

In considering any regulatory action or use of enforcement powers, the Commissioner may refer to this Opinion as a guide to how they interpret and apply the law. Each case will be fully assessed on the basis of its facts and relevant laws. The Commissioner may update or revise this Opinion based on any material legal or practical developments in this evolving area, such as judicial decisions and case law, or further findings from their regulatory work and practical experience