Search

How can we help?

Icon

Facial recognition and data protection for companies

You are walking along a side street towards your office. Unbeknown to you a private company has installed closed circuit television with facial recognition capabilities and is tracking your movements. Is this lawful?

On 15 August 2019, the Information Commissioner, Elizabeth Denham released a statement which may address this very issue.

Ms Denham advised that she would be investigating the use of facial recognition technology in King’s Cross London by a privately-owned development. Ms Denham states that she is “deeply concerned about the growing use of facial recognition in public spaces, not only by law enforcement agencies but also increasingly by the private sector.”

The investigation follows a previous blog by Ms Denham in July that live facial recognition software was a high priority area for the ICO and that her office was conducting an investigation into and monitoring trials by the South Wales and Met Police.

Live facial recognition technology is different to CCTV monitoring. By using biometrics (certain physical and physiological features), the technology can map facial features to identify particular individuals by matching these with a database of known faces. This technology has been in use for some years by certain public and government agencies but with the advent of AI and machine learning, it has become more prevalent in the private sector.

Facial Recognition Concerns

Whilst the privacy legal framework for law enforcement is different to that for private companies, the privacy concerns about the use of facial recognition software in public spaces remain the same.

Some threats to privacy include:

  • Lack of Transparency – An intrusion into the private lives of members of the public who had not consented to or were aware of the collection or the purposes for which they were collected/stored
  • Misuse – Images retrieved may be used for purposes other than that those consented to or notified.
  • Accuracy – inherent technological bias within the technology may result in false positive matches or discrimination.
  • Automated Decision making – decisions which may significantly affect individuals may be based solely on the facial recognition software.

Processing of Personal Data by Private Companies

Organisations which process personal data within the UK must comply with the General Data Protection Act (GDPR) and the Data Protection Act 2018.

Processing of personal data must only be undertaken if any of the grounds under Article 6 of the GDPR apply. Further where the processing involves special category data, which includes biometric data, a further justification must be found within Article 9.

Article 6 (1) lists the lawful bases for processing. In the context of video surveillance, the applicable bases includes:

– the consent of the individuals concerned (Article 6(1) (a). For consent to be valid under the GDPR it must be freely given, specific, informed and unambiguously given prior to the processing.

– necessary for the legitimate interests pursued by the controller. (Article 6 (1)(f). Processing for the purpose of ‘legitimate interest of a controller’ will be lawful unless such interests are overridden by the fundamental rights and freedoms of an individual.

Facial Recognition may be processing Special Category Data

If the processing of personal data involves special category data, in addition to identifying a lawful basis under Article, 6, an exemption must also be found in Article 9 to justify such processing.

Facial recognition technology will collect a type of special category data, biometric data if it is capable of uniquely identifying an individual. Biometric data involves the physical, physiological or behavioural characteristics of a person.

Therefore, if facial recognition technology is used to identify a particular individual as opposed to a category of persons (such as the profiling of customers by race, gender, age) this will be processing biometric data.

Article 9(2) of the GDPR lists a of limited number of exemptions which may justify processing special category data. These grounds include with the explicit consent of the data subject and other various grounds including vital interests of the data subject (immediate medical emergency), necessary for the establishment, exercise or defence of legal claims, processing relates to personal data already made public by the individual, substantial public interest, various medical and public health reasons and for scientific research or statistics.

The European Data Protection Board (EDPB) recently issued draft guidelines for public consultation, Guidelines 3/2019 on processing of personal data through video devices. The draft guidelines specifically address the use of facial recognition technology.

The EDPB is an independent European body established under the GDPR and publishes guidance on the application of European data protection laws.

Organisations which process personal data within the UK must comply with the General Data Protection Act (GDPR) and the Data Protection Act 2018.

The draft guidelines make some interesting observations about both the use of CCTV and facial recognition:

  1. Video surveillance may be necessary to protect the legitimate interests of a controller such as for the protection of property against burglary, theft or vandalism (Para 19)
  2. Video surveillance measures should only be chosen if the purpose could not reasonably be fulfilled by other means which are less intrusive to the fundamental rights and freedoms (Para 24)
  3. It may be necessary to use video surveillance not just within the property boundaries but in some cases may include the immediate surroundings of the premises in which case some protective measures such as blocking out or pixelating could be employed (Para 27)
  4. In respect of facial recognition, the draft guidelines voice caution. The EDPB appears to suggest that whilst other exemptions may arguably be available for processing of special category data, in the context of private organisations, explicit consent may in most cases be required. (Para 76)
  5. Where explicit consent is required, an organisation cannot condition access to its services on consenting to the processing but must offer an alternative solution that does not involve facial recognition (para 85).
  6. In cases where the technology captures passers-by, an exemption under Article 9 will still be required for these individuals (para 83). The difficulty in the case of passers-by is that consent must be obtained before undertaking processing and therefore either another exemption under Article 9 must apply or such processing may be unlawful.

If the conclusions of the ICO’s investigation into the Kings Cross matter reflect the views of the EDPB (explicit consent likely to be required for facial recognition), the flow on affects for companies could be widespread.  Certainly, companies which use facial recognition on individuals in public areas ought to be now reviewing their data protection compliance and procedures.

Disclaimer
This information is for guidance purposes only and should not be regarded as a substitute for taking legal advice. Please refer to the full General Notices on our website.

Author profile

About this article

Read, listen and watch our latest insights

art
  • 29 April 2026
  • Privacy and Data Protection

UK Data Protection – what’s new?

Having come into force on 19 June 2025, it comes as no surprise that we are now seeing the effects of the Data (Use and Access) Act 2025 (‘DUAA’). This article highlights a few of DUAA’s fundamental reforms, delves into one in particular, and examines how this will impact the recruitment sphere.

art
  • 29 April 2026
  • Employment

Employment Rights Act: Changing key contract terms will be harder from January 2027

The Employment Rights Act 2025 (“ERA 2025”) introduces a new regime that restricts how employers can change certain core contractual terms, with the key provisions now expected to commence on 1 January 2027.

art
  • 28 April 2026
  • Immigration

Proposed expansion of right to work checks from 1 October 2026: what employers need to know

The Home Office has published a consultation on a draft Code of Practice addressing how employers can avoid unlawful discrimination while preventing illegal working. The draft indicates a planned expansion of right to work (RTW) check obligations to take effect from 1 October 2026.

Pub
  • 27 April 2026
  • Corporate and M&A

Quarterly Insights: Key Corporate & Commercial Topics – Q2 2026

Join Stuart Mullins and Emma Docking as they explore key corporate and commercial topics, including SME growth and exit strategies for 2026, EMI schemes for employee incentives, and the importance of drag along and tag along rights.

art
  • 22 April 2026
  • Commercial Real Estate

Historic rent reviews: A warning for tenants

We have been asked whether a landlord is able to operate historic rent reviews. 

art
  • 14 April 2026
  • Employment

Updates to Vento Bands 2026: Injury to feelings awards

For discrimination and detriment cases, compensation can also cover non-financial losses, which, in most cases, will include an injury to feelings award.