The Information Commissioner Elizabeth Denham has warned that the use of live facial recognition (LFR) technology must comply with privacy laws.

“Any organisation using software that can recognise a face amongst a crowd then scan large databases of people to check for a match in a matter of seconds, is processing personal data,” she wrote this month on the Information Commissioner Office’s blog.

The warning comes after two extensive trials of the technology by the South Wales Police and the Met Police in London. Both trials have shown that the technology can be used to make arrests. It works by capturing facial features on a camera – a facial fingerprint. An algorithm compares that image to those held in suspect databases. Police officers then compare the match against the real person that has been selected and decide whether it is accurate. Companies are trying to innovate and invent different ways to ensure Facial Biometrics and its authentication is actually as secure as it should be, as an example in the past people have bypassed facial recognition by using a picture of the facial structure required, some innovations include using facial liveness detection that requires the authentication system to determine if it’s a real person, not just the right person.

Denham said her office had done a “deep dive” into the trials with police co-operation.

Issues to address

“Legitimate aims have been identified for the use of LFR,” she said. “But there remain significant privacy and data protection issues that must be addressed, and I remain deeply concerned about the rollout of this technology. I believe that there needs to be demonstrable evidence that the technology is necessary, proportionate and effective considering the invasiveness of LFR.”

Denham gave professional evidence in a recent court case involving a member of the public, supported by civil rights group Liberty. He challenged the lawfulness of South Wales Police’s use of LFR via the courts in May.

The man involved – Ed Bridges – was out shopping in Cardiff city centre when the police captured his image. The courts have heard the case to decide whether the use of facial recognition in this way by SWP is lawful.

“The resulting judgment will form an important part of our investigation and we will need to consider it before we publish our findings,” she said. “Whilst the judgment will be important, any force deploying LFR needs to consider a wide range of issues.”


For other forces considering the use of such technology, ICO has the following advice:

  • Carry out a data protection impact assessment and update this for each deployment – because of the sensitive nature of the processing involved in LFR, the volume of people affected, and the intrusion that can arise. Law enforcement organisations are advised to submit data protection impact assessments to the ICO for consideration, with a view to early discussions about mitigating risk.
  • Produce a bespoke ‘appropriate policy document’ to cover the deployments – it should set out why, where, when and how the technology is being used.
  • Ensure the algorithms within the software do not treat the race or sex of individuals unfairly.
  • Police forces should also ensure they have familiarised themselves with our Guide to Law Enforcement Processing covering Part 3 of the Data Protection Act 2018.

Although data protection law differs for commercial companies using LFR, the technology is the same and the intrusion that can arise could still have a detrimental effect, the ICO said.