The Information Commissioner has raised the bar on using live facial recognition technology in public places.
In a recent opinion on the issue, outgoing ICO head Elizabeth Denham said that companies implementing such systems needed to pass tough tests. Data controllers must make a clear case for necessity, proportionality, consent and privacy.
But laws on the facial recognition are complex. Lawmakers and the courts appreciate the benefits but balancing public protection and benefit is difficult.
Cameras that take biometric information from images fall under the data protection laws. The case for using facial recognition to identify individuals on smart phones, for instance, is relatively strong. That is because the user gives his or her consent.
However, when systems are used in places such as shopping centres (LFR), consent is not usually requested. In fact, LFR enables mass, covert surveillance.
The paper draws the distinction between one-to-one recognition for security entry to offices, for example, and public use. Denham said she was most concerned about LRT capturing faces in public, applying categories to them and then using that data for marketing and other purposes. The measures in the opinion provide guidance for organisations seeking to apply LFR and will inform the commission’s compliance activities in this area.
“These requirements mean that where LFR is used for the automatic collection of biometric data in public places, there is a high bar for its use to be lawful.”
The opinion draws on the commissions past work. In 2019, ICO investigated the use of LFR by the South Wales Police and the Metropolitan Police Service. ICO found multiple short comings in both cases. In fact, the Court of Appeal found that the police’s use of LFR was unlawful under the Data Protection Act (DPA).
Different legal frameworks cover using LFR in public places. Those include GDPR and part two of the DPA.
In forming the current opinion, ICO examined several real-life LFR deployments. It found that data controllers often did a poor job of assessing risks. That was especially true where they bought in software from third parties without good due diligence.
Denham concluded, “data protection by design and default principles must be at the heart of any advances.”