In the last week there have been two decisions, relating to the use of Facial Recognition (FR), with differing outcomes. So with the two decisions on FR providing two different outcomes, what sets them apart? The fundamental differences of the two cases revolved around the lawful basis for processing and the proportionality of that processing to the purpose, with the public interest of crime prevention being of much greater public importance than automating the school register.

In the first case R (Bridges) v Chief Constable of South Wales Police and Others [2019] EWHC 2341 (Admin), the judge found that the South Wales Police's use of Automated Facial Recognition (AFR) was consistent with the requirements of the Human Rights Act and data protection legislation (both the Data Protection Act 1998 and the Data Protection Act 2018 (DPA 2018)). 

South Wales Police carried out a pilot which used CCTV to capture images of members of the public, including at locations and events that they deemed susceptible to criminal activity. Data analysis was then carried out against facial biometric data of individuals on a police watchlist to identify any individuals known to the police. If an individual was matched to the watchlist, action would be taken as appropriate and if there was no match the facial biometrics and images were deleted.

In terms of human rights, the judge confirmed that the processing did interfere with the right to respect for private life under Article 8(1) European Convention of Human Rights (ECHR), but that interference was in accordance with the law and the use of AFR in the context of crime prevention was proportionate. Therefore the use of AFR in this context did not breach the Article 8(1) right.

When considering the data protection claims, there were a number of points discussed.

1. The processing of the images was personal data under the data protection legislation, whether or not the individuals captured on camera were also on the watchlist. 
2. SWP's processing was fair and lawful. The court held that the processing, which was also deemed sensitive processing under Section 35(8) DPA 2018, was necessary for SWP's legitimate interests, taking account of the common law obligation to prevent and detect crime. In terms of the necessity test, the court referred to its decision in considering the proportionality requirements under the ECHR. In that regard the court found that:

  • the AFR was deployed in an open and transparent manner, with significant public engagement
  • the AFR was deployed at limited locations for limited periods of time and for the specific and limited purpose of seeking to identify specific individuals in the area on the police watchlist, who may pose a threat and 
  • the purpose could not be met by the simple deployment of CCTV without the AFR. 

3. SWP's processing was supported by a relevant impact assessment, firstly a Privacy Impact Assessment, and after May 2018 a Data Protection Impact Assessment (DPIA). The court did, however, state that where a data controller has genuinely approached the task of completing a DPIA, it is not for the court to second-guess that assessment, specifically stating: 

"What is required is compliance itself, i.e. not simply an attempt to comply that falls within a range of reasonable conduct."

The court dismissed the claim for judicial review on all grounds and found that the current legal regime is adequate to ensure the appropriate use of AFR in these circumstances and that SWP's use of AFR to date, has been lawful.

The second decision is that of the Swedish data protection authority (DPA), Datainspektionen, which has issued a fine under the General Data Protection Regulation (GDPR), to a school board who carried out a pilot using FR software via camera, in a secondary school classroom, with the aim of automating the registration process in order to free up teacher time. The photos of students' faces captured by the camera were stored, along with students' full names on a local computer. 

The Swedish DPA found that the school had breached the GDPR in three ways:

1.The processing was intrusive and the purpose, registering attendance, could be achieved in less intrusive ways, meaning the use of FR was disproportionate to the purpose. The processing was therefore in breach of the Article 5 GDPR purpose limitation principle.
2. The school board had obtained explicit consent from parents / guardians for the processing and it was possible to opt out of participating in the trial. However, the DPA stressed that the processing concerned children who were dependent on the school board, which gave rise to an inequality of the relationship between the school board and the students. Therefore, consent could not be relied on as a lawful basis for processing, as consent could not be considered freely given. Consequently, consent also could not be relied on in relation to the processing of special category personal data. The DPA also stated that managing attendance records is not necessarily in the substantial public interest, as would be required under the Article 9(2)(g) lawful basis for processing special category personal data.
3. No DPIA was carried out, although the school board had made a form of risk assessment, documenting that the processing would not result in any high risks to the students. The DPA found that the risk assessment did not consider the risks to the rights and freedoms of the students and did not document the proportionality of the processing in relation to the purpose. Therefore, the Article 35 requirement to carry out a DPIA in the event of processing likely to result in a high risk to the rights and freedoms of data subjects, was not met, especially as this processing included the use of new technology, FR, systematic surveillance and special category personal data. 

The DPA issued the school board a fine of SEK 200,000 (around £16,500). 

Although the Bridges case highlights that there may be instances when FR could be lawfully utilised for crime prevention or the protection of the public in the right circumstances, the court also made it clear that assessing proportionality is fact sensitive. Therefore assessing whether the use of FR is lawful will need to be assessed on a case by case basis, with the court stating that: 

"It will, of course, be open to any person who considers that their Article 8(1) rights have been the subject of interference because of the use of AFR Locate by SWP (or other law enforcement agency) to call on SWP to demonstrate that the interference was justified on the particular facts of the case."

With the aim being to bring harmonisation in the application of the GDPR, other regulators may well use the Swedish DPA analysis when considering similar cases. On that basis, ensuring: that there are appropriate lawful bases under both Article 6 and Article 9 GDPR; that the processing is proportionate to the purpose; and that a valid DPIA has been carried out, will be key factors regulators will be looking at when analysis such cases. 

Furthermore, with the ICO also announcing last week that they will be investigating the use of live FR technology in King's Cross, London, after it was reported that property developer Argent has deployed the software across its 67-acre estate, it will be interesting to see how the two decisions are applied.

However you view these recent decisions relating to the use of FR, one thing that rings true from the introductory comments of the Bridges decision is that "the algorithms of the law must keep pace with new and emerging technologies".