07 January, 2020
With the recent public furore over several instances of its implementation across the UK, it is a good time to look into how the real life application of facial recognition technology holds up against data protection legislation. From the use of facial recognition at King's Cross, to much public furore, to the recent High Court case where South Wales Police were challenged on their use of facial recognition this is very much an issue of the day which will only become more prevalent as the technology spreads and becomes more widespread.
Over this summer there have been various news reports detailing the use of automated facial recognition cameras, to identify individuals who are guilty of having committed a previous offence, by a private firm, King's Cross Central Limited Partnership (KCCLP), who developed and managed the 67-acre King's Cross area.
The discovery prompted questions from London's Mayor, Sadiq Khan, as well as an investigation by the Information Commissioners Office (ICO), who were "deeply concerned" about the use of the technology.
Several weeks after the news broke, KCCLP released a statement stating that it was co-operating with the ICO investigation, and will not comment whilst the probe is ongoing.
KCCLP confirmed that the King's Cross Estate does not currently use facial recognition technology though there were two facial recognition cameras, covering a single location at King's Boulevard, which were operational between May 2016 and March 2018. Regarding why the system was needed, KCCLP stated:
"The system was used only to help the Metropolitan Police and British Transport Police prevent and detect crime in the neighbourhood and ultimately to help ensure public safety,"
All data was "regularly deleted, with the final deletion taking place in March 2018."
The King's Cross Estate team has since undertaken work on the potential introduction of new facial recognition technology, but this work has now stopped with no plans to reintroduce any similar system in the future.
On the 5th September 2019 the High Court rejected a judicial review claim in respect of South Wales Police's use of automated facial-recognition technology. Edward Bridges, a civil liberties campaigner from Cardiff, brought a claim against the Chief Constable of South Wales Police who are the national lead on the use of automated facial recognition technology in UK policing. They have been conducting trials of automated facial recognition technology since mid-2017.
The claim concerned a pilot project known as AFR (automated facial recognition) Locate. AFR Locate involves the deployment of cameras to capture images of members of the public, which are then processed and compared with images of persons on watch lists of "persons of interest", such as photos from custody records.
Mr Bridges claimed that use of AFR Locate generally, and on two particular occasions when South Wales Police used AFR Locate in Cardiff (in a busy shopping area and at an exhibition), as part of a trial, when Mr Bridges was present and had been caught on camera, was contrary to:
The claim stated that South Wales Police failed to comply with the data protection principles, in particular the need for law enforcement processing to be lawful and fair (note that this does not include transparency, which law enforcement agencies are exempt from compared to 'normal' every day data controllers) and that there was a failure to carry out a data protection impact assessment for the processing, as required under section 64(1) of the DPA 2018.
The claim for judicial review was dismissed on all grounds. The High Court was satisfied both that the current legal regime was adequate to ensure the appropriate and non-arbitrary use of AFR Locate, and that SWP's use to date of AFR Locate has been consistent with the requirements of the HRA, and the data protection legislation
The court concluded that there was a clear and sufficient legal framework (data protection legislation) governing whether, when and how AFR Locate could be used and there were sufficient police common law powers in relation to the use of AFR Locate in compliance with Article 8.
This is significant decision, since it is possibly the world's first legal challenge to the use of automated facial recognition technology by police forces.
Whilst the two examples may not strike some as relevant as they are on a large scale or are related to detecting crimes, facial recognition technology can be used by anyone and if they do not have a legitimate reason to use it, surprising types of organisations can run afoul of the law.
An example of this is the recent situation where the Swedish Data Protection Authority issued the country's first GDPR fine on 27th August 2019 after a school was found improperly using facial recognition technology to monitor the attendance of its students.
The school in Skellefteå, which is in the north of Sweden, was fined 200,000 Swedish Krona (approximately £17,000) after conducting a trial where the attendance of 22 pupils was tallied using facial recognition over the course of three weeks.
Even though it was only a test, this was found to have infringed "several articles in GDPR" as the school had processed biometric data unlawfully and did not do an adequate risk assessment (in this case a DPIA).
The test was run with the consent of the pupil's parents, but this was deemed unlawful due to the asymmetrical power relationship between the school and the pupils/parents.
Ranja Bunni, a lawyer at the Swedish Data Protection Agency, who helped with the review of this violation, said that consent isn't a valid legal argument since the students depend on the high school board. The agency pointed out in its release that there are alternatives to checking student attendance that aren't as intimately invasive as a facial recognition system
The use of facial recognition technology remains divisive due to concerns about its accuracy, racial bias and the intrusion into peoples everyday lives.
In July the ICO warned the police about the use of the technology, after a study found that 81 percent of 'suspects' flagged by Met's police facial recognition technology are innocent, and that the overwhelming majority of people identified are not on police wanted lists.
There remains a strong argument against implementing this technology without good reason, as seen at King's Cross and the Swedish school, however the South Wale Police trial shows that from a legal point of view there are legitimate paths to its use.
As artificial intelligence and machine learning become more of a part of day to day life then the implementation of facial recognition technology will become more prevalent however as the Swedish fine shows, if there is a less intrusive way to achieve your aims outside of using this technology then this should likely be implanted instead.
For more information contact Daniel Milnes in our Governance, Procurement & Information department via email or phone on 01254 222313. Alternatively send any question through to Forbes Solicitors via our online Contact Form.