18 October, 2018
Recent data protection headlines have been filled with the statement from the Surveillance Camera Commissioner that advanced CCTV programmes in place in Manchester's Trafford Centre are 'not proportionate', causing many to wonder how far security systems can go before they fall on the wrong side of data protection law.
The system in question concerns the use of Automatic Facial Recognition Technology (AFR) to scan every single customer from April to September this year, estimated to be 15 million people, in an attempt to curtail criminal activity and increase general safety in the centre. The technology was utilised to match the faces of shoppers against a 'watchlist' provided by Greater Manchester Police (GMP), which included both wanted and missing persons. It is believed to be the largest scheme of its kind ever carried out in the UK, though the project has been subsequently disbanded.
Such mass surveillance systems have unsurprisingly raised a few questions from a data protection standpoint. With so many people having their images taken, assessed and processed without their knowledge, many are wondering if such measures are justifiable under data protection laws; particularly considering the recent introduction of the more stringent legislation in the form of GDPR and Data Protection Act 2018.
There can be little doubt that data collected by facial recognition software has the potential of becoming 'biometric data', which is considered a 'special category' of personal data under GDPR. This is before considering the abundance of personal data that could be incidentally processed from such collection regarding a person's race, sex or religion, all special categories in their own right.
Under GDPR, special category data cannot be processed unless such processing has a lawful basis under GDPR or falls under a specified exemption. The exemption that will be most applicable for the GMP under these circumstances is that such processing is legal whenever it is 'necessary for reasons of substantial public interest…which shall be proportionate to the aim pursued'. Unsurprisingly, the prevention and detection of crime is a well-established 'public interest', but doubt still remains over whether the deployment of face recognition software could be considered either necessary or proportionate.
Following an assessment of the measures in place the Surveillance Camera Commissioner, Tony Porter, has raised a number of concerns with the extent of surveillance.
Specifically, the Commissioner noted that the scheme had not received approval from police officers with a sufficient level of 'strategic command', that there had not been sufficient legal oversight from the beginning of the project, and no written policy was put in place regarding what the data was being used for and how long it would be kept. Additionally, insufficient guidance had been made available to unsuspecting shoppers regarding how their personal data was being collected and whether or not this would be retained for any purpose.
Whilst acknowledging the legitimacy of using CCTV to curtail crime or find missing persons, Porter concluded that the act of analysing the biometric data of so many people against such a small number of matches on the watchlist was certainly in danger of falling foul of being 'proportionate'.
The issue of proportionality becomes more of a concern when considering that, by GMP's own admission, as many as 98% of the 'matches' generated by the software are proved to be inaccurate, and instead wrongly flag up and process data relating to unsuspecting, innocent bystanders. After analysing 15 million shoppers, the six-month scheme produced one successful match.
This assessment, whilst not the result of an official ICO investigation, is nevertheless an interesting addition to the debate on the limits of using 'public interest' as an excuse for data processing. The lesson from this Commissioner's statement is that 'public interest' is clearly not an excuse that can be used carte blanche to justify every measure of public or even private surveillance.
The facts of the scheme may ring some bells for anyone who remembers the 2013 story of the Royston 'Ring of Steel', where a scheme of using cameras to monitor and record every licence plate going in and out of the Herefordshire town was ruled as illegal by the ICO.
In Royston, as may yet be the case with the Trafford Centre, the response of the offending party was not to stop the scheme altogether, but rather to take the comments of the assessment on board with aims to altering the use and deployment of the scheme.
It will certainly be interesting to monitor how GMP, and other organisations, react to the Commissioner's feedback on mass surveillance. The use of AFR has previously been used at many large-scale events including the Notting Hill Carnival and Champions League football matches; the assessments into the system's use and benefits may soon become much more of a concern for those looking to deploy such methods.
Intu Limited, the company that runs the Trafford Centre, have been eager to emphasise that not only is the scheme no longer in operation in the shopping centre, but that at no stage in the scheme were images retained. Nevertheless, many customers have raised concern over the practice and are eager to know exactly what information may have been gathered.
Anyone with such concerns should heed the advice of the Surveillance Camera Commission and make a subject access request to both Intu Limited and GM Police. Such requests are free to make and require the organisations involved to investigate and produce all personal data they hold on the requestor.
Forbes regularly advises on matters concerning Data Protection and GDPR. If you have concerns over any GDPR compliance issues in your organisation, or believe you have been affected by the Trafford Centre surveillance, contact us at email@example.com to find out more about how we can help.