24 February, 2023
In a statement published 31 January 2023, the ICO confirmed its view that North Ayrshire Council's (NAC) use of Facial Recognition Technology (FRT) for cashless catering purposes in canteens at nine of its schools is likely to have infringed articles of the UK General Data Protection Regulations (UK GDPR). The new system, which has proved controversial, was implemented with the intention of speeding up queues at lunch time and was suggested to be a more Covid-secure solution than card payments or fingerprint scanners.
In this article we explore the circumstances that gave rise to investigation by ICO and the issues FRT pose because of domestic data protection legislation in England and Wales.
An enquiry by the ICO was undertaken after privacy campaigners raised concerns over NAC's use of FRT for some of its pupils. Somewhat significantly, the ICO concluded that whilst it may be possible to deploy FRT in schools lawfully, in this case, NAC did not.
More specifically, the ICO were concerned that the technology had been deployed in a manner that is likely to have infringed various data protection laws including Article 5(1)(a) of the UK GDPR relating to personal data being processed lawfully, fairly and in a transparent manner.
Moreover, the ICO raised concern that NAC sought parental consent, rather than consent from children aged between 12 and 14 years old. It is important to note that whilst NAC is based in Scotland, schools in England and Wales must ensure they are undertaking similar safeguards in relation to consent, in accordance with Section 26 of the Protection of Freedoms Act 2012. This legislation states that if at any time a child refused to particate in or objects to anything that involves the processing of its biometric information, the relevant authority must ensure the information is not processed, irrespective of any consent given by the parent. In England and Wales, generally children aged 13-years-old are considered mature enough to be able to provide their own consent. Therefore, if schools are relying on consent as their lawful basis for processing, usually only children aged 13 or over can provide their own consent and whoever holds parental responsibility for children under this age must provide their consent to process their children's personal data.
FRT and other similar technologies can offer benefits within an education setting, however, the processing of special category data is not without risk, and in the education sector setting, the risk is heightened because children's data is being processed, who are classed as vulnerable data subjects for the purposes of the UK GDPR. Recital 38 of the UK GDPR outlines that children merit specific protection regarding their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data.
One of the ICO's main concerns was that NAC were unable to demonstrate a lawful basis for the processing. As the FRT system is classed as special biometric data, there must be both a lawful basis for processing under Article 6 of the UK GDPR and a condition for processing the special category data under Article 9 of the UK GDPR. NAC stated they were relying on consent as its lawful basis under Article 6(1)(a) and explicit consent under Article (9)(2)(a) for processing special category data.
The ICO considered the forms sent to individuals including pupils and parents within the school and determined that consent wasn't freely given. There needs to be a genuine choice available for individuals and in this case, the NAC's FRT consent form stated, "facial recognition will be used for authenticating all secondary school pupils that require access to school meals and/or snacks, including those eligible for free school meals." This does not present FRT as an option and the ICO said it appears "unlikely" that consent was freely given.
Further to this, the ICO criticised the Data Protection Impact Assessment (DPIA) undertaken by NAC and said it was unlikely to have complied with Article 35 of the UK GDPR. There were no risks identified in the DPIA relating to the processing of children's biometric data and NAC should have ensured that its DPIA contained advice from its Data Protection Officer (DPO) to show that the controller had considered all relevant risks and what, if any, changes had been made a result.
Whilst the decision to implement FRT in certain situations may be controversial, this investigation represents a recognition by the ICO that FRT represents a progressive means of processing data, particularly in the education sector. Notwithstanding the ICO's agreement that implementation can be done successfully, schools must keep a mind to the restrictions imposed on them as data controllers, by the domestic data protection legislation. To increase confidence in their compliance with the UK GDPR, we recommend that schools using FRT should:
Ensure there is a valid lawful basis for processing children's data. When processing special category data, you must ensure a further condition for lawful processing in accordance with Article 9 of the UK GDPR.
Ensure that the processing is transparent. It is vital that schools are able to explain in age-appropriate language how children's data will be collected, used, stored and retained. The risks associated with its processing should be clearly set out in a legible format which can be accomplished by producing a children's privacy notice.
Ensure that a comprehensive DPIA that complies with Article 35 requirements has been completed. The DPIA should also consult the advice of the schools DPO and document this.
You can view the full statement relating to NAC's use of FRT by the ICO here.
For more information contact Laura Rae in our Governance, Procurement & Information department via email or phone on 01772 220221. Alternatively send any question through to Forbes Solicitors via our online Contact Form.
Learn more about our Governance, Procurement & Information department here