All singing and dancing? TikTok pays a high price for misuse of children's data

Together we are Forbes

Article

06 April, 2023

Bethany_Paliga
Bethany Paliga
Senior Associate

On 4 April 2023, the UK regulator for data protection, the Information Commissioner's Office (ICO) announced it had concluded its investigations into TikTok Information Technologies UK Limited and TikTok Inc (TikTok), the popular video sharing platform, and confirmed that it had issued TikTok with a substantial fine of £12.7 million, for what the ICO describes "is for a number of breaches of data protection law, including failing to use children's personal data lawfully." UK data protection legislation identifies TikTok as a 'data controller', meaning it has control over the purpose and means by which it uses personal data.

This announcement comes in the wake of the Cabinet Office's decision to ban the use of TikTok on government issued, electronic devices, in response to privacy concerns, and will undoubtedly cast further aspersions over wider use of the platform in the UK. This is similar to other jurisdictions who have raise similar concerns, such as the USA, Canada and Australia.

Having reviewed the ICO's announcement, we consider where these concerns arose from, and why, and what lessons others in the tech space and/or who control children's personal data can learn from this decision.

Background

TikTok, which was released internationally in 2017, is a short-form video sharing platform, developed and owned by Chinese company, ByteDance. The platform gained significant, worldwide popularity in 2020, during the onset of the Covid-19 pandemic. Since then, it has become one of the world's fastest growing brands, attracting tens of millions of users every day. Most recently, TikTok has attracted criticism from leading jurisdictions, resulting from a series of privacy and security concerns.

In September 2022, the ICO served a "notice of intent" following a series of investigations is had undertaken into the policies and procedures adopted by TikTok and the effect these have on the personal data of its users, specifically, children. At the time, the ICO indicated TikTok could face a fine totalling up to a massive £27 million, for what it deemed to be failures by the platform to protect children's privacy.

At the time, it was explained that investigations found that between May 2018 and July 2020, Tiktok may have:

  • "processed the data of children under the age of 13 without appropriate parental consent,
  • failed to provide proper information to its users in a concise, transparent and easily understood way, and
  • processed special category data, without legal grounds to do so."

The ICO's notice of intent concluded by confirming that representations would be taken into consideration from TikTok, before making a decision about whether a fine would become payable and if so, in what amount.

Since that time, TikTok submitted representations in its defence to the ICO, resulting in the regulator deciding not to pursue the provisional finding related to the unlawful use of special category data and concluding that £12.7 million fine was more appropriate. It is still one of the noticeably largest fines the ICO has issued in recent years.

In its statement, the ICO estimated that TikTok allowed up to 1.4 million UK children under 13 to use its platform in 2020, despite its rules about not allowing those under 13 to create an account. It also felt TikTok also failed to carry out adequate checks to identify and remove underage children from its platform.

The Law

UK data protection legislation considers individuals of 'adulthood' to be those aged 18 and above, however, there are specific rules afforded to children of sufficient 'maturity', being those who are able to understand their own data protection rights and consent to the use of their data under the age of 18, which is a principle adopted by the regulator and many other tech giants, in the operation of their business.

Article 8 of the UK GDPR sets out specific obligations where an information society service (such as TikTok) is offered directly to a child. The UK GDPR requires consent to be obtained from a child, where the child is aged 13 or over. Where the child is below the age of 13 years, the processing will only be lawful only if consent is given by a parent or other individual with responsibility for the child.

In addition to this, TikTok, and other platforms like it, must be able to demonstrate that they are adhering to strict policies and procedures, in order to protect the privacy of those children. Moreover, platforms solely marketing themselves to children, and specifically children under the age of 13, (which we have seen from other popular platforms such as YouTube with the launch of 'YouTube Kids') must have in place an even higher degree of privacy control, to ensure that they are handling children's data in a transparent way and is not misused.

In September 2021, the ICO, somewhat controversially, introduced a series of regulatory standards for those who use children's data, in the form of 'the Age Appropriate Design Code' or 'Children's Code' (the Code). From September 2021, websites and products affected by the code need to provide additional layers of protection for children's data. At the time, the ICO released a statement saying:

"this might involve restricting or removing certain features to children if they're under 18. Some of the things you might see are:

  • privacy settings being automatically set to very high;
  • children and their parents/carers being given more control of the privacy settings;
  • non-essential location tracking being switched off;
  • children no longer being 'nudged' by sites through notifications to lower their privacy settings; and
  • clearer and more accessible tools being in place to help children exercise their data protection rights (including parental consent tools)."

Critics of the Code have previously indicated that its implementation creates havoc for those who it may incidentally apply to as a result of having their service accessed by children, even if it is not intended to be suitable for children, however, the ICO has made clear its priority in introducing these standards has been to create a more wholesale, positive and safe experience for children online.

Whilst the ICO investigation into TikTok commenced before the Code was introduced, its legal obligation to protect children's information remained, as a resulting of the existing legislation, such as the UK General Data Protection Regulation and Data Protection Act 2018. Moving forwards, TikTok and platforms like it should be adhering to the principles and guidance offered by the Code, to ensure their compliance with the standards expected by the legislation.

In coming months, the anticipated UK Online Safety Bill will demand social network platforms to undertake strict age verification processes, meaning TikTok will may be faced with even tougher sanction, should it fail to rectify the issues identified by the ICO.

Looking forwards and points to consider

In our view, the ICO's announcement not only reinforces its wide powers of sanction, but also indicates the dim view it takes towards those who it considers are acting in ignorance of the enhanced protections afforded to children and other vulnerable individuals.

TikTok has 28 days from the date of issue to appeal its fine. It remains to be seen whether this right will be exercised, despite some early indications from TikTok representatives that the company disagrees with the conclusions reached by the ICO. Undoubtedly, the UK's increasingly stringent approach to protecting children's online safety will provide TikTok with some critical points to consider for improving and rebuilding, in response to these concerns.

Whilst TikTok has borne the brunt of the lessons that can be learned from this decision, this announcement provides an opportunity for other data controllers of children's data, such as those in the education and technology sectors, to reflect on their own practices and procedures, not only to assess their degree of compliance against the Code and existing legislation, but also to ensure preparedness for the changes that will be brought about by the upcoming Online Safety Bill, and the consequences of non-compliance. In our view, now is a good time to audit the policies and procedures you have in place, alongside the adequacy of any contractual agreements you have in place with third party providers working on your behalf.

Should you wish to obtain further advice and support with regarding to your data protection policies, procedures and agreements and/or wish to obtain a deeper understanding of obligations placed on data controllers in respect of children's data, please contact Bethany Paliga.

For more information contact Bethany Paliga in our Governance, Procurement & Information department via email or phone on 01254 222347. Alternatively send any question through to Forbes Solicitors via our online Contact Form.

Learn more about our Governance, Procurement & Information department here

The Importance of Collateral Warranties & their Drafting

ICO Issues Reprimand for Redaction Failures

Contact Us

Get in touch to see how our experts could help you.

Call0800 689 3206

CallRequest a call back

EmailSend us an email

Contacting Us

Monday to Friday:
09:00 to 17:00

Saturday and Sunday:
Closed