The United Kingdom's Information Commissioner's Office (ICO) finalized a new Code of Practice (the Code) in September 2020, which applies to most companies that offer online services to or otherwise collect personal data from users in the UK, if their service is likely to be accessed by children. Given the finalization of this Code, there is no better time to resolve to stay current on children's privacy laws.

By publishing the Code, the ICO is signaling how it intends to interpret the data protection principles under the EU General Data Protection Regulation (GDPR) and UK's Privacy of Electronic Communications Regulation (PECR). As such, this means that violations of the Code could lead to GDPR-standard fines: up to ?20 million/£17.5 million, or 4% of a company's annual worldwide turnover, whichever is higher.

Fortunately, the Code includes a 12-month transition period before enforcement begins, so companies should start learning about the new requirements and taking steps to comply now before enforcement starts in September 2021.

Key Changes and Requirements

The Code consists of 15 standards to which all subject online services will need to adhere, all while keeping in mind a holistic, risk-based approach that considers a child's best interest. While some of the standards will likely be familiar to many companies with developed privacy programs, such as data minimization, transparency and restrictions on data sharing, companies that are accustomed to handling underage privacy issues may nonetheless be surprised by some of the Code's new requirements. Some of the most impactful changes are outlined below.

A child is now considered to be any user under 18. 

Most global privacy laws set a minimum age (13 for the US, and 16 for many EU countries) under which no personal data may be collected from a child, unless a parent gives consent or an established exception applies. The Code does not change the minimum age for data processing in the UK (it remains 13 years old), but separately asserts that children require special protections online until they are 18 years old. The Code requires companies to treat children differently depending on their age group (0-5, 6-9, 10-12, 13-15 or 16-17, for example) by tailoring a company's privacy policy language to fit the reading comprehension level of each group. 

Some of the ICO's requirements, such as the requirements to limit data sharing and personalization efforts by default, are likely to have an impact on many companies, especially services with large numbers of UK users between the ages of 16-18 years old who will now need additional protection under the new guidance.

Analytics and data collection for "service enhancement" can no longer be collected by default, setting up a collision course with COPPA.

The U.S. Children's Online Privacy Protection Act (COPPA) is one of the most well-known (and enforced) children's privacy laws in the world. Notably, COPPA does not require parental consent in situations where a child's data is used only for optimization, personalization, statistical reporting, and other functions necessary to analyze and support the "internal operations" of the service. The ICO takes a much stricter view, holding that "collection of personal data in order to 'improve,' 'enhance,' or 'personalize' your users' online experience [is] beyond the provision of your core service" and requires separate consent. Combine this strict interpretation with the ICO's position that consent for non-essential processing must be provided by the UK child's parent where the child is under 13, and this provision will likely to have a major business impact on the practical ability of child-directed apps to perform analytics or otherwise use user data from UK residents to improve their services. 

That said, it is unclear whether the Code would necessarily prohibit the collection of de-identified data for analytics purposes, or the extent to which a company can define its personalization efforts as part of its "core service." With this in mind, we recommend exercising caution before defining a "core service" too broadly: the ICO writes "although there may be some limited examples of services where behavioral advertising is part of the core service (e.g. a voucher or 'money off' service), we think these will be exceptional. In most cases the funding model will be distinct from the core service and so should be subject to a privacy setting that is 'off' by default."

Privacy/GDPR as an enforcement hook for nearly any harm to children. 

One of the Code's standards is a requirement to consider the "best interest of the child" when making design decisions around data- this concept is referenced frequently in the other standards of the Code as well. According to the Code, determining the best interest of a child is a holistic analysis, incorporating a variety of factors, such as the child's rights to freedom of expression, thought, conscience, religion, association, access to information, age-appropriate play, protection against economic, sexual or other forms of exploitation, and more. 

Child privacy laws and enforcement have traditionally overlapped with other child-protection initiatives. For example, the Italian Data Protection Authority recently placed a temporary ban on TikTok following the death of a 10-year-old girl from Palermo, who suffocated after participating in a 'choking challenge' on the social media platform. (This is in addition to a ban on TikTok from the Indian government for activities deemed "prejudicial to sovereignty and integrity of India," suggesting a high level of mistrust with respect to these social network services internationally.)

Nevertheless, the Code makes clear that the ICO intends to use GDPR/PECR (and its associated penalties) as the enforcement hook for just about any "detrimental impact" on a child that can be traced back to their data. For example, the ICO would consider the following activities to be violations enforceable under the Code:

  • Marketing practices that make direct exhortations to children to buy or persuade their parents to buy advertised products;
  • Mechanics that encourage excessive screen time;
  • Failures to adequately police or enforce a service's self-imposed community guidelines (setting up another potential conflict with immunity afforded to US companies under Section 230 of the Communications Decency Act); or
  • Any practice that is contrary to established guidelines on children's marketing, including showing ads for music content with explicit lyrics or for foods that are high in fats or sugars.

Children have privacy rights against their parents. 

Although many companies provide robust parental controls designed to allow parents to monitor and control their children's use of a given service, the Code recognizes that children have certain privacy rights specifically against their parents. For example, the code requires that children must be notified any time when a parent is monitoring their activity. This requirement may not be intuitive to many companies, especially those in cultures with a more expansive view of a parent's right to monitor and control their child's online behavior. This creates another potential conflict with COPPA, which mandates that operators of services directed to children under 13 give parents the unqualified ability to access and delete personal information that a service collects from their children.

Steps to Take Now to Prepare

For companies that already take steps to comply with children's privacy laws like COPPA and GDPR-K, below are some items to focus on in 2021:

  1. Double-check your service's audience and the steps you take to screen out and/or protect children. As with COPPA, the Code applies to services that appeal to children or that are directed to children. If a company with an underage following wishes to avoid being subject to the Code, it should document and take additional steps to keep children from accessing its service, such as relying on neutral age-screens and cookies to prevent resubmission (already encouraged under COPPA), or, for very high-risk activities, requiring additional measures like utilizing a third-party verification service to check ages and/or collecting IDs used solely for age verification.
  2. Make sure all optional data collection/sharing settings are off by default for all users under 18 in the UK. This includes all analytics and personalization features that cannot be reasonably classified as part of the company's "core" service. For younger users, the Code recommends "interventions" before the user can choose a less protective privacy setting, instructing the child to ask their parents to help. Given that UK users younger than 13 cannot legally make changes to personal data processing without their parents' consent, companies should primarily design these "interventions" with 13-17-year-old users in mind.
  3. Work towards kid-friendly privacy disclosures in your privacy policy and "just-in-time" notices. Companies offering online services that appeal to children should consider adding a "kid-friendly" version of their privacy policy to supplement their more robust legal disclosures. The code also recommends diagrams, cartoons and/or graphics to explain privacy concepts to children between the ages of 6 and 12.
  4. Ensure your documentation is in place and consult with children where possible. The Code requires all companies to document their compliance through a Data Protection Impact Assessment (DPIA). The ICO includes a sample DPIA template as part of the Code. One of the key requirements of an effective DPIA is actual consultations with children and parents to find out how they use the services, the risks they might encounter and whether they understand the privacy-disclosures the company plans to present in the finished product. It may be difficult to have a full picture of how kids interact with a feature before it launches; therefore, the DPIAs would ideally be updated over time.
  5. Consider how kids' privacy fits into a larger global compliance program. Companies that do allow underage users should take a holistic view of how their service could harm them, and consider how to craft a child-friendly experience starting from the early stages of design. This includes thinking about the ads that kids might see on the platform, how a service's monetization model affects kids, and what other interactions a kid might have with other users on the platform. Remember that children are impressionable and often lie about their age to access services they shouldn't - this is why they need extra protection.
  6. Check with your certification provider about compliance with the Code. The ICO contemplates that certification schemes will become available that will offer certification of adherence to the Code. Such providers also often provide safe harbor protection under COPPA, so companies should consider whether joining such a program is the right move for their business.

While the Code presents a number of new challenges for companies, now is the best time for companies to think through their compliance programs and implement changes, always keeping in mind a smart, risk-based approach. Of course, there will be even more to consider on the horizon, with the Irish Data Protection Commission announcing its own code as well, which is still in drafting stages until March 2021. As always, Fenwick's attorneys are here to help you navigate these constantly shifting standards.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.