The Federal Trade Commission (FTC) and the New York Attorney General have announced that Google and YouTube will pay a record $170 million to settle allegations that YouTube violated the Children’s Online Privacy Act (COPPA) Rule by collecting children’s personal information without required notice or verifiable parental consent.

The settlement marks the first instance in which the FTC has found a general audience online service in violation of the COPPA Rule for collecting personal information from a child-directed portion of its service, and demonstrates the FTC’s continued focus on children’s online privacy.

The COPPA Rule applies to operators of websites or online services that are directed to children under age 13 or that have actual knowledge that they are collecting personal information from a child under age 13. In 2013, the FTC amended the COPPA Rule to create liability for third-party providers like ad networks and plug-ins that have actual knowledge that they are collecting personal information from users of other child-directed websites or online services. In those instances, the third-party provider is deemed to be an online service directed to children by virtue of its collection of personal information from users of another child-directed service.

The FTC complaint alleges YouTube and Google collected persistent identifiers from users of YouTube channels that they knew were directed to children

In its complaint, the FTC alleged that YouTube and Google were required to comply with the COPPA Rule because they had actual knowledge that they were collecting personal information, including persistent identifiers for behavioral advertising purposes, from users of child-directed YouTube channels.

The FTC took the position that these individual YouTube channels are online services and that, because YouTube and Google had actual knowledge that the channels were child-directed, YouTube was deemed to be an operator of those online services responsible for COPPA compliance. The complaint alleged that YouTube and Google failed to provide parents with COPPA-required notices of their information practices of obtain verifiable parental consent.

The YouTube platform contained channels directed to children

The FTC considers several factors in determining whether a website or online service is directed to children, including subject matter, visual and audio content, language, audience composition, intended audience, use of animated characters, and the inclusion of child-oriented activities and incentives. In considering these factors, the FTC found that the YouTube platform hosted numerous channels that are directed to children. Many of these channels self-identified as being directed to children (eg, in the “About” section of their YouTube channel webpage) or contained content with indicia that it was directed to children (for example, animated characters or a subject matter that generally appeals to children).

YouTube’s actual knowledge

The FTC alleged that YouTube and Google had actual knowledge that the channels from which it collected personal information were child-directed. According to the FTC, the parties obtained this actual knowledge through communications with the channel owners, which told YouTube and Google they were directed to children; through YouTube’s internal automated content rating system, which identified certain content as directed to children; through YouTube and Google employees’ manual review of content from the YouTube website to curate specific content for its YouTube kids app; and through YouTube and Google’s own research.

As evidence of the parties’ actual knowledge, the FTC pointed to their marketing of its service itself as a top destination for kids in presentations to makers of popular children's products and brands.

According to the FTC, even though YouTube and Google consistently referred to the YouTube platform as a top website for children, they attempted to fall outside of the scope of COPPA by making inconsistent statements that YouTube was a general audience website and did not have users under the age of 13.

Even where a channel self-identified as child-directed, or YouTube’s content rating system assigned a “Made for Kids” or “Y” rating for content uploaded to the platform, YouTube and Google allegedly continued to collect personal information to engage in behavioral advertising.

The FTC’s settlement order requires Google and YouTube to:

  • Develop, implement, and maintain a system for YouTube channels to designate whether their content is directed to children.
  • After the YouTube channels have made such designations, Google and YouTube must cease using, disclosing, or benefiting from personal information previously collected from users of content that is designated as directed to children (notably, the FTC did not require Google to delete such personal information).
  • Make reasonable technological efforts to ensure future compliance with COPPA parental notice and verifiable parental consent requirements.
  • Provide an annual COPPA compliance training to all individuals responsible for managing YouTube’s relationships with YouTube channel owners.
  • Create certain records associated with the collection, use, and disclosure of children’s personal information for a period of 10 years, and maintain such records for 5 years.
  • Certify compliance with COPPA in reports to the FTC and State of New York under penalty of perjury for a one year period, and certify certain changes in contact information and entity structure for a period of 10 years.
  • Pay a civil penalty of $136 million to the FTC, and $34 million to the State of New York Department of Law.

YouTube has announced changes to its children’s information practice

In response to the FTC settlement, YouTube announced on its official blog that it will make changes to the treatment of children’s content on the platform, stating that it “will treat data from anyone watching children’s content on YouTube as coming from a child, regardless of the age of the user.” The blog explained that in order to identify content made for kids, YouTube will require content creators to tell it when content is child-directed, and YouTube will also use machine learning to identify videos that clearly target young audiences.

Two FTC commissioners dissented, saying it’s not enough

In what is beginning to look like a trend, Commissioners Rebecca Kelly Slaughter and Commissioner Rohit Chopra dissented from the settlement order, raising concerns that the order did not adequately deter children’s privacy violations.

The commissioners – who also dissented from the FTC’s settlement with Facebook in July – opined that the settlement should have included greater injunctive and monetary relief. Commissioner Slaughter expressed the concern that “the order does not require YouTube to police the channels that deceive by mis-designating their content, such as by requiring YouTube to put in place a technological backstop to identify undesignated child-directed content and turn off behavioral advertising.” Commissioner Chopra stated that “the terms of the settlement were not even significant enough to make Google issue a warning to its investors,” and that the monetary penalty is insufficient because Google and YouTube’s profits from their COPPA violations exceed the monetary penalty in the settlement order. Commissioner Chopra also objected to the FTC not holding individual executives personally liable – an objection that both he and Commissioner Slaughter previously lodged in the Facebook settlement.

Implications

The FTC settlement has important implications for content creators and hosting platforms alike. Notably, content creators should ensure compliance with COPPA to avoid strict liability for violations of children’s online privacy requirements. YouTube content creators should be especially vigilant, as FTC Chairman Simons announced that the FTC “intends to conduct a sweep of the YouTube platform to determine whether child-directed content is being properly designated as such in order to ensure that the channels themselves are complying with COPPA.”

Further, online platforms that contain portions of their service that are directed to kids need to assess their COPPA obligations. Even where child-directed portions of a platform consist of third-party user generated content (UGC), the platform operator will be responsible for COPPA compliance to the extent it knows it is collecting personal information through a portion of the platform containing child-directed UGC.

Platforms and third-party providers collecting personal information from other services should consider whether they are at risk of obtaining actual knowledge (inadvertently or otherwise) that such services are child-directed, and should consider whether risk mitigation strategies such as signaling are appropriate for their business.

Last, businesses should brace themselves for increasing fines from the FTC – the $170 million fine against YouTube is nearly 30 times higher than any previous COPPA fine, and may signal a trend of the FTC seeking higher financial penalties in COPPA enforcement actions.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.