In August, Viacom and a number of other app developers and ad-tech companies reached a settlement with parents who had alleged that the companies were illegally selling children's personal information for behavioral advertising. The deal involves 16 separate settlement agreements that impose limitations on the collection and use of children's personal information and could have far-reaching implications across the mobile app industry.

In 2017, parents whose children used gaming apps filed three lawsuits alleging that the defendants had embedded code into the games to collect personal information, without parental consent. The code allegedly tracked children's activity across apps and devices, and used the information for targeted advertising. The majority of the parents' claims survived motions to dismiss in May 2019, including state law claims under the California constitutional right to privacy, the California Unfair Competition Law, New York General Business Law, and Massachusetts privacy law, as well as common law claims of intrusion upon seclusion.

According to the settlement motion, which is pending approval before the U.S. District Court for the Northern District of California, the app developer defendants—Viacom, Kiloo and Sybo—agreed to make certain changes regarding children's privacy in the gaming apps at issue in the litigation, as well as dozens of other gaming apps. Among other provisions, the defendants committed to update certain age-screening gates to comply with federal guidance, limit the collection of location information and create an enforceable right related to certain apps for class members to ensure the apps meet federal privacy standards, including the Children's Online Privacy Protection Act (with its implementing regulations, COPPA).

The agreement requires all but one of the ad-tech defendants, including Twitter, AdColony and Chartboost (among others), to limit their services to only contextual advertising—i.e., advertising based on the content of the surrounding app or page—in thousands of apps directed to children. It prohibits behavioral advertising—i.e., advertising based on the collection and retention of data about the consumer's online activity across different websites and apps over time—in these apps. Additionally, the ad-tech companies agreed to limit any advertising to only contextual advertising in any app where a user is identified as under age 13.

COPPA prohibits behavioral advertising on child-directed apps without parental notice and consent. Although COPPA does not contain a private right of action, this settlement provides an example as to how litigants can nonetheless seek to compel COPPA compliance—and even impose more stringent standards—through class action litigation.

Last year, the Federal Trade Commission (FTC), the agency charged with enforcing COPPA, solicited public comments and held a workshop on potential updates to COPPA due to continued rapid changes in technology. The agency also entered a $170 million settlement agreement with YouTube and Google in September 2019 over alleged COPPA violations.

Congress has also shown continued interest in children's privacy issues. In March, Sen. Ed Markey (D-MA) and Sen. Richard Blumenthal (D-CT) introduced the Kids Internet Design and Safety (KIDS) Act, which aims to create protections for online users under the age of 16 by limiting the use of certain features and advertising practices designed to appeal to children. Other bills currently pending before Congress, such as the Kids PRIVCY Act and the PROTECT Kids Act, seek to strengthen protection for children online by amending COPPA.

Originally Published by Akin Gump, December 2020

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.