The first months of 2019 have seen several key developments in the world of children's privacy. There have been major enforcement actions, new legislative proposals, and new best practices and guidance issued, both in the United States and abroad.

The running theme in all these developments is that companies — especially those who may not intend to or may not be aware that they are targeting children — now need to account for underage users and take necessary precautions to secure their privacy. It's not sufficient to simply put in a company privacy policy that one's service is not intended for users under 13; if kids are using a service, a regulator is going to demand the service be kid-friendly.

This article analyzes recent developments and includes steps companies can take right now to protect their business.

Executive Summary

  • In the $5.7 million TikTok/ settlement, the FTC relied heavily on "reliable empirical evidence" of audience composition when determining which sites are subject to the Children's Online Privacy Protection Act (COPPA).
  • The California Consumer Privacy Act (CCPA) extends additional privacy protections for 13 to 16-year-old California minors, causing a dilemma for U.S. businesses that need to decide whether to single out Californians for special treatment.
  • Proposed federal amendments to COPPA would strengthen the law and hold providers accountable if they have "constructive knowledge" that a user is underage.
  • The UK Information Commissioner's Office issued a sweeping new code of practice with strict requirements for providers to protect children's well-being.

FTC Considers TikTok/ "Directed to Children," Leading to a Record-Setting Fine

On February 27, 2019, the FTC announced a record-setting $5.7 million fine to popular short-form video sharing platform TikTok, formerly known as, as part of a consent order over allegations that the company violated COPPA. The settlement is now the largest COPPA penalty ever obtained by the FTC.

COPPA applies to the operator of any website or online service "directed to children" that collects personal information from children (defined as those under 13 years of age), or any website or online service that has actual knowledge that it collects personal information from children. Unless an exception applies, an operator subject to COPPA must obtain verifiable parental consent before collecting any personal information from a child.

The FTC will evaluate whether a site or service is "directed to children" based on a variety of factors such as: the subject matter of the site or service, its visual content, the use of animated characters or child-oriented activities and incentives, music or other audio content, age of models, presence of child celebrities or celebrities who appeal to children, and language or other characteristics of the website or online service. In addition to these factors, the Commission may also generally rely on other "competent and reliable empirical evidence regarding audience composition."

The TikTok/ app at issue allowed users to create and share short videos with other users. These videos typically featured users lip-syncing to popular music.'s 2018 Privacy Policy stated that "The Platform is not directed at children under the age of 13." Nonetheless, the FTC, weighing the factors, concluded that was a child-directed service. The complaint stated that creating and sharing lip-syncing videos was a "child-oriented activity" and pointed to the presence of emojis like "cute animals and smiley faces," "simple tools" for sharing content, songs related to "Disney" and "school," and kid-friendly celebrities such as Katy Perry, Selena Gomez, Ariana Grande, Meghan Trainor and others.

This is a broad and somewhat striking interpretation, given that this type of content — lip-syncing, approachable design, bright colors, emojis, presence of pop music, etc. — can arguably be found on many sites and services not directed to children. (Take, for example, RuPaul's Drag Race and its associated app.) While the FTC interpretation here appears to set a worrisome precedent, it's possible the Commission may be relying less on the "subject matter" factors, and more heavily on other "competent and reliable empirical evidence" of audience composition.

According to the FTC complaint, there was quite damning empirical evidence that staff was aware of the popularity of their platform with children. For example, had received "thousands" of complaints from parents that their children under 13 had created accounts. Meanwhile, prominent press articles highlighted the popularity of the app among tweens and younger children; seemed to acknowledge this themselves when they published guidance stating, "If you have a young child on, please be sure to monitor their activity on the App." Lastly, the Children's Advertising Review Unit (CARU) met with's co-founder and flagged to the company the app's popularity with kids; when failed to address the issues CARU raised, CARU ultimately referred the case to the FTC.

As likely did not set out to appeal to kids when it launched its service, other companies should view the TikTok/ settlement as a cautionary tale. However, if there is a silver lining here, it is that the FTC's shift toward relying on "reliable empirical evidence" of audience composition should provide a bit more certainty, compared to the "subject matter" factors. A company that does its own due diligence and can show hard evidence that kids are not using its service (for instance, through market surveys or demographic studies) should be in a better position to mitigate its risk.

CCPA: California's Privacy Rules for Minors Could Be a Major Headache Across the US

The CCPA, set to go into effect January 1, 2020, creates various new compliance burdens on many companies doing business in California. Among them is the requirement that a business may not sell the personal information of consumers they know to be less than 16 years old, without affirmative, opt-in consent from the parent or guardian (or from the consumers themselves if between the ages of 13 and 16). Moreover, under the CCPA, a company will still be responsible if it "willfully disregards" customer ages.

Notably, the definition of "sale" is broadly defined under the law (for example, it could include behavioral advertising or joint marketing promotions); it will be difficult to obtain opt-in consent at scale. As a result, some have argued that, practically speaking, this change raises the minimum age under COPPA from 13 to 16 for California residents only.

The CCPA creates a major compliance burden for businesses, as many global companies do not currently distinguish between users of different states within the United States. To adapt, some companies are considering adding a "state" field to user accounts (potentially based on IP address) and singling out California residents for different treatment. Another option is to raise the minimum age to 16 for the entire United States, though this approach might have a larger impact on revenue. It is also important to note that the law is silent regarding retroactive effect, so it is unclear whether users who are above the age of 13 but under the age of 16 at the time the CCPA is effective may be treated as adults under the law, or if they must go back to being treated as children.

That said, amendments are expected to the CCPA before it goes into effect in 2020. Moreover, the CCPA is very vulnerable to a constitutional challenge based on federal preemption, and the federal government could explicitly preempt the CCPA by passing new legislation, such as the bill described in the next section.

COPPA: New Proposed Amendments

In March 2019, Senators Ed Markey (D-MA) and Josh Hawley (R-MO) introduced a bill to amend and further expand the scope of COPPA. In addition to raising the minimum age to 16 across the United States, the bill text contains several other key provisions:

  • "Directed to Children" definition. In addition to looking at reliable empirical evidence of audience composition (like in the case described above), the bill also allows the FTC to look at reliable empirical evidence related to "the intended audience" of the app (emphasis added). In other words, any internal communications discussing what a developer or marketing teams wants their audience to be could be used against them as evidence.
  • Continuation of Service required. The service provider must still provide the service to the minor even after deleting the child's personal information, unless the operator is not "capable of providing such service without such information." As a result, for a service subject to COPPA, there must be a child-friendly build available: A developer cannot rely on kicking underage players out.
  • "Constructive knowledge" regarding individual's minor status. Similar to the CCPA, providers are required to comply if they have "constructive" knowledge of a child's age. It is unclear what level of information would satisfy this standard or the extent to which a service provider must investigate users' ages.
  • FTC to establish minimum security standards for all connected devices. The bill also targets connected devices for children, and requires them to adhere to minimum security standards, to be determined by the FTC.

The UK ICO Proposes Sweeping Guidance for Underage Privacy

The United States is not the only jurisdiction with sweeping children's privacy laws. The EU General Data Protection Regulation (GDPR) that went into effect in 2018 contains parallel protections, and EU member states may set their own minimum age standards (anywhere from 13 to 16 — see this link for more details).

More recently, in April 2019, the UK Information Commissioner's Office (ICO) released a 122-page guidance document entitled "Age appropriate design: a code of practice for online services." The document is out for consultation until May 31, after which the ICO will draft a final version to be laid before Parliament to come into effect before the end of the year. The public has an opportunity to read the code and fill in a survey to give their views.

Looking at the code of practice, there are a great deal of things that, if they remain in the final version of the code, will create substantial new compliance burdens for companies. Three key takeaways are:

  • Broader scope of services subject to rules. The burden of proof is on a provider to show its services are not appealing to kids, rather than relying on a standard related to the provider's actual knowledge. Moreover, online services that have a substantial number of underage users are subject to the rules, even if underage users are an insubstantial percentage of the overall base. This could mean many popular adult-oriented services may have to consider children for the first time.
  • Strict transparency/control requirements. The ICO recommends a dynamic and comprehensive privacy notice, designed for kids to understand. Moreover, all settings must be set to "high privacy" by default, with features and notices specifically designed to steer kids toward making good decisions.
  • "Targeting" and "profiling" must be off by default. Personalization of services, including using a child's data to suggest things like in-app purchases, must be opt-in. Previous guidance under COPPA suggested that contextual personalization might be okay without an opt-in, but the ICO rules require that personalization efforts be off by default, unless keeping them on is in the "best interest" of the child.

Five Practical Tips for Reducing Your Risk

The developments above make it clear that children's privacy is a hot topic, and it's unlikely to go away anytime soon. While much of the law here is in flux, there are a few things companies can do now to prepare:

  1. Re-analyze your site/services' appeal to children. The case and ICO guidance both emphasize that a service may be subject to the rules even if the developers did not intend to target kids. Keep on the lookout for empirical evidence of your app's audience composition. For example, see if your marketing team has data regarding the target demographics of your app. Keep up on the news and see if the app is starting to become popular with younger users. Determine whether your app is being featured on any "Children's" or "Families" lists. In any case, counsel should be involved in this investigation to preserve attorney-client privilege. If there is any doubt, consider gating users based on age.
  2. Age-gate the right way. With respect to implementing an age-gate, the FTC has stated that a service provider cannot encourage children to lie about their age or make it easy for the child to circumvent the gate (for instance, by clicking the "back" button and trying again). When implementing an age-gate for a service that is already live, make sure that the gate is presented to existing users as well as new users, and that the language used in the age-gate is appropriate for your app's audience. Once you have users' ages, delete any personal information you may have collected from or about underage users.
  3. Perform privacy due diligence during and after M&A. was acquired by ByteDance Ltd. in August 2018 and merged with the TikTok app under the TikTok name. If you acquire a company, make sure you do thorough due diligence on any privacy issues your target might be bringing along. Acquirers should conduct a post-close privacy assessment to evaluate and remediate any risks. COPPA Safe Harbor programs are slowly gaining popularity with some companies; consider if they might be right for you.
  4. Train, train, train: Teach your customer service reps to handle underage users. Several privacy laws specifically require employees who handle sensitive data to be adequately trained: It's important that customer service reps can handle complaints from parents and know what to do when it sounds like a child might be on the other end of a customer support line. Customer support should also keep in close contact with your company's legal team and flag if they sense that a game might be unexpectedly popular with kids.
  5. Consider what you can do now to adapt to the ICO guidance. The ICO guidance is not yet binding, but the requirements are extremely strict. Some things you can do to prepare in the meantime include drafting kid-friendly privacy policies or other privacy settings to give a more privacy-protective experience to a child.

These are complicated issues, so companies should work closely with privacy counsel during this period of enhanced focus on children's privacy and take the necessary precautions.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.