This article provides Womble’s recommendations for companies processing children’s data based on a recent discussion with Associate Director Wiseman regarding his personal impressions of the general state of the FTC’s enforcement of children’s privacy and online safety protections. Chair Ferguson and the FTC have signaled that children’s privacy and online safety is a key enforcement area for the FTC in 2025. Mr. Wiseman acknowledged it remains so in 2026. In particular, companies should be prepared to demonstrate compliance with the COPPA amendments by April 22, 2026, and the TAKE IT DOWN Act provisions enforced by the FTC by May 19, 2026.

The Legal and Regulatory Landscape

Children’s privacy and online safety is in the spotlight in 2026. At the end of 2025, the House Energy & Commerce Committee released nineteen draft bills focused on minors’ privacy and online safety. This federal legislative push follows significant federal and state regulatory enforcement activity in this space over the last several years, with the FTC announcing several settlements in the space at the end of 2025 and states continuing to target social media platforms.

At the federal level, the FTC has two new sets of requirements to enforce: the amended COPPA rules and the soon-to-be effective TAKE IT DOWN Act. Associate Director Wiseman acknowledged that the FTC’s DPIP team is actively preparing to enforce both sets of requirements. Womble recommends a “wait and see” approach is not advisable and companies subject to these laws should comply as they take effect.

Key Takeaways and Action Items

I. Updated COPPA Regulations

For the first time since 2013, the FTC has updated its COPPA Rule, with companies having until April 22, 2026, to comply. Among other changes, the updated COPPA Rule expands the definition of “personal information” to include biometric identifiers such as fingerprints, handprints, retina patterns, iris patterns, genetic data, voiceprints, gait patterns, facial templates, or faceprints, as well as government-issued identifiers like state identification card numbers, birth certificate numbers, and passport numbers. The updated COPPA Rule also gives companies more options to obtain parental consent, including the “Text Plus” method.

As emphasized by Associate Director Wiseman, the updated COPPA Rule now requires companies that are subject to COPPA to create, establish, and implement the following policies and practices. Womble has generally summarized what each involves below.

  • Retention Policy. A written data retention policy setting forth the purposes for which children’s personal information is collected, the business need for retaining such information, and a timeframe for deletion of such information. The written data retention policy must be included in the company’s notice on its website or online service.
  • Child-Specific Information Security Program. A written information security program that contains safeguards appropriate to the sensitivity of the personal information collected from children and the company’s size, complexity, and nature and scope of activities. Note that the updated COPPA Rule outlines specific steps a company must take to meet this requirement, including an annual risk assessment and an annual evaluation and accounting of known risks.
  • Consent for Third-Party Disclosures. Where a company discloses personal information to one or more third parties, its direct notice to parents must include the identities or specific categories of such third parties and the purposes for such disclosure, and that the parent can consent to the collection and use of the child’s personal information without consenting to the disclosure of such personal information to third parties except to the extent such disclosure is integral to the website or online service.
  • Online Notice Updates. A company’s online notice must now include the company’s data retention policy and disclose the identities or specific categories of any third parties to which the company discloses a child’s personal information (either directly in the notice or via hyperlink), the purposes for such disclosure, and state that parents may consent to the collection and use of their child’s information without agreeing to the third-party disclosure except to the extent that such disclosure is integral to the website or online service. 

Time-Sensitive Action Items:

  1. Ensure your company has a data retention policy that explicitly addresses children’s data.
  2. Make sure your company can demonstrate that its information security program—in some way—accounts for children’s data.
  3. Update the language of your company’s online children’s privacy notice and direct notice to parents no later than April 22, 2026.
  4. Determine whether any of your company’s contracts with your service providers require adjustments based on any new effects of the COPPA Rule updates on your company.

II. The TAKE IT DOWN Act

The FTC commences enforcing the TAKE IT DOWN Act (“Act”) on May 19, 2026. The Act imposes both criminal and civil liability on companies or individuals that violate the Act’s requirements around non-consensual intimate imagery, including imagery of minors. The FTC’s purview is the civil prohibitions under the Act and take down provisions. Certain criminal prohibitions of this Act enforced by the DOJ are already in effect. In general, the Act builds on other recent federal efforts to combat sexual imagery online, including the Revising Existing Procedures on Reporting via Technology Act (“REPORT Act”) signed into law in 2024. The REPORT Act expanded providers’ mandatory National Center for Missing & Exploited Children (“NCMEC”) reporting obligations to include content that includes child sex trafficking and enticement of a minor (which is on top of the already required mandatory obligations for content that includes child sexual abuse material (“CSAM”), sexual exploitation of children, child trafficking, misleading domain names, and production of CSAM for importation into the United States).

Broadly speaking, the Act targets the proliferation of nonconsensual intimate imagery (NCII) and deepfakes, and imposes obligations on ‘‘covered platforms,’’ which means a website, online service, online application, or mobile application that (i) serves the public, and (ii) primarily provides a forum for user-generated content, including messages, videos, images, games, and audio files or publishes, curates, hosts, or makes available content of nonconsensual intimate visual depictions in the regular course of trade or business. This Act is addressed here as part of the overview for children’s privacy initiatives heading into 2026 as exploitive content often features children, but it is not an Age-limited law and applies to exploitive content of adults as well.

Companies meeting the definition of a “covered platform” are subject to a few main requirements:

  • Consumer Notification Process. Covered platforms must establish a process for an identifiable individual (or an authorized person acting on behalf of such individual) to notify the covered platform of an intimate visual depiction published on the covered platform that includes a depiction of the identifiable individual and was published without the consent of the identifiable individual.
  • Removal Request Process. Covered platforms must establish a process for an identifiable individual (or an authorized person acting on behalf of such individual) to submit a request for the covered platform to remove such intimate visual depiction.
  • Notice of Process. Covered platforms must provide consumers with a clear and conspicuous notice that explains the covered platform’s notice and removal process.
  • Timely Removal. Covered platforms must, within 48 hours after receiving a removal request, (i) remove the intimate visual depiction; and (ii) make reasonable efforts to identify and remove any known identical copies of such depiction.

While the Act’s requirements are new, the FTC began mapping out its interpretations of some of the Act’s requirements in the FTC’s 2025 settlement with Aylo. Companies can look to this settlement for guidance related to the FTC’s interpretation of the Act’s requirements (although note that the settlement granted Aylo 72 hours to take down reported content, whereas the Act allows only 48 hours). In general, FTC enforcement actions can be a helpful source of information regarding how the FTC interprets a legal requirement, even if that information is based on a specific factual context that may not reflect the reader’s business.

Time-Sensitive Action Items:

  1. Determine whether your company meets the definition of a “covered platform” under the Act. Given the FTC’s focus on protecting minors, companies should put themselves in the FTC’s shoes and take a broad interpretation of what constitutes a covered platform.
  2. Review and update internal policies governing your company’s NCMEC mandatory reporting obligations (which obligations apply more broadly than the TAKE IT DOWN ACT).

III. Age Verification

The FTC has now released the agenda for its online Age Verification Workshop, scheduled for January 28, 2026. The FTC describes the workshop as bringing together a diverse group of stakeholders, including researchers, academics, industry representatives, consumer advocates, and government regulators, to discuss why age verification matters, age verification and estimation tools, navigating the regulatory contours of age verification, how to deploy age verification more widely, and interplay between age verification technologies and COPPA requirements.

Womble believes the FTC’s focus on age verification and estimation is one to watch. There have been bipartisan efforts to narrow or close the “actual knowledge” pre-requisite for COPPA to apply, including at the state level with the passage of the app store accountability act by four states. For those attending the FTC’s workshop, be on the lookout for what FTC officials may say about when they believe a company has “actual knowledge” it is collecting data from a user under the age of 13.

Ongoing Action Item

Remain watchful this year on any updated guidance from the FTC regarding age verification. As more companies may have to give it effect, there may be practical “how to” learnings both from a legal and technical perspective.

Closing Thoughts

The FTC has communicated in a number of different ways in 2025 that it is going to robustly enforce children’s privacy laws. All indications from the FTC on this topic point to a similar sustained focus for 2026. Companies collecting children’s data or providing services to children or that they know could be used by children, should at a minimum, review whether their practices comply with the updated COPPA Rule and, if applicable, take steps to comply in full. Covered platforms subject to the TAKE IT DOWN Act should also make sure they will be compliant. And given nearly every regulator and legislature is focused on children’s privacy and online safety, companies should also ensure that their practices comply with state privacy laws.