Policymakers around the world are attempting to pass legislation that addresses data privacy issues impacting kids and teens. To avoid costly fines and litigation, Ceren Canal Aruoba, co-leader of BRG’s consumer protection, product liability and environment practice, says C-suite executives and corporate integrity professionals would do well to understand the fast-changing regulatory landscape — and how to proactively mitigate risk.
Last August, a bipartisan coalition of 44 US state attorneys general sent a formal letter to major AI firms expressing concerns regarding the safety of children interacting with AI chatbots. This letter was just a hint of what’s to come.
As AI use intensifies, so will regulatory scrutiny of data privacy issues impacting kids and teens, be it related to harmful content, manipulative design, noncompliant surveillance, automated profiling or any number of other violations. A wave of costly penalties, settlements and reputational damages is already emerging, including last year’s settlement by Disney surrounding allegations of unlawful collection of children’s personal data.
But the US is not alone — not by a longshot, which could serve to further complicate reality for corporate compliance, data privacy and cybersecurity professionals.
Global developments
Many jurisdictions around the world have recently passed — or are attempting to pass — legislation to address the data privacy issues targeting kids’ and teens. Most of these efforts are meant to create guardrails and often limit companies’ abilities to advertise to people under 18, as well as collect and benefit from their personal data.
Broadly speaking, new policies extend beyond data privacy into product governance and algorithmic accountability (e.g., algorithmic personalization and engagement features are banned from being used for minors in an increasing number of jurisdictions). Enforcement is picking up, too, and sites, apps and platforms could be vulnerable even if they aren’t specifically designed or targeted at children.
A few key regulatory developments to keep top-of-mind:
US
Actions are taking place at the state and federal levels, including significant amendments to the Children’s Online Privacy Protection Act (COPPA) that went into effect in June 2025 and for which operators have until April 22, 2026, to comply. Though COPPA applies to children under 13, there is pending federal legislation to expand its protections to those under 17. This legislation would also ban targeted advertising to minors and require companies to create an “eraser button” for users to delete their own data. Numerous states have passed their own legislation as well, including: age-appropriate design code laws in Nebraska and Vermont; social media and app store laws that require age-verification and parental consent in Utah, Texas, Louisiana and Nebraska; and youth-related updates to the California Privacy Rights Act.
EU
Age assurance guidelines adopted by the European Data Protection Board in February 2025 will direct enforcement of age-gating methods across the EU. Relatedly, the EU may revise the GDPR as part of a broader digital omnibus effort. While the changes wouldn’t be written specifically for children, they could still materially change how companies collect and use children’s data.
UK
Multiple overlapping laws now govern children’s online experiences. Together, they create stricter design, data-use and safety obligations for any digital service children might use, backed by active enforcement through 2026. For instance: The age-appropriate design code is a UK framework that requires digital services likely to be accessed by children to minimize data collection, default to high privacy settings and avoid nudging or dark patterns. Here is a summary of other measures:
- In April 2024, the Information Commissioner’s Office introduced the children’s code strategy, which seeks to ensure that social media and video sharing platforms comply with data protection law and conform with the standards of the children’s code.
- The Data (Use and Access) Act 2025, which will be phased in by June 2026, makes changes to data consent and lawful processing and expands data-sharing permissions; while not child-specific, it interacts with the children’s code and may lower friction for data use, making children’s protections more dependent on design compliance rather than consent alone.
- The 2023 Online Safety Act requires platforms to conduct formal children’s risk assessments, identify harms (content, algorithms, data use) and implement mitigation measures. Enforcement is handled by Ofcom and could lead to significant fines and service restrictions.
Australia
A coordinated set of laws will require platforms to verify or estimate users’ ages, restrict underage access to social media and redesign data practices to better protect children. These include:
- Effective in December 2025, the Online Safety Amendment (Social Media Minimum Age) Act 2024 requires age‑restricted social media platforms to take reasonable steps to prevent people younger than 16 from creating or maintaining accounts.
- Enforceable starting December 2025, the internet search engine services online safety code requires search engines to implement age assurance for logged-in users, especially those likely to be minors.
- The eSafety industry codes, covering hosting/search and app stores, social media and messaging, mandate robust age assurance and require default parental controls.
The Proliferation Everyone Expected Didn’t Happen: What Stalled State Data Privacy Laws in 2025
Many states may have decided it was not necessary to duplicate provisions businesses already implemented based on other state laws
Read moreDetailsCritical business implications
These global developments signal a sea change in how regulators are treating data privacy issues affecting kids and teens. Costly fines by regulators will inevitably follow for those who do not follow the new regulations, likely paired up with mandatory product redesigns and/or ongoing oversight.
In Australia, for example, eSafety code violations could lead to fines of up to $50 million, while FTC settlements for COPPA violations can be in the tens of millions of dollars (Epic Games, maker of the popular “Fortnite” video game, settled with the agency over such violations for $275 million in 2022). In the US, large-scale class actions may also come in the wake of enforcement actions.
For executives and corporate compliance professionals, there are several lessons to be learned from recent enforcement trends, particularly in the U.S.:
- Aggressive dual enforcement: Federal agencies (e.g., the FTC and DOJ) and private litigants are simultaneously pursuing cases, covering both regulatory and monetary penalties.
- COPPA vs. US state law limits: Private US state-law claims involving under-13 minors often face preemption by COPPA. However, claims for teens 13 to 17 are gaining traction under state and consumer statutes.
- Algorithmic personalization under scrutiny: Lawsuits focus not just on data collection but on how data is used in recommendation engines — driving enforcement in platforms’ analytics and engagement mechanisms.
- Responsibility in EdTech: Stakeholders should treat school data collection as requiring parental consent under COPPA, even when tools are deployed at scale under school contracts.
- Global momentum: Regulators across jurisdictions are converging on similar issues — age assurance, algorithmic profiling, engagement-driving features and dark-pattern design.


Ceren Canal Aruoba is managing director at BRG and co-leader of the firm’s consumer protection, product liability, and environment practice. 






