Children's Online Privacy Laws by State (2026)

Federal law through COPPA protects children under 13 from online data collection, but a growing number of states have decided that federal protections do not go far enough. Since 2022, state legislatures across the country have passed laws targeting social media access, age verification, data minimization, and platform design for young users.
These state laws vary widely in scope, enforcement mechanisms, and the ages they cover. Some focus narrowly on social media account creation. Others impose broad obligations on any digital service a child might access. This guide surveys the major state laws, compares their approaches, and tracks ongoing legal challenges. For the federal framework, see our COPPA Compliance Guide.
California: Age-Appropriate Design Code Act (AADC)
California enacted the Age-Appropriate Design Code Act (AB 2273) in September 2022, modeled after the United Kingdom's Age Appropriate Design Code. The law took effect on July 1, 2024, and applies to any online service, product, or feature "likely to be accessed by children" under 18.
The AADC requires covered businesses to:
- Complete a Data Protection Impact Assessment (DPIA) before offering any new feature or product likely accessed by children
- Configure all default privacy settings to the highest level for child users
- Provide privacy information in language suited to the age of the child
- Refrain from profiling children by default (unless profiling is necessary to provide the service and appropriate safeguards are in place)
- Refrain from using dark patterns to steer children toward options that lower their privacy
A federal judge in the Northern District of California issued a preliminary injunction against enforcement in September 2023 in NetChoice v. Bonta, finding that portions of the law likely violated the First Amendment. California appealed to the Ninth Circuit, and as of early 2026, the injunction remains in place while litigation continues.
California also strengthened its approach through AB 1394 (2023), which requires social media platforms to report apparent child sexual abuse material within specified timeframes, and SB 976 (2024), the Protecting Our Kids from Social Media Addiction Act, which prohibits platforms from providing addictive feeds to minors without parental consent.
For California's broader data privacy framework, see our California Data Privacy Laws page. For family law context, see California Child Support Laws.
Utah: Social Media Regulation Act (SB 152)
Utah became one of the first states to directly regulate minors' access to social media. Governor Cox signed SB 152 in March 2023, and the amended version took effect on October 1, 2024.
Key provisions of Utah's law:
- Social media companies must verify the age of all Utah account holders
- Minors (under 18) need parental consent to create a social media account
- Parents receive access to their minor child's account
- Platforms cannot collect, use, or disclose a minor's personal data without parental consent
- Platforms cannot show advertising to minor accounts between 10:30 PM and 6:30 AM (curfew provision)
- Companies face penalties up to $250,000 per violation, plus $2,500 per day of continued violation
The law has faced criticism over its age verification mandate and parental access provisions. Industry groups challenged the original version, leading to amendments that narrowed some requirements. Enforcement began in 2024 under the Utah Division of Consumer Protection.
For related Utah legal information, see Utah Child Support Laws and Utah Data Privacy Laws.
Texas: Securing Children Online through Parental Empowerment Act (HB 18)
Texas Governor Abbott signed HB 18, the SCOPE Act, in June 2023. The law took effect on September 1, 2024, and creates a comprehensive framework for children's digital privacy.
The SCOPE Act requires:
- Parental consent for minors to create accounts on "digital services" (broadly defined)
- A ban on targeted advertising directed at known minors
- Restrictions on collecting, processing, or selling a known minor's personal data beyond what is necessary to provide the service
- Platforms to provide parents with tools to supervise their child's account settings, privacy preferences, and purchase capabilities
"Known minor" means the platform has actual knowledge or willful disregard that the user is under 18. The Texas Attorney General enforces the law with penalties up to $10,000 per violation.
A federal court in the Western District of Texas has reviewed First Amendment challenges to portions of HB 18. The litigation follows the same pattern seen in other states: industry groups argue that age verification and content restrictions burden protected speech. For more Texas-specific legal context, see Texas Child Support Laws and Texas Data Privacy Laws.
Louisiana: Act 440 (Age Verification for Social Media)
Louisiana passed Act 440 in 2023, requiring social media platforms to verify the age of Louisiana users and obtain parental consent for users under 18 to create accounts. The law took effect on July 1, 2024.
Act 440 mandates that platforms use "commercially reasonable" age verification methods. The law allows parents to request deletion of their minor child's account and prohibits platforms from using a minor's data for purposes unrelated to the service.
Louisiana's enforcement approach places responsibility on the platforms, not on parents or children. The state Attorney General can bring civil enforcement actions, with penalties structured per violation.
For Louisiana family law context, see Louisiana Child Support Laws.
Arkansas: Social Media Safety Act (SB 396)
Arkansas enacted the Social Media Safety Act (SB 396) in April 2023, requiring social media companies to verify the age of Arkansas users and obtain parental consent before allowing minors under 18 to create accounts. Platforms must use third-party age verification services.
A federal judge in the Eastern District of Arkansas blocked enforcement of the law in August 2023 in NetChoice v. Griffin, finding it likely violated the First Amendment. The court held that the law's age verification requirements burdened adult users' access to lawful speech and that less restrictive alternatives existed. The case was appealed to the Eighth Circuit.
For Arkansas family law context, see Arkansas Child Support Laws.
Ohio: Parental Notification Act (HB 311)
Ohio's HB 311 takes a different approach from states that require parental consent. Instead of blocking minors from creating accounts, Ohio requires social media platforms to notify parents when a child under 16 creates an account and to provide parents with the ability to manage the account.
The notification approach avoids some of the constitutional issues raised by consent-based models, since it does not prevent the minor from accessing the platform. Ohio's law reflects a middle-ground philosophy: inform parents and give them tools rather than gatekeep access entirely.
For Ohio family law context, see Ohio Child Support Laws.
Florida: Social Media for Minors Act (HB 3)
Florida's HB 3, signed in March 2024, prohibits social media platforms from allowing children under 14 to hold accounts and requires parental consent for users aged 14 and 15. The law was set to take effect on January 1, 2025.
The law defines covered "social media platforms" based on specific features: algorithmic content curation, addictive features (infinite scroll, push notifications, auto-play), and the ability to upload content visible to other users. Platforms that primarily provide email, messaging, streaming, news, or shopping are excluded.
A federal judge in the Northern District of Florida issued a preliminary injunction blocking the law in August 2024 in NetChoice v. Moody, finding that it likely violated the First Amendment by restricting minors' access to protected speech without demonstrating that less restrictive means were unavailable.
For Florida family law context, see Florida Child Support Laws and Florida Data Privacy Laws.
Virginia: Age Verification Requirements
Virginia enacted HB 1624 requiring certain websites to verify the age of users before granting access to content deemed harmful to minors. While primarily aimed at adult content, the law's age verification infrastructure has implications for children's privacy because any verification system that collects government IDs or biometric data from minors creates its own privacy risks.
Virginia's approach illustrates a tension present in many state laws: the tools used to protect children online can themselves create data collection risks. See Virginia Data Privacy Laws for the state's broader privacy framework.
Maryland, Minnesota, and Emerging State Laws
Maryland passed the Maryland Kids Code (Maryland Age-Appropriate Design Code) (SB 571) in 2024, modeled after California's AADC and the UK code. It requires data protection impact assessments, default high privacy settings, and prohibitions on profiling children.
Minnesota enacted the Minnesota Age-Appropriate Design Code in 2024, similarly requiring DPIAs and default privacy protections for services likely accessed by children under 18.
New York has been considering the SAFE for Kids Act and the Child Data Protection Act, which would restrict algorithmic feeds for minors and limit data collection from users under 18.
Connecticut, Georgia, Mississippi, Montana, and New Jersey have also introduced or passed children's online privacy bills during their 2024-2025 legislative sessions.
State Comparison Table
| State | Law | Age Covered | Key Requirement | Status (Early 2026) |
|---|---|---|---|---|
| California | AADC (AB 2273) | Under 18 | Data protection impact assessments | Enjoined (Ninth Circuit appeal) |
| California | SB 976 | Under 18 | Ban addictive feeds without consent | Effective 2025 |
| Utah | SB 152 | Under 18 | Parental consent for social media | Effective Oct 2024 |
| Texas | HB 18 (SCOPE) | Under 18 | Parental consent, no targeted ads | Effective Sep 2024 |
| Louisiana | Act 440 | Under 18 | Age verification, parental consent | Effective Jul 2024 |
| Arkansas | SB 396 | Under 18 | Third-party age verification | Enjoined (Eighth Circuit appeal) |
| Ohio | HB 311 | Under 16 | Parental notification | Effective |
| Florida | HB 3 | Under 14 (ban), 14-15 (consent) | Account ban under 14 | Enjoined (appeal pending) |
| Virginia | HB 1624 | Minors | Age verification for harmful content | Effective 2025 |
| Maryland | SB 571 | Under 18 | DPIA, default high privacy | Effective 2025 |
| Minnesota | Age-Appropriate Design Code | Under 18 | DPIA, default high privacy | Effective 2025 |
Age Verification: Methods and Challenges
States have taken varying approaches to age verification, and each raises distinct privacy and constitutional questions.
Government ID upload provides strong age verification but creates a database of identification documents linked to online activity. Louisiana initially used this model for its age verification law covering adult content sites.
Commercial age estimation uses AI and facial analysis to estimate a user's age without collecting an ID. California's AADC contemplated this approach, and the UK's Information Commissioner's Office has endorsed age estimation as a proportionate method.
Self-declaration (entering a date of birth) is the simplest method but the least effective, since children can easily enter false dates. The FTC has criticized age gates that allow repeated attempts.
Parental verification requires confirming the parent's identity before granting consent for the child. Methods parallel COPPA's VPC options: credit card, knowledge-based questions, or government ID.
The core challenge is that any age verification system robust enough to reliably identify children also collects sensitive data from everyone who uses the service. Courts evaluating these laws have repeatedly noted this tension when assessing First Amendment challenges.
What These Laws Mean for Families
Parents navigating children's online privacy face a patchwork of protections that depend on where they live and which platforms their children use. Federal COPPA protections apply nationwide for children under 13, but state laws add layers that cover older teens and address platform design.
Families in states with active laws can expect:
- Social media platforms to ask for age verification during account creation
- Options to review and manage a child's account settings and data
- Restrictions on targeted advertising and algorithmic content curation for child accounts
- The ability to request deletion of a child's data and account
Where courts have blocked state laws, the federal COPPA baseline and the platform's own policies remain the primary protections.
For related family law topics, visit our United States Child Support Laws hub page, which covers state-by-state child support guidelines, enforcement, and parental rights.
This article provides general legal information about children's online privacy laws by state. It does not constitute legal advice. Consult an attorney for advice specific to your situation.
Sources and References
- California Age-Appropriate Design Code Act (AB 2273)(leginfo.legislature.ca.gov).gov
- California SB 976 (Protecting Our Kids from Social Media Addiction Act)(leginfo.legislature.ca.gov).gov
- California AB 1394 (CSAM Reporting)(leginfo.legislature.ca.gov).gov
- Utah SB 152 (Social Media Regulation Act)(le.utah.gov).gov
- Texas HB 18 (SCOPE Act)(capitol.texas.gov).gov
- Louisiana Act 440 (HB 61)(legis.la.gov).gov
- Arkansas Social Media Safety Act (SB 396)(arkleg.state.ar.us).gov
- Ohio HB 311 (Parental Notification Act)(legislature.ohio.gov).gov
- Florida HB 3 (Social Media for Minors Act)(flsenate.gov).gov
- Virginia HB 1624 (Age Verification)(lis.virginia.gov).gov
- Maryland Kids Code (SB 571)(mgaleg.maryland.gov).gov
- Minnesota Age-Appropriate Design Code (2024 Session Law Ch. 124)(revisor.mn.gov).gov
- COPPA State Enforcement Authority (15 U.S.C. § 6502(d))(uscode.house.gov).gov
- New York SAFE for Kids Act (S7694)(nysenate.gov).gov