COPPA 2.0 Passed the Senate: What Parents Need to Know About Kids' Privacy
In March 2026, the US Senate passed COPPA 2.0 — the Children and Teens' Online Privacy Protection Act — by unanimous vote. If signed into law, it would extend federal privacy protections from children under 13 to all minors under 17, ban targeted advertising to kids, and require platforms to get parental consent before collecting teen data. Meanwhile, the FTC's updated COPPA rules already took effect on April 22, 2026, adding biometric data to the list of protected information and restricting the use of children's data for AI training. Here's what all of this means for parents, what's actually in effect today, and what you can do right now.
A Brief History: What the Original COPPA Covers
The Children's Online Privacy Protection Act (COPPA) was signed into law in 1998 and took effect in 2000. It was groundbreaking at the time — the first US federal law specifically protecting children's online privacy. Here's what it established:
- Age threshold: Applies to children under 13
- Parental consent: Websites and apps must get verifiable parental consent before collecting personal information from children under 13
- Notice requirements: Operators must post clear privacy policies describing what data they collect from children and how they use it
- Data minimization: Companies cannot require children to provide more information than necessary to participate in an activity
- Parental access: Parents have the right to review, delete, and refuse further collection of their child's data
- FTC enforcement: The Federal Trade Commission enforces COPPA and can impose fines for violations
COPPA has been enforced actively. The FTC has brought dozens of cases resulting in hundreds of millions of dollars in fines. Epic Games (Fortnite) paid $275 million in 2022. Google/YouTube paid $170 million in 2019. TikTok (then Musical.ly) paid $5.7 million in 2019 and faced additional scrutiny since.
But COPPA has a fundamental limitation that has become increasingly untenable: it only covers children under 13. A 13-year-old on Instagram, Snapchat, Discord, or TikTok has zero federal privacy protections. Platforms can collect their data, target them with ads, sell their information to data brokers, and use their content to train AI models — all perfectly legal under current law.
COPPA by the Numbers
- 1998: COPPA signed into law
- 13: Current age threshold for protections
- $275M: Largest COPPA fine (Epic Games, 2022)
- 95%: Percentage of teens aged 13–17 who use social media (Pew, 2025)
- 0: Federal privacy protections for teens 13 and older under current law
What COPPA 2.0 Would Change
COPPA 2.0 — formally the Children and Teens' Online Privacy Protection Act — passed the Senate unanimously in March 2026. It was co-sponsored by Senators Ed Markey (D-MA) and Bill Cassidy (R-LA), the same bipartisan pair behind the original 2024 version that passed the Senate but died in the House. Here's what the 2026 version includes:
Age Threshold: Under 13 Becomes Under 17
The most significant change. COPPA 2.0 raises the age of protection from under 13 to under 17. This means platforms would need verifiable parental consent before collecting personal information from any minor — including teenagers.
For the approximately 25 million American teenagers aged 13–16, this would create federal privacy protections where none currently exist. Platforms like Instagram, Snapchat, TikTok, YouTube, Discord, and Roblox would all be affected.
Ban on Targeted Advertising to Minors
COPPA 2.0 would prohibit targeted advertising directed at users known to be under 17. This doesn't ban all advertising — platforms can still show contextual ads (ads based on the content of the page, not the user's personal data). But the behavioral advertising model — where ads follow users based on their browsing history, location, interests, and activity patterns — would be illegal for minors.
This is a direct hit to the business model of platforms that rely on advertising revenue from teen users. Meta, Snap, and TikTok all generate significant ad revenue from users aged 13–17. The Interactive Advertising Bureau has lobbied heavily against this provision, arguing it would harm "age-appropriate" advertising.
Eraser Button: Right to Delete
COPPA 2.0 includes an "eraser button" provision requiring platforms to provide a clear mechanism for minors (or their parents) to delete personal information and account data. Platforms must process deletion requests within a reasonable timeframe and must actually delete the data — not just deactivate the account while retaining data on backend servers.
Prohibition on Dark Patterns
The bill explicitly bans the use of dark patterns — manipulative design choices intended to trick users into sharing more data than they intend to. Examples include pre-checked consent boxes, confusing privacy settings, "confirm-shaming" language on opt-out screens, and making privacy-protective choices deliberately harder to find than data-sharing options.
COPPA vs. COPPA 2.0: What Changes
| Provision | Original COPPA (1998) | COPPA 2.0 (2026 Senate Bill) |
|---|---|---|
| Age covered | Under 13 | Under 17 |
| Targeted ads | Not addressed | Banned for minors |
| Biometric data | Not covered | Covered (via FTC rule) |
| AI training | Not addressed | Requires parental consent |
| Dark patterns | Not addressed | Explicitly banned |
| Eraser button | Parental access only | Full deletion right for minors |
| Third-party sharing | Consent required | Separate opt-in consent required |
The New FTC COPPA Rules (Already in Effect)
While COPPA 2.0 awaits House action, the FTC has already updated its COPPA enforcement rules under existing authority. These updated rules took effect on April 22, 2026 and apply now, regardless of whether COPPA 2.0 becomes law. Here's what changed:
- Biometric data is now personal information. Voice prints, facial geometry, fingerprints, and other biometric identifiers collected from children under 13 now require parental consent. This affects apps that use face filters, voice recognition, or biometric login for children's accounts.
- Separate consent for third-party advertising. Previously, a single blanket consent could cover both first-party data use and sharing with advertising partners. The new rules require separate opt-in parental consent specifically for sharing children's data with third-party advertisers. Parents can consent to the app collecting data while refusing to allow that data to be shared for ads.
- AI and machine learning restrictions. Children's data cannot be used to train AI models or machine learning systems without explicit parental consent. This is directly aimed at companies that feed children's content, voice data, images, and behavioral patterns into training datasets.
- Data retention limits. Companies must delete children's data once the purpose for which it was collected has been fulfilled. Indefinite retention "for future product improvement" is no longer a valid justification.
- Stronger consent verification. The updated rules require more secure methods of verifying parental identity — knowledge-based authentication, video verification, or government ID matching. The previous standard of "send an email to a parent" was widely regarded as insufficient.
These FTC rules apply to all operators subject to COPPA — websites, apps, games, connected toys, and platforms that collect data from children under 13. They are enforceable now and carry the same fine structure as existing COPPA enforcement ($50,120 per violation as of 2026).
Where Things Stand: Senate vs. House
The Senate passed COPPA 2.0 unanimously — a rare display of bipartisan agreement on tech regulation. But the House has not scheduled a vote. The 2024 version of the bill (then paired with the Kids Online Safety Act, or KOSA) passed the Senate 91–3 but was never brought to the House floor.
Industry opposition is the primary obstacle. The tech industry's lobbying apparatus — led by NetChoice, the Computer and Communications Industry Association (CCIA), and the Interactive Advertising Bureau — has argued that raising the age threshold to 17 would require age verification that infringes on adult users' privacy and free speech. They also contend that banning targeted advertising to teens would devastate ad-supported platforms that teens use for free.
Supporters counter that children's privacy should not be a partisan issue and that the advertising industry's objections are fundamentally about protecting revenue, not protecting rights. The bill has support from the American Academy of Pediatrics, Common Sense Media, the Electronic Frontier Foundation (with reservations about age verification), and dozens of parenting organizations.
As of May 2026, there is no timeline for a House vote. Parents should not wait for COPPA 2.0 to take action.
Children's Data and Data Brokers
Here's an angle that doesn't get enough attention: children's information regularly appears on data broker and people-search sites, even though minors rarely sign up for these services themselves.
How does this happen? Several ways:
- Family records. When a people-search site builds a profile on a parent, it typically includes household members — including children. A search for a parent's name on Spokeo, BeenVerified, or Whitepages often shows children's names, approximate ages, and the family address.
- Public records. Birth records, school enrollment records, and property records (which list household occupants) are all sources that data brokers aggregate. In states without strong public records restrictions, this information is readily available.
- School directory data. Some school districts share student directory information with third parties unless parents opt out under FERPA (the Family Educational Rights and Privacy Act). Data aggregators acquire this information and feed it into broader consumer databases.
- App and game data. Children's apps that violate COPPA by collecting data without parental consent — a common occurrence despite FTC enforcement — feed that data into analytics and advertising pipelines that eventually reach data brokers.
- Social media scraping. Public social media profiles that mention children by name, post photos with geolocation data, or tag children's accounts provide data brokers with information they can correlate with other records.
The result: even young children can have data broker profiles that include their name, family members, home address, and approximate age. This data can be accessed by anyone who pays for a people-search report — which raises obvious safety concerns beyond just privacy.
What Parents Can Do Right Now
You don't need to wait for Congress. These steps protect your children's data today:
- Audit every app on your child's devices. Go through each app and check its privacy settings. Disable ad personalization, location sharing, and data sharing wherever possible. Remove apps your child no longer uses — dormant accounts still collect and share data.
- Enable platform-level parental controls. On iPhone, use Screen Time (Settings > Screen Time > Content & Privacy Restrictions). On Android, use Family Link. Both allow you to restrict app installations, limit data sharing, and control privacy settings at the OS level.
- Disable ad tracking on their devices. On iPhone: Settings > Privacy & Security > Tracking > turn off "Allow Apps to Request to Track." On Android: Settings > Privacy > Ads > "Delete advertising ID."
- Check people-search sites for your family. Search for your own name on major people-search sites like Spokeo, BeenVerified, Whitepages, and TruePeopleSearch. If your profile lists your children, submit removal requests. This is tedious to do manually across dozens of sites — a data removal service automates the process.
- Opt out of school directory data sharing. Under FERPA, parents have the right to opt out of directory information sharing. Contact your child's school and submit a written request to exclude your child from directory data releases. Do this at the beginning of every school year.
- Use a family-covering data removal service. Services like GhostVault remove your entire household's data from broker databases — not just yours. At $3.99/month, it continuously monitors for your family's information across 500+ broker sites and submits removal requests automatically. This catches the family records, household listings, and relative associations that expose your children's data through your own profiles.
- Review social media privacy settings for teen accounts. If your teen uses Instagram, TikTok, Snapchat, or Discord, go through privacy settings together. Set accounts to private, disable location sharing, limit who can message them, and turn off activity status. On Instagram, enable Supervision features that give parents visibility into settings and contacts.
- Talk to your kids about data. Teens are more likely to follow privacy practices they understand. Explain that "free" apps make money by selling data, that public posts are scraped by companies they've never heard of, and that information shared online is effectively permanent. Make it concrete — show them a people-search result with the family's information on it.
Quick Privacy Checklist for Kids' Devices
- Ad tracking disabled at OS level
- Location services off for all non-essential apps
- Social media accounts set to private
- Unused apps deleted (not just unused — uninstalled)
- Parental controls enabled (Screen Time / Family Link)
- School directory opt-out submitted
- Family data checked on people-search sites
What Platforms Must Do Under the New FTC Rules
The updated FTC COPPA rules that took effect on April 22, 2026 create immediate obligations for platforms with users under 13:
- Separate advertising consent. Platforms can no longer bundle advertising data-sharing into a general consent flow. Parents must be asked specifically whether they consent to their child's data being shared with advertisers — and they must be able to say no without losing access to the core service.
- Biometric consent. Any app that collects voice prints, facial geometry, or other biometric identifiers from children must obtain specific parental consent for that collection. Face filters, voice assistants, and biometric login features in children's apps are all affected.
- AI training restrictions. Platforms cannot use children's data — including content they create, behavioral data, or biometric data — to train AI models without parental consent. This applies retroactively to data already collected: companies must obtain consent before using existing children's data for AI training.
- Mandatory data deletion. When the purpose for collecting a child's data is fulfilled — for example, the child stops using the app — the platform must delete the data. "We might need it later" is no longer an acceptable retention justification.
- Verified parental consent. The bar for verifying parental identity has been raised. Email-based consent (a parent clicks a link) is no longer sufficient for operations that share data with third parties. Platforms must use video verification, government ID matching, or knowledge-based authentication for these higher-risk data uses.
The Road Ahead
COPPA 2.0's passage through the Senate was historic — a unanimous vote on tech regulation is almost unheard of. But identical legislation passed the Senate in 2024 and went nowhere in the House. The same industry lobbying that blocked it then is active now.
Even if COPPA 2.0 becomes law, enforcement will take time. The FTC would need to issue implementing regulations, platforms would get compliance deadlines, and the first enforcement actions would likely take 12–18 months. Children's privacy in practice will depend on parents taking action today — not waiting for legislation that may or may not pass, and regulatory enforcement that will take years to reach most platforms.
The state privacy laws already in effect offer some protection. California, Connecticut, Colorado, and several other states have laws that include specific provisions for children's data. But coverage is uneven, enforcement varies, and the patchwork of state laws creates confusion for both parents and platforms.
What's clear is that children's data has real value — to advertisers, to data brokers, to AI training pipelines, and unfortunately to bad actors. Whether Congress acts on COPPA 2.0 or not, protecting your family's data is something you can start today. Check what's already out there, lock down what you can control, and use the tools available to clean up what's already been exposed.
Frequently Asked Questions
Is COPPA 2.0 signed into law yet?
Not yet. The Senate passed COPPA 2.0 unanimously in March 2026, but the House of Representatives has not scheduled a vote. Until the House passes the bill and the President signs it, COPPA 2.0 is not law. The original COPPA (1998) remains in effect, and the FTC's updated COPPA rules — which took effect on April 22, 2026 — are enforceable now. These updated FTC rules add biometric data protections, restrict AI training on children's data, and require separate parental consent for advertising data-sharing.
What age does COPPA 2.0 cover?
COPPA 2.0 would extend protections to all children and teens under 17, up from the current threshold of under 13. This is the bill's most significant change. Approximately 25 million American teenagers aged 13–16 currently have no federal online privacy protections. Platforms would need verifiable parental consent before collecting data from anyone under 17, and targeted advertising to all minors would be prohibited.
What are the new FTC COPPA rules effective April 2026?
The FTC's updated COPPA rules took effect on April 22, 2026 and apply under existing COPPA authority (they don't require COPPA 2.0 to pass). Key changes: biometric data now requires parental consent; companies need separate opt-in consent before sharing kids' data with advertisers; children's data can't be used for AI training without explicit consent; data retention limits require deletion once the collection purpose is fulfilled; and consent verification methods must be more secure than email confirmation alone.
Do data brokers have children's information?
Yes. Children's data appears on data broker and people-search sites through family records that list household members including minors, school directory information, public records, app data from COPPA-violating children's apps, and social media scraping. A parent's profile on sites like Spokeo or BeenVerified typically lists children by name and approximate age. Removing your own data from broker sites also removes the family associations that expose your children. A data removal service like GhostVault handles this across 500+ broker sites continuously.
What can parents do right now to protect kids' privacy?
Don't wait for legislation. Audit and restrict privacy settings on every app your child uses. Disable ad tracking at the device level (iPhone: Settings > Privacy > Tracking; Android: Privacy > Ads). Check people-search sites for your family's information and submit removal requests. Opt out of school directory data sharing under FERPA. Enable parental controls (Screen Time on iOS, Family Link on Android). Set all social media accounts to private. And consider a data removal service that covers your entire household — GhostVault monitors 500+ broker sites for family data at $3.99/month, catching the household listings that expose your children through your own profiles.

This is just one of 500+ brokers selling your data.
GhostVault removes you from all of them automatically — and keeps you removed.