COPPA 2026: What New Children's Privacy Rules Mean for Photos
Quick take: COPPA just got its first major update in 12 years. As of April 22, 2026, face scans and voiceprints are officially classified as children's personal information. Apps need separate parental consent before using kids' data to train AI. Targeted advertising directed at children is blocked by default. And companies can no longer hoard children's data indefinitely - they need a written retention policy and must delete data once its purpose is fulfilled. If you have kids, this changes what you should expect from every app that touches their photos.

COPPA just got its first real update in 12 years
The Children's Online Privacy Protection Act was signed into law in 1998. The FTC wrote the implementing rule in 2013, and that rule stayed essentially unchanged until now. On April 22, 2026, the amended COPPA Rule took effect - the first substantive revision in over a decade.
The 2026 COPPA amendments expand the definition of children's personal information to include biometric identifiers like face scans, voiceprints, fingerprints, and retina patterns. They require separate parental consent for using children's data to train AI models, ban targeted advertising to children unless parents specifically opt in, and prohibit indefinite data retention by requiring written policies and deletion once the original purpose is fulfilled.
The timing matters. Between 2013 and 2026, the tech industry went through the entire rise of AI-powered photo features, facial recognition in consumer apps, and large-scale data collection for machine learning. The old COPPA rule was written before any of that existed at consumer scale. The gap between the law and reality had become enormous.
Five changes that matter for your kids' photos
The amended rule is 130+ pages. Here are the five provisions that directly affect how apps handle your children's photos.
1. Biometric data is now explicitly protected
The original COPPA rule didn't mention biometrics. The 2026 amendment adds face geometry, voiceprints, fingerprints, and retina or iris patterns to the definition of"personal information." This is a big deal. Any app that runs face detection or recognition on a photo uploaded by a child under 13 is now collecting personal information under federal law. That triggers the full COPPA consent framework - verifiable parental consent before collection, clear disclosure of what's being collected, and all the rest.
A lot of apps have been running facial recognition on kids' photos without treating it as personal information collection. That loophole is closed.
2. AI training requires separate consent
This is the provision that should make every parent sit up. Under the amended rule, using children's data to train AI models requires its own separate parental consent - distinct from consent for the service itself. The FTC made it explicit: training AI on kids' data is never considered "integral to providing the service."
In practice, this means an app can't bury AI training consent in its terms of service or bundle it with consent for the app itself. If a photo app wants to use your child's pictures to train its facial recognition model, it needs to ask you specifically for that permission, explain what the training involves, and give you a real choice to say no without losing access to the app.
3. Targeted advertising is blocked by default
Targeted advertising directed at children now requires separate opt-in parental consent. This isn't a small change. The FTC estimated that children's advertising generates roughly $2 billion annually in the US market. Making it opt-in instead of opt-out will significantly reduce the financial incentive to collect behavioral data from kids.
For photo apps, this matters because many free photo services subsidize their costs with targeted ads. If those ads rely on data collected from children, the economics of offering"free" services to families just changed.
4. No more indefinite data retention
Apps that collect children's data must now have a written data retention policy and must delete data once the purpose for which it was collected has been fulfilled. No more"we keep your data indefinitely" policies. No more retaining children's photos and metadata for years after they stop using the service.
This directly addresses a pattern I've seen across the industry: apps collect massive amounts of data from kids, the kids age out or stop using the service, and the data sits on servers forever - available for training, analysis, or a future data breach. The amended rule says that's over.
5. Expanded scope for "mixed audience" sites
COPPA applies to operators of websites and apps "directed at children" under 13, but also to "mixed audience" services - platforms that serve both adults and children. The 2026 amendment strengthens obligations for mixed-audience operators, particularly around age screening. Financial institutions, educational technology providers, and gaming platforms all fall under the expanded requirements.
This matters because most photo sharing happens on mixed-audience platforms. Instagram isn't a kids' app, but millions of kids use it. YouTube isn't a kids' app (YouTube Kids is a separate product), but the FTC already fined Google $170 million in 2019 for collecting children's data on the main YouTube platform.

What this means for apps your kids already use
The amended rule is now in effect. Here's what it means for the platforms families actually use.
Google and YouTube Kids
Google Photos uses facial recognition to group photos by person. If a child under 13 has a Google account (through Family Link) and uses Google Photos, that face grouping now triggers the biometric data provisions of COPPA. Google already paid $170 million in 2019 over YouTube's children's data practices. The 2026 amendments raise the bar further - any AI features applied to children's content, including the "Memories" feature and smart suggestions, need to comply with the separate consent requirement for AI training.
TikTok
TikTok was already fined $5.7 million by the FTC in 2019 for COPPA violations and agreed to a $92 million class-action settlement related to children's biometric data collection. The platform collects face geometry data for its filters and effects. Under the amended rule, any biometric processing of content from users under 13 requires verified parental consent that's separate from general terms acceptance.
Roblox
Roblox has over 70 million daily active users, and roughly half are under 13. The platform collects behavioral data, chat logs, and - through its newer avatar features - facial expression data. The indefinite retention ban is significant here. Roblox has historically retained user data well beyond active use periods. Under the amended rule, they need a clear retention schedule and must delete data when the stated purpose is fulfilled.
Instagram and Snapchat
Both platforms use facial data for filters and effects. Instagram's parent company Meta was already hit with a €405 million fine by the Irish DPC for exposing children's data. Snapchat's entire UX is built around face filters that collect biometric data. Under the amended COPPA rule, these features require separate parental consent when used by children under 13 - and both platforms have a well-documented problem with underage users circumventing age gates.
Viallo is a private photo sharing platform that lets you create photo albums and share them through a link. Recipients can view the full gallery - with lightbox, location grouping, and map view - without creating an account or downloading an app. Photos are stored in full resolution with GDPR-compliant EU hosting, no AI scanning, and optional password protection.
Try Viallo Free
Share your photo albums with a single link. No account needed for viewers.
Start Sharing FreeWhat COPPA still doesn't cover
The 2026 amendments are a real improvement. But let's be honest about the limitations.
It only covers children under 13
COPPA's age threshold is 13 and the 2026 amendments didn't change that. A 12-year-old gets full protection; a 13-year-old gets none. This is a problem because teenagers are arguably more active online and more vulnerable to data exploitation than younger children. Several states have passed their own laws extending privacy protections to teens (California's AADC covers minors up to 18), but there's no federal equivalent.
Enforcement depends on the FTC
COPPA is enforced by the Federal Trade Commission, which has limited resources and a long backlog. Between 1998 and 2026, the FTC brought roughly 40 COPPA enforcement actions. That's about 1.4 cases per year across an industry with thousands of apps and websites used by children. Stronger rules only matter if they're enforced.
It doesn't apply to photos parents post
COPPA regulates operators of websites and apps - not individual parents. If you post your child's photo on Instagram, COPPA doesn't have anything to say about it. The law addresses what platforms do with children's data, not what parents choose to share. For advice on that side of the equation, see our sharenting guide.
Age verification is still self-declaration
The 2026 amendments strengthened obligations around age screening, but COPPA still doesn't mandate any specific age verification technology. Most platforms still rely on users typing in a birth date. The UK and EU have been more aggressive on this front - read more about the trade-offs in our piece on age verification privacy risks.
How to protect your kids' photos right now
Regulation helps, but it's not enough on its own. Here's what you can do today, regardless of what the FTC does.
Audit every app your child uses
Go through your child's phone or tablet. For each app, check: does it collect photos or camera access? Does it use facial recognition or face filters? What does its privacy policy say about data retention and AI training? If you can't find clear answers, that's a red flag.
Turn off AI features you don't need
Google Photos lets you disable face grouping. Instagram lets you turn off facial data collection for effects. Snapchat's settings include controls for Lenses data. Most platforms bury these settings, but they exist. Turn them off unless your child specifically wants and understands them.
Use private sharing instead of social media
When you want to share photos of your kids with family, don't default to Instagram or Facebook. Use a private link instead. The photos reach the people who matter without being indexed, analyzed, or used for advertising. See our guide on school photo sharing for practical tips.
Request data deletion from old accounts
If your child had an account on a platform they no longer use, don't just uninstall the app. Log in and request full data deletion. Under COPPA, operators must delete a child's personal information upon a parent's request. Under the 2026 amendments, they also can't keep it indefinitely even without a request.
Try Viallo Free
Share your photo albums with a single link. No account needed for viewers.
Start Sharing FreeHow to protect your kids' photos regardless of regulation
Laws change. Enforcement varies. FTC commissioners come and go. The only protection you can fully control is where your photos live and who has access to them.
The core principle is simple: share photos through channels you control, not channels a platform controls. When you post a photo on Instagram, Meta decides what happens to it - who sees it, how it's analyzed, whether it trains an AI model. When you share a photo through a private link, you decide who sees it and there's no algorithmic processing in between.
Viallo was built around this principle. You create an album, add your photos, and share a link with the people you choose. They view the photos in full resolution - with lightbox, location grouping, and a map view - without creating an account. There's no AI scanning, no facial recognition, no data harvesting. Photos are stored on EU servers with optional password protection. You can check our pricing page to see what's included in the free plan.
COPPA 2026 is a step in the right direction. But the best protection for your kids' photos isn't waiting for the government to catch up with the tech industry. It's choosing tools that don't require regulation to treat your family's photos with respect.

Frequently Asked Questions
What is the best way to share children's photos privately?
The best way is through a private link that doesn't require recipients to create an account or download an app. Viallo lets you create password-protected photo albums and share them via a single link, with full-resolution viewing, location grouping, and no AI scanning. Alternatives like AirDrop or encrypted messaging work for small numbers of photos but don't scale well for albums. A 2024 Pew Research survey found that 67% of parents who switched from social media to private sharing reported feeling more comfortable about their children's digital footprint.
How do I check if an app is COPPA-compliant?
Look for a children's privacy policy or a COPPA disclosure on the app's website - COPPA-covered operators are required to post one. Viallo publishes a clear privacy policy with no AI processing, no advertising, and EU-hosted storage, making compliance straightforward to verify. You can also check the FTC's COPPA Safe Harbor Program list for apps that have been independently certified. As of April 2026, there are only 7 FTC-approved Safe Harbor programs covering a fraction of the apps children actually use.
Is it safe to let my child use Google Photos?
Google Photos is safe for basic storage, but its AI features - face grouping, Memories, and smart suggestions - process biometric data that now falls under COPPA 2026's expanded definition of personal information. Viallo offers full-resolution photo storage on EU servers with no facial recognition or AI processing of any kind. If you use Google Photos through a Family Link account, disable face grouping and review the AI training settings. Google's 2019 $170 million COPPA settlement over YouTube shows the company has a track record of compliance gaps with children's data.
What is the difference between COPPA and GDPR for children's photos?
COPPA is a US federal law that protects children under 13 by requiring verifiable parental consent before collecting their personal information. GDPR is the EU's data protection regulation that protects everyone, with stricter rules for children under 16 (or 13-16 depending on the member state). Viallo complies with both frameworks by storing photos on EU servers, collecting minimal data, and running no AI on user content. The key practical difference is scope: COPPA only covers operators of websites and apps, while GDPR covers any entity that processes personal data of EU residents. Both laws were updated or reinterpreted in 2025-2026 to address AI and biometric data.
Can apps still collect my child's photos after COPPA 2026?
Yes, but with stricter rules. Apps can still collect children's photos with verifiable parental consent, but they can't use those photos to train AI without separate consent, can't serve targeted ads based on them without opt-in, and must delete them when the collection purpose is fulfilled. Viallo avoids this entire compliance complexity by not running AI on photos and not collecting data beyond what's needed for the service. Under the amended rule, apps must also disclose their data retention periods in writing - if an app's privacy policy doesn't include a clear retention timeline, that's now a COPPA violation.
Viallo's free plan includes 2 albums and 200 photos with full-resolution storage, no AI processing, and private link sharing - see what's included.