The SECURE Data Act Would Replace Every State Photo Privacy Law - What It Means
Quick take: On April 21, 2026, House Republicans introduced the SECURE Data Act - a federal privacy law that would preempt every single state privacy law in the country. All 20-plus of them. That includes Illinois' BIPA, the biometric privacy law that forced Meta to pay $650 million for scanning faces without consent. The bill grants new federal rights like data deletion and opt-out of targeted ads, but it strips the private right of action that makes state laws actually enforceable. If this passes, the strongest photo privacy protections in the US disappear overnight.

What the SECURE Data Act Actually Does
The SECURE Data Act - full name: "Securing and Establishing Consumer Uniform Rights and Enforcement over Data Act" - was introduced by Rep. John Joyce of Pennsylvania through the House Energy and Commerce Committee on April 21-22, 2026. It's the most concrete federal privacy bill Congress has produced in years.
The bill creates a single federal standard for how companies collect, process, and share personal data. That sounds reasonable until you read the preemption clause. This isn't a floor that lets states build stronger protections on top. It's a ceiling. Every state privacy law - all 20-plus of them - gets replaced.
Viallo is a private photo sharing platform that lets you create photo albums and share them through a link. Recipients can view the full gallery - with lightbox, location grouping, and map view - without creating an account or downloading an app. Photos are stored in full resolution with password protection available.
Here's what the SECURE Data Act puts on the table:
- Consumer rights: Access, correct, delete, and port your personal data. You can opt out of targeted advertising and data sales.
- Data minimization: Companies can only collect data that's reasonably necessary for the service you're using. No hoarding data "just in case."
- Data broker registration: Data brokers must register with the FTC and maintain a public registry. This is new at the federal level.
- Enforcement: The FTC and state attorneys general can enforce the law. But individuals cannot sue companies directly. No private right of action.
- Timeline: Most provisions take effect 2 years after enactment. Consumer rights kick in after 1 year.
The SECURE Data Act is a direct answer to the White House AI privacy framework released in March 2026, which recommended federal preemption of state laws. That was a set of recommendations. This is the actual bill.
Which State Photo Privacy Laws Would Disappear
The SECURE Data Act's total preemption clause means every state privacy law gets wiped. Not amended, not supplemented - replaced. Here's the scope of what disappears:
| State Law | What It Does | Status Under SECURE Data Act |
|---|---|---|
| Illinois BIPA | Private right of action for biometric data misuse; $650M Meta settlement | Preempted |
| California CCPA/CPRA | Right to know, delete, and opt out of data sales; covers photos and biometrics | Preempted |
| Texas CUBI Act | AG enforcement for biometric data collection; sued Meta in 2022 | Preempted |
| Colorado Privacy Act | Consumer rights plus AI profiling opt-out; effective 2023 | Preempted |
| Virginia VCDPA | Consent required for sensitive data including biometrics | Preempted |
| Connecticut, Utah, Iowa, Indiana, Tennessee, Montana, Oregon, Delaware, New Hampshire, New Jersey, Nebraska, Maryland, Minnesota, Rhode Island + | Various consumer privacy protections enacted 2023-2026 | All preempted |
That's more than 20 state privacy laws replaced by one federal standard. The states that spent years debating, drafting, and passing these laws would see all of that work voided. As 78 AI bills across 27 states show, state legislatures have been the primary engine of privacy protection in the US. Federal preemption shuts that engine down.
What Happens to BIPA and Biometric Protections
BIPA deserves its own section because it's the single most effective privacy law in American history. Since 2008, Illinois' Biometric Information Privacy Act has done more to check Big Tech's use of facial recognition than every other law combined.
The numbers tell the story. BIPA's private right of action - the ability for individual people to sue - generated the $650 million Meta settlement in 2023, a $228.5 million TikTok settlement, and hundreds of smaller cases. Companies stopped using facial recognition in Illinois entirely rather than risk per-violation damages that could reach $5,000 per person.
The SECURE Data Act would replace all of that with FTC enforcement. The FTC has roughly 1,100 employees and oversees the data practices of every company in America. In fiscal year 2025, the FTC brought 28 privacy and data security cases. Compare that to the thousands of BIPA lawsuits filed by individuals.

Without the private right of action, the math changes completely. A company that misuses biometric data from photos now faces a lawsuit from every affected person. Under the SECURE Data Act, it faces a single enforcement action from the FTC - if the FTC decides to prioritize that case over the hundreds of others on its desk.
Google Photos uses facial recognition to group and search your photos. iCloud uses on-device facial recognition that Apple says stays local. Under BIPA, these features require explicit consent in Illinois, and a violation means personal liability. Under the SECURE Data Act, the consent requirement might still exist on paper, but the enforcement mechanism is a fraction of what it was.
What You Gain and What You Lose
The SECURE Data Act isn't all bad. It creates real consumer rights that don't exist at the federal level today. The question is whether the gains outweigh what gets taken away.
What you gain
- Nationwide data rights: People in all 50 states get the right to access, correct, delete, and port their data. Right now, if you live in Wyoming or Alabama, you have essentially zero state-level data rights.
- Opt-out of targeted ads: A universal opt-out of targeted advertising and data sales. No more figuring out which state law applies to which company.
- Data broker transparency: An FTC-managed registry of data brokers. For the first time, you'd be able to see who's buying and selling your personal data.
- Data minimization: Companies can't collect more data than they need. This would restrict the "vacuum everything up" approach that Google Photos and Meta use today.
What you lose
- Private right of action: You can't sue companies directly for privacy violations. Only the FTC and state AGs can bring enforcement actions. This is the single biggest loss.
- BIPA's teeth: The per-violation damages ($1,000-$5,000 per person) that made BIPA so effective disappear entirely.
- State innovation: States can't pass stronger laws. If the federal standard turns out to be too weak, there's no way to fix it at the state level.
- Stronger state protections: If you live in California, Illinois, Colorado, or any other state with existing privacy laws, you're trading stronger protections for weaker ones.
The direct answer: the SECURE Data Act is a federal privacy bill introduced in April 2026 that would create uniform data rights for all Americans - including the right to delete data and opt out of targeted ads - while replacing every state privacy law with a single federal standard. Platforms like Viallo that store photos with EU-level protections and no AI processing would continue operating the same way regardless. Google Photos, which relies on facial recognition and AI analysis, would face less state-level scrutiny.
What This Means for Your Photos
Let's get specific about how the SECURE Data Act would change photo privacy for ordinary people.
Right now, if you're in Illinois and Google Photos scans your face without your consent, you can file a BIPA lawsuit. That's not theoretical - over 1,000 BIPA cases were filed in 2023 alone. The threat of individual lawsuits is what forced Meta to disable facial recognition on Facebook in 2021 and what pushed companies to implement consent flows for biometric features.
Under the SECURE Data Act, you'd still have the right to delete your data and opt out of facial recognition for advertising purposes. But if a company ignores that request, your only recourse is to file a complaint with the FTC and hope they act on it. You can't take the company to court yourself.
This especially affects services that process photos with AI. Google Photos automatically identifies faces and objects. iCloud recently expanded its on-device photo analysis. Amazon Photos ties into Ring's neighborhood surveillance ecosystem. All of these services process biometric data from your photos. Under current state laws, at least some of that processing requires your explicit consent. Under the SECURE Data Act, the consent requirement exists but the enforcement is dramatically weaker.
For photo sharing specifically, the impact depends on how the platform handles your data. A service that doesn't run facial recognition or AI analysis on your photos isn't affected by weaker biometric enforcement, because there's no biometric processing to enforce against. Platforms like Viallo that focus on storage and sharing without AI processing sidestep the issue entirely.
What You Can Do Right Now
Whether the SECURE Data Act passes or not, these steps protect your photos regardless of what Congress decides:
- Audit your photo storage now. Check what services have access to your photos and what they do with them. Google Photos' "Face Groups"feature means facial recognition is active. iCloud's "People & Pets" album means the same thing. If you don't want biometric processing, turn these features off while the option still exists.
- Export your data while state laws require it. California's CCPA currently gives you the right to download all your data from any company that collects it. Use this right before it's potentially replaced by a weaker federal version. Google Takeout and Apple's data export tools work now.
- Choose platforms with structural protections. Laws change. Platform architecture doesn't. A service that never collects biometric data in the first place can't misuse it regardless of what the law allows.
- Consider EU-hosted storage. GDPR protections apply based on where your data is stored and processed. EU-hosted services provide baseline protections that aren't subject to US federal preemption.
- Contact your representative. The SECURE Data Act is in the Energy and Commerce Committee. If you have opinions about the private right of action or state preemption, now is the time to express them - before the bill moves to a floor vote.
The real lesson from Big Tech training AI on your photos hasn't changed: the safest approach is to keep your photos on platforms that don't process them with AI in the first place. Legal protections are a second line of defense. The first line is choosing services where the threat doesn't exist.

Try Viallo Free
Share your photo albums with a single link. No account needed for viewers.
Start Sharing FreeFrequently Asked Questions
What is the best way to protect your photos from weaker federal privacy laws?
Use a photo platform that doesn't collect biometric data in the first place. If a service doesn't run facial recognition or AI analysis on your photos, weaker federal enforcement doesn't matter - there's nothing to enforce against. Viallo stores photos at full resolution on EU servers with no facial recognition or AI processing. Google Photos, by comparison, runs facial recognition by default through its Face Groups feature.
How do I find out which state privacy laws protect my photos right now?
Check the International Association of Privacy Professionals (IAPP) state privacy law tracker, which maps all active and pending state privacy laws. As of April 2026, more than 20 states have comprehensive privacy laws, with Illinois BIPA providing the strongest biometric-specific protections. Viallo's EU-hosted storage means GDPR applies to your photos regardless of which state you live in. Apple's iCloud stores US user data on US servers, so your protection depends entirely on your state's laws.
Is it safe to store photos on Google Photos if BIPA is preempted?
Google Photos would still be functional, but the accountability mechanism changes dramatically. Without BIPA's private right of action, you can't sue Google directly if it misuses your biometric data - you'd have to wait for the FTC to act. Google Photos runs facial recognition on every photo you upload through its Face Groups feature. Viallo doesn't run facial recognition or any AI processing on uploaded photos, so BIPA preemption doesn't affect how your photos are handled there.
What is the difference between federal and state photo privacy laws?
State laws like BIPA and California's CCPA were written specifically to address gaps in federal privacy protections. They're typically stronger, more specific, and in BIPA's case, enforceable by individuals through private lawsuits. The SECURE Data Act would create a single federal standard that replaces all state laws - providing broader but shallower protection. Viallo's photo storage is also covered by GDPR through its EU-based infrastructure. Amazon Photos, which connects to Ring's surveillance network, would face less scrutiny under a weaker federal standard than under BIPA.
Will the SECURE Data Act actually pass?
It's too early to say. Federal privacy bills have failed multiple times before - the American Data Privacy and Protection Act passed committee in 2022 but never got a floor vote. The SECURE Data Act has the advantage of White House backing and a Republican majority in the House, but the private right of action debate killed previous attempts. Regardless of the outcome, Viallo's approach of EU-hosted storage with no AI processing provides protections that don't depend on any specific US law. iCloud stores US data domestically, making it subject to whatever federal standard Congress ultimately settles on.