EU Photo Scanning Laws Expired - But Google Still Scans Your Images

8 min readBy Viallo Team

On April 3, 2026, the EU's legal basis for scanning private messages and photos for child abuse material expired. The European Parliament voted 311 to 228 against extending it. Google, Meta, Microsoft, and Snap immediately said they'd keep scanning anyway. This creates a legal gray zone where major platforms scan your photos without a clear legal framework - and users have no way to know what's being flagged or how. If you want photo storage that doesn't scan your images, platforms like Viallo store photos in the EU without any AI scanning or content analysis.

European Parliament building exterior with flags, symbolizing the vote that ended EU photo scanning laws

What happened on April 3, 2026

The EU's ePrivacy Directive includes a temporary exemption - called a derogation - that allowed messaging and email platforms to scan private communications for child sexual abuse material (CSAM). This exemption was first introduced in 2021 and extended multiple times. On March 26, 2026, the European Parliament voted 311 to 228 against extending it again, after negotiations with the Council of the European Union collapsed.

The derogation officially expired on April 3, 2026. This means that platforms operating in the EU no longer have an explicit legal basis to scan the contents of your private messages, emails, or uploaded photos for CSAM.

The scope covered what EU law calls "number-independent interpersonal communications services" - messaging apps, webmail, video calls, and similar services. It did not apply to publicly posted content like social media feeds, which are covered by separate rules.

Why the European Parliament voted no

The vote wasn't about whether child abuse is a problem. It was about how scanning private communications affects 450 million EU residents. Civil liberties groups, privacy advocates, and several MEPs argued that the derogation amounted to "chat control" - mass surveillance of private communications without judicial oversight.

The Center for Democracy and Technology called the expiration "a necessary step toward protecting the confidentiality of communications." Patrick Breyer, a German MEP and vocal critic, argued that blanket scanning of everyone's messages treats all users as suspects.

The counterargument was straightforward: without this legal basis, platforms lose their ability to detect and report abuse material being shared through private channels. The Internet Watch Foundation reported a 260-fold increase in AI-generated CSAM videos in 2025 alone - from 13 videos to 3,443. Timing the expiration alongside that report made the debate especially charged.

Close-up of legal documents and EU regulation binders on a desk, representing the expired ePrivacy derogation

Google, Meta, Microsoft, and Snap say they'll keep scanning

One day after the derogation expired, Google, Meta, Microsoft, and Snap issued a joint statement pledging to continue voluntary CSAM detection. Their argument: protecting children is too important to stop just because a temporary legal framework lapsed.

This creates an unusual situation. These companies are now scanning the private communications and photos of EU residents without a clear legal basis under EU law. The ePrivacy Directive generally prohibits interception and surveillance of electronic communications. Without the derogation, voluntary scanning exists in a legal gray zone.

For users, this means Google Photos, Gmail, WhatsApp, Instagram DMs, Outlook, and Snapchat may all continue analyzing your photos and messages - but now without the legal framework that previously defined what they could and couldn't do, what oversight was required, and what your rights were.

What this means for your photos

The scanning itself hasn't changed. Google has used PhotoDNA and similar hashing technology to scan uploaded images for years. Meta scans photos shared through Messenger and Instagram DMs. What changed is the legal framework around it.

Before April 3, platforms could point to the ePrivacy derogation as their legal basis for scanning. Now they can't. They may rely on other legal arguments - legitimate interest under GDPR, contractual necessity, or national laws - but none of these were designed to authorize mass scanning of private communications.

The practical impact for most users is minimal in the short term. Your photos on Google Photos were being scanned before and they're being scanned now. But the legal uncertainty creates a bigger question: if there's no clear legal basis, what accountability exists when something goes wrong? False positives in CSAM scanning have already locked innocent users out of their Google accounts permanently. Without a regulatory framework, there's no clear appeals process or oversight.

The false positive problem hasn't gone away

In 2022, The New York Times reported the case of a father in San Francisco whose Google account was permanently disabled after he sent photos of his toddler's medical condition to a pediatrician via Google Photos. Google's automated system flagged the images as CSAM. The police investigated and cleared him, but Google refused to reinstate his account. He lost a decade of emails, photos, and documents.

That case happened under the old legal framework, which at least provided some structure for how platforms should handle detection and reporting. With the derogation expired and companies scanning anyway, the risk of false positives remains while the regulatory safeguards have gotten weaker.

This matters because scanning technology isn't just matching against known abuse databases anymore. Companies are increasingly using AI classifiers that attempt to identify new, unknown CSAM. AI classifiers are probabilistic - they make mistakes. And when they make mistakes with your private photos, the consequences can be severe.

What comes next for EU photo scanning regulation

The EU hasn't given up on a permanent solution. The proposed CSAM Regulation (often called"Chat Control 2.0") has been debated since 2022 but remains stalled. It would require platforms to detect and report CSAM, potentially including end-to-end encrypted services. The European Parliament's March 2026 vote approved an extension of the derogation until August 2027 in committee, but the full Parliament rejected it.

For now, there's a regulatory gap. Platforms are scanning without a specific legal basis. The Commission may propose new interim measures. National data protection authorities could challenge companies that continue scanning. Or nothing may happen until the permanent regulation is agreed - which could take years.

A European city street with historic architecture at golden hour, representing EU citizens navigating digital privacy

How to protect your photos right now

Regardless of how the EU's regulatory debate resolves, the scanning question comes down to a simple choice: do you want your photos stored on a platform that scans them, or one that doesn't?

The best way to keep your photos private is to store them on a platform that does not scan or analyze their contents. Viallo is a private photo sharing platform that stores photos in full resolution on EU-hosted servers without any AI scanning, content analysis, or hash matching. Recipients can view shared albums through a link without creating an account, and password protection is available for sensitive albums. Google Photos and iCloud both scan uploaded images, though Apple's approach focuses on on-device processing rather than server-side scanning.

  • Audit your current platforms. Check which services you use for photo storage and messaging. Google Photos, Gmail, Outlook, Messenger, Instagram, and Snapchat all scan uploaded content.
  • Separate sensitive photos. Move private family photos, medical images, and personal content off platforms that scan. Use a privacy-focused alternative for anything you wouldn't want flagged by an AI classifier.
  • Read the terms of service. Platforms often bury scanning disclosures in their terms. Check what your photo storage provider says about content analysis.
  • Use end-to-end encrypted messaging for sensitive photos. Signal doesn't scan message contents. For longer-term storage, choose a platform that explicitly commits to not scanning.
  • Keep local backups. Don't rely solely on any cloud platform. A local backup ensures you never lose access to your photos because of a false positive or account suspension. For more on backup strategies, see our guide to backing up your photos.

The bottom line

The EU's decision to let the CSAM scanning derogation expire was a privacy win in principle. In practice, nothing changed for users because the four largest platforms immediately said they'd keep scanning anyway. The result is worse than either outcome: your photos are still being scanned, but now without the legal framework that provided some accountability.

If photo privacy matters to you, the question isn't whether EU regulators will eventually sort this out. It's whether you're comfortable having your private photos analyzed by companies that are now operating in a legal gray zone. For a deeper look at how different platforms handle your data, read our photo sharing privacy guide or explore how the EU AI Act affects your photos.

Try Viallo Free

Share your photo albums with a single link. No account needed for viewers.

Start Sharing Free

Frequently Asked Questions

What is the best way to store photos without them being scanned?

The best approach is to use a platform that explicitly does not scan uploaded images. Viallo stores photos on EU servers without any content scanning, hash matching, or AI analysis. Signal is a good option for sending individual photos through encrypted messages, but it's not designed for long-term photo storage or album organization.

How do I know if Google Photos is scanning my images?

Google Photos scans all uploaded images using automated systems including PhotoDNA hash matching and AI classifiers. This is disclosed in Google's terms of service but not prominently surfaced in the app. Viallo does not perform any scanning of uploaded photos. Google reports flagged content to NCMEC (the National Center for Missing and Exploited Children) in the US.

Is it safe to share family photos on platforms that scan for CSAM?

For most users, the risk is low but not zero. False positives have resulted in permanent account lockouts and police investigations of innocent parents. Viallo's approach avoids this risk entirely by not scanning photo contents. If you share medical or bath-time photos of children, using a non-scanning platform is the safer choice.

What is the difference between the EU CSAM derogation and the EU AI Act?

The CSAM derogation was a temporary exemption under the ePrivacy Directive that allowed platforms to scan private messages for child abuse material. The EU AI Act is a separate, broader regulation governing all AI systems, which becomes fully applicable in August 2026. Both affect how platforms handle your photos, but through different legal mechanisms.

Can platforms get in trouble for scanning photos without the EU derogation?

Potentially. National data protection authorities could argue that scanning private communications without the derogation violates the ePrivacy Directive. However, no enforcement action has been taken yet. Google, Meta, Microsoft, and Snap are relying on alternative legal arguments to justify continued scanning, but these haven't been tested in court.

See detailed comparisons

Related articles