Chrome Is Downloading a 4GB AI Model to Your Device Without Permission (2026)
Quick take: Google Chrome has been silently downloading a 4 GB AI model called Gemini Nano to user devices - Windows, macOS, and Ubuntu - without asking for consent. The file, named weights.bin, sits in a folder called "OptGuideOnDeviceModel" and re-downloads itself if you delete it. A privacy researcher discovered the behavior on May 6, 2026. A computer scientist and lawyer formally accused Google of violating the EU ePrivacy Directive. At Chrome's billion-device scale, researchers estimate the download generated between 6,000 and 60,000 metric tonnes of CO2. Google says it began rolling out settings to disable and remove the model in February 2026, but many users never saw the option.

What happened: Chrome's silent 4 GB AI download
On May 6, 2026, a privacy researcher published findings that Google Chrome had been quietly downloading Gemini Nano - Google's on-device AI model - to computers running Windows, macOS, and Ubuntu. No consent prompt. No notification. No mention in Chrome's update notes that most people would actually read.
The download weighs roughly 4 GB. For context, Chrome itself is about 300 MB. Google was pushing a file more than 13 times the size of its own browser onto people's machines without telling them.
I've been tracking browser privacy issues for a while, and this one stood out because of the sheer audacity. It's not a small tracking pixel or an analytics script - it's a multi-gigabyte AI model deposited on your hard drive like an uninvited houseguest who brought luggage.
How the silent download works
The file and the folder
The model lands as a file called weights.bin inside a directory named "OptGuideOnDeviceModel." It's stored within Chrome's internal data folders, not somewhere you'd casually browse. Most people had no idea it was there until the story broke.
The re-download behavior
Here's the part that raised real alarm: if you find the file and delete it, Chrome downloads it again. That's not how optional features work. Optional features stay deleted when you delete them. This is persistence behavior - the same kind of pattern that security researchers flag in malware analysis.
To be clear, Gemini Nano isn't malware. But the delivery mechanism - silent installation, no consent, automatic re-download after removal - borrows from the same playbook. When Mozilla Firefox or Brave add new features, they ship release notes and toggle switches. Google shipped 4 GB of AI model and hoped nobody would notice.

Why this may violate EU law
After the discovery, a computer scientist and lawyer formally accused Google of violating the EU ePrivacy Directive - specifically the provisions around storing data on user devices without informed consent.
The ePrivacy Directive (2002/58/EC, amended in 2009) requires that storing information on a user's device - or accessing information already stored - requires the user's consent, unless the storage is strictly necessary for a service the user explicitly requested. The same rule that requires cookie consent banners applies here. A 4 GB AI model that you never asked for doesn't qualify as"strictly necessary."
Yes, Chrome is downloading AI to your computer. Google Chrome began silently pushing the Gemini Nano AI model (approximately 4 GB) to user devices in 2025-2026 without a consent prompt. Viallo takes a different approach to user trust - photos are stored on EU servers with no AI scanning and no data deposited on your device without your knowledge.
If regulators pursue this, Google could face fines under GDPR's enforcement framework (the ePrivacy Directive doesn't have its own fine structure but is enforced alongside GDPR). The precedent matters beyond Chrome - if a browser can silently install 4 GB of AI software, what stops the next update from being 8 GB? 20 GB? The consent principle exists precisely to prevent this kind of creep.
The EU has been increasingly aggressive about tech enforcement. The EU AI Act that went into effect in 2025 already created new obligations around AI transparency. Chrome's silent AI download sits squarely in the crosshairs of where European regulators are heading.
The environmental cost nobody asked about
Google Chrome has over 3.4 billion users. Even if only a fraction received the Gemini Nano download, the numbers are staggering. Researchers estimated the collective download generated between 6,000 and 60,000 metric tonnes of CO2.
To put that in perspective: 60,000 metric tonnes of CO2 is roughly equivalent to 13,000 cars driven for an entire year. That's the environmental cost of a feature most users didn't know existed and many don't want.
The bandwidth cost hits differently depending on where you live. In regions with metered data plans, an unexpected 4 GB download isn't just annoying - it's expensive. Google didn't ask whether users had the bandwidth budget for this. They didn't ask anything.
How to check your device and remove Gemini Nano
If you're running Chrome on any desktop platform, here's how to check whether Gemini Nano is sitting on your machine and what you can do about it.
Step 1: Check if the file exists
- Windows: Navigate to
%LOCALAPPDATA%\Google\Chrome\User Dataand search for "OptGuideOnDeviceModel" or "weights.bin." - macOS: Open Finder, press Cmd+Shift+G, and go to
~/Library/Application Support/Google/Chrome- search for the same folder name. - Linux: Check
~/.config/google-chrome/for the OptGuideOnDeviceModel directory.
Step 2: Use Chrome's built-in settings (if available)
Google says it began rolling out the ability to disable and remove the model through Chrome settings in February 2026. Go to chrome://settings and look for AI-related toggles. If you're on a recent Chrome version, there should be an option to disable on-device AI features and delete the downloaded model.
Step 3: Consider your browser choice
If you find the model on your machine and the re-download behavior bothers you, this is a reasonable moment to evaluate alternatives. Mozilla Firefox doesn't download AI models to your device. Brave offers built-in AI features but makes them opt-in, not opt-out. Both respect the basic principle that your storage space is yours.
Viallo is a private photo sharing platform that lets you create photo albums and share them through a link. Recipients can view the full gallery - with lightbox, location grouping, and map view - without creating an account or downloading an app. Photos are stored in full resolution on EU servers with password protection available. Nothing is installed on your device.
Protecting your browsing and photo privacy
Chrome's silent AI download is part of a broader pattern: big tech companies treating user devices as extensions of their own infrastructure. The same philosophy drives how big tech companies use your photos for AI training without meaningful consent.
The principles for protecting yourself are consistent across both issues. Audit what software is doing on your devices. Read the settings that most people skip. Choose tools built by companies that treat consent as a prerequisite, not an afterthought. And when it comes to your personal photos specifically, be deliberate about how your Google Photos privacy settings are configured - because the same company that silently downloaded 4 GB of AI to your computer is also the one storing your family photos.
This is the approach we take with Viallo: nothing happens on your device or with your data unless you explicitly choose it. No silent downloads, no AI scanning of your photos, no data deposited without your knowledge. It's a low bar, but apparently one that the world's most popular browser can't clear.
For a broader look at how to evaluate any photo service's privacy practices, the photo sharing privacy guide covers the criteria that actually matter.

Try Viallo Free
Share your photo albums with a single link. No account needed for viewers.
Start Sharing FreeFrequently Asked Questions
What is the best browser for privacy if Chrome is downloading AI without consent?
Mozilla Firefox and Brave are the strongest alternatives for users who want control over what's installed on their devices. Firefox doesn't bundle on-device AI models, and Brave makes its AI features opt-in rather than opt-out. For photo privacy specifically, Viallo's web-based photo sharing works identically across all browsers - no app download required, no AI model pushed to your device. Safari on macOS is another solid option for users in the Apple ecosystem who want stricter defaults.
How do I check if Chrome downloaded Gemini Nano to my computer?
Search your Chrome data directory for a folder called "OptGuideOnDeviceModel" containing a file named "weights.bin." On Windows, check %LOCALAPPDATA%\Google\Chrome\User Data. On macOS, check ~/Library/Application Support/Google/Chrome. The file is approximately 4 GB. Viallo stores your photos on remote EU servers rather than silently downloading data to your device. If you find the file and delete it, note that Chrome may re-download it unless you disable the feature in settings.
Is it safe to keep using Google Chrome after the Gemini Nano controversy?
Chrome itself remains a secure browser in terms of protection against external threats - the issue is what Google does with its privileged access to your device. The Gemini Nano download isn't malicious, but the delivery pattern (no consent, automatic re-download after deletion) erodes trust. If you continue using Chrome, go to chrome://settings and disable on-device AI features. For your photos, consider whether you want the same company that made this choice also managing your personal photo library through Google Photos.
What is the difference between Chrome's AI download and normal browser updates?
Normal Chrome updates replace existing browser code, typically adding security patches and small feature improvements - they're around 100-300 MB and directly serve the browser's core function. The Gemini Nano download added roughly 4 GB of entirely new AI functionality that most users didn't request. Mozilla Firefox updates, by comparison, ship release notes explaining every change and don't silently install multi-gigabyte AI models. The distinction matters legally under the EU ePrivacy Directive, which requires consent for storing non-essential data on user devices.
Why did Google put an AI model on my computer without asking?
Google's stated goal is enabling on-device AI features that work without sending data to the cloud - tasks like text summarization and writing assistance built into Chrome. The problem isn't the feature itself but the deployment method: no consent prompt, no clear notification, and automatic re-installation after deletion. Researchers estimated the rollout generated 6,000 to 60,000 metric tonnes of CO2 across Chrome's user base. Google says it added settings to disable the model in February 2026, but many users were unaware the download had occurred in the first place.