

Chrome faces scrutiny after reports of automatic AI model installation, highlighting privacy and transparency concerns. The group of analysts argues that this kind of silent modification of a user's environment violates both user expectations and, in their view, European privacy law.
Google Chrome has started automatically downloading and installing an on-device AI model file, weights.bin, to power Gemini Nano. The 4GB model is being installed on users’ devices without consent, notice, or an opt-out toggle and deleting the file causes Chrome to re-download the model without telling you.
According to Alexander Hanff from The Privacy Guy, this behavior mirrors a pattern previously seen with Anthropic’s Claude Desktop.
The weights.bin file is stored in the OptGuideOnDeviceModel directory in your Chrome user profile. It weighs a hefty 4GB and is installed on devices that meet certain system requirements, without obtaining user consent or offering an option to disable it in the settings. To prevent it from being redownloaded after deletion, users should disable it via chrome://flags or enterprise policy tools.
In Hanff’s testing, Chrome took just 14 minutes to create the OptGuideOnDeviceModel directory and download the model, all while giving users no indication that it was downloading such a large file. The researcher said that Google's activity involved many dark patterns, similar to those seen in the Claude desktop app, which he had written about before. The dark patterns listed involve:
Forced bundling across trust boundaries
Invisible default with no opt-in
Harder to remove than install
Pre-staging capability user did not request
Generic/obfuscated naming: OptGuideOnDeviceModel vs GeminiNanoLLM
Registration without user configuration
Documentation gap for normal users
Automatic re-install after deletion
Retroactive survival of future consent
Shipped via stable release channel
Also Read: Google Sharpens AI Pricing Strategy With New Ultra Lite Plan for Gemini
A key focus of Hanff's post is the environmental cost of silently distributing a 4GB AI model, highlighting the perils of distributing such a file on a global scale. If deployed across hundreds of millions or billions of devices, he estimated that the total emissions impact of simply distributing the file (not even using it) could reach tens of thousands of tons of CO2 equivalent. This is similar to the annual output of tens of thousands of cars.
The researcher also noted that deploying this model on devices has a significant climate impact, generating 640,000 tonnes of CO2e. For users with data caps or relying on mobile data, the download could use up all the data and leave them scratching their heads about what’s going on.