Google’s Chrome AI wording change raises GDPR and CCPA concerns over on‑device processing
#Privacy

Google’s Chrome AI wording change raises GDPR and CCPA concerns over on‑device processing

Privacy Reporter
4 min read

Google removed the phrase ‘without sending your data to Google servers’ from Chrome’s on‑device AI description. While the company insists the model still runs locally, the edit has sparked privacy‑rights alarms and prompted a closer look at compliance with the EU’s GDPR and California’s CCPA.

What happened

Google quietly updated the text that appears in Chrome’s System → On‑device AI settings. The original wording promised that AI features such as scam detection would run "directly on your device without sending your data to Google servers". In a recent rollout (Chrome 148, still being phased in) the clause "without sending your data to Google servers" was removed.

The change was first spotted on Reddit and quickly amplified by privacy advocate Alexander Hanff, who asked whether the edit signaled a shift in architecture – i.e., that prompts and responses might now be transmitted to Google’s cloud for processing. Google’s spokesperson replied that the data is still processed solely on the device, and the wording was altered only to avoid legal ambiguities when a website calls the on‑device model via the new Prompt API.

Featured image

GDPR (EU)

  • Article 5(1)(b) – personal data must be processed "lawfully, fairly and in a transparent manner". Removing a clear guarantee about data staying on‑device could be viewed as a reduction in transparency.
  • Article 25 – requires data‑by‑design and data‑by‑default safeguards. If a feature that was marketed as on‑device suddenly allows data to leave the device, Google may need to reassess its DPIA (Data Protection Impact Assessment).
  • Article 32 – mandates appropriate security measures. Any new data flow to Google servers would need to be protected under GDPR‑compliant encryption and access controls.

CCPA (California)

  • Section 1798.100 – gives consumers the right to know what personal information is collected and how it is used. The removal of the “no‑cloud‑transfer” phrase could be interpreted as a change in the collection purpose that must be disclosed.
  • Section 1798.105 – requires businesses to provide a clear and conspicuous “Do Not Sell My Personal Information” option. If on‑device AI prompts are now sent to Google, that could be a “sale” under CCPA unless an opt‑out is offered.

Both regimes also impose penalties: up to €20 million or 4 % of global turnover for GDPR violations, and up to $7,500 per violation (plus statutory damages) for CCPA breaches.

Impact on users and companies

Users

  • Privacy expectations – Many Chrome users rely on the on‑device label to justify enabling AI features in high‑security contexts (e.g., banking, health). If data can now travel to Google’s servers, users may be exposed to profiling or targeted advertising.
  • Control – The Prompt API lets websites query the local Gemini Nano model. While the model runs locally, the inputs and outputs are visible to the requesting site, meaning a third‑party site could harvest conversational data without the user’s explicit consent.
  • Device resources – The Nano model occupies ~4 GB of storage. Google’s recent UI addition to uninstall the model when resources are low is a positive step, but the removal of the privacy guarantee may discourage users from disabling the model, fearing loss of security features like scam detection.

Companies (web developers, enterprises)

  • Compliance risk – Sites that invoke the Prompt API must now treat the exchanged prompts as personal data under GDPR and CCPA. They need to update privacy policies, obtain consent, and possibly conduct their own DPIAs.
  • Legal exposure – If a site inadvertently forwards user prompts to Google Cloud (e.g., via a mis‑configured API endpoint), it could be held liable for an unlawful data transfer.
  • Product design – Enterprises building internal tools on top of Chrome’s AI will need to ensure that any on‑device processing remains truly local, or else provide a clear opt‑out and data‑processing agreement.

What changes are needed

  1. Clear, auditable documentation – Google should publish a technical white‑paper showing the data flow for the Prompt API, confirming that no payload leaves the device unless the site explicitly requests it.
  2. Explicit consent mechanisms – When a website calls the on‑device model, Chrome could display a one‑time consent banner stating that "the site will receive the text you entered and the model’s response".
  3. GDPR‑style DPIA – Google must update its DPIA to cover the Prompt API interaction, documenting the lawful basis (likely legitimate interests) and the safeguards in place.
  4. CCPA “Do Not Sell” toggle – Adding a toggle that disables any website‑initiated data sharing with Google would align Chrome with California’s opt‑out requirements.
  5. User‑controlled uninstall – Continue to make the Nano model removal easy to access, and surface a reminder that turning it off will also disable related security features.

Bottom line

Google’s wording tweak does not, by itself, prove a technical change, but it opens a regulatory window for scrutiny. Under GDPR and CCPA, any ambiguity about where personal data travels can trigger enforcement actions, especially when a global platform like Chrome markets a feature as on‑device. Users deserve a transparent statement that their prompts stay on their machine unless they explicitly agree to share them, and developers must treat any data exchanged via the Prompt API as personal data subject to the same strict safeguards that apply to traditional cloud‑based AI services.

What to watch next – Keep an eye on Chrome’s upcoming releases (v149‑v151) for a possible UI addition that clarifies data flow, and monitor statements from data‑protection authorities in the EU and California for any formal inquiries into the change.

Comments

Loading comments...