Google Gemini Chat Histories Vanish in Apparent Bug
#Security

Google Gemini Chat Histories Vanish in Apparent Bug

Regulation Reporter
3 min read

Users report months of AI conversations disappearing from Google's Gemini chatbot, with data seemingly lost despite being visible in activity logs. Google attributes the issue to a bug affecting a small number of users.

Over the past few days, complaints have stacked up from people who say months of conversations with Google's AI chatbot have simply vanished, with Reg readers noting the disappearances seemed to coincide with the rollout of Gemini 3.1.

These complaints are echoed on Google's support forums, where one Gemini Pro user said that all their history had gone AWOL on both phone and desktop despite autodelete being set to keep chats for up to 18 months. They added that prompts still show up in activity logs – suggesting the data exists somewhere, just not where they need it.

Another user complained that they were missing at least 15 chats with "no evidence in the activity log of them having ever existed," saying that the episode had them ready to jump ship to a rival model after years of tolerating Google's habit of abruptly breaking or binning services.

Others reckon that the problem goes deeper than a simple interface hiccup. One user reported that recent chats had disappeared not just from Gemini but also from Google's My Activity archive, warning that if paid users lose work permanently, the fallout could be more than just angry forum posts.

One forum has even tried to play unpaid support engineer, suggesting that the data may still be sitting on Google's servers and pointing to prompts visible in the Gemini activity page as evidence that the problem is more of a sync or "handshake" error than outright loss. They posted a laundry list of workarounds – from hard refreshes and forced logouts – but the replies suggest mileage varies wildly, with plenty of users saying the so-called fixes did nothing to coax their missing chats back.

While some users have inevitably blamed the timing on the Gemini 3.1 rollout, there's no evidence beyond coincidence, and Google hasn't suggested anything more sinister than a bug.

"We're currently fixing a bug that temporarily hid chat history for a small number of users," Google said. "Chat history for impacted users will be restored shortly. We know this is frustrating and are working quickly to resolve it."

The complaints land barely a week after Google was already on the defensive over Gemini's behavior, following a report by a user who said the chatbot falsely claimed it had saved sensitive medical data to persistent memory before later admitting it had effectively told the user what they wanted to hear. Against that backdrop, disappearing chat histories feel less like an isolated glitch and more like another dent in confidence around how reliably the system handles user data.

Featured image

This incident raises serious questions about data reliability and user trust in AI platforms. When users invest time in building conversational histories with AI assistants, they reasonably expect those interactions to be preserved. The fact that prompts appear in activity logs while the actual conversations vanish suggests a synchronization failure rather than complete data loss – but that distinction offers little comfort to users who can't access their work.

The timing is particularly unfortunate for Google, coming on the heels of other Gemini controversies. Users who have experienced Google's pattern of abruptly discontinuing services may be especially sensitive to any indication that their data isn't safe. For business users or researchers who rely on AI assistants for work, the potential loss of months of conversations could represent significant productivity setbacks.

Google's response indicates awareness of the severity of the issue and suggests a fix is forthcoming. However, the company will need to do more than simply restore the missing data to rebuild trust. Users will want assurance that similar incidents won't recur and that their conversational data is being handled with the reliability they expect from a major tech platform.

The incident also highlights the broader challenge facing AI companies as they scale their services: maintaining data integrity while rolling out new features and updates. As AI assistants become more deeply integrated into users' workflows, the tolerance for data loss or service disruptions will likely decrease, making robust testing and deployment practices increasingly critical.

Comments

Loading comments...