A Bloomberg report indicates Apple is developing a major Siri overhaul, codenamed 'Campos,' that will transform the assistant into a standalone chatbot powered by Google's Gemini, set to launch with iOS 27, macOS 27, and iPadOS 27 in 2026.
Apple is preparing to fundamentally reinvent Siri, moving it from a voice-activated command tool to a full-fledged conversational chatbot. According to a Bloomberg report, the company is developing a new system internally codenamed "Campos" that will debut with its next major software updates: iOS 27, macOS 27, and iPadOS 27.
This isn't a minor update. The report describes a standalone app built into Apple devices that will replace the existing Siri entirely. The new system will support natural language conversations similar to what users experience with ChatGPT or Google's Gemini, and can be activated using the traditional "Siri" wake word or by holding the side button on iPhones and iPads.

The Gemini Foundation
Perhaps the most significant detail in the report is the technology powering this transformation. The Siri chatbot will be powered by a custom model based on Google's Gemini. This follows earlier confirmation that Apple had chosen Google's AI to power its revamped assistant, marking a notable shift in Apple's approach to AI integration.
The choice of Gemini as the underlying model represents a pragmatic decision by Apple. Rather than developing its own large language model from scratch—a resource-intensive process that has challenged even well-funded companies—Apple appears to be leveraging Google's established AI infrastructure while customizing it for its ecosystem. This allows Apple to focus on integration and user experience rather than foundational model development.
Capabilities and Functionality
The new Siri chatbot is expected to handle a wide range of tasks that go far beyond the current assistant's capabilities:
- Web searches and query answering: The system will be able to perform searches and provide detailed responses to complex questions
- Content generation: It can generate images and summarize text, bringing it in line with modern AI assistants
- Document and content analysis: Users will be able to upload documents or share on-screen content for analysis
- Device control: The chatbot will continue to manage device settings and system functions
- Personal data utilization: For select tasks, it can access personal data while maintaining privacy controls
Crucially, the report notes that the Siri chatbot will be deeply integrated across Apple's apps. This means it won't just be a separate chat interface—it will have access to in-app content to perform actions. For example, it could help compose emails in Mail, create calendar events, or manage photos in the Photos app, all while understanding the context of what's on screen.
The system will support both voice and text-based interactions, giving users flexibility in how they communicate with the assistant. However, the report mentions it may lack persistent memory features found in some rival chatbots, which could be a deliberate privacy-focused limitation.

Ecosystem Implications
This transformation has significant implications for Apple's ecosystem strategy. By making Siri a full chatbot, Apple is addressing a key competitive gap. While Siri has been effective for basic tasks like setting timers or controlling smart home devices, it has fallen behind competitors in natural language understanding and conversational ability.
The deep integration across apps could create a more powerful ecosystem lock-in. If Siri can seamlessly interact with Apple's native apps and understand user context across the system, it becomes more valuable to stay within Apple's ecosystem. This is particularly important as competitors like Google and Samsung are also enhancing their AI assistants with similar capabilities.
However, the reliance on Google's Gemini technology creates an interesting dynamic. While Apple maintains control over the user experience and integration, it's outsourcing the core AI model to a competitor. This could lead to questions about data handling, model updates, and long-term strategic independence.
Timeline and Expectations
According to Bloomberg, Apple is expected to unveil the new Siri chatbot at WWDC 2026. This timeline aligns with the typical annual software update cycle, suggesting that iOS 27 will be the vehicle for this major transformation.
The rollout will likely be gradual, with the chatbot appearing in developer betas following WWDC, followed by a public release in the fall of 2026. Given the scope of the changes, Apple may also need to update its privacy policies and data handling practices to accommodate the new capabilities.
Competitive Context
This move represents Apple playing catch-up in the AI assistant space. Google Assistant has been evolving toward more conversational interactions, and Samsung's Bixby has been integrating with its own ecosystem. Microsoft's Copilot and OpenAI's ChatGPT have set new expectations for what AI assistants can do.
By choosing Gemini, Apple is essentially buying into the current state of the art in large language models rather than trying to build its own from scratch. This could allow Siri to leapfrog competitors in terms of raw capability, though the success will ultimately depend on how well Apple integrates these capabilities into its ecosystem and user experience.
The report also raises questions about hardware requirements. While the processing could be done server-side, allowing older devices to access the new features, the deep integration with device settings and personal data might require specific hardware capabilities. This could influence upgrade cycles for users with older iPhones or iPads.
What This Means for Users
For consumers, this transformation could make Siri significantly more useful. Instead of limited commands, users could have full conversations, ask complex questions, and get help with a wider range of tasks. The integration with Apple apps means less switching between applications to get things done.
However, the shift also raises questions about privacy and data usage. While Apple has historically emphasized privacy, the new capabilities will require more data processing and potentially more data sharing with Google's systems. How Apple manages these privacy considerations will be crucial for user trust.
The lack of persistent memory, as mentioned in the report, could be a privacy feature rather than a limitation. While some users might want their AI to remember past conversations, others might prefer that their interactions remain private and not stored for future reference.
Looking Ahead
The reported transformation of Siri represents one of Apple's most significant software changes in recent years. It signals a recognition that the current voice assistant model is no longer sufficient in an era of advanced AI chatbots.
The success of this initiative will depend on several factors: the quality of the Gemini integration, the seamlessness of the app ecosystem integration, and how well Apple balances capability with privacy. If executed well, it could revitalize Siri and make it a compelling reason to choose Apple devices. If poorly implemented, it could highlight the challenges of integrating third-party AI technology into a tightly controlled ecosystem.
As we approach WWDC 2026, expect more details to emerge about how this new Siri chatbot will work, what it will cost (if anything), and how it will be rolled out across Apple's device lineup. For now, the report provides a glimpse into Apple's vision for the future of AI on its platforms—a future where Siri is no longer just a voice assistant, but a full conversational partner.

Comments
Please log in or register to join the discussion