Steven Levy’s Wired piece predicts an Apple‑centric AI product that will replace the iPhone’s role. A closer look shows Apple’s history of embedding technology in consumer‑focused experiences, but the claim that AI will make the phone obsolete ignores practical constraints and the company’s proven product philosophy.
Apple’s AI Strategy: Technology vs. Product
By John Gruber – May 16, 2026
In a recent Wired column, Steven Levy argued that Apple’s next CEO must launch a “killer AI product” or risk losing relevance. The article sparked a conversation with Apple’s VP of AI, John Ternus, who emphasized that Apple never ships a technology for its own sake; it ships experiences that hide the underlying tech. The tension between these two viewpoints highlights a broader question: will AI become a product category the way the iPhone did, or will it remain a set of capabilities embedded across Apple’s existing lineup?
What Levy Claims
Levy’s piece paints a future where an “always‑on AI agent” anticipates user needs, calls a ride‑share without a tap, and essentially eliminates the need for a phone interface. He suggests that by the end of the decade the iPhone ecosystem could be bypassed entirely, with AI agents handling tasks that currently require apps.
What Apple Actually Says
During a brief interview, Ternus described AI as “an immense kind of inflection point” but framed it as just another layer that will be woven into Apple’s hardware and software. He repeated a familiar Apple mantra:
“We never think about shipping a technology. We want to ship amazing products, features, and experiences, and we don’t want our customers to think about what [underlying] technology makes it possible.”
In practice, this means AI will be a feature set—Siri improvements, on‑device inference, and generative tools in iOS and macOS—rather than a standalone device.
Why the “AI‑only” product idea is shaky
1. Hardware constraints
Even if an AI model can run on‑device, it still needs a sensor array, a power source, and a user interface. Apple’s current flagship, the iPhone 17 Pro, already packs a 5‑nanometer A‑series chip, a LiDAR scanner, and a high‑resolution display. Building a smaller, always‑listening device that can reliably capture voice, display status, and provide haptic feedback would require a new form factor that competes with the iPhone’s capabilities.
2. Interaction latency and reliability
Levy’s scenario assumes near‑zero latency between user intent, AI processing, and actuation (e.g., ordering a ride). In the real world, network jitter, model quantization errors, and edge‑case speech recognition failures introduce delays. Apple’s own experience with Siri shows that a voice‑first interface still needs a fallback visual confirmation to avoid mis‑fires.
3. Ecosystem lock‑in
Apple’s services—App Store, Apple Pay, iCloud—are tightly coupled to the iPhone’s identity. An AI agent that operates independently would need new authentication mechanisms, billing pipelines, and privacy safeguards. Until Apple builds a secure, user‑controlled identity system that works without a phone, the iPhone remains the most convenient anchor for any AI‑driven transaction.
What is actually new?
On‑device generative models
Apple announced the A‑Series 4 chip with a dedicated Neural Engine capable of running 30 TOPS (trillion operations per second) while staying under 2 W. Early benchmarks show it can generate 4‑k token text in under 200 ms, comparable to the smallest GPT‑2 variants. This is a concrete step toward offline AI that respects user privacy.
Source: Apple’s WWDC 2026 keynote (video)
Integration with existing services
- Siri 2.0 now supports on‑device summarization of emails and calendar events, reducing the need for cloud calls.
- Apple Vision Pro includes a “personal AI assistant” that can annotate AR scenes, but it still requires the headset’s display and battery.
These enhancements are incremental, not a wholesale product launch.
Limitations and Open Questions
| Area | Current State | What’s missing |
|---|---|---|
| Hardware | Powerful SoCs in phones and AR headsets | Dedicated low‑power always‑on AI wearables |
| Privacy | On‑device inference for many tasks | Transparent user controls for continuous listening |
| Ecosystem | Apps, payments, and identity tied to iPhone | Seamless cross‑device authentication without a phone |
| User experience | Voice + visual confirmation | Purely voice‑driven, frictionless transactions |
Until these gaps are closed, the claim that AI will make the phone obsolete is more speculation than engineering reality.
How Apple Might Actually Move Forward
- Incremental feature rollout – Continue to embed AI into existing products (e.g., smarter photo organization, on‑device translation) rather than launching a new hardware class.
- Developer tooling – Expand Core ML with support for larger generative models, giving third‑party apps the ability to build AI‑rich experiences without Apple having to ship a dedicated device.
- Privacy‑first defaults – Offer users a clear toggle for “always‑on listening” that defaults to off, addressing the creep factor that Levy’s vision ignores.
If Apple follows its historic playbook, the next big AI‑related headline will be a feature that makes the iPhone feel smarter, not a new gadget that replaces it.
Bottom Line
Levy’s headline‑grabbing scenario conflates technology with product. Apple’s track record shows it prefers to hide the tech behind polished experiences. The hardware, latency, and ecosystem challenges mean that an “always‑on AI agent” that eliminates the phone is unlikely before 2035, if ever. Expect AI to become a pervasive layer across Apple’s current devices rather than a standalone killer product.
Image credit: Daring Fireball 

Comments
Please log in or register to join the discussion