Meta made significant strides in blending artificial intelligence, augmented reality, and wearable hardware at its Connect 2025 event, moving beyond camera-focused smart glasses into true display-enabled computing worn on your face. The flagship announcement was the Meta Ray-Ban Display – a radical evolution of its Ray-Ban collaboration featuring a waveguide display projected over the user's right eye.

Article illustration 1

The new Meta Ray-Ban Display glasses project information directly into the wearer's field of view. (Image: ZDNET)

Beyond the Lens: The Display Breakthrough
The Ray-Ban Display represents Meta's first consumer device with an integrated visual interface. Key technical specs revealed include:
* A high-resolution waveguide display projecting at 5,000 nits brightness
* Integration with the Meta Neural Band – an sEMG (surface electromyography) wristband translating subtle muscle movements into controls
* Support for displaying live translations, navigation paths, messages, and contextual AI information
* Transition lenses enabling use as smart sunglasses
Priced at $799 (including the Neural Band) and launching September 30th, the glasses aim to deliver always-available contextual computing. During a live demo, CEO Mark Zuckerberg demonstrated gesture controls like mimicking writing with a pen to send texts and turning an imaginary dial to adjust music volume, though the tech hit a snag during a video call attempt.

Augmenting the Ecosystem
Meta didn't neglect its existing wearable line:
1. Meta Ray-Ban Gen 2: An iterative update ($379) boasting doubled battery life (up to 6 hours streaming), 3K video recording, new colors (including transparent blue), and a software-based Audio Boost feature (coming to Gen 1 via update) that amplifies real-world conversations.
2. Meta Oakley Vanguard ($499): Targeting athletes, this rugged, IP67-rated model (shipping Oct. 21) features a centered 12MP camera for 3K video and Strava integration for real-time workout stat announcements.

The AI & Metaverse Angle
Beyond hardware, Meta emphasized software integration:
* Live AI: Contextual AI features will evolve post-launch on the Display glasses, providing real-time information overlays based on surroundings and conversation.
* Meta Horizon Studio: Unveiled as a generative AI playground powered by the new Meta Horizon Engine. Users can create intricate virtual worlds (e.g., UFC octagons, underwater scenes) via voice prompts, signaling a push towards more accessible metaverse creation tools.
* Meta TV: A new hub aggregating streaming services like Netflix and Disney+ with Dolby Atmos/Vision support, aiming to be the central entertainment platform within Meta's ecosystem.

Analysis: The Wearable Platform Play
Meta's announcements signal a cohesive strategy:
* Moving Beyond Camera First: The Display glasses shift focus from passive recording to active information display and interaction.
* Gesture as Primary Interface: The Neural Band represents a bet on sEMG as a more intuitive, private control scheme than voice or touch for wearables.
* Tiered Market Approach: From accessible Ray-Ban Gen 2 to premium Display and specialized Oakley models, Meta targets diverse user needs and price points.
* AI as the Glue: Agentic AI, contextual awareness, and generative creation tools (Horizon Studio) are positioned as the core intelligence layer connecting hardware and software experiences.

The technical hiccup during Zuckerberg's demo underscores the challenges in seamlessly integrating these nascent technologies. However, Meta's aggressive push into display-based AR glasses, coupled with novel input methods and deep AI integration, marks a pivotal moment. It moves wearable computing closer to the long-promised vision of contextually aware, always-available augmentation, forcing competitors and developers to take note of this evolving platform. The success hinges on refining the reliability demonstrated on stage and convincing users of the display's daily utility beyond novelty.

Source: ZDNET Live Coverage & Announcements from Meta Connect 2025