Meta’s flagship Ray‑Ban Display glasses receive Neural Handwriting for air‑gesture typing, expanded maps, video capture, and an open SDK that lets third‑party apps run on the wearable, marking the most feature‑rich update since launch.
Meta rolls out Neural Handwriting and a full developer toolkit for Ray‑Ban Display

Meta’s Ray‑Ban Display – the glasses that overlay digital content onto the real world – have just received a major software bump, dubbed Update 125. After a long beta period limited to Messenger and WhatsApp, the Neural Handwriting feature is now live for everyone, and the update also opens the platform to third‑party developers for the first time.
Air‑gesture typing with the Neural Band
The centerpiece of the update is Neural Handwriting, a system that translates finger movements into text without a physical keyboard. Users wear the Neural Band that ships with the $800 glasses. The band uses surface electromyography (sEMG) sensors to read the tiny electrical signals generated when you move your fingers on any surface – a desk, your palm, even your thigh. Those signals are mapped to letters, allowing you to “write” messages in the air.
The feature works across iOS and Android and integrates with:
- Instagram Direct
- WhatsApp chat windows
- Facebook Messenger
- System‑level message notifications (search contacts, reply, send new messages)
Because the band captures muscle activity rather than relying on a camera, it works in low‑light conditions and respects privacy – no video of your hand is recorded.
New capture mode and map upgrades
Update 125 adds a display‑recording mode that stitches together three streams into a single video file:
- The image shown on the glasses’ micro‑display
- The view from the front‑facing camera (POV)
- Ambient audio captured by the built‑in microphone
This makes it easy to create tutorials or share a moment from a first‑person perspective without post‑production.
Maps have been expanded with richer points of interest, full‑country walking directions for the United States, and major European cities such as London, Paris and Rome. Users can now save home and work locations and receive voice‑guided navigation directly in the glasses.
Messaging and social app refinements
- WhatsApp now supports group video calls and live captions for phone calls.
- Instagram gets smoother Reels browsing and a revamped DM navigation pane.
- Facebook adds widgets for birthdays and live sports scores, visible as overlay cards.
Opening the platform to developers
Perhaps the most strategic change is the Device Access Toolkit SDK, available for both iOS and Android. Developers have three paths to bring apps to the glasses:
- Native integration – add a Ray‑Ban Display module to an existing mobile app, letting users launch a glasses‑specific UI from the phone.
- Dedicated native app – build a full‑screen experience that runs solely on the wearable.
- WebApp – with extra work, a responsive web app can be packaged to look and behave like a native glasses app.
Early community projects already showcase the possibilities:
- A YouTube player that streams video directly to the display.
- An aviation dashboard showing flight data and weather overlays.
- A grocery‑list assistant that lets you add items by air‑gesture typing.
- Transit navigation tools that combine live bus times with AR directions.
- Simple games that use hand gestures for control.
The SDK and accompanying documentation are hosted on Meta’s developer portal: Device Access Toolkit.
What this means for the ecosystem
Meta’s move signals a shift from a closed, hardware‑first approach to a more open, app‑centric model. By lowering the barrier for third‑party content, the glasses can start to host a variety of utility apps that make the $800 price tag feel more justified. The addition of Neural Handwriting also addresses a long‑standing criticism: the lack of a practical text input method.
For users already invested in Meta’s social ecosystem, the update deepens integration – messages, Reels, and Facebook widgets are now first‑class experiences on the glasses. For developers, the open SDK offers a chance to experiment with AR overlays, contextual information, and hands‑free interaction without building a custom hardware stack.
Looking ahead
Meta has hinted that future updates will bring more advanced AI assistants and tighter integration with its upcoming smartwatch. If the current rollout is any indication, the company is positioning the Ray‑Ban Display as a versatile companion for everyday tasks rather than a niche novelty.
For the full changelog and download instructions, see the official Meta announcement page.

Comments
Please log in or register to join the discussion