Meta announced a new update on Tuesday that adds conversation-focused audio to its AI-powered smart glasses, aiming to make it easier to hear people speaking in loud environments. The feature will initially roll out on Ray-Ban Meta and Oakley Meta HSTN glasses in the U.S. and Canada.
The update reflects Meta’s broader push to position smart glasses not just as lifestyle accessories, but as practical, everyday tools.
How the Conversation-Focus Feature Works
First previewed at Meta’s Meta Connect conference earlier this year, the new conversation-focused mode uses the glasses’ open-ear speakers and onboard AI to amplify the voice of the person you’re speaking with while reducing surrounding noise.
Wearers can fine-tune the amplification level by swiping the right temple of the glasses or adjusting settings in the companion app. This allows users to adapt the audio boost to different environments, such as busy restaurants, bars, clubs, commuter trains, or crowded public spaces.
While real-world performance will need to be tested, the feature points toward a growing role for smart wearables in accessibility and situational awareness—areas traditionally handled by dedicated hearing devices.
Smart Glasses as Hearing Assistance Tools
Meta is not alone in exploring this space. Apple has already introduced similar capabilities in its AirPods lineup. Apple’s Conversation Boost helps users focus on nearby speech, and newer AirPods Pro models support a clinical-grade hearing aid feature in certain regions.
Meta’s approach differs by integrating hearing assistance into a wearable that also handles vision-based AI tasks, photography, and voice interaction—suggesting a future where assistive features are bundled into general-purpose consumer devices rather than standalone products.
Read More: In 2026, Google Plans to Try Again With Smart Glasses
Spotify Integration Links Vision to Action
Alongside the hearing update, Meta is also adding a more playful feature: Spotify-powered music playback triggered by what the user is looking at. If the glasses detect something visually relevant—such as an album cover, holiday decorations, or a themed setting—users can ask the glasses to play music that matches the moment via Spotify.
While the feature leans more toward novelty, it highlights Meta’s vision for contextual AI—connecting visual input to immediate actions across apps without pulling out a phone.
Availability and Regional Rollout
The conversation-focused feature will be limited to the U.S. and Canada at launch. The Spotify integration, however, will be available in English across a wider set of markets, including Australia, Austria, Belgium, Brazil, Canada, Denmark, Finland, France, Germany, India, Ireland, Italy, Mexico, Norway, Spain, Sweden, the United Arab Emirates, the U.K., and the U.S.
Early Access First, Wider Rollout Later
The update, labeled software version 21, will first be available to users enrolled in Meta’s Early Access Program, which requires joining a waitlist and receiving approval. Meta says the update will roll out more broadly after the initial testing phase.
A Step Toward More Useful Wearables
Taken together, the updates underscore Meta’s strategy of pushing smart glasses beyond novelty and toward everyday usefulness. Features that improve hearing, respond to visual context, and reduce reliance on smartphones could help smart glasses carve out a clearer role—especially as competition in AI-powered wearables continues to intensify.
Whether users embrace these features as essential tools or optional enhancements may determine how quickly smart glasses move from niche devices to mainstream accessories.



