Importance Score: 65 / 100 🔴
While Google gears up to resurrect its Google Glass concept, Meta is forging ahead with innovative artificial intelligence capabilities set to debut in smart glasses this summer. The Ray-Ban smart glasses, developed in collaboration with Meta, will receive several advanced AI enhancements for users in the United States and Canada.
Meta’s AI-Powered Smart Glasses
Using the Meta View app on a paired smartphone, Ray-Ban smart glasses users will be able to utilize the “Hey Meta, start live AI” voice command to provide Meta AI with a live feed of whatever is being viewed through the glasses.
Much like Google’s Gemini demonstration, users will be able to pose conversational queries to Meta AI about the visual data it captures and how it might address various issues. Among examples provided by Meta is the potential for Meta AI to suggest alternatives to butter by analyzing the contents of a pantry through the glasses’ lens.
Furthermore, users will be able to ask targeted questions regarding objects seen even without enabling the live AI feature.
Elevating User Experience with New Features
In addition to seasonal aesthetic upgrades, Ray-Ban’s smart glasses will offer the “Hey Meta, start live translation” command to facilitate real-time translation of incoming languages including English, French, Italian, and Spanish. The glasses’ built-in speakers will provide translations as individuals converse. Additionally, users can display a translated transcript on their smartphone screen for others to view.
Addressing Privacy Concerns with Meta AI Glasses
Privacy is a significant consideration with these smart glasses. Inna Tokarev Sela, CEO and founder of AI data firm illumex, highlighted that individuals often notice the recording indicator light on Ray-Ban smart glasses, which can elicit unease as people are aware of being recorded. There are concerns both about being filmed by strangers and about Meta’s data collection practices.
“In the new models, users can control the notification light, which could raise privacy issues,” Sela commented. “However, capturing videos is common at tourist attractions, public events, and more. I anticipate that Meta will not share any information about individuals unless they explicitly provide consent through registration.”
This could present additional hurdles regarding consent, depending on how users employ the recording capabilities. “For example, users should have the option to opt-in and select what information to expose when someone is within their frame—similar to platforms like LinkedIn,” Sela explained. “Naturally, any recordings generated by the glasses should not be admissible in court without explicit consent, akin to other recording devices.”
Additional Enhancements and Release Timeline
New Commands and Compatibility Features
Complementing the AI updates, Ray-Ban’s smart glasses will enable users to post automatically on Instagram or send messages on Messenger via voice commands. Enhanced compatibility with music streaming services will also allow users to listen to music from Amazon Music, Apple Music, and Spotify directly through their glasses, eliminating the need for earbuds.
Feature Rollout and Availability
Meta has announced that these new features will be rolled out over the spring and summer, with object recognition updates for European Union users scheduled for late April and early May.