Importance Score: 45 / 100 🔵
Weekend bike rides have long been a cherished escape, with each push of the pedals helping to alleviate the accumulated stress of the week. Over time, I’ve curated a collection of gadgets to elevate these journeys. However, I’ve discovered that carrying excessive gear can detract from the experience, requiring constant management of notifications and battery levels instead of simply enjoying the ride. In the market for smart glasses? Here’s my experience.
Enter Ray-Ban Meta: smart glasses designed to simplify my weekend rides and add a touch of enjoyment.
Instead of juggling sunglasses, headphones, and my phone for photos, I now utilize a single device that handles everything.
The Ray-Ban Meta smart glasses have resonated with a broader audience. Meta reports that it has sold millions of these devices, with CEO Mark Zuckerberg noting a threefold increase in sales over the past year.
Online forums and video reviews suggest that many individuals are using Ray-Ban Meta glasses while biking. Meta is aware of this trend and is reportedly developing a new generation of AI smart glasses with Oakley, specifically tailored for athletes.
Initially, I didn’t anticipate using my Ray-Ban Metas on bike rides. However, several months ago, I decided to experiment with them.
Now, I find myself wearing these glasses more often on bike rides than in any other setting. Meta has struck a chord with these smart glasses, convincing me of their potential. They offer a user experience that is almost enjoyable, and with a few enhancements could transform into something truly exceptional.
Techcrunch event
Berkeley, CA
|
June 5
BOOK NOW
A crucial advantage of Ray-Ban Meta glasses lies in their fundamental quality as a pair of Ray-Ban sunglasses. I own the Wayfarer style, equipped with transition lenses and a transparent plastic frame.
I’ve found that these perform admirably on bike rides, shielding my eyes from sunlight, debris, and pollen. They provide a snug fit beneath a bike helmet, though not a seamless one (more on that later).
A notable feature of the Meta smart glasses is the integrated camera, situated above the right and left eyes. This allows me to capture photos and videos of scenes during my rides by simply pressing a button on the top right corner of the frames, eliminating the need to fumble with my phone, which can feel cumbersome and hazardous while cycling.
During a recent ride through San Francisco’s Golden Gate Park, I captured images of Blue Heron Lake, the park’s shrub-covered dunes along the Pacific Ocean, and the tree-lined path at the park’s entrance using the Ray-Ban Meta glasses.
While the camera isn’t exceptional, it’s reasonably effective, and I’ve managed to capture scenes that I would have otherwise missed. Therefore, I don’t view the camera as a replacement for my phone’s camera, but rather as a supplementary tool for capturing more photos and videos.
I frequently use the open-ear speakers in the arms of the glasses, which allow me to listen to podcasts and music while still hearing the sounds of people, cyclists, and cars around me. Meta isn’t the first to integrate speakers into glasses – Bose has offered a reliable pair for years. However, Meta’s approach to open-ear speakers is surprisingly competent. I’ve been impressed with the audio quality and how little I miss conventional headphones on these rides.
I’ve also engaged with Meta’s AI assistant during my weekend rides, posing questions about the natural environment and the historical context of buildings I encountered.
Since I usually cycle to disconnect, conversing with an AI chatbot seemed counterintuitive. However, I’ve discovered that these interactions sparked my curiosity without overwhelming me with notifications and endless content, which often occurs when I use my phone.
The key advantage is having all of these capabilities in one device.
This translates to fewer items to charge, less clutter in my biking equipment, and fewer devices to monitor during my rides.
Navigating the Road Bumps: Challenges with Ray-Ban Meta Smart Glasses
Though the Ray-Ban Meta glasses are suitable for walking, their design isn’t optimized for cycling.
The Ray-Ban Meta glasses frequently slip down my nose during bumpy rides. When I bend over on the bike and look up, the thick frames obstruct my view. (Most cycling sunglasses feature thin frames and nose pads to address these issues.)
The Ray-Ban Meta glasses also exhibit limitations in their compatibility with other apps. While I appreciate the ability to capture photos and pause music, I still need to access my phone for other functions.
For instance, the Ray-Ban Meta glasses offer Spotify integration, but requesting specific playlists from the AI assistant proved challenging. Sometimes, the glasses wouldn’t play anything, or they would play the wrong playlist.
I’d like to see these integrations improved and expanded to include biking-specific apps like Strava or Garmin.
The Ray-Ban Meta glasses also don’t integrate seamlessly with my iPhone, likely due to Apple’s strict policies.
Ideally, I’d like to send texts or navigate using Apple Maps through my Ray-Ban Meta glasses, but these features may remain unavailable until Apple releases its own version of smart glasses.
That leaves Meta’s AI assistant. The AI feature is frequently promoted as the main advantage of these glasses, but I found its performance lacking.
Meta’s voice AI is less compelling than other voice AI products from OpenAI, Perplexity, and Google. Its AI voices sound more robotic, and I find its responses to be less reliable.
I tested Ray-Ban Meta’s live video AI sessions, which were initially presented at Meta Connect. This feature streams live video and audio from Ray-Ban Meta into an AI model, designed to create a more integrated experience with the AI assistant and allow it to “see” your perspective. However, the feature failed to provide accurate responses.
I challenged Ray-Ban Meta to identify cars I passed while cycling near my apartment. The glasses incorrectly identified a modern Ford Bronco as a vintage Volkswagen Beetle, despite the clear differences in appearance. Later, the glasses confidently identified a 1980s BMW as a Honda Civic. While closer, they are still distinctly different cars.
During the live AI session, I requested the AI to identify plants and trees. The AI identified a eucalyptus tree as an oak tree. When I corrected it, the AI acknowledged, “Oh yeah, you’re right.” These experiences raise questions about the value of AI assistance.
Google DeepMind and OpenAI are also developing multimodal AI sessions similar to Meta’s smart glasses offering. For now, they seem incomplete.
I hope to see an improved version of AI smart glasses that are suitable for bike rides. The Ray-Ban Meta glasses represent one of the most promising AI devices I’ve experienced, and, with a few critical improvements, riding with them would be a joy.