X

Meta Ray-Bans Are Getting AI Improvements and I Got to Test Them Out

Last year's glasses will continue to get software upgrades to make one of my favorite gadgets even better.

Headshot of Scott Stein
Headshot of Scott Stein
Scott Stein Editor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
Expertise VR and AR | Gaming | Metaverse technologies | Wearable tech | Tablets Credentials
  • Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Scott Stein
2 min read
Looking at a pair of black Ray-Ban glasses on a table, through the lenses

Meta's second-gen Ray-Bans are getting improved AI services soon.

Scott Stein/CNET

In a year of promising AI gadgets, Meta's second-gen Ray-Ban smart glasses ended up being one of the best, and something I kept on my face for far longer than I ever expected. They've been an unexpected hit for Meta, too, and a vehicle for exploring some camera-enabled AI features that can analyze the real world on the fly. At Meta's Connect conference, a show dedicated to VR, AR and a lot of AI, Meta announced new updates Wednesday for the glasses, though no new hardware. 

Watch this: I Tried Out Meta's More Affordable Quest 3S and New Ray-Ban AI Features

Instead, last year's versions will see new features such as live translation, camera recognition of QR codes and phone numbers, and support for AI analysis of live video recording. I got to check out just a few of those features at a press event before the announcement, using a pair of Ray-Bans perched over my own glasses (I didn't bring contact lenses).

The demos I walked through showed how I could look at a QR code with my glasses and then automatically open the linked website on my phone after my glasses snapped a photo. I looked at a model of a street full of toy cars and asked the glasses how they could describe the cars in front of me, and a photo was snapped without me having to say the usual "look" trigger word. I also tried deeper music control in the glasses, making specific music requests for tracks in Spotify (a feature that's also coming with support for Apple Music and Amazon Music). All of these features will work on iOS and Android using the Meta View app.

Watch this: I Tried Out Meta's More Affordable Quest 3S and New Ray-Ban AI Features

I'm most interested in the live translation feature, and how quickly it might respond to actual conversations. I also wonder how AI assistance with recorded video clips might work, too. At some point, AI-assisted glasses might have a much more continuous awareness of my world with cameras. But taking photos and videos also drains battery life, something the Meta Ray-Bans already struggle with over the course of a full day. According to Meta, battery life improvements are also in the works, which I'd love to see, although I'm not sure how it'll do it.

My demos did have a few connection hiccups, too, something I've experienced at times using Meta Ray-Bans connected to my iPhone. For example, sometimes voice requests will hang for music playlist playback in a way that Siri requests don't with AirPods. Regardless, expect the Meta Ray-Bans to keep getting better. That's good news for anyone who already has a pair, although I wonder when Meta will make strides towards a third-gen version -- maybe next year.