One thing is clear in 2025: Smart glasses are real. I've worn them. Meta's Ray-Bans not only look normal, but they're successful: According to Counterpoint Research, over a million have already been sold.
By no means does that mean smart glasses are the next iPhone (or even AirPods), but I've found them on my face a lot -- and they have their uses. The same goes for display glasses like the Xreal One; all I can say is I'm ready to carry smart glasses fitted with my prescription in my bag wherever I go.
Read more: The official Best of CES winners, awarded by CNET Group
But will everyone else? This year's CES showed off a handful of requisite smart glasses once again, many of them promising -- no surprise -- AI. They look perfectly normal, or normal enough. And acceptance of smart glasses, or at least the way they look, is changing. Most people don't know I'm wearing Meta Ray-Bans.
But that's also a concern. The man who killed 14 people in New Orleans on New Year's Day wore Meta's Ray-Bans leading up to his attack, according to the FBI. The glasses don't do much more than phones already can -- they record video, take photos, make calls, play music, and they have camera-connected AI services. Authorities are not yet sure whether these services aided in his attack, but it's worth raising the question as these smart glasses are the beginning of a wave of wearable devices that will always have AI services on.
The glasses found at CES 2025 looked more real than they ever have before. For instance, Halliday glasses resembled something you'd pick up at Lenscrafters, but they also have a tiny monochrome display perched above the frame that can show notifications or AI-delivered information via text. The tiny circular display sits at the top of your field of view, and they can do things like translate language in real time.
The RayNeo X3 Pro, a pair of AR glasses, have cameras, enable hand tracking and have dual displays built into the clear lenses in a smaller size than a pair I wore last year. They work with a wrist-worn neural band made by Mudra that, similar to Meta's prototype Orion glasses, can detect small finger movements and use them to control apps.
Meta's Ray-Bans already can analyze the world in real-time with cameras and AI, but their reliability and phone connectedness is spotty.
The largest missing smart glasses ingredient is also the clearest
It's all very futuristic, but a big part of the picture is still missing: better connection with our phones. Most smart glasses still need to pair with phones to work, like Meta's Ray-Bans. The pairings are the weakest link. With Ray-Bans, I can use Meta AI or play music or sync photos that I take, but the connection can drop, and the glasses can't access or control anything else on my phone. They're not deeply linked like AirPods or the Apple Watch, or Google's Buds or Pixel Watch. That'll change slowly, starting later this year.
Google's Android XR, a planned framework for glasses and VR headsets to deeply link into phones and Google's Gemini AI, could make these glasses work a lot more fluidly for Android phones. Demos of Google's own smart glasses I tried in December had always-on AI modes, and it promised to connect with phones as well. A mixed-reality headset by Samsung can run Android apps.
Apple could and should do the same thing for iPhones, but nothing's happened yet. The Apple Vision Pro, oddly enough, doesn't pair directly with iPhones. Instead, it shares common apps and cloud services. A pair of Apple glasses could have the same sport of deeper phone hook-ins as the Apple Watch and AirPods, but that product is at best a far-off rumor right now.
It seems Google is taking small steps with Android XR this year. Samsung's larger Vision-Pro-like headset will be the first Android XR device, and glasses will follow later. Xreal is one of Google's first Android XR partners, but their most recent Xreal One glasses aren't meant to be worn all day: They're more like plug-in displays. Still, they may be among the first to be Android XR connected, along with Samsung's own smart glasses.
The Xreal One, a pair of display glasses that can connect to phones, already have onboard processors that can help anchor virtual content in place.
Deep phone integration is what will make any of these glasses start to feel absolutely necessary instead of just a novelty. I love the Meta Ray-Bans, but I do not love Meta's restricted relationship with my phone…or Meta's social media policies, for that matter. There should be easier ways for glasses to pick and choose AI services on my phone or act as a peripheral, more like the way earbuds or watches do. I have a feeling that'll be more on the agenda in 2026 than 2025, though.
For now, though, these glasses really aren't weird-looking anymore. Seriously. They look good! Now, they just need to work better with everything else.