It's been a shitty year for AI wearables.
Just a few months ago, the tech world was convinced that AI hardware could be the next big thing. It was a heady vision backed up by futuristic demos and sleek hardware. At the heart of the excitement were the Humane AI Pin and the Rabbit R1. Both promised a grandiose future. Neither delivered on its promises.
In the gadget world, this is old news. Smart glasses and augmented reality headsets experienced a similar hype cycle a decade ago. Google Glass promised a future where reality is overlaid with helpful information. In the years since, Magic Leap, Focals By North, Microsoft's HoloLens, Apple's Vision Pro, and most recently the new Snapchat Spectacles have tried to keep the vision alive, but without any real commercial success.
All in all, it is a little ironic that the best chance of a functional wearable AI device is a pair of smart glasses – namely the Ray-Ban Meta smart glasses.
The funny thing about the meta smart glasses is that nobody expected them to be this successful. Partly because the first version, the Ray-Ban Stories, was a total flop. Partly because they weren't smart glasses with new ideas. Bose had already made stylish audio sunglasses and then went out of business. Snap Spectacles had already tried recording short videos for social media and that obviously wasn't good enough either. On paper, there was no compelling reason why the Ray-Ban meta smart glasses would resonate with people.
And yet have succeeded where other wearable AI devices and smart glasses failed, even exceeding Meta's own expectations.
Much of it boils down to the fact that Meta has finally found the style And Execution. The Meta glasses come in a lot of different styles and colors compared to the Stories. You are almost guaranteed to find something that looks good on you. In this respect, Meta was smart enough to understand that the average person not want to look like they just came out of a science fiction movie. They want to look cool by today's standards.
At $299 they are expensive, but affordable compared to a $3,500 Vision Pro or a $699 Humane Pin. The audio quality is good. The call quality is surprisingly excellent thanks to a well-positioned microphone in the nose bridge. Unlike Snap's Stories or previous Spectacles, the video and photo quality is good enough to post to Instagram without shame – especially in the age of content creators, where Instagram Reels and POV-style TikToks reign supreme.
This is a device that can easily fit into people's lives Now. There is no future software update to wait for. It is not a solution looking for a problem to solve. And that, more than anything, is why the Ray-Bans have a chance of successfully understanding AI.
That's because AI is already built into it – it's just a feature, not the whole thing. You can use it to identify objects you come across or learn more about a landmark. You can ask Meta AI to write dodgy captions for your Instagram post or translate a menu. You can video call a friend and they can see what you see. All of these use cases make sense for the device and your usage.
In practice, these functions are a bit shaky and inelegant. Meta AI has not yet shown me any Good Instagram caption and often it can't hear me well in noisy environments. But unlike the Rabbit R1, it works. Unlike Humane, it doesn't overheat and there's no latency as it uses your phone to process things. Crucially, unlike those two devices, when the AI shits the bed, it can still get other things done very well.
That's good enough. For now. In the future, the pressure will only increase. Meta's strategy is that if people can get comfortable with simpler smart glasses, they will also be able to cope with facial computers when AI – and eventually AR – matures in prime time.
You have proven the first part of the equation. But if the second part is true, the AI cannot be OK or useful. It must really GoodIt has to make the leap from “Oh, that’s kind of handy if it works” to “I wear smart glasses all day because my life So it's much easier with them than without them.” At the moment, many of the AI features of the Meta Glasses are nice, but basically just party tricks.
That's a tall order, but of all of them, Meta seems to have the best chance of success right now. Style and wearability are not an issue. The company just signed a deal with EssilorLuxxotica to extend its smart glasses partnership beyond 2030. With an overall plan for the hardware now in place, incremental improvements like a better battery and lighter fits are possible. Now it remains to be seen if Meta can deliver on the rest, too.
Next week, the company will get a chance to prove it at the Meta Connect event. It's the best time. Humane's daily earnings are outpacing sales. Critics accuse Rabbit of being nothing more than a sham. Experts aren't convinced that Apple's big AI-inspired “supercycle” with the iPhone 16 will even happen. A win here would not only cement Meta's lead – it would also help keep the dream of AI hardware alive.