
After years of hype, AR glasses are finally on the horizon. Snap says its next-gen smart glasses will launch in 2026. Meta has already sold over two million of its Ray-Ban smartglasses. And it’s reportedly working on a new version instead of next updating its Quest line of VR headsets. Apple is rumored to be quietly working on its own version. These will be much less obtrusive than big VR headsets—they look (mostly) like regular glasses and promise real-time translation, video calls, and turn-by-turn directions, all floating in front of your eyes.
It sounds amazing. But will people actually wear them?
We’ve already seen how virtual reality (VR) goggles, while fun for gaming, are not yet regularly used by most people. Typically, people are wowed by the technology, but don’t want to strap something to their face that cuts them off from the real world. Think of it like an amusement park. It’s really great and fun, but you wouldn’t want to go every day or every week.
I’ve used both the Apple Vision Pro and the Meta Quest 3 VR headsets. Both are fun, but even if they were lighter and cheaper, they’re a niche product because they’re a big thing strapped to your head. AR is much more subtle—but it still involves wearing something on your head.
For AR glasses to go mainstream, they need to solve real problems—without being awkward or invasive. Think language translation while traveling, guided workouts, hands-free help for delivery drivers, or facial recognition to remind you who that work associate is. It could be an easy way to interact with ever more ubiquitous AI tools. But wearing something on your face is much more difficult and personal than wearing a watch. If the tech feels weird or doesn’t match people’s personal tastes, it won’t be widely used.
For now, the promise is exciting. The reality? Still very much TBD.