This story is part of, CNET’s look at how the world will continue to evolve starting in 2022 and beyond.
I had every intention of wearing Meta’s (Facebook)glasses, equipped with cameras, microphones and speakers, as my everyday glasses. Then something funny happened on the way to the optical shop.
At Lenscrafters, they wouldn’t install the special prescription lenses I had made for the glasses. According to a shop employee, they couldn’t service the glasses. There was even an internal memo about it. The glasses needed to be made to order with the prescription lenses preinstalled, at least according to everyone I asked about it.
This underlines a big problem with smart glasses, still, at the end of 2021: these things are not ready to be your everyday prescription glasses yet, much less be.
That’s a problem I see across the board, as multiple companies are promising some wild new vision of advanced augmented reality eyewear, assuming I’ll just drop them onto my face whenever I want. First, that requires these glasses to work as…well, real everyday glasses.
I haven’t seen any attempts beyond Facebook’s Ray-Ban partnership to move smart eyewear into actual optical shops. Amazon’sapproach normal glasses, and so do , but you can’t just hop over to your regular optician and have them fit your prescription in, either.
Remember, Ray-Ban Stories aren’t AR glasses: they’re just camera-equipped glasses with Bluetooth audio in the frames. True AR headsets have an even harder road. Here’s what to expect.
Qualcomm’s wave of phone-connected glasses is here
To see where AR glasses are going, don’t look at individual manufacturers making AR glasses next year: instead, look at the initiative being led by the chip-maker powering most of them. Qualcomm has been driving athat work with phones, loading custom software that bridges both devices. Qualcomm calls its glasses-to-Android phone software bridge Snapdragon Spaces, and it’s officially launching next year. But there are already some AR glasses using Qualcomm’s tech.
, which had a limited US launch with Verizon, is a good example of what to expect. The thick glasses look almost normal, until you get closer: the bottom parts of the lenses are clear, but the upper halves are filled with processing tech, cameras to track location and angled half-mirrors that seemingly project 3D images onto the real world. The Nreal Light’s visual effects feel like a Microsoft Hololens or Magic Leap headset, shrunken down: unlike VR, these AR glasses project glowing-ghostly holographic effects over everything.
Using the glasses means you either have vision good enough to not wear prescription lenses, or you’re wearing contacts or you find a way to add prescription inserts. These inserts don’t work at my level of nearsightedness, though, so I need to pop in disposable contacts.
are similar to Nreal’s, with the same half-lens, half-camera-array design. And similarly, I can’t wear them over my own glasses. Lenovo’s glasses work with Windows PCs as well as Android phones, which is where things get interesting. Could smart glasses be headphones for your eyes, casting extra virtual monitors with whatever device you plug them into? For Lenovo’s glasses, yes, if you have the right computers or phones.
Motorola (also owned by Lenovo) is expected to announce its own pair of AR smart glasses, according to Qualcomm.
Will smart glasses be heads-up displays (again?)
Almost 10 years after Google Glass tried to sell us on the idea of a wearable heads-up display for notifications, many companies are still taking the same approach for smart glasses. Oppo, a Chinese consumer electronics company already known for phones, recently announced its own smart monocle. The Air Glass rests over one eye, showing pop-up notifications that often just replicate what you’d also get on a smartwatch. Sound like the nearly decade-old? Pretty much.
The advantage of Oppo’s smart monocle is it’s designed to rest over glasses, but those glasses are still part of the Air Glass package. You can’t attach them to your own glasses.
Oppo won’t be the last company to try this approach. I expect the next wave of smart glasses to use this idea as a steppingstone to full AR, which still feels like it’s going to need a few more years to bake.
Snap’s AR glasses show where things are going, eventually: outdoors
I recently test-drove a pair of developer-onlymade by Snap, running around my own backyard for half an hour on a sunny day in December. Snap’s Spectacles are wireless, and can project 3D effects while layering them onto real-world environments outdoors. Current AR glasses are generally meant to be used inside and have a hard time with bright sunlight and high-contrast conditions. Snap’s glasses are brighter and made to be worn both indoors and out. Unlike Qualcomm’s current AR glasses from partner companies, they don’t need to be tethered with a cable, either.
But, of course, there are downsides. Snap’s glasses have a really narrow field of view, meaning that 3D effects only appear in a small box in front of my vision. Their battery life is incredibly short — just half an hour. They’re meant for developers, allowing them to begin exploring more immersive AR ideas that can go beyond what’s being done now on phones, and in AR-rich apps like Snapchat. I can see why they’d be useful, since other full AR headsets like Magic Leap and Hololens 2 are a lot bigger.
Snap’s glasses work now, but with so many limitations that there’s no way they’d be able to succeed as a consumer product yet. And again: I needed to take off my own glasses.
Business headsets will continue to multiply
Hands-free headsets for offsite fieldwork and use in factories will continue to be one of the biggest targets for AR glasses for now. Magic Leap, which originally pitched its AR glasses as everyday devices for creators, has pivoted to enterprise. The company’s second-generation AR glasses could end up being a far more practical product along the lines of, and able to comfortably fit over everyday glasses. Companies like Vuzix, which has made smart glasses for years, are that could take on similar roles in business spaces.
In fact, that’s still my favorite thing about the Hololens 2, which I got to wear again earlier this year: it works with my glasses! It doesn’t ask me to take them off.
Meta’s next glasses may evolve (and Amazon’s could, too)
Meta’s surprisingly bare-bones first-gen Ray-Ban Stories glasses didn’t seem technologically groundbreaking; instead, maybe, they were a foot in the door. But they did look shockingly like regular glasses, which is maybe Meta’s starting point. For companies like Amazon, which also makes its own pair of audio-only Echo Frames glasses, that may be the foothold to build in more features over time.
Would displays come next? If Meta’s next glasses were to do that, it would mean using some sort of nontraditional glasses lens. Or, maybe, there could be an overlay. Smart glasses that embed waveguides (an etched way of reflecting images to the eye) and other display tech into the lenses already exist, but it means they could drift even further from being glasses you could easily get new prescription lenses for. Meta is aiming to get there for its everyday-looking glasses, but as I said earlier, maybe they should also start with making the glasses easy to service in a store?
Will someone figure out how we interact with these glasses?
There’s no mouse or trackpad or keyboard for AR. There isn’t even an agreed-upon controller. While VR headsets generally tend to have dual controllers that look like a gamepad split apart, AR headsets tend to either rely on hand tracking, or phone interfaces, to interact. Neither is ideal. Controls often feel imprecise, even at the best of times. Microsoft’s HoloLens 2 uses finger pinches and air-taps to work, but that’s not good enough for me as an everyday device. Some AR headsets likeused their own little handheld controllers. I’ve seen some that use connected rings.
The point is, no one’s agreed on a killer solution. Meta (Facebook) sees wrist-wornas the answer, sensing finger movement and translating it into precise controls. But that tech isn’t expected next year. Meta may make its own watch, which could be a stepping-stone as a wrist-worn controller. Apple’s expected VR headset could lean on the Apple Watch too (who knows). But in 2022, figuring out how we control smart glasses seems as big an unsolved problem as anything else.
What will the metaverse do about my eyes?
You can see now, I think, why most purveyors of aspiring metaverse worlds are starting with experiences where no headsets are needed. Everyday glasses just aren’t here yet, and most AR glasses don’t even work with my prescription vision. While I’m an extreme case as far as my myopia, what does it mean for accommodating people in general, and working with all faces and all eyes? The answer is: there’s no answer yet. It’s a frustrating work in progress.
VR, at least, is something I can put my glasses-wearing face into…sometimes. Except for my current glasses, which are so wide that I now need a second pair of narrower glasses to use with VR. It’s exhausting. And it’s hardly something I’ll be able to use everywhere. The metaverse won’t stand for such limiting types of tech. And while this hardware is still being baked throughout 2022, it means that more advanced VR headsets will probably be how the future of the metaverse, and even mixed reality, is defined for now.
And while these glasses are being developed: figure out how to accommodate my prescription, please.