Thursday, January 1

Showtime for Android XR glasses


During my hands-on with Android XR glasses last month, I kept returning to the digital clock that was displayed over the real world. Professionally, I have been waiting over a decade for Google to re-enter the smart glasses space, and everything is lining up for this to happen in 2026.

It will start with screen-less glasses that just have microphones, speakers, and cameras, but the version with a single monocular display is coming soon after.

In the intervening years, Meta has been the most dominant player in the market, with a screen version arriving last year. However, I think Meta ended up pigeonholing themselves with the Ray-Ban style in trying to find a market with sunglasses.

Google’s offerings are not meant to be an accessory you just wear outside when it’s sunny or during exercise. Android XR glasses aim to replace your eyeglasses, while appealing to those who don’t have prescription lenses. They are meant to be worn all the time and become another computing device alongside your phone and even watch.

Advertisement – scroll for more content

This is why I think what Google releases this year will be the first real, wide-scale debut of smart glasses as the next consumer electronics form factor.

In terms of widespread appeal, there’s Samsung’s wide hardware reach and relationship with customers. Meanwhile, the Android-powered nature plays a big role in providing a seamless extension of your phone. The underrated flip side of that is how glasses could be a selling point for Android phones if support initially remains exclusive to the platform. (I would assume these Google/Samsung glasses will require Android phones for the foreseeable future since they will be handling so much processing.) 

I have so many questions about how Android XR glasses will be marketed and the initial feature set. 

Managing notifications will undoubtedly be a part of it, while I think the fact that you’ll always be wearing (open-ear) headphones might have an interesting impact on earbuds in the long-term future.

Then there’s having a world-facing camera at all times. The privacy and cultural implications cannot be ignored, but I think the value in having an always-ready point-of-view (POV) camera to capture pictures and videos you’d otherwise miss will be immense. It should be less distracting than holding up a phone and possibly being taken out of the moment. 

Finally, we come to the AI aspect. Google is referring to this form factor as “AI glasses.” On the positive front, this is the form factor for real-time assistants like Gemini Live that can tap into the camera for context about the world. The fact that you don’t have to physically hold up a phone should increase the likelihood you’ll invoke it. However, I am worried that evoking “AI” might constrain the appeal versus something more general and familiar like the “smart” of your phone or watch. 

Besides functionality, Google needs to have a broader narrative about why all the things that can be done on smart glasses should be literally done in front of your face. One thing Google discussed during the demos was the idea that briefly using glasses instead of staring at your phone will help you be more present in everyday life. That’s something that everyone needs to test in their lives, but that’s the kind of pitch that Google needs to have on day one.

Of all the demos last month that emphasized the surprising size and fidelity of the screen, it was the humble time overlay in standby mode that kept me infinitely amused. It reminded me that augmented reality is finally happening and that the quality, if not utility (especially by adding the date and temperature), of this iteration will be pretty high out of the gate. 

FTC: We use income earning auto affiliate links. More.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *