counter hit make

I saw the future of Android XR smart glasses, and Google left me stunned at the progress

0 16
Google Android XR Smart Glasses Developer Kit
Kerry Wan/ZDNET

Follow ZDNET: Add us as a preferred source on Google.


ZDNET’s key takeaways

  • Google has announced three advances in its Android XR platform.
  • They’re headlined by display AI glasses that will be available to developers first.
  • Galaxy XR and Project Aura also get updates that improve their immersive experiences.

Last week, within the confines of Google’s Hudson River office, I put on a pair of Android XR glasses and began to converse with Gemini as I walked around the room. These weren’t the Warby Parker or Gentle Monster models that had been teased at Google I/O in May, but rather a developer kit that will soon be in the hands (and on the faces) of Android developers worldwide.

The demos, ranging from visual assistance to gyroscopic navigation, progressed swiftly and, to my surprise, with high rationale. At one point, I asked Gemini to give me a fruit salad recipe with the pasta on the shelf, only for it to recommend a more traditional tomato sauce dish instead. That’s both a testament to Gemini’s smarts and the glasses’ multimodal hardware.

Also: I invested in Samsung’s $1,800 XR headset to replace my dual monitors – and it’s paying off big time

By the time my briefing was over, I had switched from the Android XR glasses to Samsung’s Galaxy XR headset and an upcoming pair by Xreal, Project Aura. This seamless transitioning between wearables, most of which will also leverage your Android phone and smartwatch for added functionality, is one of Google’s moonshots for 2026.

From what I’ve seen, that future can’t come soon enough.

Google’s vision for AI glasses is two-fold

Google Android XR Smart Glasses Developer Kit
Kerry Wan/ZDNET

Google’s plan for AI glasses comes in two forms: one that’s audio and camera only, similar to Meta’s Ray-Bans, and another that integrates a display for visual cues and floating interfaces, like Meta’s Ray-Ban Display. Clearly, there’s some competition in the space. However, Google has one key advantage before it even launches: a well-established software ecosystem, with Developer Preview 3 of the Android XR SDK (including APIs) set to release this week.

No, we’re not just talking about Gmail, Meet, and YouTube, like how Messenger, Instagram, and WhatsApp are to Meta. Instead, the abundance of existing third-party Android apps, homescreen and notification panel widgets, and hardware products will, in theory, transition fluidly into the Android XR operating system. 

Also: Watch out, Meta: Samsung just confirmed its smart glasses plans (with some spicy hints)

I got an early taste of it when I requested an Uber ride from the Google office to the third-best-rated pizzeria in Staten Island (as I testingly asked Gemini earlier). Besides populating a navigation pathway to my Uber pickup spot, the glasses’ display projected the driver’s information when I was near. This functionality is pulled directly from the native Uber app for Android, Google tells me, and it’s a good representation of how seamless developing for the wearable platform will be.

Another interesting aspect during my demo was how Gemini provided environmental context the moment I put on the glasses. Instead of asking the assistant about my location, the weather, or the random objects strategically placed around me for demo purposes, the Android XR experience began with a summary of contextual information and a prompt for follow-up questions. It’s a thoughtful touch that makes conversing with the assistant more natural.

Galaxy XR gets better, but I’m more drawn to this headset

Samsung Galaxy XR headset
Sabrina Ortiz/ZDNET

As I mentioned earlier, I also tried the Samsung Galaxy XR headset (again), only this time with some new features, including PC Connect, which syncs with a Windows PC or laptop for an extended, more immersive viewing experience, travel mode for improved anchoring during movement, and Likeness, a digital avatar generator similar to Apple’s Spatial Personas. 

As a Windows user, I was mostly invested in the PC Connect feature, which allowed me to project a much larger screen (albeit virtually) of the game, Stray. With a wireless controller in hand, the inputs were surprisingly responsive, and the image quality was stable in terms of refresh rate.

Also: The Samsung Galaxy XR headset comes with $1,000 worth of freebies – here’s what’s included

However, what stole the Galaxy XR headset’s thunder was a more portable and comfortable-to-wear pair of Xreal glasses, dubbed Project Aura. This was first announced at Google I/O months ago, and using the wired-in wearable for the first time made me realize that the future of comfortable face computers is not that far off.

Project Aura features a decently large 70-degree field of view, complemented by Xreal’s standard tinting feature, which enhances the screen’s brightness. It runs on the same Android XR platform as the Galaxy XR headset, meaning you can raise your hand for pinch and swipe gestures, view multiple floating windows simultaneously (including via PC Connect), and access various Android apps and services already available on your phone.

The big question with Project Aura is undoubtedly its price. Xreal’s existing lineup of extended reality glasses ranges from $300 to $650. With the enhanced computing (and innovation) of Project Aura, I’d expect it to be closer to the $1,000 mark at launch. Google and Xreal haven’t shared an official release date for the glasses yet, but have suggested to me that they’ll come late next year.

Bottom line (for now)

My journey through Google’s Android XR demos confirms that the competition in the wearable computing space is heating up, driven less by speculative concepts and more by tangible, functional hardware and software integration. The core strength of Google’s strategy lies not just in the smarts of Gemini but in leveraging the established Android ecosystem. That should be music to developers’ ears.

The ability to fluidly transition between diverse devices, from bulky developer kits to Xreal’s Project Aura underscores the company’s commitment to flexibility. Ultimately, what I experienced suggests that Google’s 2026 vision for seamless, multifunctional smart glasses is not merely marketing hype, but a technically sound and rapidly converging reality that could redefine how we interact with information and the digital world.

Featured

Leave A Reply