Tuesday, December 09, 2025 | 12:19 PM ISTहिंदी में पढें
Business Standard
Notification Icon
userprofile IconSearch

Android Show XR Edition: Google announces Project Aura, AI glasses and more

Google's Android Show XR Edition unveiled key updates across Project Aura, new AI glasses and Galaxy XR. Here's a list of everything that was announced at the event

Google's 'The Android Show: XR Edition'

Google's 'The Android Show: XR Edition'

Aashish Kumar Shrivastava New Delhi

Listen to This Article

Google hosted “The Android Show: XR Edition” on December 8, where it announced the launch date for XReal's Project Aura wired XR (extended-reality) glasses, showcased its AI glasses that are being built in partnership with Warby Parker and Gentle Monster, and highlighted the updates coming to the Samsung Galaxy XR headset. 
 
At the event, the US-based technology giant also announced the launch of Developer Preview 3 of the Android XR SDK, giving developers the tools and APIs to design helpful augmented experiences. 

Google Android Show XR Edition: Highlights

XReal Project Aura

Google previewed Project Aura, an Android XR platform-based glasses developed by XREAL. According to the company, the headset features a 70-degree field of view and optical see-through technology that overlays digital content onto a user’s real-world surroundings. Google stated that the system can display multiple floating windows, aimed at enabling work or entertainment use without blocking visibility. 
 
 
The company added that the device is intended for practical tasks as well, such as following a recipe video while cooking or viewing step-by-step instructions during appliance repairs. The XR wired glasses are powered by tethered puck which holds the main compute and battery. The puck doubles as a trackpad. Google said it plans to share more information about the launch of Project Aura next year.

AI glasses built in collaboration with Warby Parker and Gentle Monster

Google showcased two types of AI glasses at the Android Show XR Edition, outlining its plans for hands-free, on-the-go assistance. According to the company, the first model is a screen-free pair equipped with speakers, microphones and cameras, enabling users to interact with Gemini, take photos and receive real-time voice assistance. This in a way is similar to Ray-Ban Meta glasses. Google said this pair of AI glasses will be launched in 2026.
 
The second pair of AI glasses that Google highlighted at the event features an in-lens display that can privately show contextual information, including navigation cues and live translation captions. Google also demonstrated a prototype of this display-equipped model, showing how users could summon Gemini for quick help. The demo included Gemini 2.5 Flash Image, also referred to as Nano Banana, which can edit photos on the spot, including adding objects or people that were not present in the original image.
 
Additionally, these glasses also showed memory retention capabilities. When a Google executive walked past a table filled with snacks, the glasses saw the items and remembered them automatically. Later, after a few minutes, when the person asked if there was any high-protein item laid on the table that they walked past, these AI glasses was able to recall the name of the particular item. 

New features for Galaxy XR headset

Google has begun the rollout of a set of new features for the Galaxy XR headset. As per the company, the updates are intended to improve how users work and communicate in Android XR environments.
 
The update introduces PC Connect, a feature that links a Windows PC to the headset so users can pull in their desktop or individual app windows and place them alongside native Android XR apps. The company added that this setup is intended to give users more flexibility for work or gaming than a standard laptop display. PC Connect has been launched in beta.
 
Google highlighted auto-spatialisation feature for the Galaxy XR which turns regular 2D content into 3D in real time. The company noted that system-level auto-spatialisation will arrive next year. Travel mode has also been added, which is designed to keep the headset’s view stable during movement, such as on flights, enabling use as a personal cinema or portable workspace. For video calls, Google said users can now create a Likeness, a digital avatar that mirrors facial expressions and hand gestures in real time. Google has begun rolling out this feature in beta testing phase.

Android XR SDK’s developer preview 3

Google announced that it is opening the next phase of development for Android XR, noting that hardware and platforms are “just the canvas” for future spatial experiences. As part of this push, the company has released Developer Preview 3 of the Android XR SDK.
 
According to Google, the latest preview enables developers to begin building apps for AI glasses, providing new tools and APIs to support augmented, context-aware experiences similar to those being developed by partners such as Uber and GetYourGuide. The update also introduces additional capabilities aimed at helping developers create more complex and immersive applications for XR headsets and wired XR glasses.

Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: Dec 09 2025 | 12:16 PM IST

Explore News