Google has given a live demonstration of its Android XR-powered smart glasses during the TED2025 conference. Shahram Izadi, the brand’s Android XR lead, demonstrated the glasses’ features including its visual memory, translation capabilities, and contextual object recognition, with the help of colleague Nishtha Bhatia.
During the demonstration, Izadi interacted with the Android XR glasses in various ways. Showcasing its integration with Google’s Gemini AI assistant, the glasses has the ability to follow voice commands and have real-time visual processing. For example, the Google glasses could locate missing objects, generate poems, and recall things that were briefly seen.

When addressing the Gemini AI assistant in a particular language, the Android XR glasses were able to provide quick translations, without the need for a manual setting change. The Google glasses were also shown to provide visual explanations of diagrams, and have contextual object recognition.
The Google Android XR glasses, which are stated to be “lightweight and discreet” , feature a mini camera, microphones, speakers, and a high-resolution colour display embedded in the lens. It can be connected to a smartphone and access a wide range of apps. Overall, it looks like Project Astra, first demoed last year, is finally taking shape in the form of a Google Glass revival.


Samsung is also reportedly planning to launch its own smart glasses later this year, codenamed Haean, which will also use Google Android XR operating system. While its specs have not been unveiled, it might similarly feature an integrated camera, Qualcomm’s Snapdragon XR2 Plus Gen 2 chip, and have a lightweight frame.
(Source: TED / Youtube via TechSpot)
Manisha Dharmendra contributed to this article.
Follow us on Instagram, Facebook, Twitter or Telegram for more updates and breaking news.