During Google I/O 2025, the technology leader revealed its bold strategy for Android XR, marking a significant initiative in the extended reality (XR) sector. With collaborations spanning from high-end eyewear companies like Gentle Monster to major tech firms like Samsung, Google is setting the groundwork for a novel generation of smart glasses that will be driven by Android and Gemini AI. The objective? To bring XR into the mainstream by 2026.
Taking inspiration from Meta’s strategy, Google is partnering with well-known eyewear brands to craft chic smart glasses. Gentle Monster and Warby Parker are the initial collaborators chosen to create frames compatible with Android XR. These glasses will not only look good—they’ll also be practical, equipped with integrated cameras, microphones, speakers, and optional in-lens displays for augmented reality applications.
This dual strategy—providing both display-enabled XR glasses and lower-cost audio-first variants—positions Google to compete head-on with Meta’s Ray-Ban smart glasses. The aim is to appeal to both technology aficionados and style-conscious consumers, making smart glasses more accessible and desirable.
In a calculated initiative, Google has collaborated with Samsung to jointly create a “software and reference hardware platform” for Android XR. This partnership seeks to establish a standardized framework for other OEMs to design their own smart glasses, promoting innovation and widespread adoption throughout the sector.
Samsung, which is already working on its own Project Moohan XR headset, will now broaden its scope to smart glasses utilizing Android XR. This collaboration is anticipated to produce a vibrant ecosystem of devices that can work together, similar to what Android accomplished for smartphones.
To back its hardware goals, Google is enhancing its developer assistance. The Android XR Developer Preview 2 presents a collection of new tools aimed at making XR app creation more intuitive and powerful:
While these tools are primarily optimized for XR headsets, Google assures that smart glasses-specific SDKs will be available by the close of 2025, allowing developers to customize experiences for both immersive and lightweight wearable platforms.
Central to Google’s XR vision is Gemini, its next-gen AI assistant. Gemini will facilitate voice interactions, contextual understanding, and real-time support on Android XR glasses. Throughout the I/O keynote, Google showcased several scenarios:
These interactions are crafted to be hands-free and fluid, with Gemini “seeing and hearing what you do” through the glasses’ sensors. The AI will retain context and preferences, enabling more tailored and proactive support.
Google’s concept videos at I/O 2025 provided insights into the future of Android XR. Users could engage with floating UI components, receive contextual alerts, and navigate their surroundings without needing to reach for a phone. The interface is designed to occupy only a portion of the user’s field of vision, ensuring it is beneficial without being disruptive.
Nevertheless, realizing this vision in a lightweight, consumer-friendly format presents a technical hurdle. Google must strike a balance between battery life, processing capability, and display clarity—all within the constraints of a chic glasses framework.
A few companies are already getting involved. XREAL, for example, displayed its Project Aura AR glasses at I/O, developed on Google’s initial Android XR platform. As more OEMs adopt the reference platform co-created by Google and Samsung, the XR ecosystem is expected to expand swiftly.
Google also confirmed that the Android XR Play Store