the latest Pixel 10 smartphones, the Pixel Watch 4 variants, and the budget-friendly Pixel Buds 2A earphones were unveiled earlier this week, but Gemini AI was also a major highlight of the Made by Google event. At the core of all these devices is artificial intelligence, with Google revealing several intriguing AI features. Magic Cue, Camera Coach, and conversational edits in Google Photos stand out, but Gemini Live also received several enhancements this week, including the capability to see what you see and engage with the physical environment through the Pixel phone’s display.
The new Gemini Live functionality will not be surprising to any Pixel user who viewed Google’s Project Astra demonstration at I/O 2025 a few months back. At that event, Google presented ideas for Gemini Live, showcasing the AI assistant’s ability to recognize objects in the physical world and respond to inquiries in real time while marking those items on the display.
Perhaps you are attempting to fix something following an online guide and are unsure which tool to select from a case or countertop. By sharing your screen with Gemini Live and requesting the AI to identify the tool referenced in the guide, you could find your answer. The AI would analyze the camera feed, recognize the item, and then emphasize the correct tool by outlining it with a rectangle.
Visual assistance will be offered in Gemini Live on Pixel 10 smartphones initially. This feature will debut in time for the phones’ release on August 28. The functionality will become accessible on other Android devices within the same week. iPhone users who employ Gemini Live will receive visual support afterward.
To enable Gemini Live to offer assistance