the multimodal AI features of Project Astra to Google Search on mobile devices. In other words, Search Live also enables the AI to perceive what you observe in the real world in addition to responding to your voice commands. The AI will deliver answers instantaneously to any inquiries you may have concerning what it can visualize. Nonetheless, the majority of the new functionalities Google showcased at the event were made accessible to Labs users instead of the general audience. Four months later, Google officially introduced Search Live in the U.S.
Users of Android and iPhone can explore the new Search Live feature without participating in Labs experiments. A new button within the Google app on Android and iPhone, as well as in Google Lens, will allow them to engage in interactive voice dialogues in Google’s AI Mode, with the AI having access to the real-time video feed from the camera.
Google introduced the new Search Live feature in a blog entry on Wednesday. Search Live will function in the Google and Google Lens applications, but only English language support is available.
Users will need to open the Google app and tap a new Live icon that appears beneath the search bar (see image above). Upon doing so, they will share the camera feed with the AI, allowing the AI to see what they perceive in real-time. Users can pose questions about their surroundings, as illustrated in the examples Google shared on YouTube.
A comparable process is available if users are already directing the smartphone camera towards an object when <a href="https://www.bgr.com/tech/google-lens-brings-circle-to-search-to-iphone-sort-of/" target