Google’s “Search Live” feature is being rolled out to select iOS and Android users, allowing them to interact conversationally with the AI. This innovative tool was introduced at the recent Google I/O event, where it showcased the ability of users to engage with AI in a dynamic way. For example, during the event, a user asked Search to retrieve a User’s Manual for a Huffy Mountain Bike and pointed out specific sections, like the brakes. The AI also assisted in finding relevant YouTube videos and emails for additional information, demonstrating a practical application of AI that goes beyond simple tasks.
Users will know if they have access to Search Live by looking for a waveform icon with a sparkle beneath the Google Search bar in the app. To initiate a conversation, tapping this icon or another button next to the search field will activate the feature. The interface is designed with flexibility, offering both dark and light screen options, and includes buttons for voice interaction and document transcription. The goal is to provide a real-time conversation with AI, allowing users to find exactly what they need more efficiently.
However, Google notes that the AI can make mistakes, acknowledging that it is still an experimental feature. The possibility of “hallucinations,” or inaccuracies in the information provided, is also mentioned, reminding users to exercise caution when relying on the AI. Interestingly, users can continue a conversation in the background even if they exit the Google app. As of now, some devices, including the Pixel 6 Pro and iPhone 15 Pro Max, are still awaiting this feature.
Many users, including myself, are eagerly anticipating its arrival, as it promises to enhance the usability of Google Search through interactive and personalized assistance.