The latest generation of Android smartphones is incorporating new Google Lens-like functionalities that prioritize Search within every Android app. This innovative feature allows users to highlight sections of images or text to trigger Search results. The recently unveiled Samsung Galaxy S24 and the Pixel 8 models from last year are among the devices gaining access to this feature. Remarkably, this marks the initial instance where a phone integrates AI Search capabilities inherently, eliminating the necessity to enroll in a beta program.
The Circle to Search attribute operates similarly to Google’s multisearch, enabling users to initiate searches from any app, whether viewing an image, text, or video. By employing gestures such as taps, swipes, or circular motions with a finger to highlight an object, a search bar emerges from the bottom bar, providing details about the image, including product pricing, or furnishing search outcomes based on the text. Users can activate this feature by performing a long press on the navigation bar or home button, catering to those who prefer retaining the back button functionality.
With Circle to Search, users can seamlessly explore additional information without the need for reverse image searches when encountering images on platforms like Reddit. For text-based searches, simply highlighting the text generates search results akin to entering a query directly into Google. Upon completion, a downward swipe allows users to revert to their previous app without manually closing the new interface.
Scheduled for release on January 31, Circle to Search will initially be exclusive to the Pixel 8, Pixel 8 Pro, and the new Samsung Galaxy S24 series. This functionality is expected to expand across the Android ecosystem gradually, although at present, only a select few devices will have access to it. Integrated within the Google app, this feature necessitates the application to be active and updated on Android devices to maximize its utility.
Moreover, Google is enhancing search capabilities on both Android and iOS by introducing additional search options through the multisearch function in Google Lens. In addition to receiving reports on photos through images and text, users can now obtain responses from Google’s Search AI, which is still in beta. For instance, capturing an image of a wilting Azalea and inquiring about plant care through Lens can prompt Google to provide text-based guidance on nurturing the plant, ensuring it thrives and avoiding any mishaps that could harm it.
This functionality is part of the ongoing Search Generative Experience (SGE) beta program. These in-Search responses are presented in collapsible formats with accompanying links to videos or articles where the AI sourced the information. The new multisearch feature with AI integration allows users to engage with an in-Search AI chatbot, complementing the Circle to Search functionality.
Google is now making its SGE more accessible to a broader audience, particularly to existing Pixel 8 users and prospective Samsung S24 owners, without requiring any sign-up process. Additional Search AI features may soon reach Android users directly through the company’s AI beta program, although Google continues to classify these features as experimental. As text generation becomes more accessible through mobile devices, distinguishing between regular users and those participating in Google’s experimental initiatives is becoming increasingly challenging.