Hey there! I’ve got some exciting news for all you tech enthusiasts out there. Google has officially started rolling out a new feature called the Search Screen, powered by Google Lens and integrated directly into Google Assistant. This is a game-changer, as Google Lens allows you to search for images and translate, find similar products, and more. You can use Google Lens directly with Google Assistant, making it super handy.
This feature replaces the new Search Screen with the old “What is on my screen?”.
While Google Lens functionality was available through shortcuts before, it hasn’t been a viable option for many users. Direct integration into Google Assistant will make it much easier and more streamlined. Previously, you had to use the “What is on my screen?” command, but now, with direct integration, it’s just a whole lot simpler.
This new feature is directly integrated into Google Assistant, so you don’t need to give a voice command. Just press the power button to launch Google Assistant, and you’ll find the search option in the dialogue bar. No more manual launching of Google Assistant whenever you capture or share a screenshot in Google Lens. The search screen might take some time to analyze, but if it finds something on your screen, you’ll also get the option to read, which is super helpful. It likely uses the recently launched Google Reading Mode for article reading.
Google announced this feature in February, and it’s officially rolling out. Pixel devices are the first to get this Lens shortcut, decreasing the reliance on “What is on My Screen?” which hasn’t been removed yet because it doesn’t trigger everywhere, making it unreliable. However, this new feature will appear everywhere. Recently, Google made minor adjustments to Google Assistant and its visuals, which will be further refined with the update.
With this new feature, you can use Google Lens in various ways, such as analyzing what’s on your current screen, translating, copying text, shopping, searching Google, and finding places. For now, this feature is exclusively available on Google Pixel phones in the Google App v14.3.1 Beta. Expect it to roll out to other devices globally in the coming weeks. What do you think of this new feature? Are you excited to try it out? Let me know in the comments below!