At Search On event, Google explained how machine learning is helping them create search experiences that mirror how people make sense of the world. Also, by interpreting information in its numerous forms, from words to images to the real world, They can help users gather and explore data in innovative ways. Google is making visual search more natural and intuitive for users.
Helping you search outside the box
Lens lets you search by camera or image. Google introduced Multisearch this year, making visual search more natural. Multisearch lets you capture an image or screenshot and add words, such as pointing at something and asking about it. Also, Multisearch is available all over the world in English, and it will soon be available in more than 70 languages.
At Google, I/O, the company previewed “Multisearch near me,” which lets users snap a picture of a meal or object and find it nearby. This new approach to search will help you identify local companies, whether you want to support your neighborhood shop or need something today. “Multisearch Near Me” will launch in the U.S. this fall.
With multisearch, you can take a pic *and* ask a question to get the look you want or fix something. 🤯 We're bringing this new way to search to 70+ languages. And soon, you'll be able to add "near me" to your image to find what you're looking for nearby. #SearchOn pic.twitter.com/RHxRQm42EU
— Google (@Google) September 28, 2022
- Visual comprehension breaks down language barriers. Google Lens translates images beyond text. Google translates visual text into over 100 languages, 1 billion times per month.
- Machine learning improvements can combine translated text into complex pictures, so it seems natural. Machine learning models have been refined, so it can achieve this in 100 milliseconds.
- Magic Eraser on Pixel employs generative adversarial networks (GAN models). Later this year, this upgraded experience launches.
What you love about Translating with Lens is now even better. 💡
With major advancements in AI, translated text appears seamlessly integrated, as if it was part of the original picture. Turning text… into context! #SearchOn pic.twitter.com/N8YySv87z1
— Google (@Google) September 28, 2022
- New Lens translation update: With this new update, users can point their camera at a foreign-language poster and view translated text overlaid on the images. The update will be rolling out later this year.
- Search Shortcuts: With the Google App for iOS, Google is putting some of its most useful capabilities in your hands. Starting today, users will find shortcuts under the search bar to shop screenshots, translate text with your camera, and hum to search.
New ways to explore information
Google is aiming to make it, so you may ask queries with fewer words—or none at all—and we’ll still comprehend what you mean or surface useful results.
- You can examine information structured in a way that makes sense to you, such as by digging deeper on a topic as it unfolds or discovering fresh perspectives.
- Finding the findings promptly is key. Google will soon roll out a speedier search option. Before you complete typing a question, a Google search can provide relevant content, according to the firm.
New search experiences: Google’s new search experiences let you explore topics more naturally. It will suggest keywords or topics as you type to help you formulate your inquiry. If you want to visit Mexico, Google will help you narrow your search to more relevant results, such as “best cities in Mexico for families.”
Visual stories and Short videos
- Google makes it easy to investigate a topic by highlighting relevant and helpful material, including open Web resources.
- For cities, you may find visual stories and short movies from visitors, recommendations on how to explore, things to do, how to get there, and other travel-related information.
Zoom in and out: Google’s analysis of how people search will soon present you with topics to help you go further or find a new direction. When zooming, you can add or remove topics.
- This can help you find items you hadn’t considered. Oaxacan beaches are among Mexico’s best-kept secrets. Google is redesigning how it displays results to better represent how people search.
- Users see relevant content from various sources, whether it’s text, photos, or video. As you scroll, you’ll notice connected search subjects. You may not have considered visiting Oaxaca’s historic places or listening to live music.
New features announced at the Google’s Search On 2022 and new ways to explore information will be available in the coming months.
Speaking on the introduction, Cathy Edwards, VP/GM, Search, said,
We’re advancing visual search to be far more natural than ever before, and we’re helping people navigate information more intuitively. We hope you’re excited to search outside the box, and we look forward to continuing to build the future of search together.