Digital Innovation Top Stories
4 mins read

MUM’s the word: Google Search to get 1,000 times more powerful

Getting your Trinity Audio player ready...

In the coming months, we’ll introduce a new way to search visually, with the ability to ask questions about what you see. 

Prabhakar Raghavan, Senior Vice President at Google

This week at Search On 2021, Google showed how Search will soon be able to not only better understand aspects of a topic the user is searching for, but also surface more insights and inspiration. 

The company showcased how they are using AI and new technologies to make information more helpful than ever before, while giving users new ways to search and explore in more natural and intuitive ways.

“Fewer searches to get things done”

At Google I/O earlier this year, the search giant announced that they had reached “a critical milestone for understanding information” with a new technology called Multitask Unified Model, or MUM for short.

“Today’s search engines aren’t quite sophisticated enough to answer the way an expert would. But with MUM, we’re getting closer to helping you with these types of complex needs,” said Pandu Nayak, Google Fellow and Vice President of Search. “So in the future, you’ll need fewer searches to get things done.”

MUM has the potential to transform how Google helps you with complex tasks. MUM uses the T5 text-to-text framework and is 1,000 times more powerful than BERT.

MUM not only understands language, but also generates it. It’s trained across 75 different languages and many different tasks at once, allowing it to develop a more comprehensive understanding of information and world knowledge than previous models.

And MUM is multimodal, so it understands information across text and images and, in the future, can expand to more modalities like video and audio.

Pandu Nayak Google Fellow and VP, Search

Google has been experimenting with using MUM’s capabilities to make its products more helpful and enable entirely new ways to search. Now, it’s sharing an early look at what will be possible with MUM. 

Here are a few examples:

With this new capability, a user can look for something that might be difficult to describe accurately with words alone. By combining images and text into a single query, Google making it easier to search visually and express questions in more natural ways.

Some questions are even trickier. For example, for a broken bike part that needs fixing,  instead of poring over catalogs of parts and then looking for a tutorial, the point-and-ask mode of searching will make it easier to find the exact moment in a video that can help.

A redesigned search

The company is also applying AI advances like MUM to redesign Google Search, to make searching more natural and intuitive. The search engine will be able to understand how people typically explore a particular topic, and shows the aspects people are likely to look at first.

Users can further explore ideas by zooming in and out of a topic with new features to refine and broaden searches. 

Google is also making it easier to find visual inspiration with a newly designed, browsable results page. This new visual results page is designed for searches that are looking for inspiration.

Introducing multimodal search

MUM is multimodal, which means it can understand information from different formats like webpages, pictures and more, simultaneously. 

Eventually, you might be able to take a photo of your hiking boots and ask, “can I use these to hike Mt. Fuji?” MUM would understand the image and connect it with your question to let you know your boots would work just fine. It could then point you to a blog with a list of recommended gear.

The Keyword, Google blog

Taking it a step further, with videos

While it already uses advanced AI systems to identify key moments in videos, like the winning shot in a basketball game, or steps in a recipe, Google is taking this a step further, by introducing a new experience that identifies related topics in a video, with links to easily dig deeper and learn more.

Using MUM, the search can even show related topics that aren’t explicitly mentioned in the video, based on Google’s advanced understanding of the information in the video.

The first version of this feature will roll out in the coming weeks, with more visual enhancements in the coming months.

Removing language barriers

Language can be a significant barrier to accessing information. MUM has the potential to break down these boundaries by transferring knowledge across languages.

It can learn from sources that aren’t written in the language the user wrote the search in, and help surface that information.

Across all these MUM experiences, we look forward to helping people discover more web pages, videos, images and ideas that they may not have come across or otherwise searched for. 

Prabhakar Raghavan, Senior Vice President at Google