Advertisement
NewsSEO

How Does Google Search Understand Human Language

Google describes how its artificial intelligence (AI) systems interpret human language and produce appropriate search results.

Google Search can understand human language with the help of multiple AI models that collaborate to find the most relevant results.

Pandu Nayak, Google’s Vice President of Search, explains how these AI models work in simple terms in a news article on the company’s official blog.

Nayak deconstructs the following AI models, which play a significant role in how Google returns search results:

  • RankBrain
  • Neural matching
  • BERT
  • MUM

Neither of these models is sufficient on its own. They all assist one another by performing various tasks to understand queries and match them to the content searchers are looking for.

Here are the most important takeaways from Google’s behind-the-scenes look at what its AI models do and how it all translates into better search results for users.

Google’s AI Models Explained

RankBrain

RankBrain, Google’s first AI system, was released in 2015.

The goal of RankBrain, as the name implies, is to determine the best order for search results by ranking them according to relevance.

RankBrain, despite being Google’s first deep learning model, continues to play a significant role in search results today.

Google uses RankBrain to understand how words in a search query related to real-world concepts.

Nayak explains how RankBrain works:

“For example, if you search for ‘what’s the title of the consumer at the highest level of a food chain,’ our systems learn from seeing those words on various pages that the concept of a food chain may have to do with animals, and not human consumers.

By understanding and matching these words to their related concepts, RankBrain understands that you’re looking for what’s commonly referred to as an “apex predator.”

Neural Matching

In 2018, Google added neural matching to search results.

Using knowledge of broader concepts, neural matching enables Google to understand how queries relate to pages.

Instead of focusing on individual keywords, neural matching examines entire queries and pages to determine the concepts they represent.

Google’s AI model allows it to cast a wider net when searching its index for content relevant to a query.

Nayak explains how neural matching works:

“Take the search “insights how to manage a green,” for example. If a friend asked you this, you’d probably be stumped. But with neural matching, we’re able to make sense of it.

By looking at the broader representations of concepts in the query — management, leadership, personality and more — neural matching can decipher that this searcher is looking for management tips based on a popular, color-based personality guide.”

BERT

BERT was first used in queries in 2019, and it is now used in all queries.

It’s intended to do two things: retrieve relevant content and rank it.

BERT can understand how words relate to each other when used in a specific sequence, ensuring that important words are not omitted from a query.

BERT is able to rank web content for relevance faster than other AI models due to its complex understanding of language.

In this example, Nayak shows how BERT works in practice:

“For example, if you search for “can you get medicine for someone pharmacy,” BERT understands that you’re trying to figure out if you can pick up medicine for someone else.

Before BERT, we took that short preposition for granted, mostly sharing results about how to fill a prescription. Thanks to BERT, we understand that even small words can have big meanings.”

MUM

In 2021, Google introduced its most recent AI milestone in Search, the Multitask Unified Model, or MUM.

MUM is a thousand times more powerful than BERT, and it can understand and generate language.

It has a more comprehensive understanding of information and world knowledge because it has been trained in 75 languages and many different tasks at the same time.

MUM’s understanding of language extends to images, text, and, in the future, more. That is what it means when MUM is referred to as “multi-modal.”

Because Google is still in the early stages of realizing MUM’s potential, its use in search is limited.

MUM is currently being used to enhance searches for COVID-19 vaccine information. It will be used in Google Lens in the coming months to search using a combination of text and images.

Summary

Here’s a quick rundown of Google’s major AI systems and what they do:

  • RankBrain: This algorithm ranks content by determining how keywords relate to real-world concepts.
  • Neural matching: Provides Google with a broader understanding of concepts, allowing it to search through more content.
  • BERT: Allows Google to understand how words in a specific sequence can change the meaning of queries.
  • MUM: Understands information and global knowledge in dozens of languages and multiple modalities, including text and images.

All of these AI systems collaborate to find and rank the most relevant content for a query as quickly as possible.

Learn more from SEO and read The Google MUM Algorithm is capable of more than just ranking websites.

Related Articles

Back to top button

Adblock Detected

Don't miss the best oppertunities.