fbpx

How does Google’s AI Understand Human Language?

4 min read
AI in SEO

We’ve seen an explosion of AI language models in recent years. The ultimate goal of these systems is to be able to extract, communicate, and interpret human-level language.

Do you ever wonder how Google interprets your search queries? There’s a lot that goes into providing relevant search results, and one of the most critical skills is language interpretation. Search systems are comprehending human language better than ever before because of advancements in AI and machine learning.

Google describes how its artificial intelligence (AI) systems interpret human language and deliver appropriate search results.

Over the years, Google has built hundreds of algorithms, such as their early spelling system, to aid in the delivery of relevant search results. Their legacy algorithms and systems aren’t simply discarded when they develop new AI systems. In fact, Search relies on hundreds of algorithms and machine learning models, and it can only develop when both old and new systems can work together effectively. Each algorithm and model has a specific purpose, and they activate at different times and in different ways to assist deliver the most useful outcomes. Some of the more advanced systems have a greater impact than others. Let’s take a closer look at some of the most popular AI systems in use today in Search, and what they do.

 

Pandu Nayak, Google’s Vice President of Search, explains how these AI models function in simple words in a recent article.

The following AI models, which play a crucial role in how Google provides search results, are decoded by Nayak:

 

RankBrain

RankBrain is a technique that Google uses to better understand what a search query’s likely user intent is. It was released in the spring of 2015, but not until October 26 of the same year that it was released publicly.

RankBrain was first used on queries that Google had never seen before, which accounted for around 15% of all searches at the time and still do today. It was then broadened to affect all search results.

RankBrain is a machine learning system that builds on Hummingbird, which transformed Google’s environment from “strings” to “things.”

RankBrain, according to Google, “helps them uncover the information they couldn’t find before by better understanding how words in a search link to real-world concepts.” If you search for “what is the title of the consumer at the highest level of a food chain,” for example, the systems learn from reading such keywords on different pages that the concept of a food chain may refer to animals rather than humans consumers. RankBrain recognizes that you’re seeking what’s known as a “top predator” by recognizing and connecting these terms to their relevant concepts.

Neural matching

Many recent AI systems rely on neural networks. But it wasn’t until 2018 that Google added neural matching to Search, allowing it to better understand how searches and sites are related. Neural matching aids in the interpretation and matching of fuzzier representations of concepts in queries and pages. It examines the full query or page rather than simply the keywords, allowing it to gain a better understanding of the underlying concepts.

 Take the search “insights how to manage a green,” for example. If a friend asked you this, you’d probably be stumped. But with neural matching, we’re able to make sense of it. By looking at the broader representations of concepts in the query — management, leadership, personality, and more — neural matching can decipher that this searcher is looking for management tips based on a popular, color-based personality guide.

Neutral matching

BERT — a model for understanding meaning and context

BERT, which was launched in 2019, was a big step forward in natural language understanding, allowing users to comprehend how multiple word combinations communicate different meanings and intents. BERT understands how a group of words communicates a complicated concept rather than merely searching for content that matches individual phrases. BERT recognizes words in a sequence and how they connect to one another, so it ensures that significant words from your query aren’t lost in the results – no matter how little they are.

For example, if you search for “can you get medicine for someone pharmacy,” BERT understands that you’re trying to figure out if you can pick up medicine for someone else. Before BERT, we took that short preposition for granted, mostly sharing results about how to fill a prescription. Thanks to BERT, we understand that even small words can have big meanings.

BERT is now an important part of practically every English query. This is because BERT systems excel in ranking and retrieving, two of the most important tasks in delivering relevant results. BERT can quickly rank articles for relevance based on its complex language understanding.

BERT

 MUM

In 2021, Google introduced the Multitask Unified Model, or MUM, which is the company’s most recent AI milestone in search.

MUM is a thousand times more powerful than BERT, and it can understand as well as generate language.

It has a broader understanding of information and world knowledge, having been trained in 75 languages and performing many tasks at the same time.

MUM is also multimodal, meaning it can understand information across multiple modalities such as text, images, and more in the future.

MUM’s application in search is restricted because Google is only beginning to realize its potential.

MUM is now being utilized to improve COVID-19 vaccine information searches. It will be used in Google Lens in the coming months as a way to search using a combination of text and images.

Source – google