In recent years, AI language models have significantly increased, with the primary objective being the extraction, communication, and interpretation of human-level language.
Have you ever wondered about how Google comprehends your search queries? The process of providing relevant search results involves intricate language interpretation, and thanks to advancements in AI and machine learning, search systems are now better at understanding human language than ever before.
Google elucidates how its artificial intelligence (AI) systems decode human language and deliver pertinent search results.
Over time, Google has developed many algorithms, including its early spelling system, to enhance the delivery of pertinent search results. Interestingly, they keep their legacy algorithms and systems when introducing new AI systems. Instead, Google Search relies on a synergy of many algorithms and machine learning models, and its effectiveness hinges on the harmonious collaboration between old and new systems. Each algorithm and model serves a specific purpose and activates at distinct times and in unique ways to facilitate the delivery of the most valuable outcomes. Some of these advanced systems have a more pronounced impact than others. Let’s explore some of the prevalent AI systems currently employed in Search and explore their functions.
In a recent article, Pandu Nayak, Google’s Vice President of Search, simplifies the functioning of these AI models in plain language.
The following AI models, which play a crucial role in how Google provides search results, are decoded by Nayak:
Google employs RankBrain to enhance its comprehension of the probable user intent behind a search query. This technology was initially introduced in the spring of 2015 but was made publicly available on October 26.
RankBrain was initially deployed for queries Google had not previously encountered, constituting approximately 15% of all searches at that time—a proportion that still holds today. Subsequently, its scope was expanded to influence all search outcomes.
RankBrain operates as a machine learning system that builds upon the foundation laid by Hummingbird. This transformation shifted Google’s focus from interpreting “strings” of text to understanding real-world “entities.”
According to Google, RankBrain assists in uncovering information that was previously elusive by improving its grasp of how words in a search query relate to concrete concepts in the real world. For instance, if you were to search for “what is the title of the consumer at the highest level of a food chain,” RankBrain learns from various web pages that keywords like “food chain” primarily pertain to animals rather than human consumers. By recognizing and linking these terms to their pertinent concepts, RankBrain identifies that you are inquiring about a “top predator.”
Many recent AI systems make use of neural networks. However, it wasn’t until 2018 that Google incorporated neural matching into Search, enhancing its ability to discern the relationships between searches and webpages. Neural matching is pivotal in interpreting and aligning more abstract representations of ideas within queries and pages. Rather than focusing solely on keywords, it scrutinizes the entirety of the question or webpage, enabling a more profound comprehension of the underlying concepts.
Consider the search query “insights on how to manage a green,” for instance. If a friend poses this question, it might perplex you. Nevertheless, we can decipher its meaning with the aid of neural matching. By examining the broader conceptual representations present in the query—such as management, leadership, personality, and more—neural matching can deduce that the searcher seeks guidance on managing based on a widely recognized personality framework associated with colors.
BERT — a model for understanding meaning and context
Launched in 2019, BERT represented a significant leap forward in natural language comprehension, enabling users to grasp how various combinations of words convey diverse meanings and intentions. Unlike merely matching individual phrases, BERT comprehends how groups of words collaborate to share intricate concepts. BERT discerns the sequencing and interconnections of words, ensuring that even subtle components of your query remain pertinent in the search results.
For instance, when you input a query like “Can you get medicine for someone’s pharmacy,” BERT recognizes you intend to inquire about obtaining medication on behalf of someone else. In the pre-BERT era, the significance of the word “for” in the query often went unnoticed, resulting in search outcomes primarily focused on prescription filling. We now appreciate that even seemingly minor words can carry substantial meaning thanks to BERT.
BERT has become an integral component of nearly every English query due to its remarkable prowess in ranking and retrieving information—two pivotal functions in delivering pertinent search results. Leveraging its intricate language comprehension, BERT swiftly assesses articles’ relevance, enhancing search outcomes’ precision.
In 2021, Google unveiled the Multitask Unified Model, abbreviated as MUM, marking the latest milestone in the company’s AI advancements for search.
MUM stands out as a thousandfold enhancement over BERT, possessing the remarkable ability to comprehend and generate language.
Its scope of understanding spans a broader spectrum of information and global knowledge, from training in 75 languages to concurrent performance across multiple tasks.
MUM’s versatility extends to being multimodal, enabling it to fathom information encompassing various modalities, such as text, images, and potentially more in the future.
While MUM’s application in search is currently limited, Google is only beginning to explore its full potential.
MUM is being harnessed to enhance searches related to COVID-19 vaccine information, with plans to incorporate it into Google Lens in the forthcoming months. This integration will enable searches combining text and images for an enriched user experience.
Source – google