Google made substantial changes to its crawler documentation, increasing information density and providing more focused coverage on critical topics.
Google has unveiled an overhaul of its Crawler documentation, reducing the main overview page and dividing the content into three more specialized pages. While the changelog downplays the scope of these changes, the update introduces a brand-new section and effectively rewrites the entire crawler overview. By splitting the content, Google has increased the information density across all crawler-related pages, enhancing topical depth and clarity.
What Changed?
Google’s documentation changelog lists only two updates, but the reality is that much more has been modified.
Here’s an overview of the fundamental changes:
- Updated user agent string for the GoogleProducer crawler
- Added content-encoding information
- Introduced a new section on technical properties
The new technical properties section includes fresh content that is not part of the previous documentation. While there haven’t been any changes to crawler behavior, Google has reorganized the content by creating three topic-specific pages. This allows them to add more detailed information to the crawler overview while making the page more concise.
Here’s the new information about content encoding (compression):
“Google’s crawlers and fetchers support the following content encodings (compressions): gzip, deflate, and Brotli (br). The content encodings supported by each Google user-agent are advertised in the Accept-Encoding header of each request they make. For example, Accept-Encoding: gzip, deflate, br.”
Additionally, the updated documentation includes details about crawling over HTTP/1.1 and HTTP/2 and a statement that Google aims to crawl as many pages as possible without negatively impacting a website’s server.
What Is The Goal Of The Revamp?
The recent update to Google’s crawler documentation was driven by the need to streamline the content. The original overview page had become too large, and as more information was added, it risked becoming unwieldy. To solve this, Google broke the page into three focused subtopics. This expands the specific crawler content while keeping the overview page more concise and manageable. By spinning off subtopics into their pages, Google has found an effective way to serve users better while maintaining a clean structure.
Here’s how the documentation changelog explains the change:
“The documentation grew very long, which limited our ability to extend the content about our crawlers and user-triggered fetchers… Reorganized the documentation for Google’s crawlers and user-triggered fetchers. We also added explicit notes about what product each crawler affects and added a robots.txt snippet for each crawler to demonstrate how to use the user agent tokens. There were no meaningful changes to the content otherwise.”
Although the changelog describes the update as a simple reorganization, it’s more significant. The crawler overview has been substantially rewritten, and three new pages have been created. While the core content remains the same, dividing it into subtopics makes it easier for Google to grow the documentation without making the original page larger.
The newly-organized Overview of Google Crawlers and Fetchers page is now an accurate high-level summary, with more detailed information shifted to standalone pages.
Google introduced three new pages:
- Common Crawlers
- Special-Case Crawlers
- User-Triggered Fetchers
- Common Crawlers
This page covers the standard Google crawlers, many of which are tied to GoogleBot, including the Google-InspectionTool, which uses the GoogleBot user agent. All crawlers listed here obey the robots.txt rules.
Here are the documented Google crawlers:
- Googlebot
- Googlebot Image
- Googlebot Video
- Googlebot News
- Google StoreBot
- Google-InspectionTool
- GoogleOther
- GoogleOther-Image
- GoogleOther-Video
- Google-CloudVertexBot
- Google-Extended
1. Special-Case Crawlers
These crawlers are associated with specific Google products and operate under agreements with their users. They use IP addresses distinct from GoogleBot and follow specific rules.
Here’s the list of special-case crawlers:
- AdSense
- User Agent: Mediapartners-Google
- AdsBot
- User Agent: AdsBot-Google
- AdsBot Mobile Web
- User Agent: AdsBot-Google-Mobile
- APIs-Google
- User Agent: APIs-Google
- Google-Safety
- User Agent: Google-Safety
2. User-Triggered Fetchers
User requests within a Google product activate these fetchers. For instance, users trigger the Google Site Verifier to verify site ownership or a site on Google Cloud might retrieve an external RSS feed. Since the fetchers respond to user actions, they generally ignore robots.txt rules. The general technical properties of Google crawlers still apply.
The documented user-triggered fetchers include:
- Feedfetcher
- Google Publisher Center
- Google Read Aloud
- Google Site Verifier
By splitting the content into these focused pages, Google improves clarity and creates room for future expansions in each area.
Takeaway
Google’s crawler overview page had become overly comprehensive, making it potentially less useful for users seeking specific information. The revamped page is now more concise and allows users to explore subtopics related to the three types of crawlers. This shift makes the content more accessible to navigate and understand, allowing users to find detailed information more efficiently.
This update provides a valuable lesson in refreshing a page that may be underperforming due to being overly comprehensive. By breaking a broad topic into standalone pages, subtopics can better address specific user needs and improve their visibility in search results.
It’s important to note that this change doesn’t reflect an update in Google’s algorithm—it’s simply an improvement in their documentation structure to enhance usability and make room for future content expansion.
If you need to feel more relaxed or confident about SEO, consider exploring our monthly SEO packages and let the experts guide you. We’re here to help you achieve the best results!