fbpx

Google Wants to Reduce Crawl Rate to make it More Sustainable

3 min read
google reducing crawl rate

In the new episode of Search Off The Record Podcast that aired on 20 January 2022, Google’s Search Relations Team that comprises of John Mueller, Martin Splitt, and Gary Illyes, discussed at length about what to expect from Google in 2022 and beyond. One of the key takeaways from their discussion was the issue of crawling and indexing. The team discussed at length and considered reducing the frequency of crawling and indexing in order to make it more environmentally sustainable by conserving its computing services. But how is it going to impact you, as a website owner or an SEO professional, shall the frequency of indexing and crawling is reduced? Let’s look at what the search trio had to say about it and what to expect in coming months.

Crawling and Indexing does have an impact on the environment.

Gary brought to light that computing itself isn’t very environmentally sustainable in general. And since both crawling and indexing happen virtually, one is naturally inclined to assume that it has no impact on the environment as it is carbon-free. But that is not the case at all. Gary said:

“… what I mean is that computing, in general, is not really sustainable. And if you think of Bitcoin, for example, Bitcoin mining has real impact on the environment that you can actually measure, especially if the electricity is coming from coal plants or other less sustainable plants.

We are carbon-free, since I don’t even know, 2007 or something, 2009, but it doesn’t mean that we can’t reduce even more our footprint on the environment. And crawling is one of those things that early on, we could chop off some low-hanging fruits.”

How does Google intend to make crawling and indexing more environmentally sustainable?

To the uninitiated, there are two types of crawling in general: crawling to discover and crawling to refresh. While the former happens to discover new and updated content, the latter happens to refresh existing content.  Gary went on to say that one of the ways crawling can be more sustainable is by reducing the number of Refresh Crawls on websites that don’t update their content as frequently as others. Citing an example of the Wall Street Journal, he further added that since sites like the Wall Street Journal are frequently updating their homepage with new content, so these homepages need to be crawled to refresh frequently, as opposed to say their About page, which in comparison, always almost remains the same. So Google can help make a difference by crawling such pages less frequently.

Here’s what Gary had to say about it.

“… one thing that we do, and we might not need to do that much, is refresh crawls. Which means that once we discovered a document, a URL, then we go, we crawl it, and then, eventually, we are going to go back and revisit that URL. That is a refresh crawl. And then every single time we go back to that one URL, that will always be a refresh crawl. Now, how often do we need to go back to that URL?”

“So you don’t have to go back there that much. And often, we can’t estimate this well, and we definitely have room for improvement there on refresh crawls. Because sometimes it just seems wasteful that we are hitting the same URL over and over again. Sometimes we are hitting 404 pages, for example, for no good reason or no apparent reason. And all these things are basically stuff that we could improve on and then reduce our footprint even more.”

What does this mean for your website?

If you are under the impression that a high crawl rate, even if you’re not updating your content frequently, is a positive ranking signal, you are mistaken. Gary was quick to dismiss the issue raised by John, and said that a high crawl rate doesn’t necessarily translates to better or higher rankings. Here is a snippet from the conversation between John, Gary, and Martin.

John: [00:09:06] So I guess that’s kind of also a misconception that people have in that they think if a page gets crawled more, it’ll get ranked more. Is that correct that that’s a misconception, or is that actually true?

Gary: [00:09:20] It’s a misconception.

John: [00:09:20] OK, so no need to try to force something to be re-crawled if it doesn’t actually change. It’s not going to rank better.

Martin: [00:09:28] I mean it doesn’t make sense, right? If nothing has changed on the page, then we already know everything there is to know about that page. Why would that be a thing that makes it more relevant or more useful to users? It doesn’t make sense.

Since more crawling doesn’t mean higher rankings, we guess it won’t be such a bad thing for your website after all. But again, this is all conjecture at this moment as nothing is confirmed. But it certainly makes sense to refresh-crawl pages that are updated more often than the ones that rarely are. You can listen to this new episode titled ‘What to expect from Search Central in 2022’ of Search Off The Record Podcast here and if you’d like Download its Transcript to read and refer to later.

Source: Search Engine Journal