fbpx

Rethinking Site Quality: Google’s Simplified Perspective

6 min read

In a recent podcast, Google’s John Mueller, Martin Splitt, and Gary Illyes delved into the concept of site quality, presenting diverse perspectives and emphasizing that, at its core, it isn’t as complex as commonly perceived. They likened it to a straightforward concept, suggesting that site quality might be simpler than widely understood.

 

Understanding Site Quality: Not as Complex as It Seems

 

During the discussion, a key emphasis was placed on referencing site quality documentation, with Gary Illyes recommending:

“I’d suggest exploring search engine documentation. Many of them outline how they operate. Identify areas where your content or page might need improvement. Frankly, it may seem patronizing, but it’s far from rocket science.”

 

Tools for Assessing Site Quality: A Missing Puzzle Piece

 

Gary Illyes highlighted the need for more specific tools for diagnosing site quality, unlike tools designed for objectively identifying technical issues.

Metrics indicating fluctuations in traffic lack explanatory power—they merely signify change without revealing the underlying cause. Illyes expressed frustration, stating, “I find the up-down metric completely useless because you still have to figure out what’s wrong with it or why people didn’t like it.”

Martin Splitt echoed this sentiment, pondering the discrepancy between a writer’s perspective and user feedback, questioning how to enhance content when confronted with such disparities.

Illyes introduced an alternative perspective, suggesting that site quality is more straightforward than commonly perceived: “What if it’s about writing the thing that will help people achieve whatever they need when they come to the page? And that’s it.”

In response to Martin’s inquiry about user perspective, Illyes clarified that they were reframing the issue, suggesting a different approach to problem-solving. Reframing, in this context, implies reconsidering the problem from a different angle—to assess whether the page fulfills its promised function for users, aligning with their needs and expectations.

 

Enhancing Quality through Value Addition

 

In the subsequent segment of the podcast, John Mueller and Gary Illyes delve into the concept of adding value, a crucial aspect associated with site and page quality.

The discussion revolves around the significance of providing something valuable in search engine result pages (SERPs). They highlight instances where SERPs showcase websites that engage users and are anticipated as the go-to sources for specific queries. This anticipation is evident when Google Suggest associates brand names with keywords, indicating users’ inclination towards these sites.

This scenario suggests numerous users transform keywords into branded searches, signaling Google about their preferences.

More than merely relevance or possessing the ideal answer is required in such competitive query scenarios. John Mueller elucidates this by citing an intriguing system:

“Sometimes, when speaking with individuals, they express the need to create a page for users. However, upon examining the search results, it’s evident that numerous others have already created similar pages. The dilemma emerges: Is this page truly contributing value to the Internet? Despite being a good page, the question arises: Is it genuinely needed when there are numerous other satisfactory versions already available and embraced by users?”

This discussion delves into the intricacies of relevance or content adequacy and the significance of contributing unique value to online content in a landscape already filled with similar offerings.

In such scenarios, relying on competitive analysis to “reverse engineer” the SERPs can hinder SEO efforts.

This approach tends to stagnate as it mirrors what’s already present in the SERPs, feeding Google redundant information.

Consider this analogy: Let’s imagine assigning a zero baseline to the site ranked in Google and extending that baseline to everything in the SERPs. Anything below zero signifies poor quality, while anything above zero indicates higher quality.

In essence, emulating existing entities or topics in the SERPs doesn’t elevate from this zero baseline—it merely maintains the same standing.

SEO practitioners attempting to reverse engineer Google by replicating entities or topics are achieving a perfect score of zero. In this context, mimicking what’s already prevalent fails to surpass the established zero baseline.

So, as per Mueller, Google’s response echoes, “It’s a good page, but who needs it?”

In this scenario, Google isn’t seeking content aligning with the existing zero SERP baseline. Mueller suggests that Google is pursuing something distinct, surpassing the prevalent status quo.

In the analogy presented, Google aims for content that exceeds the established SERPs baseline of zero—a figure greater than zero, akin to a one.

Merely duplicating existing content or magnifying the same approach to a larger scale doesn’t contribute additional value. It remains unchanged content-wise, regardless of the quantity or scale of repetition.

 

Cracking Tough SERPs: The Side Door Approach

 

Gary Illyes shares an unconventional strategy for infiltrating challenging SERPs—taking an indirect route.

While this tactic isn’t novel, it remains effective even today. Illyes advises avoiding confrontations that might seem daunting, suggesting a more pragmatic approach to competition.

Illyes parallels the struggles he encountered during his SEO days by discussing the challenge of breaking into niches. He reminisces about topics like mesothelioma or legal domains that were exceptionally difficult to penetrate, likening these challenges to present-day scenarios with abundant content hindering new entries.

Recalling advice from Matt Cutts, the former head of Web Spam, Illyes echoes Cutts’ suggestion to offer a unique perspective or original content—an approach he acknowledges might be challenging due to the saturation of various viewpoints.

However, Illyes contends that identifying less-discussed niches could offer a breakthrough opportunity. He emphasizes that niches with limited conversation present a more accessible pathway for entry, implying that success lies in genuinely assisting the audience with valuable content.

Illyes emphasizes the importance of expertise and genuine intent to assist users—an approach that aligns with my strategy to stay ahead in SEO.

For instance, I privately advised clients to implement these strategies for their review pages even before the reviews update and Google’s inclusion of Experience in E-A-T. I urged them to maintain confidentiality because I firmly grasped these methods.

I’m not clairvoyant; I observed what Google prioritized in rankings. Years before the review update, I recognized the necessity of original photos and hands-on experience with the reviewed product to rank higher. This insight helped me foresee the importance of these factors in Google’s algorithms.

Gary’s emphasis on viewing the issue through the lens of “trying to help people” resonates profoundly.

Continuing, he touches upon the notion of selecting battles wisely.

He highlights the prevalent motivator, often money-driven, prompting individuals to target high-profit niches—an obvious choice for many. However, Gary suggests an alternative approach: delving into less-covered topics. For instance, if only a handful of individuals have discussed a particular subject on the Internet, seizing that niche might yield traffic opportunities.

By diversifying across multiple less-explored areas, there’s potential to surpass the traffic volume of highly competitive niches, offering an alternative route to success.

Gary is discussing strategies to navigate the barriers established sites pose—the entry hurdle. His advice revolves around avoiding duplicating what everyone else already offers, emphasizing the importance of distinctiveness, which directly impacts quality.

I’ve employed this approach with new websites, capitalizing on creating content that more significant sites either can’t or haven’t considered making. It involves pinpointing weaknesses in more significant sites—areas where they fail, like connecting with specific demographics such as younger or older audiences.

These instances highlight the essence of offering something unique, setting the site apart in quality. It’s about identifying and capitalizing on what more significant sites might overlook or struggle to address, ultimately allowing the site to stand out amidst competition.

 

Analyzing Quality Challenges

 

Assessing a site for quality issues proves more challenging than pinpointing technical glitches. Yet, here are a few crucial takeaways:

  1. Perception of quality varies: Those deeply involved in content are sometimes the best judges.
  2. Dive into Google’s search documentation: Explore on-page factors, content guidelines, and quality recommendations.
  3. Simplify content quality: Focus on understanding the topic thoroughly and providing valuable assistance.
  4. Originality matters: Rather than mirroring competitors, seek unique approaches from SERPs.
  5. Flexibility is key: Staying open-minded prevents rigid thinking, especially regarding site quality. It’s crucial to avoid fixating on one viewpoint that might obscure the genuine causes of ranking issues.

If navigating this feels overwhelming, consider exploring our monthly SEO packages. Our experts are here to lend a hand and make things easier for you.

Shilpi Mathur
[email protected]