fbpx

The Enigma of Ranking Drops: Why Google Keeps Some Secrets

3 min read

Google elucidates the difficulties in offering clarity regarding algorithmic ranking shifts and manual evaluations.

Decreases in traffic don’t invariably indicate algorithmic spam interventions. Manual assessments to counteract ranking declines are unlikely. Enhanced transparency in the Search Console could be on the horizon.

 

In a recent Twitter conversation, Danny Sullivan, Google’s Search Liaison, discussed the search engine’s approach to algorithmic spam actions and ranking declines.

The dialogue began with a website owner expressing concern over a substantial decrease in traffic and the inability to request a manual review.

Sullivan clarified that a site’s decline in rankings could be due to an algorithmic spam action or other factors affecting its visibility, emphasizing the complexity of search ranking dynamics.

 

Sullivan emphasized that numerous websites encountering ranking declines often misinterpret them as algorithmic spam actions when, in fact, that might not be the root cause.

“I’ve examined numerous sites where individuals have lamented the loss of rankings and concluded that they’ve been hit with an algorithmic spam action, but in reality, they haven’t.”

Sullivan’s comprehensive statement illuminates the challenges Google faces in providing transparency.

Furthermore, he elaborates on why the inclination for manual reviews to supersede automated rankings might need to be revised.

 

Navigating Transparency Challenges & Manual Intervention

 

Sullivan acknowledged enhancing transparency in Search Console, potentially alerting site owners of algorithmic actions akin to manual interventions.

However, he underscored two significant hurdles:

  1. Exposing algorithmic spam signals could empower malicious actors to exploit the system.
  2. Algorithmic actions lack site specificity and cannot be manually overridden.

Sullivan empathized with the frustration from uncertainty regarding traffic drops and the inability to engage with someone.

Nevertheless, he cautioned against the inclination for manual intervention to override automated ranking systems. Sullivan stated:

“…you don’t want to think, ‘Oh, I just wish I had a manual action; that would be so much easier.’ You don’t want your site catching our spam analysts’ attention. First, it’s not as if manual actions are promptly processed. Second, we retain information about a site in the future, particularly if it claims it has altered but hasn’t truly.”

 

Evaluating Content Utility & Credibility

 

Beyond spam detection, Sullivan delved into the diverse systems assessing individual content and websites’ utility, relevance, and reliability.

He acknowledged the imperfections within these systems, recognizing that some high-quality sites may not receive the recognition they deserve.

“Some of them rank quite well. However, they’ve experienced slight downward shifts in positions significant enough to trigger notable traffic drops. Website owners may assume there are fundamental issues when, in reality, there aren’t — which is why we’ve dedicated an entire section to this topic on our traffic drop troubleshooting page.” Sullivan disclosed ongoing deliberations about introducing more indicators in the Search Console to aid creators in understanding their content’s performance.

“Another topic I’ve been contemplating, and I’m not alone, is whether we could enhance the Search Console to display some of these indicators. This poses challenges akin to those I mentioned about spam, regarding our reluctance to allow the systems to be manipulated and the absence of a straightforward solution such as a button that says ‘this content is actually more useful than our automated systems think — rank it better!’ However, perhaps there’s a method we can devise to share more information in a manner beneficial to everyone, coupled with improved guidance, that could assist creators.”

 

Empowering Small Publishers & Forward Momentum

 

In response to a proposal from Brandon Saltalamacchia, founder of RetroDodo, regarding the manual review of “good” sites and offering guidance, Sullivan shared his vision for potential solutions.

He mentioned contemplating ideas like enabling self-declaration through structured data for small publishers and leveraging that information to instigate positive transformations.

“I’ve been exploring and suggesting ideas about what we could do with small publishers and self-declaration through structured data and how to utilize that information to drive positive changes. It’s still in the exploratory phase, and there are no promises, but there’s potential for us to advance in a more constructive direction.

Sullivan emphasized that implementing changes will take time, and he cannot make guarantees. Nevertheless, he expressed optimism about discovering avenues for progress.

 

If you still need more time to feel overwhelmed and uncertain, consider exploring our monthly SEO packages and allow our experts to assist you.

Shilpi Mathur
[email protected]