Recent remarks from Google employees suggest that the algorithm is functioning according to its intended design, emphasizing that site owners should prioritize user experience over trying to manipulate the algorithm. However, these same individuals also mention that the search team is exploring methods to enhance the visibility of high-quality content.
This might appear contradictory at first glance. If the algorithm is operating as intended, why invest efforts in refining it further? The explanation for this apparent paradox is rather unexpected.
Google’s Point Of View
Understanding Google’s perspective on search is crucial, and they offer insights through their Search Off The Record (SOTR) podcast, where Googlers discuss search from their unique viewpoint behind the search box.
In a recent episode, Googlers Gary Illyes and John Mueller delved into how issues within Google’s system may occur but might not be significant from their internal standpoint. However, these issues might become apparent to external users, prompting questions and observations.
In this context, Gary Illyes addressed the decision-making process regarding whether to communicate these issues externally.
He shared:
“There’s also the flip side where we are like, ‘Well, we don’t actually know if this is going to be noticed,’ and then two minutes later there’s a blog that puts up something about ‘Google is not indexing new articles anymore. What up?’ And I say, ‘Okay, let’s externalize it.'”
John Mueller then asks:
“Okay, so if there’s more pressure on us externally, we would externalize it?”
And Gary answered:
“Yeah. For sure. Yeah.”
John follows up with:
“So the louder people are externally, the more likely Google will say something?”
Gary then clarified with a nuanced response, stating that while sometimes external pressure prompts Google to make announcements, it’s not a blanket rule because not every perceived issue is actually a failure on Google’s part.
He explained further:
“I mean, in certain cases, yes, but it doesn’t work all the time, because some of the things that people perceive externally as a failure on our end is actually working as intended.”
So, while external feedback does influence Google’s decisions to communicate, it’s not always indicative of an actual problem with Google’s systems. Sometimes, the issue lies with the site owners themselves, who may not realize that something is amiss. This can be observed when people mistakenly attribute changes in their site rankings to Google’s actions, such as during the Site Reputation Abuse crackdown, even though their sites were not affected by manual actions.
The Non-Existent Algorithms
SearchLiaison recently addressed the ongoing misconception surrounding the Helpful Content Update (HCU), clarifying that the HCU system no longer exists. Instead, the signals associated with helpfulness have been integrated into Google’s core ranking algorithm.
In a tweet, they stated: “I know people keep referring to the helpful content system (or update), and I understand that — but we don’t have a separate system like that now. It’s all part of our core ranking systems.”
Essentially, all the signals previously attributed to the HCU are now part of the broader core algorithm, which comprises numerous components. Therefore, while helpfulness remains a factor, there are also other signals considered during core updates, leading to a more comprehensive evaluation of websites.
This underscores the importance for site owners to broaden their perspective beyond solely focusing on helpfulness-related signals when assessing changes in their rankings, as there may be a variety of factors at play influencing their site’s performance.
Mixed Signals
However, there’s some ambiguity in the feedback from Googlers. On one hand, they affirm that things are functioning as intended, yet they also mention ongoing efforts by the search team to display more websites. This implies a potential discrepancy in the algorithm’s performance.
On June 3rd, SearchLiaison addressed claims of algorithmic actions against individuals, asserting that such actions haven’t occurred. This response was prompted by a tweet on June 3rd from someone who reported being affected by an algorithm update on May 6th, expressing uncertainty about what to rectify due to the absence of a manual action. It’s worth noting that there’s a typo in the tweet, stating June 6th instead of May 6th.
The original tweet from June 3rd highlights concerns about potential algorithmic changes impacting site rankings, despite SearchLiaison’s statement that no such changes occurred on June 6th. The tweet emphasizes the sudden and severe decline in site visibility, without any accompanying manual actions to clarify the reasons behind it, which is particularly frustrating for those in the gaming media industry.
Before delving into SearchLiaison’s response, it’s worth considering that the tweet underscores a tendency to focus on specific signals or explanations for ranking drops, possibly overlooking a broader spectrum of factors that could contribute to such declines.
In response, SearchLiaison acknowledges this perspective, emphasizing two key points reiterated from a previous comprehensive post: Firstly, some individuals may perceive algorithmic spam actions despite none occurring, and secondly, manual actions are generally undesirable.
In that same response, SearchLiaison also acknowledged the potential for improvement within Google’s search algorithms and indicated ongoing research into enhancing performance. He expressed awareness of concerns, like those raised in the tweet, within the search team, emphasizing their commitment to exploring ways to enhance the search experience.
Additionally, John Mueller echoed similar sentiments in a tweet from the previous month, highlighting efforts by the team to evaluate how sites can better perform in search results for upcoming updates. He underscored the importance of showcasing content that creators have invested effort in and where sites prioritize user usefulness.
SearchLiaison mentioned they’re exploring enhancements, while Mueller indicated they’re assessing how sites may enhance their performance in future updates. So, how do we reconcile something functioning as intended yet having room for improvement?
One perspective is that while the algorithm serves its purpose adequately, it isn’t flawless. Imperfection implies opportunities for refinement, consistent with the nature of all things, wouldn’t you agree?
Takeaways:
- It’s worth noting that the potential for refinement doesn’t always indicate something is broken; rather, it acknowledges the inherent imperfection in all things.
- Additionally, it’s valuable to recognize that helpfulness is merely one aspect among several signals. What appears to be an issue related to helpfulness might not solely stem from that factor. Hence, it’s crucial to explore a broader spectrum of possibilities.
Original news from SearchEngineJournal