Exactly How Many Major “Sitewide” Signals Does Google Have?

Exactly How Many Major “Sitewide” Signals Does Google Have?

How can your site’s reputation and overall quality be hurting your important pages?

For years, Google would insist that its major signals (relevance and PageRank) are page-level. They were supposedly looking at pages, not sites.

Well, it is not exactly true. In fact, domain-level metrics go a long way as Google started talking about them back in the early 2010s.

What do we know about site-level signals, and more importantly, why do we care?

Well, if you are asking yourself the second question, you are probably new to SEO. The fact that Google looks at sites, not just pages, changes everything. But let’s look at each signal separately to understand just how much.

Note: None of these signals is straightforward. Each may be a thousand of signals we would never know. These are how we know them and how they were explained to us by Google. They are obviously much more complicated.

Topic Authority

As the term suggests, topical authority stands for the go-to resource for or a well-known expert on (or both) a particular topic.

Google doesn’t often use this term in its official documentation. The clearest mention of it was in Google News guidelines which state that topical authority for each story helps pages rank higher (presumably even if it is not an original source having broken the news):

  • How notable a source is for a topic or location” (This is the entity classification).
  • How original reporting … is cited by other publishers” (These are backlinks/citations that help Google estimate the authoritativeness of that resource).
  • ...a source’s history of high-quality reporting, or recommendations from expert sources, such as professional societies [and] journalistic awards” (These are references from other connections from other entities Google knows about).

Google’s raters’ quality guidelines mention topical authority or resource authoritativeness many times as well. For example, for a query [hotels], here’s how the quality guidelines explain why “Marriott” qualifies as a good candidate for ranking higher:

Page Quality: The Marriott website gives information on Marriott Hotels, a popular chain of hotels. Marriott has a good reputation and is an expert on Marriott hotels, making the information on this page highly authoritative. High+ to Highest is an appropriate rating.


Another example in the guidelines suggests checking the backlinks of higher-ranking pages. For example,?

Positive reputation information: Users in the U.S. can obtain free credit reports on this website by providing their Social Security Number. Note that the Wikipedia article tells us that “AnnualCreditReport.com is the only federally mandated and authorized source for obtaining a free credit report.
Linked mentions help topic authority!

Google’s Helpful Content guidelines also suggest that our sites need to have a primary topic or focus that indirectly proves the topical authority concept:

Does your site have a primary purpose or focus?

Both the Google News article and the quality raters’ quality guideline suggest that for higher rankings your site needs to:

  • Make sure your site has a topic it is consistently covering (don’t go too much outside of it)
  • Have a reputation as a trusted resource
  • Be a recognizable brand within that niche
  • Have references from other trusted sources (like Wikipedia or other recognized entities)

This site-level signal creates two major challenges for SEOs and website owners:

  • It makes it harder for smaller and newer sites to rank even when they are original sources of information or provide content of higher quality (umm those hidden gems that Google is determined to once surface)
  • It makes it harder to identify why a particular URL is over-ranking other URLs that may provide a better experience.

The key here is, of course, to work on your site’s topical authority but it is neither fast nor easy..

Content Quality

Since Panda, we’ve known that poor quality content (especially its high percentage) can hurt the whole site. Google’s Michael Wyszomierski confirmed this a while ago:

it’s important for webmasters to know that low quality content on part of a site can impact a site’s ranking as a whole.

Helpful Content updates may hit the whole site as well if its algorithm finds a bigger part of it “unhelpful.”

Any content — not just unhelpful content — on sites determined to have relatively high amounts of unhelpful content overall is less likely to perform well in Search.

Just like with Panda, Helpful Content poses a clear problem in these cases: If the whole site is affected, how should we determine which content is actually deemed unhelpful?

Google’s recommendation here is to revamp your whole site and your whole content strategy. I’d say this recommendation is pretty unhelpful but this is what we have to do.

Site’s Reputation/Historical Records

This is the most interesting part for me because it assumes that Google is saving some records for each site as it has proven to be either trusted or not.

Google has been telling us about sites’ reputation since at least 2010: Quality links to your site | Google Search Central Blog

There are those who advocate for short-lived, often spammy methods, but these are not advisable if you care for your site's reputation.

So we know the site’s reputation exists…

A bigger question here is: for how long can your site’s poor reputation (i.e. its past) be hurting you??

In Google’s office-hours video, John Mueller talks about fixing technical issues (of which they have no memory) versus fixing website quality which takes much more time. This, I assume means that in cases of poor quality reputation, Google does have a memory of low quality which, in turn, confirms that Google does assign some kind of “reputation” to the whole site:?

…it takes a lot of time for us to recognize that maybe something is not as good as we thought it was.
Similarly, it takes a lot of time for us to learn the opposite again.
And that’s something that can easily take, I don’t know, a couple of months, a half a year, sometimes even longer than a half a year, for us to recognize significant changes in the site’s overall quality
So that’s something where I would say, compared to technical issues, it takes a lot longer for things to be refreshed in that regard.

So what’s the solution here? Sending high-quality signals will likely remedy things faster because Google will re-visit the site again and again with each high-quality link and assign more trust signals which will over-weigh its past mistakes. Creating and promoting high-quality linkable content is the way to go.

Finally, if you haven’t had enough, there are quite a few patents that talk about site quality:

Let me know what you think!

Ann Smarty

Co-Founder of Smarty Marketing, the boutique SEO agency based in NY

10 个月

This is also the topic of our Wednesday' (tomorrow) LIVE discussion here on LinkedIn. Please join us -> https://www.dhirubhai.net/events/thesitequalityingoogle-sorganic7155077904551284736/comments/

要查看或添加评论,请登录

社区洞察

其他会员也浏览了