Technical SEO Issue - Primary Pages

A few days back, we went through the value of having a technical SEO report which is a snapshot of your website technical SEO status. Crawling is just one of the challenges, we use Deepcrawl for doing that, with its main advantage for this task is being web crawler and giving the opportunity to schedule the monthly reports. Bigger challenge is usually the report read out. When you have 50 to 100 issue labels in the report, it is very hard for a professional inexperienced in SEO to decide if those issues need attention and resources or not. In the next few weeks, I will go through the most popular issues and terms in a monthly report and try to give you some general guidance.

We are starting with the Primary Pages. They are your most quality pages with the highest potential of attracting organic traffic. Obviously you do not have more important pages than them. 

Primary page is any page on the website that has unique content allowed for indexing, or is the primary page in a set of duplicates. So all pages that have the potential to represent you in Google would look like this.

No alt text provided for this image

We need to pay attention to these few things in the Primary Pages report:

What ratio Primary to All Pages you have

We care about this ratio to make sure we work with an optimized crawl budget.We only want to focus Googlebot to crawl Primary pages. All pages which are not Primary, like Paginated, Non-200, Failed URLs etc should be noindexed, canonicalized or improved to be included in the Primary pages. 

In short - if we are not showing a page in Google results, it is not for crawling. 

No alt text provided for this image

Are important pages included

Check if all pages you expect to rank for are included in the Primary list. If they are not - check in which of the other categories they fell. For example, Googlebot might have chosen another page as the canonical (original), and the page you care for is now marked as duplicate. Other reasons might be that it is broken, too slow and failing to load, or erroneously noindexed or canonicalized as copy. 

Are low quality pages included

The reverse check up is important also. Check to see if you have low quality pages slipped in the Primary list. The most common one here would be automatically generated pages or UTM parameter pages to be included in the list while the original is marked off as duplicate. 

These three cases wrap up where focus should go when examining the Primary Pages. Have fun.

要查看或添加评论,请登录

Lily Grozeva的更多文章

社区洞察

其他会员也浏览了