Technical SEO Issue - Primary Pages
Lily Grozeva
Driving SEO for B2B tech companies like Neo4j, Cribl, and Extreme Networks. Excited about helping brands transition from classic Google SEO to AI Search.
A few days back, we went through the value of having a technical SEO report which is a snapshot of your website technical SEO status. Crawling is just one of the challenges, we use Deepcrawl for doing that, with its main advantage for this task is being web crawler and giving the opportunity to schedule the monthly reports. Bigger challenge is usually the report read out. When you have 50 to 100 issue labels in the report, it is very hard for a professional inexperienced in SEO to decide if those issues need attention and resources or not. In the next few weeks, I will go through the most popular issues and terms in a monthly report and try to give you some general guidance.
We are starting with the Primary Pages. They are your most quality pages with the highest potential of attracting organic traffic. Obviously you do not have more important pages than them.
Primary page is any page on the website that has unique content allowed for indexing, or is the primary page in a set of duplicates. So all pages that have the potential to represent you in Google would look like this.
We need to pay attention to these few things in the Primary Pages report:
What ratio Primary to All Pages you have
We care about this ratio to make sure we work with an optimized crawl budget.We only want to focus Googlebot to crawl Primary pages. All pages which are not Primary, like Paginated, Non-200, Failed URLs etc should be noindexed, canonicalized or improved to be included in the Primary pages.
In short - if we are not showing a page in Google results, it is not for crawling.
Are important pages included
Check if all pages you expect to rank for are included in the Primary list. If they are not - check in which of the other categories they fell. For example, Googlebot might have chosen another page as the canonical (original), and the page you care for is now marked as duplicate. Other reasons might be that it is broken, too slow and failing to load, or erroneously noindexed or canonicalized as copy.
Are low quality pages included
The reverse check up is important also. Check to see if you have low quality pages slipped in the Primary list. The most common one here would be automatically generated pages or UTM parameter pages to be included in the list while the original is marked off as duplicate.
These three cases wrap up where focus should go when examining the Primary Pages. Have fun.