5 Common Google Indexing Issues & How to Fix Them
Vijay Chauhan ??
SEO Manager @AllEvents.in ?? | Drove Traffic from 0 to 150M ?? | International SEO Specialist | Technical SEO Expert ?? | SEO Growth Strategist ?? | Shaping Global SEO Strategies ??.
If you are a website or e-commerce owner, you may have noticed that Google regularly has problems indexing new content. You have noticed that some of your pages do not appear in Google results even though you have made an effort to write and you are convinced that your page is of interest to Internet users.
It is likely that your site is having indexing issues. Here are some leads you should follow.
What is an indexed page on Google?
To begin with, you need to understand what an indexed page is. Google is a search engine, it is a system designed to retrieve information stored on the Internet. The information is stored in an index, which is updated or “indexed” by robots called “web crawlers”. When a user enters keywords into a search engine, the search engine searches for web pages that contain those words in its index.
To be discovered by search engines, the pages of your website must be indexed. When a page is indexed, it means that it has been analyzed by a search engine, which then records its content in a database, and is offered to users in search results. The pages are identified by indexing robots and their content is therefore attributed to their URL.
In contrast, a page that is not indexed will therefore not be displayed on search engines. Internet users will not be able to find your page on Google but only by browsing your site. Content indexing is sometimes a long process but some technical issues can significantly slow down the indexing of the site.
Read How Search organizes information
How do you know if your site has indexing problems?
There are several ways to find out if your site or a particular page is having indexing issues.
The first is to analyze the coverage report of your Search Console, provided it is configured! The coverage report gives you a lot of information about indexing errors and issues.
- indexed pages
- excluded pages
It is in this report in particular that you can identify the pages of your site that are not indexed. If your strategic pages are in the “Crawled, currently unindexed” or “Detected, currently unindexed” pages, your site may be experiencing indexing issues.
A second technique is to use the command site:https://yourwebsitename.com in the Google search bar. You will then obtain the pages of your site including this URL present on Google To give you an idea, compare the actual number of pages available with a crawl (with screaming frog for example) and the number of pages indexed on search engines. research. You can thus identify the pages that are not indexed on search engines.
The technical reasons that block indexing on Google
1) A new website
How to make your site appear on Google is an essential question for any webmaster. My website has been online for about a week, but I still can’t find it on Google. How is it possible? How can I modify it? What should I do if I want my site to be quickly indexed by search engines as soon as it is launched?
If you have a new site, you may have to wait a few weeks for it to be listed by the major search engines. This delay is normal and cannot be avoided. To facilitate this indexing work, you can submit your site to Google to announce its creation. . You can also verify that your sitemap.xml file has been created and is accessible. The latter facilitates the work of indexing on search engines. To do this, you must add the address of your sitemap in the Search Console.
2) Blocking robots.txt
In most cases, the robots.txt file consists of rules that prohibit robots from crawling a certain directory, category of pages, or part of a website.
If multiple pages or your entire site fails to index despite waiting a long time after creating your site pages and submitting the sitemap, you may have blocked those pages in your robots.txt file. . You must therefore quickly check the blocking of your pages. This must be a priority before implementing any other solution, paid or not, to improve the indexing of a website. Logically, this file is located at the root of the site: www.yourwebsite/robots.txt; check that unindexed pages are not on this file.
Search Console allows you to test your robots.txt file. In the Coverage,> Excluded > Detected tab, currently unindexed, click on one of your unindexed URLs and test blocking by robots.txt. You can then determine if it is your file that is blocking the indexing of your page.
3) Blocking via .htaccess file
It is sometimes necessary to protect access to a directory on a webserver to prevent anyone from accessing it and not displaying it for all to see in the robots.txt. To do this, a piece of code is placed in the .htaccess file which blocks the display of a page by requiring the entry of a login and password. This is a configuration file for Apache HTTP servers that is part of your site to make it accessible to the World Wide Web. You must therefore check that your page is not protected in this way and that your content is accessible to Google robots.
4) The page is blocked by a meta tag
One of these tags, the “meta robots”, allows you to block a page from all search engines. The noindex meta tag, also known as noindex, can be a very powerful tool if you don’t want your site’s URLs to appear in Google’s search results. If, on the contrary, you wish to have your pages indexed, you must imperatively remove this noindex, otherwise, you will have no chance of finding your page on the search engines.
This is often the case when launching a WordPress site and forgetting to uncheck the “Visibility by search engines” box. To check this point, go to your back office, Settings section, then Reading.
In case you want to block the indexing of a page, make sure to place the robots meta tag before blocking it with your robot. If you do the reverse, crawlers won’t crawl your page and they won’t see your noindex tag.
5) Pages with JavaScript
To find out if your page is based on JavaScript, try the free Web Developer extension. In the “Disable” tab, you can click on “Disable JavaScript” and refresh your page. If no more content appears on the screen, it means that the textual rendering of your web page is done with JS and the indexing robots are unable to read the content of your page. However, sometimes the robots still take your content into account despite the JavaScript. To verify this, check the preview of your cached page. To do this, type in the search bar “cache:https://urldemonsite.com", you will then obtain your page from the point of view of a robot.
For Google, there is no point in indexing an empty page. You must therefore quickly make your content accessible and readable for Google robots so that they can integrate and index your pages.
6) Canonical URL tags
Analyze the canonical tag of your page. If you notice that the URL in this tag points to another page, chances are your page is not indexing. Google will take the page provided as a reference and index it. The canonical tag will allow us to indicate to Google when we have two identical pages in terms of content, which page the search engine should consider as the original.
How to speed up the indexing of your pages on Google?
I have just seen some of the technical problems which can slow down or block the indexing of the pages of your site. This list is not exhaustive and despite this, your site may still have indexing problems. Here are some recommendations to follow to facilitate the indexing of your site
Publish quality content
We can’t say it enough, quality content is essential if you want your page to be present in the SERP (Google’s results pages). Quality content will be useful for Internet users and will generate engagement. Address your target and use an appropriate tone. I am not addressing professionals as I am addressing individuals, start-ups, or large companies. In short, adapt your speech to make sure you are understood.
Your content must address a problem and clearly answer it. Internet users, in most cases, seek an answer on the Internet. If you answer it, you have a better chance that Google offers your page and therefore that it is indexed.
Needless to say you should avoid duplicate content, in the best case your page will not be indexed, and in the worst, you could suffer a penalty from Google.
Update your sitemap.xml file
Check that your page is in your sitemap! This file in XML format is to be placed at the root of your site and must include all of your pages. It allows you to help Google detect your pages. In addition, you can submit your file to them with the Google Search Console. If your site is not indexing, you can consult the coverage report which will give you an explanation of the errors that Google encountered while crawling your site.
Make internal links to your page
Make sure your page links well with other pages on your site. To find new pages, Google robots go to your home page and follow the various links present in your content. If no page links to the page you want to index, robots will not pass on it and it will not be able to appear on the Google index.
Make backlinks to your page
I've just seen that to facilitate the indexing of your page, you must make internal links towards this one. Another solution is to get a backlink, a link from a third-party site to your page. We know that Google uses external links to define the popularity of a site. These links also help Google detect your pages and speed up indexing.
Turning Thinkyze into a brand you’ll remember. ?? Let’s transform your digital dreams into ROI-driven realities. Whether it’s SEO, PPC, or a killer funnel strategy, I’m just a message away. Let’s Thinkyze your growth
2 年Thanks for sharing it Vijay Chauhan. Insightful Information !!