What are the errors listed in Google Search Console and how can they be resolved?
Prashant Kumar
Digital Marketing Manager | Driving Results, Strategy, and Growth in the Digital Landscape
If Google's web crawler, Googlebot, faces a problem while attempting to scan your website and fails to comprehend a page, it will abandon the task and proceed. As a result, your page will not be cataloged and won't appear in search results, causing a significant impact on your search rankings.
Outlined below are some of the errors that can lead to this situation:
Focusing your efforts here is a great place to start.
How To Fix A Server error (5xx):
If you encounter a server error (5xx), it means that something went wrong with the website's server, preventing it from fulfilling your request. To troubleshoot, first check if you can load the page in your browser. If you can, contact your IT team or hosting company to ask if the server has experienced any outages or if there are any configurations blocking access to the site.
How To Fix A Redirect error:
If you experience a redirect error, it means that the URL you're trying to access is not working. This can happen if there are too many redirects or if the redirect URL is too long. To fix this, ensure that your redirect goes directly to the final URL and eliminate all steps in between.
Submitted URL blocked by robots.txt
If your submitted URL is blocked by robots.txt, it means that there is a line of code in your robots.txt file that is preventing Google from crawling the page. To fix this, test your page using the robots.txt tester and remove the line if you want the page to be indexed. If you don't want the page to be indexed, check your sitemap.xml file to see if the URL is listed there and remove it if necessary.
Submitted URL marked ‘noindex’:
Finally, if your submitted URL is marked 'noindex', it means that the page has a 'noindex' directive that is preventing it from being indexed. To fix this, remove the tag or HTTP response. You’re sending Google mixed signals. “Index me… no, DON’T!” Check your page’s source code and look for the word “noindex”. If you see it, go into your CMS and look for a setting that removes this, or find a way to modify the page’s code directly.
领英推荐
It’s also possible to noindex a page through an HTTP header response via an X-Robots-Tag, which is a bit trickier to spot if you’re not comfortable working with developer tools.?
Submitted URL seems to be a Soft 404:
After you submitted this page for indexing, the server returned a response that appears to be a soft 404 error. Soft 404 errors occur when a page seems broken to search engines like Google, but it doesn't properly display a 404 Not Found response.
These errors usually arise in two situations: when you have a category page with no content, or when your website's theme generates pages that are not supposed to exist. To address this issue, you can either turn these pages into proper 404 pages, redirect them to their new location, or add some relevant content to them.
Submitted URL returns unauthorized request (401):
The URL you submitted for indexing has returned an unauthorized request (401). To resolve this issue, you need to either remove the authorization requirements for the page or allow Googlebot to access it by verifying its identity. This warning is typically triggered when Google tries to crawl a page that is only accessible to a logged-in user. It's best to remove any such URLs from your website to avoid wasting Google's resources.
Submitted URL not found (404):
If you're trying to submit the URL for indexing, it should be included in your sitemap. Check your sitemap first to ensure that the URL is included.
On the other hand, if you submitted a non-existent URL for indexing, you'll receive a "Submitted URL not found" error (404). To prevent this error, make sure to remove any pages that you've deleted from your website from your sitemap during regular maintenance of the sitemap file.
The submitted URL has a crawl issue:
The URL you submitted for indexing encountered a crawling error that Google could not classify. To resolve this issue, you can utilize the URL Inspection tool to debug the page. It's possible that something obstructed Google's ability to fully download and display your page's content.
You may also want to use the Fetch as Google tool to identify any discrepancies between what Google renders and what you see when you load the page in your browser. If your page relies heavily on Javascript to load content, that may be the issue, as many search engines still ignore Javascript. Additionally, a long page load time or blocked resources could be to blame.
Marketing Analyst
11 个月To resolve the crawl error "Submitted URL Marked 'Noindex'", first, understand it's different from pages not indexed. Likely, misconfigurations caused it. Verify URL accuracy, allow indexing in site settings, check password protection, and use Google tools for troubleshooting. For more information, read a blog at : https://www.algosaga.com/blog/crawl-error-submitted-url-marked-noindex/
full stack engineer
1 年Thank you sir, your all efforts are valuable for us.
| HR & Marketing Leader | Founder | I help aspiring entrepreneurs build their brands | 397K+ | Helped 580+ brands on LinkedIn | Organic LinkedIn Growth | Author |920M+ content views | Lead Gen | Influencer Marketing
1 年That's wonderful
Guiding Creative Women on a Journey towards Love, Joy, and Financial Freedom by transforming past challenges into self-connection and empowerment.
1 年That's interesting information. Thank you for your valuable post ?? Prashant Badhan