Developing Technical SEO Requirements For Filtered Pages
Arun Mahendran
?? Growth at Trebound & Monk Mantra | ?? SEO Strategist | ?? Growth Marketer | ?? Marketing Science | Webflow Devotee | ? Fractional CMO | Reforge'24 | Harvard Business School - DMS | London School Of Business
I recently bumped into a Question, that sounded tricky, interesting, and challenging.
Below is the question:
Need help with developing technical SEO requirements for filtered pages?
I am an SEO specialist here, working for a large e-commerce company. Basically, we are going to enable filtered state URLs on all of our product category pages. So users will be able to see the clicked filter as a parameter in the URL. The purpose of this is for users to easily see the selected filter in the URL and share the filtered list of products. It's more looking to improve the user experience, and obviously, there will be SEO implications. I've been tasked with developing technical requirements for this project, and the immediate step would be to disallow the crawling of filtered URLs at the initial launch via robots.txt. I'm not sure if there'd be any additional requirements since we'd just be disallowing them, but are there any crucial pieces to the 'requirement' that I may be missing here? Insights would be much appreciated. Thank you."
Below is my response:
Here are some insights and recommendations:
1. Robots.txt: It's good that you're thinking of disallowing the crawling of filtered URLs at the initial launch. However, please remember that robots.txt will prevent crawling but not necessarily indexing. If Google discovers these URLs from other sources or through direct links, they can still appear in the index.
2. Canonical Tags: Ensure you use canonical tags on your filtered pages, pointing back to the main category page. This will signal to search engines that the main category page is the "master" version and prevent potential duplicate content issues.
3. Parameter Handling in Google Search Console (GSC): Log into GSC and set up URL parameter handling. This will let you tell Google how you want them to treat these URLs. For most ecom filters, indicating that the parameter "narrows" or "specifies" results and letting Googlebot decide works best.
4. Noindex Meta Tag: If you want to ensure these filtered pages don't get indexed, you might consider using a "noindex" meta tag on the filtered pages in addition to the robots.txt directive.
5. Faceted Navigation: Be cautious about the potential growth of URLs due to faceted navigation. If users can select multiple filters simultaneously, this could exponentially increase the number of possible URLs. This can lead to crawl depth and crawl waste issues. Using AJAX or other methods to dynamically load results without changing the URL could be a consideration.
领英推荐
6. Breadcrumb Schema: Using breadcrumb schema markup will help search engines understand the page hierarchy and could assist users in the SERPs.
7. Pagination: If your filtered pages result in pagination, ensure you implement rel="next" and rel="prev" tags to show the relationship between paginated pages.
8. Sitemap: Ensure your filtered pages aren't included in your XML sitemaps. Only the main canonical pages should be included to focus crawlers on your most important content.
9. Test, Test, Test: Before fully launching, test your implementation. Could you check how search engines are accessing and interpreting the filtered URLs? Tools like Screaming Frog or DeepCrawl can help with this.
10. Analytics & Tracking: Please make sure you can track these filters' usage in your analytics platform. This will help you gauge user interest in specific filters and adjust your strategy accordingly.
In conclusion, while disallowing crawling of the filtered URLs is a good starting point, you should consider the additional recommendations above to ensure you cover all aspects of potential SEO implications. Best of luck with your implementation!
Is there anything that you would recommend? ??????
#TechnicalSEO #RobotsTxt #CanonicalTags #GoogleSearchConsole #Noindex #FacetedNavigation #BreadcrumbSchema #PaginationSEO #SitemapOptimization #SEOtesting #AnalyticsTracking #SEOStrategy #EcommerceSEO #URLParameterHandling #SEOTips #DuplicateContent #CrawlDepth #SEOTools #SERPs #OnPageSEO