Cadence #70 - Google & Links...

Cadence #70 - Google & Links...

Hello and welcome to another weekly Cadence newsletter.

This week, a couple of weeks late to the party, I'm covering the remarks made by Google's Gary Illyes in a podcast around how Google actually "follows" links.

In a recent podcast, Google's Gary Illyes clarified that Googlebot doesn't "follow" links as many believe. Instead, it collects links and processes them later.

This contradicts Google's official documentation, which suggests real-time link-following. This could impact how SEO professionals think about crawl budgets, site architecture, and crawl frequency.

The discrepancy highlights the importance of staying updated on Google's actual practices, as assumptions about Googlebot's behavior might be more nuanced than previously understood.

Googlebot's Link Processing

According to Illyes, when Googlebot encounters a link on a webpage, it doesn't immediately follow that link.

Instead, Googlebot stores the link for later processing.

This approach suggests that the notion of Googlebot "following" links during a crawl might be somewhat misleading.

The process is more about collecting links and determining later which ones to crawl, based on various factors like site authority, content relevance, and crawl budget.

This revelation contrasts with what Google's official documentation has suggested over the years, where the idea of immediate link-following was a widely accepted concept.

The distinction is subtle but significant, particularly for those in the SEO industry who optimize websites for better crawlability and indexing.

Implications for SEO

For SEO professionals, this nuanced understanding of how Googlebot handles links could influence strategies around site architecture, internal linking, and crawl budget management.

Since Googlebot doesn't follow links in real-time, the prioritization of which links to process could impact how quickly new or updated content is discovered and indexed by Google.

This delay in processing might mean that websites with stronger overall site authority could have their links processed and crawled faster than those on less authoritative sites.

As such, optimizing for crawl efficiency becomes even more critical. Ensuring that important links are easily accessible and prominent can help in getting them crawled and indexed sooner rather than later.

The Importance of Staying Updated

The discrepancy between what has been long assumed about Googlebot's behavior and what is now being clarified by Google's own representatives underscores the importance of staying updated with the latest information.

SEO practices that worked based on older assumptions might need revisiting in light of these new insights.

Google's continuous updates and clarifications mean that SEO professionals must remain vigilant, constantly adapting their strategies to align with the reality of how search engines operate.

This ensures that their websites remain competitive in the ever-evolving landscape of search engine optimization.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了