Weekly SEO News October 18-25
Google Analytics 4 Introduces Segment Sharing for Streamlined Data Consistency and Collaboration
Google Analytics 4 (GA4) has introduced a new segment sharing feature that aims to improve data consistency and streamline analysis. This update enables users with Editor or higher permissions to save segments that everyone within a property can access, eliminating the need to recreate segments for each exploration report, which was previously a time-consuming process prone to inconsistencies.
Key features of this update include:
An example showcased in the update illustrates creating a segment focused on sessions from California by selecting “region” as the dimension, setting it to “contains California,” naming the segment, and saving it in the shared library.
The update offers practical benefits across various use cases:
This feature unifies data perspectives across teams, helping organizations "speak the same data language”.
Google to Retire Sitelinks Search Box in November Due to Declining Usage
Google is set to retire its site links search box feature on November 21, citing declining usage as the primary reason. Introduced in 2014, this feature allowed users to conduct site-specific searches directly from Google’s search results page, typically appearing above sitelinks for brand or company searches.
According to Google, removing the site links search box won’t affect search rankings or impact other sitelinks in any way. The update is purely a visual change and will roll out globally across all languages and regions. Since this update isn’t considered a major algorithm change, it won’t be documented in Google’s Search status dashboard.
Following the retirement, Google will also update related tools:
Though removing the site links search box structured data is optional, Google assures that unsupported structured data won’t harm search performance or trigger errors in Search Console.
Initially launched in September 2014, the site links search box offered features like autocomplete and the option to integrate site search schema. While this feature will be retired, the 'WebSite' structured data, which aids in site name display, will still be supported. Although no immediate action is necessary, website owners and SEO professionals should be aware of these changes.
Google’s John Mueller Shares Advice on Managing Legacy AMP Subdomains
Google Search Advocate John Mueller recently advised site owners on handling legacy AMP subdomains, which can still be crawled despite redirects or outdated setups. Mueller responded to a query on Reddit from a site owner managing about 500,000 URLs who reported that Googlebot continues to crawl AMP URLs even after setting up 301 redirects three years ago. This highlights the complexities some large sites face when moving away from AMP.
Key Insights:
The site owner noted:
Mueller’s Suggestions: Mueller recommended two possible solutions for resolving this issue:
Mueller also reassured the site owner that this crawling likely doesn’t impact the crawl budget significantly, explaining that if AMP URLs are on a separate subdomain, they typically have their own crawl budget. He also noted that for sites with about 500,000 pages, crawl budget concerns are usually less pressing, even without a separate hostname.
Context and Recommendations: This guidance comes as many publishers are reassessing AMP use. Mueller’s response suggests that managing legacy AMP URLs may not require complex solutions. For large sites facing similar issues, Mueller recommended Google’s “Large site owner’s guide to managing your crawl budget” for further guidance.
Conclusion: This update is useful for SEO professionals transitioning away from AMP implementations, offering two main options: maintaining current 301 redirects or implementing a DNS-level solution. Consulting Google’s crawl budget documentation can also aid in addressing specific needs for large sites.
领英推荐
Google’s John Mueller Reiterates Core Web Vitals’ Limited Role in Search Rankings
Google Search Advocate John Mueller recently clarified that while Core Web Vitals contribute to rankings, they don’t have a substantial impact compared to content quality. Mueller addressed the growing debate around site performance's effect on visibility, emphasizing that Core Web Vitals alone are unlikely to cause a noticeable drop in rankings.
Mueller’s Perspective: On LinkedIn, Mueller remarked that Core Web Vitals are “not giant factors” in rankings and that significant traffic changes are unlikely to result from them alone. He highlighted that improving performance primarily enhances user experience, which can help retain visitors long-term.
Mueller noted, “A good user experience is worthwhile...if users don’t want to come back, it wastes the effort of bringing in first-time visitors.”
Advantages for Small Sites: Mueller pointed out that smaller sites have a distinct advantage in SEO agility, as they can more quickly adapt to necessary changes. Sharing his own experience, Mueller explained that implementing a change like a robots.txt update is straightforward for small sites but often time-consuming for larger organisations.
Industry Context: Mueller’s response followed research by Andrew Mcleod, who found rapid ranking and traffic shifts associated with performance changes. Mcleod’s experiments, which tracked sites with significant traffic, showed patterns like 20% drops in traffic within 48 hours of enabling ads, with recovery taking one to two weeks after removal.
Historical Position: This guidance aligns with Mueller’s past remarks. In a March podcast, he noted that Core Web Vitals are part of Google’s ranking systems, though achieving perfect scores doesn’t majorly boost rankings. His stance remains clear: prioritise content quality over technical performance metrics.
Conclusion: Mueller's message emphasises that Core Web Vitals are not major ranking factors, but they contribute to user satisfaction. For website owners, focusing on quality content and a positive user experience is essential, as technical performance alone doesn’t directly drive search rankings.
Google Recommends Higher-Resolution Favicons for Improved Search Appearance
Google Search Central has updated its documentation to encourage the use of higher-resolution favicons, enhancing site visibility in search results. Previously, the minimum size for a favicon was 8x8 pixels, but Google now recommends using at least 48x48 pixels for a clearer display across various platforms. Favicons, the small icons displayed in browser tabs and bookmarks, play a key role in helping users recognize websites, potentially boosting click-through rates from search results.
Updated Requirements: The documentation now specifies that favicons must be square (1:1 aspect ratio) and at least 8x8 pixels. However, Google strongly suggests using sizes larger than 48x48 pixels to ensure optimal display.
Old vs. New Guidelines:
Reason for Changes: Google revised its guidelines to improve clarity for publishers, emphasizing the value of reviewing and updating content regularly to meet current standards. This update aligns with Google's intent to provide clear requirements, which benefits website owners across various industries by enhancing how their sites appear in search results.
Google Updates Guidelines for URL Parameters
Google has recently revised its Search Central documentation to include standardized guidelines for formatting URL parameters, recommending the use of "=" and "&" for better website crawling.
This update incorporates recommendations from a previous blog post about faceted navigation into the main documentation on URL structure, enhancing accessibility to these guidelines.
Key Changes The updated guidelines advise developers to use:
Google advises against using alternative separators such as:
Importance of the Update URL parameters are crucial for website functionality, especially for e-commerce and content management systems, as they manage features like product filtering, sorting, tracking codes, and session IDs. Although they are beneficial, they can also lead to SEO issues, such as duplicate content and inefficient use of crawl budgets.
Using proper parameter formatting can improve crawling efficiency and help mitigate common indexing problems that may impact search performance. The documentation also addresses broader challenges associated with URL parameters, including dynamic content generation, session ID management, and sorting parameter implementation.
Previous Guidance Prior to this update, developers had to consult an outdated blog post on faceted navigation for specific URL parameter formatting advice. Consolidating this information into the main guidelines streamlines access.
The revised documentation is available in the Crawling and Indexing section of Google’s Search Central resources.
Looking Forward For those using non-standard parameter formats, it's advisable to start transitioning to the standard format, ensuring appropriate redirects and monitoring crawl statistics during the change. Although Google has not indicated that non-standard parameters will negatively impact rankings, this update clarifies their preferences. New websites and redesigns should follow the standard format to prevent potential issues down the line.