SEO News Round-Up - January 2022

SEO News Round-Up - January 2022

Welcome to the January 2022 SEO news round-up from SALT!

We've scoured the world of SEO to bring you all the latest happenings and what you need to look out for in the coming weeks/months.

Desktop Core Web Vitals

The Core Web Vitals Desktop update has been confirmed for a gradual rollout throughout February. However, don’t panic! The impact will likely be minimal since Core Web Vitals tends to be only used by Google as a tie-breaker where two pages equally match user intent and content quality.?

It would be a good idea to keep an eye on the Core Web Vitals Desktop report and make improvements where possible, but check whether desktop page performance has taken a hit before investing any significant time in changes.

Handling Robot Detection?

John Mueller has shared that, if a hosting company detects a bot and serves a "bot detected" notice, we need to be careful of how this is implemented.?

If Google encounters one of these notices, and if the page is served with a 200 status code, Google will think that the detection notice is the page content of the URL, meaning that any quality content on that page will be lost as far as Google is concerned.?

Additionally, a 200 status code alongside a noindex tag will result in the page dropping from SERPs, since Google will think the noindex tag is intended to be part of the page.

John Mueller highlighted that the correct approach is to utilise a 5xx status code.

This issue seems like something worth adding to some brief tech audit checks.?

However, it’s also worth noting that there are a few ways to test this handling, with different pros and cons:

  • Inspecting a URL in Google Search Console - When you inspect a URL and hit Live Test, it will allow you to view a screenshot of how the page was rendered. This is a quick way to double check what Google really sees, and whether bot detection is posing a current and urgent problem to Google.
  • However, this will not be helpful if you already know that Google can appropriately get through the bot handling but just want to check whether the bot detection notice uses best practice when it does encounter unexpected bots.
  • Spoofing Google bot in browser - if your spoofed bot gets flagged and shown the bot detection page, this does not necessarily mean Googlebot would be flagged. This is because sites tend to use more than user agent for bot detection.
  • Spoofing Googlebot in browser is actually far more likely to trigger bot detection, which we can actually use to our advantage to test the bot detection handling even in instances where the real Googlebot can access the page. (Because it’s probably worth knowing whether a 200 or 500 is used, even if Googlebot can get past it).

New Google Video - Migrations & URL Structure Changes

A new video by Google highlights the migration guidance that the following items are crucial to any good migration:

  • Research
  • Timing
  • Redirect Mapping & Tracking
  • 301 redirects

It also highlights that, no matter what it is, any small change to a URL should be considered a migration.?

Additionally, Google notes that important pages are likely to see quick changes, but less-important pages may take several months (likely dependent on site size)

While this video likely did not contain any new information to SEOs with experience in migrations, it is a great 2-minute starting point for a quick understanding, and may also be a great resource to send to non-SEOs when discussing a migration with other teams across a business.

Times ranges for cookTime, prepTime and totalTime no longer accepted in Recipe Schema

Previously, it was possible for Recipe markup to contain cookTime, prepTime and totalTime as a range (e.g. 10-20 minutes). However, this is no longer supported by Google, and the time must now be an exact number (e.g. 10 minutes).

This is quite a specific change that only affects Recipe Schema implementation, so unless you are using Recipe Schema you can rest happy.?

However, if you are using Recipe Schema on your site, it may be worth checking the page in the Rich Results Testing Tool to make sure you’re meeting Google’s new standard.

New robots tag: "indexifembedded"

Google has recently announced a new robots tag: "indexifembedded" . This allows Google to index page content embedded through means such as iframes as part of the page it's embedded on.

The indexifembedded tag should be implemented with on the embedded content source, along with a noindex tag to ensure the content does not get indexed twice.?

No alt text provided for this image

New URL Inspection API released

Google has announced a new URL inspection API tool which allows up to 2,000 URLs to be inspected per day - that’s 600 URLs per minute!

This should prove useful for checking the indexing status of large websites, or sections of large websites. While we’ve always had tools to check indexation on a page-by-page basis,? looking at everything on a micro-scale can sometimes make it harder to see the overall picture. Now, being able to check this en-masse, should be really helpful when trying to identify trends and causes behind issues.

Links

Desktop Core Web Vitals - https://www.seroundtable.com/google-page-experience-update-coming-to-desktop-soon-32733.html

Robot Detection - https://www.seroundtable.com/google-500-status-code-robot-detection-interstitial-32767.html

URL Migration & Structural Changes -https://www.youtube.com/watch?v=FHjVmi1tkEw&t=2s

Recipe markup requires specific times -https://searchengineland.com/google-recipe-markup-now-requires-specific-times-no-more-time-ranges-378686

Indexifembedded- https://developers.google.com/search/blog/2022/01/robots-meta-tag-indexifembedded

URL Inspection API- https://developers.google.com/search/blog/2022/01/url-inspection-api



要查看或添加评论,请登录

SALT.agency的更多文章