Is Google RankBrain an approach for its Real-time Algorithm Update?

Is Google RankBrain an approach for its Real-time Algorithm Update?

Since few months ago, Google had already been talking about its "real-time algorithm update" and recently, they confirmed the roll-out of such real-time algorithm update within the same month of when RankBrain was introduced

But how can the "algorithm" be updated in "real-time" if it is not an artificial intelligence that learns through the users' query patterns and click-through behaviours in order to for it to make the ranking decision "on the fly"?

Most webmasters  know that Google Search has over 200 ranking factors, but are these ranking factors set in stone while new technology keep on emerging over time that could affect user experience on the web, e.g. Mobile-Friendly Algorithm Update? Secondly, how can Google publicly disclose the prioritization of its ranking factors when spammers keep on abusing it to fool the search engine? Therefore, in my opinion, the search engine can never be smart enough to combat web spam while at the same time maximizing the satisfaction of user experience without employing powerful AI.

RankBrain: Ranking-Aided Artificial Intelligence?

As I mentioned above, the search algorithm can never be updated (or "deployed") in real-time without having the AI that changes the algorithm (or re-prioritize the ranking factors etc.) on the fly. From the way I look at it, there are three major reasons why Google wants to (and have to) use AI on its search engine:

  1. A search engine that adapts to technological trends and user behaviours
    To be honest, I personally feel that Google RankBrain is an "algorithm generator" by itself. It learns from user click-through behaviours, perhaps together with the assistance of information collected from Google Chrome and Google Analytics, and refine its ranking allocation over time.
  2. A search engine that makes it less predictable by spammers
    As the AI-driven ranking algorithm has gone beyond the ranking factors known by many webmasters, spammers may no longer have any clue whatsoever to perform any black-hat SEO that fools the search engine.
  3. The most cost-efficient "real-time algorithm update" approach
    What can be more cost-efficient than having the AI changes and refines its algorithm by itself in real-time with almost no human intervention?

If my prediction is true, what should we do next?

Again, as I always shared with my friends on Facebook, just focus on Google Webmaster Guidelines and the content quality of your website; this includes User Experience as well, both mobile and desktop. If RankBrain is based on user click-through behaviours, perhaps this is the time for us to keep the users stay a little bit longer on our website, e.g. Pageview Per Session, Bounce Rate, User Activities, Duration Per Session etc. To me, Link Building isn't so important when Google values more on "natural" and high quality links. Furthermore, how can Google delivers late breaking news to their search users if they weight too heavily on backlinks? How can a newly published page has such a high number of high quality banklinks? In addition to this, how can Google bury such a high quality new page that matches perfectly well with certain search queries? Of course, social shares are important, which I think being actively involved in social media should be good enough; whether high quality or low quality, just leave it to the users to decide -- because Google respects user actions more than webmasters anyway.

要查看或添加评论,请登录

社区洞察