Google Latest Update “BERT” Algorithm (sample article)
[email protected]

Google Latest Update “BERT” Algorithm (sample article)

Google Latest Update “BERT” Algorithm

What is BERT Algorithm?

The BERT algorithm (Bidirectional Encoder Representations from Transformers) is a deep learning algorithm related to natural language processing. It helps a machine to understand what words in a sentence mean, but with all the nuances of context.

Bert is a natural language processing pre-training approach that can be used on a large body of text. It handles tasks such as entity recognition, part of speech tagging, and question-answering among other natural language processes. Bert helps Google understand natural language text from the Web. Google has open sourced this technology, and others have created variations of BERT.

How will the Google BERT Algorithm update affect the SEO?


Google introduced the BERT update to its Search ranking system last week. The addition of this new algorithm, designed to better understand what’s important in natural language queries, is a significant change. Google said it impacts 1 in 10 queries. Yet, many SEOs and many of the tracking tools did not notice massive changes in the Google search results while this algorithm rolled out in Search over the last week.

The question is, Why?

The short answerThis BERT update really was around understanding “longer, more conversational queries,” Google wrote in its blog post. The tracking tools, such as Mozcast and others, primarily track shorter queries. That means BERT’s impact is less likely to be visible to these tools.

And for site owners, when you look at your rankings, you likely not tracking a lot of long-tail queries. You track queries that send higher volumes of traffic to your web site, and those tend to be short-tail queries.

How has the Google BERT update affected you?

I don’t expect BERT to change the way people optimize their content. The change, if it’s seen, will be seen in how well the algorithm appears to understand the queries searchers enter, and which pages are returned based on that understanding. From what I’ve read, it’s apparently about understanding how the words in a query relate to each other, as opposed to Google looking for as much of the query as possible.


How Many parameters Does BERT Algorithm have?

BERT is a multi-layer bidirectional Transformer encoder. There are two models introduced in the paper.

· BERT base – 12 layers (transformer blocks), 12 attention heads, and 110 million parameters.

·  BERT Large – 24 layers, 16 attention heads and, 340 million parameters.

How Google BERT Algorithm Works?

Initial Google runs a program called spider to save some page attributes like titles, page text, hyper-links, etc to its own database. The spider then visits the hyper-links on this page and then the links on these links and so on. Thus we get a structure like this:


No alt text provided for this image

Here each circle represents a web-page and arrows pointing away are hyper-links on the pages eg. E has links to B,F and D.This kind of structure is called a graph.

Now the important part. After generating this structure Google uses an algorithm call page-rank algorithm to rank the pages based on its importance. Now "how is the importance decided?" simple. The page that is referred by many other pages is important, like B which is referred by almost all pages. Also the pages referred by B like page E become important because they are referred by a important page despite not being referred by other pages. Likewise pages that are not referred have a very low rank.


Features Of BERT Algorithm

The revisions within the search algorithmic program haven't been confined to updates in computer code alone. There were new hardware advancements too. For the first time, Cloud TPU’s have been used in breakthroughs.

The cloud TPU’s usually power Google products like Translate, Photos, Search, Assistant, and G-Mail. The TPU,s, or Tensor process Units are connected up with search algorithms to serve search results and supply users with the foremost relevant info.

However, this update is only confined to US English and shall be implemented in other languages over time.

When Does BERT Algorithm USED?

BERT usually comes into consideration for understanding the search queries in a better way and give relevant results. This revision is also applicable for snippets.

However, this doesn’t mean the update has replaced RankBrain. This BERT Google NLP update is an additional support for a clear understanding of your query and to provide you with relevant results.


BERT A.I. Language Modeling

According to some, BERT A.I. is a successful example of transfer learning, “which might help the transfer approach take off”.

Transfer learning might be really helpful in that it could help reduce the need for labeled data, which is a big concern when doing ML, since high-quality labeled datasets are very hard —and expensive, I guess— to produce, which makes it hard to deploy applications in many domains of interest (you can’t train a model if you don’t have the data).

Also, transfer learning makes it possible to train and use models with less computational power. Read the first part (or even only the abstract, if you don’t have much time) of the article “A Survey on Transfer Learning” (by Sinno Pan and Qiang Yang) if you want to know more about transfer learning.



What is Google’s Search Algorithm?

Google's algorithm does the work for you by searching out Web pages that contain the keywords you used to search, then assigning a rank to each page based several factors, including how many times the keywords appear on the page. Higher ranked pages appear further up in Google's search engine results page (SERP), meaning that the best links relating to your search query are theoretically the first ones Google lists.

What makes Google's search algorithms so unique?

Google focus on two things – Customer requirement and customer reviews. Presently customer wants exact result of their query. So to improve their search quality and other feature Google is working on those factors that makes Google search engine algorithm is unique.

Conclusion

The new update is here to stay for good! The new Google BERT update is one of the significant updates in recent years. Because this update is focused on providing more context-based search results, there is no need to worry about getting penalized. Instead, it’s more focused on recognizing search intent much better.

Understanding natural language is a complex and ongoing challenge to Google, and they admit that, even with BERT, it may not get a 100% accurate result.


要查看或添加评论,请登录

社区洞察

其他会员也浏览了