BERT stands for Bidirectional Encoder Representations from Transformers.

Explore innovative ideas for Australia Database development.
Post Reply
tanjilaakter0011
Posts: 18
Joined: Sun Dec 22, 2024 3:57 am

BERT stands for Bidirectional Encoder Representations from Transformers.

Post by tanjilaakter0011 »

What is Google's BERT Search Algorithm?
As we said BERT is the biggest change in search since Google released RankBrain and is essentially Google's neural network technique for natural language processing ( NLP ) before training.

BERT was officially born last year, and was described in detail on the Google AI blog . In short, BERT will help computers understand language a little more like humans do.

Since BERT’s primary job is to help better understand the nuances and context of words in searches, it will primarily work to better match queries with more relevant results. That’s why it’s already working on featured snippets .

Google gave an example: in the search “2019 brazil whatsapp in philippines traveler to USA need a visa,” the word “TO” and its relationship to other words in the query are important to understand the meaning.

Until a few months ago, Google would not have understood the importance of this connection and would have returned results about American citizens traveling to Brazil.

As Google explains, “With BERT, Search is able to pick up on this nuance and know that the very common word ‘TO’ is very important here, and we can provide a much more relevant result for this query.”

The following examples are for illustrative purposes and may not work in live search results.


Image 

In another example, a search for “do estheticians stand a lot at work,” Google said it would previously match the term “stand-alone” with the word “stand” used in the query, adding that Google’s BERT models can understand that “stand” is related to “the concept of physical demands of a job, and display a more helpful response.”
Post Reply