After 15 years of handling unanticipated search queries, Google announced that it has made progress in understanding language using machine learning.
Google said in its official blog that by using Natural Language Processing (NLP) pre-training technique called Bidirectional Encoder Representations from Transformers or BERT model, understanding keyword-ese search queries has become easier.
The company is using some of the BERT models with the latest Cloud TPUs to serve search results and bring more relevant information quickly by understanding the context of the query. BERT has shown search improvements in languages such as Korean, Hindi, and Portuguese.
(This story has not been edited by Business Standard staff and is auto-generated from a syndicated feed.)