BERT (Bidirectional Encoder Representations from Transformers)  is a Natural Language Processing (NLP) model that helps Google understand language better in order to serve more relevant results. BERT will play a significant role in improving conversational search. According to Search Engine Journal, BERT will impact 10% of search queries including organic rankings and featured snippets.

MOZ team has published a new video ‘What Is BERT?’ featuring Britney Muller.

Watch this video to learn about Google’s BERT algorithm and how it works.

MOZ team says, “There’s a lot of hype and misinformation about the new Google algorithm update. What actually is BERT, how does it work, and why does it matter to our work as SEOs? Join our own machine learning and natural language processing expert Britney Muller as she breaks down exactly what BERT is and what it means for the search industry”.

What Is BERT? – Whiteboard Friday

MOZ

Sharing is caring