What is Google BERT? No, it isn’t Ernie’s best pal. Google BERT, (Bi-directional Encoder with Representations from Transformers), is an algorithm with natural language processing. The goal is to understand not just the words on the web page, but also the context. This requires multiple levels of logic to understand the words, the responses available, and another logic tree to determine what you really want.
Here is an example using the search term “math practice books for adults”:
Without Google BERT: Google understands that you want a book to practice some math, and it adds a query to your book type looking for something that is for an adult. It found a math practice guide for Grades 6-8 for young adults. It’s a good response back, but not good enough because it lost our context.
With Google BERT: Google sees your request for a math book for an adult. It returns Math for Grownups. Google BERT runs through complex analyses, and that requires some seriously intensive hardware. Currently, only about 10% of Google’s searches are running through BERT.
What does BERT mean for developers?
BERT builds upon previous research in Artificial Intelligence by building contextual representations. However, with it being bi-directional, it learns from the queries and makes those searches through information quickly and through multiple layers at once. Bi-directional models used to be seen as almost impossible to code, but BERT’s neural network builds from the relationship model by focusing on a task or query from the text provided. The way it builds on the relationships of objects means the deep neural network evolves to understand not just the information, but the way the information should be interpreted for use. This is a huge evolution in the use of AI.
What does BERT mean for marketers?
For Digital Marketing practitioners, Google BERT is still the new kid on the block, but it will mean changes to how web pages are read by Google’s spiders. It enables Google to understand the context of your content. Previously, it was so important to spell everything out as if someone was reading it for the first time. Now, this will change the landscape for authoritative articles. According to Google, it isn’t possible to optimize content for BERT. However, BERT offers us some preliminary ways we can improve our copywriting so it doesn’t feel SEO “spammy”:
- Word count is not as important as it used to be
- BERT focuses on the context of words, but good content still needs to be there
- Content should still contain new, relevant, and useful information
We are as curious about BERT as you are so as we learn more we will share. Would you like to try and run with BERT and not make Ernie jealous? You can download BERT! The repository is available at https://github.com/google-research/bert.