The landscape of search engine optimisation (SEO) can be confusing. But in the past couple of years, this process has been transformed by two technological advancements: BERT & MUM. Recently, you may have noticed a decrease in the robotic nature of search results, which now deliver more relevant outcomes based on your search intent rather than just keywords.
This article will explain these changes and their importance for the internet’s future.
What is the BERT & MUM Update?
Previously, Google’s algorithms considered searches a “bag of words”. It would identify the most important-looking nouns and find web pages that contained them. BERT & MUM introduced the ability to view all of the words of a sentence.
BERT Update:
Before we look at the future, we need to understand the foundation. Launched in 2019, BERT stands for Bidirectional Encoder Representations from Transformers. It sounds complicated, but its job is basic: to understand the context of words in a search query. Below are its features:
- Directionality: Previous models were left-to-right or right-to-left. BERT reads both forward and backwards, which helps it understand the context of a word in terms of what precedes and follows.
- Stop word recognition: words like “to”, “for”, or “from” were previously ignored. Today, the BERT & MUM algorithms regard these words as important keywords.
- Nuance: It allows the search engine to differentiate between multiple meanings of a word (polysemy) according to the context.
MUM Update:
While BERT was a big step, MUM is a giant leap. The Multitask Unified Model (MUM) was unveiled at Google I/O 2021 and is 1,000 times more capable than its predecessor. MUM isn’t an iteration of SEO; it is a whole new breed.
Google’s focus on complex search queries, which typically require multiple searches, is evident in the BERT and MUM update cycles. For instance, you have climbed Mt Fuji and are now planning to climb Mt Rainier next fall. You have to research Rainier’s altitude, autumn weather, hiking gear, and difficulty level. Mum knows that you want to “compare” and can give you the combined answer.
- Multilingual Capabilities: MUM is trained across 75 different languages. It can draw information from a source in Japanese to answer a question posed in English.
- Multimodal Nature: MUM is multimodal, which means it “sees” and “hears”. It can extract information from text, images, and ultimately video or audio.
- Knowledge Transfer: It can transfer the knowledge it has of one concept to another, making it incredibly efficient at learning human concepts.
Importance of BERT and MUM in SEO
You may be asking yourself if MUM is going to replace BERT. The answer is no. They work together. BERT is good at understanding the language of a search, and MUM is proficient at the hard work of multi-tasking.
The BERT and MUM impact on SEO has been profound. We are moving away from an era where “keyword density” was king. Today, the focus is on “topical authority” and “user intent”. Google is no longer just matching strings of text; it is matching concepts.
Differences between BERT & MUM
It’s very important to understand the difference between BERT & MUM if you want to know their advantages separately:
| Feature | BERT | MUM |
| Launch Date | 2019 | 2021 (Gradual Rollout) |
| Power Level | Baseline AI | 1,000x more powerful than BERT |
| Data Types | Text-based only | Multimodal (Text, image, and Video) |
| Language | Understands 70+ languages | Transmits knowledge across 75+ languages |
| Primary Goal | Sentence context and nuances | Solving complex, multi-step queries |
BERT & MUM Examples
To truly grasp how these updates work, let’s look at BERT and MUM examples:
Example 1: The “To” Factor (BERT)
For the search query “2019: Brazil travels to the USA and needs a visa”, the most significant word is “to”. Before BERT, Google may have displayed results for Americans travelling to Brazil. The search intent ensures Google now knows you’re travelling to the USA from Brazil and displays the correct visa information for Brazilians.
Example 2: Deep Comparison (MUM)
Search: “Hiked Mt Adams. What do I need to do to be ready for Mt Hood?”
MUM notices you are comparing two mountains. It takes into account the altitudes, the trails, and the equipment needed. It might even show you the image from a blog that describes the boots you need to wear for Mt Hood’s trails, even if the text in that blog doesn’t explicitly state “Mt Adams”.
BERT & MUM Optimisation
Many people ask how to “hack” the update. The truth is, you can’t. These are AI models designed to reward high-quality content. However, you can align your strategy with how these models think.
1. Write for Humans, Not Robots
Forget about repeating your primary keyword every 100 words. Use natural language. If you are writing about a topic, cover the related subtopics that a human would naturally ask about. This builds topical relevance.
2. Answer Questions Directly
Google often uses BERT & MUM to populate featured snippets. To increase your chances:
- Use H2 and H3 tags as questions.
- Provide a concise answer (40-50 words) immediately following the header.
- Use bulleted lists for step-by-step instructions.
3. Incorporate Visual Context
Since MUM is multimodal, your images and videos matter more than ever. Ensure your image alt-text is descriptive and that your videos have clear transcripts. This helps the BERT and MUM algorithm understand your non-text assets.
4. Focus on Long-Form, Comprehensive Content
MUM wants to solve complex problems. People probably won’t get the answer they need from a 300-word blog. Consider crafting comprehensive guides that serve as a comprehensive resource for a specific topic.
BERT & MUM Search Intent
The shift brought by BERT and MUM search intent analysis means that Google is becoming a “knowledge engine” rather than a “search engine”. It aims to reduce the number of clicks a user has to make. For content creators, it means your value lies in being the most authoritative and clear source of information.
The BERT & MUM optimisation strategies should focus on the “E-E-A-T” principles: experience, expertise, authoritativeness, and trustworthiness. Because MUM can cross-reference information in different languages and formats, it is much harder to fool the algorithm with low-quality, AI-generated fluff.
Also Check:
- Digital Marketing with AI Course
- Digital Marketing With AI
- Digital Marketing with AI + Social Media Marketing
- Digital Marketing with AI (Offline Batch)
- AI in Digital Marketing – The Ultimate Guide
- Digital Marketing Syllabus: Course, Duration, Fees
- How to Start a Career in Digital Marketing? Skills Required & Job Roles
FAQs
Does MUM replace the BERT algorithm?
MUM is not a replacement for BERT. They are complementary technologies. BERT is used to linguistically understand the words in a search query, while MUM is used for more complex, multimodal, and multilingual tasks.
How does the BERT and MUM update affect my website ranking?
The changes favour content that is more helpful and comprehensive. If you have light or keyword-stuffed content, you could see a decline. But if you have helpful answers to people's questions, you'll likely see an improvement.
What are some BERT & MUM examples of improved search?
One of the examples is that Google will be able to understand "negative" search queries, like "Can I get someone's medicine filled at a pharmacy without a prescription?" BERT allows Google to recognise that "without" is the keyword in this sentence.
Can I perform specific BERT and MUM optimization for my blog?
The most important optimisation is to include structured data (schema markup), write in a simple, conversational style, use informative headings and include relevant images with alt texts to help MUM's multimodal capabilities.
Why is MUM considered more powerful than BERT?
It's more powerful because it's trained on a much larger scale, it can process information in multiple languages (75+), and it can process multimodal (text, image, and video) data, while BERT is mainly for text.
