


In addition, “MUM is multimodal, so it includes information through text and images and, in the future, can expand to multiple modes such as video and audio”. “It is trained to understand 75 different languages and performs many different activities at the same time, allowing him to develop a more complete understanding of information and knowledge of the world than previous models”. “MUM not only understands language but generates it“, he says. The main problem MUM aims at solving is, as Pandu Nayak writes for Google’s blog, “having to type out many queries and perform many searches to get the answer you need.” Google MUM leverages its power best for queries that don’t have an easy answer helping Google’s search engine tackle complex tasks. BERT (MUM’s predecessor) is similar in this regard, the main difference being that MUM is 1000x more powerful. Like other popular state-of-the-art language models such as GPT-3 or LaMDA, Google MUM is based on the transformer architecture.

MUM is an improvement of Google’s search engine.
