M6

DailyPost 1874
M6

What is M6? There would be very few even in the AI domain, who would venture out for a reply. InfoQ describes M6’s main skills ”(It) has cognition and creativity beyond traditional AI, is good at drawing, writing, question and answer, and broad application prospects in many fields such as e-commerce, manufacturing, literature and art.” This is the latest building block in the fast moving Artificial Intelligence research world. GPT-3, and OpenAI Codex have already been making waves. The recent new language model MT-NLP has shown remarkable improvements above the first named two. It is like an AI war now. In the everlasting battle of AI inflection points and AI winter, it is now thought that we have entered a stage of a consistent / persistent Inflection Point, which will now grow from strength to strength.

Just when MR-NLP was being seen as a clear-cut  front runner in the race, M6 has completely changed the game. It comes out of the Alibaba stables, which has not been seen as a major contender to date. Or at least nothing great had been reported in the public domain. M6 is the product of Alibaba DAMO Academy, announced on June 25, it is a large multimodal, multitasking language model with 1 trillion parameters. It is so to say 5x GPT-3 size, the gold standard in this field. With multimodality and multitasking, it is certainly a notch ahead of previous models moving in the direction of general intelligence.

Seems it has left nothing both to competition or imagination even for the best in the Artificial Intelligence research and scientific fraternity. Just in a year a neural network 50 times larger than another can be trained at 100 times less energy cost. All of us know how big energy guzzlers these AI platforms are. It simply means that M6’s 10 trillion parameters can be achieved at 1% of GPT-3’s energy cost. Alibaba researchers on energy and efficiency say, ”they reduce the consumption of the model by 80% and increased its efficiency x11 when compared to 100-million language models.” It paves the way for smaller players to enter the game of large AI models.

Alibaba’s two announcements in June and Nov 2021, are now providing a new direction to the robustness and energy consumption of these models and may be pricing too in the days to come. It would take some time to understand what it computationally means and what would be the nature of skyrocketing delivery. They have also brought down their energy consumption to 1% of what GPT-3 is needed to train. This would reduce carbon footprint in a big way moving towards green and clean cutting edge solutions. If Alibaba publishes the techniques and methods, this would lead to a paradigm shift in the democratization of AI in a revolutionary manner. Small players can compete with large corporations monopolizing the super-profitable field of large AI models.

WE ARE AT THE CUSP OF DEMOCRATIZATION OF ARTIFICIAL INTELLIGENCE.

Sanjay Sahay

Leave a Comment

Your email address will not be published. Required fields are marked *


The reCAPTCHA verification period has expired. Please reload the page.

Scroll to Top