DailyPost 2205

Research is our future and it has been sometime since large part of it has moved to the industry, at the least the result yielding ones. It has also been variously argued if the product of research means no change in our existence in that field, then it is better not conducted. A company which had the promise to make it happen, DeepMind was bought by Google in 2014, mostly on terms of his founder Demis Hassabis who still runs the show, with Goggle getting prefixed to it. This acquisition by Google was to give him huge computational and other resources that would be required by a company of that nature. Their conviction for research is certainly game changing.

In the year 2016 it was claimed that Google DeepMind AI was able to reduce the company data Google Data Centre cooling bill by 40%. This was literally out the world research, very rarely found and discussed the public domain. Besides DALL-E 2 which is making waves in the area of text prompted new image creations, another revolutionary technology might be just on the cards. Now DeepMind has found a new way to multiply numbers and thus speed up computers. While lot can happen on the hardware, these nature of algorithms can make a revolutionary change to the compute, which is at the core of our day to day functioning.

The area is known as Matrix multiplication. In this system two grids of numbers are multiplied together, which forms the basis of large number of computing tasks. An improved technique discovered by DeepMind artificial intelligence could boost the computation speeds by up to 20%. Matrix multiplication as a fundamental computing task is used by all software to limited value, but in is of immense value when it comes to the cutting edge tech areas of the day; graphics, AI and scientific simulations. It is argued that ”even a small improvement in the efficiency of these algorithms could bring large performance gains, or significant savings.”

For centuries the truism was that the most efficient way of multiplying matrices would be proportional to the number of elements being multiplied. The task thus gets harder for larger matrices. In 1969, Volker Strassen proved that multiplying a matrix of two rows of two numbers with another of the same size does not necessarily involve 8 multiplications and with a clever trick, can be reduced to 7. This approach requires some addition which is computationally less intensive. It held ground for over 50 years. But now DeepMind AI has now discovered a faster technique. ”It found an algorithm for multiplying two matrices of four rows of four numbers using just 47 multiplications, which outperforms Strassen’s 49 multiplications.” It has also been able to develop improved techniques for multiplying matrices of other sizes, 70 in total.

Sanjay Sahay

Leave a Comment

Your email address will not be published. Required fields are marked *

The reCAPTCHA verification period has expired. Please reload the page.

Scroll to Top