COMPUTING THE WORLD

DailyPost 2009
COMPUTING THE WORLD

Compute is the most critical capability of defining the outstanding potential of the digital world. Whichever you go, you will finally reach where buck stops; the computing threshold achieved. How do you manage the complexity and cost? Moore’s law has been able to support us for the last fifty years, of compute doubling every 18 to 24 months. What is its final potential or is there a need for getting into a different genre of computing? Will it be possible to take subscription based computing to the next level of computing as and when it happens. The world seems to be moving in the direction of computing the impossible, from today’s standpoint. The data needs a new treatment for the bold new, completely transformed digital world, the requirements becoming more challenging than ever.

As per IDC 64.2ZB of data was created in 2020, which is expected to grow to 180ZB by 2025. Of all the data that was created in 2020, only 10.6% was useful for analysis and only about 44% of that used. There is no way to confirm the backend calculation of these figures and how the two connect to the utilities it has delivered. The fact of the matter is with all the hype around big data analytics, ML and AI, we may barely be at the tip of the iceberg. Only few IT behemoths have made the first cut, with the presently available technology. The compute paradigm shift has to happen. Every industry has “grand challenges.” How would life be if it weren’t there or it was solvable with the available technology. One ex – the financial services industry operates on the assumption that predicting the stock market and accurately modelling is very hard to do.

What then would be the computing landscape which helps in sorting out the data tomb on our head and simultaneously also solve the grand challenges. The new age emerging machines will have the capability to achieve the core challenges, with quantum being the pinnacle of next gen problem solving. Besides the final frontier, as and when it gets achieved, and quite distant now, high performance computers (HPC) or massive parallel processing supercomputers, can help businesses to make use / sense of huge swaths of data under their command. Currently, it is either too expensive or inefficient for traditional computing.

Another area which is making waves is the area of biology inspired compute. There is no better teacher than nature. ”It draws inspiration from or relies on, natural biological processes to store data, solve problems or model complex systems in fundamentally different ways.” It is expected that the above enumerated three sets of machines would exponentially enable us to solve the world’s deepest challenges. Experimentation will be the key as we move forward. Already, a combination of GPUs, ASICs, and other purpose built chips have started to push HPC capabilities to the next level. One ex – the D1 Dojo chip was built to run the computer vision neural networks that underpin the company’s self-driving technology. Claiming to solve the ”grand challenges” in theoretical and abstract manner will no longer satiate the world’s digital exponential and dire need.

MAKING COMPUTING THE IMPOSSIBLE THE NORM, IS THE BIGGEST GRAND CHALLENGE

Leave a Comment

Your email address will not be published. Required fields are marked *


The reCAPTCHA verification period has expired. Please reload the page.

Scroll to Top