Research Team Shows AI Doubling Computing Power Every 6 Months

The years 2021 and 2022 could be considered the period when supercomputers began to reach their potential. Today’s super thinking machines have completely demolished the way we used to measure the growth of AI. If you are involved in computing you probably have heard of Moore’s Law. Moore’s law is a term used to refer to the observation made by Gordon Moore in 1965 that the number of transistors in a dense integrated circuit (IC) doubles about every two years.

Discovermagazine.com has published a story by M.Smith, who says computer power is exploding past Moore’s Law.

Smith writes this:

“Now we get an answer thanks to the work of Jaime Sevilla at the University of Aberdeen in the UK and colleagues who have measured the way computational power in AI systems has increased since 1959. Sevilla has been awarded a Marie Skłodowska-Curie grant to work on developing explainable tools for probabilistic reasoning as a researcher at Aberdeen University. This team says the performance of AI systems during the last 10 years has doubled every six months or so, significantly outperforming Moore’s Law.”

Measuring the Growth of AI

How to prove or disprove Moore’s Law was the question put to early AI researchers. Even understanding the methods AI uses to solve a given problem is not completely understood. But researchers have made progress.

This improvement has come about because of the convergence of three factors. The first is the development of new algorithmic techniques, largely based on deep learning and neural networks. The second is the availability of large datasets for training these machines. The final factor is increased computational power.

While the influences of new datasets and the performance of improved algorithms are hard to measure and rank, computational power is relatively easy to determine. And that has pointed Sevilla and others towards a way to measure the performance of AI systems.

They say that between 1959 and 2010, the amount of computational power used to train AI systems doubled every 17 to 29 months. They call this time the pre-Deep Learning Era.

“The trend in the pre-Deep Learning Era roughly matches Moore’s law,” concludes Sevilla and his team.

Sevilla and others have used what they are calling Parallel Processing. Smiths’ article explains more about the team tying neural networks together. This trend has led to the development of machines, such as the AlphaGo and AlphaFold machines that have cracked Go and protein folding respectively.

“These large-scale models were trained by large corporations, whose larger training budgets presumably enabled them to break the previous trend,” say Sevilla and co.

The larger the digital brain, such as Deep Blue or Alphago, the faster the processing and that is the reason Smith says Moore’s Law may not matter anymore when it comes to measuring AI.

However, the common approach suggests that it ought to be possible to measure AI performance on an ongoing basis, perhaps in a way that produces a ranking of the world’s most powerful machines, much like the TOP500 ranking of supercomputers.

Last month, Facebook owner, Meta, announced that it had built the world’s most powerful supercomputer devoted to AI. Just where it sits according to Sevilla and co’s measure isn’t clear but it surely won’t be long before a competitor challenges that position.

Perhaps it’s time computer scientists put their heads together to collaborate on a ranking system that will help keep the record straight.

read more at discovermagazine.com