Artificial intelligence turbo-charged the search for new useful algorithms, such as those for text translations and image recognition.
Once again the result comes from Google's Deep Mind company with AlphaTensor, the system that has allowed us to identify all the best 'shortcuts' to make some types of complex calculations easier.
The results, obtained under the guidance of DeepMind researcher Alhussein Fawzi, are published in the journal Nature.
"AlphaTensor was able to identify the most efficient ways to multiply matrices, that is, more or less large blocks of numbers widely used in the IT field for a great variety of operations that have very concrete applications", - the expert told ANSA. of neural networks Simone Scardapane, of the Sapienza University of Rome, commenting on the result.
Alpha Tensor is a sort of evolution of AlphaGo, the algorithm developed by DeepMind, specialized in playing Go - considered one of the most demanding games for a machine because it requires an enormous processing effort - which in 2016 also defeated the champion of Go. world, Lee Sedol.
"In fact, the researchers asked the algorithm to 'play' in finding the best ways to multiply two matrices together," explained Scardapane.
"There are in fact two paths: one is to perform all the operations step by step, the other is to find some sort of shortcuts that lead to identical results, but eliminating some calculations. AlphaTensor has found the best shortcuts and improved many of those already. discoveries ".
The possible impact of the new system is significant because it could reduce by as much as 10% the number of operations necessary to complete very demanding calculations, such as those that allow the translation of a text, dialogue or image recognition.
Less operations means less energy used, consequently lower costs and reduced emissions by large data centers.