The Limited Times

Now you can see non-English news...

Meta finalizes a supercomputer to power Mark Zuckerberg's metaverse

2022-01-24T22:57:29.245Z


The company will inaugurate in the middle of the year a calculation center designed to train advanced artificial intelligence models, such as the one necessary to translate in real time what people say in different languages


Artificial intelligence (AI), the technology that makes possible the most complex computer applications that we handle today, requires the processing of huge amounts of data and a lot of calculation capacity. Meta (the company formerly known as Facebook) announced this Monday that a new supercomputer, the AI ​​Research SuperCluster (RSC), will come into operation in the middle of the year, specially designed to contribute to the development of AI. From the company they emphasize that their mission will be to create more exact models for existing application systems, but also "develop the foundational technologies with which we will feed the metaverse".

The metaverse is a kind of virtual world in which, according to the plans of Mark Zuckerberg, founder of Facebook, we will spend a good part of our lives. There we interact with each other, we will entertain ourselves, we will work, we will study, we will exercise and we will buy. The metaverse, Meta's big bet, doesn't exist yet. To create this virtual environment, it will be necessary to rely on automatic and deep learning systems (

machine learning

and

deep learning ).

), the most sophisticated and promising AI techniques.

These tools make it possible, for example, to ensure that the movements of the avatars are as natural as possible, that the virtual reality glasses capture the facial movements of the users and reproduce them in their avatar or that the leaves of the trees in that world are move convincingly.

"The RSC will help Meta researchers build new and better AI models that can learn from trillions of examples, work in hundreds of different languages, analyze text, images or video indistinctly, and develop new augmented reality tools," they maintain from the company, which has not revealed where the new supercomputer will be located.

Appearance of a processor room of the Meta AI RSC supercomputer.

The term machine learning is not a euphemism. This artificial intelligence technique consists of letting the program seek life to solve a given problem. The key element for its success is not so much how the system is programmed, but what training it is given to draw its own conclusions. For example, if you want him to be able to recognize a tree, the result will be different if he is shown a thousand trees so that he can learn what they are like than if he is taught a million examples. In machine learning, the size of the database matters. Managing huge databases requires a lot of computing power and a certain infrastructure. And that is what the new Meta supercomputer aims to provide.

Among the new applications to be investigated in the short term is a simultaneous voice translator for large groups of people. In other words, a program capable of translating in real time what people say in different languages, so that they can collaborate on work projects or video game games. Work will also be done on improvements in computer vision systems (automatic recognition and labeling of objects from images) or on the detection of inappropriate content. So-called natural language processing, which allows machines to process our spoken instructions, also has a lot of room for improvement. With more training, the way the system recognizes dialects, discriminates ambient noise from the human voice or autocompletes silences could be fine-tuned.

Although the supercomputer is not yet ready, preliminary tests have shown that it processes computer vision-related workflows 20 times faster. It has also been proven to run Nvidia's collective communication library (NCCL), the processors used by Meta, nine times faster than the best data center available so far, and to train large natural language processing models three times. faster. "That means that a model with tens of billions of parameters can finish its training in three weeks, compared to the nine it needed so far," they say from Meta.

You can follow EL PAÍS TECNOLOGÍA on

Facebook

and

Twitter

or sign up here to receive our

weekly newsletter

.

Source: elparis

All tech articles on 2022-01-24

You may like

News/Politics 2024-02-05T03:50:13.765Z
News/Politics 2024-02-29T13:33:31.960Z

Trends 24h

Tech/Game 2024-03-27T18:05:36.686Z

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.