The Limited Times

Now you can see non-English news...

"We gambled and it paid off": this is how Nvidia became a chip monster in the field of artificial intelligence - voila! technology

2024-03-22T23:34:07.264Z

Highlights: "We gambled and it paid off": this is how Nvidia became a chip monster in the field of artificial intelligence - voila! technology. Nvidia became in a short time the hottest stock in the world and such that even your uncle bought its shares. Nvidia actually started with video cards, the same hardware component to which the screen connects and is responsible for the display output of the computer. Probability, by chance or not, is what lies at the heart of AI and language understanding algorithms: Is the answer I gave to the question accurate enough?


About the path taken by the Nvidia company, the important Israeli point along the way, about a floating point and about the new processors it announced at its huge conference in San Jose, California


Anudia/ShutterStock

"We bet on an artificial one and it paid off. After 22 years."

These things were said by Jensen Huang, the legendary CEO and co-founder of the Nvidia company on the stage of the GTC conference, the company's huge annual developer conference (its name, by the way, in English is written like this: nVidia and not Nvidia as many mistake it). Jensen said the things are jokes, but there is quite a bit of truth in them.



His opening speech once again filled the "Shark Pool", the SAP Stadium and home of the "Sharks", the hockey team of San Jose, on its 19,000 seats when he was in the configuration of rock shows - . But it's fitting that the CEO of Nvidia does resemble a kind of rock star: aside from the regular leather jackets he's known for, everyone wants to take a picture with him (and he happily agrees, the man is approachable and completely free of manners), and some people really adore him.



The only previous tech leader to have a similar aura and cult of personality was the fondly remembered Steve Jobs, but unlike him, Huang is far from a misanthrope.

He is approachable, not condescending and relative to the fact that he runs the third most valuable company in the world (a little over two trillion dollars, ahead of only Apple with about $2.7 trillion and ... Microsoft, with a value of a little more than three trillion dollars).

Jensen Huang, legendary CEO and co-founder of Nvidia/Reuters

From 3D graphics to artificial intelligence

Although Nvidia became in a short time the hottest stock in the world and such that even your uncle bought its shares and suddenly knows how to pronounce its name, it was not always like that.

Nvidia has been a well-known name for decades - but only if you are hardware enthusiasts, gamers or 3D designers, because Nvidia actually started with video cards, that hardware component to which the screen connects and is responsible for the display output of the computer, both 2D and 3D - and above all, the processor The graphic, which is the heart of the card, dates back to the late 90s of the last century.



Unlike the computer's main processor, the CPU, which performs operations in a row or in a column, graphic processors are able to perform computational operations at the same time (by the way, today modern main processors have also adopted this approach, but it used to be not like this), and for 3D graphics operations are required A lot of floating point calculation, or in the English acronym FLOP - FLoating point OPeration, something that Nvidia's cards really excelled at, and were often the cards and processors preferred by gamers and designers due to their ability to process complex 3D graphics - and quickly, which is also based on a lot Mathematical calculations.

More in Walla!

Pietro is celebrating a round birthday and you enjoy a once-in-60-years sale

In collaboration with Pitro

Nvidia actually started with video cards, the same hardware component to which the screen/ShutterStock connects

In itself, 3D graphics is a great sales niche, but still, relatively niche.

But Huang discovered something interesting: the floating point calculations, which the cards his company produces are so successful at - are good for other things, such as calculating probabilities.

Probability, by chance or not, is what lies at the heart of AI and language understanding algorithms: Is the answer I gave to the question accurate enough?

And so, almost overnight, Jensen Huang decided to transform nVidia from a company that only makes powerful graphics cards - to a company that also supports machine learning and artificial intelligence, he informed his employees of this decision by email, and Nvidia, which had already begun to build a name for itself in the server field - entered the world New, and very aggressively.

Became in a short time the hottest stock in the world/ShutterStock

Blackwell is the new black

Nvidia began to introduce processors whose specialization is complex computational operations, which are measured in huge amounts of floating-point operations, or units that started with a unit called giga-flop, that is, billions of floating-point operations per second, and what very quickly reached even larger units: teraflop and today we talk about A petaflop, which is one thousand teraflops or:



1,000,000,000,000,000 floating point operations per second.

And yes, that's a 1 followed by 15 zeros...


The new processing unit that Nvidia announced this week at its conference in San Jose, California called "Blackwell" is a crazy leap forward.

What started with the "Pascal" processors that Nvidia introduced in the early 2000s (all of Nvidia's processors are named after famous mathematicians and scientists), today reaches the configuration of a huge cabinet, which weighs several tons, and inside it are trays upon trays of advanced electronics.



In one such cabinet, named GB200, there are 72 new Blackwell processors, and several more communication processors that go by the name Grace, and all of them are connected with a super-fast fiber optic-based communication exchange, called NVLINK.

This communication hub, by the way, is an Israeli development of Mellanox, which Nvidia acquired in its entirety back in 2018.



Remember we talked about petaflops?

Beauty.

So an array of Blackwell processors is capable of performing 40 petaflops, that is, 40,000,000,000,000,000 floating point operations per second, which is five times more than the previous generation of Nvidia processors in the field...

Jensen Huang introduces the new processors

Jensen Huang introduces the new processors/Reuters

The secret sauce

But what is it good for you ask?

Excellent question.

Such an insane amount of floating point calculation operations are required for those who want to run a large artificial intelligence language model (LLM) or as you know it: ChatGPT and the like.

Yes, all the queries you run to ChatGPT, drawing pink elephants or creating images, end up being executed on Nvidia processors, which calculate what you asked in the form of probabilities and inference and return an answer.

And to serve so many requests per second, you need monstrous processors like Blackwell.



Nvidia today is more or less a de facto monopoly in the field of artificial intelligence.

The entire industry almost works with processors and their software.

Nvidia's secret sauce is that they offer developers a complete package of both hardware and software that knows how to work with it, including the ability today to take a third-party artificial intelligence model from Google, OpenAI or whoever - and just start working with it.

And it costs developers and companies money.

A lot of money.

Especially since today, artificial intelligence is a tool that is applied in almost every industry in an attempt to improve it.



Nvidia's processors and software are not only in artificial intelligence.

They are in robots, autonomous cars and everything that requires the calculation of probabilities, and Nvidia is currently the largest company in the world in this field, even though there are already several startups trying to steal the glory - and the cash - from it.



But at a question-and-answer session with reporters, Jensen Huang didn't sound worried.

"There are a series of characteristics that make our processor unique. We make the model work well. It's not a hardware problem, it's a software problem... energy efficiency and cost are also important considerations."

And when you buy a processor for 30-40 thousand dollars a piece, cost is really an important consideration.



Now multiply such costs by the amount of processors that are actually sold to the entire industry that deals with running artificial intelligence models, and we are talking here about the big ones (Google, Microsoft, Amazon, Facebook) and the small start-ups alike, and you will understand why there is no fear of a drop in Nvidia's stock soon.

Huang was right.

Their gamble paid off.

It paid off very well.

  • More on the same topic:

  • nvidia

Source: walla

All tech articles on 2024-03-22

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.