Nvidia Unveils New AI Chip, The ‘B200 Black well’

24

Nvidia chief executive Jensen Huang has unveiled its new AI chip called ‘B200 Blackwell’ aimed at extending his company’s dominance of artificial intelligence, and keep the chip maker in a dominant position in the artificial-intelligence industry.

Jensen Huang disclosed this at its annual developer conference on Monday in San Jose, California, the first since the pandemic, which has been called a “Woodstock” for AI developers.

The new processor design ‘Blackwell’  is 30 times faster at handling some tasks than its predecessor.

The Blackwell chips, which are made up of 208 billion transistors, will be the basis of new computers and other products being deployed by the world’s largest data Centre operators – a roster that includes Amazon, Microsoft, Alphabet’s Google and Oracle.

Blackwell, named after David Blackwell, the first black scholar inducted into the National Academy of Science.

Its predecessor, Hopper, fueled explosive sales at Nvidia by building up the field of AI accelerator chips. The flagship product from that line-up, the H100, has become one of the most prized commodities in the tech world – fetching tens of thousands of dollars per chip.

The growth has sent Nvidia’s valuation soaring as well. It is the first chipmaker to have a market capitalisation of more than $US2 trillion ($3 trillion) and trails only Microsoft and Apple overall.

The announcement of new chips was widely anticipated, and Nvidia’s stock is up 79 per cent this year through Monday’s close. That made it hard for the presentation’s details to impress investors, who sent the shares down about 1 per cent in extended trading.

Mr Huang, Nvidia’s co-founder, said AI was the driving force in a fundamental change in the economy and that Blackwell chips were “The engine to power this new industrial revolution”.

The new design has so many transistors – the tiny switches that give semiconductors their ability to store and process information – that it’s too big for conventional production techniques. It’s actually two chips married to each other through a connection that ensures they act seamlessly as one, the company said.

Nvidia’s manufacturing partner, Taiwan Semiconductor Manufacturing Co, will use its 4NP technique to produce the product.

Blackwell will also have an improved ability to link with other chips and a new way of crunching AI-related data that speeds up the process. It’s part of the next version of the company’s “super chip” line-up, meaning it’s combined with Nvidia’s central processing unit called Grace.

Users will have the choice to pair those products with new networking chips – one that uses a proprietary InfiniBand standard, and another that relies on the more common ethernet protocol. Nvidia is also updating its HGX server machines with the new chip.

The company got its start selling graphics cards that became popular among computer gamers. Nvidia’s graphics processing units, or GPUs, ultimately proved successful in other areas because of their ability to divide up calculations into many simpler tasks and handle them in parallel. That technology is now graduating to more complex, multistage tasks, based on ever-growing sets of data.

Blackwell will help drive the transition beyond relatively simple AI jobs, such as recognising speech or creating images, the company said. That might mean generating a three-dimensional video by simply speaking to a computer, relying on models that have as many as 1 trillion parameters.

For all its success, Nvidia’s revenue has become highly dependent on a handful of cloud computing giants: Amazon, Microsoft, Google and Meta platforms.

The challenge for Nvidia is broadening its technology to more customers. Mr Huang aims to accomplish this by making it easier for corporations and governments to implement AI systems with their own software, hardware and services.

Comments are closed.