AMD debuts rival to Nvidia’s AI chips, gives eye-popping industry forecast

AMD CEO Lisa Su said the market for AI chips is set to explode to more than US$400 billion. PHOTO: BLOOMBERG

SAN FRANCISCO – Advanced Micro Devices (AMD), taking aim at a burgeoning market dominated by Nvidia, unveiled new so-called accelerator chips that it said will be able to run artificial intelligence (AI) software faster than rival products.

The company introduced a long-anticipated line-up called the MI300 at an event on Dec 6 in California.

Chief executive Lisa Su also gave an eye-popping forecast for the size of the AI chip industry, saying it could climb to more than US$400 billion (S$536.6 billion) in the next four years.

That is more than twice as high as a projection AMD gave in August, showing how rapidly expectations are changing for AI hardware.

The launch is one of the most important in AMD’s five-decade history, setting up a showdown with Nvidia in the red-hot market for AI accelerators. Such chips help develop AI models by bombarding them with data, a task they handle more adeptly than traditional computer processors.

Building AI systems that rival human intelligence – considered the holy grail of computing – is now within reach, Ms Su said in an interview.

But deployment of the technology is still only just beginning. It will take time to assess the impact on productivity and other aspects of the economy, she added.

“The truth is we’re so early,” Ms Su said. “This is not a fad. I believe it.”

AMD is showing increasing confidence that the MI300 line-up can win over some of the biggest names in technology, potentially diverting billions in spending towards the company.

Customers using the processors will include Microsoft, Oracle and Facebook owner Meta Platforms, AMD said.

Nvidia shares dropped 2.3 per cent to US$455.03 in New York on Dec 6, a sign investors see the new chip as a threat. Still, AMD shares did not see a commensurate increase. On a day when tech stocks were generally down, the shares fell 1.3 per cent to US$116.82.

Surging demand for Nvidia chips by data centre operators helped propel that company’s shares in 2023, sending its market value past US$1.1 trillion.

The big question is how long it will essentially have the accelerator market to itself.

AMD sees an opening: Large language models – used by AI chatbots such as OpenAI’s ChatGPT – need a huge amount of computer memory, and that is where the chipmaker believes it has an advantage.

The new AMD chip has more than 150 billion transistors and 2.4 times as much memory as Nvidia’s H100, the current market leader.

It also has 1.6 times as much memory bandwidth, further boosting performance, AMD said.

Ms Su said that the new chip is equal to Nvidia’s H100 in its ability to train AI software and is much better at inference – the process of running that software once it is ready for real-world use.

While the company expressed confidence in its product’s performance, Ms Su said it will not just be a competition between two companies. Many others will vie for market share too.

At the same time, Nvidia is developing its own next-generation chips.

The H100 will be succeeded by the H200 in the first half of 2024, giving access to a new high-speed type of memory. That should match at least some of what AMD is offering.

And then Nvidia is expected to come out with a whole new architecture for the processor later in the year.

AMD’s prediction that AI processors will grow into a US$400 billion market underscores the boundless optimism in the AI industry. That compares with US$597 billion for the entire chip industry in 2022, according to the International Data Corporation.

As recently as August, AMD had offered a more modest forecast of US$150 billion over the same period.

But it will take the company a while to grab a large piece of that market. AMD has said that its own revenue from accelerators will top US$2 billion in 2024, with analysts estimating that the chipmaker’s total sales will reach about US$26.5 billion.

The AI processors are based on the type of semiconductors called graphics processing units, or GPUs, which have typically been used by video gamers to get the most realistic experience.

Their ability to perform a certain type of calculation rapidly by doing many computations simultaneously has made them the go-to choice for training AI software. BLOOMBERG

Join ST's Telegram channel and get the latest breaking news delivered to you.