Everybody knows Nvidia (NVDA -1.99%) is a leading provider of chips specially designed for artificial intelligence (AI) systems. And everybody knows Nvidia's chief rival across the computing sector is Advanced Micro Devices (AMD 1.14%). AMD inspires far fewer AI headlines than Nvidia -- but how close is this fabled two-horse race within the confines of the AI industry?

Nvidia's AI strategy in a nutshell

It is widely known that OpenAI relied on a few thousand Nvidia A100 Tensor Core chips for training the GPT-3 and GPT-4 AI engines. The GPT-3.5 model now serving the entry-level ChatGPT tool was trained on the Selene supercomputer, featuring 4,480 A100 chips crunching training data in parallel.

The A100 GPU (graphics processing unit) is no lightweight. These chips are hard to find on store shelves, but you can run across used A100 cards in online shopping platforms at unit prices ranging from $10,000 to $18,000. The newer and faster H100 chip, which will train the next generation of OpenAI's GPT system, is even harder to find and more than twice as expensive.

So the hardware going into a single AI-training system can significantly affect Nvidia's income statement. That's not the whole story, of course.

Supercomputers like Selene also rely on extremely high-performance network connections provided by another Nvidia product -- the InfiniBand networking solutions that came with the Mellanox acquisition in 2019. And the ultra-specialized high-performance computing cards require some data-feeding handholding from more traditional server CPUs (central processing units). In Selena's case, 8,680 AMD EPYC processors play that role.

Wait, what? Nvidia built an AMD-based supercomputer?

Yes, indeed -- Nvidia's biggest AI win to date also included thousands of 64-core AMD processors. That's how the supercomputer story goes. Bitter rivals often find themselves working together to create systems with world-class performance for hyperspecific workloads.

A certain AMD EPYC chip offered superior data bandwidth and a slightly more modern feature set than the best Intel (NASDAQ: INTC) Xeon Scalable processors in 2020 when Selene was designed. Other Nvidia-powered systems work with Intel chips instead.

And you know what else? Two of the three fastest supercomputers in the world today used AMD Instinct MI250X chips instead of Nvidia A100 or H100.

Just like Nvidia, AMD started churning out AI-specific accelerator chips years ago. Furthermore, the company's $60 billion Xilinx buyout made it an instant expert in creating custom and reprogrammable chip designs to meet the unique needs of high-end customers.

AMD's AI system experts are currently chasing large contracts for training and day-to-day operation of GPT-style language models. This team includes many engineers from the Xilinx side of that big merger.

Battle of the sky-high valuations

So the battle for AI chip contracts may be more competitive than it seems at first glance. Nvidia is hogging the spotlight so far, but AMD is also attacking this opportunity with plenty of gusto and expertise. And from time to time, the companies are AI partners instead of rivals.

Yet, Nvidia's stock has skyrocketed 214% higher in 2023, while AMD shares gained "only" 72% (cue the sardonic air quotes). The unbalanced price gains resulted in a much richer price tag for Nvidia shares. That stock trades at 41 times forward earnings and 45 times sales today, compared to 26 times forward earnings and 8 times sales for AMD.

And Nvidia hasn't exactly knocked this business opportunity out of the park yet. AMD and Nvidia saw their total revenues shrink year over year in the first quarter. Their second-quarter reports might tell a different story over the next few weeks. Until then, rip-roaring AI sales are more of a future growth prospect than a proven money-maker idea for these two companies.

On that note, I'm not exactly sprinting to the buying window for Nvidia shares at these lofty prices. Even AMD's far-lower valuation looks overheated these days. More affordable and equally impressive investment options are available elsewhere in the AI market. You should probably look at those attractive alternatives first.