Nvidia, thanks to AI Boom technology, reports rising revenues and profits

Nvidia, which makes microchips that power most artificial intelligence applications, began an extraordinary journey a year ago.

Fueled by an explosion of interest in artificial intelligence, the Silicon Valley company said last May that it expected chip sales to skyrocket. They did, and the fervor hasn't stopped, with Nvidia increasing its revenue projections every few months. Its shares have soared, giving the company a market capitalization of more than $2 trillion that makes it more valuable than Alphabet, Google's parent company.

On Wednesday, Nvidia again reported rising revenues and profits that underlined how it remains the dominant winner of the artificial intelligence boom, even as it grapples with outsized expectations and growing competition.

Revenue was $26 billion for the three months ended in April, topping February's estimate of $24 billion and tripling sales from a year earlier for the third straight quarter. Net income increased sevenfold to $5.98 billion.

Nvidia also expects revenue of $28 billion for the current quarter, ending in July, more than double the amount from a year ago and higher than Wall Street estimates.

“We are fundamentally changing how computers work and what computers can do,” Jensen Huang, Nvidia's chief executive, said on a conference call with analysts. “The next industrial revolution has begun.”

Shares of Nvidia, which have risen more than 90% this year, rose in after-hours trading after the results were released. The company also announced a 10-for-1 stock split.

Nvidia, which originally sold chips for rendering images in video games, profited after making an early and expensive bet on adapting its graphics processing units, or GPUs, to perform other computing tasks. When AI researchers began using these chips more than a decade ago to speed up tasks like recognizing objects in photos, Huang jumped at the opportunity. He has enhanced Nvidia's chips for artificial intelligence tasks and developed software to aid developments in the field.

The company's flagship processor, the H100, has enjoyed strong demand to power AI chatbots like OpenAI's ChatGPT. While most standard high-end processors cost a few thousand dollars, the H100s have been selling for between $15,000 and $40,000 each, depending on volume and other factors, analysts say.

Colette Kress, Nvidia's chief financial officer, said Wednesday that she had worked in recent months with more than 100 customers who were building new data centers — which Huang calls AI factories — ranging from hundreds to tens of thousands of GPUs, with some reaching 100,000. Tesla, for example, is using 35,000 H100 chips to help train models for autonomous driving, she said.

Nvidia will soon begin releasing a powerful successor to the H100, codenamed Blackwell, announced in March. Demand for the new chips already appears to be strong, raising the possibility that some customers will wait for the faster models rather than buy the H100. But there were no signs of such a pause in Nvidia's latest results.

Ms. Kress said demand for Blackwell is well in excess of supply of the chip and “we expect demand could exceed supply well into next year.” Huang added that the new chips should be operational in data centers by the end of the year and that “we will see a lot of revenue from Blackwell this year.”

The comments could ease fears of slowing Nvidia's momentum.

“Lingering concerns investors had in the near term about an 'air bubble' for GPU demand appear to have faded,” Lucas Keh, an analyst at research firm Third Bridge, said in an email.

Wall Street analysts are also watching for signs that some richly financed rivals could grab a sizable share of Nvidia's business. Microsoft, Meta, Google and Amazon have all developed their own chips that can be adapted for artificial intelligence work, although they have also said they are increasing purchases of Nvidia chips.

Traditional rivals such as Advanced Micro Devices and Intel have also made optimistic predictions about their AI chips. AMD said it plans to sell a new AI processor, the MI300, worth $4 billion this year.

Mr. Huang often highlights what he says is a sustainable advantage: Only Nvidia GPUs are offered by all major cloud services, such as Amazon Web Services and Microsoft Azure, so customers don't have to worry about being forced to use them a. of services thanks to its exclusive chip technology.

Nvidia also remains popular among computer makers who have long used its chips in their systems. One is Dell Technologies, which on Monday hosted an event in Las Vegas where Mr. Huang spoke.

Michael Dell, Dell's chief executive and founder, said his company will offer new data center systems that will pack 72 of the new Blackwell chips into a computer rack, standard structures slightly taller than a refrigerator.

“Don't seduce me with talk like that,” Mr. Huang joked. “This makes me super excited.”

Leave a Reply

Your email address will not be published. Required fields are marked *