Skip to main content
  1. Home
  2. Computing
  3. News

The popularity of ChatGPT may give Nvidia an unexpected boost

Add as a preferred source on Google

The constant buzz around OpenAI’s ChatGPT refuses to wane. With Microsoft now using the same technology to power its brand-new Bing Chat, it’s safe to say that ChatGPT may continue this upward trend for quite some time. That’s good news for OpenAI and Microsoft, but they’re not the only two companies to benefit.

According to a new report, the sales of Nvidia’s data center graphics cards may be about to skyrocket. With the commercialization of ChatGPT, OpenAI might need as many as 10,000 new GPUs to support the growing model — and Nvidia appears to be the most likely supplier.

Nvidia's A100 data center GPU.
Nvidia

Research firm TrendForce shared some interesting estimations today, and the most interesting bit pertains to the future of ChatGPT. According to TrendForce, the GPT model that powers ChatGPT will soon need a sizeable increase in hardware in order to scale up the development.

Recommended Videos

“The number of training parameters used in the development of this autoregressive language model rose from around 120 million in 2018 to almost 180 billion in 2020,” said TrendForce in its report. Although it didn’t share any 2023 estimates, it’s safe to assume that these numbers will only continue to rise as much as technology and budget allow.

The firm claims that the GPT model needed a whopping 20,000 graphics cards to process training data in 2020. As it continues expanding, that number is expected to rise to above 30,000. This could be great news for Nvidia.

These calculations are based on the assumption that OpenAI would be using Nvidia’s A100 GPUs in order to power the language model. These ultrapowerful graphics cards are really pricey — in the ballpark of $10,000 to $15,000 each. They’re also not Nvidia’s top data center cards right now, so it’s possible that OpenAI would go for the newer H100 cards instead, which are supposed to deliver up to three times the performance of A100. These GPUs come with a steep price increase, with one card costing around $30,000 or more.

The data center GPU market doesn’t only consist of Nvidia — Intel and AMD also sell AI accelerators. However, Nvidia is often seen as the go-to solution for AI-related tasks, so it’s possible that it might be able to score a lucrative deal if and when OpenAI decides to scale up.

Should gamers be worried if Nvidia does, indeed, end up supplying a whopping 10,000 GPUs to power up ChatGPT? It depends. The graphics cards required by OpenAI have nothing to do with Nvidia’s best GPUs for gamers, so we’re safe there. However, if Nvidia ends up shifting some production to data center GPUs, we could see a limited supply of consumer graphics cards down the line. Realistically, the impact may not be that bad — even if the 10,000-GPU prediction checks out, Nvidia won’t need to deliver them all right away.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
AI’s chip hunger could keep memory prices painfully high for years
Memory shortages may haunt your next phone, laptop, and GPU for years
Crucial Memory and SSD

While recent reports claimed that memory prices may not fall till 2027, it seems like the memory chip crunch isn't a short-term headache. And that's bad news for anyone hoping phone, laptop, and GPU prices will get cheaper again soon.

Reuters reports that SK Group chairman Chey Tae-won said the global chip wafer shortage is likely to last until 2030, with artificial intelligence demand continuing to outpace the supply. Chey said the current shortage could remain above 20%, largely because AI systems require huge amounts of high-bandwidth memory and therefore burn through a lot of wafers.

Read more
One of the most controversial US agencies is reportedly taste-testing Anthropic uber-powerful Mythos AI
The agency's reported use of Mythos highlights a widening split inside the US government over AI risk
Claude AI on an iPhone.

The US government's AI fight just got harder to square. The National Security Agency is reportedly using Anthropic's Mythos Preview even as senior Pentagon officials keep pushing to cut the company off over supply chain concerns. It shows how quickly real security needs can outrun official policy.

Since February, the Defense Department has been trying to block Anthropic and push vendors to do the same. Yet, according to an Axios report, the NSA appears to be moving ahead with one of the company's most powerful models anyway, suggesting cybersecurity demand is carrying more weight than the feud now playing out inside government.

Read more
AI streaming is going mainstream in China, whether audiences want it or not
IQiyi wants AI to make most of its content someday, and it's already starting.
man holding tablet watching iQiyi

China's Netflix, iQiyi, is making one of the biggest bets in streaming history. The company wants AI to create the bulk of its films and shows someday soon, and it's already restructuring its 16-year-old business to make that happen.

At its annual content showcase in Beijing, founder and CEO Gong Yu announced that iQiyi is pivoting its popular streaming platform into a social media destination built around AI-generated content. 

Read more