Skip to main content
  1. Home
  2. Computing
  3. News

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Microsoft explains how thousands of Nvidia GPUs built ChatGPT

Add as a preferred source on Google

ChatGPT rose to viral fame over the past six months, but it didn’t come out of nowhere. According to a blog post published by Microsoft on Monday, OpenAI, the company behind ChatGPT, reached out to Microsoft to build AI infrastructure on thousands of Nvidia GPUs more than five years ago.

OpenAI and Microsoft’s partnership has caught a lot of limelight recently, especially after Microsoft made a $10 billion investment in the research group that’s behind tools like ChatGPT and DALL-E 2. However, the partnership started long ago, according to Microsoft. Since then, Bloomberg reports that Microsoft has spent “several hundred million dollars” in developing the infrastructure to support ChatGPT and projects like Bing Chat.

Hopper H100 graphics card.
Image used with permission by copyright holder

Much of that money went to Nvidia, which is now in the forefront of computing hardware required to train AI models. Instead of gaming GPUs like you’d find on a list of the best graphics cards, Microsoft went after Nvidia’s enterprise-grade GPUs like the A100 and H100.

Recommended Videos

It’s not just as simple as getting graphics cards together and training a language model, though. As Nidhi Chappell, Microsoft head of product for Azure, explains: “This is not something that you just buy a whole bunch of GPUs, hook them together, and they’ll start working together. There is a lot of system-level optimization to get the best performance, and that comes with a lot of experience over many generations.”

With the infrastructure in place, Microsoft is now opening up its hardware to others. The company announced on Monday in a separate blog post that it would offer Nvidia H100 systems “on-demand in sizes ranging from eight to thousands of Nvidia H100 GPUs,” delivered through Microsoft’s Azure network.

The popularity of ChatGPT has skyrocketed Nvidia, which has invested in AI through hardware and software for several years. AMD, Nvidia’s main competitor in gaming graphics cards, has been attempting to make headway into the space with accelerators like the Instinct MI300.

According to Greg Brockman, president and co-founder of OpenAI, training ChatGPT wouldn’t have been possible without the horsepower provided by Microsoft: “Co-designing supercomputers with Azure has been crucial for scaling our demanding AI training needs, making our research and alignment work on systems like ChatGPT possible.”

Nvidia is expected to reveal more about future AI products during the GPU Technology Conference (GTC). with the keynote presentation kicks things off on March 21. Microsoft is expanding its AI road map later this week, with a presentation focused around the future of AI in the workplace scheduled for March 16.

Jacob Roach
Former Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
AI’s chip hunger could keep memory prices painfully high for years
Memory shortages may haunt your next phone, laptop, and GPU for years
Crucial Memory and SSD

While recent reports claimed that memory prices may not fall till 2027, it seems like the memory chip crunch isn't a short-term headache. And that's bad news for anyone hoping phone, laptop, and GPU prices will get cheaper again soon.

Reuters reports that SK Group chairman Chey Tae-won said the global chip wafer shortage is likely to last until 2030, with artificial intelligence demand continuing to outpace the supply. Chey said the current shortage could remain above 20%, largely because AI systems require huge amounts of high-bandwidth memory and therefore burn through a lot of wafers.

Read more
One of the most controversial US agencies is reportedly taste-testing Anthropic uber-powerful Mythos AI
The agency's reported use of Mythos highlights a widening split inside the US government over AI risk
Claude AI on an iPhone.

The US government's AI fight just got harder to square. The National Security Agency is reportedly using Anthropic's Mythos Preview even as senior Pentagon officials keep pushing to cut the company off over supply chain concerns. It shows how quickly real security needs can outrun official policy.

Since February, the Defense Department has been trying to block Anthropic and push vendors to do the same. Yet, according to an Axios report, the NSA appears to be moving ahead with one of the company's most powerful models anyway, suggesting cybersecurity demand is carrying more weight than the feud now playing out inside government.

Read more
AI streaming is going mainstream in China, whether audiences want it or not
IQiyi wants AI to make most of its content someday, and it's already starting.
man holding tablet watching iQiyi

China's Netflix, iQiyi, is making one of the biggest bets in streaming history. The company wants AI to create the bulk of its films and shows someday soon, and it's already restructuring its 16-year-old business to make that happen.

At its annual content showcase in Beijing, founder and CEO Gong Yu announced that iQiyi is pivoting its popular streaming platform into a social media destination built around AI-generated content. 

Read more