READING THE CHIP LEAVES

What OpenAI's latest batch of chips says about the future of AI

Sam Altman said OpenAI has secured more H100 chips and is expecting even more soon

We may earn a commission from links on this page.
A keyboard is placed in front of a displayed OpenAI logo in this illustration taken February 21, 2023.
All eyes are on OpenAI, the leading AI company at the moment.
Photo: Davo Ruvic (Reuters)

OpenAI has received a coveted order of H100 chips and is expecting more soon, CEO Sam Altman said in a Nov. 13 interview with the Financial Times, adding that “next year looks already like it’s going to be better” in regards to securing more chips.

One could say that the level of attention on AI chatbots like OpenAI’s ChatGPT and Google’s Bard this year matches the amount of focus on Nvidia’s $40,000 H100 chips. OpenAI, like many other AI companies, uses Nvidia’s latest model of chips to train its models.

The procurement of more chips from OpenAI signals that more sophisticated AI models, which go beyond powering the current version of chatbots, will be ready in the near future.

The steep demand for H100 chips from Nvidia

Generative AI systems are trained on vast amounts of data to generate complex responses to questions, and that requires a lot of computing power. Enter Nvidia’s H100 chips, which are tailored for generative AI and run much faster than previous chip models. The more powerful the chips, the faster you can process queries, Willy Shih, a professor at Harvard Business School, previously told Quartz.

Advertisement

In the background, startups, chip rivals like AMD, and Big Tech companies like Google and Amazon have been working on building more efficient chips tailored to AI applications to meet the demand—but none so far have been able to outperform Nvidia.

Such intense demand for a specific chip from one company has created somewhat of a buying frenzy for Nvidia, and it’s not just tech companies racing to snap up these hot chips—governments and venture capital firms are chomping at the bit too. But if OpenAI was able to obtain its order, perhaps that tide is finally turning, and the flow of chips to AI businesses is improving.

And while Nvidia reigns, just last week, Prateek Kathpal, the CEO of SymphonyAI Industrial, which is building AI chatbots for internal use within manufacturers, told Quartz that, although its AI applications run on Nvidia’s chips, the company has also been in discussion with AMD and Arm for their technology.

Advertisement

What do more chips mean for OpenAI?

OpenAI’s growing chip inventory means a couple of things.

The H100 chips will help power the company’s next AI model GPT-5, which Altman said is currently in the works. The new model will require more data to train on, which will come from both publicly available information and proprietary intel from companies, he told the Financial Times. GPT-5 will likely be more sophisticated than its predecessors, although it’s not clear what it will do that GPT-4 can’t, he added.

Advertisement

Altman did not disclose a timeline for the release of GPT-5. But the quick succession of releases, with GPT-4 coming just eight months ago, following the release of its predecessor GPT-3 in 2020, highlights a rapid development cycle.

The procurement of more chips also suggests that the company is getting closer to creating artificial general intelligence, or AGI, for short, which is an AI system that can essentially accomplish any task that human beings can do.