OpenAI says no plans to use Google's AI chips at scale
OpenAI said it has no active plans to use Google's in-house chip to power its products, two days after several news outlets reported the AI lab was turning to its competitor's artificial intelligence (AI) chips to meet growing demand.
A spokesperson for OpenAI said on Sunday that while the AI lab is in early testing with some of Google's tensor processing units (TPUs), it has no plans to deploy them at scale for now.
While it is common for AI labs to test different chips, using new hardware at scale can take much longer and requires different architecture and software support. OpenAI is actively using Nvidia's graphics processing units (GPUs) and AMD's AI chips to meet its growing demand. The company is also developing its own chip, an effort that remains on track to reach the "tape-out" milestone this year, when the chip's design is finalized and sent for manufacturing.
OpenAI has signed up for Google Cloud services to support its growing need for computing capacity, marking a surprising collaboration between two major competitors in the AI sector. Most of the computing power used by OpenAI will come from GPU servers operated by CoreWeave, a so-called neocloud company.
Google has been expanding external access to its in-house AI chips, or TPUs, which were historically reserved for internal use. That has helped Google attract customers, including Big Tech player Apple and startups such as Anthropic and Safe Superintelligence, two ChatGPT-maker competitors launched by former OpenAI leaders.