OpenAI Turns to Google's AI Chips Amid Rising Compute Needs

Author's Avatar
8 hours ago
Article's Main Image

OpenAI has started leasing AI chips from Alphabet's (GOOGL, Financial) subsidiary, Google, to power its ChatGPT and other products, according to a recent report. OpenAI plans to use Google Cloud services to support its growing compute demands. Previously, OpenAI had relied mainly on NVIDIA (NVDA) chips, but this marks the first significant move toward using non-NVIDIA processors.

The report suggests that OpenAI's decision to rent Google's tensor processing units (TPUs) indicates a strategic shift to reduce dependency on Microsoft's (MSFT) data centers. This transition could make TPUs a more cost-effective alternative to NVIDIA's graphics processing units (GPUs) for AI computation tasks. While OpenAI aims to cut inference costs with the TPUs rented from Google Cloud, it is reported that Google, considered a competitor in the AI race, may withhold its most powerful TPUs from OpenAI.

Disclosures

I/We may personally own shares in some of the companies mentioned above. However, those positions are not material to either the company or to my/our portfolios.