OpenAI has clarified that it currently has no plans to use Google's (GOOGL) in-house developed AI chips for its products, addressing media speculation about a potential shift to competitor chips due to rising demand. An OpenAI spokesperson mentioned that while the company is conducting early tests with some of Google's Tensor Processing Units (TPUs), there are no plans for large-scale deployment yet.
Testing various chip types is standard in the AI industry, but deploying new hardware on a large scale involves significant adjustments to system architecture and software integration, which can be time-consuming. Currently, OpenAI relies heavily on Nvidia's (NVDA, Financial) GPUs and AMD's AI chips as the primary sources for computational power to support their growing model training and inference requirements.
Additionally, OpenAI is developing its own specialized AI chips, with plans to finalize their design and commence production this year. Despite a service agreement with Google Cloud to enhance computing resources, OpenAI primarily utilizes computational power from third-party cloud provider CoreWeave (CRWV).
Meanwhile, Google is actively promoting its internally developed AI TPU chips to major tech firms, including Apple (AAPL), and AI startups like Anthropic and Safe Superintelligence, which are founded by former OpenAI members and are potential competitors to ChatGPT.