OpenAI, the company behind ChatGPT, is looking into making its own AI chips, even if it means an acquisition to do so. Prompted by shortages and the cost of AI chips, they are considering working with other chipmakers and diversifying their suppliers.
CEO Sam Altman is on record regarding the scarcity of graphics processing units (GPUs), 80% of which are controlled by Nvidia. OpenAI’s generative AI runs on a massive supercomputer constructed by Microsoft that uses 10,000 of Nvidia’s GPUs. Each query costs approximately 4 cents, according to Bernstein analyst Stacy Rasgon. If ChatGPT queries grow to a tenth the scale of Google search, that’s roughly $48.1 billion worth of GPUs initially and about $16 billion worth of chips annually.
If OpenAI goes forward, it would mean an investment of hundreds of millions of dollars a year in costs. Rumor is that it performed due diligence on a potential but unnamed acquisition target. It will likely take several years to create a custom chip, even with an acquisition. Microsoft is currently developing a custom AI chip that OpenAI is testing.
AI accelerators are necessary to train and run the latest generative AI technology. Nvidia is one of the few chipmakers that produces useful AI chips and clearly dominates the market.