After signing deals to use computer chips from Nvidia and AMD and design its own chips with Broadcom, OpenAI has reached an agreement with yet another chip maker.
OpenAI, the maker of ChatGPT, said on Wednesday that it would begin using chips from Cerebras, a start-up in Sunnyvale, Calif. OpenAI said the number of Cerebras chips it eventually used would consume 750 megawatts of electricity, an amount that could power tens of thousands of households.
The agreement is the latest in a series with various partners as OpenAI works to expand the computing power needed to build and deploy its artificial intelligence technologies, including ChatGPT.
OpenAI is one of many tech companies that are spending hundreds of billions of dollars on new data centers for A.I. OpenAI, Amazon, Google, Meta and Microsoft plan to spend more than $325 billion combined on these facilities by the end of this year alone. The company is building data centers in Abilene, Texas, and plans additional computing facilities in other parts of Texas, New Mexico, Ohio and the Midwest.
OpenAI previously said it would deploy enough Nvidia and AMD chips to consume 16 gigawatts of power, which could power millions of households. The chips that the company is designing with Broadcom are slated to consume 10 gigawatts.
As OpenAI and its partners pack new chips into data centers, some will be used to create its A.I. technologies. Others will serve these technologies to people and businesses around the globe — a process that industry insiders call inference. The Cerebras chips will be used for inference.
“This partnership will make ChatGPT not just the most capable but also the fastest A.I. platform in the world,” Greg Brockman, OpenAI’s president, said in a statement.
Co-founded in 2015 by Andrew Feldman, a chip industry veteran who previously sold a start-up to AMD, Cerebras is among a number of start-ups that have spent years building chips just for A.I.
In 2019, the company unveiled what it called the largest chip ever built. As big as a dinner plate — about 100 times the size of a typical chip — it would barely fit in your lap.
A.I. systems are typically powered by many chips that work together. But moving data between chips can be slow, limiting how quickly A.I. software operates. While some chip makers are broadening the pipes that run between chips, Cerebras has taken a different approach: Keep all the data on one giant chip so a system can operate faster.
(The New York Times has sued OpenAI and Microsoft, claiming copyright infringement of news content related to A.I. systems. The two companies have denied the suit’s claims.)
Cade Metz is a Times reporter who writes about artificial intelligence, driverless cars, robotics, virtual reality and other emerging areas of technology.
The post OpenAI Signs Another Deal With a Computer Chip Maker appeared first on New York Times.




