Broadcom and OpenAI announce custom chip partnership

OpenAI is on a roll. Hot on the heels of landmark agreements with NVIDIA and AMD, it has now announced a partnership with Broadcom to develop and deploy custom AI accelerator chips and networking systems capable of delivering up to 10 gigawatts of power across global data centres.

The collaboration aims to enhance the efficiency, scalability and sustainability of AI infrastructure as the computational demands of advanced models such as ChatGPT and Sora continue to soar.

Under the agreement, OpenAI will design the silicon and systems architecture for the new accelerators, embedding insights gained from training large-scale AI models directly into the hardware.

Meanwhile, Broadcom will lead chip development, manufacturing and network integration using its high-performance Ethernet switching and connectivity technology.

The custom chips will support OpenAI’s next-generation models while reducing dependencies on third-party chips. The energy footprint of the planned deployment is massive, equivalent to the electricity usage of several million US households.

“Developing our own accelerators adds to the broader ecosystem of partners all building the capacity required to push the frontier of AI to provide benefits to all humanity,” said Sam Altman, Co-founder and CEO of OpenAI.

“OpenAI has been in the forefront of the AI revolution since the ChatGPT moment, and we are thrilled to co-develop and deploy 10 gigawatts of next- generation accelerators and network systems to pave the way for the future of AI,” said Hock Tan, President and CEO of Broadcom.

Tagged with: