• Home
  • News
  • Fortune 500
  • Tech
  • Finance
  • Leadership
  • Lifestyle
  • Rankings
  • Multimedia
AI

OpenAI-Broadcom agreement sends shares of chipmaker soaring

By
Dina Bass
Dina Bass
,
Shirin Ghaffary
Shirin Ghaffary
and
Bloomberg
Bloomberg
Down Arrow Button Icon
By
Dina Bass
Dina Bass
,
Shirin Ghaffary
Shirin Ghaffary
and
Bloomberg
Bloomberg
Down Arrow Button Icon
October 13, 2025, 12:08 PM ET
OpenAI CEO Sam Altman.
OpenAI CEO Sam Altman.Kyle Grillot/Bloomberg via Getty Images

Broadcom Inc. shares jumped after OpenAI agreed to buy the company’s custom chips and networking equipment in a multiyear deal, part of an ambitious plan by the startup to add artificial intelligence infrastructure.

Recommended Video

In a pact announced Monday, OpenAI agreed to buy custom chips and networking components from Broadcom to help power its artificial intelligence services. OpenAI had already struck deals for data centers and chips that easily top $1 trillion, and the company plans to spend tens of billions of dollars more on Broadcom chips, according to people familiar with the matter.

By customizing the processors, OpenAI said it will be able to embed what it has learned from developing AI models and services “directly into the hardware, unlocking new levels of capability and intelligence.” The hardware rollout should be completed by the end of 2029, according to the companies.

For Broadcom, the move provides deeper access to the booming AI market. Monday’s agreement confirms an arrangement that Broadcom Chief Executive Officer Hock Tan had hinted at during an earnings conference call last month.

Investors sent Broadcom shares up as as much as 11% on Monday, betting that the OpenAI alliance will generate hundreds of billions of dollars in new revenue for the chipmaker. But the details of how OpenAI will pay for the equipment aren’t spelled out. While the AI startup has shown it can easily raise funding from investors, it’s burning through wads of cash and doesn’t expect to be cash-flow positive until around the end of this decade.

OpenAI, the creator of ChatGPT, has inked a number of blockbuster deals this year, aiming to ease constraints on computing power. Nvidia Corp., whose chips handle the majority of AI work, said last month that it will invest as much as $100 billion in OpenAI to support new infrastructure — with a goal of at least 10 GW of capacity. And just last week, OpenAI announced a pact to deploy 6 GW of Advanced Micro Device Inc. processors over multiple years. 

As AI and cloud companies announce large projects every few days, it’s often not clear how the efforts are being financed. The interlocking deals also have boosted fears of a bubble in AI spending, particularly as many of these partnerships involve OpenAI, a fast-growing but unprofitable business.

While purchasing chips from others, OpenAI has also been working on designing its own semiconductors. They’re mainly intended to handle the inference stage of running AI models — the phase after the technology is trained.

There’s no investment or stock component to the Broadcom deal, OpenAI said, making it different than the agreements with Nvidia and AMD. An OpenAI spokesperson declined to comment on how the company will finance the chips, but the underlying idea is that more computing power will let the company sell more services.

A single gigawatt of AI computing capacity today costs roughly $35 billion for the chips alone, with 10 GW totaling upwards of $350 billion. But a chief reason OpenAI is working to develop its own chip is to bring down its costs, and it’s unclear what price Broadcom’s chips will command under the deal.

OpenAI might be trying to emulate Alphabet Inc.’s Google, which made its own chips using Broadcom’s technology and saw lower costs compared with other AI companies, such as Meta Platforms Inc., according to Bloomberg Intelligence analyst Mandeep Singh. Google’s success with Broadcom might have steered OpenAI to that chipmaker, rather than suppliers such as Marvell Technology Inc., Singh added.

In announcing the agreement, OpenAI CEO Sam Altman said that his company has been working with Broadcom for 18 months.

The startup is rethinking technology starting with the transistors and going all the way up to what happens when someone asks ChatGPT a question, he said on a podcast released by his company. “By being able to optimize across that entire stack, we can get huge efficiency gains, and that will lead to much better performance, faster models, cheaper models.”

When Tan referred to the agreement last month, he didn’t name the customer, though people familiar with the matter identified it as OpenAI. 

“If you do your own chips, you control your destiny,” Tan said in the podcast Monday.

Broadcom has increasingly been seen as a key beneficiary of AI spending, helping propel its share price this year. The stock was up 40% so far this year through Friday’s close, outpacing a 29% gain by the benchmark Philadelphia Stock Exchange Semiconductor Index. OpenAI, meanwhile, has garnered a $500 billion valuation, making it the world’s biggest startup by that measure. 

By tapping Broadcom’s networking technology, OpenAI is hedging its bets. Broadcom’s Ethernet-based options compete with Nvidia’s proprietary technology. OpenAI also will be designing its own gear as part of its work on custom hardware, the startup said. 

Broadcom won’t be providing the data center capacity itself. Instead, it will deploy server racks with custom hardware to facilities run by either OpenAI or its cloud-computing partners.

A single gigawatt is about the capacity of a conventional nuclear power plant. Still, 10 GW of computing power alone isn’t enough to support OpenAI’s vision of achieving artificial general intelligence, said OpenAI co-founder and President Greg Brockman.

“That is a drop in the bucket compared to where we need to go,” he said.

Getting to the level under discussion isn’t going to happen quickly, said Charlie Kawwas, president of Broadcom’s semiconductor solutions group. “Take railroads — it took about a century to roll it out as critical infrastructure. If you take the internet, it took about 30 years,” he said. “This is not going to take five years.”

Fortune Brainstorm AI returns to San Francisco Dec. 8–9 to convene the smartest people we know—technologists, entrepreneurs, Fortune Global 500 executives, investors, policymakers, and the brilliant minds in between—to explore and interrogate the most pressing questions about AI at another pivotal moment. Register here.
About the Authors
By Dina Bass
See full bioRight Arrow Button Icon
By Shirin Ghaffary
See full bioRight Arrow Button Icon
By Bloomberg
See full bioRight Arrow Button Icon
Rankings
  • 100 Best Companies
  • Fortune 500
  • Global 500
  • Fortune 500 Europe
  • Most Powerful Women
  • Future 50
  • World’s Most Admired Companies
  • See All Rankings
Sections
  • Finance
  • Leadership
  • Success
  • Tech
  • Asia
  • Europe
  • Environment
  • Fortune Crypto
  • Health
  • Retail
  • Lifestyle
  • Politics
  • Newsletters
  • Magazine
  • Features
  • Commentary
  • Mpw
  • CEO Initiative
  • Conferences
  • Personal Finance
  • Education
Customer Support
  • Frequently Asked Questions
  • Customer Service Portal
  • Privacy Policy
  • Terms Of Use
  • Single Issues For Purchase
  • International Print
Commercial Services
  • Advertising
  • Fortune Brand Studio
  • Fortune Analytics
  • Fortune Conferences
  • Business Development
About Us
  • About Us
  • Editorial Calendar
  • Press Center
  • Work At Fortune
  • Diversity And Inclusion
  • Terms And Conditions
  • Site Map

© 2025 Fortune Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information
FORTUNE is a trademark of Fortune Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.