Broadcom CEO Hock Tan says hyperscalers and AI companies lack the expertise to design and manufacture custom silicon at scale, while the company's AI chip business surges with major deals from Meta, OpenAI, and Anthropic.
Broadcom is making a bold claim about the future of AI chip development: the hyperscalers and AI companies racing to build their own silicon simply can't do it at scale, at least not anytime soon. Speaking during the company's Q1 2026 earnings call, CEO Hock Tan pointed to Broadcom's booming custom accelerator business as evidence that even tech giants like Meta, OpenAI, and Anthropic are better off relying on specialized chipmakers rather than trying to go it alone.
Massive AI chip deployments signal Broadcom's dominance
The numbers are staggering. Broadcom reported 106 percent year-over-year growth in AI-related silicon, generating $8.4 billion in revenue for the quarter. The company has secured deals to deploy multiple gigawatts of custom accelerators across its major customers:
- Anthropic: Already implementing one gigawatt of Broadcom TPUs, with plans for a three-gigawatt deployment in 2027
- Meta: Installing "multiple gigawatts" of Broadcom's XPU accelerators in 2027 and beyond
- OpenAI: Deploying "over one gigawatt" of compute capacity based on custom XPUs in 2027
- Google: Expected to show "even stronger demand" for Broadcom silicon with its next-generation TPU
The CEO emphasized that Broadcom has already secured the necessary supplies, including high-bandwidth memory, to meet this demand through 2028.
Why Broadcom thinks AI companies can't compete
Tan's argument centers on the immense complexity of chip manufacturing at scale. "Anybody can design a chip in a lab that works well," he said. "Can you produce 100,000 of those chips quickly, at yields that you can afford? And we do not see too many players in the world that can do that."
He outlined several key challenges that hyperscalers face:
- Talent acquisition: Attracting silicon design experts capable of creating chips tuned to specific workloads
- Production management: Navigating the complex manufacturing process
- Packaging expertise: Developing advanced packaging capabilities
- Networking integration: Connecting custom chips into functional systems
Perhaps most critically, Tan argued that homebrew chipmaking efforts must create chips competitive not just with NVIDIA, but "all the other LLM platform players that you are competing against." He believes this level of competition is beyond the reach of any hyperscaler or AI company "for many years to come."
Networking business booming alongside AI chips
Broadcom's AI-relevant networking business is also experiencing explosive growth, with revenue up 60 percent year-over-year. The company plans to debut a seventh-generation Tomahawk switching chip next year that will double the current model's performance, along with similar improvements for its direct copper interconnects.
This networking strength reinforces Broadcom's position in the AI ecosystem. Tan noted that these advancements mean customers won't need to contemplate a move to optical networking in the near term, giving Broadcom a clear path to capture what it predicts will be $100 billion or more in AI chip revenue alone by 2027.
VMware props up software business amid CA and Symantec struggles
While the semiconductor business is booming, Broadcom's software infrastructure division—which includes the combined CA, Symantec Enterprise, and VMware businesses—delivered more modest results. The division saw just one percent revenue growth to reach $6.8 billion, with VMware itself growing 13 percent.
Tan was notably bullish about VMware's prospects, positioning its flagship Cloud Foundation (VCF) private cloud suite as an "essential layer" of infrastructure for enterprise AI deployments. "VCF cannot be disintermediated or replaced," he asserted. "AI will create the need for more VMware, not less."
The company forecast Q2 software revenue of $7.2 billion, representing nine percent growth, and overall Q2 revenue of $22 billion, up 47 percent year-over-year. These strong projections, along with the announcement of a new share buyback scheme, sent Broadcom stock up almost five percent in after-hours trading.
The broader implications for AI chip development
Broadcom's position reflects a broader industry reality: while companies like Google, Amazon, and Microsoft have successfully developed custom AI chips (TPUs, Trainium, Maia), scaling that capability across diverse workloads and maintaining competitive performance is an entirely different challenge.
The company's confidence in its market position is evident in its supply chain strategy—securing components through 2028 suggests Broadcom expects sustained demand and limited competition in the custom AI accelerator market for years to come.
As AI workloads continue to evolve and demand for specialized silicon grows, Broadcom appears to be betting that the complexity and cost of chip development will keep most companies as customers rather than competitors. Whether this prediction holds true may depend on how quickly AI companies can overcome the very challenges Tan outlined—or whether they decide that partnering with specialists like Broadcom is the more pragmatic path forward.
![]()
Comments
Please log in or register to join the discussion