Meta and Broadcom Expand MTIA Chip Partnership to Co-Develop Multiple AI Accelerator Generations

Meta and Broadcom Expand MTIA Chip Partnership to Co-Develop Multiple AI Accelerator Generations

Meta and Broadcom have announced a major expansion of their chip partnership, with the two companies committing to co-develop multiple generations of Meta's custom MTIA (Meta Training and Inference Accelerator) chips. The deal deepens Meta's push toward AI chip independence and comes alongside news that Broadcom CEO Hock Tan will leave Meta's board of directors.

What the MTIA Partnership Covers

Under the expanded agreement, Broadcom will serve as a key design and manufacturing partner for multiple future generations of MTIA chips. Meta's MTIA chips are purpose-built for AI training and inference workloads across Meta's platforms — including content ranking, recommendation systems, and generative AI features in Facebook, Instagram, and WhatsApp. By co-developing across multiple generations rather than one, Meta is signaling a long-term commitment to custom silicon rather than relying solely on Nvidia GPUs.

Why This Matters for Meta's AI Strategy

Meta has been aggressively reducing its dependence on Nvidia as AI chip prices remain high and supply constraints persist. Custom ASICs like MTIA offer Meta more cost control, power efficiency, and optimization for its specific workloads. The Broadcom partnership gives Meta access to Broadcom's deep expertise in custom chip design and its advanced packaging capabilities, accelerating MTIA development timelines significantly.

Hock Tan Leaves Meta's Board

The announcement includes the departure of Broadcom CEO Hock Tan from Meta's board, likely to resolve potential conflicts of interest now that the commercial relationship between the two companies has deepened significantly. Tan has led Broadcom's transformation into one of the world's most important semiconductor companies through a series of aggressive acquisitions, including the landmark VMware deal. His exit from the Meta board is procedural but symbolically marks the shift from a casual advisory relationship to a formal, deep commercial partnership.

Custom Silicon as Competitive Moat

Meta joins a growing list of hyperscalers — including Google (TPUs), Amazon (Trainium/Inferentia), and Microsoft (Maia) — that are building proprietary AI chips to reduce costs and gain performance advantages. As AI inference demands scale with the deployment of large models across billions of users, the economics of custom silicon versus general-purpose GPUs become increasingly compelling. Meta's multi-generational MTIA roadmap with Broadcom suggests it is committed to this path for the long term.

The Bottom Line

Meta's expanded MTIA partnership with Broadcom is a serious escalation in Big Tech's custom AI chip race. By committing to multiple chip generations with one of Silicon Valley's most capable semiconductor partners, Meta is laying the groundwork for AI infrastructure independence — and sending a clear message to Nvidia that the hyperscalers are determined to build their own path forward.