Podcast Episode
The deal deepens an existing collaboration between the two companies and places Broadcom's XPU platform at the centre of Meta's AI compute expansion. Broadcom will provide technology spanning chip design, advanced packaging, and Ethernet-based networking to support current and future MTIA iterations.
Broadcom and Meta Strike Multi-Year Deal to Co-Develop Custom AI Chips Through 2029
April 15, 2026
0:00
3:54
Meta and Broadcom have announced a strategic partnership to co-develop multiple generations of Meta's custom MTIA silicon, starting with a deployment exceeding one gigawatt. The deal extends through 2029 and positions Broadcom as the engineering backbone for Meta's rapidly expanding AI compute infrastructure.
A New Era for Custom AI Silicon
Meta and Broadcom announced on April 14, 2026 a sweeping multi-year strategic partnership to co-develop multiple generations of Meta's custom MTIA (Meta Training and Inference Accelerator) chips, with an initial deployment exceeding one gigawatt and plans for a sustained, multi-gigawatt rollout extending through 2029.The deal deepens an existing collaboration between the two companies and places Broadcom's XPU platform at the centre of Meta's AI compute expansion. Broadcom will provide technology spanning chip design, advanced packaging, and Ethernet-based networking to support current and future MTIA iterations.
Four Chips, Six-Month Cadence
Meta revealed in March 2026 that it had developed four MTIA chip models: the 300, 400, 450, and 500. The MTIA 300 is already running in production for ranking and recommendation workloads, while the 450 and 500 are designed for generative AI inference and slated for mass deployment in 2027. Across the full lineup, Meta reports a 4.5x increase in HBM bandwidth and a 25x increase in compute FLOPs. The company says it can now ship a new custom chip roughly every six months.Reducing Reliance on Nvidia
The partnership reflects Meta's broader push to diversify its AI hardware away from commercial GPUs. Meta has claimed its MTIA 450 delivers performance "much higher than that of existing leading commercial products" for generative AI inference. However, the company shelved its most ambitious custom training chip, codenamed Olympus, meaning it will continue to rely on Nvidia GPUs for large-scale model training.Broadcom's Expanding Custom Chip Empire
The Meta deal adds to a string of major custom silicon agreements for Broadcom. Earlier in April, the company disclosed deals to develop and supply future generations of Google's Tensor Processing Units through 2031, and to deliver approximately 3.5 gigawatts of TPU-based compute capacity to Anthropic starting in 2027. As part of the Meta expansion, Broadcom CEO Hock Tan will step down from Meta's board to serve as an advisor on its custom silicon roadmap.Market Impact
Broadcom's shares have surged on the back of these announcements, with the company now sitting on a reported $73 billion backlog in custom silicon orders. The deals cement Broadcom's position as the leading behind-the-scenes architect for custom AI chips across the hyperscaler landscape.Published April 15, 2026 at 7:30am