Podcast Episode
Meta Bets Big on Homegrown AI Chips with Four-Generation Silicon Roadmap
March 12, 2026
0:00
2:06
Meta has unveiled four new custom AI chips, the MTIA 300 through 500, designed to power its data centres through 2027. The inference-first strategy aims to reduce reliance on outside chipmakers like Nvidia and AMD while handling exploding demand for generative AI workloads.
Meta Goes All-In on Custom Silicon
Meta has announced an ambitious roadmap for four new in-house artificial intelligence chips, marking a significant escalation in the company's push to build its own AI hardware. The chips, part of the Meta Training and Inference Accelerator programme, span four generations: MTIA 300, MTIA 400, MTIA 450, and MTIA 500.From Recommendations to Generative AI
The MTIA 300 is already in production, handling ranking and recommendation workloads across Meta's apps. The MTIA 400, codenamed Iris, has completed lab testing and is moving toward deployment with 72 chips per data centre rack. The more advanced MTIA 450 and MTIA 500, codenamed Arke and Astrid respectively, target generative AI inference tasks such as image and video generation, with mass deployment expected throughout 2027.An Inference-First Philosophy
Rather than following the industry convention of designing chips for training first, Meta is prioritising inference, the process by which trained models respond to queries and generate content. With inference demand surging across its platforms, the company sees this as the most pressing bottleneck to address.Complementing, Not Replacing
Despite recently signing deals worth tens of billions of dollars with Nvidia and AMD, Meta frames its custom chips as complementary rather than competitive. The company plans capital expenditure of up to one hundred and thirty-five billion dollars in 2026, reflecting the sheer scale of its AI infrastructure ambitions. Broadcom assists with chip design while TSMC handles fabrication.A Broader Industry Trend
Meta joins Alphabet and Amazon in the race to develop bespoke AI silicon, as the share of custom chip-based AI servers is projected to reach nearly 28 percent in 2026.Published March 12, 2026 at 1:32am