Meta doubles down on custom inference chips after reportedly scrapping training chip
Meta said today that it’s expanding its custom silicon development to include four new generations of Meta Training and Inference Accelerator (MTIA) chips. The announcement comes just weeks after The Information reported that the social media company had scrapped its most advanced AI training chip, dubbed Olympus, after facing design challenges. In the meantime, it signed outside chip deals with Nvidiaand Advanced Micro Devices.
Early in its recent conference call, Broadcom CEO Hock Tan sought to reassure investors that the custom chip specialist’s relationship with the social media giant was only getting stronger.
“Now contrary to recent analyst reports, Meta’s custom accelerator MTIA road map is alive and well,” he said. “We’re shipping now.”
The new road map suggests Meta’s in-house chips will focus more on inference, which has more predictable workloads, over training — a technically more demanding area dominated by Nvidia:
“MTIA 300 will be used for ranking and recommendations training, and is already in production. MTIA 400, 450 and 500 will be capable of handling all workloads, but we will primarily use these chips to support GenAI inference production in the near future and into 2027.”
Meta CFO Susan Li told attendees at Morgan Stanley’s tech conference earlier this month that the company “eventually” plans to expand its custom chip design to include training models.
Early in its recent conference call, Broadcom CEO Hock Tan sought to reassure investors that the custom chip specialist’s relationship with the social media giant was only getting stronger.
“Now contrary to recent analyst reports, Meta’s custom accelerator MTIA road map is alive and well,” he said. “We’re shipping now.”
The new road map suggests Meta’s in-house chips will focus more on inference, which has more predictable workloads, over training — a technically more demanding area dominated by Nvidia:
“MTIA 300 will be used for ranking and recommendations training, and is already in production. MTIA 400, 450 and 500 will be capable of handling all workloads, but we will primarily use these chips to support GenAI inference production in the near future and into 2027.”
Meta CFO Susan Li told attendees at Morgan Stanley’s tech conference earlier this month that the company “eventually” plans to expand its custom chip design to include training models.