Break: A large tech company is stepping up its AI development. (whaaat ??) In this case, the protagonist It’s very familiar now The story is meta, Reuters Report Its first internal chip is being tested for AI training. The idea is to reduce its Gargantuan infrastructure costs and reduce its reliance on Nvidia (one obviously is a company). Bring out Mark Zuckerberg’s “Adult Language” side). If all goes well, Meta hopes to use it for training by 2026.
Meta reportedly Universal NVIDIA GPU). After the deployment began, the company completed its first “tape”, the silicon development phase, which sent a complete design for manufacturing test runs.
Chips are part of the Meta-Training and Inference Accelerator (MTIA) seriesThe company’s custom in-house silicon family focuses on generating AI, recommendation systems and advanced research.
Last year, the company began using MTIA chips reasoningThe prediction process that occurs behind the scenes in AI models. Meta started using the inference for its Facebook and Instagram news feed recommendation system. Reuters It plans to start using training silicon, the report said. It is said that the long-term plan for both chips begins with suggestions and ultimately uses them to generate products, e.g. META AI Chatbot.
The company is one of NVIDIA’s largest clients place Orders for billions of dollars in GPUs in 2022. This is the hub of Meta, as it failed in the previous internal reasoning that silicon failed to pass small-scale testing deployments, just like it is now doing for training chips.