Share

Meta unveils next-gen custom AI chip to power its apps and services | Tech News

0 0

Meta has unveiled its next-generation silicon chip to handle artificial intelligence (AI) workloads. The company said it would use its new Meta Training and Interference Accelerator (MTIA) chip to run ranking and recommendation models to improve content organisation on its apps and services such as Facebook and Instagram.

Meta released its first-generation of MTIA chips last year to run indigenous AI models, including its deep-learning recommendation models that manage content on various Meta-owned social media platforms.

Click here to follow our WhatsApp channel

The company also has its own Large Language Model (LLM), called LLaMA, which Meta released last year to compete with the likes of Microsoft-backed OpenAI and Google. The social media giant also offers various generative AI tools and features on its platforms, such as custom stickers in Messenger and AI chatbots. Meta’s focus on AI has led to an increase in demand for computing power, which it plans to address by investing in infrastructure to support AI, including data centres and hardware.

The company said in a press note that the next-generation MTIA chip was part of its broader plan for custom, domain-specific silicon chips that could handle internal workloads. This move is also aimed to reduce the company’s reliance on Nvidia, which provides Meta H100 graphics cards to power its AI models.

Lately, there has been a growing trend among big technology companies to develop in-house AI chips. Earlier this week, Google revealed its own custom Arm-based CPU, Axion, specifically built for handling general-purpose workloads, including CPU-based AI training and media processing. Microsoft also has its own pair of Arm-based custom AI chips, designed to train AI models and power Azure Cloud services.

You may also like...