Published February 12, 2024

Meta’s Artemis AI Chips Signal a Shift in AI Landscape, Eying Independence from Nvidia

Meta is set to deploy its own AI chips in servers by 2024, posing a challenge to Nvidia's AI dominance. However, the company will continue using Nvidia H100 GPUs in its datacenters for the time being.
Exciting Meta Quest 3 Games to Look Forward to in 2024

In a strategic move set to reshape the dynamics of artificial intelligence (AI) infrastructure, Meta Platforms, the parent company of Facebook, is gearing up to deploy its proprietary AI chips, codenamed Artemis, into its data centers by 2024, as revealed in an internal document obtained by Reuters. This bold initiative is part of Meta’s ambitious plan to mitigate dependence on Nvidia’s dominant H100 chips, offering potential cost savings and control over the surging expenses associated with AI workloads.

Meta’s substantial investments, running into billions of dollars, underline its commitment to enhancing computing capacity, particularly for the resource-intensive generative AI products integrated into platforms like Facebook, Instagram, and WhatsApp. This entails the acquisition of specialized chips and the meticulous reconfiguration of data centers to accommodate these advancements.

According to Dylan Patel, founder of the silicon research group SemiAnalysis, the successful implementation of Meta’s Artemis chip could result in significant annual energy cost savings and potentially save the company billions in chip purchases. This move towards in-house AI solutions marks a pivotal moment for Meta’s silicon project, rebounding from the discontinuation of its initial chip iteration in 2022 in favor of Nvidia’s GPUs.While Meta strives for self-reliance, it acknowledges the continued reliance on Nvidia’s H100 GPUs in its data centers for the foreseeable future. CEO Mark Zuckerberg has outlined plans to have approximately 350,000 H100 processors in service by the end of the year.The Artemis chip, designed for AI inference, aligns with Meta’s focus on algorithms for ranking judgments and generating responses to user prompts.

A Meta spokesperson emphasized the complementary nature of their internally developed accelerators alongside commercially available GPUs, aiming to achieve optimal performance and efficiency for Meta-specific workloads.While Meta’s pursuit of reduced dependency on Nvidia processors might suggest a potential shift in the AI landscape, it is evident that, at least for now, Nvidia’s GPUs will continue to play a pivotal role in shaping Meta’s AI infrastructure. As the tech giant navigates this transformative journey, the industry watches closely to witness the evolving dynamics between these tech behemoths.

Elizabeth Betty

Elizabeth Betty brings a wealth of experience and a keen eye for detail to her role as the Editor at BagItToday. With a passion for technology and an insatiable curiosity for the latest gadgets, Elizabeth is dedicated to providing readers with insightful and unbiased reviews. As a seasoned tech enthusiast, Elizabeth stays at the forefront of the rapidly evolving tech landscape. Her in-depth knowledge allows her to dissect complex features and functionalities, providing readers with comprehensive evaluations that empower them to make informed purchasing decisions. Driven by her commitment to journalistic integrity, Elizabeth ensures that every review published on BagItToday is grounded in thorough research and hands-on testing. Her goal is to deliver content that not only informs but also inspires and entertains readers, fostering a community of tech enthusiasts who share her excitement for innovation.

See More Post