Meta Announces New Details on Data Centers for AI Projects

Meta Platforms on Thursday shared new details on projects it was pursuing to make its data centers better suited to supporting artificial intelligence work, including a custom chip “family” that it said it was developing in-house.

The Facebook and Instagram owner said in a series of blog posts that it designed a first-generation chip in 2020 as part of the Meta Training and Inference Accelerator (MTIA) program, which was aimed at improving efficiency for the recommendations models it uses to serve ads and other content in news feeds.

Reuters previously reported that the company was not planning to deploy its first in-house AI chip widely and was already working on a successor. The blog posts portrayed the first MTIA chip as a learning opportunity.

“From this initial program, we have learned invaluable lessons that we are incorporating into our roadmap,” it wrote.

The first MTIA chip was focused exclusively on an AI process called inference, in which algorithms trained on huge amounts of data make judgments about whether to show, say, a dance video or a cat meme as the next post in a user’s feed, the posts said.

A Meta spokesperson declined to comment on deployment timelines or elaborate on the company’s plans to develop chips that could train the models as well.

Meta has been engaged in a massive project to upgrade its AI infrastructure this past year, after executives realized it lacked the hardware and software needed to support demand from product teams building AI-powered features.

As part of that, the company scrapped plans for a large-scale rollout of an in-house inference chip and started work on a more ambitious chip capable of performing both training and inference, according to the Reuters reporting.

Meta acknowledged in its blog posts that its first MTIA chip stumbled with high-complexity AI models, although it said the chip handled low- and medium-complexity models more efficiently than competitor chips.

The MTIA chip also used only 25 watts of power — a fraction of what market-leading chips from suppliers such as Nvidia consume — and used an open-source chip architecture called RISC-V, Meta said. 

In addition to detailing its chip work, Meta provided an update on plans to redesign its data centers around more modern AI-oriented networking and cooling systems, saying it would break ground on its first such facility this year.

The new design would be 31 percent cheaper and could be built twice as quickly as the company’s current data centers, an employee said in a video explaining the changes.

© Thomson Reuters 2023


Google I/O 2023 saw the search giant repeatedly tell us that it cares about AI, alongside the launch of its first foldable phone and Pixel-branded tablet. This year, the company is going to supercharge its apps, services, and Android operating system with AI technology. We discuss this and more on Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate links may be automatically generated – see our ethics statement for details.
Social

Articles You May Like

SBF sentencing live updates: FTX founder says he made ‘selfish decisions’ at failed crypto exchange
Big week for tech IPOs Reddit, Astera boosts Morgan Stanley after extended lull
Fire-Boltt Oracle With 4G LTE SIM Support, 1.96-Inch Display Launched in India: Price, Specifications
Hackers Hit Indian Defense, Energy Sectors with Malware Posing as Air Force Invite
SEC scores big win in lawsuit against crypto exchange Coinbase

Leave a Reply

Your email address will not be published. Required fields are marked *