Banner Image
Search icon

Microsoft Unveils Custom AI and Cloud Computing Chips at Ignite 2023

Published on 16 Nov, 2023, 8:32 AM IST
Updated on 16 Dec, 2024, 6:40 PM IST
Siddharth Chauhan
ReadTimeIcon
3 min read
Top stories and News
Follow us onfollow-google-news-icon

Share Post

Microsoft

The chips are named Maia and Cobalt, marking Microsoft's entry into the custom silicon space.

At its annual Ignite 2023 developer conference held this week in Seattle, Microsoft announced that it has developed two custom-designed semiconductors tailored for its cloud and artificial intelligence workloads. The chips are named Maia and Cobalt, marking Microsoft's entry into the custom silicon space dominated by tech giants like Amazon, Google, and Apple.

The Maia chip is an AI accelerator designed to speed up large language models and other machine-learning tasks. It will power Microsoft's Azure OpenAI service, which underpins products like their Copilot coding assistant. Maia will help reduce the costs of delivering advanced AI services via the cloud by optimizing performance for Microsoft's standard set of AI models.

Microsoft

Cobalt is a general-purpose CPU built on Arm architecture. Microsoft said it has already tested Cobalt to power its Teams collaboration software. Looking ahead, Cobalt will serve as an alternative to Amazon's own Graviton chips for running compute workloads on the Azure cloud platform. Both chips leverage the latest 5nm manufacturing node and are tailored specifically for Microsoft's cloud and AI offerings.

While Microsoft noted that the chips would initially power their internal services, they signalled plans to eventually offer enterprise customers direct access to the customized silicon. The move reflects the company's strategy of taking more control over the full hardware and software stack underpinning its cloud infrastructure. It aims to design systems optimized for functionality, performance, cost, sustainability and flexibility.

Microsoft also announced partnerships to provide cloud services running on Nvidia and AMD's latest GPU accelerators. They revealed testing OpenAI's GPT-4 model on AMD chips and plans to add Nvidia's H200 GPU to Azure next year for large AI model inferencing. The custom and partner chips will start rolling out to Microsoft's global datacenter network early next year.

AckoDriveTag IconTags
Microsoft
Ignite 2023
AI
Cloud Computing
Ignite 2023 developer conference
Maia
Cobalt

Looking for a new car?

We promise the best car deals and earliest delivery!

Callback Widget Desktop Icon