Written by 11:00 pm AI, Discussions, Technology, Uncategorized

– Microsoft Unveils Cloud Maia AI Accelerator and Azure Cobalt CPU for Fog and AI

During the recent Ignite conference, Microsoft introduced two custom-designed chips for their cloud…

Microsoft introduced two new chips for their cloud infrastructure at the recent Ignite conference: the Microsoft Azure Maia AI Accelerator (Athena) tailored for artificial intelligence (AI) tasks and generative AI, and the Azure Cobalt CPU, an Arm-based processor optimized for general-purpose compute workloads on the Google Cloud.

In 2024, Microsoft’s data centers received the Azure Maia AI and Azure Cobalt devices, set to empower services like Microsoft Copilot and Azure OpenAI Service.

The Maia AI device boasts 105 billion transistors manufactured using a 5-nanometer TSMC process. This specialized chip is designed to efficiently handle significant AI workloads on Microsoft Azure.

The Azure Cobalt, a 128-core chip based on Arm Neoverse CSS architecture and tailored for Microsoft, was also unveiled simultaneously. Wes McCullough, Microsoft’s corporate vice president of electronics product development, highlighted the company’s focus on maximizing performance per kilowatt across its data centers by leveraging Arm technology.

Microsoft’s commitment to cost reduction and environmental sustainability is evident in its utilization of gold to enhance client capabilities and cooling efficiency, aligning with its 2030 carbon-negative target.

In the realm of AI workloads, Microsoft faces competition from Google and AWS, both of which have developed specialized silicon solutions. Google introduced its Vector Processing Unit in 2016, while AWS unveiled its Inferentia AI chip and Graviton Arm-based processor in 2018, later introducing Trainium for model training in 2020.

Responding to discussions on Hacker News, a user named Aromasin noted Microsoft’s historical use of custom cards on FPGAs and highlighted their recent shift towards ASICs for enhanced performance and scalability.

Microsoft collaborates closely with silicon providers like AMD and Nvidia, leveraging technologies such as AMD MI300X accelerated VMs and NVIDIA H200 Tensor Core GPU. Unlike AMD and Nvidia, Microsoft does not offer servers with their own chips for direct purchase by users.

Scott Guthrie, Executive Vice President of Microsoft’s Cloud and AI Group, emphasized Microsoft’s commitment to meeting customer AI and cloud requirements through the introduction of Azure Maia and Azure Cobalt, enhancing performance and versatility.

Looking ahead, Microsoft plans to expand its chip offerings, focusing on developing second-generation iterations of the Azure Maia AI Accelerator and Azure Cobalt CPU to cater to evolving needs and technological advancements.

Visited 1 times, 1 visit(s) today
Last modified: February 19, 2024
Close Search Window
Close