Written by 3:00 pm AI, Discussions, Uncategorized

– AI Practice Cards: Microsoft’s Innovative Tool for AI Mastery

Microsoft joins the AI hardware race with two custom silicon chips

The reports are accurate: Microsoft has developed a proprietary AI chip tailored for training extensive language models, potentially reducing the need for Nvidia’s technology and costs. Additionally, Microsoft has created its own Arm-based CPU for cloud operations. Both custom chips are crafted to drive Microsoft’s Azure data centers and equip the company and its corporate clients for an AI-centric future.

Microsoft’s Azure Maia AI chip and Arm-based Azure Cobalt CPU are set to debut in 2024, following a surge in demand for Nvidia’s H100 GPUs this year, which are widely utilized for training generative image tools and large language models. The escalating demand for these GPUs has led to staggering prices exceeding $40,000 on platforms like eBay.

Rani Borkar, head of Azure hardware systems and infrastructure at Microsoft, affirmed the company’s extensive history in silicon development, including collaboration on Xbox silicon over two decades ago and co-engineering chips for Surface devices. The development of the Azure Maia AI chip and Azure Cobalt CPU stems from this legacy, with Microsoft commencing the architecture of the cloud hardware stack in 2017 to pave the way for the creation of these new custom chips.

The Azure Cobalt CPU, named after the blue pigment, boasts 128 cores and is based on an Arm Neoverse CSS design customized for Microsoft’s requirements. It is tailored to support general cloud services on Azure, emphasizing not only high performance but also meticulous power management. Microsoft is currently evaluating the Cobalt CPU on workloads such as Microsoft Teams and SQL server, with plans to offer virtual machines to customers next year for various tasks.

On the other hand, the Maia 100 AI accelerator, named after a brilliant blue star, is specifically designed for running cloud AI workloads, particularly large language model training and inference. This chip will power significant AI workloads on Azure, including those from the multibillion-dollar partnership with OpenAI. Manufactured on a 5-nanometer TSMC process, the Maia chip incorporates 105 billion transistors and supports sub 8-bit data types for enhanced hardware and software co-design, resulting in faster model training and inference times.

Microsoft, in collaboration with industry leaders, is standardizing the next generation of data formats for AI models and leveraging the Open Compute Project (OCP) to adapt entire systems to AI requirements. The Maia chip represents Microsoft’s first complete liquid-cooled server processor, aimed at enhancing server density and efficiency within existing data center footprints.

As Microsoft progresses with the deployment of Maia 100 and Cobalt CPU, the impact on AI cloud service pricing and the acceleration of Microsoft’s broad AI initiatives remain to be seen. The company’s commitment to partnerships with industry giants like Nvidia and AMD underscores its strategy of diversifying the supply chain and offering customers a range of infrastructure choices.

Visited 1 times, 1 visit(s) today
Last modified: February 25, 2024
Close Search Window