Soon after Nvidia’s HGX H200 AI chip announcement, Microsoft unveiled its rival offering at its recent Ignite conference in Seattle on Monday.
Microsoft’s new AI chip is called Maia 100 and it is meant to go against Nvidia’s AI-focused GPUs, though its comparative performance remains unclear. The Windows maker has also announced a second chip called Cobalt 100 which will focus on general computing tasks like Intel’s Core processors.
Scott Guthrie, executive VP of Microsoft’s Cloud + AI Group said: “Microsoft is building the infrastructure to support AI innovation, and we are reimagining every aspect of our data centers to meet the needs of our customers. At the scale we operate, it’s important for us to optimize and integrate every layer of the infrastructure stack to maximize performance, diversify our supply chain, and give customers infrastructure choice.”
During a shortage of GPUs, which has plagued the industry significantly in the past, AI chips from cloud providers could potentially address the demand. However, unlike Nvidia or AMD, Microsoft and other cloud computing counterparts don’t intend to sell servers incorporating their chips. According to Borkar, the company developed its AI computing chip based on customer feedback.
Microsoft is currently testing how Maia 100 performs working with its AI chatbot Copilot, GitHub Copilot coding assistant, and OpenAI’s GPT-3.5-Turbo. OpenAI’s CEO Sam Altman said: “We were excited when Microsoft first shared their designs for the Maia chip, and we’ve worked together to refine and test it with our models. Azure’s end-to-end AI architecture, now optimized down to the silicon with Maia, paves the way for training more capable models and making those models cheaper for our customers.”
As for Cobalt 100, Microsoft has been testing it with the Teams app and Azure SQL Database and so far it has been able to perform 40% better than Azure’s existing Arm-based chips, according to Microsoft’s claims. The current Arm-based chips on Azure come from the Ampere startup.
While talking to CNBC during an interview, Rani Borkar, a corporate vice president at Microsoft said that the Cobalt 100 chips will power virtual machines through Microsoft’s Azure cloud in 2024. She did not say when Maia 100 would be rolled out. As mentioned earlier, Maia’s performance compared to Nvidia’s H100 is also unknown.