Microsoft Unveils Homegrown Maia 100 & Cobalt 100 Chips To Tackle NVIDIA & AMD In AI & Compute
Microsoft Unveils Homegrown Maia 100 & Cobalt 100 Chips To Tackle NVIDIA & AMD In AI & Compute

Microsoft finally unveils its first batch of "homegrown" AI chips, the Azure Maia AI Accelerator and the Azure Cobalt ARM CPU targeted towards achieving cutting-edge computational performance.
Now, let's take a look at a few factors that actually led to Microsoft shifting to in-house AI solutions. The first is to stand out amongst competitors, since currently in terms of the "AI race", Microsoft is well ahead mainly since the company is well into integrating generative AI capabilities into all of its portfolio including mainstream and enterprise applications.
The second reason is to reduce dependency on the complete "supply chain" since currently the whole market is confined to specific suppliers, with the notable mentions being NVIDIA and TSMC which are facing huge order backlogs.
At today's Microsoft Ignite event, the company has put out a statement, revealing that it plans to take the reigns into its own hands. Microsoft believes that the introduction of homegrown AI components is "a last puzzle piece" when it comes to delivering a top-notch infrastructure to its clients and partners as well as a step towards reducing dependencies on the industry's suppliers.
Microsoft is building the infrastructure to support AI innovation, and we are reimagining every aspect of our datacenters to meet the needs of our customers.
At the scale we operate, it’s important for us to optimize and integrate every layer of the infrastructure stack to maximize performance, diversify our supply chain and give customers infrastructure choice.
-Scott Guthrie, executive vice president of Microsoft’s Cloud + AI Group
The introduction of Microsoft's custom AI products isn't a surprise since the company has been rumored to be developing in-house solutions for a long time now. The advantage Microsoft has here is that it can "optimize" the AI chips according to the company's own cloud and AI workloads. Through harnessing the power of already-built software resources, Microsoft plans on combining hardware to make an end-product that will help the company propagate in the future in terms of power, performance, sustainability, and cost.
Moving to the more interesting bits, it is unfortunately sad to disclose that Microsoft hasn't revealed any sort of specifications or statistics about either of its AI chips, but the firm has revealed that their Maia AI accelerator has already seen adoption by OpenAI, and since the accelerator is specifically targeted towards Azure hardware stack, and with the generational improvements in chip design and AI infrastructure, Microsoft believes that it could output "huge gains" with its product.
Diving into the technical bits, the Microsoft Azure Maia 100 is an ASIC built upon TSMC's 5nm node and uses an x86 host. The chip will be mounted in custom liquid-cooled racks offering up to 4 chips. The chip will support standard INT8 & INT4 data formats and utilize embedded ethernet interfaces.
We were excited when Microsoft first shared their designs for the Maia chip, and we’ve worked together to refine and test it with our models. Azure’s end-to-end AI architecture, now optimized down to the silicon with Maia, paves the way for training more capable models and making those models cheaper for our customers
-Sam Altman, CEO of OpenAI
The Microsoft Azure Cobalt 100 CPU is built upon an ARM architecture, and Microsoft believes that the British firm has the best designs in the market, enabling them to squeeze out the maximum “performance per watt” for the company's data centers. The company has outlined a total of 128 Neoverse N2 cores with 12-channel DDR5 memory support and up to 40% higher performance per core versus the outgoing ARM server chips.
What kind of impact would the industry have with Microsoft's own AI chips? Well, it is certain that it would increase competitiveness, but that can't be concluded right now since we don't have any sort of performance statistics to back our conclusion. What Microsoft has revealed is that the Azure AI & Azure CPU platforms will be backed by a fully custom approach involving custom-tuned drivers and custom delivery timeframes.
However, the approach of shifting to homegrown products has been "cooking" for a long time now, and Microsoft believes that there is a need in the industry for something different. It will be interesting to see how the Azure Maia 100 AI accelerator and Azure Cobalt 100 CPU stack are among the industry's offerings, but competitors should stay cautious. Microsoft will still offer Azure cloud services with 3rd party silicon as it has already announced new NVIDIA Hopper H100 & H200 instances and will also be leveraging AMD's upcoming Instinct MI300X and 4th Gen EPYC platform to power VMs.
What's Your Reaction?






