Microsoft has unveiled the first cloud private preview of Nvidia’s much-anticipated Blackwell AI infrastructure during its annual developers’ conference, Microsoft Ignite. Satya Nadella, the company’s CEO, introduced the Azure ND GB200 V6 virtual machine series, built on Nvidia’s Blackwell platform, which features 72 GPUs connected via a single NVLink domain and backed by InfiniBand technology. These new server racks are designed to handle cutting-edge AI training and inference workloads.
Additionally, Microsoft has equipped its servers with the Azure Integrated HSM, a custom-built hardware security module that uses encryption and signing keys for enhanced protection. This integration highlights Microsoft’s commitment to both innovation and security in its AI solutions.
Despite the excitement surrounding Blackwell, Nvidia has faced challenges in its production process. Reports indicate that overheating issues have led the company to request redesigns of its custom server racks late in the production phase. The racks aim to accelerate training for larger AI models by utilizing 72 GPUs. Nvidia, however, remains optimistic about meeting its shipment timeline, which is set for the end of the first half of next year. CEO Jensen Huang acknowledged previous delays due to a design flaw, describing it as entirely Nvidia’s responsibility.
Meanwhile, Microsoft expanded its AI ecosystem by enhancing its Microsoft 365 Copilot. New features include a real-time interpreter agent for Microsoft Teams, capable of handling up to nine languages, launching in preview next year. Additionally, a project manager agent, designed to streamline project planning and oversight, is now available in preview. The company also introduced autonomous AI agents within Copilot Studio, empowering users to build agents tailored to their workflows.
These developments underscore Microsoft’s efforts to advance AI capabilities while addressing evolving customer needs and positioning Azure as a leader in AI innovation.