Microsoft Dives into AI Infrastructure with Multi-Billion Dollar Investment in CoreWeave, Backed by Nvidia
Microsoft, already a major player in the AI industry through its investment in OpenAI, is expanding its involvement in the surging demand for AI-driven services. According to sources familiar with the matter, CNBC has learned that Microsoft has committed to investing potentially billions of dollars over several years in cloud computing infrastructure from CoreWeave, a startup that recently raised $200 million. This funding round comes shortly after CoreWeave achieved a valuation of $2 billion.
CoreWeave specializes in providing simplified access to Nvidia’s high-performance graphics processing units (GPUs), widely regarded as the best available for running AI models. The deal between Microsoft and CoreWeave, established earlier this year, ensures that OpenAI, the operator of the popular ChatGPT chatbot, has ample computing power for its future needs. OpenAI relies on Microsoft’s Azure cloud infrastructure to meet its substantial computational requirements.
Both Microsoft and CoreWeave declined to comment on the specifics of the agreement.
The rush to leverage generative AI gained momentum after OpenAI introduced ChatGPT, a chatbot capable of producing sophisticated responses based on human input. Numerous companies, including Google, have since embraced generative AI for their own products, and Microsoft has been actively deploying chatbots across its services such as Bing and Windows.
Given the high demand for its infrastructure, Microsoft seeks additional avenues to access Nvidia’s GPUs. CoreWeave CEO Michael Intrator refrained from discussing the Microsoft deal during an interview last month, but he revealed that the company’s revenue has multiplied significantly from 2022 to 2023.
CoreWeave recently secured funding from hedge fund Magnetar Capital, extending its financing round to a total of $221 million. Nvidia, in a prior financing round, invested $100 million, as confirmed by Intrator. Established in 2017, CoreWeave presently employs 160 individuals.
Nvidia’s stock price has surged by 170% this year, briefly surpassing a $1 trillion market cap for the first time after issuing a forecast for the July quarter that exceeded Wall Street estimates by over 50%. Nvidia’s finance chief, Colette Kress, stated during an earnings call that the company’s growth would be primarily driven by data centers, reflecting the increasing demand for generative AI and large language models. OpenAI’s GPT-4, a large language model trained on Nvidia GPUs using extensive online data, serves as the foundation for ChatGPT.
Kress specifically mentioned CoreWeave by name during the call, and Nvidia CEO Jensen Huang referred to the company in his presentation at Nvidia’s GTC conference in March.
According to CoreWeave’s website, the company offers computing power that is “80% less expensive than legacy cloud providers.” Among their offerings are Nvidia’s A100 GPUs, also available on Amazon, Google, and Microsoft clouds. Additionally, CoreWeave provides cost-effective Nvidia A40 GPUs for visual computing, while the A100 targets AI, data analytics, and high-performance computing. Some CoreWeave clients have faced challenges in obtaining sufficient GPU power from major cloud providers, leading to requests for A100 or newer H100 GPUs from Nvidia. In response, CoreWeave has recommended A40 GPUs, which it claims offer excellent performance at a competitive price.
Microsoft has reportedly engaged in discussions with Oracle about potentially renting servers from each other to accommodate increased capacity, as reported by The Information earlier this month, citing an anonymous source.