xAI Plans to Let Cursor Train Its Composer 2.5 AI Coding Model on Tens of Thousands of xAI GPUs

Elon Musk's AI company xAI plans to allow Cursor, one of the most popular AI coding tools, to use its GPU infrastructure to train Cursor's Composer 2.5 coding model, according to Business Insider. The arrangement would give Cursor access to tens of thousands of xAI's GPUs — significant compute for a company that primarily distributes AI tools rather than training frontier models.
What's Being Shared
xAI's GPU cluster, built around Nvidia's latest accelerators at the company's Memphis "Colossus" facility, is one of the largest AI training clusters in the world. Allowing Cursor to use this infrastructure to train Composer 2.5 is a meaningful resource transfer — one that could give Cursor's coding model capabilities it couldn't achieve as easily with standard cloud compute.
Composer is Cursor's agentic coding feature, designed to handle multi-step coding tasks rather than simple completions. Training a more capable version of Composer on xAI's infrastructure would directly improve the product that has made Cursor one of the fastest-growing developer tools in the market.
xAI's Strategy
For xAI, providing compute to Cursor is a new kind of strategic play. Rather than competing head-on in every AI application category, xAI is positioning its infrastructure as a resource that other AI companies can use — potentially in exchange for equity, data, or distribution relationships. It's a model that resembles what cloud providers have done with AI startups, but with xAI's own proprietary hardware as the differentiator.
This is also consistent with xAI's broader ambition to monetize its massive compute investment. Training runs for other companies' models generates revenue and utilization without requiring xAI to build every application layer itself.
Cursor's Position
Cursor has become one of the defining tools of the AI coding era, with significant adoption among professional developers. The company recently hit headlines when Uber reportedly blew through its Cursor Pro budget, leaving hundreds of engineers on a waitlist. That kind of demand puts pressure on Cursor to deliver ever-more-capable models — and access to xAI's compute could help it deliver.
The Bottom Line
xAI sharing GPUs with Cursor is compute-as-strategy. It's a novel form of partnership in the AI ecosystem — one that trades infrastructure access for relationship value. If it works, expect more of these arrangements as AI compute owners look for ways to monetize their infrastructure beyond first-party products.