
Elon Musk’s AI company, xAI, plans to put its stockpile of computing power to use in a new arrangement with coding startup Cursor, according to people familiar with the matter.
Cursor plans to train its latest AI coding model, Composer 2.5, on xAI infrastructure, the people said. Cursor will use tens of thousands of xAI’s graphic processing units (GPUs), the chips used to train AI models, they said.
The setup effectively turns xAI into a kind of cloud provider. By renting some of its GPUs to other companies, xAI could start generating revenue from its massive infrastructure while still developing its own AI models. The arrangement could help the company offset the costs of building and operating data centers, while also deepening ties with a startup that has access to valuable coding data.
Amazon, Microsoft, and Google, the largest cloud providers, own millions of chips and rent computing power out to thousands of companies and developers, generating huge profits. Newer players like CoreWeave and Lambda have built businesses around supplying GPUs to AI model developers. Access to computing power has become an increasingly competitive aspect of the AI arms race.
Representatives for xAI and Cursor did not respond to a request for comment.
It’s not the first time Cursor and xAI have overlapped. The startup hired two former Cursor product engineering leads in March, Andrew Milich and Jason Ginsburg. Ginsburg and Milich oversee xAI’s product team and report directly to Musk and xAI president Michael Nicolls, Business Insider previously reported.
xAI is one of many companies racing to build the best AI models, and it has one of the largest data center footprints. Musk said during an all-hands last December that xAI would beat competitors like OpenAI and Anthropic because it would have access to more power to train its models.
Over the past two years, xAI has rapidly expanded the footprint of its data centers, a project it has named Colossus. Last year, the company said it had around 200,000 Nvidia GPUs, and Musk has said it plans to expand to 1 million GPUs.
xAI’s infrastructure team has been experiencing a leadership shake-up. It lost its infrastructure lead, Heinrich Küttler, last week. The company moved Jake Palmer into a leadership role over the physical infrastructure team, and SpaceX’s Daniel Dueri took a leadership position over the compute infrastructure team last week, Business Insider previously reported.
In a memo to staff last week, Nicolls, xAI’s president, said the company’s model FLOPs Utilization (MFUs), a measure of how efficiently a GPU is used during AI training, was “embarrassingly low” at about 11%. Nicolls said he aims for the team to reach 50% in the next few months. For comparison, according to AI infrastructure company Lambda AI, most large-scale AI training operates between 35% to 45% MFU.
Cursor is in talks for a reported valuation of around $50 billion, Bloomberg reported last month. Meanwhile, it faces pressure as major AI startups like Anthropic and OpenAI push aggressively into building coding assistants.
In March, Cursor released Composer 2, a coding model designed to generate and edit code across large projects. Cursor built the model on top of an open-source AI model from Chinese startup Moonshot AI and fine-tuned it using its own data from its developer user base.
Do you work at xAI or have a tip? Contact this reporter via email at [email protected] or Signal at 248-894-6012. Use a personal email address, a non-work device, and non-work WiFi; here’s our guide to sharing information securely.
Read the original article on Business Insider
The post Elon Musk’s xAI plans to supply AI computing power to coding startup Cursor appeared first on Business Insider.




