Why is Elon Musk taking the matter of the space data center seriously?
In the past few years, the AI industry has seemed to be booming. Model parameters have been getting larger and larger, more and more training data has been fed, and investment and financing have often reached tens of billions of dollars.
However, behind this prosperity, electricity remains the biggest bottleneck.
On February 5, 2026, Elon Musk shared the real situation of xAI in a recent interview: In order to launch the Colossus cluster, they had to build a power plant across states and even considered manufacturing key components themselves. Chip production capacity is exploding exponentially, but the power supply is stuck in the long cycles of approval, cooling, and equipment delivery.
His conclusion is: The path on the ground won't work. In the next 36 months, the cheapest place to deploy AI will be in space.
Why space? Musk has done the math.
If this happens, the rules of the entire AI industry will be rewritten:
- Data centers will no longer rely on the power grid.
- Computing power expansion will no longer be restricted by ground permits.
- The ceiling for large - scale development will move from the Earth to space.
The only question left is: Do you believe it?
Section 1 | It's not a lack of chips, but a lack of electricity to start them
In the past, when the industry talked about the computing power bottleneck, it focused on chip production capacity. But Musk pointed out that what really restricts the large - scale development of AI is not GPUs, but electricity.
The Colossus 2 cluster of xAI is equipped with 330,000 GB300 chips. Musk has calculated that to run them all, including network equipment, CPUs, storage, cooling systems, and backup power, a total power generation capacity of 1 gigawatt is required.
What does that mean? The average electricity consumption in the United States is 500 gigawatts, and a single cluster would account for 0.2% of that.
However, even if there is theoretically enough electricity, this 1 gigawatt cannot be connected in reality.
Why can't it be connected? Musk sorted out the entire process before getting the chips up and running:
Power generation permit: xAI originally planned to build a power plant in Tennessee but got stuck in state - level approval. Finally, they moved to Mississippi and had to lay high - voltage lines across states, moving them several miles.
Equipment backlog: There are only three foundries in the world that can manufacture the moving and stationary blades of gas turbines, and orders are already scheduled until 2030. Musk said that SpaceX and Tesla may have to build their own foundry production lines to get the blades.
Cooling and grid connection: Chip power consumption is just the starting point. In actual operation, power also needs to be supplied to servers, networks, storage, and air - conditioning. To handle peak loads, a 40% margin of cooling power needs to be reserved. Want to connect to the public power grid? Utility companies will conduct a one - year study first and then tell you whether you can connect based on the report.
One year is too long for an AI company to wait. xAI needs to launch the cluster within a few months, but the power industry offers an approval cycle measured in years. The time difference between a few months and a year makes rapid expansion impossible.
Finally, xAI built its own power plant and purchased its own power generation equipment.
These specific cases occurred in the United States, but the power bottleneck is not just a problem in the United States. Musk mentioned in the interview that except for China, global power output is basically flat, while chip production capacity is growing exponentially. No matter which country it is, it will soon face the problem of power shortage.
Musk predicts that by the end of this year, there will be a batch of AI cluster chips sitting in warehouses unable to be started. The problem is not a lack of chips, but a lack of electricity.
Section 2 | Space is not a dream, it's a calculated choice
If the ground path doesn't work, then look up.
This idea sounds radical, but Musk didn't make this bet based on intuition. He compared the costs: Every obstacle on the ground can become an advantage in space.
Take power generation for example. Space solar power has two advantages: It operates at full power 24 hours a day, with no clouds or atmosphere to block the sunlight, increasing the light intensity by 30%; and there's no need for batteries.
Musk calculated:
"Chinese solar panels are already as cheap as $0.25 per watt. In space, the power generation efficiency is five times that on the ground, and the battery cost is saved. Overall, the cost per kilowatt - hour is one - tenth of that on the ground."
More importantly, none of the ground - related obstacles mentioned in Section 1 exist in space. There are no permit cycles, equipment bottlenecks, grid coordination issues, or cooling burdens.
Combining these factors: The cost is ten times lower, and there are no obstacles. Musk's judgment is that within 36 months, space will become the cheapest place to deploy AI.
Moreover, he's not just talking. SpaceX and Tesla have already started making plans.
SpaceX aims to achieve a launch frequency of 10,000 - 30,000 times per year for Starship, with a payload of 100 - 150 tons each time. Although this number sounds exaggerated, according to Musk's plan, this is the prerequisite for achieving large - scale space computing power.
Both companies are promoting a plan to produce 100 gigawatts of solar cells per year. And the solar panels used in space are lighter and thinner, requiring no thick frames or wind - proof designs, resulting in lower costs.
From technology to cost to production capacity, every aspect points in the same direction: Space is the next stop for AI computing power expansion.
Section 3 | In five years, space AI will exceed the total on Earth
Space is the best choice. When can it be realized?
Within 36 months, space will become the cheapest place to deploy AI. This is the first stage.
In five years, the AI computing power launched into space each year will exceed the cumulative total of all AI on Earth. From then on, space will become the main battlefield for AI computing power competition.
1. What's the scale in five years?
Musk predicts that in five years, the annual increment of space AI will reach hundreds of gigawatts. What does that mean? Let's take 100 gigawatts as an example:
- It requires 100 million chips (assuming each chip consumes 1 kilowatt).
- It requires corresponding solar arrays and radiators.
- It requires 10,000 Starship launches (calculated based on a payload of 100 - 150 tons each time).
For comparison: The average electricity consumption in the United States is 500 gigawatts. Launching 200 gigawatts into space per year is equivalent to adding the total electricity consumption of the United States every two and a half years.
Musk said that SpaceX is preparing for 10,000 - 30,000 launches per year, and this transportation capacity can support an annual increment of hundreds of gigawatts of space AI. It won't hit the ceiling of rocket fuel supply until it reaches 1 terawatt per year.
2. Why can't the Earth support it?
It's not that the technology can't do it, but that it's physically impossible.
The solar energy received by the Earth accounts for only one - two - billionth of the sun's total radiation. Even if you want to use one - millionth of the solar energy, it's still 100,000 times more than the current global power output of humanity.
To absorb this level of energy, you have to go to space. Musk said: You can't expand on a large scale on Earth.
Once the computing power demand reaches the terawatt level, the Earth has only one option: to supply power to edge devices (such as Tesla cars and Optimus robots). Centralized computing power must go to space.
3. A more long - term plan: The Moon
The limit for launching from the Earth is 1 terawatt per year. Beyond this scale, launches will have to be made from the Moon.
The lunar soil contains 20% silicon and has abundant aluminum, which can be used to manufacture solar cells and radiators on - site. The chips can be transported from the Earth. Musk envisions that the lunar base could use a mass driver to launch AI satellites into deep space at a speed of 2.5 kilometers per second, with a transportation capacity of up to 1 petawatt (1 million gigawatts) per year.
He said this is "true large - scale development."
From 36 months to five years to the lunar base, this is the expansion path deduced by Musk based on physical laws.
The Earth can't support the exponential growth of computing power. This is physics.
Space is the only place with unlimited expansion potential. This is also physics.
Section 4 | Three companies, one goal
From physical feasibility to actual implementation, Musk didn't suddenly decide to do space AI. It's a natural convergence of his three business lines.
1. SpaceX has transportation capacity.
The transportation capacity goal of Starship was originally for colonizing Mars. Now, there's an additional phased goal: to become a super - large - scale cloud service provider. The transportation capacity is ready, just waiting for launch.
2. Tesla has manufacturing capabilities.
The solar panels used in space are lighter and thinner, requiring no thick frames or wind - proof designs, resulting in lower manufacturing costs. Tesla's large - scale production capabilities match SpaceX's launch requirements.
3. xAI has demand.
xAI aims to develop super - large - scale AI models with the goal of understanding and simulating complex systems.
Musk said that he has locked in all the production capacity of TSMC and Samsung for chips. But by the end of this year, the limiting factor will change from chips to electricity. For xAI to lead, it must start more chips faster than others. And space is the only place that can break through the ground - based power bottleneck.
These three companies form a complete closed - loop:
SpaceX provides transportation capacity to send computing power into orbit.
Tesla provides large - scale manufacturing capabilities for solar panels.
xAI provides application scenarios to verify the commercial value of space computing power.
Individually, each of these requires ten years of accumulation. But Musk has all three lines at the same time, so space AI has transformed from an idea into an executable plan.
The space data center is not the end, but a necessary step for the three companies to move towards a larger scale.
Conclusion | The limit of the Earth, the starting point of space
Musk has said: There will be more and more chips, but you may not be able to run them at all.
Because ground - based data centers are approaching their energy consumption limits. Power restricts expansion, and there are obstacles everywhere in terms of permits, cooling, and equipment. Space has become the only way out.
SpaceX has transportation capacity, Tesla has manufacturing capabilities, and xAI has demand. With these three lines ready, it's just a matter of time.
The next stop is not on the ground, but in space.
📮 References:
https://www.youtube.com/watch?v=BYXbuik3dgA&t=2217s
https://www.reuters.com/business/aerospace-defense/spacex-seeks-fcc-nod-solar-powered-satellite-data-centers-ai-2026-01-31
https://apnews.com/article/92bc8ad95593bf3b5b801ddf36427194
https://www.dwarkesh.com/p/elon-musk
https://www.reuters.com/sustainability/climate-energy/why-does-elon-musk-want-put-ai-data-centers-space-2026-01-29/
https://apnews.com/article/elon-musk-orbital-ai-data-centers-xai-spacex-92bc8ad95593bf3b5b801ddf36427194
This article is from the WeChat official account "AI Deep Researcher", author: AI Deep Researcher, published by 36Kr with authorization.