Elon Musk announced the "Space AI" plan.
In addition to hardware costs, in the next few years, electricity production, transmission, and cooling requirements will also be the main constraints faced by large artificial intelligence data centers. In view of this, Elon Musk recently proposed a disruptive vision: deploying AI computing centers in space.
Musk serves as the CEO of xAI, SpaceX, and Tesla. The former two are engaged in the research and development of large AI models and commercial aviation respectively, while Tesla is involved in multiple businesses such as electric vehicles, energy storage, and robotics. Connecting these businesses can provide almost closed - loop support for his vision. Once successful, his companies may also be the biggest beneficiaries.
Why this vision?
Musk believes that within the next four to five years, running large - scale artificial intelligence systems in orbit will be more cost - effective than running similar systems on Earth. This is mainly due to the "free" solar energy and relatively easy - to - achieve cooling technology.
He previously said at the US - Saudi Investment Forum: "I estimate that before the Earth's potential energy is exhausted, the cost - effectiveness of electricity and artificial intelligence in the space field will be far better than that of current terrestrial artificial intelligence. I think that even within a time frame of 4 to 5 years, the most cost - effective way to perform artificial intelligence computing will be to use solar - powered AI satellites."
"I think it won't be more than five years from now," he added.
Musk emphasized that as computing clusters grow, the combined demand for power supply and cooling will escalate to a level that terrestrial infrastructure can hardly keep up with. He claimed that to achieve a continuous computing power capacity of 200 to 300 gigawatts per year, it would be necessary to build large - scale and expensive power plants, as a typical nuclear power plant has a continuous power generation capacity of about 1 gigawatt.
Meanwhile, the current continuous power generation capacity in the United States is about 490 gigawatts (note that although Musk said "per year", he means the continuous power generation capacity within a specific period). Therefore, it is impossible to allocate most of it to artificial intelligence. Musk said that in the Earth's power grid, any AI - related power demand approaching the terawatt level is unfeasible.
"You can't build a power plant of that scale. For example, a continuous power generation capacity of 1 terawatt is simply impossible. You have to do it in space. In space, you can use continuous solar energy. In fact, you don't need batteries because there is always sunlight in space. Moreover, solar panels will actually be cheaper because you don't need glass or frames, and cooling is just radiative cooling," he explained.
Musk's plan
It is reported that Musk's core plan is to deploy 100 gigawatts of solar - powered AI satellites in orbit every year, a scale comparable to a quarter of the total electricity in the United States.
He posted on November 19th, saying: "Starship should be able to send about 300 gigawatts, or even 500 gigawatts, of solar - powered AI satellites into orbit every year." He also added that at this rate, the orbital AI computing power could exceed the total electricity consumption in the United States within a few years, which averages about 500 gigawatts.
This is not just a matter of launching hardware. It is an important step towards what Musk describes as a "Kardashev Type II civilization", a theoretical milestone that refers to a society's ability to harness the entire energy output of a star.
According to posts on X, Musk has repeatedly linked the capabilities of Starship to this scale and pointed out that the energy level that can be harnessed by space solar power is "more than a billion times" the total resources on Earth. This concept is based on ideas such as the "Dyson sphere", but Musk's version focuses on a swarm of AI satellites that can process data while harnessing unlimited solar energy.
However, according to Musk, "there is still a key link holding it back." This link is likely to be the expansion of production scale and orbital assembly scale.
However, some analysts point out that these satellites will not float idly. They will form a network of solar - powered computing nodes. According to a report released by PCMag earlier this month, this concept is similar to a "Dyson sphere" composed of satellites that can harness solar energy and even cool the Earth by blocking sunlight, thus assisting in climate control.
Musk also previously wrote on X: "Ultimately, solar - powered AI satellites are the only way to achieve a 'Kardashev Type II civilization'."
In addition, to reach the upper limit of 300 - 500 gigawatts of power generation per year, Musk also suggests manufacturing on the Moon. In an article posted on X on November 2, 2025, he said: "A lunar base can produce 100 terawatts of electricity per year. The base can manufacture solar - powered AI satellites on - site and use a mass driver to accelerate them to escape velocity."
Still just a dream
Although the future described by Musk is extremely optimistic, in fact, there are numerous obstacles ahead. Orbital debris, regulatory approvals, and international space policies all pose risks. Jensen Huang, the CEO of NVIDIA, commented on this: "This is just a dream."
Theoretically, space is an ideal place for power generation and cooling of electronic devices because the temperature in the shadows can be as low as - 270°C. But the actual situation is not that simple. For example, in direct sunlight, the temperature can reach as high as + 120°C.
However, in Earth's orbit, the temperature fluctuation range is much smaller: - 65°C to + 125°C in low Earth orbit (LEO), - 100°C to + 120°C in medium Earth orbit (MEO), - 20°C to + 80°C in geostationary orbit (GEO), and - 10°C to + 70°C in high Earth orbit (HEO).
LEO and MEO are not suitable as "space data centers" due to unstable lighting patterns, severe thermal cycles, passing through radiation belts, and frequent eclipses. GEO is more feasible because it has abundant sunlight throughout the year (although there are also eclipses every year, but they last for a very short time), and the radiation intensity is also relatively low.
However, even in geostationary orbit, building large - scale artificial intelligence data centers faces severe challenges: megawatt - class GPU clusters require huge heat - dissipation wings to dissipate heat only through infrared radiation. This means that each gigawatt - class system requires tens of thousands of square meters of deployable structures, far exceeding the capabilities of any aircraft to date.
In addition, launching such a large - scale project would require thousands of Starship - class flights, which is unrealistic within the four to five years set by Musk and is extremely costly.
In addition, high - performance AI accelerators such as Blackwell or Rubin and their supporting hardware still cannot work properly under the radiation in GEO orbit without heavy shielding or thorough radiation - resistant modification. These modifications will significantly reduce the clock frequency and/or require the adoption of new process technologies that need to significantly improve radiation resistance rather than just optimize performance. This will reduce the feasibility of building AI data centers in GEO.
In addition, considering the scale of the proposed project, technologies such as high - bandwidth connection with the Earth, autonomous maintenance, debris avoidance, and robotic maintenance are still in their infancy. This may be why Jensen Huang said that all this is currently just a "dream".
This article is from the WeChat official account "Caixin Lianxun", author: Huang Junzhi. It is published by 36Kr with authorization.