HomeArticle

How difficult is it to build a global super-large data center?

神译局2026-05-05 08:00
The current consensus is that the scale must be large enough.

God Translation Bureau is a compilation team under 36Kr, focusing on fields such as technology, business, workplace, and life, and mainly introducing new technologies, new ideas, and new trends from abroad.

Editor's note: A giant data center is forcing engineers to completely break the old industry rules. This article is from a compilation, and we hope it will inspire you.

The human demand for more intelligent (which now means larger-scale) AI models is endless. Coupled with the full popularization of existing AI technologies, the construction of global data centers has witnessed an explosive growth, with both the number of projects and the construction scale reaching historical highs. The most eye-catching one is the Hyperion Data Center announced by Meta in June 2025 and located in Louisiana, USA, with a total power supply capacity of up to 5 gigawatts. Meta CEO Mark Zuckerberg said that the area of Hyperion will be comparable to most of the Manhattan urban area; the power supply capacity of the first phase of the project is 2 gigawatts, and it is planned to be completed in 2030.

Although the ultra-large scale of 5 gigawatts is unparalleled in the industry, there are now dozens of similar giant projects advancing simultaneously around the world. Michael Guckes, the chief economist of the construction data platform ConstructConnect, revealed that before July 2025, the global investment in data centers had exceeded $27 billion; the annual total easily exceeded $60 billion, and the Hyperion project alone accounted for a quarter of the total investment.

For engineers responsible for implementing these projects, the current challenges are unprecedented. Top technology giants are sparing no expense to intensify efforts in the three core technologies of computing power, heat dissipation, and networking, to build ultra-large-scale infrastructure that was unimaginable five years ago.

However, there are many hidden dangers behind the rapid construction. The construction of large data centers requires a large influx of temporary workers, which directly causes noise pollution, traffic congestion, and environmental pollution, and often drives up local electricity prices. Moreover, AI data centers consume extremely high amounts of electricity 24 hours a day throughout the year, and the environmental protection pressure after completion remains huge. The latest research shows that in the United States alone, the annual carbon emissions of such data centers can reach tens of millions of tons of carbon dioxide equivalent.

Even though problems occur frequently, leading AI companies and engineers are still fully promoting the construction of giant data centers. So, what difficulties need to be overcome to build an unprecedented super-large data center?

AI Completely Rewrites Building Design Standards

Traditional data centers usually have a reinforced concrete foundation and a steel structure frame, and then concrete walls are poured. All that is needed is to build a stable building shell. Meta even once built giant temporary tents to quickly establish simple data centers.

However, the ultra-large scale of top AI data centers has given rise to numerous new problems. Robert Haley, the vice president of the architectural consulting firm Jacobs, said: The biggest problem lies underground. Soft, corrosive, and expansive foundation soil can delay the project schedule, and high costs must be invested in reinforcement and renovation.

Amanda Carter, a senior technical leader at the engineering design firm Stantec, added that soil thermal conductivity is also crucial because most power facilities are buried underground. If the soil has poor thermal conductivity, the heat generated by the equipment cannot be dissipated, which can easily lead to malfunctions. Before starting construction, engineers often need to collect hundreds or thousands of soil samples for repeated testing.

Computing power hardware: The cabinets are as heavy as heavy equipment. Nowadays, AI data centers generally use cabinet-level integrated systems, such as the NVIDIA GB200 NVL72 full cabinet. A single cabinet integrates 72 GPUs and 36 CPUs, with a maximum video memory of up to 13.4TB; the cabinet is 2.2 meters high and weighs more than 1.5 tons. This requires data centers to thicken the concrete and strengthen the steel bars for load-bearing, far exceeding the standards of ordinary buildings. A single GB200 cabinet consumes up to 120 kilowatts of electricity. If Hyperion achieves a full-load power supply of 5 gigawatts, the park can deploy more than 41,000 sets of cabinets, with the total number of computing power chips exceeding 3 million. In the future, the next generation of GPUs will have higher power consumption and performance, and the final number of installed units will be slightly reduced, but the energy consumption pressure will only increase.

Capital investment: Hundreds of billions are poured into infrastructure. According to statistics, in the first seven months of 2025, the global investment in data centers was nearly $27 billion, and the annual estimate is approaching $60 billion. The Hyperion project of Meta alone cost as much as $10 billion. Nowadays, data center construction has become the core pillar of the construction industry. Against the backdrop of the shrinking demand for residential and public infrastructure, the third-quarter financial report of ConstructConnect in 2025 stated bluntly: If it weren't for the new data center projects pouring in $110 billion as a bottom support, the recession in the construction industry would be much more serious.

Adequate funds have made enterprises completely abandon the traditional cost-saving ideas. Before the AI boom, data centers prioritized low costs and easy construction; now they only compete in speed and scale. Super-large precast concrete wall panels and floor slabs are fully popularized: some floor slabs can span up to 23 meters, and the load-bearing limit per square meter is as high as 3,000 kilograms, more than twice the upper limit of the national standard for general industrial buildings. Many components also need to be customized separately, which would have had no economic benefits in the past.

The construction period has also been significantly compressed: A senior executive of the computing power infrastructure enterprise Cru revealed that the previous construction period of 30 - 36 months can now be completed in as fast as 12 months. Under the rapid construction, the difficulty of manpower and material allocation has increased sharply.

Hyperion is located in Richland Parish, Louisiana, where the local permanent population is only 20,000, but at least 5,000 temporary workers have flooded in during the construction period. The high-paying jobs have driven short-term revenue increases in surrounding restaurants and small shops, but they have also led to complaints from residents: traffic congestion, construction noise, dust pollution, and light pollution caused by all-day construction, and the contradictions are intensifying. At the same time, the construction will also damage the groundwater layer and affect sewage discharge, and the water quality of some residents who rely on well water has been damaged. Some US cities have directly issued bans to reject the construction of new data centers.

Power Demand: Mostly "Self-Built Power Supply"

Power is the top problem for AI data centers and the core of environmental protection disputes. Traditional data centers are preferentially located close to power hubs with stable power supply and timely response. Power companies in Virginia, USA, will build new clean energy power plants to ensure power supply for the park.

However, the power consumption of top AI data centers has long exceeded the carrying capacity of the conventional power grid. Data from the Lawrence Berkeley National Laboratory in California shows that in 2014, the average total power consumption of all data centers in the United States was only 8 gigawatts; now, a top AI park can consume up to 1 gigawatt, and Hyperion consumes as much as 5 gigawatts.

Abe Ramanan, the project leader of a clean energy institution, admitted that data centers are the most headache for power companies. To cope with the sudden high power consumption, many old and highly polluting fossil fuel emergency power plants that should have been phased out are now forced to be restarted for service.

To solve the power supply problem of Hyperion, Meta directly cooperated with Entergy, a power company in Louisiana, to build three gas turbine power plants: two are located around the park, and one is located in the southeast of the state. Entergy said that Meta will bear the full cost of building the power plants and will not increase the electricity bills of ordinary residents. However, a Bloomberg investigation found that in most areas where data centers are located, electricity prices will rise, and Meta's case is a special one.

The carbon emission pressure is huge. A study in the journal Nature in 2025 predicted that by 2030, the annual carbon emissions of data centers in the United States alone will reach 24 million to 44 million tons of carbon dioxide equivalent. In addition to the emissions from the production of concrete building materials, most of the pollution comes from the extremely high power consumption of AI servers. The three newly built gas power plants of Hyperion are estimated to emit 4 million to 10 million tons of carbon dioxide per year throughout their life cycle, comparable to the annual emissions of the entire country of Latvia.

Data centers generally use concrete to build the main body and then use steel as the framework to reinforce and support the concrete exterior wall structure. The foundation usually uses cast-in-place concrete, while the walls and floor slabs mostly use precast concrete panels, with a maximum span of up to 23 meters. The floor slabs use a reinforced T-shaped structure similar to steel beams, and the thickest part can be up to 1.2 meters wide. Large data centers often use hundreds of such concrete panels. The American Concrete Institute estimates that in the next three years, this wave of large-scale construction boom will consume 1 million tons of cement. However, this consumption still accounts for a small proportion in the entire cement industry - the total cement production in the United States in 2024 was about 103 million tons.

The total installed power generation capacity of these power plants can reach 2.26 gigawatts, and all use combined-cycle gas turbines, which can recycle the waste heat in the exhaust gas. This technology can increase the thermal efficiency to over 60%, allowing more fuel to be converted into usable electricity. In contrast, simple-cycle gas turbines directly discharge exhaust gas, and the thermal efficiency is only maintained at about 40%.

Even with high-efficiency units, the annual carbon emissions of the supporting power plants of Hyperion throughout their life cycle will still reach 4 million to 10 million tons of carbon dioxide, depending on the frequency of unit activation and the actual energy efficiency standards after completion. At the highest emission level, it is equivalent to the total exhaust emissions of more than 2 million private cars in a year.

Fortunately, not all of Meta's data centers use this power supply method. The company has announced that the Prometheus large data center project in Ohio is planned to be put into operation by the end of 2026 and will use nuclear power for the entire process. However, other technology giants have chosen power supply solutions with lower energy efficiency in order to quickly build data centers.

The ultimate example is the xAI Colossus II Data Center located in Memphis: the company directly transported dozens of temporary gas turbine generators to supply power to this suburban park. The new Stargate Data Center of OpenAI located in Abilene, Texas (expected to be put into use in the second half of 2026) is also equipped with gas turbines with a maximum total power of 300 megawatts. Both of these use simple-cycle units, and their energy efficiency is far lower than that of the combined-cycle power plant built by Entergy for Hyperion.

Currently, gas turbines are in short supply, and the longest waiting time for new machine orders is seven years. Some data centers are even modifying and refurbishing retired aero-engines to serve as power generation turbines in order to meet the construction schedule.

AI Cabinets: Redefining Load-Bearing and Energy Consumption

The current urgent demand for new and stable power stems from the amazingly power-consuming graphics cards (GPUs) in modern AI data centers. In January 2025, Mark Zuckerberg posted an announcement that Meta plans to put at least 1.3 million GPUs into operation by the end of 2025. The Stargate Data Center of OpenAI is planned to be equipped with more than 450,000 NVIDIA GB200 graphics cards; the design installed capacity of the xAI Colossus II (an expansion project of Colossus I) has even exceeded 550,000 GPUs.

Currently, GPUs are still the preferred core hardware for AI computing power tasks. They are integrated and assembled into integrated cabinets, which are as large as large cabinets; like the data centers that house them, the weight, structural complexity, and power consumption of these cabinets are all soaring rapidly.

In addition to powerful computing power, the NVIDIA GB200 NVL72 full cabinet also requires a large amount of memory. A single GB200 NVL72 cabinet can be equipped with a maximum of 13.4TB of high-bandwidth memory. In an ultra-large data park like Hyperion, the overall internal storage reserve needs to be at least dozens of petabytes. The soaring demand has directly driven up the memory price: in 2025, the price of DRAM memory (especially the DDR5 model) increased by as much as 172%.

The Hyperion Data Center has a total of 11 main buildings, with a total power supply capacity of 5 gigawatts. Calculated by the similar configuration of each building, the power consumption of a single building is close to 500 megawatts, which is enough to supply about 4.2 million American households. Just this Hyperion Data Center located in Richland Parish consumes twice as much electricity as the xAI "Colossus" Data Center. When the latter was completed in the summer of 2024, it was once one of the world's top super-large data centers.

The NVIDIA GB200 NVL72 full cabinet system is the current mainstream choice for AI data centers. A single cabinet has 72 graphics cards and 36 processors built-in, with a maximum of 17TB of memory; the cabinet is 2.2 meters high and can weigh up to 1,553 kilograms, with a power consumption of about 120 kilowatts, equivalent to the total power consumption of 100 American households. NVIDIA said that this is only the initial stage, and the power consumption of a single cabinet may soar to 1 megawatt in the future.

Viktor Petrik, the senior vice president of infrastructure solutions at Vertiv, admitted that the rapid iteration of AI full cabinet technology