The US Department of Energy forecasts that data centre electricity consumption could nearly triple within three years, potentially reaching 12 percent of total US electricity supply. Every watt of electricity consumed by computing hardware eventually becomes heat. And every unit of heat generated inside a data centre requires a cooling system to remove it.

The HVAC industry has spent decades talking about data centres as a growing opportunity. In 2026, the DOE's electricity forecast makes the argument definitively: data centre cooling is not a niche vertical for specialised contractors. It is the fastest-growing segment of commercial HVAC in the country, and the companies that build credible capability in it today are positioning for a decade of compounding growth.

The DOE Forecast Explained

The Department of Energy's estimate reflects the convergence of three demand drivers that are simultaneously accelerating:

• AI training workloads: Training large language models and other AI systems requires enormous amounts of computation. The GPU clusters used for AI training draw 10 to 50 megawatts per facility — equivalent to the power consumption of a small city. As AI models grow larger and more numerous, training compute demand grows with them.

• AI inference at scale: Once trained, AI models are deployed to answer questions, generate content, and process data. Every ChatGPT query, every AI image generation, every AI-assisted software function consumes computing power — and generates heat. The cumulative inference demand from billions of daily interactions is enormous.

• Cloud computing and storage expansion: Separate from AI specifically, the ongoing expansion of cloud computing infrastructure for business applications, streaming, and data storage continues to add data centre capacity. AI is the marginal growth driver, but cloud expansion is the baseline.

The US Department of Energy estimates that data centre electricity consumption could nearly triple within three years, potentially reaching up to 12% of total US electricity supply — driven primarily by AI training and inference workloads that generate heat densities far beyond what conventional air cooling can manage economically.

What Tripling Electricity Means for Cooling

The relationship between data centre electricity consumption and cooling demand is direct: roughly 30 to 40 percent of total data centre energy goes to cooling infrastructure. If data centre electricity consumption triples from its current level, cooling energy consumption triples proportionally — and the capital expenditure on cooling equipment triples as that infrastructure is built and upgraded.

The global data centre cooling market was valued at approximately $19.5 billion in 2025 and is projected to reach $45.8 billion by 2033 at a compound annual growth rate of 17 percent. The DOE's tripling forecast, if accurate, suggests the upper end of that projection — or possibly beyond it.

For HVAC manufacturers and commercial contractors, the implications are straightforward: the market for data centre cooling equipment and services is growing faster than any other segment of the industry, and the growth is driven by structural technology demand that will not reverse.

The Shift to Liquid Cooling — And Why HVAC Expertise Is Central

The tripling of data centre electricity consumption is not uniformly distributed across facility types. The growth is concentrated in high-density AI deployments — facilities where GPU clusters generate 30 to 100 kilowatts per rack. Conventional air cooling cannot manage these loads economically.

The industry is shifting toward liquid cooling: rear-door heat exchangers, coolant distribution units, direct-to-chip cooling, and immersion systems. These technologies require HVAC engineering expertise to design, install, commission, and maintain — but they are not the same products that commercial HVAC contractors have historically specified and installed.

The contractors and distributors who are investing in liquid cooling capability now — training their engineers, developing manufacturer relationships, pursuing data centre project experience — are positioning themselves at the front of a wave that the DOE's own projections confirm will be enormous.

Nvidia Said It Doesn't Need Chillers — But the Story Is More Complicated

When Nvidia's CEO stated that the company's latest AI system could be cooled using hot water without traditional chillers, HVAC manufacturer stocks dropped sharply. The reaction reflected genuine investor concern about the long-term role of conventional chilled water infrastructure in the AI data centre.

The reality is more nuanced. Nvidia's claim referred to specific, optimised configurations of its newest hardware. The majority of deployed AI infrastructure — including the many generations of existing GPU installations across hyperscaler data centres globally — still requires conventional cooling infrastructure. Retrofitting existing facilities is a massive, long-duration HVAC project opportunity. And even new AI-optimised facilities require substantial building-level cooling infrastructure — chillers, cooling towers, heat rejection systems — even when the server-level cooling is liquid-based.

The Nvidia announcement created noise. The DOE's tripling forecast is the signal. For HVAC professionals, the signal is the one worth reading.

Frequently Asked Questions

How much will data centre electricity consumption grow?

The US Department of Energy forecasts data centre electricity consumption could nearly triple within three years, potentially reaching 12% of total US electricity supply, driven primarily by AI training, inference, and continued cloud computing expansion.

What does data centre electricity growth mean for HVAC?

Cooling accounts for approximately 30 to 40% of data centre energy consumption. Tripling electricity consumption means tripling cooling infrastructure demand. The global data centre cooling market is already projected to reach $45.8 billion by 2033 at 17% annual growth — the DOE forecast supports or exceeds the upper end of that projection.

Do AI data centres still need HVAC contractors?

Yes. Even facilities using advanced liquid cooling at the server level require substantial building-level HVAC infrastructure — chillers, cooling towers, precision air units, and heat rejection systems. The Nvidia claim that its latest hardware can run without traditional chillers applies to specific server-level configurations, not the building infrastructure that still requires conventional HVAC engineering.

How can HVAC contractors get into data centre work?

The most effective entry path involves pursuing manufacturer certifications for data centre cooling products, building relationships with mechanical engineering firms that design data centre systems, developing safety and quality documentation required for mission-critical projects, and gaining initial experience through smaller colocation facility projects before pursuing hyperscale work.