Home World Politics Crypto Business Sports
Home World Politics Crypto Business Sports
Innovations and Challenges in Data Centre Cooling Amid Growing AI Demand image from bbc.co.uk
Image from bbc.co.uk

Innovations and Challenges in Data Centre Cooling Amid Growing AI Demand

Posted 27th Dec 2025

L 20%
C 75%
R

Data centre cooling is evolving with innovative technologies aimed at reducing energy consumption and environmental impact.

Iceotope liquid cooling technology can cut data-centre cooling energy by up to 80% by circulating water in a closed loop to cool oil-based fluids that directly contact components. This enables high-speed operation without fans.

A novel approach includes a US hotel chain's plan to use heat generated from hotel servers to warm guest rooms, laundry facilities, and a swimming pool, highlighting dual-purpose applications.

However, challenges remain as demonstrated in November when a data-centre cooling failure at CME Group disrupted trading technology. CME Group responded by adding external cooling capacity to reduce the risk of similar outages.

The sustainability of data centres is under scrutiny due to their high energy and water usage. Demand is increasing, particularly driven by AI workloads, leading to calls from more than 200 environmental groups in the US for a moratorium on new data centres.

Some cooling methods use refrigerants containing PFAS, which pose safety and climate risks due to vapor leaks. Consequently, some providers are transitioning to PFAS-free refrigerants.

Microsoft's subsea data centre project off Orkney achieved a Power Usage Effectiveness (PUE) of 1.07 and zero water consumption. Although the project was discontinued due to unfavorable economics, it contributed valuable insights for future cooling work.

Researchers at UC San Diego have proposed a pore-filled membrane-based cooling system that could cool chips passively by using heat to pump fluid, aiming for commercialisation.

The rapid growth of AI is driving increasing cooling needs, and experts advocate for transparency regarding the energy consumption of AI models. One advocate highlights that large language models (LLMs) are highly energy-intensive, while another notes that LLMs have reached their productivity limit.

Sources
BBC Logo
https://bbc.co.uk/news/articles/cp8zd176516o
* This article has been summarised using Artificial Intelligence and may contain inaccuracies. Please fact-check details with the sources provided.