More Efficient Cooling in Data Centers
In our increasingly digital world, data centers are the backbone of technological advancement, especially with the rise of artificial intelligence (AI). However, one of the most significant challenges they face is managing the waste heat generated by high-performance hardware. Tom Earp, engineering director at Page, a renowned design firm, has been at the forefront of data center design since 2006. His extensive experience—particularly during a six-year tenure with Meta—gives him a unique perspective on optimizing energy efficiency across various aspects of data center operations, from building structures to cooling systems and electrical supply.
For over a decade, data center designs maintained a level of stability, primarily due to the predictable nature of Moore’s Law. However, as Earp notes, the landscape has shifted dramatically. The adoption of advanced processors like Graphics Processing Units (GPUs) and the emergence of innovative chip designs have made it increasingly difficult to predict the energy demands a new data center will face in the near future. What remains certain is that as chips evolve, they are becoming not only faster but also hotter. Earp observes, “The people making these choices are planning for a lot of upside in how much power we’re going to need.”
The implications of these developments are profound. AI models rely heavily on chips that consume more power per unit of space than traditional processors, necessitating a reevaluation of cooling infrastructure. Earp succinctly states, “When power goes up, heat goes up.” As a result, data centers housing clusters of high-powered chips require more sophisticated cooling solutions than the conventional air cooling systems, which simply can’t keep pace.
Water cooling has emerged as the preferred method of managing heat in modern data centers. Water is significantly more effective than air at transferring heat away from equipment. However, this shift towards water cooling raises concerns about the sustainability of local water sources. Fortunately, innovative strategies are being developed to enhance the efficiency of water cooling systems.
One promising approach involves redirecting waste heat from data centers to nearby facilities where it can be repurposed. In Denmark, for example, heat generated by data centers is used to warm homes, while in Paris, it has been utilized to heat swimming pools in preparation for the upcoming Olympics. This not only aids in cooling the data centers but also contributes to community sustainability.
Water can also function as a thermal battery. By using renewable energy sources—such as wind turbines or solar panels—data centers can chill water and store it for later use. This strategy allows for cooling during peak demand periods, significantly reducing power consumption and optimizing energy efficiency when it matters most.
Yet, as data centers continue to heat up due to the increasing performance demands of modern chips, water cooling alone may not suffice. Tony Atti, CEO of Phononic, highlights the pressing need for advanced cooling solutions. With chip manufacturers like Nvidia developing processors capable of processing data at staggering rates—such as 1.6 terabytes per second—Atti warns that the exponential increase in cooling demand poses significant challenges.
Currently, the chips in servers account for approximately 45% of a data center’s energy consumption, while the cooling systems themselves consume nearly as much power—around 40%. Atti emphasizes, “For the first time, thermal management is becoming the gate to the expansion of this AI infrastructure.” This shift underscores the critical need for innovative cooling strategies as the demand for processing power escalates.
Inspired by: Source

