Feature Article

What is PUE? How to Enhance Energy Efficiency in Liquid Cooling Systems?

585
reads

In 2007, The Green Grid (TGG) that is an affiliate member organization of the Information Technology Industry Council (ITI) proposed Power Usage Effectiveness (PUE), is the ratio of the total facility power consumption divided by the IT equipment power consumption.

Power Usage Effectiveness (PUE) has been the basis indicator for evaluating the energy efficiency of data centers, also helps data centers to improve the efficiency of good energy consumption monitoring.

What is PUE?

PUE_en.png

According to the International Energy Agency (IEA) report, in 2021, the global energy consumption of data centers reached 220 to 320 billion kWh, accounting for 0.9% to 1.3% of the worldwide power usage. In Taiwan, the total power consumption was 283 billion kWh last year and estimated among 3.7 billion kWh by data centers.

Data volumes grow rapidly which are doubling every two years with big data application flourishing. To reduce power consumption and increase spatial capacity are goals of enterprises data centers development in the future.

An ideal PUE would be 1.0. However, data centers require cooling systems, lighting, and other equipment that will also consume some power. Therefore, the PUE of data center will always be greater than 1.0.

The main power consumption of data center is including IT servers, refrigeration equipment, power supply & distribution system, lighting and other systems.

According to statistical reports, the power consumption distribution of data center is roughly as follows:

Data center energy consumption distribution.png

How to get lower PUE?

By separating hot and cold aisles, improving power usage of IT gears, using energy-saving uninterruptible power supply (UPS) and lighting facilities, PUE can be effectively reduced with these traditional methods. However, that will be the key factor to reduce PUE with the application with immersion cooling system from traditional air-cooling system.

OCP Summit 2018.jpg

Source: OCP summit, March 2018

Market case of global data center PUE approaching 1.0

  • Alibaba Cloud Zhejiang data center got lower PUE to 1.09 by single phase immersion cooling system without fan and A/C facilities.
  • The energy efficiency of each of Google's large-scale data centers (data centers that are already in stable operation) in 2023, and the actual returned PUE value is 1.10, and this data includes all indirect electricity consumption.
  • Facebook have developed an indirect cooling technology, called the StatePoint Liquid Cooling (SPLC) system which have completed in 2022 exhibit a Power Usage Effectiveness (PUE) of 1.09 and Water Usage Effectiveness (WUE) of 0.20. 

What KAORI can do?

KAORI EMBU team developed In-Row 800kW L/L (Liquid to Liquid) CDU (Coolant Distribution Unit) systems, In-Rack 80kW Direct-to-Chip (D2C) Cold Plate CDU systems, as well as 2U3kW/4U7kW/25U90kW immersion cooling systems to offer a variety of different solutions to help data centers for lower PUE. KAORI offers different solutions to help data center to reduce PUE by above efficient liquid cooling solutions.

高力液冷產品.jpg

References article:

Green Grid Alliance white paper「PUE, A Comprehensive Examination of the Metric」

Data center launched in Taiwan, energy consumption verification becomes focus

Power Usage Effectiveness (PUE)

Forecast of China’s data center electricity demand scale and proportion of total social electricity consumption from 2019 to 2025

A brief discussion on data center efficiency and PUE measurement

Zhejiang cloud computing data center project started 

Meta Data centers

Google Data centers

You might be interested in:

Breakthrough in Liquid Cooling Technology: Heat Dissipation Capacity Reaches 800 kW 

Future Trends in Server Room Energy Efficiency -In-Rack CDU (Coolant Distribution Unit) for Liquid Cooling Systems

KAORI’s Liquid Cooling and Immersion Cooling Technologies: High-Performance Server Cooling Capabilities and Competitiveness

Brazed Plate Heat Exchanger Applications for Data Center Cooling