by Roger Lilley, Energize
Data centres, which store racks of computers and associated equipment for the processing and storage of data, consume enormous amounts of electrical power. Data is a vital asset of any business - organizations use information which must be available on a real-time basis. It has been said that every activity that human beings engage in is backed by the functioning of one or more data centres. Therefore, the data stored in a data centre must be both secure and available instantly. Data centres store huge volumes of data; the data centres of giants such as Facebook, Google, Amazon, and many other large organisations, are as large as a football field.
Energy use is a central issue for data centres. Power draw ranges from a few kW for a rack of servers in a cabinet to several tens of MW for large facilities. Some facilities have power densities higher than 100 times that of a typical office building. For higher power density facilities, electricity costs are a dominant operating expense and account for over 10% of the total cost of ownership (TCO) of a data centre.
Power is consumed by the computers and associated components, by cooling systems to keep the computers working at an optimal temperature, and by lighting, security and other electrical systems in operation at the data centre.
Power consumption not only costs the data centre company money, but also adds to the nation’s carbon footprint. This is particularly true in countries like South Africa, where over 90% of electricity is generated by coal- and diesel-fired generators. Frequent lengthy power outages (caused by load shedding and breakdowns) make standby generators an essential element at a data centre. These are usually diesel gensets which increase the data centre’s carbon footprint further.
The International Energy Agency reports that data centres used about 1% of the world’s electricity in 2020. This figure will keep increasing as demand for more data processing continues to grow. Thus, a massive growth in data could result in an increase in so-called greenhouse gases which could be responsible for changes in the world’s climate patterns. It has been said that data centres were responsible for 0,5% of US greenhouse gas emissions in 2018.
Future trends
Karsten Winther, the president for EMEA at IT infrastructure company Vertiv, predicts the following:
- Data centre business and footprint will continue to grow
- Higher speed (5G) networks will put pressure on data centres
- Densities will increase to handle ever-growing data demand
- Data centres will consume up to 3% of world electricity generation
- Liquid cooling of data centre equipment will become more common
- Data centre design will become standardised resulting in faster new builds
- Standby diesel gensets will give way to renewable energy systems and hydrogen-powered fuel cells.
In Africa, the data centre market is expected to grow to about US$54-billion by 2027 (CAGR = 12,73%) needing over 260 MW of electricity; of which South Africa would be US$3,23-billion by 2027 (CAGR = 11,15%), with an additional electrical load of 103 MW (https://www.arizton.com/).
Energy efficiency
While demand in data processing and storage is expected to increase, it is possible to slow the growth in electricity demand through the use of newer technologies and energy efficient systems. The most commonly used metric of data centre energy efficiency is power usage effectiveness (PUE), calculated as the ratio of total power entering the data centre divided by the power used by IT equipment.
PUE measures the percentage of power used by other equipment (cooling, lighting, etc.). The average American data centre has a PUE of 2, meaning 2 W of total power (ancillary equipment plus IT equipment) for every watt delivered to IT equipment. The ideal is estimated to be a PUE of 1,2.
The focus of measuring and analysing energy use goes beyond what's used by IT equipment because facility support hardware such as chillers and fans also use energy.
In 2011 server racks in data centres were designed for more than 25 kW and the typical server was estimated to waste (in heat) about 30% of the electricity it consumed. The energy demand for information storage systems was also rising. A high availability data centre was estimated to have a 1 MW demand. Cooling represented 35% to 45% of the data centre’s total cost of ownership.
Calculations show that in two years the cost of powering and cooling a server could be equal to the cost of purchasing the server hardware. Power is the largest recurring cost to the user of a data centre. Cooling the racks to 21°C wastes money and energy. Furthermore, overcooling equipment in environments with a high relative humidity can expose equipment to a high amount of moisture that facilitates the growth of salt deposits on conductive elements in the circuitry. It has been found that servers operate optimally at 60°C at 75% workload.
The need for liquid cooling
For decades, the common wisdom was that data centres needed raised floor environments to aid the cooling of technology. Yet, data centres are densifying – placing more computing power in tightly packed places to support big-data analytics and other digital workloads. That’s creating new burdens for cooling systems, which must protect these hot-running, often mission-critical workloads.
Jon Summers of the Research Institute of Sweden (RISE) says that liquid cooling offers the most efficient cooling for data centre servers. He says that only about 0,1% of the electrical current drawn by a microprocessor is used to process data. The rest, he says, is converted to heat. That heat has to be drawn away from the microprocessor, out of the server housing, away from the rack, and out of the building. This can be achieved by fans, and often is, he adds, but many fans would be needed in every rack and are an inefficient way of cooling.
Developments in liquid cooling are continuing, Summers says, with technology companies experimenting with new methods for improved cooling such as “direct-to-chip” liquid cooling technology, “rear-door” heat exchangers, and circuit boards immersed in non-conductive liquids.
The liquid cooling system can be used to transfer the heat evacuated from the servers to places where heat is needed, such as district heating systems or for industrial applications, Summers says.
IT infrastructure company Vertiv has unveiled a new liquid cooling solution for data centres in EMEA. Known as the Liebert XDU, this new generation of thermal management systems supports liquid-cooled servers and enables the control of liquid quality, flow and pressure.
As high-density computing applications such as data analytics and machine learning increase, rack densities and temperatures are exceeding the cooling capabilities of traditional air-cooled units and require more efficient and sustainable solutions. The Liebert XDU coolant distribution system enables the deployment of liquid cooled server applications into any data centre environment, from core to edge computing sites.
Vertiv says its Liebert CRV is a precision data centre cooling solution, integrated within a row of data centre racks and was designed to address some of the major challenges seen in high density applications. With adjustable airflow baffles and controls that independently manage airflow and temperature, the Liebert CRV can precisely deliver highly efficient cooling in the row where it’s needed. The system is available in multiple options, providing flexibility for any installation.
Send your comments to rogerl@nowmedia.co.za