Data centres have for years been known to be excessive consumers of power, consuming up to 3% of all global electricity production, and roughly ten times more per square metre than the average office.
Previously, energy efficiency wouldn’t necessarily be at the top of an information technology (IT) organisation’s priority list, but rising power costs, and an ongoing need for more hardware and equipment as well as booming data consumption is changing the way data centre operators are planning and running their facilities.

This interview with Peter Greaves – Aurecon’s Expertise Leader, Data & ICT Facilities (http://www.aurecongroup.com), explores why data centres consume so much energy; how design principles can help minimise a data centre’s energy needs; dealing with load-shedding; and possible future trends that may help reduce energy consumption.

As the uptake of data centres increases globally, there are rising concerns around the availability of electricity to support this trend. Why do data centres consume so much energy?

Data centres are complex environments that have been created to house IT equipment. Within these, the primary driver of energy consumption is the IT equipment itself. The IT equipment that supports a data centre includes communication systems, storage systems and other IT systems such as processors, server power supplies, network infrastructure and hardware, computers, Uninterrupted Power Supply and connectivity systems.

Most of the energy that is consumed within a data centre needs to pass through various stages of distribution before it can be used by IT systems. This energy is converted to heat, which is why these facilities require a significant amount of cooling.

As server densities continue to rise, cooling systems are under increased pressure in order to keep IT equipment and servers cool enough for them to operate efficiently. If temperatures or the humidity is too high, IT equipment can be damaged and tape media errors can occur.

There are a number of opportunities available that can help IT organisations and data centre developers optimise their energy consumption. What do these include?

Examples of these opportunities are the virtualisation and the use of ARM-based processors, which are designed to perform a smaller number of types of computer instructions so that they can operate at a higher speed. This provides outstanding performance at a fraction of the power. The technological development of both these options is making them a viable solution, but they are still outside of the remit of most data centre developers.

Good practical management of data centre space is still a suitable, basic way of reducing energy consumption. Making use of aisle containment systems, installing blanking panels into unused rack slots and providing brushed grommets into raised floor penetrations are all simple, yet effective energy saving methods that can be implemented but they are still forgotten in many smaller facilities.

Implementing aggressive power usage effectiveness (PUE) targets will also drive more energy saving initiatives and improvements within data centres. New facilities will find it easier to implement PUE targets as high efficiency equipment can be selected to reduce parasitic load requirements.

Implementing low PUE targets, such as energy efficient lighting, in existing facilities is also achievable, but it takes more financial backing and careful planning to realise. When equipment needs to be replaced, more energy efficient options can also be chosen, for example.

Cooling systems in data centres seem to be the largest power guzzler. Do you believe that more data centres could be using natural cooling and night cooling opportunities to save energy?

Free cooling opportunities are possible in many locations, including in South Africa, especially if the air temperature that is supplied is in line with the American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) guidelines (18°C-27°C).

With supply air temperatures of up to 27°C, we need outside air temperatures at 25°C or less in order to get significant benefits from free cooling. Data centre managers then need to decide whether they are going to use direct or indirect free-cooling. I tend to prefer indirect free-cooling via a heat wheel or heat exchanger as outside air contaminants or humidity levels do not restrict the use of free-cooling.

There’s definitely more opportunities to use this type of indirect free-cooling in certain areas of South Africa, particularly where the temperature falls below 19°C and the humidity is below 60 RH (relative humidity) for more than 2 500 hours per year.

Is running a data centre at a higher ambient temperature (than has been the norm to date) a practical option to reduce energy consumption that is needed for cooling?

Operators are still concerned about the efficiency of their data centres when they walk into a hot aisle. This perception, however, is gradually changing and people are becoming used to the idea that a hot aisle isn’t necessarily a problem.

Warmer data centres do pose a health and safety concern because anyone working in elevated temperatures cannot work for extended periods. Health and safety in warmer data centres can be managed by limiting the need to access the hot aisle, either through use of specific chimney type racks, or arranging all connections and operator works to be located in the cold aisle.

Elevated temperatures need some form of aisle containment in order to achieve optimal efficiency and this can cause problems for code compliances. Installing a sprinkler and gas suppression system can be problematic because enclosed aisles can create an extra layer of infrastructure with the associated costs.

How will load shedding – if it is implemented on an ongoing basis – affect data centres?

Load shedding will drive a greater level of reliance on the backup generator systems that are installed in data centres.

Facility operators will need to carefully manage fuel delivery protocols and facilities that have better supply chain management systems will run less risk once fuel demand ramps up. On-site fuel quantities will be a key asset with longer storage requirements becoming commonplace to deal with any local disruptions.

If load shedding is generally implemented, facilities with cogeneration energy systems will become more viable as they will be able to reduce their cost base substantially in comparison to operators that are running exclusively on diesel supplies.

Older facilities that have standby rated generator systems will need to consider downgrading their generator capacity as they will effectively be running in prime or continuous operational modes, favouring facilities rated to the Uptime Institute (a standardised methodology used by data centres as a way to measure their performance and return on investment) as they will have been designed to cater for this requirement.

Can some activities in a data centre be timed to take place after peak hours?

It is possible for some users to schedule key processing tasks to occur on an overnight cycle, however, this is limited by the business type and probably isn’t a workable solution for most operators. Other options to consider include:

• Provision of energy storage systems may provide some ability to defer energy usage to off- peak periods;

• Larger battery strings could provide an alternative to diesel generation; however, continuous deep cycling of batteries will significantly reduce their lifespan, necessitating early change out;

• Use of capacitor banks may be a viable alternative to batteries. These banks could be charged overnight for progressive use throughout the day. As the level gets low, the engines could be kicked in to replace or supplement; and/or

• Cooling storage may be a more viable alternative to reduce the mechanical cooling loads; however, some form of free cooling would probably negate the benefit of this.

As data centres are largely run off UPSs, to what extent could solar power be used to keep the UPSs charged?

A lot of solar panels would be needed to reduce the amount of electricity from the grid that most data centres would need. The most likely application is to reduce the demand on the grid by a percentage.

Although solar energy could supply a data centre with energy, it would need to be ramped up to be usable by the UPS. At this time, I would be very hesitant to suggest that this is a potential solution due to the inherent unreliability of solar energy.

Big operators like Google, however, are making use of solar energy by establishing solar generation plants that offset their data centre usage on the grid. The use of small panel arrays coupled with battery storage could be used to reduce the parasitic loads on site that are non-critical such as fuel polishing, engine heaters, office air conditioning and lighting.

How do you think data centre design and development in South Africa will change in the future?

Data centres in South Africa are in the early, exciting stages of development. As such, owners and operators are in an advantageous position to integrate sustainable, and, importantly, cost-effective energy solutions such as wind energy to significantly drive energy costs down.

If we look at what big operators are achieving overseas, then we are in the ideal position to start designing and developing more sustainable facilities. For example, Google’s data centre in Hamina, Finland, is aiming to reach its goal of becoming carbon neutral and it recently signed a deal with a wind farm operator in Sweden to power its Finnish facility with wind turbines.

Companies like Google are always looking for a competitive edge. They are looking for smarter solutions in their engineering for a variety of things including data centres, corporate headquarters and research and development facilities. Wind investment is just another competitive solution, but there are many more.

As South African data centres continue to develop, I predict that a growing number of operators will be more willing to tackle sustainability challenges head-on and incorporate more progressive solutions into their data centre designs and development.