- Cooling systems use up to 45 percent of total energy costs in a typical data center or server room.
- Efficient equipment, improved airflow management and free cooling can reduce cooling energy consumption.
- One data center saved over 1 million kWh/yr after implementing energy-saving measures.
Data centers and server rooms use far more energy than a typical commercial building space and their energy consumption is growing rapidly. Also, for each kilowatt of electricity used in a data center, heat is added that must be removed. In fact, the cooling equipment used to remove this heat accounts for nearly 45 percent of data center energy costs. While cooling is important for maintaining the performance and reliability of critical equipment, the energy-saving opportunities of improving the efficiency of your air-conditioning system can reduce energy costs significantly.
The following steps can help you save substantially on data center operating costs, while ensuring the reliability of critical equipment:
- Use smart cooling systems. These systems incorporate a number of sensors to track rack power use and heat generation. Sensors should be able to respond to localized temperature buildup, and must be properly maintained on a regular basis.
- Integrate with energy-management systems. To lower energy costs, the cooling system should be tied into an energy-management system using sophisticated temperature and time controls coordinated with performance and efficiency of servers, storage and networks.
- Increase server inlet temperature. The American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) recommends the temperature at the server inlet be set from 65°F to 80°F (and allows up to 90°F), yet many facilities have a much lower temperature setting. By gradually raising server inlet temperatures, you can reduce the need to remove heat from the area around the server. ENERGY STAR estimates that data centers can save 4 percent to 5 percent in energy costs for every 1°F increase in temperature.
Source: Federal Energy Management Program
Improve airflow management. Data centers have cooler air coming in and hotter air going out. To increase efficiency, eliminate the mixing and re-circulation of warm exhaust air. Common strategies include the following:
- Hot aisle/cold aisle layout. Arrange server equipment back-to-back and front-to-front to create hot air exhaust aisles and cold air intake aisles; and place physical barriers in open spaces above or below racks to prevent the mixing of air between the aisles.
- Block openings. Hot air is typically exhausted out the back of a server rack, while cool air is drawn in from the front. Block openings in the rack with blanking panels to prevent exhaust air from flowing forward and re-circulating around the equipment.
- Maintain underfloor pressure. Do not overuse permeable floor tiles. Use just enough to maintain proper air velocity.
- Use free cooling. Unlike many commercial buildings, data centers require significant cooling at night. Air-side economizers control the amount of outside air entering the building and can be adjusted to bring in cool outside air to reduce the air-conditioning load. Water-side economizers and in-row coolers use outside air to chill water in conjunction with a cooling tower and heat exchanger. They are effective in dry climates and for facilities with concerns about outdoor air quality.
- Variable frequency drives (VFD). Oversized motors and fans are common in data centers, since larger sizes are more efficient. VFDs offer significant savings by controlling fan volume and allowing the system to adjust to data center needs. After one facility installed VFDs on 20 air handlers, energy costs were reduced by nearly $100,000 (866,000 kWh/year).
- Heat recovery. The capture and use of waste heat through combined heat and power systems can improve data center efficiency. By using absorption or adsorption chillers powered by waste heat, chilled-water plant energy costs are reduced by well over 50 percent. One data center added heat recovery to an air-side economizer for a payback of a little more than a month and an annual savings of $7,000.
- Reduce lighting use. Lighting only makes up a small percentage of overall data center energy use; however, it gives off waste heat, driving up cooling costs. Install occupancy sensors to ensure lights only operate when needed and limit heat-intensive task lighting to specific detail work. At one 16,000-square-foot data center with 440 racks, lighting controllers with 30-minute enabled zones saved 238,000 kWh/year or $27,000, with a six-month payback.
- Consider switching to LEDs. Light emitting diodes produce less heat than conventional lighting. In one case, an intelligent LED system helped reduce HVAC loads contributing to overall 1.18 power utilization effectiveness (PUE).
Cool savings: over one million kwh per year
At Verizon's data center (nearly 25,000-square-feet) four chiller plants provide cooling to more than 750,000 square feet. After an energy assessment, the company identified three strategies to save cooling costs. The chilled water set point was increased from 42°F to 48°F to save chiller energy. To reduce the chilled water plant power consumption and increase reliability, the water-side economizer was repaired. VFDs were also installed on the condenser water and chilled water pumps. The result was an annual savings of 1,273,300 kWh/year, representing $150,000 at a simple payback of one year. (DOE 2008)
U.S. Department of Energy (DOE). DOE Assessment Identifies 30 percent Energy Savings for
Broadband and Wireless Communication Company. Industrial Technologies Program. December 2008. (Last accessed January 21, 2013)
U.S. Department of Energy. Best Practices Guide for Energy-Efficient Data Center Design. Federal Energy Management Program. Revised March 2011. (Last accessed January 21, 2013).