Leveraging New Data Center Temperature Control Standards to Cut Cooling Costs

July 17, 2018 2 minute read

Large data center with temperature control.

Cooling is one of the biggest expenses in data center operation. When you consider the costs of buying, powering, and maintaining cooling equipment, it adds up to become a huge slice of the total cost of ownership.

As any data center operator is well aware, proper cooling is critical to keeping your systems in working order and prolonging equipment life. However, the belief that server rooms and data center environments need to be kept extremely cold is quickly changing. The main goal today is to find a temperature that effectively balances performance, reliability, and computing efficiency.

Data Center Temperature Standards  

Traditional data centers typically rely on recommendations from ASHRAE (American Society of Heating, Refrigerating, and Air-Conditioning Engineers) to set their environmental operating conditions. ASHRAE publishes “Thermal Guidelines for Data Processing Environments,” which has been updated several times since its first publication in 2004.

With each publication, the ASHRAE guidelines provide a wider range of recommended temperatures for data center operation.

  2004 Version 2011 Version
Low End Temperature 68°F 64.4°F
High End Temperature 77°F 80.6°F


Wider Temperature Ranges Allow for Increased Efficiency

In an article about the evolving standards of data center environmental ranges, Data Center Frontier points out that the Power Use Efficiency (PUE) metric was first introduced in 2007. PUE is defined as the ratio of power used by the facility compared to power used by IT equipment.

PUE became the metric data center operators focused on to analyze and improve their efficiency. It soon became clear that cooling used the majority of facility power—and offered the greatest opportunity for improvement.

With ASHRAE’s expanded allowable temperature ranges for data centers and the resiliency of modern IT equipment, data center owners can seek ways to increase operational efficiency. The updated ASHRAE guidelines even encourage the use of compressor-less cooling methods such as direct outside air economizers.  

Data Center Cooling Considerations

Standard temperature ranges for data centers have certainly increased, but each data center’s needs are unique. Here are a few additional considerations operators should keep in mind when planning their cooling systems.  

Consult equipment specifications

Experts can provide guidelines on recommended temperature ranges, but always read the specifications for your equipment. Each model has different low and high temperature ratings, so always factor the manufacturer's recommendations into your plans.  

Factor in your location

Your local climate influences how your IT equipment will respond to temperature shifts. If your cooling system fails, how long will it take before your IT equipment reaches critical temperatures?

Beware of humidity

Many argue that humidity is an even bigger concern than temperature when it comes to protecting IT equipment in a data center environment. The ASHRAE standards also include recommendations for allowable humidity ranges based on relative humidity (RH) and dew point temperatures. Operating equipment outside of these humidity levels will result in decreased equipment lifespan and even equipment failure.

While recommended standards and benchmarks exist, every data center’s needs are unique. The balance between proper cooling, efficiency, and equipment performance is constantly shifting as technology advances. With IT equipment becoming more advanced and resilient, higher allowable temperature ranges allow data center operators to maximize efficiency and save money without compromising system performance.

Questions about network infrastructure for your facility? Ask an expert today! 

request an in-building wireless assessment