The evolution of the modern office needs to accommodate team members meeting in person and online. ESD now Stantec Senior Audio Visual Consultant John Doyle offers an overview of the best technology to consider. (Learn more about the meeting room technology.)
Michael Young explains how to get the most out of mission critical mechanical systems.
Humidity control is a critical component in the design and operation of data centers, though it can take a backseat to the more obvious issue of heat generation.
Too much humidity, and you risk condensation, internal corrosion and electrical shorts; not enough humidity, and you risk electrostatic discharge (ESD). Not only that, the rate-of-change of humidity can cause sensitive equipment (such as tape media storage) to expand and contract. This can lead to damage or premature failure.
There is a fine line between a well-controlled data center and one that houses equipment in danger of catastrophic failure. The installed mechanical systems must be provisioned and optimized to prevent down-time. Here are some considerations to ensure you are getting the most out of your mechanical systems.
Isolate Critical Spaces from Outside Conditions
Local climate is one of the biggest factors in humidity control. Typically, a make-up air system provides minimum ventilation to critical spaces. Controlling and conditioning ventilation air precisely prevents excess moisture from being introduced into critical spaces. Buildings that utilize vapor barriers and tight construction will naturally limit the introduction of outside humidity. In certain cases, it may make sense to buffer critical spaces from external walls with non-critical spaces to minimize infiltration. Regardless of the mode of entry, action must be taken to control humidity once inside critical spaces.
Controlling Humidity Within the Space
Low relative humidity % (RH%) is typically addressed with the addition of a humidification system that precisely controls the amount of water added to the air. The design and operation of humidification systems must be carefully performed to avoid excessive energy or water use. There may also be opportunities to limit outside air or address issues with the cooling system without having to provide additional humidification. These options should be considered before additional equipment is installed.
High RH% can be controlled in a variety of ways, with the most common being removal of moisture via direct-expansion or chilled water cooling coils. This can be achieved by the make-up air system or within the cooling system for critical spaces directly.
Another approach to high RH% is raising the space temperature of critical areas; this can naturally lower RH to appropriate levels for the equipment. Note: This can only be done within allowable temperature ranges for the IT equipment and does have its limitations. Make sure you fully understand the root of your high RH% issues before implementing solutions.
Transients and excursions are more difficult to identify and control. Certain factors such as sensor placement and rack density could create unfavorable conditions for IT equipment and exacerbate transients and excursions. A common approach is to analyze averaged space condition trend data, but this must be done with caution. Averaged data typically smooths trends and may not indicate transients or excursions that could be damaging your equipment. Where possible, analyze raw data at the sensor or device level.
Trends for The Future
Advances in IT equipment technology, including improved thermal capabilities and higher exhaust temperatures, continually push the boundaries for allowable operating conditions. ASHRAE Technical Committee 9.9 provides guidelines for many aspects of data center design, including recommended operating temperature and humidity ranges for critical IT equipment. TC9.9’s Data Center Power Equipment Thermal Guidelines and Best Practices originally provided recommended ranges of 68°F–77°F (20°C–25°C) in 2004, which was reasonably conservative based on available data at the time. The latest Thermal Guidelines in 2015 present a range from 64°F–81°F (18°C–27°C) for all non-legacy equipment, with much wider ranges for certain classes of equipment.
Take caution in applying more aggressive thermal guidelines with legacy equipment or where tape storage media is deployed. As always, ensure that all equipment is operating within manufacturer-recommended ranges regardless of the latest Thermal Guidelines.
Continuing Developments
Controlling humidity and temperature are interdependent processes; they must be considered together. If any part of the mechanical system is not operating as expected, critical failures and data center downtime could occur. The thermal capabilities of IT equipment continue to expand, which provides more options for the designer.
This allows for reduced costs across the board, not only in construction costs but also within operating conditions. Economization availability increases which directly reduces the need for cooling. Additionally, the need for humidification or dehumidification processes – both resource-intensive – will be greatly reduced. Be sure to follow along with the continuing developments to ensure you are getting the most out of your data center mechanical systems.
This article was subsequently posted by leading industry publication Engineered Systems magazine.
Want to know more? If you would like further assistance in better understanding humidity control in data centers, reach out to Michael.