Hi. Welcome to Ingram Micro.

Please choose your role, so we can direct you to what you’re looking for.

If you’d like to learn more about Ingram Micro global initiatives and operations, visit ingrammicro.com.

Our Favorite Data Center Cooling Techniques

July 14, 2017

Equipment cooling is probably the least interesting aspect of data center operations. Data center managers tend not to give cooling a lot of thought, but solution providers who understand its mechanics, and can offer more cost-efficient options, can save customers a lot of money in energy and equipment.

Although cooling isn’t necessarily sexy, it is crucial. Without proper cooling, data center temperatures can rise 
30 degrees in one hour, and the heat that servers and computing hardware generate can damage equipment. In fact, continuous operating temperatures of 80 degrees or more can shorten the useful life of server-room hardware. At the same time, cooling tends to be the biggest line item in the data center budget, typically taking up about 40 percent of operating costs.

The most common means of data center cooling is using high-output chillers: large air conditioners that run continually in order to keep the data center well within operating temperature. However, chillers require a lot of energy, which is why data centers are built in areas where electricity is readily available and inexpensive. Old-fashioned chillers are rapidly becoming outdated as new cooling options emerge . Here are just a few of our favorites:

Hot and cold aisles

Data center layout is important for cooling. It’s not the same as cooling an office building where conventional HVAC techniques apply. Strategic placement of heat-generating equipment and equipment racks, combined with optimized air flow, has a big impact on efficient cooling.

For years, data center cooling has taken advantage of raised floor designs, using the space under the floor to consolidate cold air and circulate it under the server racks. While the raised floors remain, today’s cooling strategy relies more on efficient air cooling and equipment placement. Hot aisle/cold aisle layouts are designed to optimize cooling while cutting costs. Basically, the hot aisle/cold aisle strategy places server racks in alternating rows with cold air intakes facing one way and air exhausts facing the other. Cold aisles face the air conditioner output ducts. The heated air then blows into the hot aisles, which usually face the air conditioner return ducts.

Water cooling

More data centers are adopting water cooling in lieu of conventional air conditioning. Most water cooling systems run water through pipes to cool the server racks. Water has from 50 to 1,000 times the cooling capacity of air conditioning, and it’s more efficient for targeted cooling to deal with hot spots.

In the past, copper pipes were used for rack cooling, but they were susceptible to leaks that could damage equipment. Now there are self-contained systems mounted as heat exchangers, so leaks stay contained in the cooling unit to eliminate risk .

The latest trend is the floating data center, where the server setup is actually floated on a barge and cooled with seawater that is filtered and pumped through the heat exchangers. Microsoft is even experimenting with a submersible data center in order to address the cooling problem.

Environmental cooling

Rather than finding new ways to cool the data center, more colocation facilities are being built in cooler climates. Data centers are being built in high-desert and mountain locations where cooler, drier air can offset chiller operations. The air is already cooler with little humidity, so the cooling systems don’t have to work as hard. Many of these locations also have other natural cooling features such as rivers and streams.

In addition to taking advantage of the drier, cooler air at higher altitudes, other data center vendors are going underground. Iron Mountain, for example, has a facility in western Pennsylvania that is cooler because it is 220 feet below ground. Other data center providers are using abandoned mines and even abandoned missile silos in order to build underground data centers that take advantage of the natural cooling of the earth, as well as offering added security and protection.

Raise the operating temperature

Rather than spending on cooling, more data center operators are raising the temperature of their server rooms, saying the new hardware is more robust and less temperature-sensitive. Google, for example, is raising the operating temperature of its data centers from 70 degrees to 80 degrees, saving hundreds of thousands of dollars in power expenditures. ASHRAE, the American Society of Heating, Refrigerating and Air-Conditioning Engineers, is recommending data center operating temperatures of 80.6 degrees.

Mix and match strategies

Of course, there is no reason to adopt a single cooling strategy. Many of these approaches are complementary, such as using natural cooling while raising the operating temperature. When you combine cooling strategies, you can reduce operating expenses substantially.

So when advising clients on how to get more from their data center infrastructure, be cool. Consider alternative data center cooling strategies that can save your customers money—budget dollars that could be allocated for more data center upgrades.