You may already be sustainability-minded!

Corporate sustainability initiatives increasingly are finding their way into the data center. But many CIOs who invest in and deploy leading-edge carbon-cutting technology aren’t even aware they’re taking climate action. They’re just trying to improve the bottom line.

That puts CIOs in an enviable position. In most corners of the enterprise, ESG decision-making necessitates squaring potential climate benefits with the added expense, uncertainty and disruption of next-generation fuels, materials and processes.

But IT decision-makers have opportunities to skirt that problem in the data center, which is responsible for about two percent of US greenhouse gas emissions. Emerging innovations are helping to convert their massive resource-intensive server complexes into a rare tradeoff-free front in the sustainability war, where reductions in carbon emissions often go hand in hand with lower costs.

Increasingly, it’s a front worthy of investment. Supply-side concerns like elevated fuel prices and geopolitical instabilities are only part of the motivation. Power is already the data center’s largest expense. And the seemingly insatiable desire to glean more insights from more data is driving denser deployments of racks stuffed with evermore power-hungry CPUs and GPUs.

To make matters worse, data centers are reaching the point that the energy required to cool racks packed with those brawny servers is rising faster than what’s needed to run them.

By some estimates, as much as 40 percent of a data center’s power budget is now devoted to cooling. To try and reverse the trend, many facilities managers have adopted evaporative cooling. That can help trim electricity demands from HVAC units and on-board fans. But it requires huge amounts of water – collectively, about 174 million gallons per year in the US.

That’s, like, unsustainable. Which is why cost- and sustainability-minded CIOs are looking beyond evaporative cooling.

“The way we’ve been cooling data centers is going to have to change,” said Scott Tease, Vice President and General Manager of HPC and AI at Lenovo. “Traditional air cooling just can’t keep up. And we don’t have the water anymore to supply the massive amounts that evaporative cooling requires. It’s a big, big problem.”

More enterprises recognize that reducing water usage in the data center is becoming as urgent as slashing fossil fuel consumption. AWS, for one, is already getting a jump on both. At its re:Invent event late last month, the world’s largest cloud provider pledged to be water-positive by 2030. As well, the company reiterated its commitment to power 100 percent of operations with renewable energy by 2030. During his keynote, attendees cheered when CEO Adam Selipsky told them that the company was already more than 85 percent of the way there.

“Please everyone, get involved,” Selipsky implored the crowd. “It’s a problem for all of us.”

AI to minimize the cost of AI

As with most any data center upgrade, energy-efficient options are far more varied and plentiful if you’re starting from scratch with a new facility, whether it’s on-premise, collocated or in the cloud. That said, CIOs have compelling options for cutting carbon emissions – and lowering energy costs – that are effective regardless of whether they’re deployed in new sites or on individual server upgrades in existing racks.

Ironically, one increasingly popular way to reverse the impact of mushrooming AI and machine learning on data center power demand is with more AI and ML: that is, software and services to help IT optimize storage and workloads across existing assets to ensure the most climate-friendly, cost-effective distributions.

Anthony Behan, Managing Director of Cloudera’s Telecommunications, Media and Entertainment efforts, said the transition to high-powered 5G has made his customers laser-focused on minimizing data transfers to drive power savings – and, as a consequence, cost – by pushing data workloads closer to where the data is collected and stored.

“In telecommunications, every single C-level executive has cost as a line item on their performance review,” Behan said. “It’s something that they’re constantly thinking about. So this tends to be what lights their eyes up.”

Liquid cooling

IT decision-makers can make a big dent in power demands with liquid cooling, which is quickly rising as a leading alternative. It’s far more efficient than air cooling and even evaporative cooling. Suppliers say they can save as much as 40 percent on the energy bill for cooling and enable much higher-density server and rack deployments.

And because liquid cooling systems are closed-looped, there’s no water bill.

The technology has been used to cool supercomputers for years. Only recently, it’s begun waterfalling into everyday data centers. Suppliers now offer liquid cooling implementations for racks, servers and even individual components. Nvidia, for example, announced plans to begin offering liquid-cooled GPUs this winter.

Intel, the provider of literally the other hottest server component, has been working for several years to help ensure viable liquid cooling options for the data center. In 2019, it partnered with pioneer Lenovo to pair Lenovo’s Neptune liquid cooling technology and TruScale infrastructure platform with Intel’s Xeon Scalable server hardware for HPC and AI applications.

More recently, Intel has announced efforts with other suppliers, including Submer and Green Revolution Computing, as well as a $700 million investment in a liquid cooling R&D facility in Oregon. 

Starting from scratch

CIOs who are looking to deploy a new site, whether that be on-premise, with a colocation provider or cloud vendor, can find opportunities to reuse the heat that liquid cooling implementations diffuse.

Cloud&Heat, for example, offers self-contained data centers that can warm nearby office buildings in the winter. If you’re looking for other diffused heat applications, you can see what others are doing on this map of projects. The German provider of sustainable cloud and on-premise technologies coordinates the mapping effort under the auspices of the Open Compute Project.

One of the sites on the reuse map, in fact, is a Montreal colocation facility being developed by QScale, a startup offering high-density data center facilities. The first phase, which is expected to come online in early 2023, checks most of the sustainability boxes. For example, it is:

Located in a northern climate that requires much less cooling,Powered almost entirely by renewable energy, andProviding heat to what could be thought of as agricultural colocation: a complex of greenhouses built to grow fruits and vegetables throughout the winter.

“We’re all used to sustainable things being more expensive,” said Martin Bouchard, Co-founder and CEO of QScale. “But we’re not more expensive than a plain vanilla, dirty data center. So you can have sustainable, clean carbon output and be super-efficient all at the same time.”

That should be music to CIOs’ ears – regardless of whether they’re working to make their data centers more sustainable. Or just cutting costs.

Artificial Intelligence, Energy Efficiency, Green IT