Resource Center

Back to blog homepage

The environmental cost of data centers isn’t an AI problem. It’s an engineering one.

keywords

AI, Artificial Intelligence, Data Centers, Machine Learning, Renewable Computing

By Nicola Phillips and Dip Patel

Earlier this month, Fortune published an article detailing the 34% increase in global water usage by Microsoft’s data centers from 2021 to 2022, a spike that researchers attribute to Microsoft’s investments in artificial intelligence (AI) products like ChatGPT. During the same period, Google reported a 20% increase in its global water usage.

When I read pieces like this, I’m struck by the framing of the problem. Attributing these extraordinary environmental costs to the advent of mainstream AI makes it seem like the environmental crisis driven by Big Data is an AI problem. Yet, data centers have never been built or run efficiently. They have always been scourges on the environment. This is not a new issue.

The researchers cited in the Fortune article estimate that ChatGPT uses 500 milliliters of water for every series of between 5 and 50 questions it is asked or about two 8-ounce water bottles. 

Data centers had been gulping up prodigious amounts of water long before ChatGPT came along. By some estimates, a typical hyperscale data center (like the kinds run by Microsoft, Amazon, and Google) can use between 1 and 5 million gallons of water per day. Precedence Research predicts that the market size for hyperscale data centers will grow from $102 billion in 2023 to $935 billion in 2032, a compound annual growth rate of 28%.

You can choose not to use chatbots or other AI products, but if you use any streaming service or any search engine — indeed, if you use the Internet — you are contributing to the energy use of these hyperscalers. And if you don’t personally use any of these services, data centers will continue to get built and be run.

One of the unintended consequences of new technologies is that they can shed new light on old issues. Mainstream AI products didn’t create this scenario, but they are a renewed reminder that it exists, and persists.

Data center efficiency is most often measured by a single calculation called power usage effectiveness (PUE). PUE measures the energy distribution in a facility, or what percentage of the total energy expended is going to the computing processes themselves versus other facility functions. PUE is marketed as a measure of usefulness, but it fails to measure several variables that have a significant impact on the usefulness of a data center, including the efficiency of the chips, maintenance of equipment, the volume of water being used, location, waste materials, and source of power.

The focus on PUE at the expense of a more holistic approach to analyzing data center efficiency distorts reality and undermines optimization efforts. In order to understand the actual environmental cost of a facility, additional considerations beyond PUE are necessary.

Numbers around water usage are frequently highlighted in news reports about data centers and the environment, but on an individual basis, what are facilities actually doing to try and mitigate the issue?

Hyperscale data centers are, inherently, environmental villains. Building a 10,000-square-foot facility doesn’t just require a lot of water; it requires massive amounts of land, equipment, and labor. It takes time to build and is difficult and costly to maintain.

The problem with these facilities is an engineering problem. There are ways to build data centers more efficiently.

Rather than constructing one massive building, Soluna designs its sites with a series of compact, modular buildings arranged in a diamond formation. Each building is outfitted with giant fans and slanted air filters to maximize filtration, and the buildings face inwards, toward one another, so the sound produced by the fans has to bounce off of the neighboring buildings before it escapes. Giant fans can move more air with less energy than their smaller counterparts, and naturally cool the buildings enough so that no additional cooling technology is needed. The facilities themselves use zero water.

AI — and data more broadly — isn’t going away, but it’s not a foregone conclusion that resulting data centers have to be a scourge on the environment. With better engineering, data centers, and their purveyors, can become better stewards of the land they inhabit and the resources they consume.


In a recent episode of Soluna’s Clean Integration podcast, we took a deep dive into the use of water in AI computing in traditional data center models, and some of the looming questions concerning AI’s sustainability with Shaolei Ren, Associate Professor of Electrical and Computer Engineering at the University of California, Riverside. Listen here