Resource Center

Back to blog homepage

Planning for the Power Demands of AI


AI, Artificial Intelligence, Data Centers, Machine Learning, Renewable Computing

In a series of reports released in April, the U.S. Department of Energy outlined use cases and guardrails for utilizing artificial intelligence solutions in aid of the energy transition, with particular emphasis on solutions that could be deployed on the grid in the near term.

The same week, the department launched a website summarizing how they are already leveraging AI — for emergency response, forecasting, and more — and potential opportunities for further harnessing the technology, including permitting and improved grid reliability.

“The electrical grid of the United States is among the most complex machines on Earth,” DOE found.

The AI-energy nexus remains a two-sided conversation. As AI applications are increasingly deployed in service of the energy transition, concerns over the energy demands of AI itself persist.

In early May, Soluna CEO John Belizaire participated in a virtual conversation hosted by Latitude Media that tackled how utilities, operators, and regulators are responding to burgeoning load growth, in large part driven by AI workload growth. 

As AI applications dominate a larger slice of the computing pie, John argued that “the convergence of computing and energy is inevitable.”

In particular, the vast computing needs and batchable nature of many AI applications make them well-suited for the intermittent generation of renewable energy sources. Traditional hyperscalers require 24/7 connectivity and layers of redundancy, but batchable loads can be ramped up or down in tune with corresponding energy supply and grid demand without losing the integrity of their process.

John was joined on the Latitude Media panel by Brian Janous, co-founder of Cloverleaf Infrastructure, and Michelle Solomon, senior policy analyst at Energy Innovation. Both Janous and Solomon provided insight into how utilities and regulators are contending with load growth precipitated by a surge in AI use.

Solomon co-authored a report that her team released in April on how utilities can meet near-term load growth without building new gas plants. According to the report, a suite of tech solutions — as well as some key policy action — can manage short-term growth in lieu of a reversion to fossil fuels.

As John noted, the solution might already exist. He argued that power plants have been operating under the assumption for years that they “need to get more power lines built or get very large batteries built behind there or advance the battery space, and it turns out that the solution to that problem was always there. Computing was a perfect solution.”

Indeed, there is potential to continue cleaning the grid even as demand surges. By providing IPPs with a way to offload their unprofitable, otherwise stranded, energy, Soluna hopes to incentivize the development of more renewable resources.

Janous noted that in this time of growth, collaboration between customers and utilities will be imperative. Co-location models, where computing loads are brought directly to the energy source, are one example of how such collaboration can play out in new and creative ways.

Soluna’s superpower exists in the convergence of two separate solutions: curtailment mitigation for power producers on the one hand, and sustainable cloud and compute resources for enterprises on the other. We’d argue it’s the future of computing: renewable computing.