Connect with our team to request a demo of our software

"*" indicates required fields

Get in Touch

"*" indicates required fields

AI Data Center Energy Use

by: Matt Penfold

In electricity circles, it feels like everyone is talking about AI data center energy use and load growth.  

Overall electricity load growth, which was nearly flat for decades, is now predicted to sharply increase. A December 2023 report from consulting firm Grid Strategies found that FERC filings in 2023 reflected a near doubling of grid planners’ five-year electricity load growth forecast, with “nationwide forecast of electricity demand [shooting] up from 2.6% to 4.7%.” 

Energy-hungry AI data centers are a major driver of this load growth. A May 2024 study by the Electric Power Research Institute (EPRI) found that AI could drive data centers to consume nearly 10% of the U.S.’ total electricity generation by 2030, up from 2% in 2020. Latitude Media has covered the intersection of AI, energy, and data centers extensively, including on the Carbon Copy podcast.  Even mainstream business media is joining the conversation: the Washington Post did a great podcast on how AI, cleantech manufacturing, and cryptocurrency mining are all adding strain to aging electric grids.

Chart about US power demand growth, driven largely by data center energy use

This is not a problem to which I have the answer. However, as co-founder of a company focused on helping organizations (like data centers) plan, procure, and manage clean energy, I see important roles for cloud customers, utilities, and data center companies in meeting rising demand from AI data center energy use while reducing their carbon emissions.

What Cloud Customers Can Do

To avoid a rapid increase in associated carbon emissions, the companies powering the generative AI boom should match their data centers’ energy use with additional clean energy purchases.

Cloud customers – and ideally regulators – need to hold data center companies to high standards. These should include next-generation clean energy goals (e.g., hourly matching or carbon matching rather than annual matching), which should include requirements for additionality. 

This more granular matching, especially with a focus on additionality (adding new clean energy resources to the grid vs. tapping those already online), would help alleviate legitimate concerns about reliability and increased emissions in communities with high concentrations of data centers. Singapore halted all new data center construction between 2019 and 2022 because of concerns that their high electricity demand would strain the grid and negatively impact the city-state’s sustainability goals. Both Singapore and Ireland have published rules in recent years that make new data center construction contingent on things like better energy and water efficiency, the ability to use backup generators, and “the ability to reduce power consumption when requested” (i.e., demand response). 

New data center construction without additional, clean power risks raising power prices and grid emissions. We can expect more moratoriums if we get this next phase of data center growth wrong. 

To their credit, Big Tech has pioneered the concept of voluntary corporate renewable energy purchases, with tremendous follow-on benefits in terms of policy advances and action by other corporate energy users. Now is the time to stay the course – we can’t let the AI mania become an excuse for slipping corporate power procurement standards. That said, cloud customers can only push so far – the electric utilities and their regulatory commissions hold most of the keys in this critical moment.

What Utilities Can Do

The primary suggestion I have for utilities and their regulators is, “while building new generation is a key part of the solution, let’s not treat this AI boom as a land grab for building more rate-based generation.” Business as usual is not a viable option in these unprecedented times. Investor-owned utilities are naturally predisposed to make large capital expenditures to add to their rate base. 

When all you have is a hammer, everything looks like a proverbial nail. Let’s remember that we have other tools in the toolbox. There are many opportunities for utilities to make smaller capital improvements to get more out of their existing infrastructure. Brian Janous (CEO of CloverLeaf Infrastructure) made some great points on this topic in a recent Catalyst podcast). For instance, Janous noted that data centers are effectively microgrids – they have backup generation and storage, and there’s significant opportunity to tap behind-the-meter assets to respond to grid signals.

Image of server racks in a data center.

Utility history wonks will recognize an opportunity to dust off the old “non-wires alternative” playbook, exemplified in the 2013 SCE Local Capacity Requirement procurement, which replaced the 2GW San Onofre Nuclear Generating Station with dispatchable behind-the-meter resources.

Utilities also need to be creative and work with private-sector companies on innovative tariff structures that incentivize load flexibility by passing through wholesale prices and include options to bring-your-own generation (both behind- and front-of-the-meter). Duke Energy provided a great example of this in its recent proposal of a suite of new tariffs to enable large corporate customers to “fund novel technologies like long-duration storage and advanced nuclear as they try to decarbonize.”

What Data Center Companies Can Do

There’s no silver bullet that will solve these load growth challenges for data center companies, utilities, and their regulators. On the one hand, data center companies need better tools to site new facilities in regions with interconnection capacity for both load and supply, procure the right renewable resources, and operate their facilities to minimize both costs and emissions. That’s what Verse focuses on

Data center operators also need to combine those tools with other strategies, such as optimizing the timing of compute load for AI model training. Google explained this approach in a 2020 blog post. The overall principle is to move “the timing of many [non-urgent] compute tasks to when low-carbon power sources, like wind and solar, are most plentiful.” Google notes that while they began temporally shifting tasks within a single data center, “it is also possible to move flexible compute tasks between different data centers.”

Other technical innovations could also help, ranging from improved server design and new cooling techniques to more efficient AI chips, like Google’s Tensor Processing Unit (TPU) chips, which Verse’s head of engineering wrote about earlier this year. 

What We Can’t Do — Be Complacent About AI Data Center Energy Use

There’s no question that balancing explosive load growth (driven in large part by AI data center energy use) with reducing carbon emissions is a huge challenge. This is particularly true on aging grids with long interconnection queues, as we’re seeing in many parts of the U.S. However, with the right collaboration (between utilities, regulators, data center companies, and community stakeholders), incentives (from utilities and data center customers), and tools, we can expand powerful new technologies without undermining critical climate ambitions. 

This op-ed was originally posted on MCJ Collective’s substack.