AI data centers could flood the overtaxed U.S. power grid with demand and further drive up energy costs for consumers. Or, they could simply agree to use less electricity during the handful of hours per year when the grid is under the greatest stress, making it possible for tech companies to get the power they need without straining the system.
It sounds like an easy fix, but in reality it’s complicated to modulate the demand of a data center that can use as much power as a small city. That’s why it’s rarely done today. In fact, a Department of Energy report last year “identified no examples of grid-aware flexible operation at data centers” in the U.S., with one exception — Google.
Now, the tech giant is taking its flexibility efforts one step further and applying the concept to the machine learning operations that underpin its large language models, the technology driving the current boom in AI development.
This month, Michael Terrell, Google’s head of advanced energy, announced agreements with Indiana Michigan Power (I&M) and Tennessee Valley Authority (TVA), two utilities facing a lot of data center demand, that “represent the first time we’re delivering data center demand response by targeting machine learning workloads.”
Google’s new announcements are a really big deal, said Tyler Norris, a Duke University doctoral fellow and former solar developer and special adviser at the Department of Energy. That’s because they’re the first example of the kind of collaboration between data centers and utilities that needs to happen to keep costs from spiraling out of control.
Estimates of power demand from the AI race are all over the map and hard to trust, but at a minimum, most experts agree that data center demand for electricity outstrips supply. Consumer advocates and state lawmakers are increasingly worried that this dynamic is going to cause electricity rates to surge, as utilities incur the costs of building the power plants and grid infrastructure to serve data centers, and potentially push those costs onto customer bills.
That supply-demand imbalance is also a problem for firms like Google and its competitors, which are locked in a multibillion-dollar race to build the best possible AI system — a heated competition in which who gets electrons first could be a deciding factor.
Over the past five years or so, through its “carbon-intelligent computing” program, Google has been actively shifting nonurgent computing workloads — like processing YouTube videos — to prioritize clean power and avoid dirtier energy. It has also shifted computing load to help utilities manage grid emergencies.
Until recently, it hadn’t messed with power demand for machine learning. This sort of flexibility is new territory for utilities and grid operators, too: It’s not standard practice to allow large customers to come online only if they agree to curtail their power use, Norris pointed out.
“We’ve never planned loads this way, essentially for the entire history of the electric utility industry,” Norris said. But without this kind of approach, “it’s effectively impossible to see how some of these load forecasts can be met with purely physical infrastructure building.”
Flexible data centers like Google’s may have a significant advantage in getting those all-important electrons in the near term. A more rigid project may have to wait years to come online as grid infrastructure and power plants are built; a flexible data center, meanwhile, could be fast-tracked for interconnection using the grid capacity that’s already available.
The solution has its limitations, Terrell told Canary Media in an interview, but where it makes sense, it can be a powerful tool.
“We can’t do it everywhere. Some of our loads can’t be curtailed,” Terrell said. But where Google is able to do it, “there’s value to being able to secure capacity without having to wait for new infrastructure.”
The grid is crowded — but there’s plenty of room for data centers that can be flexible. That’s what Norris and a team of researchers at Duke University concluded in a February report.
The analysis found nearly 100 gigawatts of existing capacity on U.S. grids for data centers that can commit to 0.5% “annual average load curtailment.” That equates to being able to curtail less than half of their total power use for about two hours at a time during peak demand events that happen about 100 hours of the year.
There’s a simple explanation for this spare space: Grids and power plants are overbuilt to meet peak demands, or “worst-case conditions,” as Norris put it. But if data centers agree to avoid using power during those moments, it can obviate the need to expand the grid further to serve new, higher peaks.
It’s not a new idea in principle. Utilities have paid customers to reduce power use during peak demand for decades. But existing “demand response” programs tap current customers to help prevent emergencies for the grid as it is built today, Norris explained.
Google’s new deals with I&M and TVA, by contrast, are aimed at managing growing demand “in the form of a definitive long-term contract the utility can use for planning purposes,” he said. In other words, instead of using demand response to manage existing power needs, the utilities and Google are now wielding this approach to allow new users to come online. “That’s what sets it apart.”
Norris isn’t aware of any other data center-utility projects that are taking this longer-term planning view. In fact, “most data center developers won’t even release the nameplate megawatt scale of the facility,” he said — a feature of the highly competitive AI race.
In part because of this secrecy, it’s unclear how data center growth will play out in the real world. Most projections are based on speculative requests from developers seeking power in multiple locations for projects that may or may not end up being built.
But forecasts of data center growth generally indicate that they’re set to overwhelm the grid.
A December report from consultancy Grid Strategies found that five-year growth forecasts for U.S. utilities and grid operators have quintupled between 2022 and 2024, with data center hot spots such as Virginia, Georgia, Texas, and swaths of the Midwest particularly impacted. The past month has seen utilities in California, Colorado, New Jersey, and Pennsylvania report gigawatts of new data center requests.
That’s going to drive up utility rates, which are already rising due to a number of factors, including expensive investments in grid maintenance and expansion. While it can take time for the costs of accommodating new data centers to arrive on customers’ utility bills, the sheer scale of that expansion means that “the affordability concerns here are being put into stark focus,” Norris said.
Those future costs are starting to pile up.
Georgia Power won regulatory approval earlier this year to move ahead with plans for a controversial and unprecedentedly rapid buildout of power plants, almost entirely based on huge, uncertain forecasts of data center growth. The company has filed a more than $15 billion proposal revealing that much of its new infrastructure will be gas-fired. Virginia utility Dominion Energy is pushing for a similarly massive investment in fossil-fueled power to serve the world’s highest concentration of data centers. And Louisiana regulators last week approved utility Entergy’s plan to spend billions of dollars on gas-fired power plants and grid investments to serve a $10 billion data center from Meta.
In some states, customers are already paying more for energy because of data centers. PJM Interconnection, the grid operator serving Washington, D.C., and 13 states from Illinois to Virginia, has seen prices for capacity to maintain its grid skyrocket in the past year. PJM’s inability to bring new generation online is a chief culprit. But its ballooning future demand, another important part of the equation, is “almost entirely due to existing and projected data center load additions,” according to PJM’s independent market monitor.
Norris argued that utilities, regulators, and grid operators must start demanding that would-be data centers commit to some level of flexibility to receive grid interconnection. “If you’re planning for all new loads to be inflexible and serving them with firm at all hours of the year, that’s going to be extraordinarily expensive,” he said.
While data centers could build their own power supplies, “we also don’t want them to be running the backup [diesel generators] 200 hours a year to get online faster,” Norris said. Reliance on polluting on-site generators is already a problem for communities in Memphis, Tennessee, which are protesting the use of hundreds of megawatts of gas-fired turbines at a data center built by Elon Musk’s xAI.
Google’s approach of managing its data center power use to reduce carbon emissions represents a much cleaner alternative. “The capabilities we developed to do load shifting for carbon, we use the same capabilities to do demand response,” Terrell said.
Google’s agreement with TVA applies to existing data centers “north of Nashville and in North Alabama. We need to grow, but [TVA was] not in a position to serve us” in the short term, Terrell said. TVA has not released details of its agreement with Google, and Terrell declined to provide more specifics.
Google’s agreement with I&M centers on the tech giant’s $2 billion data center in Fort Wayne, Indiana, which started operations late last year but expects to ramp up its power needs over time, Terrell said.
In broad terms, the plan states that Google will commit both to restraining its use of power at its Fort Wayne data center during critical hours and to transferring credits for a portion of carbon-free energy it has contracted for in the region to I&M to help it meet its capacity requirements. “We need to be bringing new resources onto the system,” Terrell said.
Many of the details of I&M and Google’s plan filed with state regulators last month have been redacted for confidentiality reasons, which has raised concerns from consumer advocates. But the proposal does appear to align with new regulations aimed at controlling data center costs.
Earlier this year, Indiana utility regulators approved a settlement between I&M, data center developers, and consumer advocates that set new requirements for “large loads” — namely data centers — to commit to covering a significant portion of the costs they incur. The goal is to avoid forcing customers to pay higher bills for decades due to investments made to meet data centers’ needs.
More such rules are coming. Ohio regulators in July approved a similar settlement agreement, and a broader energy law passed in Texas this year will require large data centers to reduce power use during grid emergencies. PJM launched a fast-tracked effort this month to create new large-load interconnection rules, and Southwest Power Pool, a grid operator serving 14 Midwest and Great Plains states, plans to streamline connection for data centers and other big facilities that can commit to flexible operations or to providing their own power.
Data center operators have traditionally shied away from altering operations to save or shift energy, said Astrid Atkinson, CEO of grid-software startup Camus Energy. That makes sense, given the high value of the computing they’re doing — something Atkinson dealt with as former lead of the Google teams that maintain reliable computing at data centers providing web services and social media.
But data centers training AI models have more flexibility than those providing time-sensitive or business-critical services like processing financial transactions, she said. “They’ll be running large training jobs that use up a lot of their nameplate capacity for a period of time, but then they may sit idle for a long period of time,” she said. “You can potentially move them around in time a little bit.”
Camus Energy is already working on projects to enable flexible EV charging, but much of its recent work with utilities centers on managing new data centers, she said. “If it makes the difference in being able to build or expand a site now, or having to wait five years, that makes it worth doing.”
Indeed, some other utilities and data center operators are exploring grid flexibility. The Electric Power Research Institute, a largely utility-funded nonprofit, last year launched its DCFlex initiative, a collaboration that’s testing flexible computing at sites including an Oracle data center in Arizona and a Google data center in North Carolina.
And although most regulated utilities lack incentives to work on flexible interconnection — they earn guaranteed profits based on how much money they spend on grids and new power plants — Norris thinks the surge in demand could change their calculus. There’s only so much cost regulated utilities can put on their customers before regulators are forced to intervene, and the AI boom is testing those limits.
“The more they’re looking at big upgrades and infrastructure investments, the more they need to balance that with affordability,” Norris said.
Terrell said that Google is “having more conversations with utilities” about data center flexibility, though he declined to provide further details. “It’s an advantage for our business to be able to go to utilities and offer this.”