Lots of costly and dangerous things can go wrong with high-voltage transmission lines.
Strong winds or equipment failures can cause individual lines on a tower to contact one another and short-circuit, or even break entirely. Lines can sag hazardously near trees or the ground, either due to overheating or being caked in ice. They can be damaged by wildfire smoke, windblown debris, or the simple wear and tear of time.
Sensors can help detect such threats. But it’s expensive and time-consuming to install, connect, and maintain those devices along power lines crossing remote plains, forests, and mountains.
It would be a lot cheaper and faster for utilities if they could simply use existing infrastructure to detect these issues instead of tacking on new sensors. That’s the route that Prisma Photonics has taken.
The Israel-based company has plugged its technology into the fiber-optic communications cables strung alongside thousands of miles of transmission lines in Israel, Europe, and the U.S.
That technology has been able to identify the precise locations of problems causing power outages, including ice buildup on power lines and nearby wildfires, all by “using the fiber as a microphone, or as an array of thousands of small microphones,” Eran Inbar, Prisma Photonics’ CEO, explained. The firm’s product has even picked up on the explosion of a meteor in the Earth’s upper atmosphere.
For decades, utilities have strung “optical ground wire” fiber-optic cables atop transmission towers, both to protect high-voltage wires from lightning strikes and to provide telecommunications for internal or third-party use. Across the world and the U.S., more than half of the transmission system is outfitted with these cables, Inbar said — and that coverage is only growing as more power lines are built.
That opens up a huge market for Prisma Photonics, as long as it can prove its technology is accurate enough to replace purpose-built sensors for a growing number of tasks. The more jobs Prisma’s technology can do, the more money it could potentially save transmission grid operators, Inbar said.
Finding ways to do more things with less money and less time is important for transmission owners striving to solve multiple challenges at once. Climate change-driven heat waves and winter freezes are causing more grid emergencies, both by increasing demand for electricity for air conditioning and heating and by subjecting transmission networks to increased stress.
“The grid is a very important market — but it was also a blue ocean. There were no other companies playing in this field,” said Inbar, a physicist by training who spent 20 years in the field of lasers for semiconductor and mobile technology markets before selling his company in 2014 and launching Prisma Photonics in 2017. “Can we detect short circuits? Can we detect partial discharge? Can we detect wildfire? Can we detect wind for dynamic line rating?”
Prisma’s “optical interrogator” devices plug into substations, where multiple transmission lines and the fiber-optic cables that run atop them converge. From these central points, its devices send pulses of light down an optical fiber, then capture and analyze infinitesimal reflections cast back from points along the fiber.
This is one of several methods to use fiber-optics as sensors to detect shifts in temperature, pressure, strain on the cables, and other signals, Inbar said. “It’s a very small change — but if you’re using a precise optical method, you can measure it.”
Connecting these nearly imperceptible variations to changes in the surrounding environment takes a ton of data, some complicated machine-learning algorithms to convert it into usable information, and a lot of real-world cross-checking, he said. One of Prisma’s first tasks was to eliminate the false positives — alerts of events that didn’t actually happen — that have stymied earlier efforts to use already-deployed fiber-optic cables to sense environmental conditions.
“You have to be highly committed to data collection,” he said. “There’s no way to do it in the lab — you have to go to the grid and collect data over months or, in our case, years.”
But the more Prisma’s systems are used in the real world, the more confident customers can be in correlating their measurements with real-world events. After a yearslong deployment with Prisma investor Israel Electric Corp. and other pilot projects with the New York Power Authority and as-yet-unnamed European grid operators, “we don’t have to do months and years of data collection” when deploying with its next utility partners, Inbar said. “We can go to work immediately.”
That’s the goal of Prisma’s latest project with Great River Energy, a rural electric cooperative operating transmission lines across Minnesota and Wisconsin. The initial project is targeting about 90 miles of power lines, all being monitored from computers operating inside substations.
“There is a huge impetus for reliability on the electric system,” said Michael Craig, Great River Energy’s manager of energy management systems. At the same time, “every time we’re looking to spend money, we have to justify that it’s going to be worth it to everybody in our membership. With Prisma, there were a couple of things that made it easier for us.”
“First, it’s using existing assets,” he said, referring to the fiber-optic cables already deployed on the 90 miles of line the co-op is monitoring. “We didn’t have to do this big buildout. It’s just going into the substation and doing the work” to connect Prisma’s equipment to the fiber-optic system.
Second, Prisma’s technology could offer a lower-cost way to deal with grid faults, Craig said. A bewildering array of technologies and techniques go into determining just where and how power flows have been interrupted along high-voltage power lines. Generally speaking, more precise approaches require more costly technologies, including sensors installed on transmission lines themselves.
Being able to tap into the continuous sensor of a fiber-optic line could help. “If there is an outage, maybe we’ll be able to detect what happened before we get out there,” he said. “Hopefully we can detect things more quickly. We hope it can allow us to be preventative, rather than reactive.”
Third, Prisma’s technology can potentially do many things at once, Craig said. One option is to use it for “dynamic line rating” — determining how much power transmission lines are able to safely carry under different temperatures and wind speeds, a process which can expand the capacity of existing grids without costly upgrades. Great River Energy is already testing dynamic line rating sensors installed on power lines but is eager to explore other approaches, he said.
At the same time, Craig said, Great River Energy is exploring the use of cameras in remote areas to detect wildfires — another pressing concern for power grid operators. With Prisma, “instead of installing cameras and having this one system, we can use the existing fiber as a sensor. That’s a cool way to do it.”
Inbar stressed that Prisma Photonics is still in the process of gaining real-world experience to prove how useful it can be for different tasks. Utilities must wait for bad things like faults or wildfires to happen to be able to check the technology’s accuracy, for example. Inbar added that Prisma plans to evaluate its dynamic line rating capabilities through tests hosted by the Electric Power Research Institute, a nonprofit power-sector research group considered the gold standard for utilities.
Whatever the use case, “one of the benefits of our solution is that it’s a platform,” he said. “It’s not like we’re developing a sensor for dynamic line rating or wildfire detection. It’s a platform that can collect very sensitive data on the grid, and eventually we can improve it over time — and build additional use cases.”
Antonio Baclig spent eight years as a researcher at Stanford University scouring as many battery designs as he could find in search of something cheap enough to transform the grid. He honed in on one from the 1980s that stores energy with iron and table salt, and founded the startup Inlyte Energy in 2021 to commercialize it anew.
“Our goal is solar-plus-storage baseload power that costs less than fossil fuels,” Baclig said.
Now Inlyte has secured its first major utility contract, a crucial step in proving the viability of the technology.
Southern Co., which owns the biggest utilities in Alabama, Georgia, and Mississippi, has agreed to install an 80-kilowatt/1.5-megawatt-hour Inlyte demonstration project near Birmingham, Alabama, by the end of the year. Utilities need to see new technologies work in the field before they take a chance on large-scale installations, so this project marks a necessary, but still early, stage in Inlyte’s commercialization.
New types of batteries are notoriously difficult to bring from the lab to large-scale production. Many startups have toiled at this task for years, pitching anyone who will listen about the superiority of their technology. None have come close to unseating the dominant lithium-ion battery designs that have plummeted in cost over the last decade as China massively scaled up production. But researchers have concluded that lithium-ion batteries can never get cheap enough for the mass deployments of storage that will be needed to run a grid dominated by renewable energy.
The onus is on Inlyte, then, to avoid the lackluster fate of its peers and prove its exceptionality among the ragtag camp of lithium-ion alternatives. The company has three important things going for it: dirt-cheap cost of materials, a simpler-than-usual manufacturing process, and system-level round-trip efficiency on par with lithium-ion battery systems. (Round-trip efficiency is a metric for how much of the electricity stored in a battery can later be recovered. The technologies challenging lithium-ion tend to fare poorly on this front.) Plus, the work of researchers in prior decades has already helped speed Inlyte’s path to market.
Some battery-startup founders have spent years toiling away in a lab on a favored chemistry, only to spend more years figuring out how to turn it into a viable product. Baclig, a materials scientist, surveyed the annals of battery science and plucked something off the proverbial shelf that had almost hit the big time but not quite.
He landed upon the family of sodium metal halide batteries, first developed in the late 1970s. A British firm called Beta Research explored iron-sodium batteries but in 1987 pivoted to nickel-sodium because its superior energy density made it more promising for electric vehicles.
The sodium-nickel-chloride chemistry became known in battery industry lore as the ZEBRA battery, because it was developed by a group called Zeolite Battery Research Africa. It got some traction in the 1990s: Daimler Benz built cars with this kind of battery and test-drove them for more than 60,000 miles. A European company called Horien still manufactures the battery for specialized uses, like a NATO rescue submarine and uninterruptible power supply at industrial facilities that can’t afford to go dark for even an instant.
Baclig contends that the historical abandonment of iron-salt chemistries did not reflect an intrinsic failing in the technology, just a different set of needs at the time. Today, with record solar panel installations reshaping electricity systems around the globe, there’s growing interest in cheap, long-term energy storage. And power plants don’t need to cram as much energy into a confined space as electric vehicles do.
“We have to focus this on cost now. It’s not [primarily] about energy density,” Baclig said. With those new parameters, putting the iron back into the battery might just work.
Baclig reached out to the company that had pioneered the technology in the first place: Beta Research. That firm was looking for a new project to focus on as the Covid pandemic receded, Baclig said; after a year of conversations, he and Beta Research decided to join forces in 2022. Inlyte thus pulled off a rare feat among climatetech startups, or any kind of startup: successfully conducting an international acquisition before it had even raised seed funding. The startup subsequently closed an $8 million seed round in 2023.
Since then, the team has worked at a steady clip to dial in the best iron cathode for the grid storage job. They also scaled up the size of each cell, which, unlike in lithium-ion batteries, takes the form of a ceramic tube that gets filled with powdered iron and salt. The new tubes hold 20 times the energy of the previous, EV-oriented cells.
From there, Inlyte set about testing its new battery cells, culminating in a recent third-party engineering test of a 100-cell module. Engineers typically have to tinker and improve a newfangled battery to unlock the desired level of performance. In this case, Baclig said, “That was our first module, and it just worked. We’re building on something that has a long track record, so we don’t have to reinvent.”
The chance to innovate on a legacy design attracted the firm’s chief commercial officer, Ben Kaun, who spent many years analyzing alternative grid storage concepts for the Electric Power Research Institute, a nonprofit research arm of the U.S. utility industry.
“It takes a long time to take a technology from the lab to deployment — there’s a lot of layers of scale-up and integration,” Kaun said. “It was appealing to me how much of that had been worked out with [Inlyte’s battery].”
Southern Co. will install and operate the first large-scale Inlyte battery system for at least a year as part of its ongoing efforts to test emergent long-duration storage technologies in a real-world environment.
The utility company was attracted to Inlyte’s low fire risk (it does not use flammable electrolytes like conventional lithium-ion batteries) and its ability to be sourced domestically, Southern Co. R&D manager Steve Baxley noted in an email.
“This system has the potential to be cost-competitive with lithium-ion batteries, particularly for longer durations,” he said.
The company’s subsidiary Georgia Power previously signed a landmark deal to test out another iron-based long-duration battery, from Form Energy, starting in 2026. (Baxley confirmed that project is still being developed.)
Researchers from the Electric Power Research Institute, Kaun’s former employer, will document the results of the Inlyte installation and share them with utilities around the country.
“A lot of companies will share the same learnings, so we don’t have to do the same pilot over and over again in every service territory,” Kaun said.
Of course, habitually risk-averse utilities often prefer to test-drive new technologies in their own backyard, even if that duplicates efforts elsewhere. Many utilities continued to tiptoe into lithium-ion battery installations even after the batteries had been operating for years on a massive scale in other parts of the country.
Baclig, for his part, hinted at many more trial runs in the works for next year. These projects will be doable because Inlyte gained possession of a pilot-scale factory in the U.K. as part of the Beta Research acquisition. That facility can pump out megawatt-hour-sized volumes for early test projects, but it won’t keep up if Inlyte starts closing commercial deals.
Baclig has begun seeking a location for a factory in the U.S. Building a first-of-a-kind factory can be risky, but he stressed that four factories have been set up around the world for essentially the same technology, and the Beta Research team advised on all of them. The plan is to build at the same scale as those previous facilities, to minimize uncertainty around factory economics.
“It’s not quite Intel’s ‘copy exact,’” Kaun said, referring to the pioneering microchip firm’s famous approach to replicating its factory designs. “But it’s ‘copy very similar.’”
Filling ceramic tubes with metal powders doesn’t require the same pinpoint precision as a lithium-ion battery factory. When companies construct lithium-ion factories in the U.S., they have tended to cost at least $1 billion; the capital cost for a full-scale Inlyte factory should be multiples lower than that, Kaun noted.
Furthermore, the machinery to manufacture this unique battery does not come from China’s dominant battery sector, a boon at a time when the Trump administration’s tariffs are driving up prices on Chinese imports (even the equipment needed to build factories in America).
Going forward, Inlyte will need to move from field demonstration to customer contracts, and the company is focused on buyers who need power every day but also have occasional long-term backup requirements.
Inlyte is pursuing utilities like Southern Co., which must deliver power to a fast-growing region while surviving hurricanes and other extreme weather. The startup also has a dedicated pitch for providing data centers with backup energy: The long-lasting iron-sodium batteries can ostensibly replace both the instant response from uninterruptible power supply systems and the diesel generators that would kick in until power is restored. And the batteries could run every day to lower a data center’s demand from the grid.
Convincing data center owners to adopt Inlyte’s product will not be trivial, but that sector is struggling to find the power capacity to fuel its growth, not to mention maintain corporate commitments to sourcing clean electricity. If Inlyte can really deliver clean, long-lasting power that’s cheaper than fossil-fueled alternatives, it would almost certainly find willing takers with the ability to pay.
Illinois’s ambitious clean energy transition, which mandates a phaseout of fossil-fuel power by 2045, depends on adding large amounts of energy storage to the grid. This is especially true now with the proliferation of data centers. Utility-scale battery installations will be key to ensuring that renewables — along with the state’s existing nuclear fleet — can meet electricity demand.
That’s why energy companies and advocates are racing to get legislation passed that incentivizes the addition of battery storage on the grid, before the state legislative session ends May 31.
On May 1, a state regulatory commission released a report outlining its recommendations for a summer procurement of grid-connected battery storage by the Illinois Power Agency, which procures power on behalf of utilities ComEd and Ameren.
Clean energy industry leaders and advocates have been pushing for storage incentives for years and were disappointed that such provisions were not included in the 2021 Climate and Equitable Jobs Act.
In a January lame-duck session, the legislature passed a narrow bill that ordered the Illinois Commerce Commission — which regulates utilities — to hold stakeholder workshops to study grid storage capacity needs and possible incentive structures. The resulting report is meant to inform legislation that backers hope will pass this spring and lead to the storage procurement this summer.
In the report, the commission noted that energy storage would reduce prices, increase grid reliability and resilience, avoid costly grid upgrades and power plant construction, facilitate renewable energy deployment, and create “macroeconomic benefits” like jobs and investment in local infrastructure.
Jeff Danielson, vice president of advocacy for the Clean Grid Alliance, whose members include renewable power and battery storage developers, said the plans are long overdue.
“Wind and solar are important, but for the grid itself to be holistically sustainable requires battery storage,” Danielson said. “Battery storage has value. It’s time for Illinois to add this tool in its toolbox for a sustainable grid.”
The report recommends that the Illinois Power Agency do an initial procurement for 1,038 megawatts of grid-connected storage this summer — a total that the commission says should include 588 MW in the PJM Interconnection regional transmission organization’s territory in northern Illinois and 450 MW in the territory managed by the Midcontinent Independent System Operator. Additional procurements by the end of 2026 should incentivize the construction of 3 gigawatts of storage to be in operation by 2030, the report says. And it calls for setting a second target for additional storage beyond 2030.
Advocates and industry groups said they are generally happy with the proposals, though energy storage and renewable industry leaders were asking for a 1,500-MW initial procurement and up to 15 GW of storage by 2035. The commission’s draft report had called for only 840 MW in an initial procurement, but after hearing public comments, it upped the amount in the final version. The industry also wants incentives for both stand-alone storage and storage paired with renewable energy, but the commission’s report recommends that the initial procurement only be for stand-alone batteries.
“It’s now up to lawmakers to meet the moment and provide both a short-term and long-term solution to high utility bills,” said Danielson. “Energy storage is the right answer, at the right time, for the right reasons.”
It’s crucial that legislation pass this year since storage developers are seeing increasing demand nationwide and deciding where to invest, said Samarth Medakkar, Illinois lead for Advanced Energy United, an industry group whose members include renewable and storage companies.
There are already gigawatts of proposed battery storage projects in Illinois that are waiting for approval from the Midcontinent Independent System Operator to interconnect to the grid. Those projects need funding to progress and meet deadlines set by the grid operator to stay in the queue, Medakkar explained.
“There’s competition — developers are looking at Illinois as a market, but they’re looking at other states as a market too,” Medakkar said. “We need to make these as least risky as possible. Procurement would give them confidence to make the payments to stay on course in the queue. We can send a signal to developers that if you make these nonrefundable payments, we will have a market for energy storage and you can bid your project into this market.”
A letter from storage and renewable developers to the chief of the Commerce Commission’s Public Utilities Bureau, offering comments on the draft report, noted that storage projects take seven to 10 years to develop, so the state needs to act soon to procure the grid battery capacity it wants online beyond the 2030 date discussed in the study.
“Developers across the country are facing a challenging federal environment, including newly announced tariffs,” the letter says. “As a result, many developers are now prioritizing their limited capital across fewer projects — focusing on states with established and supportive markets, and divesting from states that are not as far along.”
The commission’s report proposes incentivizing storage through a market for indexed storage credits, structured similarly to the state’s renewable energy credit program that has fueled a boom in solar power and, to a lesser degree, wind power. Under this design, the developer or owner of the storage is essentially allowed to sell credits for funds awarded by the state and collected from utility customers.
New York is the only other state with a storage credit market, according to experts. If Illinois passes legislation and launches the program this summer, it will be rolling out around the same time as the nascent program in New York, scheduled to hold its first procurement by the end of June. In other states, grid storage is supported through a structure known as tolling agreements, wherein utilities or other companies build and operate battery installations on the grid, and utility customers are essentially charged for their use.
In both models, residents pay for the new storage through their electric bills — just as they pay for renewables under Illinois’ existing renewable energy credit program. The Commerce Commission found that 3 GW of storage incentivized through credits would cost utility customers between 39 cents and $1.69 per month, though storage would also lead to bill savings by avoiding costlier investments in generation.
Danielson said battery storage developers prefer a tolling structure since it is a much more common and potentially more effective practice. It would be “pretty odd” if Illinois did not offer that option, he said, though ultimately, companies are eager to get legislation passed in whatever way possible.
“We’re not making a judgment about which one’s better. It just needs to be a choice,” Danielson said.
James Gignac, Midwest policy director of the Union of Concerned Scientists, said clean energy advocates are on the same page.
“I would be hopeful we can identify a way to use tolling agreements because the more options we can offer to the market, that means we’ll be getting more companies interested in proposing projects,” Gignac said. “That’s good for consumers and provides more competition. We may learn that the indexed storage credit approach is producing a certain type of project, and a tolling agreement could be offered to attract a different size of facility or different use case.”
Danielson noted that California, New York, and Texas have the largest amounts of on-grid storage in the country, and Illinois could be poised to join them.
“One thing those three states have in common is density of businesses and people,” Danielson said. “There is no good reason why Illinois should be lagging these other states in terms of these projects being built.”
The battery storage workshops this spring were “eerily similar to what we just did” in the leadup to CEJA, Illinois’ 2021 climate law, he continued. “For five years, these ideas have been studied and bantered about. Now demand is higher for sustainable power, the technology is better, [and] the costs are lower, which means Illinois leadership matters now more than ever.”
One subject of debate is whether the storage incentives should include the same focus on equity that has characterized Illinois’s existing clean energy laws – CEJA and the Future Energy Jobs Act before that. Workforce training and solar deployment programs created by these laws prioritize people and communities impacted by fossil-fuel power plants, the criminal justice system, and other indicators of inequity. The commission’s draft report recommended that storage procurement exclude such equity provisions, in part because battery storage-related jobs include dangerous, high-voltage conditions.
Members of the Illinois Clean Jobs Coalition objected, noting that solar and wind jobs also involve high voltage. In comments to the commission on behalf of clean energy groups, Gignac stated that solar and wind developers can request waivers under the state law if they can’t find equity-qualifying candidates for certain jobs; meanwhile, there is “no evidence” that equity-eligible employees and contractors would be unqualified for storage development.
The final recommendations encourage the same equity standards for storage development as for renewables, a change lauded by advocates.
“This will help ensure that Illinois is advancing equitable workforce opportunities in battery storage facilities alongside other clean energy technologies such as wind and solar,” said Gignac.
Clean energy advocates and industry representatives plan to encourage lawmakers to amend or introduce legislation based on the findings in the Commerce Commission’s report, they said
The Illinois Clean Jobs Coalition, which helped pass CEJA, is pushing for a new energy omnibus bill this legislative session. Members said they are hoping to work with industry to add storage-related language. Meanwhile, renewable and storage industry stakeholders are backing a bill that would require the Illinois Power Agency to procure energy storage totaling 15 GW online by 2035, and require utilities to charge customers to fund it. The bill would allow both credit and tolling incentive structures.
Samira Hanessian, energy policy director of the Illinois Environmental Council, said she is “cautiously optimistic” about a bill incentivizing storage passing this legislative session.
“I’m feeling a lot more positive around how storage is now coming up in most conversations with legislators and in our coordination spaces,” Hanessian said. “To me it’s become a very real policy issue that we are on track to address this session.”
Conversations about AI and the power grid tend to focus on the demands that the developing technology will place on the country’s aging energy infrastructure. But Josh Brumberger, CEO of Utilidata, has a vision for how AI can actually help the grid.
The Rhode Island-based grid technology company is working on what it calls “edge AI intelligence” — smart meters or grid control devices embedded with chipsets designed by leading AI chipmaker Nvidia. Those devices have the computational capacity to process massive amounts of data and make split-second decisions, enabling utilities to better manage increasingly complicated power grids.
On Tuesday, Utilidata announced $60.3 million in funding to expand production and deployment of this technology with a growing list of utility partners. The new round brings Utilidata’s total venture funding to $126.5 million and was led by Renown Capital Partners and joined by existing investor Keyframe Capital, as well as Nvidia and Quanta Services, a major utility grid, energy infrastructure, and data center engineering and services company.
“We want to make it as easy as possible for hardware manufacturers to embed AI and distributed intelligence into their devices,” Brumberger said. “There’s this concept that AI is going to be crucial as we go about developing our next-generation infrastructure — in this case, the power grid. We were kind of on the edges of those discussions a few years ago. Now we’re at the center.”
Utilidata and Nvidia began to jointly develop their technology in 2021. The next year, they launched a consortium of U.S. utilities, along with leading U.S. residential solar and battery installer Sunrun, to support its deployment.
Since then, the technology has been selected to play a role in several cutting-edge utility grid projects.
In 2023, Portland General Electric in Oregon landed a Department of Energy grant to deploy Utilidata and Nvidia’s“grid-edge computing platform” to support its long-running effort to integrate batteries, EVs, and community solar into its grid. Pennsylvania utility Duquesne Light won a similar DOE grant to use the devices to collect data to better assess and mitigate threats to the grid from climate change and extreme weather. And Consumers Energy in Michigan launched a project with Utilidata last year that uses the technology to determine the grid impacts of home EV charging.
The projects share some common features, Brumberger said. They involve collecting massive amounts of data, such as subsecond readings of the voltage and frequency of power flowing through utilities’ distribution grids. Those data must then be processed by integrated circuits running hefty mathematical calculations before grid operators can make use of it.
The in-the-field computers must also be reprogrammable to perform a shifting variety of tasks, rather than “hard-wired” for only a preset range of duties, he said. And those tasks may require autonomous decision-making, like coordinating utility and customer-controlled devices to respond to changing grid circumstances, which is possible only with technology capable of acting faster than traditional low-bandwidth wireless communications from central utility control rooms.
Utilidata, which got its start providing grid voltage control equipment to the utility industry, restructured its business in 2020 to focus on these kinds of “grid-edge” challenges. The goal was to develop new versions of long-standing utility technologies that simply don’t have the speed necessary for the modern grid.
Take the more than 100 million smart meters deployed across the U.S. since the mid-2000s. Those meters — essentially stripped-down, weatherproofed computers linked via wireless networks — are collecting energy billing data, alerting utilities to power outages, and enabling some basic grid control capabilities today.
But the chipsets in those earlier generations of advanced metering infrastructure — AMI 1.0, in utility parlance — were designed to do a preset list of tasks and to be cheap enough to be deployed in the millions. The underlying computing technology has gotten both cheaper and better since then.
Major smart-meter vendors such as Itron and Landis+Gyr have been boosting the capabilities of their latest “AMI 2.0” systems to carry out increasingly complex activities. Utilidata and Nvidia claim that their technology platform, dubbed “Karman,” exceeds the capabilities of its peers in the field, though their price point is likely higher too. (AMI vendors tend not to disclose per-unit prices, and cost and pricing vary widely depending on order volumes and vagaries of industry demand.)
Utilities take a long time to move from testing a technology to deploying it at large scale. Utilidata’s earliest pilot projects, launched in 2022, embedded Nvidia chips in devices that attach to existing meters. Last year, the company signed a deal with Aclara, a longtime smart meter manufacturer and subsidiary of electronics giant Hubbell, to develop an integrated smart meter using the Karman platform.
Utilidata and Nvidia’s projects with Portland General Electric, Duquesne Light, and other utility partners aren’t going to completely replace those utilities’ existing smart meters — at least, not right away. Instead, these initial projects are tied to strategic deployments at parts of the grid where the utilities are seeking more granular information, Brumberger said.
One major area of interest is in assessing the grid impacts of rooftop solar systems, backup batteries, EV chargers, and other so-called distributed energy resources, he said. A growing number of utilities are looking for ways to enlist these kinds of devices to reduce strains on their power grids — say, by instructing batteries to store solar power at midday to release it later when it’s more valuable to the grid, or by coordinating when EV charging happens to avoid overloading local grid infrastructure when everyone charges at once.
These virtual power plants (VPPs) or distributed energy resource management systems (DERMS) can sometimes be handled in a command-and-control fashion by utility grid operations centers communicating to in-field devices via hard-wired, cellular, or broadband internet. But more advanced tasks require complex computations of local grid conditions and real-time communications between multiple local devices — exactly what Karman was designed to handle, Brumberger said.
“How can you have a VPP that’s, from a capacity perspective, as big and as reliable as a gas-fired plant, without accelerating computing and AI? You’ve got so many disparate resources that have so much untapped value that we ultimately have to unlock,” he said.
Portland General Electric, which is planning to rely on distributed energy resources for a significant chunk of its future grid needs, sees technologies like Karman as a way to better understand the reliability of VPPs and DERMS, Ananth Sundaram, senior manager of integrated grid at Portland General Electric, told Canary Media in a 2023 interview.
“We can look at grid services, we can look at disaggregation of the power, and what customer behavior and customer signatures we can track,” he said. “That will not only provide us a solid platform for serving our clients, it also helps us harvest massive amounts of data we need to understand what exactly is happening on the grid edge.”
Utilidata is hoping that utilities and regulators will keep these future needs in mind when planning the next cycle of large-scale smart meter deployments. Brumberger noted that Quanta, a new investor in the latest funding round, is a key partner in many large-scale utility infrastructure and smart meter deployments.
Utilidata’s near-term deployments are also dependent on the Trump administration preserving the DOE grants supporting its first-of-a-kind utility projects. The administration has frozen and threatened to end federal climate and energy funding approved during the Biden administration, as well as to eliminate large swaths of the federal workforce, including DOE offices that manage these grant programs.
“We have not received any word that those projects are not happening,” Brumberger said about the grants. “If you sort of peel back the layers of our project, at its core, it’s next-generation AI infrastructure. That theme does seem central to this administration, when they talk about winning the AI race, about hardening our critical infrastructure.”
The latest round of funding will allow Utilidata to expand to new markets, both outside the U.S. and outside the utility grid, Brumberger said. In particular, it’s exploring the prospects for embedding its Nvidia-enabled distributed energy control devices within data centers themselves, enabling them to better understand and control power usage down to the server level, he said.
“We think of data centers as incredibly powerful microgrids,” he said — a perspective shared by data center developers adding generators, batteries, and energy management controls to their massive installations.
Utilidata and Nvidia’s computing platforms will also be collecting, analyzing, and “training” from the data they’re collecting, much like large language models (LLMs) “train” on massive amounts of human-generated text and images, Brumberger noted. The data might include things like differences between the grid voltage signals that accompany power disruptions caused by people turning things on and off in their homes and those caused by external impacts like tree branches hitting power lines.
“It’s no longer just a sensor, but a little hub of activity where you can train locally, so not every piece of information needs to leave the site,” he said. “The question is going to be, is this the kind of tech you need on 10% of your territory, on 50%, on 80%, or 100%?”
As electric and gas bills rise across the country, a poll released today finds that an overwhelming majority of people in the U.S. are concerned about growing energy costs — and experiencing greater financial stress because of them.
In a nationwide survey of about 2,000 adults, conducted by the consumer education nonprofit PowerLines and the polling company Ipsos in late March, 73% of respondents reported feeling concerned about rising utility bills. Nearly two-thirds of surveyed billpayers said they have seen their gas and electric bills rise over the last year, and 63% reported feeling more stressed as a result of energy costs. The results held consistent across the political spectrum, with Republicans, Democrats, and Independents alike expressing similar levels of concern.
The findings arrive as the Trump administration’s continued attacks on clean energy — and its support for coal and other fossil fuels — threaten to raise utility bills even higher, according to energy experts.
“Bottom line is, American energy consumers are hurting and they’re stressed out,” Charles Hua, executive director of PowerLines, said of the survey’s findings.
Yet according to the poll, most Americans aren’t familiar with the state entities in charge of regulating energy utilities and setting those prices: public utility commissions. That’s a problem, said Hua, because a lack of public participation prevents consumer interests from being fully considered when state regulators receive and approve rate-hike requests from utilities.
In the survey, 60% of respondents said they aren’t familiar with the state or local authority that oversees gas and electric bills. Around 90% of people couldn’t name their public utility commission as the correct regulatory body.
Meanwhile, these relatively unknown regulators have approved ballooning utility cost increases in recent years. In 2022, state utility regulators collectively approved $4.4 billion in bill increases; in 2023, they approved $9.7 billion. In the first quarter of 2025 alone, gas and electric utilities requested or received rate hikes totalling about $20 billion. Residential electricity costs have grown by nearly 30% since 2021, while gas prices have risen by 40% since 2019, far outpacing inflation, according to a separate report released today by PowerLines.
The reasons behind these fast-rising rates vary by utility and state. Still, Hua singled out one driver of higher electricity rates in particular: utility spending on transmission lines and distribution systems — in other words, the poles, wires, and lines that deliver power to customers.
Utilities have spent increasing amounts of money to replace aging infrastructure and repair or harden the grid after storms, wildfires, and other disasters made more likely by climate change. State rules guarantee investor-owned utilities a rate of return on those investments, creating a financial incentive to overspend on grid infrastructure that some researchers have estimated costs consumers billions of dollars each year. Volatility in global gas markets has also contributed to rising gas bills.
The extent to which customers are suffering proves that the current regulatory system isn’t working, said Hua. “Eighty million Americans are struggling to pay their utility bills, and that issue is not only not going away, but it’s only going to get significantly worse in the coming years.”
Households that struggle to afford utilities often have no choice but to sacrifice needs like food, medicine, or basic physical comfort in order to pay their energy bills. Total utility bill debt in the U.S. has reached $17 billion, according to PowerLines, and power shutoffs due to nonpayment have risen across the country, posing potentially deadly health risks.
Four in five respondents to the poll said they felt powerless to control increasing utility costs. Around 60% — across all political affiliations — said they don’t think their state governments are sufficiently protecting consumers when regulating utilities.
For that to change, public utility commissions need to better engage the communities they serve, said Hua.
They could, for example, hold public meetings virtually or at night so that more people can attend, he said. Commissions could also allow consumers to comment on regulatory proceedings online or in person, and could provide intervenor compensation that covers the legal fees of advocates and stakeholders so that more groups can get involved in ratemaking cases. Hua added that states should invest in expanding the staff and capacity of public utility commissions and consumer advocacy offices, which are often vastly out-resourced by large investor-owned utilities.
Other consumer advocates have called for a range of reforms to rein in high rates, such as implementing performance-based ratemaking, which rewards utilities for reaching certain environmental or equity goals. States could also prohibit utilities from charging customers for trade association and lobbying fees, and lower the rate of return utilities can earn on infrastructure investments.
Electricity and gas bills may rise even more under the Trump administration’s energy policies. Several reports have found that repealing the clean energy tax credits under the Inflation Reduction Act, which some GOP lawmakers have promised to do, would significantly raise household energy costs, given that solar and wind are now far cheaper sources of electricity than coal, oil, and gas. President Donald Trump’s sweeping tariffs — now on pause for most countries except China — and recent executive orders to keep aging, unprofitable coal power plants running would make energy costs even more unaffordable.
The administration has also targeted a popular federal assistance program that helps more than 6 million U.S. households pay for their heating and cooling bills. Early this month, Trump officials laid off the entire staff running the Low Income Home Energy Assistance Program, and a budget proposal leaked last week eliminates the program altogether. States are still waiting on about $378 million in funding this year for utility bill assistance, and lawmakers from both sides of the aisle have called for program staff to be reinstated.
“At a time when so many families are struggling to make ends meet — and tariffs are poised to drive prices even higher — it’s unconscionable to rip away the help that Congress has already offered to people in need,” Mark Wolfe, executive director of the National Energy Assistance Directors Association, told USA Today.
The Trump administration is threatening to force U.S. grid operators and utilities to keep money-losing coal-fired power plants running, no matter how dirty and expensive their power is.
Its stated reason? To shore up the reliability of the U.S. power grid.
It’s the latest salvo in a long-running battle over the country’s increasingly brittle grid — one that pits those in favor of hanging on to fossil fuels, and particularly coal, against those who put their faith in a future powered by cleaner and cheaper alternatives.
That battle is entering a critical phase as the grid faces challenges on multiple fronts.
Ever more intense summer heat waves and winter cold snaps driven by climate change are already straining the grid. An unprecedented boom in electricity demand, spurred by AI data centers and a resurgent manufacturing sector, threatens to push the grid even closer to its limit. Meanwhile, aging and unprofitable coal power plants have been closing at a rapid clip — and grid backlogs are preventing new solar, wind, batteries, and even fossil-gas plants from being built fast enough to replace them.
If these imbalances persist, more than half of North America faces significant risk of energy shortages over the next five to 10 years, according to a 2024 report from the North American Electric Reliability Corp., which oversees the nation’s electric system. Utilities and regional grid operators are sounding the same alarm. Many say they need to build more fossil gas–fired power plants and keep costly coal plants open to deal with what is evolving into a genuine crisis.
But energy experts insist that there’s no single, simple solution to this high-stakes challenge. Maintaining reliability will certainly require retaining some otherwise unprofitable fossil-fueled power plants. But it also requires pressing utilities and regional grid operators to rapidly bring solar, wind, and batteries online — and enlisting electricity users to shift power use to reduce costly peaks in demand.
This multifaceted approach would bolster the grid without sacrificing cost and climate concerns. Flocking back to coal and ratcheting up gas, meanwhile, would cost consumers more, increase climate and air pollution, and ultimately result in a less diversified and therefore more fragile grid than one balanced by renewables.
In the longer term, the U.S. must break barriers to building far more high-voltage power lines to connect clean energy across regions and share power when extreme weather strikes, experts say. It also needs to support the economics for “clean firm” technologies like advanced nuclear and enhanced geothermal power, or long-duration energy storage systems that can fill the gaps when the sun isn’t shining and the wind isn’t blowing.
“Reliability is actually a characteristic of the entire electricity system, and individual resources contribute to reliability as part of a balanced portfolio,” Sara Baldwin, senior electrification director at think tank Energy Innovation, said during a March webinar introducing the group’s February report on the complexities of keeping the grid running in a time of energy transition.
“So whenever you hear someone talking about the reliability of a single resource, that should raise a flag that is not necessarily grounded in full truth,” she said.
Baldwin’s “single resource” comment nods to a common refrain from the Trump administration and congressional Republicans that today’s grid reliability problems have a simple solution: Keep burning coal.
Last week, President Donald Trump issued an executive order that authorizes the Department of Energy to prevent uneconomical coal plants from closing, even if they’re violating federal and state carbon and environmental mandates and imposing higher power costs on customers. This would be the most aggressive federal intervention in modern history into the authority of states to regulate utilities, and of regional grid operators to manage competitive energy markets.
Many Republicans in Congress have long insisted that grid reliability problems are primarily caused by climate and clean-energy policies put in place by states and the Biden administration. They argue that regulations, not economics, are forcing coal plants into “premature” retirement and that cleaner alternatives can’t be relied on to fill the gap left by those retirements.
During two grid-reliability hearings last month in the U.S. House of Representatives, Republicans leveraged this framing in questions for utility executives, grid operators, and energy experts. “Too many electric-generating facilities have been retired in recent years,” said Rep. Bob Latta, an Ohio Republican who chairs the House Energy and Commerce subcommittee that held the hearings.
Under questioning, most representatives of the U.S. grid operators, which are responsible for managing the systems that deliver electricity to about two-thirds of U.S. customers, concurred with Republicans that the shuttering of coal power plants presents a major reliability challenge.
But Rep. Frank Pallone, a New Jersey Democrat and ranking member on the energy subcommittee, pushed back on Republicans’ framing. Instead of trying to keep coal plants open, he said, utilities, grid operators, and regulators must clear bottlenecks preventing new clean energy and energy storage from taking over the role that fossil fuels have previously played.
Grid operators are “saying they need every new electricity generator they can get to come online in the next five years,” Pallone said. “If Republicans are really interested in unleashing American energy, they should work with us to clear interconnection queues and let resources get on the grid as quickly as possible.”
Instead, Republicans are considering repealing the clean-energy tax credits created by the 2022 Inflation Reduction Act, Pallone said in his opening statement. That would undercut the economics of not just solar, wind, and battery projects but of many other forms of carbon-free generation and storage, including geothermal power, advanced nuclear power, and long-duration energy storage.
“Repealing billions of dollars in technology-neutral funding for all types of new energy is not the way you address the increasing need for energy,” he said.
Pallone’s comments underscore two of the biggest problems facing the U.S. grid.
The first is that coal simply cannot compete economically against alternatives. Coal has dwindled to providing only about 15% of U.S. electricity supply, as both fossil gas and renewables fall in cost. Last year, solar, wind, and batteries alone made up more than 90% of the 56 gigawatts of power capacity built in the U.S., and they are expected to lead new additions in 2025, too.
The second problem is that many U.S. utilities and grid operators have been unable to move fast enough to embrace the advantages of cheap, clean energy. As Energy Innovation’s February report highlights, “outdated views on grid reliability are colliding with slow-moving institutions,” which has confounded the potential for solar, wind, and batteries to fill the reliability gaps left by shuttered coal plants.
Much of the reluctance from certain grid operators stems from the fact that solar and wind ebb and flow with the weather, whereas fossil-fueled power plants can be turned on and off on command.
“Renewable generators play an important role, and we want them to come onto the grid, but they are not a one-for-one substitute for the fossil-fuel generators that we are replacing,” said Manu Asthana, president and CEO of PJM Interconnection, which manages the transmission grid delivering electricity to about 65 million people from the mid-Atlantic coast to the Great Lakes.
But daytime solar production and nighttime wind generation can still provide a large and predictable amount of the power needed during hot summer afternoons and evenings and cold winter mornings, when demand for electricity spikes and reliability issues loom, said Wilson Ricks, a researcher and energy-modeling expert at Princeton University’s ZERO Lab.
Meanwhile, lithium-ion batteries are becoming a more cost-effective way to store clean power for use when the grid needs it, he said. Utility-scale batteries deployed in California and Texas are storing gigawatts of solar power to cover peak grid demands during sunset and evening hours, for example.
Taken together, “the technologies that are currently experiencing widespread commercial adoption — wind, solar, lithium-ion batteries — can actually go a long way towards ensuring that system-wide resource accuracy,” Ricks said during last month’s webinar.
Real-world experience bears this out, said Ric O’Connell, founding executive director of GridLab, a think tank that contributed to the Energy Innovation reliability report. The standout example is Texas, which is adding wind, solar, and batteries faster than any other state.
The transmission grid operated by the Electric Reliability Council of Texas “has been running at 85% carbon-free for the last several weeks in the spring,” O’Connell said during the webinar. That clean power has helped cover the absence of fossil-fueled plants that were shut down for seasonal maintenance, he noted.
Similarly, Southwest Power Pool, which manages a grid stretching from the Dakotas to Oklahoma, has hit 90% wind power during some hours of the year and has seen nearly half of annual power needs supplied by carbon-free energy in recent years, he said.
Last year, on California’s solar- and battery-rich grid, grid operator CAISO clocked 100 days in a row with 100% carbon-free electricity for at least a part of each day. Gigawatts of battery capacity also improved the state’s summer grid reliability by shifting solar power into the evenings.
That’s a big change from the summers of 2020 and 2022, when the California grid faced serious emergencies, CAISO CEO Elliot Mainzer told Congress during last month’s hearings.
“A reliable grid relies on a portfolio of resources with different attributes and complementary characteristics,” he said. Pairing batteries with solar “has helped to increase reliability in recent years.”
By contrast, O’Connell said, PJM’s grid “is in the single digits for wind and solar. PJM has tons of gas, tons of coal, tons of nuclear — and they say, ‘We need even more to meet growing load.’ I’m like, ‘No, we need to add wind and solar and batteries to meet that growing load.’”
PJM’s grid is a microcosm of this problematic dynamic.
The region expects to lose about 40 gigawatts of generation, more than 20% of its capacity, by 2030. But critics say its bigger challenge is its inability to interconnect new resources to replace what it’s losing.
For years, PJM delayed interconnection reforms conducted by other grid operators. It has also failed until recently to undertake the kind of regional grid expansion plans that have been done by grid operators in Texas, California, and the Midwest, which have enabled them to bring much more clean energy online.
Critics say PJM’s failure to institute these reforms and forward-looking plans has played an outsized role in its struggle to replace retiring power plants and in the spiking cost of securing new generation resources. Multiple studies find PJM could have avoided billions of dollars of costs and significantly eased its reliability concerns if it had connected even a fraction of those pent-up clean energy and battery projects over the past decade.
Instead, state regulators and grid operators have been forced to use costly emergency mechanisms to keep power plants from closing. In Maryland, for example, PJM has used its “reliability must-run” authority to pay the owners of coal- and oil-fired power plants to postpone their retirements until at least 2028 to prevent the threat of regional grid instability or outages, at a cost of hundreds of millions of dollars in the coming years.
The high price tag of these emergency stay-open measures highlights the economic burden of poor planning around power plant retirements, O’Connell said.
Those costs are bound to rise if the Trump administration forces coal power plants to stay open under emergency orders — or even demands that utilities reopen closed coal plants, as Interior Secretary Doug Burgum has said the Trump administration may seek to do.
These poor economics are why backers of renewables are frustrated with those who insist that clean-energy policies are to blame for coal plants closing and thus for grid reliability challenges.
“People are retiring coal plants because they’re uneconomic,” O’Connell said.
Karen Palmer, a senior fellow and director of the electric power program at think tank Resources for the Future, agreed that coal retirements are “not the fault of environmental regulation. The market prices just aren’t there.”
Nor are fossil-fueled power plants as reliable as they’re often made out to be, particularly during weather extremes when the grid needs them the most. Summer heat waves reduce the efficiency of gas-fired power plants and can lead to equipment failures.
And major wintertime grid emergencies of the past several years, such as the Texas grid collapse in February 2021 and rolling outages in the Southeast in December 2022, have been linked to cold-related failures not only at coal and gas-fired power plants but across the wells and pipelines that deliver gas to generate power.
No one technology is immune to weather stresses and disruptions. Subzero temperatures can freeze up wind turbines and sap battery capacity, and scorching temperatures reduce solar-panel output and dampen battery performance. But this reinforces the importance of a portfolio approach to solving reliability challenges, Baldwin said.
None of this is to downplay the complexity that grid operators face — particularly at a time when demand for electricity is growing at a pace not seen in decades.
The Midcontinent Independent System Operator, which manages a grid serving 15 U.S. states from Louisiana to North Dakota, warned last year that its latest expectations for new power demand from data centers and manufacturing facilities put it at increased risk of reliability challenges if generating capacity doesn’t increase at the same rate.
In New York, state grid operator NYISO has identified a “very concerning decline in statewide resource margins” by 2034 unless it can expand clean-energy deployments, which lag behind state targets. NYISO has already delayed the closure of some gas-fired “peaker” power plants in New York City, which lies at the southern end of a grid bottleneck that constrains how much clean power from upstate New York and Canada can reach the city, NYISO CEO Richard Dewey said at the March 25 congressional hearing.
ISO New England, which manages the grid across six New England states, is struggling during winter cold snaps to meet simultaneous demand for gas for heating and for generating power, CEO and President Gordon van Welie said at the hearing. New England is also relying on Canadian hydropower in the near term and offshore wind farms in the longer term, he said — both sources threatened by Trump administration policies.
And while grid operators at last month’s hearing concurred with Republicans that losing existing generation is one reliability threat, they also agreed that losing federal clean-energy tax credits is another. Rep. Diana DeGette, a Democrat from Colorado, asked PJM’s Asthana if losing Inflation Reduction Act funding would “help or hurt our ability to stabilize the grid and to increase production.”
“In the near term, the interconnection queue is full of a lot of renewable projects, many of whom are, I’m sure, counting on the IRA,” Asthana replied. Repealing those tax credits “would make it less likely for them to come — and we do need them to come.”
“Anybody disagree with that on this panel?” DeGette asked the other grid operators. “No? They’re all shaking their heads no.”
As Illinois looks to prepare its electric grid for the future, a new voluntary program in the Chicago area promises to lower costs for both customers and the utility system as a whole.
ComEd is finalizing plans to roll out time-of-use rates in 2026 following a four-year pilot program in which participants saved money and reduced peak demand between 6.5% and 9.7% each summer.
Under a plan before state utility regulators, customers who sign up would pay a much steeper energy-delivery charge during afternoon and early evening hours but see a significant discount overnight. The goal is to shift use to hours when the grid tends to be less congested as well as deliver cleaner and cheaper power.
“What we saw from the pilot was people did change their habits,” said Eric DeBellis, general counsel for the Citizens Utility Board, the state’s main utility watchdog.
Most standard customers are set to pay an energy-delivery charge of 5.9 cents per kilowatt-hour next year, while customers who enroll under ComEd’s proposed time-of-use program would pay a “super peak” delivery rate of 10.7 cents per kilowatt-hour between 1 p.m. and 7 p.m. but just 3 cents during overnight hours.
“The key to savings will be customers limiting usage from 1 pm to 7 pm,” John Leick, ComEd’s senior manager of retail rates, said in an email.
Under the proposal, most time-of-use rate customers would pay delivery rates of around 4 cents per kilowatt-hour during the morning, from 6 a.m. to 1 p.m., and in the evening between 7 p.m. and 9 p.m.
The details were approved in January by the Illinois Commerce Commission but put on hold last Thursday after commissioners agreed to consider a request by ComEd to also incorporate energy-supply charges into the program. (The initial proposal included only the portion of customers’ bills that pays for the delivery of energy but not the cost of electricity itself.)
The Citizens Utility Board is happy with the plan approved in January, DeBellis said, and it wants to ensure a delivery-only option remains for customers who buy power from alternative retail electric suppliers since these customers would not be able to participate in a supply-charge ComEd program. The utility watchdog also would have liked the “super peak” hours to be a bit shorter.
“For your typical upper-Midwest household, about half of their electricity is HVAC, and in the summer bills go up because of air conditioning,” DeBellis explained. “We were worried about the hours of super peak, the length of time we’re asking people to let the temperature drift up.”
The prospect of limiting air conditioning on hot afternoons could dissuade people from enrolling in the program, DeBellis continued. “Since each person’s subtle behavior changes are going to be small, it needs to be really popular to have an impact,” he said.
Consumer advocates have long asked for time-of-use rates, which are considered crucial as Illinois moves toward its goal of 1 million electric vehicles by 2030. If too many EVs charge during high energy-demand times, the grid could be in trouble.
“We want EVs to be good for the grid, not bad,” said DeBellis.
People with electric vehicles will save an extra $2 on their monthly energy bill per vehicle just by enrolling in the proposed ComEd time-of-use program, with a cap of two vehicles for up to two years.
“This will help incentivize customers with EV’s to sign up and pay attention to the rate design and hopefully charge in the overnight or lower priced periods,” said Leick.
Richard McCann, a consultant who testified before regulators on behalf of the Citizens Utility Board and the Environmental Defense Fund, recommended that in order to qualify for rebates for level 2 chargers, customers must participate in the new time-of-use program or other programs related to when electricity is used.
DeBellis and his wife recently bought an EV, and he thinks time-of-use prices could be an extra incentive for others to follow suit. However, he thinks dealers and buyers are more focused on EVs “being cool” and tax incentives toward the lease or purchase price, rather than fuel savings.
“I don’t have the impression people trying to buy a car are doing the math and thinking about how much they would save on fuel” with an EV, he said. “Time-variant pricing makes those savings even more, but the math is kind of impenetrable for most people. In my humble opinion, fuel savings should be a way bigger factor, and time-of-use rates should be part of it.”
ComEd crafted the time-of-use program after a four-year pilot and a public comment process overseen by state regulators. The pilot was focused on the energy supply part of the bill, but DeBellis noted that the lessons apply to any time-of-use program, since demand peaks are the same regardless of which part of the bill one looks at.
The pilot program’s final annual report showed that both EV owners and people who did not own EVs significantly reduced energy use during “super peak” hours in the summer. Even though participants without EVs did little load-shifting during the non-summer seasons, they still saved an average of $6 per month during the last eight months of the pilot because of the way electricity was priced in the program. EV owners saved significantly more, with most cutting bills by $10 to $30 per month and some saving over $70 per month from October 2023 to May 2024.
Overall, in the pilot’s final year, energy usage did shift significantly from peak hours to night-time hours thanks to an increasing number of EV owners participating in the program.
In 2021 and 2022, participants in the pilot faced high peak-time energy rates because of market fluctuations. Some customers dropped out for this reason, Leick said, but participation remained near the cap of 1,900 residents during the pilot, with over a quarter of them being EV owners by the end.
Many utilities around the country already offer time-of-use programs: 42% of 829 utilities responding to a federal survey have such rates, according to McCann’s testimony.
ComEd’s filings with state regulators include a survey of 15 time-of-use rates established by seven other large utilities nationwide. The study found that in the first year of ComEd’s pilot program, the utility had a larger proportional difference between peak and off-peak pricing than most of the rates studied. (ComEd’s peak and off-peak rates, however, were lower than most of the others, so the total price difference was larger for other utilities.)
ComEd also saw larger reduction in summer peak-time electricity demand than many of the other time-of-use rates. ComEd’s program structure was similar to that of the other utilities, the company doing the study found, though in California, peak times were later in the afternoon, likely because of the state’s climate and proliferation of solar energy.
The program is designed to be revenue-neutral, meaning ComEd won’t earn more or less depending on when people use energy. Still, it has potential to lower costs to the system overall by helping to make more efficient use of existing infrastructure and postponing the need for new generation or transmission investments.
The new time-of-use rate would only apply only to residents. Industrial and commercial customers already have an incentive to use power at night, since their energy-delivery charge is based on their highest 30-minute spike between 9 a.m. and 6 p.m. on weekdays. If a business’ most intense energy use is after 6 p.m. or on weekends, their bill will be lower than if that same spike happened during usual business hours, Leick explained. Commercial and industrial customers’ energy-supply rates, meanwhile, are based on hourly market prices, which are typically higher during peak times.
ComEd has since 2007 offered a voluntary real-time pricing program that lets people save money if they use energy when system demand and hence market prices are lower. But keeping tabs on the market is beyond what most customers other than “energy wonks” are willing to do, DeBellis noted. Only about 1% of residential customers have enrolled in the real-time pricing program since it started, according to McCann’s testimony.
“Real-time pricing was so complicated it was basically gambling,” DeBellis said. “We’re very happy to have a time-based rate offering that’s very predictable, where people can be rewarded for establishing good habits.”
Some clean-energy groups had argued for a program that automatically enrolls residents unless they opted out. DeBellis said the Citizens Utility Board promoted the opt-in version that is currently proposed.
“We want people on this program to be aware they’re on the program, otherwise you won’t get behavior change. You’re just throwing money at random variation,” DeBellis said.
The Clean and Reliable Grid Affordability Act, introduced into the state legislature in February and backed by a coalition of advocacy groups, would mandate large utilities offer residential time-of-use rates, outlining a structure similar to what ComEd is planning. After the rates have been in place for a year, the Illinois Commerce Commission could do a study to see if the rates are indeed reducing fossil-fuel use and work with utilities to adjust if needed.
While the commission’s ruling last week reopens debate about the program’s structure, a ComEd spokesperson said the company is confident it will be available by next year.
See more from Canary Media’s “Chart of the week” column.
Last year was fantastic for battery storage. This year is poised to be even better.
The U.S. is set to plug over 18 gigawatts of new utility-scale energy storage capacity into the grid in 2025, up from 2024’s record-setting total of almost 11 GW, per Energy Information Administration data analyzed by Cleanview. Should that expectation bear out, the U.S. will have installed more grid batteries this year alone than it had installed altogether as of 2023.
The U.S. grid battery sector has been on a tear in recent years — and California and Texas are the reasons why. Combined, the two states have installed nearly three-quarters of the country’s total energy storage capacity of over 26 GW.
California has long held the top spot on large-scale battery storage installations. Even last year, when the EIA forecast that Texas would claim the lead, California held on by a few hundred megawatts. This year EIA again expects Texas to outpace California, only now by an even wider margin than last year. The Lone Star State could build nearly 7 GW of utility-scale storage in 2025 compared to California’s 4.2 GW.
But the new state-level storyline to watch is the rise of Arizona. The state built just under 1 GW of storage in 2024, buoyed by massive new projects like the Sonoran Solar Energy Center and the Eleven Mile Solar Center that pair solar with batteries to soak up as much desert sun as possible. This year, EIA says Arizona is on track to nearly quadruple last year’s total and build 3.6 GW of storage.
It’s worth noting that EIA’s 2024 storage forecast overshot actual installations by about 3 GW — and developers didn’t have the Trump administration to contend with then. President Donald Trump has not outright targeted energy storage, but the uncertainty surrounding the future of clean energy tax credits could have a chilling effect on investment, as it has had on projects in adjacent sectors like solar and battery manufacturing.
Despite the political chaos, developers are barrelling ahead. Just over 12 GW of storage projects are either under construction or complete and waiting to plug into the grid. And, as Cleanview points out, the crucial tax credit for battery storage projects is already locked into the tax code for 2025, giving developers some measure of certainty — at least for the months ahead.
This is the first article in our four-part series “Boon or bane: What will data centers do to the grid?”
In January, Virginia lawmakers unveiled a raft of legislation aimed at putting some guardrails on a data center industry whose insatiable hunger for electricity threatens to overwhelm the grid.
As the home of the world’s densest data center hub, Virginia is on the vanguard of dealing with these challenges. But the state is far from alone in a country where data center investments may exceed $1 trillion by mid-2029, driven in large part by “hyperscalers” with aggressive AI goals, like Amazon, Google, Meta, and Microsoft.
“If we fail to act, the unchecked growth of the data center industry will leave Virginia’s families, will leave their businesses, footing the bill for infrastructure costs, enduring environmental degradation, and facing escalating energy rates,” state Sen. Russet Perry, a Democrat representing Loudoun County, the heart of Virginia’s “Data Center Alley,” told reporters at the state capitol in Richmond last month. “The status quo is not sustainable.”
Perry’s position is backed by data. A December report commissioned by Virginia’s legislature found that a buildout of data centers to meet “unconstrained demand” would double the state’s electricity consumption by 2033 and nearly triple it by 2040.
To meet the report’s unconstrained scenario, Virginia would need to erect twice as many solar farms per year by 2040 as it did in 2024, build more wind farms than all the state’s current offshore wind plans combined, and install three times more battery storage than Dominion Energy, the state’s biggest utility, now intends to build.
Even then, Virginia would need to double current power imports from other states. And it would still need to build new fossil-gas power plants, which would undermine a state clean energy mandate. Meeting just half the unconstrained demand would require building seven new 1.5-gigawatt gas plants by 2040. That’s nearly twice the 5.9 gigawatts’ worth of gas plants Dominion now plans to build by 2039, a proposal that is already under attack by environmental and consumer groups.
But Perry and her colleagues face an uphill battle in their bid to more closely regulate data center growth. Data centers are big business in Virginia. Gov. Glenn Youngkin, a Republican, has called for the state to “continue to be the data center capital of the world,” citing the up to 74,000 jobs, $9.1 billion in GDP, and billions more in local revenue the industry brings. Most of the proposed data center bills, which include mandates to study how new data centers could impose additional costs on other utility customers and worsen grid reliability, have failed to move forward in the state legislature as of mid-February.
Still, policymakers can’t avoid their responsibility to “make sure that residential customers aren’t necessarily bearing the burden” of data center growth, Michael Webert, a Republican in Virginia’s House of Delegates who’s sponsoring one of the data center bills, said during last month’s press conference.
From the mid-Atlantic down to Texas, tech giants and data center developers are demanding more power as soon as possible. If utilities, regulators, and policymakers move too rashly in response, they could unleash a surge in fossil-gas power-plant construction that will drive up consumer energy costs — and set back progress on shifting to carbon-free energy.
But this outcome is not inevitable. With some foresight, the data center boom can actually help — rather than hurt — the nation’s already stressed-out grid. Data center developers can make choices right now that will lower grid costs and power-system emissions.
And it just so happens that these solutions could also afford developers an advantage, allowing them to pay less for interconnection and power, win social license for their AI products, and possibly plug their data centers into the grid faster than their competitors can.
When it comes to the grid, the nation faces a computational crossroads: Down one road lie greater costs, slower interconnection, and higher emissions. Down the other lies cheaper, cleaner, faster power that could benefit everyone.
After decades with virtually no increase in U.S. electricity demand, data centers are driving tens of gigawatts of power demand growth in some parts of the country, according to a December analysis from the consultancy Grid Strategies.
Providing that much power would require “billions of dollars of capital and billions of dollars of consumer costs,” said Abe Silverman, an attorney, energy consultant, and research scholar at Johns Hopkins University who has held senior policy positions at state and federal energy regulators and was an executive at the utility NRG Energy.
Utilities, regulators, and everyday customers have good reason to ask if the costs are worth it — because that’s far from clear right now, he said.
A fair amount of this growth is coming from data centers meant to serve well-established and solidly growing commercial demands, such as data storage, cloud computing, e-commerce, streaming video, and other internet services.
But the past two years have seen an explosion of power demand from sectors with far less certain futures.
A significant, if opaque, portion is coming from cryptocurrency mining operations, notoriously unstable and fickle businesses that can quickly pick up and move to new locations in search of cheaper power. The most startling increases, however, are for AI, a technology that may hold immense promise but that doesn’t yet have a proven sustainable business model, raising questions about the durability of the industry’s power needs.
Hundreds of billions of dollars in near-term AI investments are in the works from Amazon, Google, Meta, and Microsoft as well as private equity and infrastructure investors. Some of their announcements strain the limits of belief. Late last month, the CEOs of OpenAI, Oracle, and SoftBank joined President Donald Trump to unveil plans to invest $500 billion in AI data centers over the next four years — half of what the private equity firm Blackstone estimates will be invested in U.S. AI in total by 2030.
Beyond financial viability, these plans face physical limits. At least under current rules, power plants and grid infrastructure simply can’t be built fast enough to provide what data center developers say they need.
Bold data center ambitions have already collided with reality in Virginia.
“It used to take three to four years to get power to build a new data center in Loudoun County,” in Virginia’s Data Center Alley, said Chris Gladwin, CEO of data analytics software company Ocient, which works on more efficient computing for data centers. Today it “takes six to seven years — and growing.”
Similar constraints are emerging in other data center hot spots.
The utility Arizona Public Service forecasts that data centers will account for half of new power demand through 2038. In Texas, data centers will make up roughly half the forecasted new customers that are set to cause summer peak demand to nearly double by 2030. Georgia Power, that state’s biggest utility, has since 2023 tripled its load forecast over the coming decade, with nearly all of that load growth set to come from the projected demands of large power customers including data centers.
These saturated conditions are pushing developers into new markets, as the below chart from the energy consultancy Wood Mackenzie shows.
“There’s more demand at once seeking to connect to the system than can be supplied,” Chris Seiple, vice chairman of Wood Mackenzie’s power and renewables group, said in a December interview.
Of that demand growth, only a fraction can be met by building new solar and wind farms, Seiple added. And “unfortunately, those additions aren’t necessarily happening where a lot of the demand growth is happening, and they don’t happen in the right hours,” he said.
Seiple is referring to data centers’ need for reliable round-the-clock power, and utilities’ responsibility to provide it, including at moments of peak demand, usually during hot summers and cold winters. A lot of the money utilities spend goes toward building the power plants and grid infrastructure needed to meet those demand peaks.
Where renewables fall short, other resources like carbon-free hydropower, nuclear, geothermal, or clean-powered batteries could technically be built to serve at least a large portion of data center demand. But that’s generally not what’s playing out on the ground.
Instead, the data center boom is shaping up to be a huge boon for fossil-gas power plants. Investment bank Goldman Sachs predicts that powering AI will require $50 billion in new power-generation investment by 2030: 40% of it from renewables and 60% from fossil gas.
Even tech firms with clean energy goals, like Amazon, Google, Meta, and Microsoft, are having trouble squaring their soaring power needs with their professed climate commitments.
Over the past decade, those four companies have secured roughly 40 GW of U.S. clean power capacity, according to research firm BNEF. But in the past four years, their growing use of grid power, which is often generated using plenty of fossil gas, has pushed emissions in the wrong direction. Google has seen a 48% increase since 2019, and Microsoft a 29% jump since 2020.
The inability to match growing demand with more clean power isn’t entirely these hyperscalers’ fault. Yearslong wait times have prevented hundreds of gigawatts of solar and wind farms and batteries from connecting to congested U.S. power grids.
Tech firms are also pursuing “clean firm” sources of power. Amazon, Google, Meta, and Microsoft have pledged to develop (or restart) nuclear power plants, and Google has worked with advanced-geothermal-power startup Fervo Energy. But these options can’t be brought online or scaled up quickly enough to meet short-term power needs.
“The data center buildout, especially now with AI, is creating somewhat of a chaotic situation,” said James West, head of sustainable technologies and clean-energy research at investment banking advisory firm Evercore ISI. Hyperscalers are “foregoing some of their sustainability targets a bit, at least near-term.”
And many data center customers and developers are indifferent to clean energy; they simply seek whatever resource can get their facility online first, whether to churn out cryptocurrency or win an advantage in the AI race.
Utilities that have sought for years to win regulatory approval for new gas plants have seized on data center load forecasts as a new rationale.
A recent analysis from Frontier Group, the research arm of the nonprofit Public Interest Network, tallied at least 10.8 GW of new gas-fired power plants being planned by utilities, and at least 9.1 GW of fossil-fuel power plants whose closures have been or are at risk of being delayed, to meet projected demand.
In Nebraska, the Omaha Public Power District has delayed plans to close a 1950s-era coal plant after Google and Meta opened data centers in its territory. Last year, Ohio-based utility FirstEnergy abandoned a pledge to end coal use by 2030 after grid operator PJM chose it to build a transmission line that would carry power from two of its West Virginia coal plants to supply northern Virginia.
“This surge in data centers, and the projected increases over the next 10 years in the electricity demand for them, is really already contributing to a slowdown in the transition to clean energy,” said Quentin Good, policy analyst with Frontier Group.
In some cases, tech companies oppose these fossil-fuel expansion plans.
Georgia Power last month asked regulators to allow it to delay retirement of three coal-fired plants and build new gas-fired power plants to meet a forecasted 8 GW of new demand through 2030. A trade group representing tech giants, which is also negotiating with the utility to allow data centers to secure more of their own clean power, was among those that criticized the move for failing to consider cleaner options.
But in other instances, data center developers are direct participants.
In Louisiana, for example, Meta has paired a $10 billion AI data center with utility Entergy’s plan to spend $3.2 billion on 1.5 GW’s worth of new fossil-gas power plants.
State officials have praised the project’s economic benefits, and Meta has pledged to secure clean power to match its 2 GW of power consumption and to work with Entergy to develop 1.5 GW of solar power. But environmental and consumer advocates fear the 15-year power agreement between Meta and Entergy has too many loopholes and could leave customers bearing future costs, both economic and environmental.
“In 15 years, Meta could just walk away, and there would be three new gas plants that everyone else in the state would have to pay for,” said Logan Atkinson Burke, executive director of the nonprofit Alliance for Affordable Energy. The group is demanding more state scrutiny of the deal and has joined other watchdog groups in challenging Entergy’s plan to avoid a state-mandated competitive bidding process to build the gas plants.
In other cases, AI data center developers appear to be making little effort to coordinate with utilities. In Memphis, Tennessee, a data center being built by xAI, the AI company launched by Elon Musk, was kept secret from city council members and community groups until its June 2024 unveiling.
In the absence of adequate grid service from Memphis’ municipal utility, the site has been burning gas in mobile generators exempt from local air-quality regulations, despite concerns from residents of lower-income communities already burdened by industrial air pollution.
In December, xAI announced a tenfold increase in the site’s computing capacity — a move that the nonprofit Southern Alliance for Clean Energy estimates will increase its power demand from 150 MW to 1 GW, or roughly a third of the entire city’s peak demand.
The nonprofit had hoped that “Musk would use his engineering expertise to bring Tesla megapacks [batteries] with solar and storage capability to make this facility a model of clean, renewable energy,” its executive director, Stephen Smith, wrote in a blog post. But now, the project seems more like “a classic bait and switch.”
If gas plants are built as planned to power data centers, dreams of decarbonizing the grid in the near future are essentially out the window.
But data centers don’t need to be powered by fossil fuels; it’s not a foregone conclusion.
As Frontier Group’s Good noted during last month’s Public Interest Network webinar, “The outcome depends on policy, and the increase in demand is by no means inevitable.”
It’s up to regulators to sort this out, said Silverman of Johns Hopkins University. “There are all these tradeoffs with data centers. If we’re asking society to make tradeoffs, I think society has a right to demand something from data centers.”
That’s what the Virginia lawmakers proposing new data center bills said they were trying to do. Two of the bills would order state regulators to study whether other customers are bearing the costs of data center demand — a risk highlighted by the legislature’s December report, which found unconstrained growth could increase average monthly utility bills by 25% by 2040. Those bills are likely to be taken up in conference committee next month.
Other bills would give local governments power to review noise, water, and land-use impacts of data centers seeking to be sited in the state, require that data centers improve energy efficiency, and mandate quarterly reports of water and energy consumption.
Another sought to require that proposed new data centers undergo review by state utility regulators to ensure they won’t pose grid reliability problems. Democratic Delegate Josh Thomas, the bill’s sponsor, said that’s needed to manage the risks of unexamined load growth. Without that kind of check, “we could have rolling blackouts. We could have natural gas plants coming online every 18 months,” he said.
But that bill was rejected in a committee hearing last month, after opponents, including representatives of rural counties eager for economic development, warned it could alienate an industry that’s bringing billions of dollars per year to the state’s economy.
Data center developers have the ability to minimize or even help drive down power system costs and carbon emissions. They can work with utilities and regulators to bring more clean energy and batteries onto the grid or at data centers themselves. They can manage their demand to reduce grid strains and lower the costs of the infrastructure needed to serve them. And in so doing, they could secure scarce grid capacity in places where utilities are otherwise struggling to serve them.
The question is, will they? Silverman emphasized that utilities and regulators must treat grid reliability as their No. 1 priority. “But when we get down to the next level, are we going to prioritize affordability, which is very important for low-income customers? Are we going to prioritize meeting clean energy goals? Or are we going to prioritize maximizing data center expansion?” he asked.
Given the pressure to support an industry that’s seen as essential to U.S. economic growth and international competitiveness, Silverman worries that those priorities aren’t being properly balanced today. “We’re moving forward making investments assuming we know the answer — and it’s not like if we’re wrong, we’re getting that money back.”
In part 2 of this series, Canary Media reports on a key problem with data center plans: It’s near impossible to know how much they’ll impact the grid.
This is the second article in our series “Boon or bane: What will data centers do to the grid?”
There’s no question that data centers are about to cause U.S. electricity demand to spike. What remains unclear is by how much.
Right now, there are few credible answers. Just a lot of uncertainty — and “a lot of hype,” according to Jonathan Koomey, an expert on the relationship between computing and energy use. (Koomey has even had a general rule about the subject named after him.) This lack of clarity around data center power requires that utilities, regulators, and policymakers take care when making choices.
Utilities in major data center markets are under pressure to spend billions of dollars on infrastructure to serve surging electricity demand. The problem, Koomey said, is that many of these utilities don’t really know which data centers will actually get built and where — or how much electricity they’ll end up needing. Rushing into these decisions without this information could be a recipe for disaster, both for utility customers and the climate.
Those worries are outlined in a recent report co-authored by Koomey along with Tanya Das, director of AI and energy technology policy at the Bipartisan Policy Center, and Zachary Schmidt, a senior researcher at Koomey Analytics. The goal, they write, “is not to dismiss concerns” about rising electricity demand. Rather, they urge utilities, regulators, policymakers, and investors to “investigate claims of rapid new electricity demand growth” using “the latest and most accurate data and models.”
Several uncertainties make it hard for utilities to plan new power plants or grid infrastructure to serve these data centers, most of which are meant to power the AI ambitions of major tech firms.
AI could, for example, become vastly more energy-efficient in the coming years. As evidence, the report points to the announcement from Chinese firm DeepSeek that it replicated the performance of leading U.S.-based AI systems at a fraction of the cost and energy consumption. The news sparked a steep sell-off in tech and energy stocks that had been buoyed throughout 2024 on expectations of AI growth.
It’s also hard to figure out whose data is trustworthy.
Companies like Amazon, Google, Meta, Microsoft, OpenAI, Oracle, and xAI each have estimates of how much their demand will balloon as they vie for AI leadership. Analysts also have forecasts, but those vary widely based on their assumptions about factors ranging from future computing efficiency to manufacturing capacity for AI chips and servers. Meanwhile, utility data is muddled by the fact that data center developers often surreptitiously apply for interconnection in several areas at once to find the best deal.
These uncertainties make it nearly impossible for utilities to gauge the reality of the situation, and yet many are rushing to expand their fleets of fossil-fuel power plants anyway. Nationwide, utilities are planning to build or extend the life of nearly 20 gigawatts’ worth of gas plants as well as delaying retirements of aging coal plants.
If utilities build new power plants to serve proposed data centers that never materialize, other utility customers, from small businesses to households, will be left paying for that infrastructure. And utilities will have spent billions in ratepayer funds to construct those unnecessary power plants, which will emit planet-warming greenhouse gases for years to come, undermining climate goals.
“People make consequential mistakes when they don’t understand what’s going on,” Koomey said.
Some utilities and states are moving to improve the predictability of data center demand where they can. The more reliable the demand data, the more likely that utilities will build only the infrastructure that’s needed.
In recent years, the country’s data center hot spots have become a “wild west,” said Allison Clements, who served on the Federal Energy Regulatory Commission from 2020 to 2024. “There’s no kind of source of truth in any one of these clusters on how much power is ultimately going to be needed,” she said during a November webinar on U.S. transmission grid challenges, hosted by trade group Americans for a Clean Energy Grid. “The utilities are kind of blown away by the numbers.”
A December report from consultancy Grid Strategies tracked enormous load-forecast growth in data center hot spots, from northern Virginia’s “Data Center Alley,” the world’s densest data center hub, to newer boom markets in Georgia and Texas.
Koomey highlighted one big challenge facing utilities and regulators trying to interpret these forecasts: the significant number of duplicate proposals they contain.
“The data center people are shopping these projects around, and maybe they approach five or more utilities. They’re only going to build one data center,” he explained. “But if all five utilities think that interest is going to lead to a data center, they’re going to build way more capacity than is needed.”
It’s hard to sort out where this “shopping” is happening. Tech companies and data center developers are secretive about these scouting expeditions, and utilities don’t share them with one another or the public at large. National or regional tracking could help, but it doesn’t exist in a publicly available form, Koomey said.
To make things more complicated, local forecasts are also flooded with speculative interconnection requests from developers with land and access to grid power, with or without a solid partnership or agreement in place.
“There isn’t enough power to provide to all of those facilities. But it’s a bit of a gold rush right now,” said Mario Sawaya, a vice president and the global head of data centers and technology at AECOM, a global engineering and construction firm that works with data center developers.
That puts utilities in a tough position. They can overbuild expensive energy infrastructure and risk whiffing on climate goals while burdening customers with unnecessary costs, or underbuild and miss out on a once-in-a-lifetime economic opportunity for them and for their community.
In the face of these risks, some utilities are trying to get better at separating viable projects from speculative ones, a necessity for dealing with the onslaught of new demand.
Utilities and regulators are used to planning for housing developments, factories, and other new electricity customers that take several years to move from concept to reality. A data center using the equivalent of a small city’s power supply can be built in about a year. Meanwhile, major transmission grid projects can take a decade or more to complete, and large power plants take three to five years to move through permitting, approval, procurement, and construction.
Given the mismatch in timescales, “the solution is talking to each other early enough before it becomes a crisis,” said Michelle Blaise, AECOM’s senior vice president of global grid modernization. “Right now we’re managing crises.”
Koomey, Das, and Schmidt highlight work underway on this front in their February report: “Utilities are collecting better data, tightening criteria about how to ‘count’ projects in the pipeline, and assigning probabilities to projects at different stages of development. These changes are welcome and should help reduce uncertainty in forecasts going forward.”
Some utilities are still failing to better screen their load forecasts, however — and tech giants with robust clean energy goals such as Amazon, Google, and Microsoft are speaking up about it. Last year, Microsoft challenged Georgia Power on the grounds that the utility’s approach is “potentially leading to over-forecasting near-term load” and “procuring excessive, carbon-intensive generation” to handle it, partly by including “projects that are still undecided on location.” Microsoft contrasted Georgia Power’s approach to other utilities’ policies of basing forecasts on “known projects that have made various levels of financial commitment.”
Utilities also have incentives to inflate load forecasts. In most parts of the country, they earn guaranteed profits for money spent on power plants, grid expansions, and other capital infrastructure.
As a matter of fact, many utilities have routinely over-forecasted load growth during the past decade, when actual electricity demand has remained flat or increased only modestly. But today’s data center boom represents a different kind of problem, Blaise said — utilities “could build the infrastructure, but not as fast as data centers need it.”
In the face of this gap between what’s being demanded of them and what can be built in time, some utilities are requiring that data centers and other large new customers prove they’re serious by putting skin in the game.
The most prominent efforts on this front are at utility American Electric Power. Over the past year, AEP utilities serving Ohio as well as Indiana and Michigan have proposed new tariffs — rules and rates for utility customers — to deal with the billions of dollars of data center investments now flooding into those states.
AEP Ohio and state regulators proposed a settlement agreement in October that would require data centers to commit to 12-year agreements to “pay for a minimum of 85% of the energy they say they need each month — even if they use less.” The proposed tariff would also require prospective data center projects to provide more financial disclosures and to pay an exit fee if they back out or fail to meet their commitments.
A counterproposal filed in October by Google, Meta, Microsoft, and other groups would also set minimum power payments but loosen some other requirements.
AECOM’s Sawaya and Katrina Lewis, the company’s vice president and energy advisory director, laid out the differences between the competing proposals in a December opinion piece.
“On the one hand, the utility is seeking to protect other customer classes and reduce unnecessary investment by ensuring longer term commitments,” they wrote. “While on the other, Big Tech is looking to establish tariffs that drive innovation and growth through the appropriate grid investments without individual industries being singled out.”
AEP utility Indiana Michigan Power (I&M) has made more progress than AEP Ohio threading this needle. In November, it reached an agreement with parties including consumer advocates and Amazon Web Services, Google, Microsoft, and the trade group Data Center Coalition. That plan was approved by Indiana utility regulators last week.
A number of nitty-gritty details distinguish that deal from the disputes AEP Ohio is still wrangling over. One key difference is that I&M’s plan would apply to all large power-using customers, not just data centers.
But AECOM’s Lewis noted that data centers aren’t the only stakeholders to consider in making such a plan. Economic-development entities, city and county governments, consumer advocates, environmental groups, and other large energy customers all have their own opinions. Negotiations like those in Ohio and Indiana “will have to happen around the country,” she said. “They’re likely to be messy at first.”
That’s a good description of the various debates happening across the U.S. In Georgia, utility regulators last month approved a new rule requiring data centers of 100 megawatts or larger to pay rates that would include some costs of the grid investments needed to support those centers. But business groups oppose legislative efforts to claw back state sales tax breaks for data centers, including a bill passed last year that Republican Gov. Brian Kemp vetoed.
In Virginia, lawmakers have proposed a raft of bills to regulate data center development, and regulators have launched an investigation into the “issues and risks for electric utilities and their customers.” But Republican Gov. Glenn Youngkin has pledged to fight state government efforts to restrain data center growth, as have lawmakers in counties eager for the economic benefits they can bring.
But eventually, Koomey said, these policy debates will have to reckon with a fundamental question about data center expansion: “Who’s going to pay for this? Today the data centers are getting subsidies from the states and the utilities and from their customers’ rates. But at the end of the day, nobody should be paying for this except the data centers.”
Not all data centers bring the same economic — or social — benefits. Policies aimed at forcing developers to prove their bona fides may also need to consider what those data centers do with their power.
Take crypto mining, a business that uses electricity for number-guessing computations to claim ownership of units of digital worth. Department of Energy data suggests that crypto mining represents 0.6% to 2.3% of U.S. electricity consumption, and researchers say global power use for mining bitcoin, the original and most widely traded cryptocurrency, may equal that of countries such as Poland and Argentina.
Crypto mining operations are also risky customers. They can be built as rapidly as developers can connect containers full of servers to the grid and can be removed just as quickly if cheaper power draws them elsewhere.
This puts utilities and regulators in an uncomfortable position, said Abe Silverman, an attorney, energy consultant, and research scholar at Johns Hopkins University who has held senior policy positions at state and federal energy regulators and was a former executive at utility NRG Energy.
“Nobody wants to be seen as the electricity nanny — ‘You get electricity, you don’t,’” he said. At the same time, “we have to be careful about saying utilities have to serve all customers without taking a step back and asking, is that really true?”
Taking aim at individual customer classes is legally tricky.
In North Dakota, member-owned utility Basin Electric Cooperative asked the Federal Energy Regulatory Commission for permission to charge crypto miners more than other large customers to reduce the risk that gigawatts of new “highly speculative” crypto load could fail to materialize. Crypto operators protested, and FERC denied the request in August, saying that the co-op had failed to justify the change but adding that it was “sympathetic” to the co-op’s plight.
Silverman noted that AEP Ohio’s proposed tariff also sets more stringent collateral requirements on crypto miners than on other data centers. That has drawn opposition from the Ohio Blockchain Council, a crypto trade group, which joined tech companies in the counterproposal to the proposed tariff.
It’s likely that the industry, boosted by its influence with the crypto-friendly Trump administration, will challenge these kinds of policies elsewhere.
Then there’s AI, the primary cause of the dramatic escalation of data center load forecasts over the past two years. In their report, Koomey, Das, and Schmidt take particular aim at forecasts from “influential management consulting and investment advising firms,” which, in Koomey’s view, have failed to consider the physical and financial limitations to unchecked AI expansion.
Such forecasts include those from McKinsey, which has predicted trillions of dollars of near-term economic gains from AI, and from Boston Consulting Group, which proposes that data centers could use as much power as two-thirds of all U.S. households by 2030.
But these forecasts rely on “taking some recent growth rate and extrapolating it into the future. That can be accurate, but it’s often not,” Koomey said. These consultancies also make money selling services to companies, which incentivizes them to inflate future prospects to drum up business. “You get attention when you make aggressive forecasts,” he said.
Good forecasts also must consider hard limits to growth. Koomey helped research December’s “2024 United States Data Center Energy Usage Report” from DOE’s Lawrence Berkeley National Laboratory, which used real-world data on shipments and orders of the graphical processing units (GPUs) used by AI operations today, as well as forecasts of how much more energy-efficient those GPUs could become.
That forecast found data centers are likely to use from 6.7% to 12% of total U.S. electricity by 2028, up from about 4.4% today.
To be clear, those numbers contain an enormous range of uncertainty, and nationwide estimates don’t help individual utilities or regulators figure out what may or may not get built in their territories. Still, “we think this is a pretty reliable estimate of what energy demand growth is going to look like from the sector,” said Avi Shultz, director of DOE’s Industrial Efficiency and Decarbonization Office. “Even at the low end of what we expect, we think this is very real.”
Koomey remains skeptical of many current high forecasts, however. For one, he fears they may be discounting the potential for AI data centers to become far more energy-efficient as processors improve — something that has happened reliably throughout the history of computing. Similar mistakes informed wild overestimates of how much power the internet was going to consume back in the 1990s, he noted.
“This all happened during the dot-com era,” when “I was debunking this nonsense that IT [information technology] would be using half of the electricity in 10 years.” Instead, data centers were using only between 1% and 2% of U.S. electricity as of 2020 or so, he said. The DeepSeek news indicates that AI may be in the midst of its own iterative computing-efficiency improvement cycle that will upend current forecasts.
Speaking of the dot-com era, there’s another key factor that today’s load growth projections fail to take into account — the chance that the current AI boom could end up a bust. A growing number of industry experts question whether today’s generative AI applications can really earn back the hundreds of billions of dollars that tech giants are pouring into them. Even Microsoft CEO Satya Nadella spoke last week about an “overbuild” of AI systems.
“Over the next several years alone, we’re going to spend over a trillion dollars developing AI,” Jim Covello, head of stock research at investment firm Goldman Sachs, said in a discussion last year with Massachusetts Institute of Technology professor and Nobel Prize–winning economist Daron Acemoglu. “What trillion-dollar problem is AI going to solve?”
In part 3 of this series, Canary Media reports on how to build data centers that won’t make the grid dirtier.