
Stressed US grid forcing data centers to get more flexible
News ClipJefferson City News Tribune·NY·3/27/2026
The U.S. technology industry is under increasing pressure to make data centers more flexible in their power consumption to alleviate strain on the national grid. Amid growing public concern over Big Tech's massive electricity needs, regulators and utilities are pushing for demand response strategies. This shift aims to prevent blackouts, manage surging power bills, and accelerate agreements for new data center connections.
electricity
AmazonGoogleNvidiaMeta
Gov: Department of Energy, PJM Interconnection
The U.S. technology industry is facing mounting pressure to reduce the power consumption of its data centers during periods of high electricity demand, driven by growing public concern that these facilities are maxing out the country's power grid. Utilities and regulators are increasingly urging tech companies to adopt demand response practices, which involve scaling back energy use at data centers when requested by grid operators.
While largely in pilot stages, the goal of making data centers more flexible is to prevent blackouts and mitigate surging power bills during peak demand. Industry experts, including U.S. Energy Secretary Chris Wright, emphasize the critical need to meet electricity demand to avoid crises. The Electric Power Research Institute (EPRI) projects that data center electricity use could quadruple by 2030, consuming up to 17 percent of U.S. power supplies. PJM Interconnection, which covers the world's largest data center market, anticipates supply shortages as early as next year.
This increased flexibility could save $40 billion to $150 billion in capital investments over the next decade, ultimately reducing costs for households and small businesses that would otherwise bear the expense of grid build-outs for data centers, according to research from Duke University. Companies like Carlyle, an investor in data centers, believe demand response must be a key part of the solution.
Traditionally, cloud data centers required constant power, but newer AI-focused facilities may offer more flexibility by shifting energy-intensive workloads across locations or utilizing backup power. Google has already announced contracts to lower consumption at some facilities, and Nvidia has partnered with Emerald AI on initiatives to manage power usage. Meta also contributed to an EPRI framework released this week, outlining how data centers can become more adaptable to grid demands, with the aim of speeding up connection times for new facilities.