
Technology & Privacy
Eight States Enact Minor Social Media Bans Despite Court Fights
October 8, 2025 | Sandy Dornsife
October 8, 2025 | Bill Kramer
Key Takeaways:
We recently discussed data centers’ NIMBY (“Not In My Back Yard”) problem. However, one solution is to avoid the NIMBYs altogether by building massive data centers far away from populous areas. And that seems to be the game plan for the major AI players investing many billions, if not trillions, of dollars into ginormous data centers.
How will they power these behemoths? As a McKinsey report put it in 2024: “The power sector is rapidly becoming a protagonist in the AI story.” But first, with data center construction now outpacing office buildings, let’s get an idea of the scale we’re talking about here.
The undisputed leader in AI right now is OpenAI with its ubiquitous ChatGPT. The company is currently finishing up construction on a data center complex in Abilene, Texas, that is expected to suck up 0.9 gigawatts (GW) of electricity. But as the WSJ reports, “Company executives made clear that the Abilene site was just the beginning, noting that they envision a need for more than 20 gigawatts of computing capacity to meet the explosive demand for ChatGPT. . . . Demand is likely to eventually reach closer to 100 gigawatts, one company executive said, which would be $5 trillion” in cost.
That final 100 GW projection from an OpenAI executive is an absolutely massive amount of power. Energy demand for the entire U.S. data center sector was 25 GW in 2024. Even OpenAI’s shorter-term goal of 20 GW would nearly double that 2024 demand. To put that larger number in perspective, 100 GW would be over 100 times the capacity of their massive data center complex in Abilene. If built out as projected, 100 GW would represent roughly 8% of total US electricity generation capacity today. In fact, 100 GW could power approximately 75 million US homes, which is more than half of all US households.
And OpenAI is not alone. Elon Musk’s xAI built a 0.3 GW data center in Memphis — dubbed Colossus 1 — in only 122 days. The company has already started construction of Colossus 2, which will have 1 GW of capacity powered by a gas turbine plant across the border in Mississippi (after receiving pushback in Tennessee). And the remaining AI players are determined to keep the race going. Meta has turned to quickly constructing data centers in tents, adding a 1 GW cluster in Ohio and a 2 GW cluster in Louisiana. Google already had a massive data center presence, but is adding 1 GW of capacity by the end of the year. And Anthropic is partnering with Amazon’s AWS to bring on 1.3 GW of data center capacity. These companies are combining their already huge revenue stream alongside outside investors clamoring to pour money into data center construction. Data center infrastructure spend is on par with the build-out of the railroads in the 19th century.
If the rest of the AI industry has plans anywhere near OpenAI’s stated goals, where will all these gigawatts come from? As we discussed earlier, electricity consumption on the U.S. power grid has remained flat over the past 20 years. But that’s clearly about to change, and the lackluster pace of regulatory inertia to bring on new power generation and connect it to the grid will not cut it going forward.
The good news is that these massive data center clusters will not necessarily be connected to the grid. Part of that is because at the speed these hyperscalers want to get this compute online, there’s no way they’ll wait their turn in the multi-year (if not longer) interconnection queue to see new power generation connected to the grid. Instead, the massive data center clusters cited above are largely relying on on-site power generation (“behind-the-meter”), which doesn’t necessarily require connection to the larger electrical grid (and the regulatory headaches and slow timelines that come with it). Another proposal is to power data centers using excess capacity since the grid is only rarely generating at full capacity. But using “demand response” will require complex planning, coordination, and grid connections to pull off.
Nonetheless, an electricity build-out this large, along with the overall shift in electrification of the economy, cannot avoid disrupting the current fragile state of the country’s electrical grid. And if states can’t get the new generation online to make up for this increase in demand, policymakers will need to answer to ratepayers who are seeing their electricity bills go up. Lawmakers are starting to push back.
In 2025, Virginia enacted a measure (VA HB 2084) directing state regulators to determine whether utilities should create a special rate for specific customers like data centers. This idea to charge data centers extra fees to help ease the burden on other ratepayers is gaining steam in other states. Texas enacted a law (TX SB 6) requiring data centers to curtail operation when demand on the grid is high.
Electricity rates around the country are very likely to increase in the coming years. The reasons for this are multifaceted and complex, and while data centers are certainly a contributing factor, we're willing to predict that they’re going to receive an inordinate amount of the blame from both political parties and the public at large. But considering the insatiable demand for data center capacity, this compute will get built somewhere. If they’re not built in the U.S., those data centers — a key to training and running future AI models — will be built in places like the UAE and Saudi Arabia.
This article appeared in our Morning MultiState newsletter on September 30, 2025. For more timely insights like this, be sure to sign up for our Morning MultiState weekly morning tipsheet. We created Morning MultiState with state government affairs professionals in mind — sign up to receive the latest from our experts in your inbox every Tuesday morning. Click here to sign up.
October 8, 2025 | Sandy Dornsife
October 2, 2025 | Bill Kramer
July 11, 2025 | Bill Kramer