Biggest myths about AI data centers
Across the country, Americans are flocking to city council, planning commission, and water board meetings trying to stop data centers from being built. Driving this NIMBYism are two main arguments: data centers will drive up electricity rates, and data centers will guzzle local water resources. Data centers are prodigious consumers of both power and water, so these views are understandable. They are also (mostly) wrong.
Let’s examine them in order.
First, data centers have historically reduced inflation-adjusted electricity rates, or at least kept them in check. They’ve done this because power markets in the U.S. do not simply operate on supply and demand. For power to be available any time you want it, the grid has to produce and carry just as much or more electricity than what customers collectively request. But this demand fluctuates heavily, and many large power plants can’t simply be turned off or on at a moment’s notice. Moreover, solar and wind power aren’t always available when they are needed most. This means that utilities typically build more power plants than necessary to ensure reliability during peak demand. As a result, America’s utilities operate at an average load factor – the actual amount of energy consumed compared to total potential energy that could have been used based on peak demand – of just 53 percent, according to a recent analysis from Duke University.*
Unlike intermittent sources of demand, like lighting or air conditioning, data center electricity demand is both large and consistent, permitting utilities to utilize more of their grid capacity. With higher load being pushed through the grid, fixed generating and infrastructure costs can thus be spread across more consumers, reducing costs for all. This explains why Charles Rivers Associates recently found that data center buildouts did not trigger increases in retail utility rates over the past decade.
The speed with which the current AI data center buildout is playing out could imperil this historical trend, however. If demand rises too quickly, faster than the grid can adapt, costs will almost certainly rise for ratepayers. But this scenario can be prevented provided policymakers incentivize or require data center operators to install their own on-site power generation or battery storage, or mandate that they pay for any necessary grid upgrades to deliver the power they need. In part due to public backlash, large data center operators by and large haven’t been resisting these measures.
“The hyperscalers are increasingly on board with paying for any necessary grid and generation upgrades,” Brian Potter, Senior Infrastructure Fellow at the Institute for Progress, told RealClearScience.
Worries over rising electricity rates typically attract the most attention, but close behind are concerns about water shortages caused by data centers’ usage. Here, Potter also says the anger is overblown.
“Data centers use a lot of water compared to a single family home, but not all that much when you compare it to other industrial uses (which I think is the proper comparison). Data centers use less water than golf courses, less water than steel mills. The state of Arizona alone uses on the order of six to seven times as much water as data centers do for growing crops in the desert, and the data centers are generating vastly more economic value than growing alfalfa.”
AI data centers are being built, whether most Americans like them or not. Around 3,000 were planned or under construction as of December. The good news is that there actually seems to be more to like than to hate.
*This section was corrected 3/21 to clarify the meaning of “load factor” and explain how the power grid actually works. H/T to retired electrical engineer and RCS reader Ken Davis.