and budgetseconomyFeaturedfinanceInfrastructure and energy

How America Can Power the AI Revolution

The adage “knowledge is power” could be recast: knowledge consumes power. Throughout history, the advance of knowledge has led to the invention of new products and services. These innovations inevitably increase energy consumption. With the invention of useful artificial intelligence (AI), we have another example of that truth.

AI-focused data centers, more than any other single factor, are driving growth in regional and national electricity demands at a rate not seen in a half-century. The boom has caused the Federal Energy Regulatory Commission (FERC) to more than triple its formerly tepid forecast for growth in U.S. power demand by 2030. The lower end of its growth estimate would be equivalent to adding about five times New York City’s peak power usage.

Finally, a reason to check your email.

Sign up for our free newsletter today.

The scale of digital power is truly daunting. Consider the energy demands of other big facilities: the world’s tallest skyscraper, the Burj Khalifa, requires about 40 megawatts (MW), an electric steel mill about 100 MW, a giant TSMC semiconductor fabrication plant about 200 MW, and a giant oil refinery or LNG export terminal about 500 MW. In the U.S., we’re building only a few such facilities at any given time. Meantime, over the next few years alone—and in just the U.S.—construction is underway on dozens, and soon likely hundreds, of AI-centric data centers, each requiring 200 MW to 1,000 MW.

Companies ranging from Anthropic—the four-year-old AI startup valued at nearly $200 billion—to OpenAI (valued at some $300 billion), Google ($2.5 trillion), and numerous other tech giants (collectively, more than $20 trillion) have all weighed in with national energy plans to ensure power for “a generational opportunity to build a new era of American innovation and growth,” as Google put it. Anthropic’s July 2025 roadmap, like many others, notes that AI “will require a broader-based effort to unlock energy and data center buildouts around the country,” including the need to accelerate “geothermal, natural gas, and nuclear permitting.” [emphasis added] It is a sign of the times that Big Tech is calling for more conventional power plants.

The top question for pro-growthers: What can be done to ensure that America can power the AI and data-center revolution? The same concern has attended the emergence of all radically new technologies over history.

For example, economist William Stanley Jevons’s iconic 1865 paper, “The Coal Question,” was animated by worries that the “exhaustion of coal” would end the prosperity wave driven by the invention of steam engines for factories and ships. More recently, the 1973 Arab oil embargo engendered a half-century of handwringing about “peak oil,” worries that would stall booming automobile and aviation sectors. And in 1978, Congress passed the Fuel Use Act, banning the consumption of natural gas for electric power plants because of fears of imminent exhaustion of that fuel, which the (politically important) residential sector craved after a 250 percent rise in gas demand over the prior few decades.

But today, there is finally widespread recognition of the “bottomless well” (to borrow the title of a book I coauthored with Peter Huber) of underlying energy resources. That’s why the no-growth greens have switched from preaching “peak oil” to a “keep it in the ground” campaign. Thus, when it comes to building the energy systems needed to fuel growth, the issue is no longer about the exhaustibility of resources but about the need to weigh choices and tradeoffs, while minimizing government friction.

Much of that friction comes from a cohort of influential analysts fearful of any growth in fuel use, anywhere. For these thinkers, the key question can be summarized as: Should private markets be allowed to do what’s needed to power the AI and data-center revolution?

Stories and articles about AI’s “excessive” energy use are proliferating in the technical literature and popular media. The headline of a recent article by the editors of Scientific American cuts to the chase: “Artificial Intelligence uses too much energy.” The article asserts that AI’s “skyrocketing” energy use is “exorbitant” and antithetical to the need to “keep the energy footprint of the U.S. reined in.” Or consider a French-government-commissioned analysis that—echoing many others, such as one from the International Energy Agency—proposes schemes to encourage or enforce “digital sobriety” to curb the market’s appetite for digital services, and thus energy.

It should be obvious that energy-constrained “digital sobriety” would merely choke expansion. Statist inclinations to constrain freedom-of-choice are not new to the energy debates. But this is different. It’s not about regulating, say, the annual number of vacations by aircraft, or your thermostat setting, but about throttling a nascent revolutionary technology.

Venture capitalist Marc Andreessen has called the global race to “dominate” AI our modern “Sputnik moment.” Space travel, at least to near-Earth orbit, has yielded manifold benefits—GPS and superior weather forecasts, to note just two. And rocket ships are epic energy hogs. But the 1957 Sputnik moment is really the wrong analogy.

In infrastructure terms, facilitating the AI boom is more like the 1956 Highway Act or, in technology terms, the invention of the plane that kicked off the boom in modern aviation—the Boeing 707.

The 1958 introduction of the 707, the first practical commercial jet aircraft, made possible high-altitude (more comfortable and above-the-weather) long-distance flights, propelling the transatlantic travel boom. While passenger aviation was by then a nearly a four-decade old industry, the reliable jet engine was the pivotal invention enabling global air traffic to soar—as did aviation fuel use, up some 500 percent since 1958 (during which aviation fuel efficiency tripled).

Similarly, when President Dwight Eisenhower signed the 1956 Highway Act, the automobile age was four decades old and the nation was already festooned with thousands of miles of local roadways. That Act’s goal to build 40,000 miles of interstate superhighways was unambiguously both economic and strategic—the legislation’s full title is The National Interstate and Defense Highway Act. Few doubt that it unleashed freedom and economic mobility and facilitated growth. Those benefits entailed a 300 percent rise in overall highway fuel use since 1956, despite a doubling in the energy efficiency of car engines over that period.

Consider some parallels.

A single AI server rack, a refrigerator-sized assembly of silicon chips and the attendant supporting hardware, can weigh as much as a car, though each rack uses as much energy annually as 100 cars. A single data center hosts hundreds or thousands of such racks.

Photo by Andrej Sokolow/picture alliance via Getty Image

The hundreds of mega-scale data centers under construction in the U.S. are joining thousands of existing smaller data centers, all woven together by a telecommunication network—made up of hundreds of thousands of miles of physical cables and wireless “roads”—that is itself power hungry. You might guess where this logic is going: $30 billion is about what it costs to build 1,000 miles of car-carrying superhighway (including the cost of the cars it can carry). It also costs about $30 billion to build a single 1 GW data center (including the cost of the AI chips). That single data center will consume twentyfold more energy a year than the annual auto traffic on that span of highway.

Thus, the $64,000 question for the twenty-first century: Are we at the equivalent of the 1958 starting point in building out a new information highway infrastructure, or are we approaching the equivalent of 1992, when the last section of the original interstate network was completed? That question should answer itself.

The challenge now lies in guessing how much more power the nation will require. The FERC forecast draws on a range of independent estimates suggesting the need for between 50 GW and 130 GW in extra generating capacity. We already know that demand will be higher than the lower range of that estimate, since a detailed national survey from a team at the University of Southern California’s Marshall School of Business found a total of some 55 GW of new data-center projects already committed or under construction.

Pro-growthers worry that the nation won’t be able to build energy infrastructure fast enough. The “digital sobriety” crowd worry that the nation will be able to build it.

The latter group fret especially about how conventional power plants will be used to meet most digital demands—and for good reason. Consider that last year, 30 GW of solar accounted for nearly two-thirds of the power capacity added to the nation’s utility grids. Sounds like a lot, but because of the inherent intermittency of solar, the energy-producing capability of that 30 GW is equal to less than 8 GW of conventional generation capacity. Building 30 GW of data centers creates a need for 30 GW of round-the-clock energy production, not 8 GW.

So far, we’re meeting the digital power challenge with the proverbial “all of the above.” A handful of projects (in the sunny Southwest) will use utility-scale solar, with huge battery arrays. At the other end of energy’s political spectrum, numerous planned retirements of coal plants have been reversed. There have also been highly publicized plans to restart a few retired nuclear plants, as well as a rush to invest in yet-to-be-proven small modular reactors, most of which can provide only about 0.1 GW.

But more than 50 GW of new data-center demand will be online long before the realization of any such plans. With the need to light up hundreds of megawatts of silicon in the year or two it takes to a build a mega-scale data center, we’re seeing the market overweight the pursuit of natural gas.

The three major vendors for utility-scale gas-fired turbines are sold out through 2030. Expanding their capacity to manufacture takes time, given the complexities of building massive combustion turbines. Thus, many data-center builders have turned to using huge numbers of the smaller, easier-to-build turbines, including ordering massive diesel engines (the kinds that power ships and mines around the world) capable of burning natural gas. Caterpillar, for example, announced on August 7 a partnership with a Utah developer to supply some 4 GW of diesel engine generation.

From a purely technical viewpoint, we know the engineering capabilities exist to supply the levels of power needed—and in the timeframes needed. For perspective, some 150 GW of power capacity is manufactured every year by the industry that supplies diesel truck engines. Each such engine is about 0.5 MW and similar to those used to generate electricity. The constraints to deployment are mainly regulatory and political.

When it comes to the policy implications for delivering the power needed, the “digital sobriety” crowd is eager to frame the challenge as partisan. A recent article in The Atlantic asserted that the administration “solicited input from AI firms, civil-society groups, and everyday citizens,” indicating that the “White House is clearly deferring to the private sector, which has close ties to the Trump administration.”

It is no surprise that the private sector is seeking to influence and inform the government, given the hundreds of billions of dollars of private capital involved, and the unavoidable role that policies play (often an impeding one) at these scales. But the nation will be better served if the capital does not come from the public purse.

The good news: Big Tech seems to be tilting hard toward market-based solutions for deploying its ample capital. Thus, it’s also no surprise that the debate around Big Tech and Big Energy is becoming embroiled in naked politicking. As a recent Wall Street Journal headline put it: “Democrats Try to Halt Silicon Valley’s Swing to the Right.”

Remember the backdrop. New vectors for massive energy demands have always followed the invention of transformational technologies—steamship, lightbulb, automobile, airplane, air conditioner, fertilizer, and pharmaceuticals. Such innovations are what drove the great expansion of well-being in earlier times. The innovation-energy linkage is seen as a Faustian Bargain by the “digital sobriety” cohort, exposing its implicit (sometimes explicit) antipathy to growth and abundance.

Energy’s bottomless well has fueled and can continue to fuel prosperity—provided, to quote the late, great political analyst Charles Krauthammer, “we get the politics right.”

Top Photo by Eli Hiller/For The Washington Post via Getty Images


Source link

Related Posts

1 of 118