Jeff Bezos Warns AI Data Center Rush is a 110-Year Mistake Repeating
Bezos: Private AI Data Centers Are a Massive Waste

Jeff Bezos, the founder of Amazon, has issued a stark warning to the tech industry, labelling the current frenzy to build private, dedicated data centres for artificial intelligence as a colossal misallocation of resources. Speaking at the 2026 New York Times DealBook Summit, he drew a direct parallel to a costly industrial error from over a century ago.

The Historical Mistake: From Power Plants to Data Centers

Bezos argued that major companies like Meta and OpenAI, which are constructing their own massive computing facilities, are repeating the mistake of early 20th-century factories. Before the advent of widespread electrical grids, factories were forced to generate their own power, a highly inefficient and costly endeavour.

"Right now, everybody is building their own data center, their own generators essentially. And that's not going to last," Bezos stated. He illustrated his point with an anecdote from a visit to a 300-year-old brewery in Luxembourg, which once operated its own power plant—a relic of a bygone, less efficient era.

The Amazon executive predicts an inevitable industry pivot towards centralized, utility-like cloud computing services, such as Amazon Web Services (AWS). He emphasised that these platforms offer vastly superior efficiency and scale compared to isolated, company-specific infrastructure projects.

AI's Insatiable Appetite for Power Reshapes Global Energy

The urgency behind Bezos's comments is underscored by staggering data on AI's energy consumption. In 2024 alone, data centres worldwide consumed approximately 415 terawatt-hours (TWh) of electricity, accounting for about 1.5% of global demand. Projections indicate this figure will more than double by 2030, reaching a colossal 945 TWh—roughly equivalent to the entire annual electricity consumption of Japan.

The impact is particularly acute in the United States, where data centre electricity use is forecast to leap from 4% to a staggering 12% of national demand. The computational hunger of AI models is at the core of this surge. For instance, a single query to ChatGPT uses nearly ten times the energy of a standard Google search. Furthermore, training just one large AI model can consume as much power as 200 American homes use in a year.

This exploding demand has triggered a global scramble for solutions, ranging from long-term nuclear power agreements to futuristic concepts like space-based data centres.

Industry Leaders Echo: Energy is the True AI Bottleneck

Bezos is not a lone voice in highlighting this critical challenge. Other tech CEOs have publicly acknowledged that energy, not just advanced chips, is becoming the fundamental constraint on AI progress.

Microsoft's Satya Nadella has pointed out that valuable GPU processors often sit idle due to a lack of power infrastructure, succinctly noting, "I don't have warm shells to plug into." OpenAI's Sam Altman has called for radical breakthroughs in energy technology, such as nuclear fusion or significantly cheaper solar power. Google CEO Sundar Pichai has also identified power as "the long-term bottleneck for AI."

The industry is already reacting at a strategic level. Meta recently secured nuclear power deals aiming to generate 6.6 gigawatts by 2035 for its Prometheus AI supercluster in Ohio. Similarly, Alphabet (Google's parent company) acquired Intersect Power for $4.75 billion to develop co-located energy and computing infrastructure.

In his concluding remarks, Bezos positioned AI as a "horizontal enabling layer" akin to electricity itself. However, he issued a clear warning: the infrastructure supporting this new technological layer must undergo the same centralised efficiency revolution that transformed global industry a century ago, moving away from fragmented, private solutions to robust, shared utilities.