Nvidia's $100bn OpenAI Bet: A Game-Changer or a Risky Loop?
Nvidia's $100bn OpenAI Deal Raises Questions

The announcement on September 22nd that chipmaking giant Nvidia may invest up to $100 billion in OpenAI has sent shockwaves through Silicon Valley and global markets. This colossal deal, set to begin in the second half of 2026, is designed to help OpenAI purchase 4 to 5 million of Nvidia's advanced AI chips. The move underscores a deepening, and some say incestuous, interdependence among tech titans, coming just days after Nvidia pledged a $5 billion investment in its rival, Intel.

The Mechanics of a Mega-Deal

Under the proposed partnership, Nvidia's investment in OpenAI will increase in increments of $10 billion for every gigawatt (GW) of data-centre capacity that OpenAI builds using Nvidia's technology, up to a cap of 10GW. In essence, Nvidia would be funding its own GPU sales. Analysts like Pierre Ferragu of New Street Research calculated that for every $35 billion of GPUs OpenAI buys, Nvidia invests $10 billion, meaning OpenAI pays 71% in cash and 29% in its privately held shares.

Nvidia's CEO, Jensen Huang, highlighted that an extra 5 million GPUs would roughly equal the company's total shipments for this year, boosting sales. The deal also has an unspoken strategic benefit: it makes OpenAI more reliant on Nvidia's hardware, potentially discouraging the AI lab from developing its own chips. Following the news, Nvidia's share price jumped nearly 4%, pushing its valuation close to $4.5 trillion.

Mounting Concerns and Cash Constraints

However, the scale of the deal has raised significant eyebrows. Stacy Rasgon of Bernstein pointed out in a CNBC interview that it exacerbates worries about the "circular dynamics" of Nvidia investing in the very companies it supplies. The arrangement also shines a light on OpenAI's ambitious spending against its revenue. Despite ChatGPT boasting over 700 million weekly active users, OpenAI's annual revenue is around $13 billion, dwarfed by its commitments.

OpenAI's spending spree includes a $300 billion deal with data firm Oracle to build 4.5GW of data-centre capacity over five years from 2027. To finance its construction boom, Oracle launched an $18 billion bond sale on September 24th. While OpenAI's boss, Sam Altman, claims financing for these $400 billion projects is secured, using shares as currency deepens concerns about cash constraints.

The Colossal Infrastructure Challenge

The physical scale of these plans is staggering. The additional 10GW of power capacity OpenAI seeks is almost half of all utility-scale electricity generation added in the US in the first half of this year—equivalent to ten nuclear power plants. Building this infrastructure, even with faster permits, will take years.

Yet, progress is visible. At a site in Abilene, Texas, Altman and Oracle's new co-CEO, Clay Magouyrk, showcased the rapid development of Stargate, an AI project initiated by former President Donald Trump. In just over a year, a vast field has transformed into a complex housing over 6,000 workers. The first data-centre building, with over 100MW capacity, is operational. By September 2026, eight buildings totalling 900MW should be running, with power initially from on-site gas turbines before connecting to Texas's grid.

Altman calls the Abilene site just "a small fraction" of what's planned, identifying five more locations for development with partners like Oracle and SoftBank. All told, OpenAI's plans run to about 7GW of capacity.

Riding on Interconnected Fortunes

Altman has laid out three priorities: advancing AI research, building compelling products, and solving the "unprecedented infrastructure challenge" of securing chips and power. A vast amount of interconnected wealth now hinges on his ability to tackle all three simultaneously. While the dirt is moving in Texas, convincing his well-heeled Silicon Valley peers of his vision appears easier than actually delivering on these monumental promises in a constrained world.