

Dust from West Texas, orange-red and laced with iron particles, is carried by the wind, creating a gritty film on everything it contacts. It adheres to skin and sticks to the insides of your mouth, turning each breath into a reminder of your surroundings. This is the environment where Sam Altman, CEO of OpenAI, is directing a project named Stargate — a rapidly growing network of data centers, supported by allies such as Oracle, Nvidia, and SoftBank.
Each morning, six thousand vehicles belonging to workers flow into the location. Tires create a constant dusty cloud over the construction area, which spans the size of a small city — employing more individuals at this single site than OpenAI has on its entire payroll.
Rain occurs sporadically. One moment, the roads are dry dust; the next, they’re saturated mud — thick and sticky, grabbing at boots and hindering machinery. After the storm passes, the sun reappears, and the ground hardens again, fissured and chalk-like, as if the area is attempting to eliminate all traces that water ever touched it.
As twilight approaches, the same harsh conditions that make life there challenging transform the sky into a tapestry of vibrant colors. Shorter wavelengths dissipate, leaving only reds and oranges in their wake.
“This is what it requires to produce AI,” Altman remarked to CNBC at the site in September. “Unlike previous technological upheavals or earlier versions of the internet, substantial infrastructure is necessary. And this is merely a small indication of it.”
A small demonstration: With an average cost of around $50 billion per location, OpenAI’s Stargate undertakings accumulate to an estimated $850 billion in expenses — nearly half of the $2 trillion global AI infrastructure increase projected by HSBC.
The Abilene facility has already launched one data center, with a second one nearing completion. OpenAI CFO Sarah Friar informed CNBC that the site could eventually surpass a gigawatt of capacity — enough energy to provide for about 750,000 homes, approximately equal to the sizes of Seattle and San Francisco combined.
“The excavation work being done today is indeed about computing power that will become operational in 2026,” she stated in September. “The initial push from Nvidia will focus on Vera Rubins, the latest in frontier accelerator chips. Following that, it involves what will be constructed for 2027, 2028, and 2029. What we observe today is a significant computing crunch.”
“We are expanding at an unprecedented pace,” Altman said, squinting in the sunlight. “And we would be significantly larger now if we had vastly more capacity.”
Land costs are low. Government support is available. And for now, the power grid can be adjusted to accommodate these needs.
Altman is not the only one establishing empires.
Zuckerberg’s Hyperion and Musk’s Colossus
In the flatlands of northeast Louisiana, where expansive soybean fields used to reach the horizon, Meta‘s Mark Zuckerberg is constructing a four-million-square-foot facility dedicated to artificial intelligence. He refers to it as Hyperion, named after the Greek titan. Once completed, it will use more electricity than New Orleans — and occupy an area similar to that of lower Manhattan.
On the opposite side of the Mississippi River, in West Memphis, Arkansas, Alphabet‘s Google has commenced the largest private capital investment in state history — a multibillion-dollar campus emerging from 1,100 acres of undeveloped land.
Thirty minutes south, on the Tennessee side of the line, Elon Musk has already started transforming the industrial ruins of South Memphis. His supercomputer, Colossus, was constructed in 122 days within a closed Electrolux facility. Now he is building Colossus 2, targeting a million GPUs — and has just acquired a third property to further enlarge the complex. To power this site, Musk purchased a defunct Duke Energy power facility across the border in Southaven, Mississippi.
Southeast Wisconsin is where Microsoft is investing over $7 billion into what CEO Satya Nadella dubs “the most powerful AI” data center globally — a facility expected to accommodate hundreds of thousands of Nvidia chips when it becomes operational in early 2026. Additionally, in rural Indiana, bordering Lake Michigan, Amazon has repurposed 1,200 acres of farmland into Project Rainier, an $11 billion facility completely reliant on custom silicon, established solely to develop AI models for a startup named Anthropic.
“Transforming cornfields into data centers, almost instantaneously,” Amazon Web Services CEO Matt Garman remarked to CNBC in Seattle in October.
This is the AI explosion manifested in steel and gravel — a gradual reshaping of the country into power and computing zones. What is being built is not infrastructure in the traditional sense. It represents the physical embodiment of a conviction — that intelligence itself can be produced at an industrial scale, and that whoever constructs the largest factory will prevail.
“This is the most significant market ever in human history,” asserted Sameer Dholakia, a partner at Bessemer Venture Partners. “This surpasses oil, as everyone around the world requires intelligence.”
The funding
The amounts in play have become hard to grasp.
The leading five hyperscalers — comprising Amazon, Microsoft, Alphabet, and Meta — are projected to spend close to $443 billion on capital investments this year. CreditSights anticipates that this figure will rise to $602 billion in 2026 — marking a 36% increase year-over-year. Their analysts predict that around 75% of that expenditure will directly contribute to AI infrastructure.
The current technology sector stands out as one of the most lucrative in history, yet not all companies necessarily possess the liquid assets to meet these expenditures.
The debt incurred has been extraordinary. Hyperscalers have introduced $121 billion in additional debt this year — more than four times the average yearly issuance over the preceding five years, according to Bank of America. More than $90 billion of that was raised in just the past three months. Meta accessed the bond markets for $30 billion. Alphabet secured $25 billion. Oracle recently executed an $18 billion bond issuance — making it the largest issuer of investment-grade debt among non-financial U.S. corporations, according to Citi.
Wall Street anticipates an increase in borrowing rates.
Analysts from Morgan Stanley and JPMorgan estimate that the push for AI infrastructure could lead to up to $1.5 trillion in extra borrowing by tech firms in the upcoming years. Furthermore, UBS analysts project that as much as $900 billion in new issuance could occur by 2026 alone.
“There is a certain discomfort inherent in being a credit investor facing the kind of transformation that will demand an immense amount of capital,” Daniel Sorid, head of U.S. investment-grade credit strategy at Citi, commented to investors during a video call earlier this month.
This discomfort is visible in the derivatives market.
Credit-default swaps — financial instruments that provide a payout if a borrower fails to meet their debt obligations — have widened to multi-year highs for Oracle. Barclays and Morgan Stanley have advised clients to purchase protection, and towards the end of October, a dynamic CDS market linked to Meta began actively trading as investors hurried to hedge against what is emerging as a hyperscaler debt surge.
Historically, there is evidence that debt-funded expansions can exceed immediate demand. During the dot-com era, telecommunications companies accumulated debt to expedite fiber deployment. As conditions tightened, many were forced to restructure. The network remained intact — however, the results varied from significant losses for early investors to total equity collapses.
OpenAI and the intricate network
OpenAI stands at the heart of this infrastructure race — entwined in a network of interrelated contracts that have altered the competitive landscape for artificial intelligence.
Within just two months this fall, the organization unveiled partnerships totaling around $1.4 trillion in announced commitments — a figure that has led critics to alert of a potential AI bubble and raised fundamental inquiries regarding the availability of power, land, and supply chains needed to meet such aspirations.
The agreements were made in swift succession.
In September, OpenAI revealed a $100 billion equity-and-supply agreement with Nvidia — the chip manufacturer acquiring an ownership stake in OpenAI in return for 10 gigawatts of its next-generation systems.
In October, OpenAI partnered with AMD to implement its Instinct GPUs, with the agreement potentially granting OpenAI a 10% stake in the chip manufacturer. Shortly afterward, Broadcom consented to provide 10 gigawatts of custom chips co-developed with OpenAI. In November, OpenAI finalized its first cloud agreement with Amazon Web Services, further loosening Microsoft’s previously exclusive hold.
“This is essential for us,” OpenAI President Greg Brockman mentioned to CNBC in October, referring to the company’s urgent need to secure the raw computing capabilities for its goals. “It’s crucial to our mission if we genuinely want to scale to serve all of humanity.”
Nvidia is effectively underwriting the demand for its own chips, Oracle is constructing the sites, AMD and Broadcom are positioning themselves as alternative suppliers, and OpenAI is anchoring that demand. Detractors term this a circular economy: capital, capacity, and revenue revolving through a limited group of players. It functions as long as growth persists — but if demand decreases or funding tightens, the strain can swiftly spread across a network of shared risks.
Already, Nvidia has warned investors that there was “no guarantee” it would finalize a definitive agreement with OpenAI or complete the investment under expected conditions, serving as a reminder that significant AI partnerships frequently start as mere frameworks.
Oracle’s perspective from the ground is more straightforward: the demand is tangible, varied, and firmly committed.
“We observe widespread demand across a broad range of sectors, so it’s not solely reliant on a single source,” Clay Magouyrk, Oracle’s newly appointed co-CEO, relayed to CNBC in West Texas in September. “I don’t see a bubble since I am witnessing genuine demand for it.”
He depicted the craving for computing power as nearly limitless. “When I observe my teams at Oracle and our clients, I detect what seems to be boundless demand for technology — if we can facilitate their use of it.”
At the DealBook Summit in December, Anthropic CEO Dario Amodei illustrated the “cone of uncertainty” — a disparity between long lead times and a market that can alter within a quarter. Data centers require 18 to 24 months to construct, and chip orders are placed years ahead, even as demand predictions remain in flux.
“You don’t have $50 billion available,” he said, prompting financing that often becomes integrated into partnerships with semiconductor manufacturers or cloud providers, allowing for a “pay as you go” approach.
Amodei asserts that Anthropic seeks to remain prudent. “I think there are some players who are not managing that risk well,” he stated, while declining to specify names.
The new doctrine of scale
Critics question the reliability of firm, contracted demand versus aspirational headline figures.
Gil Luria, who monitors technological trends at D.A. Davidson, considers Oracle to be a case study.
“OpenAI made pledges that it is quite improbable they will fulfill,” he remarked. “Now they are retracting those and stating these aren’t genuine commitments — they’re frameworks. But discuss that with Oracle. Oracle believed they had a contract for $300 billion. They recorded that in their ongoing performance obligations and made commitments to Wall Street based on that.”
Oracle shares plummeted 23% in November — marking its worst month since 2001.
OpenAI’s Friar contested the characterization of a “circular economy” during an interview with CNBC in West Texas.
She likened it to the nascent phase of the internet. “When the internet was emerging, many believed that we were overextending, that there was an excess. And look at where we are now, right? The internet is prevalent. AI will resemble that.”
Friar mentioned that equity is too costly, prompting OpenAI to prepare to incur debt for the first time to support expansion. The company has explored over 800 potential locations throughout North America — considering land availability, substations, and transmission capacity.
Much like the majority of the industry, OpenAI is assessing every possible energy source — renewables, gas, and even nuclear — as utilities and technology firms pursue constant power that wind and solar cannot consistently supply on their own.
“The true constraint isn’t financial resources,” she said. “It’s energy.”
This demand remains unwavering. In late December, SoftBank’s Masayoshi Son consented to pay $4 billion for DigitalBridge, a company investing in data centers. To finance the deal — and his $40 billion pledge to OpenAI — Son liquidated SoftBank’s entire stake in Nvidia. He later expressed at a Tokyo forum that he “was crying” over the necessity to sell those shares.
The coveted asset now is energized real estate — alongside the ability to scale up. Power like this is regulated and requires permission, which implies that the expansion also hinges on decisions made in Washington.
OpenAI has engaged with the Trump administration to expand the CHIPS Act tax credit to encompass AI data centers — although when its CFO proposed the idea of a governmental “backstop” for infrastructure loans at a Wall Street Journal event in November, the backlash was immediate, leading her to retract the statement within hours. Altman took to X to insist that the company does not “seek or require government assurances.”
Organizations are not pausing for Washington. They are borrowing, constructing, and wagering that the economic conditions will eventually align — for as of now, every time they have increased their scale, the outcomes have improved. This trend is the foundational belief of the industry: increased computing results in more capable systems. It explains why startups that have yet to generate profits can still attain valuations in the hundreds of billions.
The gamble is that training increasingly large models will continue to yield transformative intelligence. It is also the expectation that the benefits are now radiating beyond the laboratory, as these models find applications across various sectors — assisting clients, generating code, handling claims, drafting contracts, condensing extensive workloads into mere hours. This is inference: the actual usage of models that turns them into functional products, rather than just training them.
Inference is where the enthusiasm must translate into profit margins, and also where the demand for computing power is relentless: every new user, workflow, or agent adds ongoing requirements, as opposed to a singular training cycle. This is why the expansion appears less like a speculative venture and more like a utility race, with firms striving to ensure they have the energy and capacity to address what they believe will be continual demands for intelligence.
“We continue to be astonished, even as the frontrunners of this scaling belief,” Daniela Amodei, Anthropic’s president and co-founder, narrated to CNBC during a discussion at the company’s headquarters in San Francisco. “Every year we’ve thought, ‘Well, this can’t possibly keep up with exponential growth,’ yet it has every single year.”
Anthropic’s revenue has surged tenfold annually over the past three years. In 2025 alone, the startup’s valuation soared from $60 billion to a funding round currently happening that could exceed $300 billion.
The reckoning
Dario Amodei, Daniela’s brother, posits that we are nearing a scenario akin to “a country of geniuses confined in a data center” — AI systems capable of performing at the level of Nobel laureates across every sector. He believes this threshold could be reached as soon as next year.
Yet he is also raising alarms.
“Take a look at entry-level consultants, lawyers, and financial professionals; many roles within white-collar service sectors can be efficiently managed by AI models without supervision,” he informed 60 Minutes. “My concern is that it will be widespread, and the pace will be quicker than prior technological advances.”
This conviction is fueling the industry’s expenditure frenzy — but skeptics fear that the construction could become an overreach fueled by debt, resulting in a familiar aftermath: bankruptcies, liquidation sales, and equity elimination.
Matt Murphy, a venture capitalist at Menlo Ventures and an early investor in Anthropic, frames the scenario differently.
“I have been in the venture capital business for 25 years,” Murphy stated, “I have witnessed the waves of cloud computing, mobile technology, and semiconductors. This is the greatest wave of all.”
When viewed from a distance, a new landscape emerges.
Zuckerberg’s Hyperion. Musk’s Colossus. Altman’s Stargate. Amazon’s Rainier. Google’s network of compute clusters. Each one a testament to a distinct perspective on the future — and all tethered to a singular limitation: energy.
Data centers are sprouting up near electrical generation plants and transmission pathways, in regions with affordable land, cooperative governments, and power grids that can be expanded. The surrounding communities are now appearing in investor presentations, earnings discussions, and projections worth trillions.
Analysts inform CNBC that the stakes transcend mere stock values. Either this year marks the commencement of a change as significant as electrification and the internet, or it signifies the climax of a bubble that future generations will examine as a lesson learned.
Altman acknowledges the skepticism — but he dismisses the idea that the expansion has become excessive.
“People will suffer losses for overinvesting,” he stated to CNBC in September. “Conversely, people can also incur losses from underinvesting and lacking adequate capability.”
“Intelligent individuals will become overly enthusiastic, and many will lose substantial amounts of money. Yet, I remain confident that in the long run, the significance of this technology will be immense for society,” Altman concluded.
For now, the development persists. Trucks raise clouds of dust. Transformers resonate. And throughout the American heartland, the factories of a new era are being sculpted.
WATCH: Microsoft anticipates a 10x return on OpenAI investment post-restructure