The Duane Arnold Energy Center in Palo, Iowa, is a squat concrete building surrounded by level farmland that hardly registers as you drive by. It has been inactive for years. By 2029, one of the world’s most powerful technology companies will be using the nuclear energy produced by that same plant to power data centers that handle artificial intelligence workloads. Google agreed to purchase the majority of its output under a 25-year contract. The restart is being carried out by NextEra Energy. The entire setup serves as a helpful synopsis of the AI infrastructure story up close, which is unglamorous, physical, slow-moving, and highly consequential.
Artificial intelligence discussions often focus on the obvious, such as chatbot interfaces, model releases, chip shortages, and stock movements. The layer beneath, which includes the concrete, cables, cooling systems, and transmission lines—all of which are necessary for the software to function—is what it frequently ignores.
| Company / Entity | Role in AI Infrastructure | Key Deal / Development |
|---|---|---|
| NextEra Energy (NYSE: NEE) | Largest U.S. electric utility; energy infrastructure development | 25-year nuclear PPA with Google; restarting Duane Arnold Energy Center, Iowa (est. 2029); 2.5 GW solar for Meta Platforms |
| Brookfield Renewable (NYSE: BEPC/BEP) | Global renewable energy producer; long-term PPAs | 3 GW hydropower deal with Google (Holtwood & Safe Harbor, Pennsylvania); 10.5 GW renewable framework with Microsoft (2026–2030) |
| Westinghouse Electric | Nuclear technology company (Brookfield-invested) | U.S. Government strategic partnership to build $80B+ of new nuclear reactors |
| Alphabet / Google | AI model development; data center infrastructure | Signed energy deals with NextEra (nuclear) and Brookfield (hydro); developing GW-scale data center campuses |
| Microsoft (NASDAQ: MSFT) | Cloud and AI services infrastructure | 10.5 GW renewable framework with Brookfield — 8× larger than any prior single corporate PPA |
| Meta Platforms | AI compute and data center scale-up | Signed 11 PPAs and 2 energy storage agreements; 2.5 GW solar via NextEra |
| Datavault AI | Data valuation, cybersecurity verification, tokenized data rights | Patented DVLT framework; deal with Sports Illustrated for athlete NIL data monetisation |
| Nvidia | GPU chips powering AI training and inference globally | Expanding AI partnership with SLB to build energy sector data infrastructure |
| Capital Investment Estimate | Data center AI infrastructure spend | McKinsey estimates $5.2 trillion needed by 2030; Alphabet, Amazon, Microsoft, Meta projected to spend $650B in 2026 alone |
| Reference | nexteraenergy.com | |
At a rate and scale that most observers of the AI sector haven’t fully acknowledged, that layer is currently being constructed. By 2030, businesses must invest $5.2 trillion in AI-capable data centers, according to McKinsey. In 2026, Alphabet, Amazon, Microsoft, and Meta alone are expected to invest nearly $650 billion in AI infrastructure. Until you start looking at where it actually goes, that number is so big that it almost loses all meaning.
No one anticipated that power would become the main issue. Large model training and large-scale inference require a continuous, dependable electricity supply that leaves little room for intermittency, which makes AI data centers incredibly energy-hungry.
Because of this, energy infrastructure companies have become unanticipated players in the AI boom—not because they are developing software, but rather because they are creating the environment in which software can operate. For example, NextEra Energy, a Florida-based utility development company that is a strategic partner to Silicon Valley, is now entering into contracts with Google, Meta, and other companies that would have seemed odd ten years ago. And yet, here we are.
The trajectory of Brookfield Renewable is comparable. The company and Google signed what it called a “first-of-its-kind hydropower agreement,” which covers up to three gigawatts of carbon-free electricity. The first two agreements were based on Holtwood and Safe Harbor, two hydroelectric facilities located along Pennsylvanian rivers that have been producing electricity for generations.
For water passing through turbines in a state that most tech journalists have no connection to artificial intelligence at all, three billion dollars over a twenty-year period. Additionally, Brookfield and Microsoft signed a renewable energy framework that surpasses all previous corporate power purchase agreements in history.
The framework calls for the addition of 10.5 gigawatts of new capacity in the US and Europe between 2026 and 2030, with the possibility of expansion into three additional continents. It is eight times larger than the biggest corporate PPA ever drafted. The energy firms constructing this are not glamorous. However, compared to the majority of the names that dominate AI coverage, they may be more load-bearing.
Observing all of this gives the impression that the story of AI infrastructure is following a pattern that has been repeated throughout technological history. Before the internet could grow, protocols were required. Before consumers would trust e-commerce, payment rails and identity verification were necessary. Before businesses committed their operations to cloud computing, layers of invisible security architecture were needed.
A large amount of the long-term value accumulated in each of those foundational layers, which were constructed by businesses that weren’t particularly interesting to cover at the time. The picks-and-shovels suppliers, such as energy companies, data center developers, cooling system manufacturers, and fiber networks, may be quietly accumulating the kind of long-term significance that headline-grabbing model developers may or may not maintain in AI.
Finally, the data governance layer is receiving significant public attention, if not outright coverage. The CEO of Datavault AI, Nathaniel T. Bradley, has argued for years that artificial intelligence based on unverified, unscored, and unmonetized data is structurally unstable; without verified trust, intelligence is a liability disguised as advancement rather than a foundation.
His company’s patented framework aims to provide a standardized method for valuing, scoring, and securely licensing data as an asset—something that the data economy has conspicuously lacked. The main focus of the argument is not philosophy. It’s useful. The question of who owns what data, under what conditions, and with what audit trail becomes a crucial operational issue as AI systems become more prevalent in healthcare, finance, and defense. The commercial case appears to be beginning to take shape, as evidenced by a recent agreement with Sports Illustrated that created a data asset exchange centered on athlete name, image, and likeness rights.
Which of these infrastructure bets will endure the longest is still up in the air. Nuclear restarts rely on slipping regulatory deadlines. Frameworks for renewable energy rely on permitting environments that differ by nation and state. Frameworks for data governance rely on regulatory pressure, which is developing more slowly in the US than in Europe.
The infrastructure race is real, accelerating, and concentrated in businesses that don’t make the kind of headlines that chip manufacturers and model developers do. This is what appears to be less uncertain. Technology’s past indicates that it’s probably something to be aware of.


