Enhanced geothermal is generating commercial power in Utah right now. Fusion is decades away. Thermodynamic computing is one chip. Knowing which is which isn't pessimism — it's the prerequisite for demanding the right technology under the right conditions in your community.
Energy technology claims exist on a spectrum from commercially proven to theoretically possible to physically implausible. The spectrum is rarely labeled. Understanding how to locate a claim on it is the foundational skill for evaluating everything in this section.
The questions that locate a claim accurately: Is it generating power or promising to? Has it connected to a grid and delivered electrons — or produced a demonstration? What does the cost per kilowatt compare to alternatives? Who is paying for it and why? What is the realistic deployment timeline, and what has to go right for that timeline to hold?
Enhanced geothermal · AI-optimized grid management · Waste heat district heating · Closed-loop liquid cooling · Grid-scale battery storage
Small modular reactors · Advanced nuclear (TerraPower Natrium) · Long-duration grid storage · Offshore floating wind
Fusion power · Thermodynamic computing · Photonic computing
Space-based solar · Hydrogen for grid power at scale · Orbital data centers for AI training
"The most underreported energy story in America is commercial enhanced geothermal — already generating power in Utah, available almost anywhere."
Every efficiency gain in computing history has been consumed by more computation, not less total energy. When data centers become 30% more energy-efficient, the industry builds more data centers. This is not speculation — it is the documented pattern since 1965, named after 19th-century economist William Stanley Jevons who observed the same dynamic with coal and steam engines. Efficiency gains are necessary — but they must be paired with demand accountability and equitable cost structures to produce public benefit rather than simply enabling more growth.
Every major energy transition has followed a similar pattern: a new technology that looks marginal becomes dominant faster than incumbents predicted, while promising technologies that seemed imminent take far longer than advocates claimed. Nuclear power in the 1950s was going to be "too cheap to meter." It became the most expensive source of electricity in many markets. Solar panels were curiosities until they weren't — costs dropped 90% in a decade and deployment accelerated faster than any energy model predicted.
The most important lesson from energy transition history: the technologies that win are usually not the most technically elegant. They are the ones that achieve cost curves that make deployment self-sustaining. Enhanced geothermal has that cost curve potential. Fusion is still decades from it. Understanding that distinction is more useful than tracking which technology has the best physics.
Microsoft signed a 20-year deal to restart Three Mile Island Unit 1. Google contracted with Kairos Power for small modular reactors. Amazon anchored a $500M raise for X-energy. The capital commitments are real. The timelines require scrutiny. Restarting an existing plant: 3–5 years, feasible. Gen IV small modular reactors: design certification (3–5 years) + site licensing (3–5 years) + construction (3–5 years) = mid-2030s at realistic earliest. NuScale — the only SMR with NRC design certification — cancelled its flagship Utah project in 2023 due to cost overruns reaching $14,600/kW, 5× its 2020 projections. The critical bottleneck: HALEU fuel. DOE projects 40+ metric tons needed by 2030; current U.S. capacity is under 1 ton per year. The announcements are not hype. The stated timelines require more scrutiny than they typically receive.
Grid-scale battery storage costs dropped 90% since 2010. This is a completed cost curve transition, not a promise. Battery storage doesn't generate power — but it transforms how intermittent renewables function at grid scale by absorbing generation during peaks and delivering it during gaps. This is the technology that makes wind and solar reliable as baseload. It's commercially deployed, cost-competitive, and scaling rapidly. It receives a fraction of the attention that fusion announcements generate.
"The technologies that win are usually not the most technically elegant. They are the ones that achieve cost curves that make deployment self-sustaining."
TerraPower — backed by Bill Gates — broke ground on its Natrium reactor in Kemmerer, Wyoming in 2024. This is a real project at a real site. The Natrium design uses a sodium-cooled fast reactor with a molten salt thermal storage system, allowing it to flex output between 345 MW and 500 MW depending on grid demand. DOE provided $2 billion in matched funding. Target operational date: 2030. This is the most advanced U.S. advanced nuclear project by construction progress. It is not yet a proven design at commercial scale — but it is not vaporware. Watching this project is the most useful signal available on whether advanced nuclear can actually hit 2030s timelines.
Fervo Energy's Cape Station in Utah is producing commercial power right now. 500 MW under development, first deliveries to the grid in 2026. This is not conventional geothermal limited to volcanic regions — it uses directional drilling technology borrowed from oil and gas to fracture hot dry rock anywhere subsurface temperatures are sufficient, which covers vast portions of the American West. 24/7 clean baseload. No fuel cost. No emissions. Scalable anywhere with the right geology. Google has a power purchase agreement from this facility.
The reason you haven't heard much about it: it doesn't generate dramatic announcements. It just works. That is its most important characteristic.
Google DeepMind's system achieves a consistent 30% reduction in cooling energy across Google's own facilities — deployed at scale since 2018, not a pilot. Its partnership with PJM Interconnection cuts interconnection approval timelines from years to months. Stanford spinout GridCARE freed up 80 MW of incremental capacity for Portland General Electric using generative AI to identify hidden grid flexibility. Open Climate Fix achieved a 10% reduction in large forecasting errors for the UK's National Grid.
The critical caveat: most of this optimization is proprietary. Whether these capabilities get embedded into the public grid — regulated, accessible, universal — is a policy question not yet answered.
Every joule of electricity entering a data center exits as heat — a thermodynamic guarantee. The question is whether that heat is wasted or used. Microsoft's partnership with Finnish utility Fortum recovers 350 MW of waste heat serving approximately 250,000 people. Stockholm integrates 30+ data centers into its city heating network, recovering 100+ GWh annually. Meta's Odense campus in Denmark heats 11,000+ homes. Deep Green Technologies in the UK heats public swimming pools with immersion-cooled servers — £22,000/year saved at Exmouth Leisure Centre — backed by £200M from Octopus Energy.
This is commercially proven at city scale. The barrier is institutional: data centers sit in industrial zones far from residential heat demand, and no standardized contracts exist in most U.S. markets.
Grid-scale battery storage costs dropped 90% since 2010. This is not a future projection. It is a completed cost transition. Battery storage transforms intermittent renewables into reliable baseload by absorbing generation during peaks and delivering during gaps. Form Energy's iron-air batteries provide 100-hour storage at costs competitive with natural gas peakers — first commercial installations are underway. This technology receives a fraction of the attention fusion announcements generate. It is more consequential for grid reliability over the next decade than any other emerging technology.
$10B+ in commitments from Amazon, Google, Microsoft, Oracle. NuScale holds the only NRC design certification but cancelled its Utah pilot due to costs reaching $14,600/kW — 5× 2020 projections. Realistic first deployment: mid-2030s at earliest, assuming no further regulatory, technical, or cost setbacks. The capital commitments are real. The timelines require realistic scrutiny.
Broke ground in Kemmerer, Wyoming in 2024. Sodium-cooled fast reactor with molten salt thermal storage, flexing between 345–500 MW output. DOE provided $2B in matched funding. Target operational: 2030. This is the most advanced U.S. advanced nuclear project by construction progress. Not vaporware — but not yet a proven design at commercial scale. Watch this project as the best available signal on whether 2030s nuclear timelines are achievable.
Commonwealth Fusion Systems' SPARC magnet achieved its design target — this is a genuine physics milestone, not vaporware. Helion has a Microsoft PPA targeting 50 MW by 2028. Most analysts view that timeline skeptically. Commercial fusion power for data centers: 2040s at realistic earliest. Commonwealth Fusion's commercial ARC facility is planned for the early 2030s — which would put commercial power in the late 2030s if everything goes right. Everything going right in fusion has historically been ambitious. Worthy of continued investment. Not a near-term solution to anything.
Normal Computing taped out the world's first thermodynamic semiconductor — the CN101 — in August 2025. Extropic has presented at major global forums. Both claim up to 1,000× energy efficiency for specific stochastic workloads: Monte Carlo simulations, diffusion models, probabilistic inference. The underlying physics — Landauer's principle — is real and well-established. One chip taped out is very early. No commercial deployment exists. The efficiency claims are theoretical for current architectures. This is frontier research worth watching, not a near-term infrastructure solution.
Space is cold (~−270°C) but cooling in vacuum is harder than on Earth — there is no convection, only radiation. The ISS requires 422 m² of radiators for just 70 kW. A 1 GW orbital data center would need nearly 1 km² of radiator area. Launch costs: Falcon Heavy runs ~$1,400/kg; Google explicitly states orbital data centers require below $200/kg by the mid-2030s. Hardware upgrade cycles are fatal to sealed systems — Microsoft's Project Natick demonstrated this with underwater data centers. Near-term viable: satellite edge computing for Earth observation data. AI training orbital data centers: speculative before mid-2030s at earliest.
Green hydrogen has a fundamental round-trip efficiency problem: you lose 60–70% of the input energy in the conversion chain (electricity to hydrogen to electricity). This makes it significantly more expensive than direct battery storage or direct renewable use for grid applications. Hydrogen has genuine value for hard-to-electrify industrial processes and long-distance transport. As a grid storage medium competing with batteries, the economics are very difficult. This is not a dismissal of hydrogen — it's a precise assessment of where the physics and economics favor it and where they don't.
Innovation doesn't automatically become public benefit. It becomes public benefit when policy conditions require it, when communities negotiate for it, and when citizens understand enough about it to demand it. The gap between "this technology exists" and "this technology serves my community" is a policy gap — and policy gaps are closed by organized citizens.
The pattern of technology benefit distribution matters as much as the technology itself. Battery storage costs dropped 90% in a decade — but the communities that host battery storage facilities don't automatically benefit from that cost curve. The benefit flows to whoever owns the storage assets and whoever negotiated the power purchase agreements. Grid-scale storage can reduce costs for all ratepayers — or it can be developed as private infrastructure that captures its value for shareholders. The technology is the same. The distribution of benefit is a policy choice.
The same is true for enhanced geothermal, for waste heat district heating, for AI grid optimization. Every technology that could benefit communities will benefit communities only if the conditions under which it's deployed require that outcome. Waiting for companies to voluntarily share the benefit of innovations they developed and own is not a strategy. It is an abdication of the civic responsibility that makes shared benefit possible.
"Innovation becomes public benefit when policy conditions require it. Waiting for companies to voluntarily share what they own is not a strategy."
The Nordic model provides the clearest example: data centers required to integrate waste heat into district heating networks, to use renewable energy, to report publicly on consumption. The result — communities receiving heat from data center operations, near-zero operational water use, grid reliability improving — is not a coincidence. It is the designed outcome of specific policy conditions applied consistently. Ethical innovation deployment is designed, not hoped for.
This section applies the AI Thinking Model™ — a framework for critical thinking, wisdom, innovation, strategy, and ethics developed by Liz B. Baker, Global Institute for AI & Humanity. Learn more →