Full-Scale Nuclear Reactors and AI Data Centers: Restarts, Uprates, and the Return of Big Nuclear
The nuclear conversation around AI data centers is no longer limited to SMRs. Full-scale reactors are back in play through three pathways that matter now: restarting shuttered plants, uprating and extending the existing fleet, and pursuing new large-reactor builds where gigawatt-scale digital infrastructure can justify them.
The nuclear conversation around AI data centers is usually dominated by small modular reactors. That is understandable. SMRs are easier to imagine next to a future hyperscale campus than a traditional thousand-megawatt-class nuclear station.
But the market is changing quickly, and the most immediate full-scale nuclear story is not actually about small reactors. It is about large ones.
That is because AI infrastructure needs huge amounts of dependable power, and it needs it on timelines that make developers, utilities, and hyperscalers look first at what can be restarted, expanded, or repurposed from the existing system. In practice, that has put full-scale nuclear back in play through three pathways: bringing retired plants back, extending and uprating the operating fleet, and in a smaller number of cases, pursuing entirely new large-reactor projects where the scale of digital demand can justify them.
That is the real large-nuclear story now. Not theory, but reactivation.
Why full-scale nuclear is back in the AI conversation
The reason is straightforward. AI data centers want power that is clean, firm, and available at very large scale.
Natural gas can provide dispatchability, but it creates emissions complications. Wind and solar remain essential, but they do not solve the constant-load problem by themselves. Storage helps, but not indefinitely. SMRs are promising, but most are not yet near-term commercial answers at the scale the biggest AI campuses may eventually require.
Large nuclear reactors already exist, already produce around-the-clock electricity, and in some cases already sit in places with transmission, water, workforce, and industrial infrastructure that can support major digital load.
That is why they matter again. The question is no longer whether full-scale nuclear belongs in the AI conversation. It is which pathway can move first.
The fastest path is restarting shuttered plants
Restarting an existing reactor is now one of the most important nuclear themes in the data center market because it offers something greenfield development often cannot: a potentially shorter route to real gigawatts.
The clearest example is Constellation's plan to restart Three Mile Island Unit 1, now renamed the Crane Clean Energy Center, after signing a 20-year power purchase agreement with Microsoft. Under the arrangement, Microsoft will purchase the output as part of its effort to match data-center power use in PJM with carbon-free energy. The plant is expected to add about 835 megawatts back to the grid.
That matters for two reasons. First, it is one of the clearest signs that a hyperscaler-scale buyer is willing to anchor the economics of a full reactor restart. Second, it shows that an old plant is no longer being valued only as a decommissioning story. It is being repriced as strategic infrastructure for the AI era.
Palisades in Michigan is also important, even though it is not tied to one named AI offtaker in the same way. Holtec is preparing the 800-megawatt plant for restart, and DOE has highlighted Palisades alongside Crane as part of the broader national effort to increase near-term nuclear output. In other words, Palisades shows that the restart pathway is bigger than any single corporate contract.
The existing fleet may offer the biggest near-term opportunity
For the next several years, the most scalable large-nuclear opportunity may not be restart alone. It may be extension and expansion of plants that are already running.
Meta's January nuclear announcement is one of the clearest signs of this. The company said its agreements with Vistra will support the continued operation and increased energy production of Perry and Davis-Besse in Ohio and Beaver Valley in Pennsylvania, while helping power the grids that support its operations, including the Prometheus supercluster in New Albany, Ohio. Meta framed the broader package as supporting up to 6.6 gigawatts of new and existing clean energy by 2035.
That is a very important development because it shows the market moving beyond the idea that tech companies only want new reactors. In reality, many of the fastest and most bankable nuclear gains may come from protecting, extending, and incrementally expanding what already exists.
Amazon's relationship with Talen Energy points in the same direction. Talen's expanded agreement will provide Amazon up to 1,920 megawatts of carbon-free electricity from Susquehanna through 2042, with the full volume expected no later than 2032. Talen and Amazon are also exploring plant uprates to add net-new energy to PJM.
This may be the most practical large-reactor strategy in the market. Instead of waiting for entirely new plants, companies can secure output from operating stations and help justify license extensions, uprates, and grid investments that preserve or expand nuclear's role.
New large-reactor builds are the boldest and hardest path
Greenfield large-reactor construction is the most ambitious pathway, and it is also the hardest to execute.
Project Matador in Texas is the clearest current example tied directly to AI infrastructure. The NRC is now in environmental scoping for Fermi America's proposal to build four AP1000 reactors at its Advanced Energy and Intelligence Campus. The filing states that the four units would provide 4 gigawatts of carbon-free baseload power directly integrated into the campus's behind-the-meter data infrastructure, with the broader site targeting up to 30 million square feet of hyperscale computing space and about 11 gigawatts of total generation capacity.
This is a very different proposition from restart or life-extension. It is not about reclaiming an existing nuclear asset. It is about building new gigawatt-class nuclear capacity specifically for large-scale digital infrastructure.
That is why the project is so important symbolically. If it advances, it will show that the market is willing to pair AI demand with utility-scale nuclear new build, not just experimental nuclear.
But it is also why caution is necessary.
The market still has to reckon with Vogtle
Any serious discussion of new large reactors in the United States has to pass through Vogtle.
Vogtle Units 3 and 4 are the first and only U.S. deployments of the AP1000, and their completion proved that America can still finish large nuclear reactors. But they also showed how hard the path can be. EIA says the project ran through major construction delays and cost overruns, with Georgia Power estimating the total cost at more than $30 billion compared with the original expectation of $14 billion.
That does not mean all future large-reactor builds will fail economically or operationally. It does mean that every AI-linked nuclear new build will be judged against a very recent and very expensive benchmark.
For developers and investors, that is the key lesson. Big nuclear is real, but it is not a quick fix.
Why uprates and restarts may matter more than greenfield builds
This is why the DOE's UPRISE initiative is so important.
Rather than assuming the fastest route to more nuclear is always new construction, DOE is explicitly targeting restarts, power uprates, and efficiency improvements in the existing fleet. The stated goal is 2.5 gigawatts of additional nuclear capacity by 2027 and 5 gigawatts by 2029, driven in part by rising industrial and AI-related electricity demand.
That logic aligns well with what the market actually needs. AI developers are not only looking for elegant long-term supply. They are looking for the fastest credible path to clean, round-the-clock megawatts. Restarts and uprates may not have the same narrative appeal as a brand-new reactor complex, but they often have a much better chance of delivering earlier.
For the next several years, that may matter more than novelty.
Why full-scale reactors appeal to hyperscalers and developers
The appeal of full-scale nuclear is not hard to understand.
Large reactors can support grid-scale, data-center-scale, and portfolio-scale power strategies all at once. They can run continuously. They already fit within utility and industrial planning frameworks. They bring more familiar operating histories than many emerging technologies. And for the biggest AI loads, they solve in very large blocks.
That matters because the largest campuses are increasingly being discussed in the hundreds of megawatts and even gigawatts. At that scale, the market starts to think less like traditional commercial real estate and more like heavy industry. Full-scale nuclear belongs naturally in that conversation.
It also helps that nuclear is politically easier to defend in some contexts than new fossil generation, especially for buyers trying to secure reliable power without abandoning long-term decarbonization goals.
The biggest pitfalls are still time, capital, and integration
The nuclear opportunity is real, but so are the risks.
Restarting a retired reactor still requires regulatory approval, equipment rehabilitation, workforce mobilization, capital spending, and local political durability. Uprates and license extensions are often easier than a new build, but they are not automatic. Greenfield reactors require the longest timelines, the deepest capital stacks, the strongest supply-chain support, and the highest tolerance for execution risk.
There is also the integration problem. A reactor and a data center may be aligned on paper, but the actual delivery pathway still has to run through transmission, interconnection, water, permitting, and commercial structuring. Power supply is not the same thing as usable campus power.
That is why the best nuclear-linked projects are the ones that treat energy, siting, and data infrastructure as one integrated development problem.
Bottom Line
Full-scale nuclear reactors are back in the AI data center conversation because the market needs clean, firm power at a scale that few other technologies can match.
The restart pathway is already real, as shown by Crane and the broader Palisades effort. The expansion pathway may prove even more important, with Meta, Vistra, Talen, and Amazon all pointing toward a future where existing nuclear plants are extended, uprated, and more tightly linked to digital infrastructure demand. New large-reactor builds, such as Project Matador, are the most ambitious option of all, but they will have to prove they can beat the cost and schedule problems that still define the U.S. large-nuclear memory.
That is the real market picture. Big nuclear is not a fantasy for AI infrastructure. But in the near term, the winners are more likely to come from reactivating and expanding the fleet we already have than from assuming a sudden wave of easy greenfield reactor construction.
Jay Sivam
Expert insights from the Nistar team on energy infrastructure and hyperscale development.