Galaxy Interactive didn’t start by investing in energy but we’ve recently made two investments in the energy space from our latest fund, Galaxy Interactive Fund II, both of which reflect the evolution of our thinking with respect to Interactive’s underlying core thesis – namely, that our digital and physical lives are converging at an unprecedented rate. Over time, work, play, communication, commerce, and creativity have all moved into hybrid systems. Each one becoming part software, part hardware, part real-world infrastructure. The more real-time, immersive, and intelligent these systems become, the more our attention moved steadily down the stack. From applications to platforms. From platforms to hardware. From hardware to the infrastructure that supports it all. Eventually, every path led to the same place: compute is ultimately physical and every layer of the digital stack returns to energy. Models don’t train in the cloud - they train in data centers. Data centers aren’t powered by ideas - they’re powered by electricity. Every digital action has a real, physical footprint. If the future is defined by AI, spatial computing, and persistent digital environments, then power may be a first-order constraint. Latency, scale, cost, reliability, and geography all collapse into a single question: where will the energy come from?
That’s what led us here. In the markets where compute actually needs to live, getting a few extra megawatts can take years. Getting hundreds of megawatts or several gigawatts is nearly impossible. At first glance, global electricity generation may look like it's kept pace with demand. But on the ground, in Northern Virginia, Dallas, Phoenix, Dublin, and other key data center markets, the substations are at capacity, transformers have multi-year lead-times, transmission interconnection queues are long, transmission authorities and system operators are inundated with requests, and the speed at which transmission infrastructure and generation assets are being built is not keeping pace with the demands of AI. In other words, we’re not heading toward a world where “AI needs more electricity;” we’re heading toward a world where energy and electricity generation could become the binding constraint for the deployment of compute supporting AI. That insight pushed us to treat energy as a core part of the compute stack.
To address it, over the past year we’ve invested in Commonwealth Fusion Systems (CFS) and Last Energy. CFS is building nuclear fusion technology to provide the world with abundant, clean, always-on power at utility scale, an approach aimed at changing the long-run global supply curve. Last Energy, a nuclear fission company, is focused on addressing the nearer-term delivery challenges with a factory-built, modular plant designed to be deployed in smaller increments and sited closer to load. Both companies are developing promising technologies with different timelines and risk profiles, both in support of the same underlying thesis: if compute is the new industrial base, then firm, scalable, dispatchable power is the bottleneck we have to solve.
Other builders and investors are arriving at the same conclusions and offering other solutions. Several startups are now explicitly framing energy, not algorithms, as the binding constraint. Unconventional AI recently reported that it raised $475 million at a $4.5 billion valuation, one of the largest seed rounds in history, to develop chips designed to radically reduce the energy cost of AI workloads. In announcing the round, the company framed its timing bluntly: "if projections hold, computation could become constrained by global energy supply within the next three to four years.“ Furthermore the article states their chip is, "designed to dramatically improve the energy efficiency of artificial intelligence, arguing that current AI scaling trends are on a collision course with global power constraints."
Whether you’re building new computing architectures, new data centers, or new AI systems, you eventually run into the same hard limit: you can’t scale intelligence without scaling energy. This post is our attempt to make that math explicit and to show, step by step, why power is becoming the defining infrastructure problem of the AI era, and why nuclear, in particular, is re-emerging as a necessary part of the solution.
Why Does Power Matter?
Every digital action has a physical footprint. Queries hit GPUs/CPUs; networks move bits through switches and fiber; uninterruptible power supplies (UPS) and mechanical cooling infrastructure keep power clean and chip temperatures operating within their recommended temperature ranges. The electrons that power all of these devices come from real generation resources, are transmitted through real wires, and turn into real heat and water use at both the generation facilities and the data centers.
Energy Per Digital Action (Order-of-Magnitude)
Action | Energy Per Action | Energy Equivalence | Peloton Strokes (at 85 RPM) |
|---|---|---|---|
Web Search | ~0.3 Wh | A 10 W LED bulb for 2 min | 7 pedal strokes |
LLM Chat (single answer) | ~9.0 Wh | A 10 W LED bulb for 60 min | 230 pedal strokes or 7 minutes of cycling |
Agentic AI Task | ~1.0 kWh | A 1,000 W microwave for 60 min | 25,500 pedal strokes or about 5 hours of straight cycling |
These figures are order-of-magnitude estimates intended to illustrate scale, not precise measurements of any single system, and are based on publicly available sources and may vary by system, workload, and configuration. Data is for illustrative purposes only. Figures are estimates based on publicly available sources and may vary by system, workload, and configuration.
Data is for illustrative purposes only. Figures are estimates based on publicly available sources and may vary by system, workload, and configuration.
The baseline: how big is the world's "power pool"?
As of 2025, humanity runs on about 8.5 TW (8,500 GW) of installed power generation capacity (with actual usable output materially lower due to capacity factors). The United States has ~1,300 GW installed, which account for around ~15% of the world total.
Basic units we'll be using:
MW = megawatt
GW = gigawatt
TW = terawatt
1,000,000 watts = 1 MW
1,000 MW = 1 GW
1,000 GW or 1,000,000 MW = 1 TW
Translating into the context of powering an average U.S. home:
1 MW of power could power 1,000 US homes for a year
1 GW could power 1,000,000 US homes for a year
How Much Power Does It Take to Run a Data Center?
Not all data centers are created equal. The power required depends on what the facility is doing, how dense the compute is, and whether it’s serving traditional enterprise workloads or cutting-edge AI systems. Over the past decade, the industry has moved rapidly up the power curve.
At the smaller end are enterprise data centers, the kind historically run by large companies for internal IT. These facilities typically draw 1–5 megawatts, roughly the same electricity as 1,000 to 5,000 homes. They support email, databases, internal applications, and modest analytics. For years, this was the “normal” scale utilities expected.
Next are large colocation and hyperscale buildings, where cloud providers and data-center operators lease the capacity. A single building at this scale can require around 50 megawatts, comparable to 50,000 homes. This is where modern cloud computing lives like storage, web services, enterprise SaaS, and increasingly, AI inference.
Beyond that are the AI campuses, like Galaxy’s AI and HPC data center campus, Helios, a purpose-built facility designed to support tens of thousands of GPUs. These campuses operate as tightly integrated systems or “clusters” and commonly demand between 300-800 megawatts, the equivalent of powering 300,000-800,000 homes. At this scale, power is no longer a line item; it is the defining design constraint.
At the extreme end are emerging gigawatt-scale campuses like Galaxy’s AI and HPC data center campus, Helios, a purpose-built facility designed to support tens of thousands of GPUs and now has a total approved capacity of 1.6+ GW. These are typically multi-building, multi-phase developments designed to scale over time toward or beyond 1,000 megawatts (1 gigawatt) of continuous electrical load, the equivalent of more than 1 million homes’ worth of electricity. In the case of Stargate UAE, a JV between OpenAI, G42, NVIDIA, Cisco, Oracle and Softbank, will “eventually host 5 gigawatts worth of data centers.” Gigawatt-scale campuses aren’t just data centers; they are effectively industrial power projects built to serve the next generation of AI and HPC workloads
Quick summary:
Enterprise Data Center: requires 1–5 MW (1,000–5,000 homes)
Large Colocation or Hyperscale Data Center: requires 50 MW (50,000 homes)
AI Data Center Campus: requires 300-800 MW (300,000-800,000 homes)
Gigawatt-scale AI Data Center Campus: 1,000 MW / 1 GW (1,000,000 homes)
Stargate UAE: 5,000 MW / 5 GW (5,000,000 homes)
The Delivery Math and What Models Are Missing
When a data center requests 300 MW of critical IT load to power AI workloads, that doesn’t mean the electrical grid only needs to generate 300 MW of electricity. In most modern AI campuses, delivering that much reliable 24/7 power can require building closer to 390-450 MW of total generation capacity. Here’s why: first, cooling systems (CRAHs, chillers, pumps, etc.) require power, UPS systems aren’t perfectly efficient (there are 3-4% losses in most plus charging requirements), and other supporting infrastructure overhead (captured in Power Usage Effectiveness or PUE metric) add between 30-50% more electrical demand based on the design of the data center, equipment specifications, and the outdoor ambient temperature and weather conditions local to the campus, pushing the campus’s real power requirement to between 390-450 MW at peak, representing a 1.3-1.5 peak PUE during the hottest summer days.
Additionally, planning engineers, grid operators, and regulated utilities must plan for resource adequacy during different seasons, grid conditions, and various contingencies. When you combine those factors, the math quickly escalates. This “300 MW” data center ends up needing transmission towers erected, wires run, and continuously routed, switched, and delivered... a much larger operational footprint and an important reminder that digital infrastructure is only as scalable as the power systems behind it.
Why We Weren't Already Building for This
Until recently, data centers were a steady but predictable small slice of global power demand. Utilities and planning engineers treated them like any other large commercial customer - big, but not transformative. That changed almost overnight with the rise of AI, hyperscalers, neoclouds, and training and inference workloads, and the arms race for compute, which together have redefined the scale, timing, and location of demand. Here’s what’s changed:
Data center site sizes have exploded. Ten years ago, a large data center might draw 10–50 MW, roughly the load of a small town. Today, new AI campuses request 300–1,000+ MW each, with gigawatt-scale campuses now a common topic of conversation for utilities. Utilities weren’t designed to energize hundreds of megawatts (or gigawatts!) for a single customer at a single point-of-interconnection, let alone in months instead of years.
Training created a new kind of baseload, while inference is often bursty with higher variability in the ramp rates of the load. We got used to thinking of “AI” as one bucket, but the electrical load profile matters and training and inference don’t look the same at scale. Training often behaves like a big, flat 24/7 industrial load. When a model training run kicks off, GPU clusters tend to run continuously for long stretches, with only occasional pauses for checkpointing and operational resets. In power terms, training can look a lot like Bitcoin mining: a steady, flat load profile.
Inference looks more like internet traffic. User and enterprise requests come in waves, and the compute serving those requests can spike during peak demand hours and soften during off-peak hours during which user requests are lower. Operators can smooth some of this by routing and load balancing across data centers and GPU clusters, but the underlying electrical load profile is user demand-driven contrasted with training runs. The implication for power systems engineers and transmission planners is uncomfortable as it requires a re-learning of how large scale, dynamic loads interact with an increasingly mixed bag of generation and storage resources within a modern and rapidly-changing grid: AI doesn’t just add more load, it adds both a large new baseload component via training and higher ramp-rate variability in the form of demand via inference, which may increase the need for dampening and power-quality management at the data center, campus, and grid level.
Location and latency lock demand into a few metros. The internet still “meets” in the same handful of metros where fiber networks converge, points-of-presence (PoPs) are established in major network interconnection hubs or carrier hotels, and technical talent remains concentrated. Instead of spreading demand evenly, we’re stacking enormous loads in already congested regions like Northern Virginia (NOVA), Dallas, and Dublin. Even if the world has enough electricity being generated to satisfy the overall demand of AI data centers, the challenge is often transmitting the electricity to these hubs at the scale required for today’s data center campuses.
Global Data Center Power Consumption Today vs 2030 Projections
Data centers are no longer a rounding error on the grid. They already represent a large, always-on block of demand, and AI is pushing that block up fast enough that it starts to show up in national planning, not just in corporate energy bills. Sam Altman’s comment that OpenAI alone could need on the order of 250 GW by 2033, within roughly eight years, is a strong signal: leading AI builders are sending up warning flares to the world that they will need power, at the scale of countries. For context, 250 GW is roughly the power consumption of the entire country of Brazil... for ONE company.
Today, global data centers draw roughly 47 GW of continuous power, about 1.5% of global electricity in 2024 by the IEA’s accounting. Most of that is still traditional cloud and enterprise compute, but the growth is coming from AI infrastructure.
“While conventional servers and supporting infrastructure contribute to overall data center electricity consumption, the rapid rise of AI-optimized servers is fueling the increase in data center power consumption,” said Linglan Wang, Research Director at Gartner. “Their electricity usage is set to rise nearly fivefold, from 93 TWh in 2025 to 432 TWh in 2030.”
Gartner also estimates total global data-center electricity consumption rises from roughly 448 TWh in 2025 to about 980 TWh or 111 GW by 2030. With that in mind, estimates across the industry are being constantly revised with BloombergNEF's (BNEF) new forecast increases their outlook by 36% from their previous one published 7 months ago.
“The massive growth rate in data center power demand reflects more than a surge in the number of data centers in the pipeline; it also highlights the new centers’ size. Of the nearly 150 new data center projects BNEF added to its tracker in the last year, nearly a quarter exceed 500 megawatts. That’s more than double last year’s share,” BloombergNEF.
As we analyze various reports on data center energy consumption in 2030 the projections vary wildly. On the low end, IEA base case estimates US Data center consumption to be at 48 GW by 2030, representing a 133% growth. A Boston Consulting Group (BCG) report projects that data center power demand in the US alone will reach will 100-130 GW (800-1,050 TWh) by 2030, increasing by 15-20% annually. Techcrunch reports that "Data center energy demand forecasted to soar nearly 300% through 2035". Whether we are reaching these numbers globally or in the US alone, consensus estimates continued growth.
Source: World Resources Institute
To get a sense of scale when data centers and the supporting infrastructure are consuming ~170 GW continuous load, about 1,500 TWh/year, that’s roughly equivalent to the entire power demand for Japan plus the UK; ~230 GW is about 2,000 TWh/year, effectively adding Germany; and ~285 GW is about 2,500 TWh/year, adding France. The headline is not one exact number. It is that compute is starting to map onto the same unit of measure as major economies, and our power systems have to respond on that scale.
“This boom in data center demand is colliding with grid realities,” cites BloombergNEF
Quick Explanation on TWh/year:
You’ll often see 2 metrics as it relates to energy = GW and TWh / year. Let’s break this down. Think of gigawatts (GW) as the instantaneous production rate of electricity (like miles-per-hour on your speedometer), and terawatt-hours (TWh) as how much you produced over time (like miles on your odometer or how much energy flowed over the wires). To turn 50 GW into TWh per year, just multiply by the number of hours in a year:
There are 8,760 hours in a year.
Running at 50 GW all year: 50 GW × 8,760 hours = 438,000 GWh = 438 TWh per year.
Forecasts vary because interconnection data is noisy and often speculative – but the direction and pace of load growth are unmistakable.
A Closer Look at the U.S.
Here's what the jump really means in plain English:
In 2024, a report by Lawrence Berkley National Labs concluded that U.S. data centers used about 20 GW of continuous power - roughly 4% of total U.S. electricity generation (~176 TWh out of ~4,400 TWh). A DOE report projects that data centers will consume "approximately 6.7%-12% of total U.S. electricity by 2028." The report also cites that total data center electricity usage climbed from 58 TWh in 2014 to 176 TWh in 2023. The report then estimates that data center electricity usage will climb to between 37 to 66 GW by 2028.
The Boston Consulting Group (BCG) projects that total US data center power demand will increase by 15-20% annually to reach 100-130 GW (800-1,050 TWh) by 2030. That’s the equivalent of the electricity used by about 100 million U.S. houses, about two thirds of the total homes in the U.S.
If we use a more conservative number and take the BCG report as the upper end we're experiencing a +50 GW step-up in under a decade, and in percentage terms, that’s +250% total growth on a grid historically planned to grow at ~2–3% per year. Doing the math, to keep up we’ll need about 10 GW of new, always-on load every year. Even before you factor in operating reserve margins, resource adequacy and planning, and construction and interconnection delays, the pace alone breaks old growth assumptions.
Bottom Line: No matter how you slice the numbers, that’s roughly 10x faster than the ~2–3%/yr grid planners typically model.
In summary:
In 2022: US Data Centers required ~17 GW → ~3% of all the US power.
In 2024: US Data Centers required ~20 GW → ~4% of all the U.S. power.
By 2026: US Data Centers may require ~35 GW → ~6.5% of all the U.S. power.
By 2028: US Data Centers may require ~65 GW → ~9% of all the U.S. power.
By 2030: US Data Centers may require ~100-130 GW → climbing to over 15% of U.S. power.
For illustrative purposes. Forward looking data based on estimates from third-party sources.
Why It Matters:
We’ve moved from a world of slow, predictable growth to one compounding at 15-20% per year and that demands an entirely new strategy and playbook. It means rethinking how we generate electricity, where we build the power plants, and how quickly we can connect them to the grid. The challenge isn’t just generating more electricity - it’s delivering gigawatts exactly where the large compute loads need to live, at a speed the old system wasn’t built for.
And for anyone watching policy closely, it’s clear that some people in government and the energy sector have already done this math. The recent wave of federal attention, new permitting reforms, and a revived focus on nuclear energy make perfect sense in this context. The numbers are too large to ignore and regulation is finally catching up to the scale of what’s coming.
How Do We Build Our Way Out of This?
Vivian Lee, Managing Director & Partner at BCG, says "The U.S. may face a shortfall of up to 80 GW of firm power to meet this demand by 2030". Let's be more conservative and say, if U.S. data-center needs require an additional 50 GW of continuous, 24/7 load, that has to come from somewhere. Turning that into annual energy, we need roughly 450 TWh/year of new generation capacity, the equivalent of powering 45 million new homes, when the total numbers of homes in the US is about 140 million.
So what does it take to build our way out of this?
If we tried to meet all 438 TWh with one power source:
Nuclear: 50 new 1 GW reactors
Natural gas (CCGT): 83 new 600 MW plants
Coal: 100 new 500 MW units
Wind: 1,000 new 50 MW wind farms
Solar: 500 new 100 MW solar plants
A blended path is more realistic:
40% nuclear/SMR → 20 new 1 GW reactors or 400 20 MW SMR’s
40% solar → 200 new 100 MW solar plants
15% gas → 13 new 600 MW CCGT plants
5% coal → 5 new 500 MW units
+ BESS: Ample gigawatt scale battery energy storage systems (BESS) to support storage of the electricity generated by nuclear, solar, gas, and coal with fast frequency response to maintain target system frequency with a large number of generation and load resources
Coal is included for illustrative arithmetic only.
Reality Check
Natural gas might fill part of the gap, but new turbine orders are already on four-year backlogs, and midstream pipeline capacity is limited. Coal remains both politically and environmentally untenable. Solar is vital, but the intermittent output can’t provide the 24/7 reliability that AI workloads demand without being coupled with large scale battery energy storage systems (BESS). That leaves an opening for nuclear, especially the new generation of Small Modular Reactors (SMRs) and microreactors now in development.
And for the first time in decades, the political winds seem to favor nuclear development. Support for nuclear power is now both broad-based and top-down. Recent federal policy changes have reduced incentives for new wind and solar while expanding tax credits for nuclear with credits now worth up to 40% of project costs. The administration is also overhauling the Nuclear Regulatory Commission (NRC), aiming to speed up approvals of next-generation reactor designs and simplifying permitting rules so startups can collocate reactors on military bases or legacy sites like the Idaho National Laboratory, which have hosted nuclear programs since the Manhattan Project.
Why Nuclear is Viewed as a Necessity
The U.S. is quietly clearing the runway for a nuclear comeback at exactly the moment the data center boom demands a different kind of power system. The challenge ahead isn’t whether the world has enough energy in aggregate. It’s whether we can deliver large amounts of reliable, round-the-clock power to very specific places, on timelines that match the pace of AI deployment. On that dimension, nuclear stands apart. It is the only large-scale, zero-carbon source that combines steady output, a compact footprint, and genuine location flexibility.
The nuclear advantage is energy density. Nothing else packs as much usable energy into so little material. A single uranium fuel pellet, roughly the size of a fingertip, contains as much energy as a ton of coal or about 149 gallons of oil. On a per-kilogram basis, nuclear fission releases one to three million times more energy than burning fossil fuels. That extreme density changes everything. Because so much energy is concentrated in such a small amount of fuel, nuclear requires far less land, transport, and physical infrastructure. A few truckloads of uranium can power an entire city for a year, without coal trains, sprawling solar fields, or thousand-mile gas pipelines.
That density directly translates into deliverability. When the bottleneck is getting power into congested metros, the most effective solution is to bring dense generation to the load itself. A one-gigawatt nuclear plant occupies a few hundred acres and has a capacity factor or annual runtime average of close to 90 percent. Accounting for intermittency, it delivers the same usable energy as three to four gigawatts of inverter-based resources like wind or solar. Small Modular Reactors (SMRs) extend that logic even further. Built in factories, cooled without massive water requirements, and deployable in 20 to 300 megawatt blocks, SMRs can be co-located with data centers, industrial parks, or grid-strained regions where new transmission infrastructure is slow to be built or politically tenuous. This matters now because the constraints are shifting. As power demand triples in key U.S. hubs, fuel supply is not the limiting factor. Land availability, transmission capacity, and delivery timelines are. Nuclear allows firm baseload power to sit in the same metros where fiber, data gravity, and latency already anchor compute. It provides a clean, compact, always-on foundation that complements non-dispatchable renewables rather than competing with them.
One important nuance is that all AI workloads are not (electrically) equal. In many real deployments, training behaves like a large, steady 24/7 load (similar to Bitcoin mining) with periodic checkpointing, while inference is often more variable and traffic-driven, peaking and troughing with user demand. That distinction matters for nuclear. Training-heavy campuses are a strong match for firm baseload, because they benefit most from a generator that can run continuously at high utilization with stable operating economics. This is exactly where SMRs become strategically important: they can deliver clean, always-on power in modular blocks sized to the campus, and they can be sited closer to load when transmission is the bottleneck.
The other nuance is that modern AI campuses are dominated by power electronics (rectifiers, inverters, fast-switching GPU power supplies). When you tightly couple a large, power-electronics-heavy load to rotating thermal generation (including nuclear), grid stability and power quality become part of the solution set, not an afterthought. In practice, this is why SMRs are often discussed alongside grid-forming battery energy storage systems (BESS): the SMR provides the steady baseload, while grid-forming BESS acts as a fast “shock absorber” that can smooth sudden load steps, buffer bursty inference demand, and help dampen stability issues such as sub-synchronous oscillation (SSO) that can arise if the system is poorly integrated. This is the same general logic behind configurations where on-site generation is paired with utility-scale batteries on the medium-voltage system to stabilize power delivered to GPU clusters and potential load impacts on the rotating synchronous operations of the generator.
Major technology companies are making moves in the space. Google signed an offtake agreement for 200 megawatts of fusion power from Commonwealth Fusion Systems’ ARC power plant in Virginia, years before the facility is even built. At the same time, momentum is building on the fission side as well. Last Energy recently reported it closed an oversubscribed $100 million financing round, underscoring growing investor confidence in modular, factory-built nuclear as a near-term solution to the power delivery problem. Then, on January 9th Meta announced agreements with three nuclear energy companies, Vistra, TerraPower, and Oklo, to secure up to 6.6 gigawatts (GW) of nuclear energy by 2035. The signal is hard to miss. As AI, data, and modern industry scale, energy density is becoming a strategic asset. And in the near to medium term, energy density means nuclear.
Conclusion: The Math is Becoming Increasingly Clear and the Dashboards Are Flashing Red
At this point, the question isn’t whether humanity can produce enough electricity, it’s whether we can build, permit, and deliver it fast enough to meet the exponential demands of an AI-driven economy. In the U.S. alone, data-center power demand could triple by 2030, adding the equivalent of 45 million new homes to the grid in less than five years. If Sam Altman's projection for OpenAI's 250 GW power need by 2033 come to fruition we'll need an additional 250 million homes worth of electricity online by then. Regardless of which 'up and to the right' projection you use, energy consumption is outpacing the capacity growth and that’s before we really factor in AI's second act: full enterprise adoption across the board, which is only just beginning.
To meet this new demand with nuclear, at current reactor sizes and timelines, America would need to bring one new nuclear reactor online every month for the next five years.
Let that sink in.
Instead, we’ve completed just two large reactors in the last half-decade (Vogtle Units 3 & 4), both decades late and billions over budget. Meanwhile, China has added over 34 GW of nuclear capacity in ten years, with 33 reactors under construction and 58 currently operating.
The market signals couldn’t be clearer. Demand is exploding, and the supply chain, from copper to transformers to permitting offices, is already blinking red. This is the math behind the dashboards that investors are watching. Capital is now racing toward every part of the value chain: not just data-center developers and power producers, but also the infrastructure and materials that will inevitably be stretched thin - steel, concrete, fuel, chips, and grid infrastructure.
And let’s not forget the math many of the models might be missing. IRL to power a single 300-MW data center, utilities must build and connect approximately 390-450 MW of total generation capacity, accounting for cooling, redundancy, and reliability margins. In practice, every new “300 MW” AI campus is closer to a half-gigawatt infrastructure project.
This means that if we’re only looking at what power our data centers will demand, we may be underestimating the real need by roughly 60–70%. In practice, total generation and delivery capacity must grow far faster than the load itself to keep up with the physical infrastructure required to actually serve it.
Closing: From Optimization to Infrastructure
There is a bright side to this story. As AI systems scale, efficiency breakthroughs are already emerging, throughout the stack and especially in inference, which now accounts for the majority of AI’s energy use.
One example is Galaxy portfolio company Peridio, which operates in the infrastructure layer of the stack and streamlines the deployment of complex Physical AI packages to edge targets. As inference moves to the edge, operational inefficiencies like failed updates, underutilized hardware, redundant workloads quietly translate into wasted power. Inference is becoming a critical component of scaled Physical AI, allowing real-time workloads to run at the endpoint where the data is generated. Peridio helps ensure that this edge compute runs efficiently, reducing unnecessary cycles, idle infrastructure, and back-and-forth data movement resulting in more energy-efficient systems.
Another example further down the stack is Galaxy portfolio company TetraMem, which is tackling the memory bottleneck that limits efficiency across modern AI workloads. TetraMem’s analog in-memory compute architecture, built on RRAM (memristor) technology, enables ultra-efficient, real-time AI processing at scale, from edge devices to full data centers. By performing computation directly where the data resides, this approach aims to dramatically reduce the power wasted moving data between memory and processors, cutting total energy use per AI task by orders of magnitude.
But the math doesn’t lie. Efficiency stretches the curve; it does not eliminate the constraint. We can’t optimize our way out of this alone. We also have to build our way out. Meeting the demands of an AI-driven economy will require a massive expansion of reliable, always-on power, paired with renewables, storage[AS1] , and smarter transmission. Nuclear, in particular, must anchor that effort.
Which returns us to how we began this article. At Galaxy Interactive, and our parent company, Galaxy, more broadly, we’re not approaching this as a theoretical problem. We are also living this reality operationally. Just this week, Galaxy announced that it completed ERCOT Large Load Interconnection Studies and secured approval for an additional 830 megawatts of power at its Helios data center campus in West Texas, bringing total ERCOT-approved capacity to over 1.6 gigawatts. Through Galaxy’s AI and HPC data center business, Helios, we see firsthand how power availability, delivery timelines, and reliability now shape what compute can be built and where it can live.
On the supply side, we’re investing in firm, scalable energy solutions that match those realities. Our investment in Last Energy reflects a near-term focus on deployable, factory-built nuclear fission designed to deliver clean, always-on power closer to load commissioning timelines that align with modern data-center development. In parallel, our investment in Commonwealth Fusion Systems is aimed further out the curve, pursuing a path to fundamentally expand the world’s supply of abundant, clean power over the long run.
Different timelines, different risk profiles, same underlying thesis: if compute is becoming the new industrial base, then energy is no longer a background input, it is the critical infrastructure layer supporting it. In upcoming posts, we’ll go deeper into the investments we’ve made and the specific technologies, teams, and delivery models we believe are best positioned to relieve the power bottleneck on relevant timelines.
Legal Disclosure:
This document, and the information contained herein, has been provided to you by Galaxy Digital Inc. and its affiliates (“Galaxy Digital”) solely for informational purposes. This document may not be reproduced or redistributed in whole or in part, in any format, without the express written approval of Galaxy Digital. Neither the information, nor any opinion contained in this document, constitutes an offer to buy or sell, or a solicitation of an offer to buy or sell, any advisory services, securities, futures, options or other financial instruments or to participate in any advisory services or trading strategy. Nothing contained in this document constitutes investment, legal or tax advice or is an endorsement of any of the stablecoins mentioned herein. You should make your own investigations and evaluations of the information herein. Any decisions based on information contained in this document are the sole responsibility of the reader. Readers should consult with their own advisors and rely on their independent judgement when making financial or investment decisions.
Participants, along with Galaxy Digital, may hold financial interests in certain assets referenced in this content. Galaxy Digital regularly engages in buying and selling financial instruments, including through hedging transactions, for its own proprietary accounts and on behalf of its counterparties. Galaxy Digital also provides services to vehicles that invest in various asset classes. If the value of such assets increases, those vehicles may benefit, and Galaxy Digital’s service fees may increase accordingly. The information and analysis in this communication are based on technical, fundamental, and market considerations and do not represent a formal valuation. For more information, please refer to Galaxy’s public filings and statements. Certain asset classes discussed, including digital assets, may be volatile and involve risk, and actual market outcomes may differ materially from perspectives expressed here.
For additional risks related to digital assets, please refer to the risk factors contained in filings Galaxy Digital Inc. makes with the Securities and Exchange Commission (the “SEC”) from time to time, including in its Quarterly Report on Form 10-Q for the quarter ended September 30, 2025, filed with the SEC on November 10, 2025, available at www.sec.gov.
Certain statements in this document reflect Galaxy Digital’s views, estimates, opinions or predictions (which may be based on proprietary models and assumptions, including, in particular, Galaxy Digital’s views on the current and future market for certain digital assets), and there is no guarantee that these views, estimates, opinions or predictions are currently accurate or that they will be ultimately realized. To the extent these assumptions or models are not correct or circumstances change, the actual performance may vary substantially from, and be less than, the estimates included herein. None of Galaxy Digital nor any of its affiliates, shareholders, partners, members, directors, officers, management, employees or representatives makes any representation or warranty, express or implied, as to the accuracy or completeness of any of the information or any other information (whether communicated in written or oral form) transmitted or made available to you. Each of the aforementioned parties expressly disclaims any and all liability relating to or resulting from the use of this information. Certain information contained herein (including financial information) has been obtained from published and non-published sources. Such information has not been independently verified by Galaxy Digital and, Galaxy Digital, does not assume responsibility for the accuracy of such information. Affiliates of Galaxy Digital may have owned, hedged and sold or may own, hedge and sell investments in some of the digital assets, protocols, equities, or other financial instruments discussed in this document. Affiliates of Galaxy Digital may also lend to some of the protocols discussed in this document, the underlying collateral of which could be the native token subject to liquidation in the event of a margin call or closeout. The economic result of closing out the protocol loan could directly conflict with other Galaxy affiliates that hold investments in, and support, such token. Except where otherwise indicated, the information in this document is based on matters as they exist as of the date of preparation and not as of any future date, and will not be updated or otherwise revised to reflect information that subsequently becomes available, or circumstances existing or changes occurring after the date hereof. This document provides links to other Websites that we think might be of interest to you. Please note that when you click on one of these links, you may be moving to a provider’s website that is not associated with Galaxy Digital. These linked sites and their providers are not controlled by us, and we are not responsible for the contents or the proper operation of any linked site. The inclusion of any link does not imply our endorsement or our adoption of the statements therein. We encourage you to read the terms of use and privacy statements of these linked sites as their policies may differ from ours. The foregoing does not constitute a “research report” as defined by FINRA Rule 2241 or a “debt research report” as defined by FINRA Rule 2242 and was not prepared by Galaxy Digital Partners LLC. Similarly, the foregoing does not constitute a “research report” as defined by CFTC Regulation 23.605(a)(9) and was not prepared by Galaxy Derivatives LLC. For all inquiries, please email [email protected].
©Copyright Galaxy Digital Inc. 2026. All rights reserved.