skip to content

Research • February 13, 2026

Weekly Top Stories - 02/13/26

BlackRock's DeFi debut?; AI backlash; yet another new L1

In this week's newsletter, Alex Thorn looks at BlackRock’s first foray into DeFi; Thad Pinakiewicz sizes up the latest “built for TradFi” L1; and Lucas Tcheyan and Jinaing Wu analyze the growing headwinds facing AI, from grid capacity to political backlash to engineer disillusionment.

🪨 BlackRock Dips a Toe in DeFi

BlackRock allowed its onchain money-market fund BUIDL to trade on UniswapX. The world’s largest asset manager by AUM made BUIDL available through a request for quote (RFQ) module built by Uniswap, one of the oldest and largest DeFi applications. The integration is the result of a partnership between Securitize, BlackRock’s transfer and tokenization agent for BUIDL, and Uniswap Labs, the developer of UniswapX, and will allow “whitelisted participants” to trade in and out of BUIDL via a DeFi front end. As part of the integration, BlackRock acquired an undisclosed amount of Uniswap’s governance token, briefly sending UNI +40%, though the move entirely retraced and by Thursday traded below pre-announcement levels.

BlackRock launched the BUIDL money-market fund almost two years ago on Ethereum. The TradFi giant manages the underlying assets (comprised entirely of short-dated U.S. Treasury Bills and overnight repo agreements) and Securitize provides onchain and transfer agent services. BUIDL peaked at $2.9 billion AUM in May 2025 and currently has $1.8 billion AUM across 82 unique holding addresses. Users can hold BUIDL tokens on eight networks, with 32% of supply currently on Ethereum, 30% on Solana, and 27.4% on Binance’s BNB. BUIDL has a $5 million investment minimum.

UniswapX is an intent-based, auction-style RFQ protocol that lets users swap using signed offchain orders that are then executed and settled onchain, not dissimilar to JupiterZ on Solana. “Fillers” can compete to fill that order however they can source the liquidity (i.e., on- or offchain). Usually, the set of fillers can be permissionless, allowing anyone to compete to give the best prices. In the BlackRock setup, however, according to the announcement, the set of fillers will be comprised of “whitelisted participants” including Wintermute, Flowtraders, and Tokka.

Our take

The world’s biggest asset manager dipping its toes in onchain finance is a strong signal. Until this module existed, buyers and sellers could only trade BUIDL directly through Securitize. Now they can conceivably access liquidity via the auction-based trading protocol built by Uniswap Labs directly through a Uniswap-branded front end.

But there are several important caveats. As far as we can tell, the investment minimum has not been reduced from $5 million. All addresses, whether investors and holders or fillers and traders, must be “whitelisted participants,” meaning they must complete know-your-customer (KYC) onboarding with Securitize. (This is essentially identical to how tokenized GLXY works with our transfer agent, Superstate). So, everyday DeFi users still can’t buy BUIDL, and everyday DeFi traders can’t provide liquidity. The assets cannot sit in a passive liquidity pool, instead never leaving the “regulated perimeter of Securitize Markets.”

Big regulatory questions lie ahead for enhanced functionality to become possible in the future. BUIDL is a security – an actively managed money-market fund. Whether securities can trade in a truly decentralized way, such as in automated market makers like Uniswap v4, is a massive and ongoing regulatory question that many believe is at the top of the Securities and Exchange Commission’s (SEC) “innovation exemption” agenda. Finally, selling tokens directly to others, if they are securities, could also run afoul of dealer rules, depending on the nature of the direct sales. The dealer rule issue likely will limit the extensibility of this new setup absent further regulatory clarity.

These regulatory questions need to be answered by the SEC before BUIDL, or other issuer-sponsored securities (read the SEC’s guidance on that taxonomy and our explanation), can trade widely in DeFi. All of that said, BlackRock and Securitize deserve a lot of credit for moving the ball forward. Alex Thorn

0️⃣ Zero Permission? LayerZero’s New Blockchain Comes With a Catch

Interoperability protocol Layer Zero unveiled plans for a layer-one blockchain, touting high throughput, zero-knowledge virtual machine (zkVM) support, and a low barrier to entry for validators, and boasting powerhouse launch partners Citadel, the DTCC, and Ark. Billed as “designed for TradFi," the yet-to-launch chain, known as Zero, is best explained in contrast to Ethereum.

Ethereum has spent the better part of a decade making a very specific bargain with users: if you want a chain that is credibly neutral, where anyone can verify what happened without asking permission, then you accept that the base layer will move at the pace of what regular people’s machines can handle. It’s not that Ethereum is indifferent to throughput; it’s that Ethereum is allergic to throughput that arrives by restricting who gets to participate. That’s why Ethereum’s scaling story became one where layer-2 rollups kept the base layer boring, slow-ish, and universally verifiable, and more specialized systems took on the messy work while inheriting the same security and legitimacy.

But that story has been shifting. As Ethereum’s own layer-1 roadmap delivered real throughput and capacity gains, co-founder Vitalik Buterin has become more willing to say the quiet part out loud: scaling the base layer helps, and the more you can do on layer-1, the less you have to lean on a complicated menagerie of L2s, each introducing its own governance, trust boundaries, and weird edge cases. The strongest version of Ethereum’s endgame is not “everything is fragmented into a thousand micro-sovereignties,” it’s “the base layer gets strong enough that you don’t need fragmentation to get basic throughput.” Ethereum, in other words, is trying to scale without giving up the moral posture that makes it Ethereum: the idea that verification is not a privilege.

Zero is, at least on paper, a different bargain, and it’s packaged as a way to keep decentralization while escaping the performance ceiling imposed by hardware constraints. The blockchain’s core design is a settlement layer paired with “atomicity zones,” asynchronously executed shards that can support different kinds of applications and different execution engines, while still inheriting the guarantees of the shared validator set. Application execution happens inside these zones, and the settlement layer is supposed to be “ultra-lightweight,” providing finality and a unified security foundation without doing the heavy computation.

Zero’s most important structural move is splitting the network into block producers and block validators. Producers do the expensive work—constructing blocks, executing state transitions, and generating the ZK proofs—while validators do the cheap work, ingesting headers and verifying proofs. The official line is that validators still have final authority over what gets accepted.

Our take

Zero is making a “have your cake and eat it too” pitch: you get enormous throughput because execution and proving happen on specialized machines, but you get decentralization because verification is so lightweight that almost anyone can participate. That is genuinely interesting, and there are plausible scenarios where this design could be the cleanest path to scaling verifiable public computation without requiring every participant to run a data center.

But the story gets more complicated when you ask the question that Ethereum and R&D shop Flashbots have trained users to ask, which is not “can a validator reject an invalid block?” but “who can realistically make the blocks in the first place?” A decentralized verification layer is not the same thing as a decentralized production layer, and production is where liveness and censorship happen.

Zero’s documentation doesn't explicitly say block producers would be permissioned, but it does concede that producers would have much higher performance requirements. That’s a polite way of saying that, at scale, producers will be fewer in number, richer, and more operationally sophisticated than validators. In other words, Zero’s block producers will tend to look like Solana’s validator set of “institutions,” even if the Zero team never describes them that way.

A validator set can be huge, censorship-resistant, and morally reassuring, but it can’t validate blocks that do not exist, and that’s the key risk with Zero: if a zone’s producer goes offline or simply doesn’t produce, the zone can halt, leaving validators with nothing to do except verify the absence of activity. Ethereum makes decentralization a constraint and accepts the throughput penalty. Zero tries to make decentralization a downstream consequence of proof verification, hoping the production market remains competitive enough that liveliness doesn’t centralize.

This is where the governance design matters, not as a footnote but as the central political economy. Zero’s materials emphasize that Atomicity Zones are not sovereign chains, and that creation and modification of zones must pass a governance process that is open to all governance token holders and built into the chain. That requirement is pitched as an antidote to the rollup world, where each rollup can be its own little nation-state with its own constitution, sequencer, and upgrade key. But the anti-sovereign rollup stance is also, paradoxically, a way to centralize sovereignty at the meta layer. The zones may not be “sovereign” in the strict sense, because chain governance has the ultimate say to remove them. But they can still be operationally sovereign in the sense that a zone can have its own execution logic, privacy model, and—crucially—permissioned block builder sets. The chain gets to say, with a straight face, that it has prevented sovereign chains, while still enabling the practical reality of sovereign operations so long as the token-weighted legislature says yes.

So, is Zero doing anything other than setting up another corpo chain? The honest answer is that it is doing something technically valid — separation of producers and validators with ZK proof verification is not a marketing gimmick — but the political economy of that design is exactly where “corpo chain” emerges as an equilibrium rather than an intention. If the throughput story depends on specialized producers, then the question that matters is not whether validators can reject censorship, but whether producers can be numerous enough that censorship is expensive and liveness is resilient. If the zone story depends on governance approvals, then the question that matters is not whether zones are sovereign in the whitepaper taxonomy, but whether they can become practically sovereign once governance has been captured by, or simply concentrated in, the kinds of actors who tend to win token-weighted politics.

Ethereum’s scaling debates are ultimately about how to scale without limiting who gets to verify. Zero’s scaling pitch, whether it acknowledges it or not, is about how much asymmetry we should tolerate between the actors who can verify blocks and those who can produce them. And if the answer is “a lot,” then yes, you may have reinvented the corporate chain, except this time it comes with ZK proofs, a governance process, and a very soothing story about censorship resistance. – Thad Pinakiewicz

🤖 The Clank-lash

So much happened in AI this week, you’d almost need an AI to keep track of it all, but the through-line was political and cultural backlash against “clankers.”

OpenAI and Anthropic released new state-of-the art coding models within hours of each other. Both are significant for more than intelligence improvements. OpenAI’s Codex 5.3 was “instrumental in creating itself,” a milestone in AI development. Opus 4.6, on the other hand, prompted one of the most detailed safety disclosures an AI lab has ever released. The risk of catastrophic outcomes from Anthropic’s latest model is “low, but not negligible,” the report warned.

Meanwhile, employees from OpenAI, Anthropic, and xAI quit, and to varying degrees gave unsettling reasons. Most dramatically, the Anthropic research lead who resigned declared that “the world is in peril." OpenAI "is building an economic engine that creates strong incentives to override its own rules,” its now-former researcher said. "[R]ecursive self-improvement loops likely go live in the next 12 months,” predicted one of the two departing xAI co-founders, an ominous sign for jobs.

On the ground, the physical infrastructure AI depends on is hitting political walls: New York became the sixth state in the past few weeks to introduce a bill to limit issuance of data center building permits. Meanwhile the White House reportedly is encouraging AI companies to sign a “voluntary” pact designed to mitigate data center development’s impact on water supplies, electricity prices and grid reliability.

Our take

In the past three years, AI models have achieved gold-medal math performance, PhD-level science benchmarks, and near-autonomous operation. Rapid innovation, however, is starting to sow the seeds of disruption. The S&P Software & Services Select Industry Index is down 20% YTD, with over $2 trillion wiped from the market caps of software-as-a-service (SaaS) companies. (Overheard on a trading floor recently: “SaaS is the new Europe.”)

Disruption leads to fears of destabilization, especially on issues like model safety and societal impact. The concern is not simply whether models are safe. It is whether the incentives of the organizations building them are aligned with broader societal interests. Competitive pressure, capital intensity, and market dominance can all shape deployment decisions in subtle ways.

This is why policy matters, not to slow innovation for its own sake, but to ensure that no small cluster of firms becomes the de facto steering committee for a general-purpose technology. The challenge is threading the needle: Preventing excessive concentration of power without choking off the experimentation and iteration that drive progress. Getting that policy right is a daunting task, but the tides are beginning to shift.

Just a few months ago, placing a moratorium on data centers might have seemed like a heterodox idea promoted by the likes of Senator Bernie Sanders. Now, with midterm elections looming, the Trump administration appears to be trying to preempt state crackdowns and assuage consumer frustration with rising utility rates tied to AI infrastructure. According to Politico, the White House’s voluntary draft pact would have hyperscalers commit to covering 100% of the cost of power generation and transmission, allow for load curtailment during grid stress events, and make other concessions. As a carrot, Washington would commit to supporting faster interconnection for data centers. A pro-AI administration has arguably gone from playing offense, exemplified by its moves last year to override state regulation, to active defense.

At its core, the debate over AI is also one of fear versus optimism. Fear spreads virally; optimism rarely does. Fear is visible and visceral: Layoffs, disrupted industries, rising power demand. Optimism is quieter and forward-looking. We can easily point to the jobs that may be threatened or the business models under pressure. It is much harder to visualize the industries that do not yet exist, the new forms of work that will emerge, or the ways everyday life could improve as intelligence becomes cheaper and more abundant. Lucas Tcheyan & Jianing Wu

Chart of the Week

Application fees on the Solana network rebounded in January, rising 43% month-over-month to $385 million and snapping a five-month streak of consecutive declines. The prior slowdown in fee generation closely mirrored the broader pullback in crypto asset prices, as diminished speculative appetite weighed on onchain activity. While onchain activity has begun to diversify beyond meme-centric trading, meme-related applications such as trading terminals, decentralized exchanges (DEXs), and token launchpads remain the dominant contributors to application fees. For more insights, read Galaxy Research’s 4Q update on Solana.

Click to enlarge
Click to enlarge

In Other News

🏦Aave Labs proposes new revenue split with DAO

🏹Robinhood launches test version of its own blockchain

🤦‍♂️Bithumb accidentally gives away $40b of bitcoin

🔮Jump Trading to get Kalshi, Polymarket stakes in exchange for market-making

🎚️Levl raises $7m to provide stablecoin infrastructure for fintechs

🚨Israel charges two over Polymarket bets on military operations

You are leaving Galaxy.com

You are leaving the Galaxy website and being directed to an external third-party website that we think might be of interest to you. Third-party websites are not under the control of Galaxy, and Galaxy is not responsible for the accuracy or completeness of the contents or the proper operation of any linked site. Please note the security and privacy policies on third-party websites differ from Galaxy policies, please read third-party privacy and security policies closely. If you do not wish to continue to the third-party site, click “Cancel”. The inclusion of any linked website does not imply Galaxy’s endorsement or adoption of the statements therein and is only provided for your convenience.