Skip to main content

21 posts tagged with "blockchain infrastructure"

Blockchain infrastructure services

View all tags

OKX OnchainOS AI Toolkit: When Exchanges Become Agent Operating Systems

· 12 min read
Dora Noda
Software Engineer

On March 3, 2026, while most exchanges were still figuring out how to add chatbots to customer support, OKX launched something fundamentally different: an entire operating system for autonomous AI agents. The OnchainOS AI Toolkit isn't about making trading faster for humans—it's about making it possible for machines.

With infrastructure already processing 1.2 billion daily API calls and $300 million in trading volume, OKX just transformed from an exchange into what might be the most ambitious bet on the agent economy. The question isn't whether AI agents will trade crypto autonomously. It's which infrastructure will dominate when they do.

The Agent-First Exchange Architecture

Traditional crypto exchanges optimize for human decision-making: charts, order books, buttons. OKX's OnchainOS flips this entirely. Instead of humans clicking through interfaces, AI agents issue natural language commands that execute across 60+ blockchains and 500+ DEXs simultaneously.

This architectural shift mirrors a broader industry transformation. Coinbase announced Agentic Wallets on February 11, 2026, with the x402 protocol for autonomous spending. Binance's CZ promised a "Binance-level brain" for AI agents. Even Bitget is retrofitting non-custodial wallets with autonomous decision-making.

But OKX's approach is distinctly infrastructure-focused. Rather than building agent personalities or trading strategies, they've created the operating system layer—unifying wallet functionality, liquidity routing, and market data into a single framework that any AI model can access.

Three Paths to Agent Integration

OnchainOS offers developers three integration methods, each targeting different use cases:

AI Skills provide natural language interfaces where agents can say "swap 100 USDC to ETH on the best available DEX" without knowing how routing works. For developers building conversational agents or customer-facing bots, this removes API complexity entirely.

Model Context Protocol (MCP) integration means OnchainOS plugs directly into LLM frameworks like Claude, Cursor, and OpenClaw. An AI coding assistant can now autonomously interact with blockchain state, execute trades, and verify on-chain data as part of its normal reasoning loop—no custom integration required.

REST APIs give scripted control for traditional developers building programmatic strategies. While less innovative than natural language commands, this ensures backward compatibility with existing trading infrastructure and allows gradual migration to agent-based systems.

The practical implication: whether you're building a fully autonomous trading bot, enhancing an existing AI assistant with crypto capabilities, or just want API access with intelligent routing, OnchainOS provides the appropriate abstraction layer.

The Economics of Agent Infrastructure

The numbers reveal production-scale deployment, not a pilot program. Processing 1.2 billion API calls daily with sub-100ms response times and 99.9% uptime requires infrastructure that most exchanges couldn't replicate overnight.

OKX's liquidity aggregation across 500+ DEXs creates economic advantages for agents that humans can't match manually. When an agent needs to execute a large swap, the system automatically:

  1. Queries real-time pricing across hundreds of liquidity pools
  2. Calculates optimal routing to minimize slippage
  3. Splits orders across multiple DEXs if needed
  4. Executes transactions in parallel across chains
  5. Verifies settlement and updates agent state

All of this happens in milliseconds. For human traders, this level of cross-DEX optimization requires running multiple interfaces simultaneously, manually comparing rates, and accepting that by the time you've checked five options, prices have moved.

The $300 million daily trading volume processed through OnchainOS suggests meaningful early adoption. More tellingly, that volume runs through infrastructure supporting over 12 million monthly wallet users—meaning the agent layer sits on top of battle-tested systems handling real user funds.

Unified Wallet Infrastructure vs Specialized Agent Wallets

Coinbase's Agentic Wallets take a purpose-built approach: wallets designed specifically for autonomous spending with security guardrails baked in. OKX went the opposite direction: integrate agent capabilities into existing wallet infrastructure that already supports 60+ chains.

The trade-offs are architectural. Purpose-built agent wallets can optimize for autonomous operation from the start—built-in spending limits, risk parameters, and recovery mechanisms designed for machines making decisions without human oversight. Unified infrastructure inherits complexity from supporting diverse chains and use cases but offers broader reach and battle-tested security.

OKX's bet is that agents will need access to the full crypto ecosystem, not a sandboxed environment. If an autonomous agent is managing a DAO's treasury, arbitraging across chains, or rebalancing a portfolio dynamically, it needs native access to wherever liquidity lives—not a specialized wallet that only works on three chains.

The market hasn't decided which approach wins. What's clear is that both OKX and Coinbase recognize the same shift: autonomous agents need infrastructure designed for them, not retrofitted human tools.

On-Chain Data Feeds: The Agent Information Layer

Trading decisions require data. For AI agents, OnchainOS provides real-time feeds covering tokens, transfers, trades, and account states across all supported networks.

This solves a problem that anyone building multi-chain applications knows intimately: querying blockchain state from dozens of networks is slow, requires running infrastructure for each chain, and introduces failure points when nodes go down or lag behind.

OnchainOS abstracts this entirely. An agent queries "get all recent trades for token X across networks Y and Z" and receives normalized, real-time data without knowing which RPC endpoints to call or how different chains structure transaction logs.

The competitive edge isn't just convenience. Agents making sub-second trading decisions need data latency measured in milliseconds. Running your own nodes for 60 blockchains to achieve similar performance requires infrastructure investment that most developers can't justify. Cloud RPC providers add latency and costs that kill the economics of high-frequency agent strategies.

By unifying data feeds as part of the platform, OKX turns infrastructure costs into a distributed shared resource—making sophisticated agent strategies accessible to independent developers, not just well-funded firms.

The x402 Protocol and Zero-Gas Execution

Autonomous payments run on the x402 pay-per-use protocol, which addresses a fundamental agent economy problem: how do machines pay each other without manual intervention?

When an AI agent needs to access a paid API, purchase data, or compensate another agent for services, x402 enables automatic settlement. Combined with zero-gas transactions on OKX's X Layer, agents can make micropayments economically—something impossible when each payment costs more in gas than the service itself.

This matters more as agent-to-agent interactions increase. A single high-level agent task might involve:

  • Querying market data from a specialized analytics agent
  • Calling a sentiment analysis API agent
  • Purchasing on-chain position data
  • Executing trades through a routing agent
  • Verifying results through an oracle agent

If each step requires manual approval or gas costs that exceed the value transferred, the agent economy never scales beyond human-supervised operations. x402 and zero-gas execution remove these friction points.

Market Context: The $50 Billion Agent Economy

OnchainOS arrives as the AI-crypto convergence accelerates. The blockchain AI market is projected to grow from $6 billion in 2024 to $50 billion by 2030. More immediately, 282 crypto × AI projects secured venture funding in 2025, with 2026 showing strong momentum.

Virtuals Protocol reports 23,514 active wallets generating $479 million in AI-generated GDP (aGDP) as of February 2026. These aren't theoretical metrics—they represent agents actively managing value, executing trades, and participating in on-chain economies.

Transaction infrastructure has fundamentally improved. Blockchain throughput increased 100x in five years, from 25 TPS to 3,400 TPS. Ethereum L2 transaction costs dropped from $24 to under one cent. High-frequency agent strategies that were economically impossible in 2023 are now routine.

Stablecoins processed $46 trillion in volume last year ($9 trillion adjusted), with projections showing AI "machine customers" controlling up to $30 trillion in annual purchases by 2030. When machines become primary transactors, they need infrastructure optimized for autonomous operation.

Developer Adoption Signals

OnchainOS launched with comprehensive documentation and starter guides, targeting builders deploying their first AI agents. The Model Context Protocol integration is particularly strategic—by plugging into frameworks developers already use (Claude, Cursor), OKX removes the "learn a new platform" barrier.

For developers already building trading bots or automation scripts, the REST API provides migration paths. For AI researchers experimenting with autonomous agents, natural language Skills offer the fastest path to on-chain capabilities.

What OKX hasn't provided: proprietary agent personalities, pre-built trading strategies, or "click here for autonomous trading" consumer products. This is infrastructure, not an end-user application. The bet is that thousands of developers building specialized agents will create more value than OKX could by building a single agent trading product.

This mirrors successful platform strategies in other markets. AWS didn't try to build every application—they provided compute, storage, and networking primitives that millions of developers used to build diverse applications. OnchainOS positions OKX as the AWS of agent infrastructure.

Competitive Dynamics and Market Evolution

The exchange industry is bifurcating. Traditional exchanges optimize for retail traders clicking buttons and institutions running regulated operations. Agent-first exchanges optimize for autonomous systems executing programmatic strategies across fragmented liquidity.

Coinbase's approach emphasizes purpose-built agent wallets with regulatory compliance considerations. OKX emphasizes breadth—60+ chains, 500+ DEXs, massive existing user base. Binance promises AI but hasn't shipped infrastructure. Smaller exchanges lack the resources to compete on infrastructure at this scale.

Network effects favor early movers. If OnchainOS becomes the standard way developers build trading agents, liquidity concentrates there because that's where the agents are. More liquidity attracts more agents. This is the same dynamic that made Ethereum the default smart contract platform despite technical limitations—developers were already there.

But it's early. Coinbase has regulatory relationships and institutional trust that matter for compliant agent deployment. Decentralized protocols might offer agent infrastructure without exchange dependency. The market could fragment by use case—Coinbase for institutional agents, OKX for defi-native operations, Solana's ecosystem for high-frequency strategies.

What "Agent-First" Really Means

The OnchainOS launch clarifies what "agent-first" infrastructure actually requires:

Natural language interfaces so non-specialist developers can build agents without learning complex blockchain APIs.

Unified cross-chain access because agents don't care about chain tribalism—they optimize for execution quality wherever liquidity exists.

Real-time data aggregation packaged as queryable feeds rather than requiring infrastructure operations.

Autonomous payment rails that let agents transact with each other economically.

Production-scale infrastructure with millisecond latency and high uptime because agents making autonomous decisions can't wait for slow API responses.

What's notable is what's missing: OKX didn't build AI models, train specialized trading agents, or create consumer-facing "autonomous trading" products. They built the layer beneath all of that.

This suggests confidence that the agent economy will be diverse—many specialized agents built by different developers for different strategies, not a few dominant trading bots. If you believe in that future, infrastructure positioning makes strategic sense.

Open Questions and Risk Factors

Several uncertainties remain. Regulatory treatment of autonomous trading systems is unresolved. When an agent executes trades violating market manipulation rules, who's liable—the developer, the exchange, the model provider?

Security risks scale differently. A bug in human-facing trading interfaces affects users who click compromised buttons. A bug in agent APIs could trigger cascading autonomous failures across thousands of agents simultaneously.

Centralization concerns persist. OnchainOS is infrastructure controlled by OKX. If agents depend on this platform for critical functionality, OKX gains enormous leverage over the agent economy—exactly the dependency crypto supposedly eliminates.

Technical risks include agent unpredictability. LLMs make probabilistic decisions. An agent optimized for yield farming might, through unexpected prompt interpretation, execute strategies its operator never intended. When that agent controls significant capital, unpredictability becomes systemic risk.

Market adoption remains unproven beyond early metrics. 1.2 billion API calls sounds impressive but could represent a small number of high-frequency bots rather than broad developer adoption. $300 million daily volume is meaningful but tiny compared to centralized exchange totals.

The Infrastructure Thesis

OKX's OnchainOS represents a specific thesis about crypto's evolution: that autonomous agents will become primary users of blockchain infrastructure, and exchanges that provide optimal agent tooling will capture disproportionate value.

This thesis is either visionary or premature. If agents do become dominant blockchain users, building this infrastructure in early 2026 positions OKX as the platform of choice before competitive dynamics lock in. If adoption lags or takes different forms, significant engineering resources go toward supporting a market that never materializes at scale.

What's clear is that OKX isn't waiting to find out. By shipping production infrastructure processing billions of API calls and hundreds of millions in trading volume, they're not pitching a vision—they're deploying a platform and learning from real usage.

The exchanges that emerge as winners in 2028 probably won't be the ones with the best trading interfaces for humans. They'll be the ones where autonomous agents found the infrastructure that made machine-to-machine crypto economies actually work.

OnchainOS is OKX's bet that infrastructure wins in the end. The next 12-24 months will reveal whether the agent economy grows fast enough to justify that conviction.


Sources

Japan's Datachain Launches First Enterprise Web3 Wallet with Privacy-Preserving Architecture

· 10 min read
Dora Noda
Software Engineer

Every corporate blockchain transaction tells a story—and that's exactly the problem.

When enterprises deploy stablecoins for cross-border payments or treasury operations, public blockchain transparency creates a dilemma. Every transaction becomes permanently visible: payment amounts, counterparties, timing patterns, and business relationships. For corporations, this isn't just uncomfortable—it's a competitive intelligence leak that makes blockchain adoption a non-starter.

Japan's Datachain has built a solution. This Spring 2026, the company is launching the country's first corporate-focused Web3 wallet that delivers what seemed impossible: complete transaction privacy while meeting stringent regulatory compliance requirements. The announcement signals a critical evolution in enterprise blockchain infrastructure, moving beyond the binary choice between transparency and privacy.

The Corporate Privacy Problem

Traditional finance operates on privacy by default. When Toyota wires payment to a supplier, competitors don't see the amount, timing, or counterparty. Banking infrastructure enforces confidentiality through institutional silos, with regulators granted selective access for compliance.

Public blockchains invert this model. Every transaction creates a permanent, public record. While wallet addresses provide pseudonymity, blockchain analytics firms can de-anonymize participants through pattern analysis. Transaction volumes reveal business relationships. Timing patterns expose operational rhythms. Payment amounts telegraph commercial terms.

For enterprises considering blockchain adoption, this transparency creates untenable risks. A manufacturer using stablecoins for supplier payments inadvertently broadcasts their entire supply chain to competitors. A treasury department moving assets between wallets reveals liquidity positions to market observers. Cross-border payment flows expose geographic expansion plans before public announcements.

Japan's regulatory environment compounds the challenge. The country's Payment Services Act requires crypto asset exchange service providers (CAESPs) to implement comprehensive know-your-customer (KYC) and anti-money laundering (AML) procedures. The Travel Rule, effective since June 2023, mandates that providers share originator and beneficiary information when transferring crypto assets or stablecoins. Service providers must obtain and record counterparty details—even for transactions not subject to the Travel Rule—and investigate unhosted wallet attributes to assess associated risks.

This regulatory framework leaves enterprises caught between two incompatible requirements: blockchain transparency that regulators can audit, and commercial confidentiality that competitive business demands.

Datachain's Privacy-by-Design Architecture

Datachain's solution—branded as "Datachain Privacy" infrastructure with the "Datachain Wallet" interface—implements what the company describes as a "triple-layer privacy model": anonymity, confidentiality, and unlinkability.

Anonymity means transaction participants' identities remain hidden from public view. Unlike pseudonymous blockchain addresses that can be de-anonymized through pattern analysis, Datachain's architecture prevents correlation between wallet addresses and corporate identities without explicit disclosure.

Confidentiality ensures transaction details—amounts, counterparties, timestamps—remain private between participating parties. Public blockchain observers cannot determine payment values or business relationships by analyzing on-chain data.

Unlinkability prevents observers from connecting multiple transactions to the same entity. Even if an enterprise conducts thousands of stablecoin transfers, blockchain analytics cannot cluster these activities into a coherent profile.

The system achieves this privacy through what appears to be zero-knowledge proof technology and selective disclosure mechanisms. Zero-knowledge proofs enable one party to prove statement validity—like "this transaction meets regulatory requirements"—without revealing the underlying data. Selective disclosure allows enterprises to demonstrate compliance to regulators while maintaining commercial privacy from competitors.

Crucially, Datachain implements Passkey-based key management, leveraging WebAuthn and FIDO2 standards. Traditional blockchain wallets rely on seed phrases or private keys—cryptographic secrets that, if compromised or lost, mean irrecoverable fund loss. Enterprise users struggle with this model: seed phrases create custody nightmares, while hardware security modules add complexity and cost.

Passkeys solve this through public-key cryptography backed by device biometrics. When an enterprise user creates a wallet, their device generates a key pair. The private key never leaves the device's secure enclave (such as Apple's Secure Element or Android's Trusted Execution Environment). Authentication happens through biometric verification—Face ID, Touch ID, or Android biometrics—instead of remembering 12- or 24-word seed phrases.

For enterprises, this dramatically simplifies key management while enhancing security. IT departments no longer need to design seed phrase custody procedures or manage hardware security modules. Employee turnover doesn't create key handoff vulnerabilities. Lost or stolen devices don't compromise wallets, as the private key cannot be extracted from the secure enclave.

Spring 2026 Launch and Enterprise Adoption

Datachain has commenced pre-registration for the Spring 2026 launch, targeting corporate stablecoin use cases. The wallet will support EVM-compatible blockchains and integrate with major stablecoins including JPYC (Japan's leading yen-backed stablecoin), USDC, USDT, and native tokens like ETH.

The timing aligns with Japan's accelerating stablecoin adoption. Following regulatory clarification that classified stablecoins as "electronic payment instruments" rather than crypto assets, major financial institutions have launched yen-backed offerings. MUFG's Progmat Coin, SBI Holdings' SBIUSDT, and JPYC have created a regulated stablecoin ecosystem targeting enterprise payment use cases.

However, stablecoin infrastructure without privacy-preserving architecture creates adoption friction. Enterprises need blockchain's benefits—24/7 settlement, programmability, reduced intermediary costs—without blockchain's transparency drawbacks. Datachain's wallet addresses this gap.

The company is accepting implementation and collaboration inquiries from enterprises through a dedicated landing page. Early adopters likely include:

  • Cross-border payment operations: Corporations using stablecoins for international supplier payments, where transaction privacy prevents competitors from analyzing supply chain relationships
  • Treasury management: CFOs moving assets between wallets or chains without broadcasting liquidity positions to market observers
  • Inter-company settlements: Conglomerates conducting internal transfers across subsidiaries without creating public transaction trails
  • B2B payment platforms: Enterprise payment processors requiring privacy for their corporate clients

Japan's regulatory environment positions Datachain uniquely. While Western jurisdictions grapple with evolving frameworks, Japan has established clear rules: stablecoins require licensing, AML/CFT compliance is mandatory, and the Travel Rule applies. Datachain's selective disclosure model demonstrates compliance without sacrificing commercial confidentiality.

The Enterprise Wallet Infrastructure Race

Datachain enters a rapidly evolving enterprise wallet infrastructure market. In 2026, the category has fragmented into specialized offerings:

Embedded wallet platforms like Privy, Portal, and Dynamic provide developers with SDKs for seamless onboarding through email, social login, and passkeys while maintaining non-custodial security. These solutions bundle account abstraction, gas sponsorship, and orchestration, targeting consumer applications rather than enterprise compliance.

Institutional custody solutions from Fireblocks, Copper, and Anchorage emphasize multi-party computation (MPC) wallet infrastructure for high-value asset protection. These platforms power hardware-secured, SOC 2-compliant wallets across EVM, Solana, Bitcoin, and other chains, but typically lack the privacy-preserving features that corporate stablecoin payments demand.

Enterprise payment platforms like BVNK and AlphaPoint focus on multi-chain stablecoin payment infrastructure, integrating Travel Rule compliance, transaction monitoring, and sanctions screening. However, these systems generally operate on public blockchain transparency, making corporate transaction details visible to blockchain observers.

Datachain's positioning combines elements from all three categories: Passkey authentication from embedded wallets, enterprise-grade security from institutional custody, and payment infrastructure from stablecoin platforms—wrapped in privacy-preserving architecture that existing solutions lack.

The market opportunity is substantial. As stablecoins transition from crypto-native applications to mainstream corporate treasury tools, enterprises need infrastructure that matches traditional finance's confidentiality expectations while meeting blockchain's transparency requirements for compliance.

Broader Implications for Enterprise Blockchain

Datachain's launch highlights a critical gap in current blockchain infrastructure: the privacy-compliance dilemma.

Public blockchains were designed for transparency. Bitcoin's breakthrough was creating a system where anyone could verify transaction validity without trusted intermediaries. Ethereum extended this to programmable smart contracts, enabling decentralized applications built on transparent state transitions.

This transparency serves essential purposes. It enables trustless verification, allowing participants to independently confirm network rules without intermediaries. It creates auditability, letting regulators and compliance officers trace fund flows. It prevents double-spending and ensures network integrity.

But transparency was never intended for corporate financial operations. When enterprises adopt blockchain for payments, they're not seeking transparency—they're seeking efficiency, programmability, and reduced intermediary costs. Transparency becomes a bug, not a feature.

Privacy-preserving technologies are maturing to address this gap. Zero-knowledge proofs, pioneered by Zcash and advanced by protocols like Aztec and Polygon zkEVM, enable transaction validity verification without revealing transaction details. Fully homomorphic encryption (FHE), commercialized by platforms like Zama Protocol, allows computation on encrypted data without decryption. Trusted execution environments (TEEs) create hardware-isolated computation zones where sensitive operations occur without external visibility.

Datachain's implementation appears to combine these approaches: zero-knowledge proofs for transaction privacy, selective disclosure for regulatory compliance, and potentially TEEs for secure key operations within the Passkey framework.

The selective disclosure model represents a particularly important innovation for regulatory compliance. Rather than choosing between "fully public for compliance" or "fully private and non-compliant," enterprises can maintain commercial privacy while demonstrating regulatory adherence through cryptographic proofs or controlled disclosures to authorized parties.

This approach aligns with Japan's "privacy-by-design" regulatory philosophy, enshrined in the country's Act on the Protection of Personal Information (APPI). Japanese regulators emphasize accountability and purpose limitation: organizations must clearly define data usage purposes and limit processing accordingly. Selective disclosure architectures make disclosure explicit and limited, aligning with APPI principles better than blanket transparency or total privacy.

The Road to Enterprise Blockchain Adoption

For blockchain to transition from crypto-native applications to mainstream enterprise infrastructure, privacy must become a standard feature, not an exception.

The current paradigm—where corporate blockchain adoption requires accepting total transaction transparency—artificially limits the technology's addressable market. Enterprises won't sacrifice competitive intelligence for marginally better settlement speed. Treasury departments won't broadcast liquidity positions to save basis points on international transfers. Supply chain managers won't expose supplier networks for programmable payment automation.

Datachain's launch, alongside similar efforts from ZKsync's Prividium banking stack (targeting Deutsche Bank and UBS) and JPMorgan's Canton Network (providing privacy for institutional applications), suggests the market is converging toward privacy-preserving enterprise blockchain infrastructure.

The Spring 2026 timeline is ambitious but achievable. Passkey authentication is production-ready, with widespread adoption across consumer applications. Zero-knowledge proof systems have matured from research curiosities to production-grade infrastructure powering Ethereum L2 networks processing billions in daily value. Selective disclosure frameworks exist in both academic literature and enterprise implementations.

The harder challenge is market education. Enterprises accustomed to traditional banking privacy must understand that blockchain privacy requires explicit architecture, not institutional silos. Regulators familiar with bank examination processes need frameworks for auditing privacy-preserving systems through cryptographic proofs rather than direct data access. Blockchain developers focused on transparency maximization must recognize that privacy is essential for institutional adoption, not antithetical to blockchain principles.

If Datachain succeeds, the template extends beyond Japan. European enterprises operating under MiCA stablecoin regulations face similar privacy-compliance tension. Singapore's Payment Services Act creates comparable requirements. U.S. state-level stablecoin licensing frameworks emerging in 2026 will likely incorporate Travel Rule obligations similar to Japan's.

BlockEden.xyz provides enterprise-grade blockchain infrastructure for developers building the next generation of Web3 applications. Explore our API services for reliable, scalable access to 40+ blockchain networks, enabling you to focus on building privacy-preserving solutions like Datachain's wallet without managing node infrastructure.

Conclusion

Japan's Datachain is solving a problem that has constrained enterprise blockchain adoption since Bitcoin's launch: public transaction transparency that conflicts with corporate confidentiality requirements.

By combining privacy-preserving cryptography with regulatory-compliant selective disclosure, wrapped in Passkey authentication that eliminates seed phrase custody nightmares, Datachain's Spring 2026 wallet launch demonstrates that enterprises can have both blockchain efficiency and traditional finance privacy.

For blockchain infrastructure to fulfill its promise beyond crypto-native applications, privacy cannot remain a specialized feature available only through complex implementations. It must become standard architecture, as fundamental as consensus mechanisms or network protocols.

Datachain's launch suggests that future is arriving. Whether building cross-border payment platforms, treasury management systems, or B2B settlement networks, enterprises will increasingly demand infrastructure that delivers blockchain's benefits without sacrificing commercial confidentiality.

The question isn't whether privacy-preserving enterprise blockchain will emerge. The question is whether incumbents will adapt or whether nimble challengers like Datachain will define the next decade of institutional Web3 infrastructure.

When Visa Settles in USDC: How Payment Giants Are Rewiring Finance for Stablecoins

· 16 min read
Dora Noda
Software Engineer

In December 2025, a quiet revolution began in the global payments industry. Visa, the network that processes over $14 trillion in annual payment volume, announced it would settle transactions in USDC stablecoin on the Solana blockchain. For the first time, a major card network was moving billions of dollars not through correspondent banks or ACH rails, but through public blockchain infrastructure.

This wasn't a pilot program relegated to a press release. Cross River Bank and Lead Bank were already settling with Visa in USDC. By November 2025, Visa's monthly stablecoin settlement volume had hit a $3.5 billion annualized run rate. The bridge between traditional finance and crypto rails wasn't coming—it had arrived.

The Payment Rails Transformation: From T+1 to Seconds

For decades, the payment industry operated on a simple truth: moving money takes time. Cross-border wire transfers settled in T+1 to T+3 days. Card network settlement happened overnight or next-day. Weekends and holidays meant financial infrastructure went dark.

Stablecoins obliterate these constraints. Settlement finality on Solana occurs in seconds. Ethereum Layer 2 networks like Base settle in under a minute. The blockchain doesn't close for weekends. There's no "business day" concept when you're running on a global, 24/7 distributed ledger.

This shift from days to seconds isn't just faster—it's a fundamental redesign of how payment networks operate. According to enterprise payment infrastructure providers, traditional payment rails face hard limitations: T+1 to T+3 settlement windows, business hours constraints, and multi-intermediary routing that introduces counterparty risk at each hop. Blockchain-based settlement eliminates these intermediaries entirely.

The market has responded decisively. On-chain stablecoin transaction volume exceeded $8.9 trillion in the first half of 2025 alone. The total stablecoin market cap surpassed $300 billion. And according to EY-Parthenon research conducted after the GENIUS Act passage, 54% of non-users expect to adopt stablecoins within 6-12 months, with 77% citing cross-border supplier payments as their top use case.

Visa's Stablecoin Strategy: VTAP and the Arc Partnership

Visa's approach centers on the Visa Tokenized Asset Platform (VTAP), released in October 2024. VTAP allows banks to issue and manage bank-issued stablecoins while retaining Visa's established risk, compliance, and authentication frameworks. This isn't Visa abandoning its traditional network—it's Visa extending that network onto blockchain rails.

The December 2025 U.S. launch focused on Circle's USDC, a fully reserved, dollar-denominated stablecoin. Participating issuer and acquirer clients can now settle with Visa in USDC delivered over the Solana blockchain. Benefits include:

  • Faster funds movement: Near-instant settlement vs. T+1 for traditional ACH
  • Seven-day availability: Blockchain settlement doesn't observe weekends or bank holidays
  • Enhanced operational resilience: No single point of failure in a distributed ledger system

Visa isn't stopping at Solana. The company is a design partner for Arc, Circle's new Layer 1 blockchain, and plans to operate a validator node once Arc goes live. This positions Visa not just as a user of blockchain infrastructure, but as an active participant in its security and governance.

Broader availability in the U.S. is planned through 2026, with active stablecoin settlement pilots already running in Europe, Latin America and the Caribbean (LAC), Asia-Pacific (AP), and Central Europe, Middle East, and Africa (CEMEA).

Mastercard's Infrastructure Play: Multi-Token Network and Crypto Credential

Where Visa moved quickly on USDC settlement, Mastercard has taken a broader, more modular approach. The company's strategy centers on two key products:

  1. Mastercard Multi-Token Network: A proprietary platform designed to manage settlement, enhance safety, and ensure regulatory compliance while preserving the programmability of stablecoins.

  2. Mastercard Crypto Credential: A compliance and identity layer that standardizes how entities interact with crypto assets across the Mastercard network.

Mastercard's pivot toward infrastructure rather than direct settlement reflects a different strategic bet. Instead of committing to specific blockchains or stablecoins, Mastercard is building the middleware layer that enables banks, fintechs, and enterprises to plug into multiple chains and token standards. This positions Mastercard as the compliance-as-a-service provider for a multi-chain future.

The company has also focused heavily on merchant-facing options, recognizing that stablecoin utility depends on where and how users can spend them. By creating standardized compliance frameworks, Mastercard aims to accelerate merchant adoption without requiring each merchant to build blockchain expertise in-house.

The GENIUS Act: Regulatory Clarity at Last

For years, stablecoins existed in regulatory limbo. Were they securities? Commodities? Money transmitter instruments? The answer varied by jurisdiction and regulator.

The GENIUS Act, signed into law in July 2025, ended that ambiguity in the United States. The legislation established that permitted payment stablecoins are neither securities, commodities, nor deposits, but instead part of a separate regulatory regime administered by the Office of the Comptroller of the Currency (OCC), Federal Deposit Insurance Corporation (FDIC), Federal Reserve Board, Secretary of the Treasury, and state banking regulators.

Key requirements include:

  • One-to-one reserve requirements: Stablecoin issuers must hold high-quality liquid assets equal to 100% of outstanding stablecoins.
  • Mandatory audits: Regular third-party attestations of reserve adequacy.
  • Federal oversight: Dual-chartering system allowing both federal and state-chartered issuers.
  • AML/KYC compliance: Full integration with Bank Secrecy Act requirements.

The OCC and Federal Reserve have until July 2026 to finalize technical standards for reserve audits and cybersecurity. Regulations take full effect by January 18, 2027, giving issuers a clear timeline to achieve compliance.

Globally, similar frameworks have emerged. The EU's Markets in Crypto-Assets (MiCA) regulation is now fully applicable. Hong Kong enacted its Stablecoin Bill. Singapore, the UAE, and other financial hubs have introduced rules for these assets. For the first time, stablecoin issuers have clarity on what compliance looks like.

Settlement Finality: The Technical Architecture Behind Instant Settlement

Settlement finality—the point at which a transaction becomes irreversible—is the bedrock of payment network trust. In traditional systems, finality can take hours or days as transactions clear through multiple intermediaries.

Blockchain-based settlement operates on fundamentally different principles:

  • Solana: Near-instant finality (approximately 400 milliseconds for block confirmation, with economic finality in under 3 seconds).
  • Ethereum Layer 2s (Base, Arbitrum, Optimism): Settlement finality in seconds to minutes, with final security guaranteed by Ethereum mainnet.
  • Traditional rails (ACH, SWIFT): T+1 to T+3 settlement, with intraday finality unavailable in many cases.

This speed advantage isn't theoretical. When Visa settles in USDC on Solana, funds move between counterparties in seconds. Liquidity that would be locked for days in correspondent banking relationships becomes immediately available for redeployment.

However, settlement finality on public blockchains introduces new technical requirements:

  1. Blockchain confirmations: How many block confirmations constitute "final" settlement? This varies by chain and risk tolerance.
  2. Reorg risk: The possibility that blockchain state could be rewritten (though extremely rare on major chains).
  3. Smart contract risk: Settlement routed through smart contracts introduces code execution risk not present in traditional systems.
  4. Bridge security: If settlement requires moving assets between chains, bridge vulnerabilities become a critical attack vector.

Payment networks integrating stablecoins must architect systems that account for these blockchain-specific risks while maintaining the reliability standards that financial institutions demand.

Compliance Architecture: Bridging Blockchain and Regulatory Requirements

Integrating public blockchain stablecoins with traditional payment networks creates a compliance architecture challenge unlike anything the industry has faced before.

Traditional payment networks operate within well-defined regulatory perimeters. They have KYC at onboarding, transaction monitoring for suspicious activity, sanctions screening against OFAC lists, and chargeback mechanisms for dispute resolution.

Blockchain transactions work differently. They're pseudonymous, irreversible, and don't natively include customer identity data.

Payment networks have developed multi-layered compliance architectures to bridge this gap:

Identity and Onboarding Layer

  • KYB (Know Your Business) screening: Verifying corporate entities before allowing stablecoin settlement.
  • Beneficiary screening: Identifying ultimate beneficial owners in settlement transactions.
  • Wallet whitelisting: Only allowing settlement to/from pre-approved blockchain addresses.

Transaction Monitoring Layer

  • Sanctions screening: Real-time checking of blockchain addresses against OFAC and international sanctions lists.
  • Chain analysis: Using blockchain forensics tools to trace transaction history and flag high-risk counterparties.
  • KYT (Know Your Transaction) pattern monitoring: Identifying suspicious activity patterns like rapid movement through multiple addresses, structuring, or mixing services.

Governance and Control Layer

  • Approval workflows: Multi-signature requirements for large stablecoin settlements.
  • Velocity limits: Maximum settlement amounts per time period.
  • Circuit breakers: Automatic suspension of stablecoin settlement if anomalous activity is detected.

According to enterprise stablecoin infrastructure guides, secure payment platforms must integrate all three layers to meet regulatory requirements. This is far more complex than simply enabling blockchain transactions—it requires building entire compliance stacks that map traditional regulatory obligations onto pseudonymous blockchain activity.

The Regulatory Gaps: What the Rules Don't Cover Yet

Despite the GENIUS Act and global regulatory frameworks, significant gaps remain between traditional payment network regulation and blockchain reality.

Cross-Jurisdictional Settlement

Stablecoins are global by nature. A USDC transfer from a U.S. business to a European supplier settles identically whether the parties are in different time zones or across the street. But payment network regulations remain jurisdictional. If Visa settles a transaction in USDC between parties in different regulatory regimes, which rules apply? The answer is often unclear.

Smart Contract Governance

Traditional payment networks have clear governance: disputes go through arbitration processes, chargebacks follow defined rules, and systemic failures trigger regulatory intervention. Smart contracts that automate settlement have no such governance layer. If a smart contract bug causes incorrect settlement, who bears liability? The payment network? The smart contract developer? The blockchain validator? Current regulations don't specify.

MEV and Transaction Ordering

Maximal Extractable Value (MEV)—the practice of reordering or front-running blockchain transactions for profit—has no parallel in traditional payment systems. If a payment network's stablecoin settlement is front-run by MEV bots, causing price slippage or settlement failures, existing fraud and dispute regulations don't clearly apply.

Stablecoin De-Pegging Risk

Payment networks assume the dollar-denominated instruments they settle are actually worth one dollar. But stablecoins can de-peg during market stress. If Visa settles $1 million in USDC and the peg breaks to $0.95 before final settlement, who absorbs the loss? Traditional payment networks don't have frameworks for currency-like assets that can fluctuate in value mid-transaction.

The compliance gaps are real. According to payment service provider research, 85% of respondents identified lack of regulatory clarity and potential changes in regulatory posture as large concerns when dealing with digital asset payments.

While the GENIUS Act provides clarity on stablecoin issuance, it doesn't fully address the operational complexities of integrating stablecoins into payment network settlement.

Interoperability Standards

Traditional payment rails have decades of interoperability standards: ISO 20022 for messaging, EMV for card payments, SWIFT for international transfers. Blockchain ecosystems lack equivalent universal standards. How does a transaction initiated on Ethereum settle with a recipient on Solana? Payment networks must either build custom bridges, rely on third-party interoperability protocols, or limit settlement to specific chains—all of which introduce new risks and complexities.

American Express: The Silence Is Strategic

Notably absent from stablecoin settlement announcements is American Express. While Visa and Mastercard have rolled out blockchain integration initiatives, AmEx has remained publicly silent on stablecoin settlement plans.

This may reflect AmEx's fundamentally different business model. Unlike Visa and Mastercard, which operate as networks connecting issuing banks and merchants, AmEx is primarily a closed-loop system where the company acts as both issuer and acquirer. This gives AmEx more control over its payment flows but also less incentive to integrate external settlement rails.

Additionally, AmEx's customer base skews toward high-net-worth individuals and large corporations—segments that may not yet see stablecoin settlement as a compelling value proposition. For a multinational corporation with sophisticated treasury operations, the speed advantage of blockchain settlement may be less critical than for small businesses or cross-border remittance users.

That said, AmEx's silence likely won't last. As stablecoin adoption grows and regulatory frameworks mature, the competitive pressure to offer blockchain settlement options will intensify.

The Adoption Curve: From Pilots to Production Scale

Stablecoin payment network integration is no longer theoretical. Real volume is flowing through these systems today.

Visa's $3.5 billion annualized settlement run rate as of November 2025 represents actual payments moving through USDC on Solana. Cross River Bank and Lead Bank aren't testing the technology—they're using it for production settlement.

But this is still early innings. For context, Visa's total annual payment volume exceeds $14 trillion. Stablecoin settlement currently represents roughly 0.025% of Visa's total flow. The question isn't whether stablecoins will scale on payment networks—it's how fast.

Several catalysts could accelerate adoption:

  1. Merchant acceptance: As more merchants accept stablecoin payments directly, payment networks will integrate stablecoin settlement to capture that flow.
  2. Corporate treasury optimization: Companies are beginning to hold stablecoins on balance sheets for working capital efficiency. Payment networks that enable seamless conversion between stablecoin treasuries and fiat settlement will capture this market.
  3. Cross-border remittances: The $900 billion global remittance market remains dominated by high-fee intermediaries. Stablecoin settlement could reduce costs by 75% or more.
  4. Embedded finance: Fintech platforms embedding payment capabilities increasingly prefer stablecoin rails for their speed and programmability.

According to post-GENIUS Act research, 54% of current non-users expect to adopt stablecoins within 6-12 months. If even a fraction of this demand materializes, payment network stablecoin settlement could grow from billions to hundreds of billions in annual volume by 2027.

What This Means for Blockchain Infrastructure

The integration of payment giants into blockchain settlement has profound implications for crypto infrastructure providers.

Node operators and validators become critical financial infrastructure. When Visa commits to operating a validator node on Circle's Arc, it's not a symbolic gesture—it's Visa taking responsibility for network security and uptime for a system that will settle billions in payment volume.

RPC providers and API infrastructure face new reliability requirements. A payment network can't settle transactions if its RPC endpoint is down or rate-limited. Enterprises need institutional-grade blockchain API access with guaranteed uptime SLAs.

Blockchain analytics and compliance tools become mandatory vendor relationships. Payment networks must screen every settlement address against sanctions lists, trace transaction history for AML compliance, and monitor for suspicious patterns—all in real time.

Interoperability protocols (LayerZero, Wormhole, Axelar) could become the backbone of multi-chain settlement. If payment networks want to settle on multiple blockchains without maintaining separate infrastructure for each, cross-chain messaging protocols become critical infrastructure.

BlockEden.xyz provides institutional-grade API access for blockchain networks including Ethereum, Solana, Sui, and Aptos—the same infrastructure that payment networks and financial institutions rely on for production settlement. Explore our API marketplace to build on the same foundations powering the future of finance.

The 2026 Roadmap: What Comes Next

As we move deeper into 2026, several milestones will define the payment network stablecoin integration landscape:

July 2026: GENIUS Act Technical Standards Finalization The OCC and Federal Reserve must publish final rules on reserve audits and cybersecurity. These standards will define exactly what compliance looks like for stablecoin issuers and payment networks.

Q2-Q3 2026: Visa's Broader U.S. Rollout Visa has committed to expanding USDC settlement access to more U.S. partners throughout 2026. The scale of this rollout will indicate whether stablecoin settlement moves from niche to mainstream.

Circle's Arc Launch Circle's Arc Layer 1 blockchain is expected to launch with Visa as a validator. This represents the first time a major payment network will help secure a blockchain's consensus mechanism.

Mastercard Multi-Token Network Expansion Mastercard's infrastructure-first approach should begin showing results as banks and fintechs plug into the Multi-Token Network. Watch for announcements of major financial institutions launching stablecoin products on Mastercard rails.

Global Regulatory Harmonization (or Fragmentation) As the U.S., EU, Hong Kong, Singapore, and other jurisdictions finalize stablecoin rules, a key question emerges: Will these frameworks align, creating a globally interoperable stablecoin payment system? Or will regulatory fragmentation force payment networks to maintain separate compliance architectures for each region?

American Express's First Move It would be surprising if AmEx remains silent on stablecoins through all of 2026. When AmEx does announce blockchain integration, it will likely reflect a different strategic approach than Visa and Mastercard—possibly focusing on closed-loop treasury optimization for corporate clients.

Conclusion: The Payment Rails Have Split

We're witnessing a permanent bifurcation of global payment infrastructure.

On one track, traditional rails—ACH, SWIFT, card networks—will continue operating much as they have for decades. These systems are deeply embedded in financial infrastructure, regulated to exhaustion, and trusted by institutions that value stability above all else.

On the parallel track, blockchain-based payment rails are rapidly maturing. Stablecoin settlement is faster, cheaper, and available 24/7. The GENIUS Act and global regulatory frameworks have provided the clarity that institutions demanded. And now, the largest payment networks on Earth are integrating these rails into production systems.

The question for financial institutions is no longer whether to integrate stablecoin settlement, but how fast they can do so without falling behind competitors who are already settling billions on-chain.

For Visa, Mastercard, and eventually American Express, this isn't a choice between blockchain and traditional finance. It's a recognition that both will coexist, and payment networks must operate seamlessly across both worlds.

The card networks built the 20th century's payment infrastructure. Now they're rewiring it for the 21st—one USDC transaction at a time.


Sources:

The Graph's 2026 Transformation: Redefining Blockchain Data Infrastructure

· 13 min read
Dora Noda
Software Engineer

When 37% of your new users aren't human, you know something fundamental has shifted.

That's the reality The Graph faced in early 2026 when analyzing Token API adoption: more than one in three new accounts belonged to AI agents, not developers. These autonomous programs — querying DeFi liquidity pools, tracking tokenized real-world assets, and executing institutional trades — now consume blockchain data at a scale that would be impossible for human operators to match.

This isn't a future scenario. It's happening now, and it's forcing a complete rethinking of how blockchain data infrastructure works.

From Subgraph Pioneer to Multi-Service Data Backbone

The Graph built its reputation on a single elegant solution: subgraphs. Developers create custom schemas that index on-chain events and smart contract states, enabling dApps to fetch precise, real-time data without running their own nodes.

It's the reason you can check your DeFi portfolio balance instantly or browse NFT metadata without waiting for blockchain queries to complete.

By late 2025, The Graph had processed over 1.5 trillion queries since inception — a milestone that positions it as the largest decentralized data infrastructure in Web3. But raw query volume only tells part of the story.

The more revealing metric emerged in Q4 2025: 6.4 billion queries per quarter, with active subgraphs reaching an all-time high of 15,500. Yet new subgraph creation had slowed dramatically.

The interpretation? The Graph's existing infrastructure serves its current users exceptionally well, but the next wave of adoption requires something fundamentally different.

Enter Horizon, the protocol upgrade that went live in December 2025 and sets the stage for The Graph's 2026 transformation.

The Horizon Architecture: Multi-Service Infrastructure for the On-Chain Economy

Horizon isn't a feature update. It's a complete architectural redesign that transforms The Graph from a subgraph-focused platform into a multi-service data infrastructure capable of serving three distinct customer segments simultaneously: developers, AI agents, and institutions.

The architecture introduces three foundational components:

A core staking protocol that extends economic security to any data service, not just subgraphs. This allows new data products to inherit The Graph's existing network of 167,000+ delegators and active indexers without building separate security models.

A unified payments layer that handles fees across all services, enabling seamless cross-service billing and reducing friction for users who need multiple types of blockchain data.

A permissionless framework allowing new data services to integrate without requiring protocol governance votes. Any team can build on The Graph's infrastructure, as long as they meet technical standards and stake GRT tokens for security.

This modular approach solves a critical problem: different use cases require different data architectures.

A DeFi trading bot needs millisecond-level liquidity updates. An institutional compliance team needs SQL-queryable audit trails. A wallet app needs pre-indexed token balances across dozens of chains. Before Horizon, these use cases would require separate infrastructure providers.

Now, they can all run on The Graph.

Four Services, Four Distinct Markets

The Graph's 2026 roadmap introduces four specialized data services, each targeting a specific market need:

Token API: Pre-Indexed Data for Common Queries

The Token API eliminates the need for custom indexing when you just need standard token data — balances, transfer histories, contract addresses across 10 chains. Wallets, explorers, and analytics platforms no longer need to deploy their own subgraphs for basic queries.

This is where AI agents have shown up in force. The 37% non-human user adoption rate reflects a simple reality: AI agents don't want to configure indexers or write GraphQL queries. They want an API that speaks natural language and returns structured data instantly.

The integration with Model Context Protocol (MCP) enables AI agents to query blockchain data through tools like Claude, Cursor, and ChatGPT without setup keys. The x402 protocol adds autonomous payment capabilities, letting agents pay per query without human intervention.

Tycho: Real-Time Liquidity Tracking for DeFi

Tycho streams live liquidity changes across decentralized exchanges — exactly what trading systems, solvers, and MEV bots need. Instead of polling subgraphs every few seconds, Tycho pushes updates as they happen on-chain.

For DeFi infrastructure providers, this reduces latency from seconds to milliseconds. In high-frequency trading environments where a 100ms delay can mean the difference between profit and loss, Tycho's streaming architecture becomes mission-critical.

Amp: SQL Database for Institutional Analytics

Amp represents The Graph's most explicit play for traditional finance adoption: an enterprise-grade blockchain database with SQL access, built-in audit trails, lineage tracking, and on-premises deployment options.

This isn't for DeFi degens. It's for treasury oversight teams, risk management divisions, and regulated payment systems that need compliance-ready data infrastructure.

The DTCC's Great Collateral Experiment — a pilot program exploring tokenized securities settlement — already uses Graph technology, validating the institutional use case.

SQL compatibility is crucial. Financial institutions have decades of tooling, reporting systems, and analyst expertise built around SQL.

Asking them to learn GraphQL is a non-starter. Amp meets them where they are.

Subgraphs: The Foundation That Still Matters

Despite the new services, subgraphs remain central to The Graph's value proposition. The 50,000+ active subgraphs powering virtually every major DeFi protocol represent an installed base that competitors cannot easily replicate.

In 2026, subgraphs deepen in two ways: expanded multi-chain coverage (now spanning 40+ blockchains) and tighter integration with the new services.

A developer can use a subgraph for custom logic while pulling pre-indexed token data from Token API — best of both worlds.

Cross-Chain Expansion: GRT Utility Beyond Ethereum

For years, The Graph's GRT token existed primarily on Ethereum mainnet, creating friction for users on other chains. That changed with Chainlink's Cross-Chain Interoperability Protocol (CCIP) integration, which bridged GRT to Arbitrum, Base, and Avalanche in late 2025, with Solana planned for 2026.

This isn't just about token availability. Cross-chain GRT utility enables developers on any chain to pay for Graph services using their native tokens, stake GRT to secure data services, and delegate to indexers without moving assets to Ethereum.

The network effects compound quickly: Base processed 1.23 billion queries in Q4 2025 (up 11% quarter-over-quarter), while Arbitrum posted the strongest growth among major networks at 31% QoQ. As L2s continue absorbing transaction volume from Ethereum mainnet, The Graph's cross-chain strategy positions it to serve the entire multi-chain ecosystem.

The AI Agent Data Problem: Why Indexing Becomes Critical

AI agents represent a fundamentally different class of blockchain user. Unlike human developers who write queries once and deploy them, agents generate thousands of unique queries per day across dozens of data sources.

Consider an autonomous DeFi yield optimizer:

  1. It queries current APYs across lending protocols (Aave, Compound, Morpho)
  2. Checks gas prices and transaction congestion
  3. Monitors token price feeds from oracles
  4. Tracks historical volatility to assess risk
  5. Verifies smart contract security audits
  6. Executes rebalancing transactions when conditions are met

Each step requires structured, indexed data. Running a full node for every protocol is economically infeasible. APIs from centralized providers introduce single points of failure and censorship risk.

The Graph solves this by providing a decentralized, censorship-resistant data layer that AI agents can query programmatically. The economic model works because agents pay per query via x402 protocol — no monthly subscriptions, no API keys to manage, just usage-based billing settled on-chain.

This is why Cookie DAO, a decentralized data network indexing AI agent activity across Solana, Base, and BNB Chain, builds on The Graph's infrastructure. The fragmented on-chain actions and social signals generated by thousands of agents need structured data feeds to be useful.

DeFi and RWA: The Data Demands of Tokenized Finance

DeFi's data requirements have matured dramatically. In 2021, a DEX aggregator might query basic token prices and liquidity pool reserves. In 2026, institutional DeFi platforms need:

  • Real-time collateralization ratios for lending protocols
  • Historical volatility data for risk modeling
  • Cross-chain asset pricing with oracle verification
  • Transaction provenance for compliance audits
  • Liquidity depth across multiple venues for trade execution

Tokenized real-world assets add another layer of complexity. When a tokenized U.S. Treasury fund integrates with a DeFi lending protocol (as BlackRock's BUIDL did with Uniswap), the data infrastructure must track:

  • On-chain ownership records
  • Redemption requests and settlement status
  • Regulatory compliance events
  • Yield distribution to token holders
  • Cross-chain bridge activity

The Graph's multi-service architecture addresses this by allowing RWA platforms to use Amp for institutional-grade SQL analytics while simultaneously streaming real-time updates via Tycho for DeFi integrations.

The market opportunity is staggering: Ripple and BCG forecast tokenized RWAs expanding from $0.6 trillion in 2025 to $18.9 trillion by 2033 — a 53% compound annual growth rate. Every dollar tokenized on-chain generates data that needs indexing, querying, and reporting.

Network Economics: The Indexer and Delegator Model

The Graph's decentralized architecture relies on economic incentives aligning three stakeholder groups:

Indexers run infrastructure to process and serve queries, earning query fees and indexing rewards in GRT tokens. The number of active indexers increased modestly in Q4 2025, suggesting operators remained committed despite lower near-term profitability from reduced query fees.

Delegators stake GRT tokens with indexers to earn a portion of rewards without running infrastructure themselves. The network's 167,000+ delegators represent distributed economic security that makes data censorship prohibitively expensive.

Curators signal which subgraphs are valuable by staking GRT, earning a portion of query fees when their curated subgraphs are used. This creates a self-organizing quality filter: high-quality subgraphs attract curation, which attracts indexers, which improves query performance.

The Horizon upgrade extends this model to all data services, not just subgraphs. An indexer can now serve Token API queries, stream Tycho liquidity updates, and provide Amp database access — all secured by the same GRT stake.

This multi-service revenue model matters because it diversifies indexer income beyond subgraph queries. If AI agent query volume scales as projected, indexers serving Token API could see significant revenue growth, even if traditional subgraph usage plateaus.

The Institutional Wedge: From DeFi to TradFi

The DTCC pilot program represents something bigger than a single use case. It's proof that major financial institutions — in this case, the organization that settles $2.5 quadrillion in securities transactions annually — will build on public blockchain data infrastructure when it meets regulatory requirements.

Amp's feature set directly targets this segment:

  • Lineage tracking: Every data point traces back to its on-chain source, creating an immutable audit trail.
  • Compliance features: Role-based access controls, data retention policies, and privacy controls meet regulatory standards.
  • On-premises deployment: Regulated entities can run Graph infrastructure inside their security perimeter while still participating in the decentralized network.

The playbook mirrors how enterprise blockchain adoption played out: start with private/permissioned chains, gradually integrate with public chains as compliance frameworks mature. The Graph positions itself as the data layer that works across both environments.

If major banks adopt Amp for tokenized securities settlement, blockchain analytics for AML compliance, or real-time risk monitoring, the query volume could dwarf current DeFi usage. A single large institution running hourly compliance queries across multiple chains generates more sustainable revenue than thousands of individual developers.

The 2026 Inflection Point: Is This The Graph's Year?

The Graph's 2026 roadmap presents a clear thesis: the current token price fundamentally misprices the network's position in the emerging AI agent economy and institutional blockchain adoption.

The bull case rests on three assumptions:

  1. AI agent query volume scales meaningfully. If the 37% adoption rate among Token API users reflects a broader trend, and autonomous agents become the primary consumers of blockchain data, query fees could surge beyond historical levels.

  2. Horizon's multi-service architecture drives fee revenue growth. By serving developers, agents, and institutions simultaneously, The Graph captures revenue from multiple customer segments instead of relying solely on DeFi developers.

  3. Cross-chain GRT utility via Chainlink CCIP generates sustained demand. As users on Arbitrum, Base, Avalanche, and Solana pay for Graph services using bridged GRT, token velocity increases while supply remains capped.

The bear case argues that the infrastructure moat is narrower than it appears. Alternative indexing solutions like Chainstack, BlockXs, and Goldsky offer hosted subgraph services with simpler pricing and faster setup. Centralized API providers like Alchemy and Infura bundle data access with node infrastructure, creating switching costs.

The counterargument: The Graph's decentralized architecture matters precisely because AI agents and institutions cannot rely on centralized data providers. AI agents need censorship resistance to ensure uptime during adversarial conditions. Institutions need verifiable data provenance that centralized APIs cannot provide.

The 50,000+ active subgraphs, 167,000+ delegators, and ecosystem integrations with virtually every major DeFi protocol create a network effect that competitors must overcome, not just match.

Why Data Infrastructure Becomes the AI Economy Backbone

The blockchain industry spent 2021-2023 obsessing over execution layers: faster Layer 1s, cheaper Layer 2s, more scalable consensus mechanisms.

The result? Transactions that cost fractions of a penny and settle in milliseconds. The bottleneck shifted.

Execution is solved. Data is the new constraint.

AI agents can execute trades, rebalance portfolios, and settle payments autonomously. What they cannot do is operate without high-quality, indexed, queryable data about on-chain state. The Graph's trillion-query milestone reflects this reality: as blockchain applications grow more sophisticated, data infrastructure becomes more critical than transaction throughput.

This mirrors the evolution of traditional tech infrastructure. Amazon didn't win e-commerce because it had the fastest servers — it won because it built the best data infrastructure for inventory management, personalization, and logistics optimization. Google didn't win search because it had the most storage — it won because it indexed the web better than anyone else.

The Graph is positioning itself as the Google of blockchain data: not the only indexing solution, but the default infrastructure that everything else builds on top of.

Whether that vision materializes depends on execution in the next 12-24 months. If Horizon's multi-service architecture attracts institutional clients, if AI agent query volume justifies the infrastructure investment, and if cross-chain expansion drives sustainable GRT demand, 2026 could be the year The Graph transitions from "important DeFi infrastructure" to "essential backbone of the on-chain economy."

The 1.5 trillion queries are just the beginning.


Building applications that rely on robust blockchain data infrastructure? BlockEden.xyz provides high-performance API access across 40+ chains, complementing decentralized indexing with enterprise-grade reliability for production Web3 applications.

Visa and Mastercard's Stablecoin Pivot: When Traditional Payment Rails Meet Blockchain Infrastructure

· 13 min read
Dora Noda
Software Engineer

When Visa announced in late 2024 that its monthly stablecoin settlement volume had surpassed a $3.5 billion annualized run rate, it wasn't just another blockchain pilot. It was a signal that the world's largest payment networks are fundamentally rearchitecting how money moves across borders. Galaxy Digital's bold prediction—that at least one major card network will route over 10% of cross-border settlement volume through public-chain stablecoins in 2026—is no longer a speculative bet. It's becoming infrastructure reality.

The convergence is happening faster than most expected. Visa is settling actual transactions in USDC on Solana. Mastercard is running live credit card settlements on the XRP Ledger with Ripple. And both networks are racing to make blockchain-based payments invisible to end users while capturing the efficiency gains that traditional rails can't match.

This isn't about replacing the existing payment infrastructure. It's about embedding stablecoins directly into the settlement layer of the world's most trusted payment brands—and the implications stretch far beyond crypto.

Visa's Infrastructure Play: From Pilot to Production

Visa's approach represents the most aggressive stablecoin integration by a traditional payment network to date. In January 2025, the company launched USDC settlement in the United States, allowing issuer and acquirer partners to settle with Visa using Circle's dollar-backed stablecoin.

The technical architecture is deceptively simple but strategically profound. Cross River Bank and Lead Bank are settling transactions with Visa in USDC over the Solana blockchain—not a private permissioned ledger, but a public Layer 1 blockchain processing hundreds of thousands of transactions per second. The settlement framework offers seven-day availability, meaning banks can move funds 24/7 including weekends and holidays, a dramatic improvement over traditional ACH rails that operate only on business days.

But Visa isn't stopping at Solana. The company is a design partner for Arc, Circle's new purpose-built Layer 1 blockchain currently in public testnet. Arc's architecture is optimized for the performance and scalability needed to support Visa's global commercial activity on-chain. Once Arc launches, Visa plans to operate a validator node—making one of the world's largest payment processors an active participant in blockchain consensus.

This dual-chain strategy signals Visa's long-term commitment. Solana provides immediate production capabilities with proven throughput. Arc offers a tailored environment where Visa can influence protocol development and ensure the blockchain meets institutional requirements for reliability, compliance, and interoperability with existing payment infrastructure.

The benefits for issuers are tangible:

  • Faster funds movement eliminates multi-day settlement delays
  • Automated treasury operations reduce manual reconciliation overhead
  • Interoperability between blockchain-based payments and traditional rails creates optionality—banks can route transactions through whichever system offers the best economics for a given use case

Mastercard's Multi-Pronged Stablecoin Strategy

While Visa focuses on settlement infrastructure, Mastercard is building a three-layer payments stack that touches consumers, merchants, and institutional settlement simultaneously.

At the consumer layer, Mastercard announced in April 2025 that it would enable end-to-end stablecoin capabilities "from wallets to checkouts." Partnerships with crypto-native platforms like MetaMask, Crypto.com, OKX, and Kraken now let millions of people spend stablecoin balances at over 150 million Mastercard merchant locations worldwide. The OKX Card, launched in collaboration with Mastercard, links crypto trading and Web3 spending directly to the merchant network—no intermediary conversion step required for the user.

On the merchant side, Mastercard is enabling direct settlement in stablecoins like USDC, allowing businesses to receive payments in digital dollars without touching fiat. This eliminates foreign exchange friction and settlement delays, particularly valuable for cross-border e-commerce where traditional card settlements can take days and incur 2-3% currency conversion fees.

But the most technically ambitious initiative is Mastercard's live pilot with Ripple, which went operational on November 6, 2025. Real credit card transactions are settling on the XRP Ledger using RLUSD—Ripple's USD-backed stablecoin. Unlike Visa's settlement-layer integration, this pilot tests whether blockchain can handle real-time authorization and clearing, not just end-of-day settlement. If successful, it proves public blockchains can meet the sub-second response times required for point-of-sale transactions.

Underpinning these initiatives is Mastercard's Multi-Token Network, a regulated blockchain environment where banks can transact with tokenized deposits and stablecoins under existing compliance frameworks. The network also includes Crypto Credential, an identity and compliance layer that binds blockchain addresses to verified entities—solving the "who are you transacting with" problem that has long plagued permissionless networks.

Mastercard's strategy is hedged. It's supporting multiple stablecoins (USDC, PYUSD, USDG, FIUSD), multiple blockchains (Ethereum, Solana, XRP Ledger), and multiple use cases (consumer spending, merchant settlement, wallet payouts). The bet is that stablecoins will become ubiquitous, but the winning chains and form factors remain uncertain.

Galaxy Digital's 10% Threshold: Why It Matters

Galaxy Digital's prediction that a major card network will route over 10% of cross-border settlement volume through public-chain stablecoins in 2026 is significant for three reasons:

1. It establishes a quantifiable benchmark. "Exploring blockchain" has been a common refrain for payment networks since 2015. A 10% threshold represents material adoption—not a pilot, but a production use case handling billions of dollars in real transaction volume.

2. The prediction specifically references public-chain stablecoins, not private permissioned networks. This distinction matters. Private blockchains controlled by consortiums offer incremental efficiency gains but don't fundamentally change the trust model or interoperability dynamics. Public chains introduce permissionless access, programmability, and composability—properties that enable entirely new financial primitives.

3. Galaxy expects "most end users will never see a crypto interface." This is the critical usability threshold. If blockchain infrastructure remains visible to consumers, adoption stays limited to crypto-native users. If it becomes invisible—users swipe a Mastercard, merchants receive dollars, but the settlement layer runs on Solana—then the addressable market expands to every cardholder and merchant globally.

EY-Parthenon's projection supports Galaxy's thesis from a different angle. The consultancy estimates that 5-10% of cross-border payments will use stablecoins by 2030, representing $2.1 trillion to $4.2 trillion in value. Cross-border payments are particularly ripe for disruption because legacy rails are slowest and most expensive for these transactions. SWIFT transfers can take 2-5 business days and cost $25-50 per transaction. Stablecoin settlement on Solana costs fractions of a penny and settles in seconds.

Visa's $3.5 billion annualized run rate (as of November 2024) shows the trajectory is real. If that volume doubles every six months—a conservative assumption given exponential crypto adoption curves—Visa alone could hit $50 billion in annual stablecoin settlement by late 2026. For context, Visa's total payment volume exceeded $10 trillion in 2023. A 10% cross-border threshold would require roughly $150-200 billion in stablecoin settlement, an ambitious but achievable target if institutional adoption accelerates.

Technical Architecture: How Blockchain Meets Payment Rails

The technical integration between traditional payment networks and blockchain stablecoins involves three layers: the settlement layer, the compliance layer, and the user interface layer.

Settlement Layer: This is where blockchain offers the clearest advantages. Traditional payment networks settle transactions through a complex web of correspondent banks, clearinghouses, and central bank systems. Settlement can take 1-3 business days, requires pre-funded nostro accounts in multiple currencies, and operates only during banking hours.

Blockchain settlement is radically simpler. A stablecoin like USDC exists as a smart contract on Ethereum, Solana, or other chains. Transactions are atomic—either both parties receive their funds or the transaction fails entirely. Settlement is final within seconds to minutes depending on the blockchain. And because blockchains operate 24/7, there are no weekend delays or holiday closures.

Visa's integration with Solana demonstrates this architecture. When Cross River Bank settles with Visa in USDC, the bank sends USDC tokens to Visa's blockchain address. Visa receives the tokens, updates internal ledgers, and credits the acquiring bank. The entire process happens on-chain with cryptographic proof, eliminating the reconciliation mismatches common in traditional correspondent banking.

Compliance Layer: The biggest blocker to mainstream blockchain adoption has been compliance uncertainty. Payment networks operate under strict regulatory frameworks—KYC, AML, sanctions screening, transaction monitoring. Public blockchains are pseudonymous and permissionless, creating friction with regulatory requirements.

Mastercard's Crypto Credential solves this problem by creating a compliance overlay. Users prove identity off-chain through traditional KYC processes. Once verified, they receive a blockchain credential that cryptographically proves their identity meets regulatory standards without exposing personal data on-chain. Merchants and payment processors can verify the credential in real-time, ensuring all parties meet compliance requirements.

Similarly, Circle's USDC is issued only to verified entities that pass KYC checks. While USDC can be freely transferred on public blockchains, the on-ramp (converting fiat to USDC) and off-ramp (redeeming USDC for fiat) remain gated by traditional financial compliance. This hybrid model preserves blockchain's efficiency while satisfying regulatory obligations.

User Interface Layer: The final piece is making blockchain invisible to end users. Visa and Mastercard's core competency is user experience—consumers swipe cards without thinking about ACH networks, correspondent banks, or foreign exchange settlement. The same principle applies to stablecoin integration.

When a consumer spends with a Mastercard-linked crypto wallet, the transaction appears identical to a traditional card payment. Behind the scenes, the wallet converts stablecoins to fiat (or merchants accept stablecoins directly), but the checkout experience is unchanged. This abstraction is critical. Asking consumers to manage blockchain addresses, gas fees, and wallet private keys creates friction. Making it automatic removes adoption barriers.

Visa's partnership with Circle on Arc blockchain includes plans for this level of integration. Arc is designed with "performance and scalability needed to support Visa's global commercial activity onchain"—implying transaction throughput, finality times, and reliability that match or exceed traditional payment systems. If Arc delivers, Visa can route transactions through blockchain infrastructure without degrading the user experience.

The Broader Implications for Financial Infrastructure

The Visa-Mastercard stablecoin pivot is more than a payment network upgrade. It's a signal that blockchain is transitioning from speculative asset class to institutional infrastructure.

For banks, stablecoin settlement offers immediate cost savings. Nostro account funding ties up billions in dormant capital. Blockchain settlement eliminates pre-funding requirements—funds move only when transactions execute. For international payments, this liquidity efficiency translates to lower costs and better treasury management.

For merchants, particularly cross-border e-commerce businesses, stablecoin settlement reduces foreign exchange risk and settlement delays. A European merchant accepting USD payments from American customers can receive USDC instantly, convert to euros on-demand, and avoid the 2-5 day settlement windows that constrain cash flow.

For fintech platforms, the integration creates new infrastructure primitives. Once Visa and Mastercard support stablecoin settlement, any fintech with card issuing capabilities can offer crypto-linked spending. This eliminates the need for proprietary blockchain integrations—fintechs can leverage Visa and Mastercard's infrastructure as a blockchain abstraction layer.

The regulatory dimension is equally important. Visa and Mastercard operate under the most stringent compliance regimes in global finance. Their endorsement of public-chain stablecoins signals to regulators that these systems can meet institutional standards. The GENIUS Act in the U.S., MiCA regulations in the EU, and stablecoin frameworks in Singapore and Hong Kong are all converging toward clear rules that treat compliant stablecoins as payment instruments rather than speculative crypto assets.

This regulatory clarity, combined with major payment network adoption, creates a positive feedback loop. As compliance frameworks solidify, more institutions adopt stablecoins. As adoption grows, regulators gain confidence in the technology's safety and stability. And as stablecoins prove themselves in production, the economic incentives to migrate from legacy rails increase.

What Happens to Traditional Payment Infrastructure?

The rise of stablecoin settlement doesn't spell the end of SWIFT, ACH, or correspondent banking—at least not immediately. What it does is create a parallel infrastructure that handles transactions traditional rails do poorly: cross-border payments, 24/7 settlement, micropayments, and programmable money.

Think of it as optionality. A bank settling with Visa can choose USDC for international transactions requiring instant settlement, while using traditional ACH for domestic payroll disbursements where speed matters less. Over time, as blockchain infrastructure matures, the efficiency gains compound, and the default shifts toward stablecoin settlement for an increasing share of transactions.

The real disruption isn't consumer-facing. Most cardholders won't know whether their transaction settled via ACH or blockchain. The disruption is institutional—banks, payment processors, and treasury operations reallocating capital from nostro accounts and correspondent banking fees into blockchain infrastructure. McKinsey estimates that blockchain-based cross-border payments could save financial institutions $10-15 billion annually in settlement costs alone.

For blockchain infrastructure, this represents validation at the highest levels. Solana, Ethereum, and emerging chains like Circle's Arc are no longer experimental networks—they're processing billions in settlement volume for Fortune 500 payment companies. This institutional usage drives network effects, attracting developers, liquidity, and applications that further entrench blockchain as critical financial infrastructure.

The 2026 Inflection Point

If Galaxy Digital's prediction holds—and current trajectories suggest it will—2026 marks the year stablecoins cross from "emerging technology" to "mainstream settlement infrastructure."

The pieces are in place. Visa and Mastercard have moved beyond pilots to production systems processing real transaction volume. Regulatory frameworks in major jurisdictions are clarifying the legal status of stablecoins as payment instruments. And the economic case is undeniable—faster settlement, lower costs, better liquidity management, and 24/7 availability.

For consumers, the change will be invisible. Cards will still swipe, apps will still process payments, and money will still move. But underneath, the infrastructure powering those transactions will increasingly run on public blockchains, settling in stablecoins, and leveraging cryptographic proof instead of correspondent bank trust.

For the blockchain industry, this is the legitimacy milestone that has long been promised but rarely delivered. Not another white paper or roadmap—actual Fortune 500 companies embedding public-chain infrastructure into trillion-dollar payment networks.

The traditional finance and crypto divide is closing. Not because one side won, but because the most valuable properties of each—blockchain's efficiency and transparency, traditional finance's trust and user experience—are merging into hybrid infrastructure that neither ecosystem could build alone.

Visa and Mastercard's stablecoin pivot isn't the end of that convergence. It's the beginning.


Sources:

The Custody Architecture Divide: Why Most Crypto Custodians Can't Meet U.S. Banking Standards

· 13 min read
Dora Noda
Software Engineer

Here's a paradox that should concern every institution entering crypto: some of the industry's most prominent custody providers — Fireblocks and Copper among them — cannot legally serve as qualified custodians under U.S. banking regulations, despite protecting billions in digital assets.

The reason? A fundamental architectural choice that seemed cutting-edge in 2018 now creates an insurmountable regulatory barrier in 2026.

The Technology That Divided the Industry

The institutional custody market split into two camps years ago, each betting on a different cryptographic approach to securing private keys.

Multi-Party Computation (MPC) splits a private key into encrypted "shards" distributed across multiple parties. No single shard ever contains the complete key. When transactions require signing, the parties coordinate through a distributed protocol to generate valid signatures without ever reconstructing the full key. The appeal is obvious: eliminate the "single point of failure" by ensuring no entity ever holds complete control.

Hardware Security Modules (HSMs), by contrast, store complete private keys inside FIPS 140-2 Level 3 or Level 4 certified physical devices. These aren't just tamper-resistant — they're tamper-responsive. When sensors detect drilling, voltage manipulation, or temperature extremes, the HSM instantly self-erases all cryptographic material before an attacker can extract keys. The entire cryptographic lifecycle — generation, storage, signing, destruction — occurs within a certified boundary that meets strict federal standards.

For years, both approaches coexisted. MPC providers emphasized the theoretical impossibility of key compromise through single-point attacks. HSM advocates pointed to decades of proven security in banking infrastructure and unambiguous regulatory compliance. The market treated them as equally viable alternatives for institutional custody.

Then regulators clarified what "qualified custodian" actually means.

FIPS 140-3: The Standard That Changed Everything

The Federal Information Processing Standards don't exist to make engineers' lives difficult. They exist because the U.S. government learned — through painful, classified incidents — exactly how cryptographic modules fail under adversarial conditions.

FIPS 140-3, which superseded FIPS 140-2 in March 2019, establishes four security levels for cryptographic modules:

Level 1 requires production-grade equipment and externally tested algorithms. It's the baseline — necessary but insufficient for protecting high-value assets.

Level 2 adds requirements for physical tamper-evidence and role-based authentication. Attackers might successfully compromise a Level 2 module, but they'll leave detectable traces.

Level 3 demands physical tamper-resistance and identity-based authentication. Private keys can only enter or exit in encrypted form. This is where the requirements become expensive to implement and impossible to fake. Level 3 modules must detect and respond to physical intrusion attempts — not just log them for later review.

Level 4 enforces tamper-active protections: the module must detect environmental attacks (voltage glitches, temperature manipulation, electromagnetic interference) and immediately destroy sensitive data. Multi-factor authentication becomes mandatory. At this level, the security boundary can resist nation-state attackers with physical access to the device.

For qualified custodian status under U.S. banking regulations, HSM infrastructure must demonstrate at minimum FIPS 140-2 Level 3 certification. This isn't a suggestion or best practice. It's a hard requirement enforced by the Office of the Comptroller of the Currency (OCC), Federal Reserve, and state banking regulators.

Software-based MPC systems, by definition, cannot achieve FIPS 140-2 or 140-3 certification at Level 3 or above. The certification applies to physical cryptographic modules with hardware tamper-resistance — a category that MPC architectures fundamentally don't fit.

The Fireblocks and Copper Compliance Gap

Fireblocks Trust Company operates under a New York State trust charter regulated by the New York Department of Financial Services (NYDFS). The company's infrastructure protects over $10 trillion in digital assets across 300 million wallets — a genuinely impressive achievement that demonstrates operational excellence and market confidence.

But "qualified custodian" under federal banking law is a specific term of art with precise requirements. National banks, federal savings associations, and state banks that are members of the Federal Reserve system are presumptively qualified custodians. State trust companies can achieve qualified custodian status if they meet the same requirements — including HSM-backed key management that satisfies FIPS standards.

Fireblocks' architecture relies on MPC technology on the backend. The company's security model splits keys across multiple parties and uses advanced cryptographic protocols to enable signing without key reconstruction. For many use cases — especially high-velocity trading, cross-exchange arbitrage, and DeFi protocol interactions — this architecture offers compelling advantages over HSM-based systems.

But it doesn't meet the federal qualified custodian standard for digital asset custody.

Copper faces the same fundamental constraint. The platform excels at providing fintech companies and exchanges with fast asset movement and trading infrastructure. The technology works. The operations are professional. The security model is defensible for its intended use cases.

Neither company uses HSMs on the backend. Both rely on MPC technology. Under current regulatory interpretations, that architectural choice disqualifies them from serving as qualified custodians for institutional clients subject to federal banking oversight.

The SEC confirmed in recent guidance that it will not recommend enforcement action against registered advisers or regulated funds that use state trust companies as qualified custodians for crypto assets — but only if the state trust company is authorized by its regulator to provide custody services and meets the same requirements that apply to traditional qualified custodians. That includes FIPS-certified HSM infrastructure.

This isn't about one technology being "better" than another in absolute terms. It's about regulatory definitions that were written when cryptographic custody meant HSMs in physically secured facilities, and haven't been updated to accommodate software-based alternatives.

Anchorage Digital's Federal Charter Moat

In January 2021, Anchorage Digital Bank became the first crypto-native company to receive a national trust bank charter from the OCC. Five years later, it remains the only federally chartered bank focused primarily on digital asset custody.

The OCC charter isn't just a regulatory achievement. It's a competitive moat that becomes more valuable as institutional adoption accelerates.

Clients using Anchorage Digital Bank have their assets custodied under the same federal regulatory framework that governs JPMorgan Chase and Bank of New York Mellon. This includes:

  • Capital requirements designed to ensure the bank can absorb losses without threatening customer assets
  • Comprehensive compliance standards enforced through regular OCC examinations
  • Security protocols subject to federal banking oversight, including FIPS-certified HSM infrastructure
  • SOC 1 and SOC 2 Type II certification confirming effective internal controls

The operational performance metrics matter too. Anchorage processes 90% of transactions in under 20 minutes — competitive with MPC-based systems that theoretically should be faster due to distributed signing. The company has built custody infrastructure that institutions including BlackRock selected for spot crypto ETF operations, a vote of confidence from the world's largest asset manager launching regulated products.

For regulated entities — pension funds, endowments, insurance companies, registered investment advisers — the federal charter resolves a compliance problem that no amount of innovative cryptography can solve. When regulations require qualified custodian status, and qualified custodian status requires HSM infrastructure validated under FIPS standards, and only one crypto-native bank operates under direct OCC supervision, the custody decision becomes straightforward.

The Hybrid Architecture Opportunity

The custody technology landscape isn't static. As institutions recognize the regulatory constraints on pure MPC solutions, a new generation of hybrid architectures is emerging.

These systems combine FIPS 140-2 validated HSMs with MPC protocols and biometric controls for multi-layered protection. The HSM provides the regulatory compliance foundation and physical tamper-resistance. MPC adds distributed signing capabilities and eliminates single points of compromise. Biometrics ensure that even with valid credentials, transactions require human verification from authorized personnel.

Some advanced custody platforms now operate as "temperature agnostic" — able to dynamically allocate assets across cold storage (HSMs in physically secured facilities), warm storage (HSMs with faster access for operational needs), and hot wallets (for high-velocity trading where milliseconds matter and regulatory requirements are less stringent).

This architectural flexibility matters because different asset types and use cases have different security-versus-accessibility trade-offs:

  • Long-term treasury holdings: Maximum security in cold storage HSMs at FIPS Level 4 facilities, with multi-day withdrawal processes and multiple approval layers
  • ETF creation/redemption: Warm storage HSMs that can process institutional-scale transactions within hours while maintaining FIPS compliance
  • Trading operations: Hot wallets with MPC signing for sub-second execution where the custody provider operates under different regulatory frameworks than qualified custodians

The key insight is that regulatory compliance isn't binary. It's context-dependent based on the type of institution, the assets being held, and the regulatory regime that applies.

NIST Standards and 2026's Evolving Landscape

Beyond FIPS certification, the National Institute of Standards and Technology (NIST) has emerged as the cybersecurity benchmark for digital asset custody in 2026.

Financial institutions offering custody services increasingly must meet operational requirements aligned with the NIST Cybersecurity Framework 2.0. This includes:

  • Continuous monitoring and threat detection across custody infrastructure
  • Incident response playbooks tested through regular tabletop exercises
  • Supply chain security for hardware and software components in custody systems
  • Identity and access management with least-privilege principles

Fireblocks' framework aligns with NIST CSF 2.0 and provides a model for banks operationalizing custody governance. The challenge is that NIST compliance, while necessary, isn't sufficient for qualified custodian status under federal banking law. It's a cybersecurity baseline that applies across custody providers — but doesn't resolve the underlying FIPS certification requirement for HSM infrastructure.

As crypto custody regulations mature in 2026, we're seeing clearer delineation between different regulatory tiers:

  • OCC-chartered banks: Full federal banking oversight, qualified custodian status, HSM requirements
  • State-chartered trust companies: NYDFS or equivalent state regulation, potential qualified custodian status if HSM-backed
  • Licensed custody providers: Meet state licensing requirements but don't claim qualified custodian status
  • Technology platforms: Provide custody infrastructure without directly holding customer assets in their own name

The regulatory evolution isn't making custody simpler. It's creating more specialized categories that match security requirements to institutional risk profiles.

What This Means for Institutional Adoption

The custody architecture divide has direct implications for institutions allocating to digital assets in 2026:

For registered investment advisers (RIAs), the SEC's custody rule requires client assets to be held by qualified custodians. If your fund structure requires qualified custodian status, MPC-based providers — regardless of their security properties or operational track record — cannot satisfy that regulatory requirement.

For public pension funds and endowments, fiduciary standards often require custody at institutions that meet the same security and oversight standards as traditional asset custodians. State banking charters or federal OCC charters become prerequisites, which dramatically narrows the field of viable providers.

For corporate treasuries accumulating Bitcoin or stablecoins, the qualified custodian requirement may not apply — but insurance coverage does. Many institutional-grade custody insurance policies now require FIPS-certified HSM infrastructure as a condition of coverage. The insurance market is effectively enforcing hardware security module requirements even where regulators haven't mandated them.

For crypto-native firms — exchanges, DeFi protocols, trading desks — the calculus differs. Speed matters more than regulatory classification. The ability to move assets across chains and integrate with smart contracts matters more than FIPS certification. MPC-based custody platforms excel in these environments.

The mistake is treating custody as a one-size-fits-all decision. The right architecture depends entirely on who you are, what you're holding, and which regulatory framework applies.

The Path Forward

By 2030, the custody market will likely have bifurcated into distinct categories:

Qualified custodians operating under OCC federal charters or equivalent state trust charters, using HSM infrastructure, serving institutions subject to strict fiduciary standards and custody regulations.

Technology platforms leveraging MPC and other advanced cryptographic techniques, serving use cases where speed and flexibility matter more than qualified custodian status, operating under money transmission or other licensing frameworks.

Hybrid providers offering both HSM-backed qualified custody for regulated products and MPC-based solutions for operational needs, allowing institutions to allocate assets across security models based on specific requirements.

The question for institutions entering crypto in 2026 isn't "which custody provider is best?" It's "which custody architecture matches our regulatory obligations, risk tolerance, and operational needs?"

For many institutions, that answer points toward federally regulated custodians with FIPS-certified HSM infrastructure. For others, the flexibility and speed of MPC-based platforms outweighs the qualified custodian classification.

The industry's maturation means acknowledging these trade-offs rather than pretending they don't exist.

As blockchain infrastructure continues evolving toward institutional standards, reliable API access to diverse networks becomes essential for builders. BlockEden.xyz provides enterprise-grade RPC endpoints across major chains, enabling developers to focus on applications rather than node operations.

Sources

Filecoin's Onchain Cloud Transformation: From Cold Storage to Programmable Infrastructure

· 11 min read
Dora Noda
Software Engineer

While AWS charges $23 per terabyte monthly for standard storage, Filecoin costs $0.19 for the same capacity. But cost alone never wins infrastructure wars. The real question is whether decentralized storage can match centralized cloud providers in the metrics that actually matter: speed, reliability, and developer experience. On November 18, 2025, Filecoin made its answer clear with the launch of Onchain Cloud—a fundamental transformation that turns 2.1 exbibytes of archival storage into programmable, verifiable infrastructure designed for AI workloads and real-time applications.

This isn't incremental improvement. It's Filecoin's pivot from "blockchain storage network" to "decentralized cloud platform," complete with automated payments, cryptographic verification, and performance guarantees. After months of testing with over 100 developer teams, the mainnet launched in January 2026, positioning Filecoin to capture a meaningful share of the $12 billion AI infrastructure market.

The Onchain Cloud Architecture: Three Pillars of Programmable Storage

Filecoin Onchain Cloud introduces three core services that collectively enable developers to build on verifiable, decentralized infrastructure without the complexity traditionally associated with blockchain storage.

Filecoin Warm Storage Service keeps data online and provably available through continuous onchain proofs. Unlike cold archival storage that requires retrieval delays, warm storage maintains data in an accessible state while still leveraging Filecoin's cryptographic verification. This addresses the primary limitation that kept Filecoin confined to backup and archival use cases—data wasn't fast enough for active workloads.

Filecoin Pay automates usage-based payments through smart contracts, settling transactions only when delivery is confirmed onchain. This is fundamental infrastructure for pay-as-you-go cloud services: payments flow automatically as services are proven, eliminating manual invoicing, credit systems, and trust assumptions. Thousands of payment channels have already processed transactions through the testnet phase.

Filecoin Beam enables measured, incentivized data retrievals with performance-based incentives. Storage providers compete not just on storage capacity but on retrieval speed and reliability. This creates a retrieval market where providers are rewarded for performance, directly addressing the historical weakness of decentralized storage: unpredictable retrieval times.

Developers access these services through the Synapse SDK, which abstracts the complexity of direct Filecoin protocol interaction. Early integrations come from the ERC-8004 community, Ethereum Name Service (ENS), KYVE, Monad, Safe, Akave, and Storacha—projects that need verifiable storage for everything from blockchain state to decentralized identity.

Cryptographic Proofs: The Technical Foundation of Verifiable Storage

What differentiates Filecoin from centralized cloud providers isn't just decentralization—it's cryptographic proof that storage commitments are being honored. This matters for AI training datasets that need provenance guarantees, compliance-heavy industries that require audit trails, and any application where data integrity is non-negotiable.

Proof-of-Replication (PoRep) generates a unique copy of a sector's original data through a computationally intensive sealing process. This proves that a storage provider is storing a physically unique copy of the client's data, not just pretending to store it or storing a single copy for multiple clients. The sealed sector undergoes slow encoding, making it infeasible for dishonest providers to regenerate data on-demand to fake storage.

The sealing process produces a Multi-SNARK proof and a set of commitments (CommR) that link the sealed sector to the original unsealed data. These commitments are publicly verifiable on the blockchain, creating an immutable record of storage deals.

Proof-of-Spacetime (PoSt) proves continuous storage over time through regular cryptographic challenges. Storage providers face a 30-minute deadline to respond to WindowPoSt challenges by submitting zk-SNARK proofs that verify they still possess the exact bytes they committed to storing. This happens continuously—not just at the initiation of a storage deal, but throughout its entire duration.

The verification process randomly selects leaf nodes from the encoded replica and runs Merkle inclusion proofs to show that the provider has the specific bytes that should be there. Providers then use the privately stored CommRLast to prove they know a root for the replica that both agrees with the inclusion proofs and can derive the publicly-known CommR. The final stage compresses these proofs into a single zk-SNARK for efficient onchain verification.

Failure to submit WindowPoSt proofs within the 30-minute window triggers slashing: the storage provider loses a portion of their collateral (burned to the f099 address), and their storage power is reduced. This creates economic consequences for storage failures, aligning provider incentives with network reliability.

This two-layer proof system—PoRep for initial verification, PoSt for continuous validation—creates verifiable storage that centralized clouds simply cannot offer. When AWS says they're storing your data, you trust their infrastructure and legal agreements. When Filecoin says it, you have cryptographic proof updated every 30 minutes.

AI Infrastructure Market: Where Decentralized Storage Meets Real Demand

The timing of Filecoin Onchain Cloud's launch aligns with a fundamental shift in AI infrastructure requirements. As artificial intelligence transitions from research curiosity to production infrastructure reshaping entire industries, the storage needs become clear and massive.

AI models require massive datasets for training. Modern large language models train on hundreds of billions of tokens. Computer vision models need millions of labeled images. Recommendation systems ingest user behavior data at scale. These datasets don't fit in local storage—they need cloud infrastructure. But they also need provenance guarantees: poisoned training data creates poisoned models, and there's no cryptographic way to verify data integrity on AWS.

Continuous data access for inference. Once trained, AI models need constant access to reference data for serving predictions. Retrieval-augmented generation (RAG) systems query knowledge bases to ground language model outputs. Real-time recommendation engines pull user profiles and item catalogs. These aren't one-time retrievals—they're continuous, high-frequency access patterns that demand fast, reliable storage.

Verifiable data provenance to prevent model poisoning. When a financial institution trains a fraud detection model, they need to know the training data wasn't tampered with. When a healthcare AI analyzes patient records, provenance matters for compliance and liability. Filecoin's PoRep and PoSt proofs create an audit trail that centralized storage can't replicate without introducing trusted intermediaries.

Decentralized storage to avoid concentration risks. Relying on a single cloud provider creates systemic risk. AWS outages have taken down significant portions of the internet. Google Cloud disruptions impact millions of services. For AI infrastructure that underpins critical systems, geographic and organizational distribution isn't a philosophical preference—it's a risk management requirement.

Filecoin's network holds 2.1 exbibytes of committed storage with an additional 7.6 EiB of raw capacity available. Network utilization has grown to 36% (up from 32% in Q2 2025), with active stored data near 1,110 petabytes. Around 2,500 datasets were onboarded in 2025, showing steady enterprise adoption.

The economic case is compelling: Filecoin averages $0.19 per terabyte monthly versus AWS's roughly $23 for the same capacity—a 99% cost reduction. But the real value proposition isn't just cheaper storage. It's verifiable storage at scale with programmable infrastructure, delivered through developer-friendly tools.

Competing Against Centralized Cloud: Where Filecoin Stands in 2026

The question isn't whether decentralized storage has advantages—verifiable proofs, censorship resistance, cost efficiency are clear. The question is whether those advantages matter enough to overcome the remaining disadvantages: primarily that Filecoin storage and retrieval is still slower and more complex than centralized alternatives.

Performance gap narrowing but not closed. AWS S3 delivers single-digit millisecond latency for reads. Filecoin Warm Storage and Beam retrievals can't match that—yet. But many workloads don't need millisecond latency. AI training runs access large datasets in sequential batch reads. Archival storage for compliance doesn't prioritize speed. Content distribution networks cache frequently accessed data regardless of origin storage speed.

The Onchain Cloud upgrade introduces sub-minute finality for storage commitments, a significant improvement over previous multi-hour sealing times. This doesn't compete with AWS for latency-critical applications, but it opens up new use cases that were previously impractical on Filecoin.

Developer experience improving through abstraction. Direct Filecoin protocol interaction requires understanding sectors, sealing, WindowPoSt challenges, and payment channels—concepts foreign to developers accustomed to AWS's simple API: create bucket, upload object, set permissions. The Synapse SDK abstracts this complexity, providing familiar interfaces while handling cryptographic proof verification in the background.

Early adoption from ENS, KYVE, Monad, and Safe suggests the developer experience has crossed a usability threshold. These aren't blockchain-native storage projects experimenting with Filecoin for ideological reasons—they're infrastructure projects with real storage needs choosing verifiable decentralized storage over centralized alternatives.

Reliability through economic incentives versus contractual SLAs. AWS offers 99.999999999% (11 nines) durability for S3 Standard through multi-region replication and contractual service level agreements. Filecoin achieves reliability through economic incentives: storage providers who fail WindowPoSt challenges lose collateral and storage power. This creates different risk profiles—one backed by corporate guarantees, the other by cryptographic proofs and financial penalties.

For applications that need both cryptographic verification and high availability, the optimal architecture likely involves Filecoin for verifiable storage of record plus CDN caching for fast retrieval. This hybrid approach leverages Filecoin's strengths (verifiability, cost, decentralization) while mitigating its weaknesses (retrieval speed) through edge caching.

Market positioning: not replacing AWS, but serving different needs. Filecoin isn't going to replace AWS for general-purpose cloud computing. But it doesn't need to. The addressable market is applications where verifiable storage, censorship resistance, or decentralization provide value beyond cost savings: AI training datasets with provenance requirements, blockchain state that needs permanent availability, scientific research data that requires long-term integrity guarantees, compliance-heavy industries that need cryptographic audit trails.

The $12 billion AI infrastructure market represents a subset of total cloud spending where Filecoin's value proposition is strongest. Capturing even 5% of that market would represent $600 million in annual storage demand—meaningful growth from current utilization levels.

From 2.1 EiB to the Future of Verifiable Infrastructure

Filecoin's total committed storage capacity has actually declined through 2025—from 3.8 exbibytes in Q1 to 3.3 EiB in Q2 to 3.0 EiB by Q3—as inefficient storage providers exited following the Network v27 "Golden Week" upgrade. This capacity decline while utilization increased (from 30% to 36%) suggests a maturing market: lower total capacity but higher paid storage as a percentage.

The network expects over 1 exbibyte in paid storage deals by the end of 2025, representing a transition from speculative capacity provisioning to actual customer demand. This matters more than raw capacity numbers—utilization indicates real value delivery, not just miners onboarding storage hoping for future demand.

The Onchain Cloud transformation positions Filecoin for a different growth trajectory: not maximizing total storage capacity, but maximizing storage utilization through services that developers actually need. Warm storage, verifiable retrieval, and automated payments address the barriers that kept Filecoin confined to niche archival use cases.

Early mainnet adoption will be the critical test. Developer teams have tested on testnet, but production deployments with real data and real payments will reveal whether the performance, reliability, and developer experience meet the standards required for infrastructure decisions. The projects already experimenting—ENS for decentralized identity storage, KYVE for blockchain data archives, Safe for multi-signature wallet infrastructure—suggest cautious optimism.

The AI infrastructure market opportunity is real, but not guaranteed. Filecoin faces competition from centralized cloud providers with massive head starts in performance and developer ecosystems, plus decentralized storage competitors like Arweave (permanent storage) and Storj (performance-focused S3 alternative). Winning requires execution: delivering reliability that meets production standards, maintaining competitive pricing as the network scales, and continuing to improve developer tools and documentation.

Filecoin's transformation from "blockchain storage" to "programmable onchain cloud" represents a necessary evolution. The question in 2026 isn't whether decentralized storage has theoretical advantages—it clearly does. The question is whether those advantages translate into developer adoption and customer demand at scale. The cryptographic proofs are in place. The economic incentives are aligned. Now comes the hard part: building a cloud platform that developers trust with production workloads.

BlockEden.xyz provides enterprise-grade infrastructure for blockchain developers building on verifiable foundations. Explore our API marketplace to access the infrastructure you need for applications designed to last.

Sources

The 2026 Data Availability Race: Celestia, EigenDA, and Avail's Battle for Blockchain Scalability

· 13 min read
Dora Noda
Software Engineer

Every Layer 2 you use relies on a hidden infrastructure most users never think about: data availability layers. But in 2026, this quiet battlefield has become the most critical piece of blockchain scalability, with three giants—Celestia, EigenDA, and Avail—racing to process terabits of rollup data per second. The winner doesn't just capture market share; they define which rollups survive, how much transactions cost, and whether blockchain can scale to billions of users.

The stakes couldn't be higher. Celestia commands roughly 50% of the data availability market after processing over 160 gigabytes of rollup data. Its upcoming Matcha upgrade in Q1 2026 will double block sizes to 128MB, while the experimental Fibre Blockspace protocol promises a staggering 1 terabit per second throughput—1,500 times their previous roadmap target. Meanwhile, EigenDA has achieved 100MB/s throughput using a Data Availability Committee model, and Avail has secured integrations with Arbitrum, Optimism, Polygon, StarkWare, and zkSync for its mainnet launch.

This isn't just infrastructure competition—it's a battle over the fundamental economics of Layer 2 networks. Choosing the wrong data availability layer can increase costs by 55 times, making the difference between a thriving rollup ecosystem and one strangled by data fees.

The Data Availability Bottleneck: Why This Layer Matters

To understand why data availability has become blockchain's most important battlefield, you need to grasp what rollups actually do. Layer 2 rollups like Arbitrum, Optimism, and Base execute transactions off-chain to achieve faster speeds and lower costs, then post transaction data somewhere secure so anyone can verify the chain's state. That "somewhere secure" is the data availability layer.

For years, Ethereum's mainnet served as the default DA layer. But as rollup usage exploded, Ethereum's limited block space created a bottleneck. Data availability fees spiked during periods of high demand, eating into the cost savings that made rollups attractive in the first place. The solution? Modular data availability layers purpose-built to handle massive throughput at minimal cost.

Data availability sampling (DAS) is the breakthrough technology enabling this transformation. Instead of requiring every node to download entire blocks to verify availability, DAS allows light nodes to probabilistically confirm data is available by sampling small random chunks. More light nodes sampling means the network can safely increase block sizes without sacrificing security.

Celestia pioneered this approach as the first modular data availability network, separating data ordering and availability from execution and settlement. The architecture is elegant: Celestia orders transaction data into "blobs" and guarantees their availability for a configurable period, while execution and settlement happen on layers above. This separation allows each layer to optimize for its specific function rather than compromising on all fronts like monolithic blockchains.

By mid-2025, more than 56 rollups were using Celestia, including 37 on mainnet and 19 on testnet. Eclipse alone has posted over 83 gigabytes through the network. Every major rollup framework—Arbitrum Orbit, OP Stack, Polygon CDK—now supports Celestia as a data availability option, creating switching costs and network effects that compound Celestia's early-mover advantage.

Celestia's Two-Pronged Attack: Matcha Upgrade and Fibre Blockspace

Celestia isn't resting on its market share. The project is executing a two-phase strategy to cement dominance: the near-term Matcha upgrade bringing production-ready scalability improvements, and the experimental Fibre Blockspace protocol targeting 1 terabit per second of future throughput.

Matcha Upgrade: Doubling Down on Production Scale

The Matcha upgrade (Celestia v6) is currently live on the Arabica testnet with mainnet deployment expected in Q1 2026. It represents the largest single capacity increase in Celestia's history.

Core improvements include:

  • 128MB block size: CIP-38 introduces a new high-throughput block propagation mechanism, increasing maximum block size from 8MB to 128MB—a 16x jump. The data square size expands from 128 to 512, and maximum transaction size grows from 2MB to 8MB.

  • Reduced storage requirements: CIP-34 cuts Celestia's minimum data pruning window from 30 days to 7 days plus 1 hour, slashing storage costs for bridge nodes from 30TB to 7TB at projected throughput levels. For rollups running high-volume applications, this storage reduction translates directly to lower operational costs.

  • Light node optimization: CIP-35 introduces pruning for Celestia light nodes, allowing them to retain only recent headers rather than the entire chain history. Light node storage requirements drop to approximately 10GB, making it feasible to run verification nodes on consumer hardware and mobile devices.

  • Inflation cut and interoperability: Beyond scalability, Matcha cuts protocol inflation from 5% to 2.5%, potentially making TIA deflationary if network usage grows. It also removes the token filter for IBC and Hyperlane, positioning Celestia as a routing layer for any asset across multiple ecosystems.

In testing environments, Celestia achieved approximately 27 MB/s throughput with 88 MB blocks in the Mammoth Mini devnet, and 21.33 MB/s sustained throughput with 128 MB blocks in the mamo-1 testnet. These aren't theoretical maximums—they're production-proven benchmarks that rollups can rely on when architecting for scale.

Fibre Blockspace: The 1 Tb/s Future

While Matcha focuses on near-term production readiness, Fibre Blockspace represents Celestia's moonshot vision for blockchain throughput. The protocol is capable of sustaining 1 terabit per second of blockspace across 500 nodes—a throughput level 1,500 times the goal set in Celestia's previous roadmap.

The core innovation is ZODA, a new encoding protocol that Celestia claims processes data 881 times faster than KZG commitment-based alternatives used by competing DA protocols. During large-scale network tests using 498 GCP machines distributed across North America (each with 48-64 vCPUs, 90-128GB RAM, and 34-45Gbps network links), the team successfully demonstrated terabit-scale throughput.

Fibre targets power users with a minimum blob size of 256KB and maximum of 128MB, optimized for high-volume rollups and institutional applications requiring guaranteed throughput. The rollout plan is incremental: Fibre will first deploy to the Arabica testnet for developer experimentation, then graduate to mainnet with progressive throughput increases as the protocol undergoes real-world stress testing.

What does 1 Tb/s actually mean in practice? At that throughput level, Celestia could theoretically handle the data needs of thousands of high-activity rollups simultaneously, supporting everything from high-frequency trading venues to real-time gaming worlds to AI model training coordination—all without the data availability layer becoming a bottleneck.

EigenDA and Avail: Different Philosophies, Different Trade-offs

While Celestia dominates market share, EigenDA and Avail are carving out distinct positioning with alternative architectural approaches that appeal to different use cases.

EigenDA: Speed Through Restaking

EigenDA, built by the EigenLayer team, has released V2 software achieving 100MB per second throughput—significantly higher than Celestia's current mainnet performance. The protocol leverages EigenLayer's restaking infrastructure, where Ethereum validators reuse their staked ETH to secure additional services including data availability.

The key architectural difference: EigenDA operates as a Data Availability Committee (DAC) rather than a publicly verified blockchain. This design choice removes certain verification requirements that blockchain-based solutions implement, enabling DACs like EigenDA to reach higher raw throughput while introducing trust assumptions that validators in the committee will honestly attest to data availability.

For Ethereum-native projects prioritizing seamless integration with the Ethereum ecosystem and willing to accept DAC trust assumptions, EigenDA offers a compelling value proposition. The shared security model with Ethereum mainnet creates a natural alignment for rollups already relying on Ethereum for settlement. However, this same dependency becomes a limitation for projects seeking sovereignty beyond the Ethereum ecosystem or requiring the strongest possible data availability guarantees.

Avail: Multichain Flexibility

Avail launched its mainnet in 2025 with a different focus: optimizing data availability for highly scalable and customizable rollups across multiple ecosystems, not just Ethereum. The protocol combines validity proofs, data availability sampling, and erasure coding with KZG polynomial commitments to deliver what the team calls "world-class data availability guarantees."

Avail's current mainnet throughput stands at 4MB per block, with benchmarks demonstrating successful increases to 128MB per block—a 32x improvement—without sacrificing network liveness or block propagation speed. The roadmap includes progressive throughput increases as the network matures.

The project's major achievement in 2026 has been securing integration commitments from five major Layer 2 projects: Arbitrum, Optimism, Polygon, StarkWare, and zkSync. Avail claims over 70 partnerships total, spanning application-specific blockchains, DeFi protocols, and Web3 gaming chains. This ecosystem breadth positions Avail as the data availability layer for multichain infrastructure that needs to coordinate across different settlement environments.

Avail DA represents the first component of a three-part architecture. The team is developing Nexus (an interoperability layer) and Fusion (a security network layer) to create a full-stack modular infrastructure. This vertical integration strategy mirrors Celestia's vision of being more than just data availability—becoming fundamental infrastructure for the entire modular stack.

Market Position and Adoption: Who's Winning in 2026?

The data availability market in 2026 is shaping up as a "winner takes most" dynamic, with Celestia holding commanding early-stage market share but facing credible competition from EigenDA and Avail in specific niches.

Celestia's Market Dominance:

  • ~50% market share in data availability services
  • 160+ gigabytes of rollup data processed through the network
  • 56+ rollups using the platform (37 mainnet, 19 testnet)
  • Universal rollup framework support: Arbitrum Orbit, OP Stack, and Polygon CDK all integrate Celestia as a DA option

This adoption creates powerful network effects. As more rollups choose Celestia, developer tooling, documentation, and ecosystem expertise concentrate around the platform.

Switching costs increase as teams build Celestia-specific optimizations into their rollup architecture. The result is a flywheel where market share begets more market share.

EigenDA's Ethereum Alignment:

EigenDA's strength lies in its tight integration with Ethereum's restaking ecosystem. For projects already committed to Ethereum for settlement and security, adding EigenDA as a data availability layer creates a vertically integrated stack entirely within the Ethereum universe.

The 100MB/s throughput also positions EigenDA well for high-frequency applications willing to accept DAC trust assumptions in exchange for raw speed.

However, EigenDA's reliance on Ethereum validators limits its appeal for rollups seeking sovereignty or multichain flexibility. Projects building on Solana, Cosmos, or other non-EVM ecosystems have little incentive to depend on Ethereum restaking for data availability.

Avail's Multichain Play:

Avail's integrations with Arbitrum, Optimism, Polygon, StarkWare, and zkSync represent major partnership wins, but the protocol's actual mainnet usage lags behind announcements.

The 4MB per block throughput (versus Celestia's current 8MB and Matcha's upcoming 128MB) creates a performance gap that limits Avail's competitiveness for high-volume rollups.

Avail's true differentiator is multichain flexibility. As blockchain infrastructure fragments across Ethereum L2s, alternative L1s, and application-specific chains, the need for a neutral data availability layer that doesn't favor one ecosystem grows. Avail positions itself as that neutral infrastructure, with partnerships spanning multiple settlement layers and execution environments.

The Economics of DA Layer Choice:

Choosing the wrong data availability layer can increase rollup costs by 55x according to industry analysis. This cost differential stems from three factors:

  1. Throughput limitations creating data fee spikes during demand peaks
  2. Storage requirements forcing rollups to maintain expensive archive infrastructure
  3. Switching costs making it expensive to migrate once integrated

For gaming-focused Layer 3 rollups generating massive state updates, the choice between Celestia's low-cost modular DA (especially post-Matcha) versus more expensive alternatives can mean the difference between sustainable economics and bleeding capital on data fees. This explains why Celestia is projected to dominate gaming L3 adoption in 2026.

The Path Forward: Implications for Rollup Economics and Blockchain Architecture

The data availability wars of 2026 represent more than infrastructure competition—they're reshaping fundamental assumptions about how blockchains scale and how rollup economics work.

Celestia's Matcha upgrade and Fibre Blockspace roadmap make it clear that data availability is no longer the bottleneck for blockchain scalability. With 128MB blocks in production and 1 Tb/s demonstrated in testing, the constraint shifts elsewhere—to execution layer optimization, state growth management, and cross-rollup interoperability. This is a profound shift. For years, the assumption was that data availability would limit how many rollups could scale simultaneously. Celestia is systematically invalidating that assumption.

The modular architecture philosophy is winning. Every major rollup framework now supports pluggable data availability layers rather than forcing dependence on Ethereum mainnet. This architectural choice validates the core insight behind Celestia's founding: that monolithic blockchains forcing every node to do everything create unnecessary trade-offs, while modular separation allows each layer to optimize independently.

Different DA layers are crystallizing around distinct use cases rather than competing head-to-head. Celestia serves rollups prioritizing cost efficiency, maximum decentralization, and proven production scale. EigenDA appeals to Ethereum-native projects willing to accept DAC trust assumptions for higher throughput. Avail targets multichain infrastructure needing neutral coordination across ecosystems. Rather than a single winner, the market is segmenting by architectural priorities.

Data availability costs are trending toward zero, which changes rollup business models. As Celestia's block sizes grow and competition intensifies, the marginal cost of posting data approaches negligible levels. This removes one of the largest variable costs in rollup operations, shifting economics toward fixed infrastructure costs (sequencers, provers, state storage) rather than per-transaction DA fees. Rollups can increasingly focus on execution innovation rather than worrying about data bottlenecks.

The next chapter of blockchain scaling isn't about whether rollups can access affordable data availability—Celestia's Matcha upgrade and Fibre roadmap make that inevitable. The question is what applications become possible when data is no longer the constraint. High-frequency trading venues running entirely on-chain. Massive multiplayer gaming worlds with persistent state. AI model coordination across decentralized compute networks. These applications were economically infeasible when data availability limited throughput and spiked costs unpredictably. Now the infrastructure exists to support them at scale.

For blockchain developers in 2026, the data availability layer choice has become as critical as choosing which L1 to build on was in 2020. Celestia's market position, production-proven scalability roadmap, and ecosystem integrations make it the safe default. EigenDA offers higher throughput for Ethereum-aligned projects accepting DAC trust models. Avail provides multichain flexibility for teams coordinating across ecosystems. All three have viable paths forward—but Celestia's 50% market share, Matcha upgrade, and Fibre vision position it to define what "data availability at scale" means for the next generation of blockchain infrastructure.

Sources

Nillion's Blacklight Goes Live: How ERC-8004 is Building the Trust Layer for Autonomous AI Agents

· 12 min read
Dora Noda
Software Engineer

On February 2, 2026, the AI agent economy took a critical step forward. Nillion launched Blacklight, a verification layer implementing the ERC-8004 standard to solve one of blockchain's most pressing questions: how do you trust an AI agent you've never met?

The answer isn't a simple reputation score or a centralized registry. It's a five-step verification process backed by cryptographic proofs, programmable audits, and a network of community-operated nodes. As autonomous agents increasingly execute trades, manage treasuries, and coordinate cross-chain activities, Blacklight represents the infrastructure enabling trustless AI coordination at scale.

The Trust Problem AI Agents Can't Solve Alone

The numbers tell the story. AI agents now contribute 30% of Polymarket's trading volume, handle DeFi yield strategies across multiple protocols, and autonomously execute complex workflows. But there's a fundamental bottleneck: how do agents verify each other's trustworthiness without pre-existing relationships?

Traditional systems rely on centralized authorities issuing credentials. Web3's promise is different—trustless verification through cryptography and consensus. Yet until ERC-8004, there was no standardized way for agents to prove their authenticity, track their behavior, or validate their decision-making logic on-chain.

This isn't just a theoretical problem. As Davide Crapis explains, "ERC-8004 enables decentralized AI agent interactions, establishes trustless commerce, and enhances reputation systems on Ethereum." Without it, agent-to-agent commerce remains confined to walled gardens or requires manual oversight—defeating the purpose of autonomy.

ERC-8004: The Three-Registry Trust Infrastructure

The ERC-8004 standard, which went live on Ethereum mainnet on January 29, 2026, establishes a modular trust layer through three on-chain registries:

Identity Registry: Uses ERC-721 to provide portable agent identifiers. Each agent receives a non-fungible token representing its unique on-chain identity, enabling cross-platform recognition and preventing identity spoofing.

Reputation Registry: Collects standardized feedback and ratings. Unlike centralized review systems, feedback is recorded on-chain with cryptographic signatures, creating an immutable audit trail. Anyone can crawl this history and build custom reputation algorithms.

Validation Registry: Supports cryptographic and economic verification of agent work. This is where programmable audits happen—validators can re-execute computations, verify zero-knowledge proofs, or leverage Trusted Execution Environments (TEEs) to confirm an agent acted correctly.

The brilliance of ERC-8004 is its unopinionated design. As the technical specification notes, the standard supports various validation techniques: "stake-secured re-execution of tasks (inspired by systems like EigenLayer), verification of zero-knowledge machine learning (zkML) proofs, and attestations from Trusted Execution Environments."

This flexibility matters. A DeFi arbitrage agent might use zkML proofs to verify its trading logic without revealing alpha. A supply chain agent might use TEE attestations to prove it accessed real-world data correctly. A cross-chain bridge agent might rely on crypto-economic validation with slashing to ensure honest execution.

Blacklight's Five-Step Verification Process

Nillion's implementation of ERC-8004 on Blacklight adds a crucial layer: community-operated verification nodes. Here's how the process works:

1. Agent Registration: An agent registers its identity in the Identity Registry, receiving an ERC-721 NFT. This creates a unique on-chain identifier tied to the agent's public key.

2. Verification Request Initiation: When an agent performs an action requiring validation (e.g., executing a trade, transferring funds, or updating state), it submits a verification request to Blacklight.

3. Committee Assignment: Blacklight's protocol randomly assigns a committee of verification nodes to audit the request. These nodes are operated by community members who stake 70,000 NIL tokens, aligning incentives for network integrity.

4. Node Checks: Committee members re-execute the computation or validate cryptographic proofs. If validators detect incorrect behavior, they can slash the agent's stake (in systems using crypto-economic validation) or flag the identity in the Reputation Registry.

5. On-Chain Reporting: Results are posted on-chain. The Validation Registry records whether the agent's work was verified, creating permanent proof of execution. The Reputation Registry updates accordingly.

This process happens asynchronously and non-blocking, meaning agents don't wait for verification to complete routine tasks—but high-stakes actions (large transfers, cross-chain operations) can require upfront validation.

Programmable Audits: Beyond Binary Trust

Blacklight's most ambitious feature is "programmable verification"—the ability to audit how an agent makes decisions, not just what it does.

Consider a DeFi agent managing a treasury. Traditional audits verify that funds moved correctly. Programmable audits verify:

  • Decision-making logic consistency: Did the agent follow its stated investment strategy, or did it deviate?
  • Multi-step workflow execution: If the agent was supposed to rebalance portfolios across three chains, did it complete all steps?
  • Security constraints: Did the agent respect gas limits, slippage tolerances, and exposure caps?

This is possible because ERC-8004's Validation Registry supports arbitrary proof systems. An agent can commit to a decision-making algorithm on-chain (e.g., a hash of its neural network weights or a zk-SNARK circuit representing its logic), then prove each action conforms to that algorithm without revealing proprietary details.

Nillion's roadmap explicitly targets these use cases: "Nillion plans to expand Blacklight's capabilities to 'programmable verification,' enabling decentralized audits of complex behaviors such as agent decision-making logic consistency, multi-step workflow execution, and security constraints."

This shifts verification from reactive (catching errors after the fact) to proactive (enforcing correct behavior by design).

Blind Computation: Privacy Meets Verification

Nillion's underlying technology—Nil Message Compute (NMC)—adds a privacy dimension to agent verification. Unlike traditional blockchains where all data is public, Nillion's "blind computation" enables operations on encrypted data without decryption.

Here's why this matters for agents: an AI agent might need to verify its trading strategy without revealing alpha to competitors. Or prove it accessed confidential medical records correctly without exposing patient data. Or demonstrate compliance with regulatory constraints without disclosing proprietary business logic.

Nillion's NMC achieves this through multi-party computation (MPC), where nodes collaboratively generate "blinding factors"—correlated randomness used to encrypt data. As DAIC Capital explains, "Nodes generate the key network resource needed to process data—a type of correlated randomness referred to as a blinding factor—with each node storing its share of the blinding factor securely, distributing trust across the network in a quantum-safe way."

This architecture is quantum-resistant by design. Even if a quantum computer breaks today's elliptic curve cryptography, distributed blinding factors remain secure because no single node possesses enough information to decrypt data.

For AI agents, this means verification doesn't require sacrificing confidentiality. An agent can prove it executed a task correctly while keeping its methods, data sources, and decision-making logic private.

The $4.3 Billion Agent Economy Infrastructure Play

Blacklight's launch comes as the blockchain-AI sector enters hypergrowth. The market is projected to grow from $680 million (2025) to $4.3 billion (2034) at a 22.9% CAGR, while the broader confidential computing market reaches $350 billion by 2032.

But Nillion isn't just betting on market expansion—it's positioning itself as critical infrastructure. The agent economy's bottleneck isn't compute or storage; it's trust at scale. As KuCoin's 2026 outlook notes, three key trends are reshaping AI identity and value flow:

Agent-Wrapping-Agent systems: Agents coordinating with other agents to execute complex multi-step tasks. This requires standardized identity and verification—exactly what ERC-8004 provides.

KYA (Know Your Agent): Financial infrastructure demanding agent credentials. Regulators won't approve autonomous agents managing funds without proof of correct behavior. Blacklight's programmable audits directly address this.

Nano-payments: Agents need to settle micropayments efficiently. The x402 payment protocol, which processed over 20 million transactions in January 2026, complements ERC-8004 by handling settlement while Blacklight handles trust.

Together, these standards reached production readiness within weeks of each other—a coordination breakthrough signaling infrastructure maturation.

Ethereum's Agent-First Future

ERC-8004's adoption extends far beyond Nillion. As of early 2026, multiple projects have integrated the standard:

  • Oasis Network: Implementing ERC-8004 for confidential computing with TEE-based validation
  • The Graph: Supporting ERC-8004 and x402 to enable verifiable agent interactions in decentralized indexing
  • MetaMask: Exploring agent wallets with built-in ERC-8004 identity
  • Coinbase: Integrating ERC-8004 for institutional agent custody solutions

This rapid adoption reflects a broader shift in Ethereum's roadmap. Vitalik Buterin has repeatedly emphasized that blockchain's role is becoming "just the plumbing" for AI agents—not the consumer-facing layer, but the trust infrastructure enabling autonomous coordination.

Nillion's Blacklight accelerates this vision by making verification programmable, privacy-preserving, and decentralized. Instead of relying on centralized oracles or human reviewers, agents can prove their correctness cryptographically.

What Comes Next: Mainnet Integration and Ecosystem Expansion

Nillion's 2026 roadmap prioritizes Ethereum compatibility and sustainable decentralization. The Ethereum bridge went live in February 2026, followed by native smart contracts for staking and private computation.

Community members staking 70,000 NIL tokens can operate Blacklight verification nodes, earning rewards while maintaining network integrity. This design mirrors Ethereum's validator economics but adds a verification-specific role.

The next milestones include:

  • Expanded zkML support: Integrating with projects like Modulus Labs to verify AI inference on-chain
  • Cross-chain verification: Enabling Blacklight to verify agents operating across Ethereum, Cosmos, and Solana
  • Institutional partnerships: Collaborations with Coinbase and Alibaba Cloud for enterprise agent deployment
  • Regulatory compliance tools: Building KYA frameworks for financial services adoption

Perhaps most importantly, Nillion is developing nilGPT—a fully private AI chatbot demonstrating how blind computation enables confidential agent interactions. This isn't just a demo; it's a blueprint for agents handling sensitive data in healthcare, finance, and government.

The Trustless Coordination Endgame

Blacklight's launch marks a pivot point for the agent economy. Before ERC-8004, agents operated in silos—trusted within their own ecosystems but unable to coordinate across platforms without human intermediaries. After ERC-8004, agents can verify each other's identity, audit each other's behavior, and settle payments autonomously.

This unlocks entirely new categories of applications:

  • Decentralized hedge funds: Agents managing portfolios across chains, with verifiable investment strategies and transparent performance audits
  • Autonomous supply chains: Agents coordinating logistics, payments, and compliance without centralized oversight
  • AI-powered DAOs: Organizations governed by agents that vote, propose, and execute based on cryptographically verified decision-making logic
  • Cross-protocol liquidity management: Agents rebalancing assets across DeFi protocols with programmable risk constraints

The common thread? All require trustless coordination—the ability for agents to work together without pre-existing relationships or centralized trust anchors.

Nillion's Blacklight provides exactly that. By combining ERC-8004's identity and reputation infrastructure with programmable verification and blind computation, it creates a trust layer scalable enough for the trillion-agent economy on the horizon.

As blockchain becomes the plumbing for AI agents and global finance, the question isn't whether we need verification infrastructure—it's who builds it, and whether it's decentralized or controlled by a few gatekeepers. Blacklight's community-operated nodes and open standard make the case for the former.

The age of autonomous on-chain actors is here. The infrastructure is live. The only question left is what gets built on top.


Sources: