Skip to main content

60 posts tagged with "Decentralized Computing"

Decentralized computing and cloud

View all tags

DePIN's Revenue Reckoning: How Akash, io.net, and Aethir Are Replacing Token Mining with Real Business Cash Flow

· 9 min read
Dora Noda
Software Engineer

Aethir quietly crossed $127 million in annual revenue in 2025. Not in token emissions. Not in speculative incentive programs. In actual enterprise spending on GPU compute. That single data point may mark the moment decentralized compute stopped being a crypto experiment and started becoming a cloud business.

For years, the knock against Decentralized Physical Infrastructure Networks (DePIN) was simple: their economics ran on token printing, not customer invoices. Providers earned rewards denominated in volatile native tokens, demand was often synthetic, and the gap between "network activity" and "revenue" could be measured in orders of magnitude. But across 2025 and into early 2026, the leading GPU compute networks — Akash, io.net, Aethir, and Render — have been executing a pivot that the broader market hasn't fully priced in: the shift from token-subsidized supply to demand-driven cash flow.

ASI Alliance Chain Launch: The $2B Decentralized AI Mega-Merger Goes Live

· 8 min read
Dora Noda
Software Engineer

When four of crypto's most ambitious AI projects — Fetch.ai, SingularityNET, Ocean Protocol, and CUDOS — merged into a single entity in 2024, skeptics dismissed it as token consolidation theater. Two years later, the Artificial Superintelligence (ASI) Alliance is shipping production infrastructure that challenges the centralized AI establishment at its core: a purpose-built Layer-1 blockchain, enterprise-grade GPU inference at half the cost of AWS, and an AGI programming framework that treats autonomous agents as first-class citizens.

With ASI:Chain's DevNet live, ASI:Cloud processing real workloads, and NVIDIA GPU allocations sold out through 2026, the Alliance's bet on decentralized AI infrastructure is looking less like idealism and more like inevitability.

The DAO Governance Crisis: Why 12,000 Organizations Managing $28 Billion Are Quietly Breaking Down

· 8 min read
Dora Noda
Software Engineer

One percent of token holders control ninety percent of voting power across major DAOs. Over 12,000 decentralized autonomous organizations now manage roughly $28 billion in treasury assets — yet average voter turnout hovers around 20%, and in many cases, fewer than one in ten eligible participants actually cast a vote. What was supposed to be the most democratic form of organizational governance is starting to look like its most dysfunctional.

In early 2026, several high-profile DAOs effectively admitted defeat. Jupiter DAO froze all governance voting and locked its treasury until 2027. Scroll DAO paused operations entirely after its leadership resigned in confusion over which proposals were even active. Yuga Labs walked away from its DAO structure with a blunt statement about dysfunction. These aren't fringe experiments — they represent some of the most well-funded projects in crypto.

The question is no longer whether DAO governance has a problem. It's whether the model can be saved.

AI Agents as Primary Blockchain Users: The Invisible Revolution of 2026

· 14 min read
Dora Noda
Software Engineer

"In a few years, it's going to be just AI, like the operating system," declared Illia Polosukhin, co-founder of NEAR Protocol, in a statement that crystallizes the most profound shift happening in blockchain technology today. His prediction is simple yet transformative: AI agents will become the primary users of blockchain, not humans.

This isn't a distant science fiction scenario. It's happening right now, in March 2026, as billions of transactions are being executed by autonomous AI agents across dozens of blockchains. While human users still dominate headline statistics, the infrastructure being built today reveals a future where blockchain becomes the invisible backend to AI-driven interactions.

The Paradigm Shift: From Human-Centric to Agent-Centric Blockchain

Polosukhin's vision articulates what many infrastructure builders already know: "AI is going to be on the front-end, and blockchain is going to be the back-end." This reversal of roles transforms blockchain from a direct user interface to a coordination layer for autonomous systems.

The numbers support this trajectory. By the end of 2026, 40% of enterprise applications are expected to embed task-specific AI agents, up from less than 5% in 2025. Meanwhile, prediction markets like Polymarket already see AI agents contributing 30% or more of trading volume, demonstrating that autonomous systems are not just theoretical—they're active market participants.

NEAR's February 2026 launch of Near.com exemplifies this shift. The super app positions itself at the intersection of crypto and AI, described by Polosukhin as part of the "agentic era," where AI systems don't just provide answers, but take action on behalf of users.

The Infrastructure Enabling Autonomous Agents

The emergence of AI agents as primary blockchain users required fundamental infrastructure breakthroughs across wallets, execution layers, and payment protocols.

Agentic Wallets: Financial Autonomy for AI

In February 2026, Coinbase launched Agentic Wallets, the first wallet infrastructure designed specifically for AI agents. These wallets allow AI systems to hold funds and execute on-chain transactions independently within defined limits, giving agents the power to spend, earn, and trade autonomously while maintaining enterprise-grade security.

The security architecture is critical. Agentic Wallets include programmable guardrails that allow users to set session caps and transaction limits, defining how much an AI agent can spend and under what circumstances. Additional controls include operation allowlists, anomaly detection, real-time alerts, multi-party approvals, and detailed audit logs, all configurable via API.

OKX followed suit in early March 2026 with an AI-focused upgrade to its OnchainOS developer platform, positioning it as infrastructure for autonomous crypto trading agents. The platform provides unified wallet infrastructure, liquidity routing, and on-chain data feeds enabling agents to execute high-level trading instructions across more than 60 blockchains and 500-plus decentralized exchanges. The system already handles 1.2 billion daily API calls and about $300 million in trading volume.

Circle's integration of blockchain infrastructure for AI agents emphasizes stablecoin-based autonomous payments, while the x402 protocol has been battle-tested with over 50 million transactions, enabling machine-to-machine payments, API paywalls, and programmatic resource access without human intervention.

Natural Language Intent-Based Execution

Perhaps the most transformative development is the integration of natural language processing with blockchain execution. By 2026, most major crypto wallets are introducing natural language intent-based transaction execution. Users can say "maximize my yield across Aave, Compound, and Morpho" and their agent will execute the strategy autonomously.

This shift from explicit transaction signing to declarative intent represents a fundamental change in blockchain interaction patterns. Transaction Intent refers to a high-level, declarative representation of a user's desired outcome (the "what"), which is compiled into one or more concrete, chain-specific transactions (the "how").

The AI agent layer performs several critical functions: natural language understanding to parse user intent, context maintenance for conversational continuity, planning and reasoning to decompose complex tasks into executable steps, safety validation to prevent harmful or unintended actions, and tool orchestration to coordinate interactions with external systems.

AI agents parse natural language instructions such as "Swap 1 ETH for USDC on Uniswap," transforming them into structured operations that interact with smart contracts. By integrating agents with intent-centric systems, we ensure users fully control their data and assets, while generalized intents enable agents to solve any user request, including complicated multi-step operations and cross-chain transactions.

Real-World Applications Already Live

The applications enabled by these infrastructure advances are already generating measurable economic activity.

Autonomous DeFi applications allow agents to monitor yields across protocols, execute trades on Base, and manage liquidity positions 24/7. Agents can rebalance automatically when detecting better yield opportunities without approval needed. With programmable safeguards in place, AI agents monitor DeFi yields, rebalance portfolios automatically, pay for APIs or computing resources, and participate in digital economies without direct human confirmation.

This represents a significant shift toward AI agents becoming active financial participants in blockchain ecosystems rather than just advisory tools.

The Infrastructure Gap: Challenges Ahead

Despite rapid progress, significant infrastructure gaps remain between AI capabilities and blockchain tooling requirements.

Scalability and Performance Bottlenecks

AI workloads are heavy, while blockchain networks are often limited in throughput. The integration of AI agents with blockchain encounters significant scalability and performance limitations, with computational overhead of consensus mechanisms and latency of transaction validation impacting real-time operations.

AI decisions require fast responses, but public blockchains may introduce delays, and on-chain computation can be expensive. This tension has led to hybrid architectures where heavy computation occurs off-chain, while verification and settlement occur on-chain. Unique "Offchain Service" architectures allow agents to run heavy machine learning models offchain but verify results onchain.

Tooling and Interface Standards

Research has identified consequential gaps and organized them into a 2026 research roadmap, prioritizing missing interface layers, verifiable policy enforcement, and reproducible evaluation practices. A research roadmap centers on two interface abstractions: a Transaction Intent Schema for portable goal specification, and a Policy Decision Record for auditable policy enforcement.

Privacy and Security Challenges

A key challenge is balancing transparency with privacy. Developing advanced privacy-preserving mechanisms suited for natural language interactions is essential, along with establishing secure on-chain and off-chain data transfer protocols.

Ethereum implemented EIP-7702 to address security concerns, allowing a standard account to serve as a smart contract for a single transaction where a human user grants temporary, highly restricted permission to an AI agent.

Payment Infrastructure at Scale

AI agents require payment infrastructure that traditional processors cannot provide. When a single agent conversation triggers hundreds of micro-activities with sub-cent costs, legacy systems become economically unviable.

Blockchain throughput has already increased 100x in five years, from 25 transactions per second to 3,400 TPS as of late 2025. Transaction costs on Ethereum L2s dropped from $24 to under one cent, making high-frequency transactions feasible, which is critical for AI agent micropayments and autonomous transactions.

Stablecoin transaction volume reached $46 trillion annually, up 106% year-over-year, while adjusted transaction volume (filtering out automated trading) reached $9 trillion, representing 87% year-over-year growth.

The Economic Magnitude of the Shift

The scale of this transformation is staggering when you examine forward-looking projections.

Gartner estimates that AI "machine customers" could influence or control up to $30 trillion in annual purchases by 2030, while McKinsey research suggests agentic commerce could generate $3 to $5 trillion globally by 2030.

Looking at specific blockchain use cases, consumer behavior indicates significant variation. 70% of consumers are willing to let AI agents book flights independently and 65% trust them for hotel selections. Additionally, 81% of US consumers expect to use agentic AI for shopping, shaping over half of all online purchases.

However, the current reality is more cautious. Only 24% of consumers trust AI to make routine purchases on their behalf, suggesting that B2B adoption rather than consumer-facing use will drive early transaction volumes.

The enterprise trajectory supports this assessment. It's projected that by late 2026, 60% of crypto wallets will use agentic AI to manage portfolios, track transactions, and improve security.

Why Blockchain Is the Perfect Backend for AI Agents

The convergence of AI and blockchain isn't accidental—it's architecturally necessary for autonomous agent economies.

Blockchain provides three critical capabilities that AI agents require:

  1. Trustless Coordination: Advances in large language models have enabled agentic AI systems that can reason, plan, and interact with external tools to execute multi-step workflows, while public blockchains have evolved into a programmable substrate for value transfer, access control, and verifiable state transitions. When agents from different providers need to transact, blockchain provides neutral settlement infrastructure.

  2. Verifiable State: AI agents need to verify the state of assets, permissions, and commitments without trusting centralized intermediaries. Blockchain's transparency enables this verification at scale.

  3. Programmable Money: Autonomous agents require programmable payment rails that can execute conditional logic, time-locks, and multi-party settlements—exactly what smart contracts provide.

This architecture explains why Polosukhin frames AI as the frontend and blockchain as the backend. Users interact with intelligent interfaces that understand natural language and user goals, while blockchain handles the coordination, settlement, and verification layer invisibly.

The Existential Questions for 2026 and Beyond

The rapid advancement of AI agent infrastructure raises profound questions about the future direction of this convergence.

By late 2026, we'll know whether crypto AI converges with mainstream AI as essential plumbing or diverges as a parallel ecosystem, which will determine whether autonomous agent economies become a trillion-dollar market or remain an ambitious experiment.

Capital constraints, scalability gaps, and regulatory uncertainty threaten to relegate crypto AI to niche use cases. The challenge is whether blockchain infrastructure can scale fast enough to match the exponential growth in AI capabilities.

Regulatory frameworks remain undefined. How will governments treat autonomous agents with financial autonomy? What liability structures apply when an AI agent makes a harmful transaction? These questions lack clear answers in March 2026.

Building for the Agent Economy

For developers and infrastructure providers, the implications are clear: the next generation of blockchain infrastructure must be designed for autonomous agents first, humans second.

This means:

  • Intent-first interfaces that accept natural language or high-level goals rather than explicit transaction parameters
  • Hybrid architectures that balance on-chain verification with off-chain computation
  • Privacy-preserving mechanisms that enable agents to transact without exposing sensitive business logic
  • Interoperability standards that allow agents to coordinate across chains and protocols seamlessly

The 282 crypto×AI projects funded in 2025 with $4.3 billion in valuations represent early bets on this infrastructure layer. The survivors will be those that solve the practical challenges of scalability, privacy, and interoperability.

For developers building AI agent applications that require reliable, high-performance blockchain infrastructure, BlockEden.xyz provides enterprise-grade API access across NEAR, Ethereum, Solana, and 10+ chains—enabling the multi-chain coordination that autonomous agents demand.

Conclusion: The Invisible Future

Polosukhin's prediction that "blockchain is going to be the back-end" suggests a future where blockchain technology becomes so ubiquitous that it disappears from conscious awareness—much like TCP/IP protocols underpin the internet without users thinking about packet routing.

This is the ultimate success metric for blockchain: not mass adoption through direct user interfaces, but invisibility as the coordination layer for autonomous AI systems.

The infrastructure being built in 2026 is not for today's crypto users who manually sign transactions and monitor gas prices. It's for tomorrow's AI agents that will execute billions of transactions daily, coordinating economic activity across chains, protocols, and jurisdictions without human intervention.

The question is not whether AI agents will become primary blockchain users. They already are in specific verticals like prediction markets and DeFi yield optimization. The question is how fast the infrastructure can scale to support the next three orders of magnitude of growth.

As enterprise applications embed AI agents at exponential rates and blockchain throughput continues its 100x trajectory, 2026 marks the inflection point where the agent economy transitions from experiment to infrastructure.

Polosukhin's vision is becoming reality: AI on the front end, blockchain on the back end, and humans enjoying the benefits without seeing the complexity underneath.

Sources

DEX Perpetuals Hit 10.2% Market Share: Inside the 800% Volume Surge Reshaping Crypto Derivatives

· 7 min read
Dora Noda
Software Engineer

When silver prices surged past $120 per ounce during January 2026's geopolitical turmoil, something remarkable happened: over $1.25 billion in silver perpetual futures traded on Hyperliquid in a single day—not on the CME, not on Binance, but on a decentralized exchange that did not exist three years ago. This was not an anomaly. It was a signal that the $80 trillion derivatives market is undergoing a structural transformation.

OKX OnchainOS AI Toolkit: When Exchanges Become Agent Operating Systems

· 12 min read
Dora Noda
Software Engineer

On March 3, 2026, while most exchanges were still figuring out how to add chatbots to customer support, OKX launched something fundamentally different: an entire operating system for autonomous AI agents. The OnchainOS AI Toolkit isn't about making trading faster for humans—it's about making it possible for machines.

With infrastructure already processing 1.2 billion daily API calls and $300 million in trading volume, OKX just transformed from an exchange into what might be the most ambitious bet on the agent economy. The question isn't whether AI agents will trade crypto autonomously. It's which infrastructure will dominate when they do.

The Agent-First Exchange Architecture

Traditional crypto exchanges optimize for human decision-making: charts, order books, buttons. OKX's OnchainOS flips this entirely. Instead of humans clicking through interfaces, AI agents issue natural language commands that execute across 60+ blockchains and 500+ DEXs simultaneously.

This architectural shift mirrors a broader industry transformation. Coinbase announced Agentic Wallets on February 11, 2026, with the x402 protocol for autonomous spending. Binance's CZ promised a "Binance-level brain" for AI agents. Even Bitget is retrofitting non-custodial wallets with autonomous decision-making.

But OKX's approach is distinctly infrastructure-focused. Rather than building agent personalities or trading strategies, they've created the operating system layer—unifying wallet functionality, liquidity routing, and market data into a single framework that any AI model can access.

Three Paths to Agent Integration

OnchainOS offers developers three integration methods, each targeting different use cases:

AI Skills provide natural language interfaces where agents can say "swap 100 USDC to ETH on the best available DEX" without knowing how routing works. For developers building conversational agents or customer-facing bots, this removes API complexity entirely.

Model Context Protocol (MCP) integration means OnchainOS plugs directly into LLM frameworks like Claude, Cursor, and OpenClaw. An AI coding assistant can now autonomously interact with blockchain state, execute trades, and verify on-chain data as part of its normal reasoning loop—no custom integration required.

REST APIs give scripted control for traditional developers building programmatic strategies. While less innovative than natural language commands, this ensures backward compatibility with existing trading infrastructure and allows gradual migration to agent-based systems.

The practical implication: whether you're building a fully autonomous trading bot, enhancing an existing AI assistant with crypto capabilities, or just want API access with intelligent routing, OnchainOS provides the appropriate abstraction layer.

The Economics of Agent Infrastructure

The numbers reveal production-scale deployment, not a pilot program. Processing 1.2 billion API calls daily with sub-100ms response times and 99.9% uptime requires infrastructure that most exchanges couldn't replicate overnight.

OKX's liquidity aggregation across 500+ DEXs creates economic advantages for agents that humans can't match manually. When an agent needs to execute a large swap, the system automatically:

  1. Queries real-time pricing across hundreds of liquidity pools
  2. Calculates optimal routing to minimize slippage
  3. Splits orders across multiple DEXs if needed
  4. Executes transactions in parallel across chains
  5. Verifies settlement and updates agent state

All of this happens in milliseconds. For human traders, this level of cross-DEX optimization requires running multiple interfaces simultaneously, manually comparing rates, and accepting that by the time you've checked five options, prices have moved.

The $300 million daily trading volume processed through OnchainOS suggests meaningful early adoption. More tellingly, that volume runs through infrastructure supporting over 12 million monthly wallet users—meaning the agent layer sits on top of battle-tested systems handling real user funds.

Unified Wallet Infrastructure vs Specialized Agent Wallets

Coinbase's Agentic Wallets take a purpose-built approach: wallets designed specifically for autonomous spending with security guardrails baked in. OKX went the opposite direction: integrate agent capabilities into existing wallet infrastructure that already supports 60+ chains.

The trade-offs are architectural. Purpose-built agent wallets can optimize for autonomous operation from the start—built-in spending limits, risk parameters, and recovery mechanisms designed for machines making decisions without human oversight. Unified infrastructure inherits complexity from supporting diverse chains and use cases but offers broader reach and battle-tested security.

OKX's bet is that agents will need access to the full crypto ecosystem, not a sandboxed environment. If an autonomous agent is managing a DAO's treasury, arbitraging across chains, or rebalancing a portfolio dynamically, it needs native access to wherever liquidity lives—not a specialized wallet that only works on three chains.

The market hasn't decided which approach wins. What's clear is that both OKX and Coinbase recognize the same shift: autonomous agents need infrastructure designed for them, not retrofitted human tools.

On-Chain Data Feeds: The Agent Information Layer

Trading decisions require data. For AI agents, OnchainOS provides real-time feeds covering tokens, transfers, trades, and account states across all supported networks.

This solves a problem that anyone building multi-chain applications knows intimately: querying blockchain state from dozens of networks is slow, requires running infrastructure for each chain, and introduces failure points when nodes go down or lag behind.

OnchainOS abstracts this entirely. An agent queries "get all recent trades for token X across networks Y and Z" and receives normalized, real-time data without knowing which RPC endpoints to call or how different chains structure transaction logs.

The competitive edge isn't just convenience. Agents making sub-second trading decisions need data latency measured in milliseconds. Running your own nodes for 60 blockchains to achieve similar performance requires infrastructure investment that most developers can't justify. Cloud RPC providers add latency and costs that kill the economics of high-frequency agent strategies.

By unifying data feeds as part of the platform, OKX turns infrastructure costs into a distributed shared resource—making sophisticated agent strategies accessible to independent developers, not just well-funded firms.

The x402 Protocol and Zero-Gas Execution

Autonomous payments run on the x402 pay-per-use protocol, which addresses a fundamental agent economy problem: how do machines pay each other without manual intervention?

When an AI agent needs to access a paid API, purchase data, or compensate another agent for services, x402 enables automatic settlement. Combined with zero-gas transactions on OKX's X Layer, agents can make micropayments economically—something impossible when each payment costs more in gas than the service itself.

This matters more as agent-to-agent interactions increase. A single high-level agent task might involve:

  • Querying market data from a specialized analytics agent
  • Calling a sentiment analysis API agent
  • Purchasing on-chain position data
  • Executing trades through a routing agent
  • Verifying results through an oracle agent

If each step requires manual approval or gas costs that exceed the value transferred, the agent economy never scales beyond human-supervised operations. x402 and zero-gas execution remove these friction points.

Market Context: The $50 Billion Agent Economy

OnchainOS arrives as the AI-crypto convergence accelerates. The blockchain AI market is projected to grow from $6 billion in 2024 to $50 billion by 2030. More immediately, 282 crypto × AI projects secured venture funding in 2025, with 2026 showing strong momentum.

Virtuals Protocol reports 23,514 active wallets generating $479 million in AI-generated GDP (aGDP) as of February 2026. These aren't theoretical metrics—they represent agents actively managing value, executing trades, and participating in on-chain economies.

Transaction infrastructure has fundamentally improved. Blockchain throughput increased 100x in five years, from 25 TPS to 3,400 TPS. Ethereum L2 transaction costs dropped from $24 to under one cent. High-frequency agent strategies that were economically impossible in 2023 are now routine.

Stablecoins processed $46 trillion in volume last year ($9 trillion adjusted), with projections showing AI "machine customers" controlling up to $30 trillion in annual purchases by 2030. When machines become primary transactors, they need infrastructure optimized for autonomous operation.

Developer Adoption Signals

OnchainOS launched with comprehensive documentation and starter guides, targeting builders deploying their first AI agents. The Model Context Protocol integration is particularly strategic—by plugging into frameworks developers already use (Claude, Cursor), OKX removes the "learn a new platform" barrier.

For developers already building trading bots or automation scripts, the REST API provides migration paths. For AI researchers experimenting with autonomous agents, natural language Skills offer the fastest path to on-chain capabilities.

What OKX hasn't provided: proprietary agent personalities, pre-built trading strategies, or "click here for autonomous trading" consumer products. This is infrastructure, not an end-user application. The bet is that thousands of developers building specialized agents will create more value than OKX could by building a single agent trading product.

This mirrors successful platform strategies in other markets. AWS didn't try to build every application—they provided compute, storage, and networking primitives that millions of developers used to build diverse applications. OnchainOS positions OKX as the AWS of agent infrastructure.

Competitive Dynamics and Market Evolution

The exchange industry is bifurcating. Traditional exchanges optimize for retail traders clicking buttons and institutions running regulated operations. Agent-first exchanges optimize for autonomous systems executing programmatic strategies across fragmented liquidity.

Coinbase's approach emphasizes purpose-built agent wallets with regulatory compliance considerations. OKX emphasizes breadth—60+ chains, 500+ DEXs, massive existing user base. Binance promises AI but hasn't shipped infrastructure. Smaller exchanges lack the resources to compete on infrastructure at this scale.

Network effects favor early movers. If OnchainOS becomes the standard way developers build trading agents, liquidity concentrates there because that's where the agents are. More liquidity attracts more agents. This is the same dynamic that made Ethereum the default smart contract platform despite technical limitations—developers were already there.

But it's early. Coinbase has regulatory relationships and institutional trust that matter for compliant agent deployment. Decentralized protocols might offer agent infrastructure without exchange dependency. The market could fragment by use case—Coinbase for institutional agents, OKX for defi-native operations, Solana's ecosystem for high-frequency strategies.

What "Agent-First" Really Means

The OnchainOS launch clarifies what "agent-first" infrastructure actually requires:

Natural language interfaces so non-specialist developers can build agents without learning complex blockchain APIs.

Unified cross-chain access because agents don't care about chain tribalism—they optimize for execution quality wherever liquidity exists.

Real-time data aggregation packaged as queryable feeds rather than requiring infrastructure operations.

Autonomous payment rails that let agents transact with each other economically.

Production-scale infrastructure with millisecond latency and high uptime because agents making autonomous decisions can't wait for slow API responses.

What's notable is what's missing: OKX didn't build AI models, train specialized trading agents, or create consumer-facing "autonomous trading" products. They built the layer beneath all of that.

This suggests confidence that the agent economy will be diverse—many specialized agents built by different developers for different strategies, not a few dominant trading bots. If you believe in that future, infrastructure positioning makes strategic sense.

Open Questions and Risk Factors

Several uncertainties remain. Regulatory treatment of autonomous trading systems is unresolved. When an agent executes trades violating market manipulation rules, who's liable—the developer, the exchange, the model provider?

Security risks scale differently. A bug in human-facing trading interfaces affects users who click compromised buttons. A bug in agent APIs could trigger cascading autonomous failures across thousands of agents simultaneously.

Centralization concerns persist. OnchainOS is infrastructure controlled by OKX. If agents depend on this platform for critical functionality, OKX gains enormous leverage over the agent economy—exactly the dependency crypto supposedly eliminates.

Technical risks include agent unpredictability. LLMs make probabilistic decisions. An agent optimized for yield farming might, through unexpected prompt interpretation, execute strategies its operator never intended. When that agent controls significant capital, unpredictability becomes systemic risk.

Market adoption remains unproven beyond early metrics. 1.2 billion API calls sounds impressive but could represent a small number of high-frequency bots rather than broad developer adoption. $300 million daily volume is meaningful but tiny compared to centralized exchange totals.

The Infrastructure Thesis

OKX's OnchainOS represents a specific thesis about crypto's evolution: that autonomous agents will become primary users of blockchain infrastructure, and exchanges that provide optimal agent tooling will capture disproportionate value.

This thesis is either visionary or premature. If agents do become dominant blockchain users, building this infrastructure in early 2026 positions OKX as the platform of choice before competitive dynamics lock in. If adoption lags or takes different forms, significant engineering resources go toward supporting a market that never materializes at scale.

What's clear is that OKX isn't waiting to find out. By shipping production infrastructure processing billions of API calls and hundreds of millions in trading volume, they're not pitching a vision—they're deploying a platform and learning from real usage.

The exchanges that emerge as winners in 2028 probably won't be the ones with the best trading interfaces for humans. They'll be the ones where autonomous agents found the infrastructure that made machine-to-machine crypto economies actually work.

OnchainOS is OKX's bet that infrastructure wins in the end. The next 12-24 months will reveal whether the agent economy grows fast enough to justify that conviction.


Sources

OpenClaw: Revolutionizing AI Agent Frameworks with Blockchain Integration

· 11 min read
Dora Noda
Software Engineer

In just 60 days, an open-source project transformed from a weekend experiment into GitHub's most-starred repository, surpassing React's decade-long dominance. OpenClaw, an AI agent framework that runs locally and integrates seamlessly with blockchain infrastructure, has achieved 250,000 GitHub stars while reshaping expectations for what autonomous AI assistants can accomplish in the Web3 era.

But behind the viral growth lies a more compelling story: OpenClaw represents a fundamental shift in how developers are building the infrastructure layer for autonomous agents in decentralized ecosystems. What started as one developer's weekend hack has evolved into a community-driven platform where blockchain integration, local-first architecture, and AI autonomy converge to solve problems that traditional centralized AI assistants cannot address.

From Weekend Project to Infrastructure Standard

Peter Steinberger published the first version of Clawdbot in November 2025 as a weekend hack. Within three months, what began as a personal experiment became the fastest-growing repository in GitHub history, gaining 190,000 stars in its first 14 days.

The project was renamed to "Moltbot" on January 27, 2026, following trademark complaints by Anthropic, and again to "OpenClaw" three days later.

By late January the project was viral, and by mid-February, Steinberger had joined OpenAI and the Clawdbot codebase was transitioning to an independent foundation. This transition from individual developer project to community-governed infrastructure mirrors the evolution patterns seen in successful blockchain protocols—from centralized innovation to decentralized maintenance.

The numbers tell part of the story: OpenClaw achieved 100,000 GitHub stars within a week of its late January 2026 release, making it one of the fastest-growing open-source AI projects in history. After launching, over 36,000 agents gathered within just a few days.

But what makes this growth remarkable isn't just velocity—it's the architectural decisions that enabled a community to build an entirely new category of blockchain-integrated AI infrastructure.

The Architecture That Enables Blockchain Integration

While most AI assistants rely on cloud infrastructure and centralized control, OpenClaw's architecture was designed for a fundamentally different paradigm. At its core, OpenClaw follows a modular, plugin-first design where even model providers are external packages loaded dynamically, keeping the core lightweight at approximately 8MB after the 2026 refactor.

This modular approach consists of five key components:

The Gateway Layer: A long-living WebSocket server (default: localhost:18789) that accepts inputs from any channel, enabling the headless architecture that connects to WhatsApp, Telegram, Discord, and other platforms through existing interfaces.

Local-First Memory: Unlike traditional LLM tools that abstract memory into vector spaces, OpenClaw puts long-term memory back into the local file system. An agent's memory is not hidden in abstract representations but stored as clearly visible Markdown files: summaries, logs, and user profiles are all on disk in the form of structured text.

The Skills System: With the ClawHub registry hosting 5,700+ community-built skills, OpenClaw's extensibility enables blockchain-specific capabilities to emerge organically from the community rather than being dictated by a central development team.

Multi-Model Support: OpenClaw supports Claude, GPT-4o, DeepSeek, Gemini, and local models via Ollama, running entirely on your hardware with full data sovereignty—a critical feature for users managing private keys and sensitive blockchain transactions.

Virtual Device Interface (VDI): OpenClaw achieves hardware and OS independence through adapters for Windows, Linux, and macOS that normalize system calls, while communication protocols are standardized via a ProtocolAdapter interface, enabling deployment flexibility on bare metal, Docker, or even serverless environments like Cloudflare Moltworker.

This architecture creates something uniquely suited for blockchain integration. When on the Base platform, an "OpenClaw × Blockchain" ecosystem is forming, centered around infrastructure like Bankr/Clanker/XMTP and extending to SNS, job markets, launchpads, trading, games, and more.

Community-Driven Development at Scale

Version 2026.2.2 includes 169 commits from 25 contributors, demonstrating the active community participation that has become OpenClaw's defining characteristic.

This wasn't organic growth alone—strategic community cultivation accelerated adoption.

BNB Chain launched the Good Vibes Hackathon: The OpenClaw Edition, a two-week sprint with nearly 300 project submissions from over 600 hackers. The results reveal both the promise and current limitations of blockchain integration: several community projects—such as 4claw, lobchanai, and starkbotai—are experimenting with agents that can initiate and manage blockchain transactions autonomously.

According to user examples shared on social media, OpenClaw is being used for tasks such as monitoring wallet activity and automating airdrop-related workflows. The community has built some of the most comprehensive on-chain trading automation available in any open-source AI agent framework, making it a powerful option for crypto traders who want natural language control over their positions.

However, the gap between potential and reality remains significant. Despite the proliferation of tokens and agent-branded experiments, there is still relatively little deep, native crypto interaction, with most agents not actively managing complex DeFi positions or generating sustained on-chain cash flows.

The March 2026 Technical Maturity Inflection

The OpenClaw 2026.3.1 release marks a critical transition from experimental tool to production-grade infrastructure. The update added:

  • OpenAI WebSocket streaming for low-latency token delivery, enabling real-time inference UX that can cut perceived response time and improve agent handoffs
  • Claude 4.6 adaptive thinking for improved multi-step reasoning, presenting a route to higher-quality tool-use chains in enterprise agents
  • Native Kubernetes support for production deployment, signaling readiness for enterprise-scale blockchain infrastructure
  • Discord threads and Telegram DM topics integration for structured chat workflows

Perhaps more significantly, the February 2026.2.19 release represented a maturity inflection point with 40+ security hardenings, authentication infrastructure, and observability upgrades.

Previous releases focused on feature expansion; this release prioritized production readiness.

For blockchain applications, this evolution matters. Managing private keys, executing smart contract interactions, and handling financial transactions require not just capability but security guarantees.

While security firms like Cisco and BitSight warn that OpenClaw presents risks due to prompt injection and compromised skills, advising users to run it in isolated environments like Docker or virtual machines, the project is rapidly closing the gap between experimental tool and institutional-grade infrastructure.

What Makes OpenClaw Different in the AI Agent Market

The AI agent landscape in 2026 is crowded, but OpenClaw occupies a unique position when compared to alternatives like Claude Code, which is Anthropic's terminal-based coding agent that focuses exclusively on helping developers write, understand, and maintain software.

Claude Code operates in a sandboxed environment where permissions are explicit and granular, with dedicated security infrastructure and regular audits. It excels at complex code refactoring, using the reasoning ability of Opus 4.6 coupled with Context Compaction to minimize the likelihood of breaking code.

In contrast, OpenClaw is designed to be an always-on, 24/7 personal assistant that you communicate with via standard messaging apps.

While Claude Code wins at coding tasks, OpenClaw dominates in day-to-day automation because of its integration with numerous tools and platforms.

The two tools are complementary, not competing. Claude Code handles your codebase. OpenClaw handles your life. But for blockchain developers and Web3 users, OpenClaw offers something Claude Code cannot: the ability to integrate autonomous AI decision-making with on-chain actions, wallet management, and decentralized protocol interactions.

The Blockchain Integration Challenge

Despite rapid technical progress, OpenClaw's blockchain integration reveals a fundamental tension in the AI × crypto convergence. The technical standards are emerging: ERC-8004, x402, L2, and stablecoins are suitable for agent IDs, permissions, credentials, evaluations, and payments.

The Base platform ecosystem centered around OpenClaw demonstrates what's possible. Infrastructure components like Bankr handle financial rails, Clanker manages token operations, and XMTP enables decentralized messaging. The full stack is being assembled.

Yet the gap between infrastructure capability and application reality persists. Most OpenClaw blockchain experiments focus on monitoring, simple wallet operations, and airdrop automation. The vision of agents autonomously managing complex DeFi positions, executing sophisticated trading strategies, or coordinating multi-protocol interactions remains largely unrealized.

This isn't a failure of OpenClaw's architecture—it's a reflection of broader challenges in the AI × blockchain convergence:

Trust and Verification: How do you verify that an AI agent's on-chain actions align with user intent when the agent operates autonomously? Traditional permission systems don't map cleanly to the nuanced decision-making required for DeFi strategies.

Economic Incentives: Most current integrations are experimental. Agents don't yet generate sustained on-chain cash flows that would justify their existence beyond novelty value.

Security Trade-offs: The local-first, always-on architecture that makes OpenClaw powerful for general automation creates attack surfaces when managing private keys and executing financial transactions.

The community is aware of these limitations. Rather than premature claims of solving Web3's UX problems, the ecosystem is methodically building the infrastructure layer—wallets integrated with AI decision-making, protocols designed for agent interaction, and security frameworks that balance autonomy with user control.

The Web3 Infrastructure Implications

OpenClaw's emergence signals several important shifts in how Web3 infrastructure is being built:

From Centralized AI to Local-First Agents: The success of OpenClaw's architecture validates the demand for AI assistants that don't send your data to centralized servers—particularly important when those conversations involve private keys, transaction strategies, and financial information.

Community-Driven vs Corporate-Led: While companies like Anthropic and OpenAI control their AI assistant roadmaps, OpenClaw demonstrates an alternative model where 25 contributors can ship 169 commits and the community determines which features matter. This parallels the governance evolution in successful blockchain protocols.

Skills as Composable Primitives: The ClawHub registry with 5,700+ skills creates a marketplace of capabilities that can be mixed and matched. This composability mirrors the building blocks approach of DeFi protocols, where smaller components combine to create complex functionality.

Open Standards for AI × Blockchain: The emergence of ERC-8004 for agent identity, x402 for agent payments, and standardized wallet integrations suggests the industry is converging on shared infrastructure rather than fragmented proprietary solutions.

The fact that OpenClaw has no token, no cryptocurrency, and no blockchain component is perhaps its greatest strength in the blockchain space. Any token claiming to be associated with the project is a scam. This clarity prevents the financialization from corrupting the technical development, allowing the infrastructure to mature before economic incentives shape the ecosystem.

The Path Forward: Infrastructure Before Applications

March 2026 represents a critical moment for OpenClaw in the blockchain ecosystem. The technical foundations are solidifying: production-ready security, Kubernetes deployment, enterprise-grade observability. The community infrastructure is growing: 25 active contributors, 300 hackathon submissions, 5,700+ skills.

But the most important developments are the ones that haven't happened yet. The killer applications for AI agents in Web3 aren't simple wallet monitors or airdrop farmers. They're likely to emerge from use cases we haven't fully imagined—perhaps agents that coordinate cross-chain liquidity provision, autonomously manage treasuries for DAOs, or execute sophisticated MEV strategies across multiple protocols.

For these applications to emerge, the infrastructure layer must mature first. OpenClaw's community-driven development model, local-first architecture, and blockchain-native design make it a strong candidate to become foundational infrastructure for this next phase.

The question isn't whether AI agents will transform how we interact with blockchain protocols. The question is whether the infrastructure being built today—exemplified by OpenClaw's approach—will be robust enough to handle the complexity, secure enough to manage real financial value, and flexible enough to enable innovations we can't yet anticipate.

Based on the architectural decisions, community momentum, and technical trajectory visible in March 2026, OpenClaw is positioning itself as the infrastructure layer that enables that future. Whether it succeeds depends not just on code quality or GitHub stars, but on the community's ability to navigate the complex trade-offs between autonomy and security, decentralization and usability, innovation and stability.

For blockchain developers and Web3 infrastructure teams, OpenClaw offers a glimpse of what's possible when AI agent architecture is designed from first principles for decentralized systems rather than adapted from centralized paradigms. That makes it worth paying attention to—not because it's solved all the problems, but because it's asking the right questions about how autonomous agents should integrate with blockchain infrastructure in a post-cloud, local-first, community-governed world.

DePAI: When Physical Robots Meet Decentralized AI Infrastructure

· 13 min read
Dora Noda
Software Engineer

When robots start earning their own paychecks, who controls their wallets? That's the trillion-dollar question driving DePAI—Decentralized Physical AI—a paradigm shift that's moving physical robots and AI systems from corporate data centers to community-owned infrastructure. While Web3 has spent years promising to decentralize the digital world, 2026 marks the year this vision collides with the physical realm: autonomous vehicles, humanoid robots, and AI-powered IoT devices operating on blockchain rails.

The numbers tell a compelling story. The World Economic Forum projects the DePIN (Decentralized Physical Infrastructure Networks) market will explode from $20 billion today to $3.5 trillion by 2028—a staggering 6,000% increase. What's driving this growth? The convergence of AI and blockchain is creating what industry insiders now call "DePAI"—infrastructure that enables distributed machine learning, autonomous economic agents, and community-owned robotics networks at unprecedented scale.

This isn't speculative tokenomics anymore. Real revenue is flowing through decentralized networks: Aethir posted $166 million in annualized revenue serving 150+ enterprise AI clients, Helium's decentralized wireless network hit $13.3 million in annualized revenue through partnerships with T-Mobile and AT&T, and Grass is generating approximately $33-85 million annually selling web-scraped data to AI companies. The shift from "token speculation" to "business revenue models" has arrived.

From DePIN to DePAI: The Evolution of Decentralized Infrastructure

To understand DePAI, you need to grasp its foundation: DePIN (Decentralized Physical Infrastructure Networks). DePIN uses blockchain and token incentives to crowdsource physical infrastructure—wireless networks, GPU compute, storage, sensors—that traditionally required massive capital expenditure from corporations. Think Uber, but for infrastructure: individuals contribute resources (bandwidth, GPUs, storage) and earn tokens in return.

DePAI takes this concept further by adding autonomous AI agents into the mix. It's not just about decentralizing infrastructure ownership—it's about enabling AI systems and physical robots to interact with that infrastructure autonomously, transact in decentralized markets, and execute complex tasks without centralized cloud dependencies.

The seven-layer DePAI stack illustrates this evolution:

  1. AI Agents - Autonomous software entities that make decisions and execute transactions
  2. Robotics - Physical embodiments (humanoid robots, drones, autonomous vehicles)
  3. Decentralized Data Streams - Real-time sensor data, location data, environmental inputs
  4. Spatial Intelligence - Mapping, navigation, and environmental understanding
  5. Infrastructure Networks - DePIN for compute, storage, connectivity
  6. The Machine Economy - Peer-to-peer markets where machines transact directly
  7. DePAI DAOs - Governance layers enabling community ownership and decision-making

This stack transforms robots from isolated corporate assets into economically autonomous actors in a decentralized ecosystem. Imagine a delivery drone that autonomously books GPU compute for route optimization, purchases bandwidth access through a DePIN marketplace, and settles payments via smart contracts—all without human intervention.

The Enterprise Revenue Breakout: Aethir's $166M Lesson

For years, DePIN projects struggled with the "chicken-and-egg" problem: how do you bootstrap supply (people contributing resources) without demand (paying customers), and vice versa? Aethir cracked this problem with a laser focus on enterprise clients rather than retail speculators.

In Q3 2025 alone, Aethir generated $39.8 million in revenue, reaching a $147+ million annual recurring revenue (ARR) run rate. By early 2026, this figure hit $166 million ARR. The key differentiator? These revenues came from 150+ enterprise clients across AI, gaming, and Web3—not from token emissions or subsidies.

With over 435,000 enterprise-grade GPUs distributed across 200+ locations in 93 countries, Aethir provides more than $400 million worth of compute capacity while maintaining an exceptional 98.92% uptime. That's infrastructure reliability comparable to AWS or Google Cloud, but delivered through a decentralized network where GPU owners earn yield and customers pay 50-85% less than hyperscaler prices.

The business model is straightforward: AI companies need massive compute for training and inference. Centralized cloud providers like AWS charge premium rates and face GPU scarcity (SK Hynix and Micron have announced their entire 2026 output is sold out). Aethir aggregates idle GPU capacity from data centers, mining operations, and enterprise partners, making it available through a decentralized marketplace at fractional costs.

For 2026, Aethir is doubling down on agentic AI—enabling autonomous AI agents to book, pay for, and optimize GPU usage in real-time without human operators. This positions DePAI infrastructure not just as a cost-efficient alternative to centralized cloud, but as the native rails for the emerging machine economy.

Helium's Hybrid Model: Carrier Offload Meets Community Networks

While Aethir focuses on compute, Helium tackles connectivity. What started in 2019 as a community-driven IoT network has evolved into a full-stack wireless DePIN supporting both IoT and 5G mobile services. By Q3 2025, the Helium Network had transferred over 5,452 terabytes of data offloaded from major U.S. mobile carriers, representing significant quarter-over-quarter growth.

The "carrier offload" model is where DePAI meets real-world telecommunications. Major carriers like T-Mobile, AT&T, Movistar, and Google Orion partner with Helium to offload customer data to community-run hotspots in high-traffic urban areas. The carrier pays the network a fee, and that revenue flows to hotspot operators who provide the physical infrastructure.

Despite some confusion in media reports, Helium does not have a formal carrier offload agreement directly with T-Mobile as a telecom-to-telecom partnership. Instead, T-Mobile subscribers can connect to Helium's network at select locations through third-party arrangements, and carriers benefit from reduced congestion by offloading traffic to Helium's 26,000+ Wi-Fi sites.

Helium Mobile, the network's MVNO (Mobile Virtual Network Operator) service, exemplifies the "Hybrid MNO" model: users get unlimited mobile plans for $20/month by seamlessly switching between Helium's community network and T-Mobile's backbone. When you're near a Helium hotspot, your traffic gets routed through DePIN infrastructure. When you're not, T-Mobile's network serves as backup.

This hybrid approach proves DePAI doesn't need to replace centralized infrastructure entirely—it can augment it, capturing high-margin use cases (urban density, IoT sensors, stationary devices) while leaving low-margin scenarios to traditional providers. The result: $13.3 million in annualized revenue for a network bootstrapped by retail participants, not telecom giants.

Grass: Monetizing Idle Bandwidth for AI Training Data

If Aethir is selling compute and Helium is selling connectivity, Grass is selling data—specifically, web data scraped by a decentralized network of 2.5 million+ users who contribute their unused internet bandwidth.

AI companies face a critical bottleneck: they need massive, diverse datasets to train large language models (LLMs), but scraping the public web at scale requires enormous bandwidth and IP diversity to avoid rate limits and geographic blocks. Grass solved this by crowdsourcing bandwidth from everyday internet users, turning their home connections into a distributed web-scraping network.

The revenue model is straightforward: AI labs purchase structured datasets through the Grass network for model training, paying the Grass Foundation in fiat or crypto. The GRASS token serves as the "primary vehicle for value accrual," distributing revenue back to node operators and stakers who provide the underlying infrastructure.

While exact revenue figures vary across sources, Grass monetizes less than 1% of its 2.5M+ user base and already generates substantial early revenue estimates ranging from $33 million to $85 million annually. The founder casually mentioned a "mid-8 figure revenue" in a recent demo, suggesting the network is generating $50+ million per year. With 8.5 million monthly active users and growing commercial deals with AI labs, Grass is scaling network capacity for both training datasets and live context retrieval data to serve AI clients through 2026-2027.

What makes Grass a DePAI case study rather than just a data marketplace? The network enables autonomous AI agents to access real-time, decentralized web data without relying on centralized APIs that can be censored, rate-limited, or shut down. As AI agents become more autonomous and economically active, they'll need infrastructure that's as permissionless and decentralized as they are.

The Robotics Revolution: When Machines Need DePAI Infrastructure

DePAI's ultimate vision extends beyond compute, connectivity, and data—it's about enabling physical robots to operate as autonomous economic agents. Morgan Stanley analysts predict the humanoid robotics industry could generate up to $4.7 trillion in annual revenue by 2050. But here's the critical question: will these robots be controlled by a handful of corporations (Boston Dynamics under Hyundai, Tesla's Optimus, Google's robotics division), or will they operate on decentralized infrastructure owned by communities?

Projects like peaq, XMAQUINA, and elizaOS are pioneering the DePAI approach to robotics:

  • peaq functions as the "Machine Economy operating system," enabling robots, sensors, and IoT devices to interact via self-sovereign IDs, transact peer-to-peer, and offer data and services through decentralized marketplaces. Think of it as the Ethereum for machines.

  • XMAQUINA advances DePAI through a DAO structure, giving a global community liquid exposure to leading private robotics companies developing next-generation humanoids. Instead of robots being corporate assets, investors pool resources and democratize ownership in robotics companies via blockchain-based governance.

  • elizaOS bridges decentralized AI agents and robotics by turning autonomous intelligence into real-world workflows. It extends naturally into robotics where systems must process data locally and coordinate tasks without relying on fragile centralized clouds.

The core idea is "universal basic ownership" as an alternative to universal basic income (UBI). If robots displace human labor at scale, DePAI offers a model where everyday people profit from machine labor as owners and stakeholders in the networks, not just passive recipients of government transfers.

By 2030, industry forecasts suggest more than half of all AI-driven robots will run workloads on decentralized GPU networks like Aethir, not on AWS, Azure, or Google Cloud. They'll use DePIN wireless networks like Helium for connectivity, access real-time data through networks like Grass, and settle transactions via smart contracts. The vision is a machine economy where autonomous agents and physical robots interact in permissionless markets, owned and governed by DAOs rather than monopolies.

Why 2026 Marks the Shift from Speculation to Revenue

For years, DePIN and Web3 infrastructure projects were funded by token emissions and venture capital, not paying customers. That model worked during bull markets but collapsed spectacularly when crypto entered bear markets. Projects with no real revenue but high token inflation saw their networks and valuations evaporate.

2026 marks a paradigm shift. The metrics that matter now are:

  • Network revenue - How much fiat or stablecoin revenue is the network generating from actual customers?
  • Utilization rates - What percentage of the network's capacity is being actively used by paying users?
  • Enterprise adoption - Are real businesses (not just crypto-native protocols) using the infrastructure?

Aethir, Helium, and Grass demonstrate this shift in action:

  • Aethir's $166M ARR comes from 150+ enterprise clients, not token incentives.
  • Helium's $13.3M annual revenue comes from carrier offload partnerships and MVNO subscribers, not speculative hotspot purchases.
  • Grass's $33-85M revenue comes from AI companies buying datasets, not airdrop farmers.

The GPU-as-a-service market alone is estimated to be worth $35-70 billion by 2030, with accelerated compute workloads growing at more than 30% CAGR. Decentralized services are competing on cost (50-85% savings vs. AWS/GCP), flexibility (global distribution, no vendor lock-in), and resistance to centralized control—values that resonate especially with AI developers concerned about censorship and platform risk.

Compare this to traditional DePIN tokens that collapsed when incentives dried up. The difference is sustainable unit economics: if the network earns more revenue from customers than it spends on token emissions and operations, it can survive indefinitely without bull market bailouts.

The $3.5 Trillion Question: Can DePAI Actually Scale?

The World Economic Forum's $3.5 trillion projection by 2028 sounds audacious, but it hinges on three critical factors:

1. Regulatory Clarity

Physical infrastructure—wireless networks, data centers, transportation systems—operates under heavy regulation. Can DePIN and DePAI networks navigate telecom licensing, data privacy laws (GDPR, CCPA), and robotics safety standards while maintaining decentralization? Helium's carrier partnerships suggest yes, but regulatory risk remains high.

2. Enterprise Adoption

AI companies and robotics firms need infrastructure that's reliable, compliant, and cost-effective. Aethir's 98.92% uptime and enterprise-grade SLAs prove decentralized networks can compete on reliability. But will Fortune 500 companies trust critical workloads to community-owned infrastructure? The next 12-24 months will be telling.

3. Technological Maturation

DePAI requires seamless integration across blockchain (payments, identity, governance), AI (autonomous agents, machine learning), and physical systems (robotics, sensors, edge compute). Many pieces still need interoperability standards, better developer tools, and reduced latency for real-time applications.

The bullish case is compelling: global AI infrastructure spending is projected to hit $5-8 trillion through 2030, and decentralized networks are capturing an increasing share by offering cost, flexibility, and sovereignty advantages. The bearish case warns of centralization creep (a few large node operators dominating networks), regulatory crackdowns, and competition from hyperscalers who could match DePIN pricing through economies of scale.

What Comes Next: The Machine Economy Goes Live

As we move deeper into 2026, several trends will accelerate DePAI's evolution:

Agentic AI proliferation - AI agents are moving from chatbots to autonomous economic actors. They'll need DePAI infrastructure for permissionless access to compute, data, and connectivity.

Open-source model adoption - As more companies run open-source LLMs (Llama, Mistral, etc.) instead of relying on OpenAI/Anthropic APIs, demand for decentralized inference will surge.

Robotics commercialization - Humanoid robots entering warehouses, factories, and service industries will need decentralized infrastructure to avoid vendor lock-in and enable interoperability.

Tokenized incentives for edge nodes - The next wave of DePIN projects will focus on edge compute (processing data close to where it's generated) rather than centralized data centers. This fits perfectly with latency-sensitive robotics and IoT applications.

For developers and investors, the playbook is shifting: look for projects with real revenue, sustainable unit economics, and enterprise traction. Avoid networks sustained purely by token emissions or speculative NFT sales. The DePAI winners will be those bridging Web3's permissionless ethos with the reliability and compliance standards enterprise customers demand.

For builders developing AI applications that require reliable, cost-efficient infrastructure, BlockEden.xyz offers enterprise-grade API access to leading blockchain networks. Explore our services to build on infrastructure designed for the decentralized future.

Sources

The Graph's 2026 Transformation: Redefining Blockchain Data Infrastructure

· 13 min read
Dora Noda
Software Engineer

When 37% of your new users aren't human, you know something fundamental has shifted.

That's the reality The Graph faced in early 2026 when analyzing Token API adoption: more than one in three new accounts belonged to AI agents, not developers. These autonomous programs — querying DeFi liquidity pools, tracking tokenized real-world assets, and executing institutional trades — now consume blockchain data at a scale that would be impossible for human operators to match.

This isn't a future scenario. It's happening now, and it's forcing a complete rethinking of how blockchain data infrastructure works.

From Subgraph Pioneer to Multi-Service Data Backbone

The Graph built its reputation on a single elegant solution: subgraphs. Developers create custom schemas that index on-chain events and smart contract states, enabling dApps to fetch precise, real-time data without running their own nodes.

It's the reason you can check your DeFi portfolio balance instantly or browse NFT metadata without waiting for blockchain queries to complete.

By late 2025, The Graph had processed over 1.5 trillion queries since inception — a milestone that positions it as the largest decentralized data infrastructure in Web3. But raw query volume only tells part of the story.

The more revealing metric emerged in Q4 2025: 6.4 billion queries per quarter, with active subgraphs reaching an all-time high of 15,500. Yet new subgraph creation had slowed dramatically.

The interpretation? The Graph's existing infrastructure serves its current users exceptionally well, but the next wave of adoption requires something fundamentally different.

Enter Horizon, the protocol upgrade that went live in December 2025 and sets the stage for The Graph's 2026 transformation.

The Horizon Architecture: Multi-Service Infrastructure for the On-Chain Economy

Horizon isn't a feature update. It's a complete architectural redesign that transforms The Graph from a subgraph-focused platform into a multi-service data infrastructure capable of serving three distinct customer segments simultaneously: developers, AI agents, and institutions.

The architecture introduces three foundational components:

A core staking protocol that extends economic security to any data service, not just subgraphs. This allows new data products to inherit The Graph's existing network of 167,000+ delegators and active indexers without building separate security models.

A unified payments layer that handles fees across all services, enabling seamless cross-service billing and reducing friction for users who need multiple types of blockchain data.

A permissionless framework allowing new data services to integrate without requiring protocol governance votes. Any team can build on The Graph's infrastructure, as long as they meet technical standards and stake GRT tokens for security.

This modular approach solves a critical problem: different use cases require different data architectures.

A DeFi trading bot needs millisecond-level liquidity updates. An institutional compliance team needs SQL-queryable audit trails. A wallet app needs pre-indexed token balances across dozens of chains. Before Horizon, these use cases would require separate infrastructure providers.

Now, they can all run on The Graph.

Four Services, Four Distinct Markets

The Graph's 2026 roadmap introduces four specialized data services, each targeting a specific market need:

Token API: Pre-Indexed Data for Common Queries

The Token API eliminates the need for custom indexing when you just need standard token data — balances, transfer histories, contract addresses across 10 chains. Wallets, explorers, and analytics platforms no longer need to deploy their own subgraphs for basic queries.

This is where AI agents have shown up in force. The 37% non-human user adoption rate reflects a simple reality: AI agents don't want to configure indexers or write GraphQL queries. They want an API that speaks natural language and returns structured data instantly.

The integration with Model Context Protocol (MCP) enables AI agents to query blockchain data through tools like Claude, Cursor, and ChatGPT without setup keys. The x402 protocol adds autonomous payment capabilities, letting agents pay per query without human intervention.

Tycho: Real-Time Liquidity Tracking for DeFi

Tycho streams live liquidity changes across decentralized exchanges — exactly what trading systems, solvers, and MEV bots need. Instead of polling subgraphs every few seconds, Tycho pushes updates as they happen on-chain.

For DeFi infrastructure providers, this reduces latency from seconds to milliseconds. In high-frequency trading environments where a 100ms delay can mean the difference between profit and loss, Tycho's streaming architecture becomes mission-critical.

Amp: SQL Database for Institutional Analytics

Amp represents The Graph's most explicit play for traditional finance adoption: an enterprise-grade blockchain database with SQL access, built-in audit trails, lineage tracking, and on-premises deployment options.

This isn't for DeFi degens. It's for treasury oversight teams, risk management divisions, and regulated payment systems that need compliance-ready data infrastructure.

The DTCC's Great Collateral Experiment — a pilot program exploring tokenized securities settlement — already uses Graph technology, validating the institutional use case.

SQL compatibility is crucial. Financial institutions have decades of tooling, reporting systems, and analyst expertise built around SQL.

Asking them to learn GraphQL is a non-starter. Amp meets them where they are.

Subgraphs: The Foundation That Still Matters

Despite the new services, subgraphs remain central to The Graph's value proposition. The 50,000+ active subgraphs powering virtually every major DeFi protocol represent an installed base that competitors cannot easily replicate.

In 2026, subgraphs deepen in two ways: expanded multi-chain coverage (now spanning 40+ blockchains) and tighter integration with the new services.

A developer can use a subgraph for custom logic while pulling pre-indexed token data from Token API — best of both worlds.

Cross-Chain Expansion: GRT Utility Beyond Ethereum

For years, The Graph's GRT token existed primarily on Ethereum mainnet, creating friction for users on other chains. That changed with Chainlink's Cross-Chain Interoperability Protocol (CCIP) integration, which bridged GRT to Arbitrum, Base, and Avalanche in late 2025, with Solana planned for 2026.

This isn't just about token availability. Cross-chain GRT utility enables developers on any chain to pay for Graph services using their native tokens, stake GRT to secure data services, and delegate to indexers without moving assets to Ethereum.

The network effects compound quickly: Base processed 1.23 billion queries in Q4 2025 (up 11% quarter-over-quarter), while Arbitrum posted the strongest growth among major networks at 31% QoQ. As L2s continue absorbing transaction volume from Ethereum mainnet, The Graph's cross-chain strategy positions it to serve the entire multi-chain ecosystem.

The AI Agent Data Problem: Why Indexing Becomes Critical

AI agents represent a fundamentally different class of blockchain user. Unlike human developers who write queries once and deploy them, agents generate thousands of unique queries per day across dozens of data sources.

Consider an autonomous DeFi yield optimizer:

  1. It queries current APYs across lending protocols (Aave, Compound, Morpho)
  2. Checks gas prices and transaction congestion
  3. Monitors token price feeds from oracles
  4. Tracks historical volatility to assess risk
  5. Verifies smart contract security audits
  6. Executes rebalancing transactions when conditions are met

Each step requires structured, indexed data. Running a full node for every protocol is economically infeasible. APIs from centralized providers introduce single points of failure and censorship risk.

The Graph solves this by providing a decentralized, censorship-resistant data layer that AI agents can query programmatically. The economic model works because agents pay per query via x402 protocol — no monthly subscriptions, no API keys to manage, just usage-based billing settled on-chain.

This is why Cookie DAO, a decentralized data network indexing AI agent activity across Solana, Base, and BNB Chain, builds on The Graph's infrastructure. The fragmented on-chain actions and social signals generated by thousands of agents need structured data feeds to be useful.

DeFi and RWA: The Data Demands of Tokenized Finance

DeFi's data requirements have matured dramatically. In 2021, a DEX aggregator might query basic token prices and liquidity pool reserves. In 2026, institutional DeFi platforms need:

  • Real-time collateralization ratios for lending protocols
  • Historical volatility data for risk modeling
  • Cross-chain asset pricing with oracle verification
  • Transaction provenance for compliance audits
  • Liquidity depth across multiple venues for trade execution

Tokenized real-world assets add another layer of complexity. When a tokenized U.S. Treasury fund integrates with a DeFi lending protocol (as BlackRock's BUIDL did with Uniswap), the data infrastructure must track:

  • On-chain ownership records
  • Redemption requests and settlement status
  • Regulatory compliance events
  • Yield distribution to token holders
  • Cross-chain bridge activity

The Graph's multi-service architecture addresses this by allowing RWA platforms to use Amp for institutional-grade SQL analytics while simultaneously streaming real-time updates via Tycho for DeFi integrations.

The market opportunity is staggering: Ripple and BCG forecast tokenized RWAs expanding from $0.6 trillion in 2025 to $18.9 trillion by 2033 — a 53% compound annual growth rate. Every dollar tokenized on-chain generates data that needs indexing, querying, and reporting.

Network Economics: The Indexer and Delegator Model

The Graph's decentralized architecture relies on economic incentives aligning three stakeholder groups:

Indexers run infrastructure to process and serve queries, earning query fees and indexing rewards in GRT tokens. The number of active indexers increased modestly in Q4 2025, suggesting operators remained committed despite lower near-term profitability from reduced query fees.

Delegators stake GRT tokens with indexers to earn a portion of rewards without running infrastructure themselves. The network's 167,000+ delegators represent distributed economic security that makes data censorship prohibitively expensive.

Curators signal which subgraphs are valuable by staking GRT, earning a portion of query fees when their curated subgraphs are used. This creates a self-organizing quality filter: high-quality subgraphs attract curation, which attracts indexers, which improves query performance.

The Horizon upgrade extends this model to all data services, not just subgraphs. An indexer can now serve Token API queries, stream Tycho liquidity updates, and provide Amp database access — all secured by the same GRT stake.

This multi-service revenue model matters because it diversifies indexer income beyond subgraph queries. If AI agent query volume scales as projected, indexers serving Token API could see significant revenue growth, even if traditional subgraph usage plateaus.

The Institutional Wedge: From DeFi to TradFi

The DTCC pilot program represents something bigger than a single use case. It's proof that major financial institutions — in this case, the organization that settles $2.5 quadrillion in securities transactions annually — will build on public blockchain data infrastructure when it meets regulatory requirements.

Amp's feature set directly targets this segment:

  • Lineage tracking: Every data point traces back to its on-chain source, creating an immutable audit trail.
  • Compliance features: Role-based access controls, data retention policies, and privacy controls meet regulatory standards.
  • On-premises deployment: Regulated entities can run Graph infrastructure inside their security perimeter while still participating in the decentralized network.

The playbook mirrors how enterprise blockchain adoption played out: start with private/permissioned chains, gradually integrate with public chains as compliance frameworks mature. The Graph positions itself as the data layer that works across both environments.

If major banks adopt Amp for tokenized securities settlement, blockchain analytics for AML compliance, or real-time risk monitoring, the query volume could dwarf current DeFi usage. A single large institution running hourly compliance queries across multiple chains generates more sustainable revenue than thousands of individual developers.

The 2026 Inflection Point: Is This The Graph's Year?

The Graph's 2026 roadmap presents a clear thesis: the current token price fundamentally misprices the network's position in the emerging AI agent economy and institutional blockchain adoption.

The bull case rests on three assumptions:

  1. AI agent query volume scales meaningfully. If the 37% adoption rate among Token API users reflects a broader trend, and autonomous agents become the primary consumers of blockchain data, query fees could surge beyond historical levels.

  2. Horizon's multi-service architecture drives fee revenue growth. By serving developers, agents, and institutions simultaneously, The Graph captures revenue from multiple customer segments instead of relying solely on DeFi developers.

  3. Cross-chain GRT utility via Chainlink CCIP generates sustained demand. As users on Arbitrum, Base, Avalanche, and Solana pay for Graph services using bridged GRT, token velocity increases while supply remains capped.

The bear case argues that the infrastructure moat is narrower than it appears. Alternative indexing solutions like Chainstack, BlockXs, and Goldsky offer hosted subgraph services with simpler pricing and faster setup. Centralized API providers like Alchemy and Infura bundle data access with node infrastructure, creating switching costs.

The counterargument: The Graph's decentralized architecture matters precisely because AI agents and institutions cannot rely on centralized data providers. AI agents need censorship resistance to ensure uptime during adversarial conditions. Institutions need verifiable data provenance that centralized APIs cannot provide.

The 50,000+ active subgraphs, 167,000+ delegators, and ecosystem integrations with virtually every major DeFi protocol create a network effect that competitors must overcome, not just match.

Why Data Infrastructure Becomes the AI Economy Backbone

The blockchain industry spent 2021-2023 obsessing over execution layers: faster Layer 1s, cheaper Layer 2s, more scalable consensus mechanisms.

The result? Transactions that cost fractions of a penny and settle in milliseconds. The bottleneck shifted.

Execution is solved. Data is the new constraint.

AI agents can execute trades, rebalance portfolios, and settle payments autonomously. What they cannot do is operate without high-quality, indexed, queryable data about on-chain state. The Graph's trillion-query milestone reflects this reality: as blockchain applications grow more sophisticated, data infrastructure becomes more critical than transaction throughput.

This mirrors the evolution of traditional tech infrastructure. Amazon didn't win e-commerce because it had the fastest servers — it won because it built the best data infrastructure for inventory management, personalization, and logistics optimization. Google didn't win search because it had the most storage — it won because it indexed the web better than anyone else.

The Graph is positioning itself as the Google of blockchain data: not the only indexing solution, but the default infrastructure that everything else builds on top of.

Whether that vision materializes depends on execution in the next 12-24 months. If Horizon's multi-service architecture attracts institutional clients, if AI agent query volume justifies the infrastructure investment, and if cross-chain expansion drives sustainable GRT demand, 2026 could be the year The Graph transitions from "important DeFi infrastructure" to "essential backbone of the on-chain economy."

The 1.5 trillion queries are just the beginning.


Building applications that rely on robust blockchain data infrastructure? BlockEden.xyz provides high-performance API access across 40+ chains, complementing decentralized indexing with enterprise-grade reliability for production Web3 applications.