Skip to main content

588 posts tagged with "Blockchain"

General blockchain technology and innovation

View all tags

Bittensor's DeepSeek Moment: Can TAO Become the Second Pole of Global AI?

· 9 min read
Dora Noda
Software Engineer

When 70 strangers scattered across the world — armed with consumer GPUs and home internet connections — collectively trained a 72-billion-parameter language model that outperformed Meta's LLaMA-2-70B, something shifted in the AI narrative. No corporate whitelist. No $100 million data center. No centralized lab pulling the strings. Just Bittensor's Subnet 3, a cryptoeconomic incentive system, and a technical trick called SparseLoCo that made it all possible.

The AI world spent early 2026 obsessing over DeepSeek's proof that frontier-quality models don't require OpenAI-scale budgets. Bittensor's community calls what happened on March 10, 2026 their own "DeepSeek moment" — evidence that large language models can now emerge entirely outside centralized institutions. The question worth asking: is Bittensor genuinely building the second pole of global AI infrastructure, or is it a compelling story wrapped around an elegant but fragile experiment?

How Celestia's Data Availability Sampling Hits 1 Terabit Per Second: The Technical Deep Dive

· 13 min read
Dora Noda
Software Engineer

On January 13, 2026, Celestia shattered expectations with a single benchmark: 1 terabit per second of data throughput across 498 distributed nodes. For context, that's enough bandwidth to process the entire daily transaction volume of Ethereum's largest Layer 2 rollups—in less than a second.

But the real story isn't the headline number. It's the cryptographic infrastructure that makes it possible: Data Availability Sampling (DAS), a breakthrough that allows resource-constrained light nodes to verify blockchain data availability without downloading entire blocks. As rollups race to scale beyond Ethereum's native blob storage, understanding how Celestia achieves this throughput—and why it matters for rollup economics—has never been more critical.

The Data Availability Bottleneck: Why Rollups Need a Better Solution

Blockchain scalability has long been constrained by a fundamental trade-off: how do you verify that transaction data is actually available without requiring every node to download and store everything? This is the data availability problem, and it's the primary bottleneck for rollup scaling.

Ethereum's approach—requiring every full node to download complete blocks—creates an accessibility barrier. As block sizes grow, fewer participants can afford the bandwidth and storage to run full nodes, threatening decentralization. Rollups posting data to Ethereum L1 face prohibitive costs: at peak demand, a single batch can cost thousands of dollars in gas fees.

Enter modular data availability layers. By separating data availability from execution and consensus, protocols like Celestia, EigenDA, and Avail promise to slash rollup costs while maintaining security guarantees. Celestia's innovation? A sampling technique that inverts the verification model: instead of downloading everything to verify availability, light nodes randomly sample tiny fragments and achieve statistical confidence that the full dataset exists.

Data Availability Sampling Explained: How Light Nodes Verify Without Downloading

At its core, DAS is a probabilistic verification mechanism. Here's how it works:

Random Sampling and Confidence Building

Light nodes don't download entire blocks. Instead, they conduct multiple rounds of random sampling for small portions of block data. Each successful sample increases confidence that the complete block is available.

The math is elegant: if a malicious validator withholds even a small percentage of block data, honest light nodes will detect the unavailability with high probability after just a few sampling rounds. This creates a security model where even resource-limited devices can participate in data availability verification.

Specifically, every light node randomly chooses a set of unique coordinates in an extended data matrix and queries bridge nodes for the corresponding data shares plus Merkle proofs. If the light node receives valid responses for each query, statistical probability guarantees the whole block's data is available.

2D Reed-Solomon Encoding: The Mathematical Foundation

Celestia employs a 2-dimensional Reed-Solomon encoding scheme to make sampling both efficient and fraud-resistant. Here's the technical flow:

  1. Block data is split into k × k chunks, forming a data square
  2. Reed-Solomon erasure coding extends this to a 2k × 2k matrix (adding redundancy)
  3. Merkle roots are computed for each row and column of the extended matrix
  4. The Merkle root of these roots becomes the block data commitment in the block header

This approach has a critical property: if any portion of the extended matrix is missing, the encoding breaks down, and light nodes will detect inconsistencies when verifying Merkle proofs. An attacker can't withhold data selectively without being caught.

Namespaced Merkle Trees: Rollup-Specific Data Isolation

Here's where Celestia's architecture shines for multi-rollup environments: Namespaced Merkle Trees (NMTs).

A standard Merkle tree groups data arbitrarily. An NMT, however, tags every node with the minimum and maximum namespace identifiers of its children, and orders leaves by namespace. This enables rollups to:

  • Download only their own data from the DA layer
  • Prove completeness of their namespace's data with a Merkle proof
  • Ignore irrelevant data from other rollups entirely

For a rollup operator, this means you're not paying bandwidth costs to download data from competing chains. You fetch exactly what you need, verify it with cryptographic proofs, and move on. This is a massive efficiency gain compared to monolithic chains where all participants must process all data.

The Matcha Upgrade: Scaling to 128MB Blocks

In 2025, Celestia activated the Matcha upgrade, a watershed moment for modular data availability. Here's what changed:

Block Size Expansion

Matcha increases maximum block size from 8MB to 128MB—a 16x capacity boost. This translates to:

  • Data square size: 128 → 512
  • Maximum transaction size: 2MB → 8MB
  • Sustained throughput: 21.33 MB/s in testnet (April 2025)

To put this in perspective, Ethereum's target blob count is 6 per block (roughly 0.75 MB), expandable to 9 blobs. Celestia's 128MB blocks dwarf this capacity by over 100x.

High-Throughput Block Propagation

The constraint wasn't just block size—it was block propagation speed. Matcha introduces a new propagation mechanism (CIP-38) that safely disseminates 128MB blocks across the network without causing validator desynchronization.

In testnet, the network sustained 6-second block times with 128MB blocks, achieving 21.33 MB/s throughput. This represents 16x the current mainnet capacity.

Storage Cost Reduction

One of the most overlooked economic changes: Matcha reduced the minimum data pruning window from 30 days to 7 days + 1 hour (CIP-34).

For bridge nodes, this slashes storage requirements from 30TB to 7TB at projected throughput levels. Lower operational costs for infrastructure providers translate to cheaper data availability for rollups.

Token Economics Overhaul

Matcha also improved TIA token economics:

  • Inflation cut: From 5% to 2.5% annually
  • Validator commission increase: Max raised from 10% to 20%
  • Improved collateral properties: Making TIA more suitable for DeFi use cases

Combined, these changes position Celestia for the next phase: scaling toward 1 GB/s throughput and beyond.

Rollup Economics: Why 50% DA Market Share Matters

As of early 2026, Celestia holds approximately 50% of the data availability market, having processed over 160 GB of rollup data. This dominance reflects real-world adoption by rollup developers who prioritize cost and scalability.

Cost Comparison: Celestia vs Ethereum Blobs

Celestia's fee model is straightforward: rollups pay per blob based on size and current gas prices. Unlike execution layers where computation dominates, data availability is fundamentally about bandwidth and storage—resources that scale more predictably with hardware improvements.

For rollup operators, the math is compelling:

  • Ethereum L1 posting: At peak demand, batch submission can cost $1,000–$10,000+ in gas
  • Celestia DA: Sub-dollar costs per batch for equivalent data

This 100x+ cost reduction is why rollups are migrating to modular DA solutions. Cheaper data availability directly translates to lower transaction fees for end users.

The Rollup Incentive Structure

Celestia's economic model aligns incentives:

  1. Rollups pay for blob storage proportional to data size
  2. Validators earn fees for securing the DA layer
  3. Bridge nodes serve data to light nodes and earn service fees
  4. Light nodes sample data for free, contributing to security

This creates a flywheel: as more rollups adopt Celestia, validator revenue increases, attracting more stakers, which strengthens security, which attracts more rollups.

The Competition: EigenDA, Avail, and Ethereum Blobs

Celestia's 50% market share is under siege. Three major competitors are scaling aggressively:

EigenDA: Ethereum-Native Restaking

EigenDA leverages EigenLayer's restaking infrastructure to offer high-throughput data availability for Ethereum rollups. Key advantages:

  • Economic security: Secured by restaked ETH (currently 93.9% of restaking market)
  • Tight Ethereum integration: Native compatibility with Ethereum's blob market
  • Highest throughput claims: Though previous versions lacked active economic security

Critics point out that EigenDA's reliance on restaking introduces cascade risk: if an AVS experiences slashing, it could propagate to Lido stETH holders and destabilize the broader LST market.

Avail: Universal DA for All Chains

Unlike Celestia's Cosmos focus and EigenDA's Ethereum orientation, Avail positions itself as a universal DA layer compatible with any blockchain architecture:

  • UTXO, Account, and Object model support: Works with Bitcoin L2s, EVM chains, and Move-based systems
  • Modular design: Separates DA from consensus entirely
  • Cross-ecosystem vision: Aims to serve as the neutral DA layer for all blockchains

Avail's challenge? It's the newest entrant, lagging in live rollup integrations compared to Celestia and EigenDA.

Ethereum Native Blobs: EIP-4844 and Beyond

Ethereum's EIP-4844 (Dencun upgrade) introduced blob-carrying transactions, offering rollups a cheaper data posting alternative to calldata. Current capacity:

  • Target: 6 blobs per block (~0.75 MB)
  • Maximum: 9 blobs per block (~1.125 MB)
  • Future expansion: PeerDAS and zkEVM upgrades targeting 10,000+ TPS

However, Ethereum blobs come with trade-offs:

  • Short retention window: Data is pruned after ~18 days
  • Shared resource contention: All rollups compete for the same blob space
  • Limited scalability: Even with PeerDAS, blob capacity maxes out far below Celestia's roadmap

For rollups prioritizing Ethereum alignment, blobs are attractive. For those needing massive throughput and long-term data retention, Celestia remains the better fit.

Fibre Blockspace: The 1 Terabit Vision

On January 14, 2026, Celestia co-founder Mustafa Al-Bassam unveiled Fibre Blockspace—a new protocol targeting 1 terabit per second of throughput with millisecond latency. This represents a 1,500x improvement over the original roadmap targets from just a year prior.

Benchmark Details

The team achieved the 1 Tbps benchmark using:

  • 498 nodes distributed across North America
  • GCP instances with 48-64 vCPUs and 90-128GB RAM each
  • 34-45 Gbps network links per instance

Under these controlled conditions, the protocol sustained 1 terabit per second data throughput—a staggering leap in blockchain performance.

ZODA Encoding: 881x Faster Than KZG

At Fibre's core is ZODA, a novel encoding protocol that Celestia claims processes data 881x faster than KZG commitment-based alternatives used by EigenDA and Ethereum blobs.

KZG commitments (Kate-Zaverucha-Goldberg polynomial commitments) are cryptographically elegant but computationally expensive. ZODA trades some cryptographic properties for massive speed gains, making terabit-scale throughput achievable on commodity hardware.

The Vision: Every Market Comes Onchain

Al-Bassam's roadmap statement captures Celestia's ambition:

"If 10KB/s enabled AMMs, and 10MB/s enabled onchain orderbooks, then 1 Tbps is the leap that enables every market to come onchain."

The implication: with sufficient data availability bandwidth, financial markets currently dominated by centralized exchanges—spot, derivatives, options, prediction markets—could migrate to transparent, permissionless blockchain infrastructure.

Reality Check: Benchmarks vs. Production

Benchmark conditions rarely match real-world chaos. The 1 Tbps result was achieved in a controlled testnet environment with high-performance cloud instances. The real test comes when:

  • Actual rollups push production workloads
  • Network conditions vary (latency spikes, packet loss, asymmetric bandwidth)
  • Adversarial validators attempt data withholding attacks

Celestia's team acknowledges this: Fibre runs parallel to the existing L1 DA layer, giving users a choice between battle-tested infrastructure and cutting-edge experimental throughput.

What This Means for Rollup Developers

If you're building a rollup, Celestia's DAS architecture offers compelling advantages:

When to Choose Celestia

  • High-throughput applications: Gaming, social networks, micropayments
  • Cost-sensitive use cases: Rollups targeting sub-cent transaction fees
  • Data-intensive workflows: AI inference, decentralized storage integrations
  • Multi-rollup ecosystems: Projects launching multiple specialized rollups

When to Stick with Ethereum Blobs

  • Ethereum alignment: If your rollup values Ethereum's social consensus and security
  • Simplified architecture: Blobs offer tighter integration with Ethereum tooling
  • Lower complexity: Less infrastructure to manage (no separate DA layer)

Integration Considerations

Celestia's DA layer integrates with major rollup frameworks:

  • Polygon CDK: Easily pluggable DA component
  • OP Stack: Custom DA adapters available
  • Arbitrum Orbit: Community-built integrations
  • Rollkit: Native Celestia support

For developers, adopting Celestia often means swapping out the data availability module in your rollup stack—minimal changes to execution or settlement logic.

The Data Availability Wars: What Comes Next

The modular blockchain thesis is being stress-tested in real time. Celestia's 50% market share, EigenDA's restaking momentum, and Avail's universal positioning set up a three-way competition for rollup mindshare.

  1. Throughput escalation: Celestia targets 1 GB/s → 1 Tbps; EigenDA and Avail will respond
  2. Economic security models: Will restaking risks catch up to EigenDA? Can Celestia's validator set scale?
  3. Ethereum blob expansion: PeerDAS and zkEVM upgrades could shift cost dynamics
  4. Cross-chain DA: Avail's universal vision vs. ecosystem-specific solutions

The BlockEden.xyz Angle

For infrastructure providers, supporting multiple DA layers is becoming table stakes. Rollup developers need reliable RPC access not just to Ethereum, but to Celestia, EigenDA, and Avail.

BlockEden.xyz offers high-performance RPC infrastructure for Celestia and 10+ blockchain ecosystems, enabling rollup teams to build on modular stacks without managing node infrastructure. Explore our data availability APIs to accelerate your rollup deployment.

Conclusion: Data Availability as the New Competitive Moat

Celestia's Data Availability Sampling isn't just an incremental improvement—it's a paradigm shift in how blockchains verify state. By enabling light nodes to participate in security through probabilistic sampling, Celestia democratizes verification in a way monolithic chains cannot.

The Matcha upgrade's 128MB blocks and the Fibre vision's 1 Tbps throughput represent inflection points for rollup economics. When data availability costs drop 100x, entirely new application categories become viable: high-frequency trading onchain, real-time multiplayer gaming, AI agent coordination at scale.

But technology alone doesn't determine winners. The DA wars will be decided by three factors:

  1. Rollup adoption: Which chains actually commit to production deployments?
  2. Economic sustainability: Can these protocols maintain low costs as usage scales?
  3. Security resilience: How well do sampling-based systems resist sophisticated attacks?

Celestia's 50% market share and 160 GB of processed rollup data prove the concept works. Now the question shifts from "can modular DA scale?" to "which DA layer will dominate the rollup economy?"

For builders navigating this landscape, the advice is clear: abstract your DA layer. Design rollups to swap between Celestia, EigenDA, Ethereum blobs, and Avail without re-architecting. The data availability wars are just beginning, and the winners may not be who we expect.


Sources:

Consensus Model Trade-offs for Interoperability: PoW, PoS, DPoS, and BFT in Cross-Chain Bridge Security

· 11 min read
Dora Noda
Software Engineer

Over $2.3 billion was drained from cross-chain bridges in the first half of 2025 alone—already surpassing the full-year total from 2024. While much of the industry conversation focuses on smart contract audits and multisig key management, a quieter but equally critical vulnerability often goes unexamined: the mismatch between how different blockchains reach consensus and how bridges assume they do.

Every cross-chain bridge makes implicit assumptions about finality. When those assumptions collide with the actual consensus model of a source or destination chain, attackers find windows to exploit. Understanding how PoW, PoS, DPoS, and BFT consensus mechanisms differ—and how those differences cascade into bridge design choices and messaging protocol selection—is one of the most important topics in Web3 infrastructure today.

Data Markets Meet AI Training: How Blockchain Solves the $23 Billion Data Pricing Crisis

· 12 min read
Dora Noda
Software Engineer

The AI industry faces a paradox: global data production explodes from 33 zettabytes to 175 zettabytes by 2025, yet AI model quality stagnates. The problem isn't data scarcity—it's that data providers have no way to capture value from their contributions. Enter blockchain-based data markets like Ocean Protocol, LazAI, and ZENi, which are transforming AI training data from a free resource into a monetizable asset class worth $23.18 billion by 2034.

The $23 Billion Data Pricing Problem

AI training costs surged 89% from 2023 to 2025, with data acquisition and annotation consuming up to 80% of machine learning project budgets. Yet data creators—individuals generating search queries, social media interactions, and behavioral patterns—receive nothing while tech giants harvest billions in value.

The AI training dataset market reveals this disconnect. Valued at $3.59 billion in 2025, the market is projected to hit $23.18 billion by 2034 at a 22.9% CAGR. Another forecast pegs 2026 at $7.48 billion, reaching $52.41 billion by 2035 with 24.16% annual growth.

But who captures this value? Currently, centralized platforms extract profit while data creators get zero compensation. Label noise, inconsistent tagging, and missing context drive costs, yet contributors lack incentives to improve quality. Data privacy concerns impact 28% of companies, limiting dataset accessibility precisely when AI needs diverse, high-quality inputs.

Ocean Protocol: Tokenizing the $100 Million Data Economy

Ocean Protocol addresses ownership by allowing data providers to tokenize datasets and make them available for AI training without relinquishing control. Since launching Ocean Nodes in August 2024, the network has grown to over 1.4 million nodes across 70+ countries, onboarded 35,000+ datasets, and facilitated more than $100 million in AI-related data transactions.

The 2025 product roadmap includes three critical components:

Inference Pipelines enable end-to-end AI model training and deployment directly on Ocean's infrastructure. Data providers tokenize proprietary datasets, set pricing, and earn revenue every time an AI model consumes their data for training or inference.

Ocean Enterprise Onboarding moves ecosystem businesses from pilot to production. Ocean Enterprise v1, launching Q3 2025, delivers a compliant, production-ready data platform targeting institutional clients who need auditable, privacy-preserving data exchanges.

Node Analytics introduces dashboards tracking performance, usage, and ROI. Partners like NetMind contribute 2,000 GPUs while Aethir helps scale Ocean Nodes to support large AI workloads, creating a decentralized compute layer for AI training.

Ocean's revenue-sharing mechanism works through smart contracts: data providers set access terms, AI developers pay per usage, and blockchain automatically distributes payments to all contributors. This transforms data from a one-time sale into a continuous revenue stream tied to model performance.

LazAI: Verifiable AI Interaction Data on Metis

LazAI introduces a fundamentally different approach—monetizing AI interaction data, not just static datasets. Every conversation with LazAI's flagship agents (Lazbubu, SoulTarot) generates Data Anchoring Tokens (DATs), which function as traceable, verifiable records of AI-generated output.

The Alpha Mainnet launched in December 2025 on enterprise-grade infrastructure using QBFT consensus and $METIS-based settlement. DATs tokenize and monetize AI datasets and models as verifiable assets with transparent ownership and revenue attribution.

Why does this matter? Traditional AI training uses static datasets frozen at collection time. LazAI captures dynamic interaction data—user queries, model responses, refinement loops—creating training datasets that reflect real-world usage patterns. This data is exponentially more valuable for fine-tuning models because it contains human feedback signals embedded in conversation flow.

The system includes three key innovations:

Proof-of-Stake Validator Staking secures AI data pipelines. Validators stake tokens to verify data integrity, earning rewards for accurate validation and facing penalties for approving fraudulent data.

DAT Minting with Revenue Sharing allows users who generate valuable interaction data to mint DATs representing their contributions. When AI companies purchase these datasets for model training, revenue flows automatically to all DAT holders based on their proportional contribution.

iDAO Governance establishes decentralized AI collectives where data contributors collectively govern dataset curation, pricing strategies, and quality standards through on-chain voting.

The 2026 roadmap adds ZK-based privacy (users can monetize interaction data without exposing personal information), decentralized computing markets (training happens on distributed infrastructure rather than centralized clouds), and multimodal data evaluation (video, audio, image interactions beyond text).

ZENi: The Intelligence Data Layer for AI Agents

ZENi operates at the intersection of Web3 and AI by powering the "InfoFi Economy"—a decentralized network bridging traditional and blockchain-based commerce through AI-powered intelligence. The company raised $1.5 million in seed funding led by Waterdrip Capital and Mindfulness Capital.

At its core sits the InfoFi Data Layer, a high-throughput behavioral-intelligence engine processing 1 million+ daily signals across X/Twitter, Telegram, Discord, and on-chain activity. ZENi identifies patterns in user behavior, sentiment shifts, and community engagement—data that's critical for training AI agents but difficult to collect at scale.

The platform operates as a three-part system:

AI Data Analytic Agent identifies high-intent audiences and influence clusters by analyzing social graphs, on-chain transactions, and engagement metrics. This creates behavioral datasets showing not just what users do but why they make decisions.

AIGC (AI-Generated Content) Agent crafts personalized campaigns using insights from the data layer. By understanding user preferences and community dynamics, the agent generates content optimized for specific audience segments.

AI Execution Agent activates outreach through the ZENi dApp, closing the loop from data collection to monetization. Users receive compensation when their behavioral data contributes to successful campaigns.

ZENi already serves partners in e-commerce, gaming, and Web3, with 480,000 registered users and 80,000 daily active users. The business model monetizes behavioral intelligence: companies pay to access ZENi's AI-processed datasets, and revenue flows to users whose data powered those insights.

Blockchain's Competitive Advantage in Data Markets

Why does blockchain matter for data monetization? Three technical capabilities make decentralized data markets superior to centralized alternatives:

Granular Revenue Attribution Smart contracts enable sophisticated revenue-sharing where multiple contributors to an AI model automatically receive proportional compensation based on usage. A single training dataset might aggregate inputs from 10,000 users—blockchain tracks each contribution and distributes micropayments per model inference.

Traditional systems can't handle this complexity. Payment processors charge fixed fees (2-3%) unsuitable for micropayments, and centralized platforms lack transparency about who contributed what. Blockchain solves both: near-zero transaction costs via Layer 2 solutions and immutable attribution via on-chain provenance.

Verifiable Data Provenance LazAI's Data Anchoring Tokens prove data origin without exposing underlying content. AI companies training models can verify they're using licensed, high-quality data rather than scraped web content of questionable legality.

This addresses a critical risk: data privacy regulations impact 28% of companies, limiting dataset accessibility. Blockchain-based data markets implement privacy-preserving verification—proving data quality and licensing without revealing personal information.

Decentralized AI Training Ocean Protocol's node network demonstrates how distributed infrastructure reduces costs. Rather than paying cloud providers $2-5 per GPU hour, decentralized networks match unused compute capacity (gaming PCs, data centers with spare capacity) with AI training demand at 50-85% cost reduction.

Blockchain coordinates this complexity through smart contracts governing job allocation, payment distribution, and quality verification. Contributors stake tokens to participate, earning rewards for honest computation and facing slashing penalties for delivering incorrect results.

The Path to $52 Billion: Market Forces Driving Adoption

Three converging trends accelerate blockchain data market growth toward the $52.41 billion 2035 projection:

AI Model Diversification The era of massive foundation models (GPT-4, Claude, Gemini) trained on all internet text is ending. Specialized models for healthcare, finance, legal services, and vertical applications require domain-specific datasets that centralized platforms don't curate.

Blockchain data markets excel at niche datasets. A medical imaging provider can tokenize radiology scans with diagnostic annotations, set usage terms requiring patient consent, and earn revenue from every AI model trained on their data. This impossible to implement with centralized platforms that lack granular access control and attribution.

Regulatory Pressure Data privacy regulations (GDPR, CCPA, China's Personal Information Protection Law) mandate consent-based data collection. Blockchain-based markets implement consent as programmable logic—users cryptographically sign permissions, data can only be accessed under specified terms, and smart contracts enforce compliance automatically.

Ocean Enterprise v1's focus on compliance addresses this directly. Financial institutions and healthcare providers need auditable data lineage proving every dataset used for model training had proper licensing. Blockchain provides immutable audit trails satisfying regulatory requirements.

Quality Over Quantity Recent research shows AI doesn't need endless training data when systems better resemble biological brains. This shifts incentives from collecting maximum data to curating highest-quality inputs.

Decentralized data markets align incentives properly: data creators earn more for high-quality contributions because models pay premium prices for datasets improving performance. LazAI's interaction data captures human feedback signals (which queries get refined, which responses satisfy users) that static datasets miss—making it inherently more valuable per byte.

Challenges: Privacy, Pricing, and Protocol Wars

Despite momentum, blockchain data markets face structural challenges:

Privacy Paradox Training AI requires data transparency (models need access to actual content), but privacy regulations demand data minimization. Current solutions like federated learning (training on encrypted data) increase costs 3-5x compared to centralized training.

Zero-knowledge proofs offer a path forward—proving data quality without exposing content—but add computational overhead. LazAI's 2026 ZK roadmap addresses this, though production-ready implementations remain 12-18 months away.

Price Discovery What's a social media interaction worth? A medical image with diagnostic annotation? Blockchain markets lack established pricing mechanisms for novel data types.

Ocean Protocol's approach—letting providers set prices and market dynamics determine value—works for commoditized datasets but struggles with one-of-a-kind proprietary data. Prediction markets or AI-driven dynamic pricing may solve this, though both introduce oracle dependencies (external price feeds) that undermine decentralization.

Interoperability Fragmentation Ocean Protocol runs on Ethereum, LazAI on Metis, ZENi integrates with multiple chains. Data tokenized on one platform can't easily transfer to another, fragmenting liquidity.

Cross-chain bridges and universal data standards (like decentralized identifiers for datasets) could solve this, but the ecosystem remains early. The blockchain AI market at $680.89 million in 2025 growing to $4.338 billion by 2034 suggests consolidation around winning protocols is years away.

What This Means for Developers

For teams building AI applications, blockchain data markets offer three immediate advantages:

Access to Proprietary Datasets Ocean Protocol's 35,000+ datasets include proprietary training data unavailable through traditional channels. Medical imaging, financial transactions, behavioral analytics from Web3 applications—specialized datasets that centralized platforms don't curate.

Compliance-Ready Infrastructure Ocean Enterprise v1's built-in licensing, consent management, and audit trails solve regulatory headaches. Rather than building custom data governance systems, developers inherit compliance by design through smart contracts enforcing data usage terms.

Cost Reduction Decentralized compute networks undercut cloud providers by 50-85% for batch training workloads. Ocean's partnership with NetMind (2,000 GPUs) and Aethir demonstrates how tokenized GPU marketplaces match supply with demand at lower cost than AWS/GCP/Azure.

BlockEden.xyz provides enterprise-grade RPC infrastructure for blockchain-based AI applications. Whether you're building on Ethereum (Ocean Protocol), Metis (LazAI), or multi-chain platforms, our reliable node services ensure your AI data pipelines remain online and performant. Explore our API marketplace to connect your AI systems with blockchain networks built for scale.

The 2026 Inflection Point

Three catalysts position 2026 as the inflection year for blockchain data markets:

Ocean Enterprise v1 Production Launch (Q3 2025) The first compliant, institutional-grade data marketplace goes live. If Ocean captures even 5% of the $7.48 billion 2026 AI training dataset market, that's $374 million in data transactions flowing through blockchain-based infrastructure.

LazAI ZK Privacy Implementation (2026) Zero-knowledge proofs enable users to monetize interaction data without privacy compromise. This unlocks consumer-scale adoption—hundreds of millions of social media users, search engine queries, and e-commerce sessions becoming monetizable through DATs.

Federated Learning Integration AI federated learning allows model training without centralizing data. Blockchain adds value attribution: rather than Google training models on Android user data without compensation, federated systems running on blockchain distribute revenue to all data contributors.

The convergence means AI training shifts from "collect all data, train centrally, pay nothing" to "train on distributed data, compensate contributors, verify provenance." Blockchain doesn't just enable this transition—it's the only technology stack capable of coordinating millions of data providers with automatic revenue distribution and cryptographic verification.

Conclusion: Data Becomes Programmable

The AI training data market's growth from $3.59 billion in 2025 to $23-52 billion by 2034 represents more than market expansion. It's a fundamental shift in how we value information.

Ocean Protocol proves data can be tokenized, priced, and traded like financial assets while preserving provider control. LazAI demonstrates AI interaction data—previously discarded as ephemeral—becomes valuable training inputs when properly captured and verified. ZENi shows behavioral intelligence can be extracted, processed by AI, and monetized through decentralized markets.

Together, these platforms transform data from raw material extracted by tech giants into a programmable asset class where creators capture value. The global data explosion from 33 to 175 zettabytes matters only if quality beats quantity—and blockchain-based markets align incentives to reward quality contributions.

When data creators earn revenue proportional to their contributions, when AI companies pay fair prices for quality inputs, and when smart contracts automate attribution across millions of participants, we don't just fix the data pricing problem. We build an economy where information has intrinsic value, provenance is verifiable, and contributors finally capture the wealth their data generates.

That's not a market trend. It's a paradigm shift—and it's already live on-chain.

The Rise of Pragmatic Privacy: Balancing Compliance and Confidentiality in Blockchain

· 16 min read
Dora Noda
Software Engineer

The blockchain industry stands at a crossroads where privacy is no longer a binary choice. Throughout crypto's early years, the narrative was clear: absolute privacy at all costs, transparency only when necessary, and resistance to any form of surveillance. But in 2026, a profound shift is underway. The rise of Decentralized Pragmatic AI (DePAI) infrastructure signals a new era where compliance-friendly privacy tools are not just accepted—they're becoming the standard.

This isn't a retreat from privacy principles. It's an evolution toward a more sophisticated understanding: privacy and regulatory compliance can coexist, and in fact, must coexist if blockchain and AI are to achieve institutional adoption at scale.

The End of "Privacy at All Costs"

For years, privacy maximalism dominated blockchain discourse. Projects like Monero and early versions of privacy-focused protocols championed absolute anonymity. The philosophy was straightforward: users deserve complete financial privacy, and any compromise represented a betrayal of crypto's founding principles.

But this absolutist stance created a critical problem. While privacy is essential for protecting honest users from surveillance and front-running, it also became a shield for illicit activity. Regulators worldwide began treating privacy coins with suspicion, leading to delistings from major exchanges and outright bans in several jurisdictions.

As Cointelegraph reports, 2026 is the year pragmatic privacy takes off, with new projects tackling compliant forms of privacy for institutions and growing interest in existing privacy coins like Zcash. The key insight: privacy isn't binary. Neither full transparency nor absolute privacy are workable in the real world, because while privacy is essential for honest users, it can also be used by criminals to evade law enforcement.

People are starting to accept making tradeoffs that curtail privacy in limited contexts to make protocols more threat-resistant. This represents a fundamental shift in the blockchain community's approach to privacy.

Defining Pragmatic Privacy

So what exactly is pragmatic privacy? According to Anaptyss, pragmatic privacy refers to the strategic implementation of privacy measures that protect user and business data without breaching regulatory requirements, ensuring that financial operations are both secure and compliant.

This approach recognizes that different participants in the blockchain ecosystem have different privacy needs:

  • Retail users need protection from mass surveillance and data harvesting
  • Institutional investors require confidentiality to prevent front-running of their trading strategies
  • Enterprises must satisfy strict AML/KYC mandates while protecting sensitive business information
  • AI agents need verifiable computation without exposing proprietary algorithms or training data

The solution lies not in choosing between privacy and compliance, but in building infrastructure that enables both simultaneously.

zkKYC: Privacy-Preserving Identity Verification

One of the most promising developments in pragmatic privacy is the emergence of zero-knowledge Know Your Customer (zkKYC) solutions. Traditional KYC processes require users to repeatedly submit sensitive personal documents to multiple platforms, creating numerous honeypots of personal data vulnerable to breaches.

zkKYC flips this model. As zkMe explains, their zkKYC service combines Zero-Knowledge Proof (ZKP) technology with full FATF compliance. A regulated KYC provider verifies the user off-chain following standard AML and identity verification procedures, but protocols do not collect identity data. Instead, they verify compliance cryptographically.

The mechanism is elegant: smart contracts automatically check a zero-knowledge proof before allowing access to certain services or processing large transactions. Users prove they meet compliance requirements—age, residency, non-sanctioned status—without revealing any actual identity data to the protocol or other users.

According to Studio AM, this is already happening in some blockchain ecosystems: users prove age or residency with a ZKP before accessing certain decentralized finance (DeFi) services. Major financial institutions are taking notice. Deutsche Bank and Privado ID have conducted proofs of concept demonstrating blockchain-based identity verification using zero-knowledge credentials.

Perhaps most significantly, in July 2025, Google open-sourced its zero-knowledge proof libraries following work with Germany's Sparkasse group, signaling growing institutional investment in privacy-preserving identity infrastructure.

zkTLS: Making the Web Verifiable

While zkKYC addresses identity verification, another technology is solving an equally critical problem: how to bring verifiable Web2 data into blockchain systems without compromising privacy or security. Enter zkTLS (Zero-Knowledge Transport Layer Security).

Traditional TLS—the encryption that secures every HTTPS connection—has a critical limitation: it provides confidentiality but not verifiability. In other words, while TLS ensures that information is encrypted during transmission, it does not create a proof that the encrypted interaction happened in a way that can be independently verified.

zkTLS solves this by integrating Zero-Knowledge Proofs with the TLS encryption system. Using MPC-TLS and zero-knowledge techniques, zkTLS allows a client to produce cryptographically verifiable proofs and attestations of real HTTPS sessions.

As zkPass describes it, zkTLS generates a zero-knowledge proof (e.g., zk-SNARK) confirming that data was fetched from a specific server (identified by its public key and domain) via a legitimate TLS session, without exposing the session key or plaintext data.

The implications are profound. Traditional APIs can be easily disabled or censored, whereas zkTLS ensures that as long as users have an HTTPS connection, they can continue to access their data. This allows virtually any Web2 data to be used on a blockchain in a verifiable and permissionless way.

Recent implementations demonstrate the technology's maturity. Brevis's zkTLS Coprocessor, when fetching data from a web source, proves that the content was retrieved through a genuine TLS session from the authentic domain and that the data hasn't been tampered with.

At FOSDEM 2026, the TLSNotary project presented on liberating user data with zkTLS, demonstrating how users can prove facts about their private data—bank balances, credit scores, transaction histories—without exposing the underlying information.

Verifiable AI Computation: The Missing Piece for Institutional Adoption

Privacy-preserving identity and data verification set the stage, but the most transformative element of DePAI infrastructure is verifiable AI computation. As AI agents become economically active participants in blockchain ecosystems, the question shifts from "Can AI do this?" to "Can you prove the AI did this correctly?"

This verification requirement isn't academic. According to DecentralGPT, as AI becomes part of finance, automation, and agent workflows, performance alone isn't enough. In Web3, the question is also: Can you prove what happened? In late December 2025, Cysic and Inference Labs partnered to build scalable infrastructure for verifiable AI applications, combining decentralized compute with verification frameworks designed for real-world uses.

The institutional imperative for verifiable computation is clear. As noted in analysis by Alexis M. Adams, the transition to deterministic AI infrastructure is the only viable pathway for organizations to meet the multi-jurisdictional demands of the EU AI Act, US state-level frontier laws, and the rising expectations of the cyber insurance market.

The global AI governance market reflects this urgency: valued at approximately $429.8 million in 2026, it's projected to reach $4.2 billion by 2033, according to the same analysis.

But verification faces a critical gap. As Keyrus identifies, AI deployment requires trusting digital identities, but enterprises cannot validate who—or what—is actually operating AI systems. When organizations cannot reliably distinguish legitimate AI agents from adversary-controlled imposters, they cannot confidently grant AI systems access to sensitive data or decision authority.

This is where the convergence of zkKYC, zkTLS, and verifiable computation creates a complete solution. AI agents can prove their identity (zkKYC), prove they retrieved data correctly from authorized sources (zkTLS), and prove they computed results correctly (verifiable computation)—all without exposing sensitive business logic or training data.

The Institutional Push Toward Compliance

These technologies aren't emerging in a vacuum. Institutional demand for compliant privacy infrastructure is accelerating, driven by regulatory pressures and business necessity.

Large financial institutions recognize that without privacy, their blockchain strategies will stall. According to WEEX Crypto News, institutional investors require confidentiality to prevent front-running of their strategies, yet they must satisfy strict AML/KYC mandates. Zero-Knowledge Proofs are gaining traction as a solution, allowing institutions to prove compliance without revealing sensitive underlying data to the public blockchain.

The regulatory landscape of 2026 leaves no room for ambiguity. The EU AI Act reaches general application in 2026, and regulators across jurisdictions expect documented governance programs, not just policies, according to SecurePrivacy.ai. Full enforcement applies to high-risk AI systems used in critical infrastructure, education, employment, essential services, and law enforcement.

In the United States, by the end of 2025, 19 states enforced comprehensive privacy laws, with several new statutes taking effect in 2026, complicating multi-state privacy compliance obligations. Colorado and California have added "neural data" (and Colorado also added "biological data") to "sensitive" data definitions, as reported by Nixon Peabody.

This regulatory convergence creates a powerful incentive: organizations that build on compliant, verifiable infrastructure gain competitive advantage, while those clinging to privacy maximalism find themselves shut out of institutional markets.

Data Integrity as the Operating System for AI

Beyond compliance, verifiable computation enables something more fundamental: data integrity as the operating system for responsible AI.

As Precisely notes, in 2026, governance won't be something organizations layer on after deployment—it will be built into how data is structured, interpreted, and monitored from the start. Data integrity will serve as the operating system for responsible AI. From semantic clarity and explainability to compliance, auditability, and control over AI-generated data, integrity will determine whether AI can scale safely and deliver lasting value.

This shift has profound implications for how AI agents operate on blockchain networks. Rather than opaque black boxes, AI systems become auditable, verifiable, and governable by design. Smart contracts can enforce constraints on AI behavior, verify computational correctness, and create immutable audit trails—all while preserving the privacy of proprietary algorithms and training data.

The MIT Sloan Management Review identifies this as one of five key trends in AI and data science for 2026, noting that trustworthy AI requires verifiable provenance and explainable decision-making processes.

Decentralized Identity: The Foundation Layer

Underlying these technologies is a broader shift toward decentralized identity and Verifiable Credentials. As Indicio explains, decentralized identity changes the equation—instead of verifying personal data in a central location, individuals hold their data and share it with consent that can be independently verified using cryptography.

This model inverts traditional identity systems. Rather than creating numerous copies of identity documents scattered across databases, users maintain a single verifiable credential and selectively disclose only the specific attributes required for each interaction.

For AI agents, this model extends beyond human identity. Agents can possess verifiable credentials attesting to their training provenance, operational parameters, audit history, and authorization scope. This creates a trust framework where agents can interact autonomously while remaining accountable.

From Experimentation to Deployment

The key transformation in 2026 is the transition from theoretical frameworks to production deployments. According to XT Exchange's analysis, by 2026, decentralized AI is moving beyond experimentation and into practical deployment. However, key constraints remain, including scaling AI workloads, preserving data privacy, and governing open AI systems.

These constraints are precisely what DePAI infrastructure addresses. By combining zkKYC for identity, zkTLS for data verification, and verifiable computation for AI operations, the infrastructure creates a complete stack for deploying AI agents that are simultaneously:

  • Privacy-preserving for users and businesses
  • Compliant with regulatory requirements
  • Verifiable and auditable by design
  • Scalable for institutional workloads

The Road Ahead: Building Composable Privacy

The final piece of the DePAI puzzle is composability. As Blockmanity reports, 2026 marks the moment when blockchain becomes "just the plumbing" for AI agents and global finance. The infrastructure must be modular, interoperable, and invisible to end users.

Pragmatic privacy tools excel at composability. An AI agent can:

  1. Authenticate using zkKYC credentials
  2. Fetch verified external data via zkTLS
  3. Perform computations with verifiable inference
  4. Submit results on-chain with zero-knowledge proofs of correctness
  5. Maintain audit trails without exposing sensitive logic

Each layer operates independently, allowing developers to mix and match privacy-preserving technologies based on specific requirements. A DeFi protocol might require zkKYC for user onboarding, zkTLS for fetching price feeds, and verifiable computation for complex financial calculations—all working seamlessly together.

This composability extends across chains. Privacy infrastructure built with interoperability standards can function across Ethereum, Solana, Sui, Aptos, and other blockchain networks, creating a universal layer for compliant, private, verifiable computation.

Why This Matters for Builders

For developers building the next generation of blockchain applications, DePAI infrastructure represents both an opportunity and a requirement.

The opportunity: First-mover advantage in building applications that institutions actually want to use. Financial institutions, healthcare providers, government agencies, and enterprises all need blockchain solutions, but they cannot compromise on compliance or privacy. Applications built on pragmatic privacy infrastructure can serve these markets.

The requirement: Regulatory environments are converging on mandates for verifiable, governable AI systems. Applications that cannot demonstrate compliance, auditability, and user privacy protection will find themselves excluded from regulated markets.

The technical capabilities are maturing rapidly. zkKYC solutions are production-ready with major financial institutions conducting pilots. zkTLS implementations are processing real-world data. Verifiable computation frameworks are scaling to handle institutional workloads.

What's needed now is developer adoption. The transition from experimental privacy tools to production infrastructure requires builders to integrate these technologies into applications, test them in real-world scenarios, and provide feedback to infrastructure teams.

BlockEden.xyz provides enterprise-grade RPC infrastructure for blockchain networks implementing privacy-preserving technologies. Explore our services to build on foundations designed for the DePAI era.

Conclusion: Privacy's Pragmatic Future

The DePAI explosion in 2026 represents more than technological progress. It signals a maturation of blockchain's relationship with privacy, compliance, and institutional adoption.

The industry is moving beyond ideological battles between privacy maximalists and transparency absolutists. Pragmatic privacy acknowledges that different contexts demand different privacy guarantees, and that regulatory compliance and user privacy can coexist through thoughtful cryptographic design.

zkKYC proves identity without exposing it. zkTLS verifies data without trusting intermediaries. Verifiable computation proves correctness without revealing algorithms. Together, these technologies create an infrastructure layer where AI agents can operate autonomously, enterprises can adopt blockchain confidently, and users retain control over their data.

This isn't a compromise on privacy principles. It's a recognition that privacy, to be meaningful, must be sustainable within the regulatory and business realities of global finance. Absolute privacy that gets banned, delisted, and excluded from institutional use doesn't protect anyone. Pragmatic privacy that enables both confidentiality and compliance actually delivers on blockchain's promise.

The builders who recognize this shift and build on DePAI infrastructure today will define the next era of decentralized applications. The tools are ready. The institutional demand is clear. The regulatory environment is crystallizing. 2026 is the year pragmatic privacy goes from theory to deployment—and the blockchain industry will be stronger for it.


Sources

DePIN's Enterprise Pivot: From Token Speculation to $166M ARR Reality

· 13 min read
Dora Noda
Software Engineer

When the World Economic Forum projects a sector will grow from $19 billion to $3.5 trillion by 2028, you should pay attention. When that same sector generates $166 million in annual recurring revenue from real enterprise customers—not token emissions—it's time to stop dismissing it as crypto hype.

Decentralized Physical Infrastructure Networks (DePIN) have quietly undergone a fundamental transformation. While speculators chase memecoins, a handful of DePIN projects are building billion-dollar businesses by delivering what centralized cloud providers cannot: 60-80% cost savings with production-grade reliability. The shift from tokenomics theater to enterprise infrastructure is rewriting blockchain's value proposition—and traditional cloud giants are taking notice.

The $3.5 Trillion Opportunity Hidden in Plain Sight

The numbers tell a story that most crypto investors have missed. The DePIN ecosystem expanded from $5.2 billion in market cap (September 2024) to $19.2 billion by September 2025—a 269% surge that barely made headlines in an industry obsessed with layer-1 narratives. Nearly 250 tracked projects now span six verticals: compute, storage, wireless, energy, sensors, and bandwidth.

But market cap is a distraction. The real story is revenue density. DePIN projects now generate an estimated $72 million in annual on-chain revenue across the sector, trading at 10-25x revenue multiples—a dramatic compression from the 1,000x+ valuations of the 2021 cycle. This isn't just valuation discipline; it's evidence of fundamental business model maturation.

The World Economic Forum's $3.5 trillion projection for 2028 isn't based on token price dreams. It reflects the convergence of three massive infrastructure shifts:

  1. AI compute demand explosion: Machine learning workloads are projected to consume 24% of U.S. electricity by 2030, creating insatiable demand for distributed GPU networks.
  2. 5G/6G buildout economics: Telecom operators need to deploy edge infrastructure at 10x the density of 4G networks, but at lower capital expenditure per site.
  3. Cloud cost rebellion: Enterprises are finally questioning why AWS, Azure, and Google Cloud impose 30-70% markups on commodity compute and storage.

DePIN isn't replacing centralized infrastructure tomorrow. But when Aethir delivers 1.5 billion compute hours to 150+ enterprise clients, and Helium signs partnerships with T-Mobile, AT&T, and Telefónica, the "experimental technology" narrative collapses.

From Airdrops to Annual Recurring Revenue

The DePIN sector's transformation is best understood through the lens of actual businesses generating eight-figure revenue, not token inflation schemes masquerading as economic activity.

Aethir: The GPU Powerhouse

Aethir isn't just the largest DePIN revenue generator—it's rewriting the economics of cloud computing. $166 million ARR by Q3 2025, derived from 150+ paying enterprise customers across AI training, inference, gaming, and Web3 infrastructure. This isn't theoretical throughput; it's billing from customers like AI model training operations, gaming studios, and AI agent platforms that require guaranteed compute availability.

The scale is staggering: 440,000+ GPU containers deployed across 94 countries, delivering over 1.5 billion compute hours. For context, that's more revenue than Filecoin (135x larger by market cap), Render (455x), and Bittensor (14x) combined—measured by revenue-to-market-cap efficiency.

Aethir's enterprise strategy reveals why DePIN can win against centralized clouds: 70% cost reduction versus AWS while maintaining SLA guarantees that would make traditional infrastructure providers jealous. By aggregating idle GPUs from data centers, gaming cafes, and enterprise hardware, Aethir creates a supply-side marketplace that undercuts hyperscalers on price while matching them on performance.

Q1 2026 targets are even more ambitious: doubling the global compute footprint to capture accelerating AI infrastructure demand. Partnerships with Filecoin Foundation (for perpetual storage integration) and major cloud gaming platforms position Aethir as the first DePIN project to achieve true enterprise stickiness—recurring contracts, not one-time protocol interactions.

Grass: The Data Scraping Network

While Aethir monetizes compute, Grass proves DePIN's flexibility across infrastructure categories. $33 million ARR from a fundamentally different value proposition: decentralized web scraping and data collection for AI training pipelines.

Grass turned consumer bandwidth into a tradeable commodity. Users install a lightweight client that routes AI training data requests through their residential IP addresses, solving the "anti-bot detection" problem that plagues centralized scraping services. AI companies pay premium rates to access clean, geographically diverse training data without triggering rate limits or CAPTCHA walls.

The economics work because Grass captures margin that would otherwise flow to proxy service providers (Bright Data, Smartproxy) while offering better coverage. For users, it's passive income from unutilized bandwidth. For AI labs, it's reliable access to web-scale data at 50-60% cost savings.

Bittensor: Decentralized Intelligence Markets

Bittensor's approach differs fundamentally from infrastructure-as-a-service models. Instead of selling compute or bandwidth, it monetizes AI model outputs through a marketplace of specialized "subnets"—each focused on specific machine learning tasks like image generation, text completion, or predictive analytics.

By September 2025, over 128 active subnets collectively generate approximately $20 million in annual revenue, with the leading inference-as-a-service subnet projected to hit $10.4 million individually. Developers access Bittensor-powered models through OpenAI-compatible APIs, abstracting away the decentralized infrastructure while delivering cost-competitive inference.

Institutional validation arrived with Grayscale's Bittensor Trust (GTAO) in December 2025, followed by public companies like xTAO and TAO Synergies accumulating over 70,000 TAO tokens (~$26 million). Custody providers including BitGo, Copper, and Crypto.com integrated Bittensor through Yuma's validator, signaling that DePIN is no longer too "exotic" for traditional finance infrastructure.

Render Network: From 3D Rendering to Enterprise AI

Render's trajectory shows how DePIN projects evolve beyond initial use cases. Originally focused on distributed 3D rendering for artists and studios, Render pivoted toward AI compute as demand shifted.

July 2025 metrics: 1.49 million frames rendered, $207,900 in USDC fees burned—with 35% of all-time frames rendered in 2025 alone, demonstrating accelerating adoption. Q4 2025 brought enterprise GPU onboarding through RNP-021, integrating NVIDIA H200 and AMD MI300X chips to serve AI inference and training workloads alongside rendering tasks.

Render's economic model burns fee revenue (207,900 USDC in a single month), creating deflationary tokenomics that contrast sharply with inflationary DePIN projects. As enterprise GPU onboarding scales, Render positions itself as the premium-tier option: higher performance, audited hardware, curated supply—targeting enterprises that need guaranteed compute SLAs, not hobbyist node operators.

Helium: Telecom's Decentralized Disruption

Helium's wireless networks prove DePIN can infiltrate trillion-dollar incumbent industries. Partnerships with T-Mobile, AT&T, and Telefónica aren't pilot programs—they're production deployments where Helium's decentralized hotspots augment macro cell coverage in hard-to-reach areas.

The economics are compelling for telecom operators: Helium's community-deployed hotspots cost a fraction of traditional cell tower buildouts, solving the "last-mile coverage" problem without capital-intensive infrastructure investments. For hotspot operators, it's recurring revenue from real data usage, not token speculation.

Messari's Q3 2025 State of Helium report highlights sustained network growth and data transfer volume, with the blockchain-in-telecom sector projected to grow from $1.07 billion (2024) to $7.25 billion by 2030. Helium is capturing meaningful market share in a segment that traditionally resisted disruption.

The 60-80% Cost Advantage: Economics That Force Adoption

DePIN's value proposition isn't ideological decentralization—it's brutal cost efficiency. When Fluence Network claims 60-80% savings versus centralized clouds, they're comparing apples to apples: equivalent compute capacity, SLA guarantees, and availability zones.

The cost advantage stems from structural differences:

  1. Elimination of platform margin: AWS, Azure, and Google Cloud impose 30-70% markups on underlying infrastructure costs. DePIN protocols replace these markups with algorithmic matching and transparent fee structures.

  2. Utilization of stranded capacity: Centralized clouds must provision for peak demand, leaving capacity idle during off-hours. DePIN aggregates globally distributed resources that operate at higher average utilization rates.

  3. Geographic arbitrage: DePIN networks tap into regions with lower energy costs and underutilized hardware, routing workloads dynamically to optimize price-performance ratios.

  4. Open market competition: Fluence's protocol, for example, fosters competition among independent compute providers, driving prices down without requiring multi-year reserved instance commitments.

Traditional cloud providers offer comparable discounts—AWS Reserved Instances save up to 72%, Azure Reserved VM Instances hit 72%, Azure Hybrid Benefit reaches 85%—but these require 1-3 year commitments with upfront payment. DePIN delivers similar savings on-demand, with spot pricing that adjusts in real-time.

For enterprises managing variable workloads (AI model experimentation, rendering farms, scientific computing), the flexibility is game-changing. Launch 10,000 GPUs for a weekend, pay spot rates 70% below AWS, and shut down infrastructure Monday morning—no capacity planning, no wasted reserved capacity.

Institutional Capital Follows Real Revenue

The shift from retail speculation to institutional allocation is quantifiable. DePIN startups raised approximately $1 billion in 2025, with $744 million invested across 165+ projects between January 2024 and July 2025 (plus 89+ undisclosed deals). This isn't dumb money chasing airdrops—it's calculated deployment from infrastructure-focused VCs.

Two funds signal institutional seriousness:

  • Borderless Capital's $100M DePIN Fund III (September 2024): Backed by peaq, Solana Foundation, Jump Crypto, and IoTeX, targeting projects with demonstrated product-market fit and revenue traction.

  • Entrée Capital's $300M Fund (December 2025): Explicitly focused on AI agents and DePIN infrastructure at pre-seed through Series A, betting on the convergence of autonomous systems and decentralized infrastructure.

Importantly, these aren't crypto-native funds hedging into infrastructure—they're traditional infrastructure investors recognizing that DePIN offers superior risk-adjusted returns compared to centralized cloud competitors. When you can fund a project trading at 15x revenue (Aethir) versus hyperscalers at 10x revenue but with monopolistic moats, the DePIN asymmetry becomes obvious.

Newer DePIN projects are also learning from 2021's tokenomics mistakes. Protocols launched in the past 12 months achieved average fully diluted valuations of $760 million—nearly double the valuations of projects launched two years ago—because they've avoided the emission death spirals that plagued early networks. Tighter token supply, revenue-based unlocks, and burn mechanisms create sustainable economics that attract long-term capital.

From Speculation to Infrastructure: What Changes Now

January 2026 marked a turning point: DePIN sector revenue hit $150 million in a single month, driven by enterprise demand for computing power, mapping data, and wireless bandwidth. This wasn't a token price pump—it was billed usage from customers solving real problems.

The implications cascade across the crypto ecosystem:

For developers: DePIN infrastructure finally offers production-grade alternatives to AWS. Aethir's 440,000 GPUs can train LLMs, Filecoin can store petabytes of data with cryptographic verification, Helium can deliver IoT connectivity without AT&T contracts. The blockchain stack is complete.

For enterprises: Cost optimization is no longer a choice between performance and price. DePIN delivers both, with transparent pricing, no vendor lock-in, and geographic flexibility that centralized clouds can't match. CFOs will notice.

For investors: Revenue multiples are compressing toward tech sector norms (10-25x), creating entry points that were impossible during 2021's speculative mania. Aethir at 15x revenue is cheaper than most SaaS companies, with faster growth rates.

For tokenomics: Projects that generate real revenue can burn tokens (Render), distribute protocol fees (Bittensor), or fund ecosystem growth (Helium) without relying on inflationary emissions. Sustainable economic loops replace Ponzi reflexivity.

The World Economic Forum's $3.5 trillion projection suddenly seems conservative. If DePIN captures just 10% of cloud infrastructure spending by 2028 (~$60 billion annually at current cloud growth rates), and projects trade at 15x revenue, you're looking at $900 billion in sector market cap—46x from today's $19.2 billion base.

What BlockEden.xyz Builders Should Know

The DePIN revolution isn't happening in isolation—it's creating infrastructure dependencies that Web3 developers will increasingly rely on. When you're building on Sui, Aptos, or Ethereum, your dApp's off-chain compute requirements (AI inference, data indexing, IPFS storage) will increasingly route through DePIN providers instead of AWS.

Why it matters: Cost efficiency. If your dApp serves AI-generated content (NFT creation, game assets, trading signals), running inference through Bittensor or Aethir could cut your AWS bill by 70%. For projects operating on tight margins, that's the difference between sustainability and burn rate death.

BlockEden.xyz provides enterprise-grade API infrastructure for Sui, Aptos, Ethereum, and 15+ blockchain networks. As DePIN protocols mature into production-ready infrastructure, our multichain approach ensures developers can integrate decentralized compute, storage, and bandwidth alongside reliable RPC access. Explore our API marketplace to build on foundations designed to last.

The Enterprise Pivot Is Already Complete

DePIN isn't coming—it's here. When Aethir generates $166 million ARR from 150 enterprise customers, when Helium partners with T-Mobile and AT&T, when Bittensor serves AI inference through OpenAI-compatible APIs, the "experimental technology" label no longer applies.

The sector has crossed the chasm from crypto-native adoption to enterprise validation. Institutional capital is no longer funding potential—it's funding proven revenue models with cost structures that centralized competitors can't match.

For blockchain infrastructure, the implications are profound. DePIN proves that decentralization isn't just an ideological preference—it's a competitive advantage. When you can deliver 70% cost savings with SLA guarantees, you don't need to convince enterprises about the philosophy of Web3. You just need to show them the invoice.

The $3.5 trillion opportunity isn't a prediction. It's math. And the projects building real businesses—not token casinos—are positioning themselves to capture it.


Sources:

Beyond Monolithic vs. Modular: How LayerZero's Zero Network Rewrites the Blockchain Scaling Playbook

· 9 min read
Dora Noda
Software Engineer

Every blockchain that has ever achieved scale has done so by making every validator repeat the same work. That single design choice — call it the replication requirement — has capped throughput for decades. LayerZero's Zero Network proposes to eliminate it entirely, and the institutional partners signing on suggest the industry may be taking that claim seriously.

InfoFi's $381M Market Decoded: How Four Verticals Are Turning Information Into Tradeable Assets

· 11 min read
Dora Noda
Software Engineer

What if your ability to spot an emerging crypto trend before the crowd was worth money? Not in a vague "knowledge is power" sense, but literally — with a token price attached to your insight and a market ready to bid on it?

That's the promise of Information Finance, or InfoFi. Coined as a concept by Vitalik Buterin in his November 2024 essay "From prediction markets to info finance," InfoFi describes a class of protocols that use financial mechanisms to extract, aggregate, and price information as a public good. By early 2025, the sector had grown to a $381 million market cap. By late 2025, it had become one of the most hotly contested battlegrounds in Web3.

But InfoFi is not one thing. Beneath the umbrella term live four distinct verticals, each with its own mechanics, power players, and competitive dynamics. Understanding where each vertical stands — and where the lines blur — is essential for anyone trying to navigate this space intelligently.

The Layer 2 Paradox: How $0.001 Fees Are Breaking Ethereum's Scaling Business Model

· 11 min read
Dora Noda
Software Engineer

Ethereum's Layer 2 networks have accomplished something extraordinary in 2025: they've reduced transaction costs by over 90%, making blockchain interactions nearly free. But this triumph of engineering has created an unexpected crisis—the very business model that funds these networks is collapsing beneath the weight of its own success.

As transaction fees plummet toward $0.001 per operation, Layer 2 operators face a stark question: how do you sustain a billion-dollar infrastructure when your primary revenue stream is evaporating?

The Great Fee Collapse of 2025

The numbers tell a dramatic story. Between January 2025 and January 2026, average gas prices on Ethereum Layer 2 networks plummeted from 7.141 gwei to approximately 0.50 gwei—a staggering 93% reduction. Today, transactions on Base average $0.01, while Arbitrum and Optimism hover around $0.15-0.20, with many operations now costing mere fractions of a cent.

The catalyst? EIP-4844, Ethereum's Dencun upgrade launched in March 2024, which introduced "blobs"—temporary data packets that Layer 2 networks can use for cost-effective settlement. Unlike traditional calldata stored permanently on Ethereum, blobs remain available for approximately 18 days, enabling them to be priced dramatically lower.

The impact was immediate and devastating to the traditional revenue model. Optimism, Arbitrum, and Base all experienced 90-99% fee reductions for many transaction types. Median blob fees dropped to as low as $0.0000000005, making user interactions almost negligibly cheap. Over 950,000 blobs have been posted to Ethereum since EIP-4844's launch, fundamentally reshaping the economics of Layer 2 operations.

For users and developers, this is paradise. For Layer 2 operators counting on sequencer revenue, it's an existential threat.

Sequencer Revenue: The Endangered Revenue Stream

Traditionally, Layer 2 networks have made money through a straightforward model: they collect fees from users for processing transactions, then pay a portion of those fees to Ethereum for data availability and settlement. The difference between what they collect and what they pay becomes their profit—sequencer revenue.

This model worked brilliantly when Layer 2 fees were substantial. But with transaction costs approaching zero, the margin has become razor-thin.

The economics reveal the challenge starkly. Base, despite leading the pack, averages only $185,291 in daily revenue over the past 180 days. Arbitrum pulls in approximately $55,025 per day. These numbers, while not insignificant, must support extensive infrastructure, development teams, and ongoing operations for networks processing hundreds of thousands of transactions daily.

The situation becomes more precarious when examining annual gross profits. Base leads with nearly $30 million for the year, while both Arbitrum and Optimism have grossed around $9.5 million each. These figures must sustain networks that collectively process 60-70% of Ethereum's total transaction volume—a massive operational burden for relatively modest returns.

The fundamental tension is clear: Layer 2 networks must find a niche that justifies their existence off Ethereum mainnet and generate sufficient revenue to sustain themselves. As one industry analysis noted, "profitability lies in the difference between what L2s earn from users and what they pay to Ethereum"—but that difference is shrinking daily.

The MEV Divergence: Different Paths to Value Capture

Facing the sequencer revenue squeeze, Layer 2 networks are exploring Maximal Extractable Value (MEV) as an alternative revenue source. But their approaches differ dramatically, creating distinct competitive advantages and challenges.

Arbitrum's Fair Ordering Philosophy

Arbitrum employs a First-Come First-Serve (FCFS) ordering system designed to reduce user harm from MEV extraction. This philosophy prioritizes user experience over revenue maximization, resulting in significantly lower MEV activity—only 7% of on-chain gas usage compared to over 50% on competing networks.

However, Arbitrum isn't abandoning MEV entirely. The network is exploring future decentralized sequencer implementations that might introduce auctions for MEV opportunities, potentially returning some value to users or the protocol treasury. This represents a middle path: preserving fairness while still capturing economic value.

Base and Optimism's Auction Approach

In contrast, Base and Optimism utilize Priority Gas Auctions (PGA), where users can bid higher fees for transaction priority. This design inherently enables more MEV activity—Optimistic MEV accounts for 51-55% of total on-chain gas usage on these networks.

The catch? Success rates for actual arbitrage remain exceedingly low on OP-Stack rollups, hovering around 1%—far lower than on Arbitrum. The majority of gas is spent on "interaction probes"—on-chain computations searching for arbitrage opportunities that rarely materialize. This creates a peculiar situation where MEV activity consumes resources without generating proportional value.

Despite lower success rates, the sheer volume of MEV-related activity on Base contributes to its revenue leadership. The network processes over 1,000 transactions per second at minimal cost, turning volume into a competitive advantage.

Alternative Revenue Models: Beyond Transaction Fees

As traditional sequencer revenue proves insufficient, Layer 2 networks are pioneering alternative business models that could reshape blockchain infrastructure economics.

The Licensing Divergence

Arbitrum and Optimism have taken dramatically different approaches to monetizing their technology stacks.

Arbitrum's Orbit Revenue Share: Arbitrum adopts a "community source code" model, requiring chains built on its Orbit framework to contribute 10% of protocol revenue if they settle outside the Arbitrum ecosystem. This creates a royalty-like structure that generates income even when chains don't directly use Arbitrum for settlement.

Optimism's Open Source Gambit: Optimism's OP Stack is completely open source under the MIT license, allowing anyone to obtain the code, modify it freely, and build custom Layer 2 chains with no royalties or upfront fees. Revenue sharing only activates when a chain joins Optimism's official ecosystem, the "Superchain."

This creates an interesting dynamic: Optimism is betting on ecosystem growth and voluntary participation, while Arbitrum enforces economic alignment through licensing requirements. Time will tell which approach better balances growth with sustainability.

Enterprise Rollups and Professional Services

Perhaps the most promising alternative emerged in 2025: the rise of the "enterprise rollup." Major institutions are launching custom Layer 2 networks, and they're willing to pay for professional deployment, maintenance, and support services.

This mirrors traditional open-source business models—the code is free, but operational expertise commands premium pricing. Optimism's recently launched OP Enterprise exemplifies this approach, offering white-glove service to institutions building customized blockchain infrastructure.

The value proposition is compelling for enterprises. They gain access to the liquidity and network effects of the Ethereum economy while maintaining customized security, privacy, and compliance capabilities. As one industry report notes, "institutions can have their own customized institutional L2 which plugs into the liquidity and network effects of the Ethereum economy."

Layer 3s and App-Specific Chains

High-performance DeFi protocols increasingly demand capabilities that generic Layer 2 networks can't efficiently provide: predictable execution, flexible liquidation logic, granular control over transaction ordering, and the ability to capture MEV internally.

Enter Layer 3s and app-specific chains built on frameworks like Arbitrum Orbit. These specialized networks allow protocols to internalize MEV, customize economics, and optimize for specific use cases. For Layer 2 operators, providing the infrastructure and tooling for these specialized chains represents a new revenue stream that doesn't depend on low-margin transaction processing.

The strategic insight is clear: Layer 2 networks win by distributing their infrastructure outward and partnering with large platforms, not by competing solely on transaction costs.

The Sustainability Question: Can L2s Survive the Fee War?

The fundamental tension facing Layer 2 networks in 2026 is whether any combination of alternative revenue models can compensate for vanishing transaction fees.

Consider the math: if transaction fees continue trending toward $0.001 and blob costs remain near zero, even processing millions of transactions daily generates minimal revenue. Base, despite its volume leadership, must find additional revenue sources to justify ongoing operations at scale.

The situation is complicated by persistent centralization concerns. Most Layer 2 networks remain far more centralized than they appear, with decentralization treated as a long-term goal rather than an immediate priority. This creates regulatory risk and questions about long-term value accrual—if a network is centralized, why should users trust it over traditional databases with "clever cryptography"?

Recent structural changes suggest Ethereum itself recognizes the problem. The Fusaka upgrade aims to "repair" the value capture chain between Layer 1 and Layer 2, requiring L2s to pay increased "tribute" to Ethereum mainnet. This redistribution helps Ethereum but further squeezes already-thin Layer 2 margins.

Revenue Models for 2026 and Beyond

Looking forward, successful Layer 2 networks will likely adopt hybrid revenue strategies:

  1. Volume Over Margin: Base's approach—processing massive transaction volumes at minimal per-transaction profit—can work if scale is achieved. Base's 1,000+ TPS at $0.01 fees generates more revenue than Arbitrum's 400 TPS at $0.20 fees.

  2. Selective MEV Capture: Networks must balance MEV extraction with user experience. Arbitrum's exploration of MEV auctions that return value to users represents a middle path that generates revenue without alienating the community.

  3. Enterprise Services: Professional support, deployment assistance, and customization services for institutional clients offer high-margin revenue that scales with client value rather than transaction count.

  4. Ecosystem Revenue Sharing: Both mandatory (Arbitrum Orbit) and voluntary (Optimism Superchain) revenue-sharing models create network effects where Layer 2 success compounds through ecosystem participation.

  5. Data Availability Markets: As blob pricing evolves, Layer 2 networks might introduce tiered data availability offerings—premium settlement guarantees for institutions, budget options for consumer applications.

By 2026, networks are expected to introduce revenue-sharing models, sequencer profit distribution, and yield tied to actual network usage, fundamentally shifting from transaction fees to participation economics.

The Path Forward

The Layer 2 economic crisis is, paradoxically, a sign of technological success. Ethereum's scaling solutions have achieved their primary goal: making blockchain transactions affordable and accessible. But technological triumph doesn't automatically translate to business sustainability.

The networks that survive and thrive will be those that:

  • Accept that transaction fees alone cannot sustain operations at $0.001 per operation
  • Develop diversified revenue streams that align with actual value creation
  • Balance centralization concerns with operational efficiency
  • Build ecosystem network effects that compound value beyond individual transactions
  • Serve institutional and enterprise clients willing to pay for infrastructure reliability

Base, Arbitrum, and Optimism are all experimenting with different combinations of these strategies. Base leads in gross revenue through volume, Arbitrum enforces economic alignment through licensing, and Optimism bets on open-source ecosystem growth.

The ultimate winners will likely be those that recognize the fundamental shift: Layer 2 networks are no longer just transaction processors. They're becoming infrastructure platforms, enterprise service providers, and ecosystem orchestrators. Revenue models must evolve accordingly—or risk becoming unsustainably cheap commodity services in a race to zero that nobody can afford to win.

For developers building on Layer 2 infrastructure, reliable node access and data indexing remain critical as these networks evolve their business models. BlockEden.xyz provides enterprise-grade API access across major Layer 2 networks, offering consistent performance regardless of underlying economic shifts.


Sources