Skip to main content

66 posts tagged with "blockchain"

View all tags

Balaji's Vision for Cryptoidentity: From Keys to Network States

· 10 min read
Dora Noda
Software Engineer

1) What Balaji means by “cryptoidentity”

In Balaji’s vocabulary, cryptoidentity is identity that is rooted in cryptography—specifically public–private keypairs—and then extended with on‑chain names, verifiable credentials/attestations, and interfaces to legacy (“fiat”) identity. In his words and work:

  • Keys as identity. The bedrock is the idea that, in Bitcoin and web3, your keypair is your identity; authentication and authorization flow from control of private keys rather than from accounts in a corporate database. (balajis.com)
  • Names and reputation on-chain. Naming systems like ENS/SNS anchor human‑readable identities to addresses; credentials (NFTs, “soulbound” tokens, on‑chain “cryptocredentials”) and attestations layer reputation and history onto those identities.
  • On‑chain, auditable “census.” For societies and network states, identity participates in a cryptographically auditable census (proof‑of‑human/unique person, proof‑of‑income, proof‑of‑real‑estate) to demonstrate real population and economic activity.
  • Bridging legacy ID ↔ crypto ID. He explicitly argues we need a “fiat identity ↔ crypto identity exchange”—akin to fiat↔crypto exchanges—so “digital passports follow digital currency.” He highlights “crypto passports” as the next interface after stablecoins. (Circle)
  • Identity for a “web3 of trust” in the AI era. To counter deepfakes and bots, he promotes content signed by on‑chain identities (e.g., ENS) so provenance and authorship are cryptographically verifiable across the open web. (Chainlink Today)
  • Civic protection. In his shorthand: “Cryptocurrency partially protects you from debanking. Cryptoidentity partially protects you from denaturalization.” (X (formerly Twitter))

2) How his view evolved (a short chronology)

  • 2019–2020 – cryptographic identity & pseudonymity. Balaji’s writings emphasize public‑key cryptography as identity (keys-as-ID) and forecast decentralized identity + reputation growing through the 2020s. At the same time, his “pseudonymous economy” talk argues for persistent, reputation‑bearing pseudonyms to protect speech and experiment with new kinds of work and organization. (balajis.com)
  • 2022 – The Network State. He formalizes identity’s job in a network state: on‑chain census; ENS‑style identity; cryptographic proofs (of personhood/income/real‑estate); and crypto‑credentials/soulbounds. Identity is infrastructural—what the society counts and what the world can verify.
  • 2022–2024 – bridges to legacy systems. In public interviews and his podcast, he calls for fiat↔crypto identity bridges (e.g., Palau’s RNS.ID digital residency) and stresses moving “paper” records to code. (Circle)
  • 2023–present – identity as defense against AI fakes. He frames cryptoidentity as the backbone of a “web3 of trust”: signed content, on‑chain provenance, and economic friction (staking, payments) to separate humans from bots. (Chainlink Today)

3) The technical stack Balaji gestures toward

Root primitive: keys & wallets

  • Control of a private key = control of an identity; rotate/partition keys for different personas and risk profiles. (balajis.com)

Resolution & login

  • ENS/SNS map human‑readable names to addresses; Sign‑In with Ethereum (EIP‑4361) turns those addresses into a standard way to authenticate to off‑chain apps.

Credentials & attestations (reputation layer)

  • W3C Verifiable Credentials (VC 2.0) define an interoperable way to issue/hold/verify claims (e.g., KYC checks, diplomas).
  • Ethereum Attestation Service (EAS) provides a public good layer for on‑ or off‑chain attestations to build identity, reputation, and registries that applications can verify. (W3C)

Proof‑of‑personhood & uniqueness

  • In The Network State, Balaji sketches “proof‑of‑human” techniques for the on‑chain census; outside his work, approaches like World ID try to verify humanness/uniqueness, which has also raised data‑protection concerns—illustrating the trade‑offs of biometric PoP.

Bridges to legacy identity

  • Palau RNS.ID is a prominent example of a sovereign issuing legal ID with on‑chain components; acceptance is uneven across platforms, underscoring the “bridge” problem Balaji highlights. (Biometric Update)

Provenance & anti‑deepfake

  • He advocates signing content from ENS‑linked addresses so every image/post/video can be traced to a cryptographic identity in a “web3 of trust.” (Chainlink Today)

4) Why it matters (Balaji’s strategic claims)

  1. Censorship & deplatforming resistance: Keys and decentralized naming reduce reliance on centralized ID providers. (Keys are bearer‑style identities.) (balajis.com)
  2. Auditability for societies: Network states require verifiable population/income/footprint; auditability is impossible without identity that can be proven on‑chain.
  3. AI resilience: A cryptographic identity layer (plus signatures/attestations) underpins authenticity online, reversing AI‑driven fakery. (Chainlink Today)
  4. Interoperability & composability: Standards (ENS, SIWE, VC/EAS) make identity portable across apps and jurisdictions.

5) How it connects to The Network State

Balaji’s book repeatedly pairs identity with a real‑time, on‑chain census—including proof‑of‑human, proof‑of‑income, and proof‑of‑real‑estate—and highlights naming (ENS) and crypto‑credentials as core primitives. He also describes “ENS‑login‑to‑physical‑world” patterns (digital keys to doors/services) embedded in a social smart contract, pointing to cryptoidentity as the access layer for both digital and (eventually) physical governance.


6) Implementation blueprint (a practical path you can execute today)

A. Establish the base identities

  1. Generate separate keypairs for: (i) legal/“real name”, (ii) work/professional pseudonym, (iii) public‑speech pseudonym. Store each in a different wallet configuration (hardware, MPC, or smart accounts with guardians). (balajis.com)
  2. Register ENS names for each persona; publish minimal public profile metadata.

B. Add authentication & content provenance 3) Enable SIWE (EIP‑4361) for app logins; phase out passwords/social logins. (Ethereum Improvement Proposals) 4) Sign public artifacts (posts, images, code releases) from your ENS‑linked address; publish a simple “signed‑content” feed others can verify. (Chainlink Today)

C. Layer credentials and attestations 5) Issue/collect VCs for legal facts (company role, licenses) and EAS attestations for soft signals (reputation, verified contributions, attendance). Keep sensitive claims off‑chain with only hashes/receipts on‑chain. (W3C)

D. Bridge to legacy identity when needed 6) Where lawful and useful, link a sovereign/enterprise ID (e.g., Palau RNS.ID) to your cryptoidentity for KYC‑gated venues. Expect heterogeneous acceptance and maintain alternates. (Biometric Update)

E. Deploy for groups/societies 7) For a startup society or DAO:

  • Gate membership with ENS + a proof‑of‑human method you deem acceptable.
  • Maintain a public, auditable census (counts of members/income/holdings) using oracles plus signed attestations, not raw PII.

7) Risks, critiques, and open questions

  • Privacy/pseudonymity erosion. Blockchain analysis can cluster wallets; Balaji’s own pseudonymity framing warns how a handful of data “bits” can re‑identify you. Use mixers/privacy tech carefully and lawfully—but recognize limits. (blog.blockstack.org)
  • Proof‑of‑personhood trade‑offs. Biometric PoP (e.g., iris) invites significant data‑protection scrutiny; alternative PoP methods reduce risk but may increase Sybil vulnerability. (law.kuleuven.be)
  • Bridge brittleness. Palau‑style IDs are not a universal KYC pass; acceptance varies by platform and jurisdiction and can change. Build for graceful degradation. (Malakouti Law)
  • Key loss & coercion. Keys can be stolen/coerced; use multi‑sig/guardians and incident‑response policies. (Balaji’s model assumes cryptography + consent, which must be engineered socially.) (balajis.com)
  • Name/registry centralization. ENS or any naming authority becomes a policy chokepoint; mitigate via multi‑persona design and exportable proofs.

8) How Balaji’s cryptoidentity maps to standards (and where it differs)

  • Alignment:

    • DIDs + VCs (W3C) = portable, interoperable identity/claims; SIWE = wallet‑native authentication; EAS = attestations for reputation/registries. These are the components he points to—even if he uses plain language (ENS, credentials) rather than standards acronyms. (W3C)
  • Differences/emphasis:

    • He elevates societal auditability (on‑chain census) and AI‑era provenance (signed content) more than many DID/VC discussions, and he explicitly pushes fiat↔crypto identity bridges and crypto passports as a near‑term priority.

9) If you’re building: a minimal viable “cryptoidentity” rollout (90 days)

  1. Week 1–2: Keys, ENS, SIWE enabled; publish your signing policy and start signing public posts/releases. (Ethereum Improvement Proposals)
  2. Week 3–6: Integrate VCs/EAS for role/membership/participation; build a public “trust page” that verifies these programmatically. (W3C)
  3. Week 7–10: Stand up a basic census dashboard (aggregate member count, on‑chain treasury/income proofs) with clear privacy posture.
  4. Week 11–13: Pilot a legacy bridge (e.g., RNS.ID where appropriate) for one compliance‑intensive flow; publish results (what worked/failed). (Biometric Update)

Selected sources (primary and load‑bearing)

  • The Network State (on‑chain census; ENS/identity; crypto‑credentials) and “ENS‑login‑to‑physical‑world” examples.
  • Public‑Key Cryptography (keys as identity). (balajis.com)
  • Circle – The Money Movement (Ep. 74) (fiat↔crypto identity bridge; “crypto passports”). (Circle)
  • The Network State podcast, Ep. 10 (fiat‑identity→crypto‑identity exchange; Palau RNS.ID). (thenetworkstate.com)
  • Chainlink Today (signed content/ENS to fight deepfakes; “web3 of trust”). (Chainlink Today)
  • Balaji on X (“Cryptoidentity…denaturalization”). (X (formerly Twitter))
  • Standards: W3C DID Core, VC 2.0; EIP‑4361 (SIWE); EAS docs. (W3C)
  • RNS.ID / Palau (real‑world bridge; mixed acceptance). (Biometric Update)
  • Pseudonymous Economy (identity & 33‑bits re‑identification intuition). (blog.blockstack.org)

Bottom line

For Balaji, cryptoidentity is not just “DID tech.” It’s a civilizational primitive: keys and signatures at the base; names and credentials on top; bridges to legacy identity; and a verifiable public record that scales from individuals to network societies. It’s how you get authentic people and authentic records in an AI‑flooded internet—and how a startup society can prove it’s real without asking the world to trust its word. (Chainlink Today)

If you want, I can tailor the implementation blueprint to your specific use case (consumer app, DAO, enterprise, or a startup‑society pilot) and produce concrete schemas/flows for SIWE, EAS, and VC 2.0 that match your regulatory and UX constraints.

DeFi’s Next Chapter: Perspectives from Leading Builders and Investors (2024 – 2025)

· 11 min read
Dora Noda
Software Engineer

Decentralized Finance (DeFi) matured considerably from the summer‑2020 speculation boom to the 2024‑2025 cycle. Higher interest rates slowed DeFi’s growth in 2022‑2023, but the emergence of high‑throughput chains, token‑driven incentives and a clearer regulatory environment are creating conditions for a new phase of on‑chain finance. Leaders from Hyperliquid, Aave, Ethena and Dragonfly share a common expectation that the next chapter will be driven by genuine utility: efficient market infrastructure, yield‑bearing stablecoins, real‑world asset tokenization and AI‑assisted user experiences. The following sections analyze DeFi’s future through the voices of Jeff Yan (Hyperliquid Labs), Stani Kulechov (Aave Labs), Guy Young (Ethena Labs) and Haseeb Qureshi (Dragonfly).

Jeff Yan – Hyperliquid Labs

Background

Jeff Yan is co‑founder and CEO of Hyperliquid, a decentralized exchange (DEX) that operates a high‑throughput orderbook for perpetuals and spot trading. Hyperliquid gained prominence in 2024 for its community‑driven airdrop and refusal to sell equity to venture capitalists; Yan kept the team small and self‑funded to maintain product focus. Hyperliquid’s vision is to become a decentralized base layer for other financial products, such as tokenized assets and stablecoins.

Vision for DeFi’s Next Chapter

  • Efficiency over hype. At a Token 2049 panel, Yan compared DeFi to a math problem; he argued that markets should be efficient, where users obtain the best prices without hidden spreads. Hyperliquid’s high‑throughput orderbook aims to deliver this efficiency.
  • Community ownership and anti‑VC stance. Yan believes DeFi success should be measured by value delivered to users rather than investor exits. Hyperliquid rejected private market‑maker partnerships and centralized exchange listings to avoid compromising decentralization. This approach resonates with DeFi’s ethos: protocols should be owned by their communities and built for long‑term utility.
  • Focus on infrastructure, not token price. Yan stresses that Hyperliquid’s purpose is to build robust technology; product improvements, such as HIP‑3, aim to mitigate dApp risks through automated audits and better integrations. He avoids setting rigid roadmaps, preferring to adapt to user feedback and technological changes. This adaptability reflects a broader shift from speculation toward mature infrastructure.
  • Vision for a permissionless financial stack. Yan sees Hyperliquid evolving into a foundational layer on which others can build stablecoins, RWAs and new financial instruments. By remaining decentralized and capital‑efficient, he hopes to establish a neutral layer akin to a decentralized Nasdaq.

Takeaways

Jeff Yan’s perspective emphasizes market efficiency, community‑driven ownership and modular infrastructure. He sees DeFi’s next chapter as a consolidation phase in which high‑performance DEXs become the backbone for tokenized assets and yield products. His refusal to take venture funding signals a pushback against excessive speculation; in the next chapter, protocols may prioritize sustainability over headline‑grabbing valuations.

Stani Kulechov – Aave Labs

Background

Stani Kulechov founded Aave, one of the first money‑market protocols and a leader in decentralized lending. Aave’s liquidity markets allow users to earn yield or borrow assets without intermediaries. By 2025, Aave’s TVL and product suite expanded to include stablecoins and a newly launched Family Wallet—a fiat–crypto on‑ramp that debuted at the Blockchain Ireland Summit.

Vision for DeFi’s Next Chapter

  • Rate‑cut catalyst for “DeFi summer 2.0.” At Token 2049, Kulechov argued that falling interest rates would ignite a new DeFi boom similar to 2020. Lower rates create arbitrage opportunities as on‑chain yields remain attractive relative to TradFi, drawing capital into DeFi protocols. He recalls that DeFi's TVL jumped from less than 1billionto1 billion to 10 billion during the 2020 rate cuts and expects a similar dynamic when monetary policy loosens.
  • Integration with fintech. Kulechov envisions DeFi embedding into mainstream fintech infrastructure. He plans to distribute on‑chain yields through consumer‑friendly apps and institutional channels, turning DeFi into a back‑end for savings products. The Family Wallet exemplifies this by offering seamless fiat–stablecoin conversions and everyday payments.
  • Real‑world assets (RWAs) and stablecoins. He regards tokenized real‑world assets and stablecoins as pillars of blockchain’s future. Aave’s GHO stablecoin and RWA initiatives aim to connect DeFi yields to real‑economy collateral, bridging the gap between crypto and traditional finance.
  • Community‑driven innovation. Kulechov credits Aave’s success to its community and expects user‑governed innovation to drive the next phase. He suggests that DeFi will focus on consumer applications that abstract complexity while preserving decentralization.

Takeaways

Stani Kulechov foresees a return of the DeFi bull cycle fueled by lower rates and improved user experience. He stresses integration with fintech and real‑world assets, predicting that stablecoins and tokenized treasuries will embed DeFi yields into everyday financial products. This reflects a maturation from speculative yield farming to infrastructure that coexists with traditional finance.

Guy Young – Ethena Labs

Background

Guy Young is the CEO of Ethena Labs, creator of sUSDe, a synthetic dollar stablecoin that uses delta‑neutral strategies to offer a yield‑bearing dollar. Ethena gained attention for providing attractive yields while using USDT collateral and short perpetual positions to hedge price risk. In 2025, Ethena announced initiatives like iUSDe, a compliant wrapped version for traditional institutions.

Vision for DeFi’s Next Chapter

  • Stablecoins for savings and trading collateral. Young categorizes stablecoin use cases into trading collateral, savings for developing countries, payments and speculation. Ethena focuses on savings and trading because yield makes the dollar attractive and exchange integration drives adoption. He believes a yield‑bearing dollar will become the world’s most important savings asset.
  • Neutral, platform‑agnostic stablecoins. Young argues that stablecoins must be neutral and widely accepted across venues; attempts by exchanges to push proprietary stablecoins harm user experience. Ethena’s use of USDT increases demand for Tether rather than competing with it, illustrating synergy between DeFi stablecoins and incumbents.
  • Integration with TradFi and messaging apps. Ethena plans to issue iUSDe with transfer restrictions to satisfy regulatory requirements and to integrate sUSDe into Telegram and Apple Pay, enabling users to save and spend yield‑bearing dollars like sending messages. Young imagines delivering a neobank‑like experience to a billion users through mobile apps.
  • Shift toward fundamentals and RWAs. He notes that crypto speculation appears saturated—altcoin market caps peaked at $1.2 trillion in both 2021 and 2024—so investors will focus on projects with real revenue and tokenized real‑world assets. Ethena’s strategy of providing yield from off‑chain assets positions it for this transition.

Takeaways

Guy Young’s perspective centers on yield‑bearing stablecoins as DeFi’s killer app. He argues that DeFi’s next chapter involves making dollars productive and embedding them into mainstream payments and messaging, drawing billions of users. Ethena’s platform‑agnostic approach reflects a belief that DeFi stablecoins should complement rather than compete with existing systems. He also anticipates a rotation from speculative altcoins to revenue‑generating tokens and RWAs.

Haseeb Qureshi – Dragonfly

Background

Haseeb Qureshi is managing partner at Dragonfly, a venture capital firm focusing on crypto and DeFi. Qureshi is known for his analytical writing and participation on the Chopping Block podcast. In late 2024 and early 2025, he released a series of predictions outlining how AI, stablecoins and regulatory changes will shape crypto.

Vision for DeFi’s Next Chapter

  • AI‑powered wallets and agents. Qureshi predicts that AI agents will revolutionize crypto by automating bridging, optimizing trade routes, minimizing fees and steering users away from scams. He expects AI‑driven wallets to handle cross‑chain operations seamlessly, reducing the complexity that currently deters mainstream users. AI‑assisted development tools will also make it easier to build smart contracts, solidifying the EVM’s dominance.
  • AI agent tokens vs. meme coins. Qureshi believes that tokens associated with AI agents will outperform meme coins in 2025 but warns that the novelty will fade and real value will come from AI’s impact on software engineering and trading. He views the current excitement as a shift from “financial nihilism to financial over‑optimism,” cautioning against overhyping chat‑bot coins.
  • Convergence of stablecoins and AI. In his 2025 predictions, Qureshi outlines six major themes: (1) the distinction between layer‑1 and layer‑2 chains will blur as AI tools expand EVM share; (2) token distributions will shift from large airdrops to metric‑driven or crowdfunding models; (3) stablecoin adoption will surge, with banks issuing their own stablecoins while Tether retains dominance; (4) AI agents will dominate crypto interactions but their novelty may fade by 2026; (5) AI tools will drastically lower development costs, enabling a wave of dApp innovation and stronger security; and (6) regulatory clarity, particularly in the U.S., will accelerate mainstream adoption.
  • Institutional adoption and regulatory shifts. Qureshi expects Fortune 100 companies to offer crypto to consumers under a Trump administration and believes U.S. stablecoin legislation will pass, unlocking institutional participation. The Gate.io research summary echoes this, noting that AI agents will adopt stablecoins for peer‑to‑peer transactions and that decentralized AI training will accelerate.
  • DeFi as infrastructure for AI‑assisted finance. On The Chopping Block, Qureshi named Hyperliquid as the “biggest winner” of 2024’s cycle and predicted DeFi tokens would see explosive growth in 2025. He attributes this to innovations like liquidity‑guidance pools that make decentralized perpetual trading competitive. His bullishness on DeFi stems from the belief that AI‑powered UX and regulatory clarity will drive capital into on‑chain protocols.

Takeaways

Haseeb Qureshi views DeFi’s next chapter as convergence of AI and on‑chain finance. He anticipates a surge in AI‑powered wallets and autonomous agents, which will simplify user interactions and attract new participants. Yet he cautions that the AI hype may fade; sustainable value will come from AI tools lowering development costs and improving security. He expects stablecoin legislation, institutional adoption and metric‑driven token distributions to professionalize the industry. Overall, he sees DeFi evolving into the foundation for AI‑assisted, regulatory‑compliant financial services.

Comparative Analysis

DimensionJeff Yan (Hyperliquid)Stani Kulechov (Aave)Guy Young (Ethena)Haseeb Qureshi (Dragonfly)
Core FocusHigh‑performance DEX infrastructure; community ownership; efficiencyDecentralized lending; fintech integration; real‑world assetsYield‑bearing stablecoins; trading collateral; payments integrationInvestment perspective; AI agents; institutional adoption
Key Drivers for Next ChapterEfficient order‑book markets; modular protocol layer for RWAs & stablecoinsRate cuts spurring capital inflow and “DeFi summer 2.0”; integration with fintech & RWAsNeutral stablecoins generating yield; integration with messaging apps and TradFiAI‑powered wallets and agents; regulatory clarity; metric‑driven token distributions
Role of StablecoinsUnderpins future DeFi layers; encourages decentralized issuersGHO stablecoin & tokenized treasuries integrate DeFi yields into mainstream financial productssUSDe turns dollars into yield‑bearing savings; iUSDe targets institutionsBanks to issue stablecoins by late 2025; AI agents to use stablecoins for transactions
View on Token IncentivesRejects venture funding & private market‑maker deals to prioritize communityEmphasizes community‑driven innovation; sees DeFi tokens as infrastructure for fintechAdvocates platform‑agnostic stablecoins that complement existing ecosystemsPredicts shift from large airdrops to KPI‑driven or crowdfunding distributions
Outlook on Regulation & InstitutionsMinimal focus on regulation; stresses decentralization & self‑fundingSees regulatory clarity enabling RWA tokenization and institutional useWorking on transfer‑restricted iUSDe to meet regulatory requirementsAnticipates U.S. stablecoin legislation & pro‑crypto administration accelerating adoption
On AI & AutomationN/AN/ANot central (though Ethena may use AI risk systems)AI agents will dominate user experience; novelty will fade by 2026

Conclusion

The next chapter of DeFi will likely be shaped by efficient infrastructure, yield‑bearing assets, integration with traditional finance and AI‑driven user experiences. Jeff Yan focuses on building high‑throughput, community‑owned DEX infrastructure that can serve as a neutral base layer for tokenized assets. Stani Kulechov expects lower interest rates, fintech integration and real‑world assets to catalyze a new DeFi boom. Guy Young prioritizes yield‑bearing stablecoins and seamless payments, pushing DeFi into messaging apps and traditional banks. Haseeb Qureshi anticipates AI agents transforming wallets and regulatory clarity unlocking institutional capital, while cautioning against over‑hyped AI token narratives.

Collectively, these perspectives suggest that DeFi’s future will move beyond speculative farming toward mature, user‑centric financial products. Protocols must deliver real economic value, integrate with existing financial rails, and harness technological advances like AI and high‑performance blockchains. As these trends converge, DeFi may evolve from a niche ecosystem into a global, permissionless financial infrastructure.

MCP in the Web3 Ecosystem: A Comprehensive Review

· 49 min read
Dora Noda
Software Engineer

1. Definition and Origin of MCP in Web3 Context

The Model Context Protocol (MCP) is an open standard that connects AI assistants (like large language models) to external data sources, tools, and environments. Often described as a "USB-C port for AI" due to its universal plug-and-play nature, MCP was developed by Anthropic and first introduced in late November 2024. It emerged as a solution to break AI models out of isolation by securely bridging them with the “systems where data lives” – from databases and APIs to development environments and blockchains.

Originally an experimental side project at Anthropic, MCP quickly gained traction. By mid-2024, open-source reference implementations appeared, and by early 2025 it had become the de facto standard for agentic AI integration, with leading AI labs (OpenAI, Google DeepMind, Meta AI) adopting it natively. This rapid uptake was especially notable in the Web3 community. Blockchain developers saw MCP as a way to infuse AI capabilities into decentralized applications, leading to a proliferation of community-built MCP connectors for on-chain data and services. In fact, some analysts argue MCP may fulfill Web3’s original vision of a decentralized, user-centric internet in a more practical way than blockchain alone, by using natural language interfaces to empower users.

In summary, MCP is not a blockchain or token, but an open protocol born in the AI world that has rapidly been embraced within the Web3 ecosystem as a bridge between AI agents and decentralized data sources. Anthropic open-sourced the standard (with an initial GitHub spec and SDKs) and cultivated an open community around it. This community-driven approach set the stage for MCP’s integration into Web3, where it is now viewed as foundational infrastructure for AI-enabled decentralized applications.

2. Technical Architecture and Core Protocols

MCP operates on a lightweight client–server architecture with three principal roles:

  • MCP Host: The AI application or agent itself, which orchestrates requests. This could be a chatbot (Claude, ChatGPT) or an AI-powered app that needs external data. The host initiates interactions, asking for tools or information via MCP.
  • MCP Client: A connector component that the host uses to communicate with servers. The client maintains the connection, manages request/response messaging, and can handle multiple servers in parallel. For example, a developer tool like Cursor or VS Code’s agent mode can act as an MCP client bridging the local AI environment with various MCP servers.
  • MCP Server: A service that exposes some contextual data or functionality to the AI. Servers provide tools, resources, or prompts that the AI can use. In practice, an MCP server could interface with a database, a cloud app, or a blockchain node, and present a standardized set of operations to the AI. Each client-server pair communicates over its own channel, so an AI agent can tap multiple servers concurrently for different needs.

Core Primitives: MCP defines a set of standard message types and primitives that structure the AI-tool interaction. The three fundamental primitives are:

  • Tools: Discrete operations or functions the AI can invoke on a server. For instance, a “searchDocuments” tool or an “eth_call” tool. Tools encapsulate actions like querying an API, performing a calculation, or calling a smart contract function. The MCP client can request a list of available tools from a server and call them as needed.
  • Resources: Data endpoints that the AI can read from (or sometimes write to) via the server. These could be files, database entries, blockchain state (blocks, transactions), or any contextual data. The AI can list resources and retrieve their content through standard MCP messages (e.g. ListResources and ReadResource requests).
  • Prompts: Structured prompt templates or instructions that servers can provide to guide the AI’s reasoning. For example, a server might supply a formatting template or a pre-defined query prompt. The AI can request a list of prompt templates and use them to maintain consistency in how it interacts with that server.

Under the hood, MCP communications are typically JSON-based and follow a request-response pattern similar to RPC (Remote Procedure Call). The protocol’s specification defines messages like InitializeRequest, ListTools, CallTool, ListResources, etc., which ensure that any MCP-compliant client can talk to any MCP server in a uniform way. This standardization is what allows an AI agent to discover what it can do: upon connecting to a new server, it can inquire “what tools and data do you offer?” and then dynamically decide how to use them.

Security and Execution Model: MCP was designed with secure, controlled interactions in mind. The AI model itself doesn’t execute arbitrary code; it sends high-level intents (via the client) to the server, which then performs the actual operation (e.g., fetching data or calling an API) and returns results. This separation means sensitive actions (like blockchain transactions or database writes) can be sandboxed or require explicit user approval. For example, there are messages like Ping (to keep connections alive) and even a CreateMessageRequest which allows an MCP server to ask the client’s AI to generate a sub-response, typically gated by user confirmation. Features like authentication, access control, and audit logging are being actively developed to ensure MCP can be used safely in enterprise and decentralized environments (more on this in the Roadmap section).

In summary, MCP’s architecture relies on a standardized message protocol (with JSON-RPC style calls) that connects AI agents (hosts) to a flexible array of servers providing tools, data, and actions. This open architecture is model-agnostic and platform-agnostic – any AI agent can use MCP to talk to any resource, and any developer can create a new MCP server for a data source without needing to modify the AI’s core code. This plug-and-play extensibility is what makes MCP powerful in Web3: one can build servers for blockchain nodes, smart contracts, wallets, or oracles and have AI agents seamlessly integrate those capabilities alongside web2 APIs.

3. Use Cases and Applications of MCP in Web3

MCP unlocks a wide range of use cases by enabling AI-driven applications to access blockchain data and execute on-chain or off-chain actions in a secure, high-level way. Here are some key applications and problems it helps solve in the Web3 domain:

  • On-Chain Data Analysis and Querying: AI agents can query live blockchain state in real-time to provide insights or trigger actions. For example, an MCP server connected to an Ethereum node allows an AI to fetch account balances, read smart contract storage, trace transactions, or retrieve event logs on demand. This turns a chatbot or coding assistant into a blockchain explorer. Developers can ask an AI assistant questions like “What’s the current liquidity in Uniswap pool X?” or “Simulate this Ethereum transaction’s gas cost,” and the AI will use MCP tools to call an RPC node and get the answer from the live chain. This is far more powerful than relying on the AI’s training data or static snapshots.
  • Automated DeFi Portfolio Management: By combining data access and action tools, AI agents can manage crypto portfolios or DeFi positions. For instance, an “AI Vault Optimizer” could monitor a user’s positions across yield farms and automatically suggest or execute rebalancing strategies based on real-time market conditions. Similarly, an AI could act as a DeFi portfolio manager, adjusting allocations between protocols when risk or rates change. MCP provides the standard interface for the AI to read on-chain metrics (prices, liquidity, collateral ratios) and then invoke tools to execute transactions (like moving funds or swapping assets) if permitted. This can help users maximize yield or manage risk 24/7 in a way that would be hard to do manually.
  • AI-Powered User Agents for Transactions: Think of a personal AI assistant that can handle blockchain interactions for a user. With MCP, such an agent can integrate with wallets and DApps to perform tasks via natural language commands. For example, a user could say, "AI, send 0.5 ETH from my wallet to Alice" or "Stake my tokens in the highest-APY pool." The AI, through MCP, would use a secure wallet server (holding the user’s private key) to create and sign the transaction, and a blockchain MCP server to broadcast it. This scenario turns complex command-line or Metamask interactions into a conversational experience. It’s crucial that secure wallet MCP servers are used here, enforcing permissions and confirmations, but the end result is streamlining on-chain transactions through AI assistance.
  • Developer Assistants and Smart Contract Debugging: Web3 developers can leverage MCP-based AI assistants that are context-aware of blockchain infrastructure. For example, Chainstack’s MCP servers for EVM and Solana give AI coding copilots deep visibility into the developer’s blockchain environment. A smart contract engineer using an AI assistant (in VS Code or an IDE) can have the AI fetch the current state of a contract on a testnet, run a simulation of a transaction, or check logs – all via MCP calls to local blockchain nodes. This helps in debugging and testing contracts. The AI is no longer coding “blindly”; it can actually verify how code behaves on-chain in real time. This use case solves a major pain point by allowing AI to continuously ingest up-to-date docs (via a documentation MCP server) and to query the blockchain directly, reducing hallucinations and making suggestions far more accurate.
  • Cross-Protocol Coordination: Because MCP is a unified interface, a single AI agent can coordinate across multiple protocols and services simultaneously – something extremely powerful in Web3’s interconnected landscape. Imagine an autonomous trading agent that monitors various DeFi platforms for arbitrage. Through MCP, one agent could concurrently interface with Aave’s lending markets, a LayerZero cross-chain bridge, and an MEV (Miner Extractable Value) analytics service, all through a coherent interface. The AI could, in one “thought process,” gather liquidity data from Ethereum (via an MCP server on an Ethereum node), get price info or oracle data (via another server), and even invoke bridging or swapping operations. Previously, such multi-platform coordination would require complex custom-coded bots, but MCP gives a generalizable way for an AI to navigate the entire Web3 ecosystem as if it were one big data/resource pool. This could enable advanced use cases like cross-chain yield optimization or automated liquidation protection, where an AI moves assets or collateral across chains proactively.
  • AI Advisory and Support Bots: Another category is user-facing advisors in crypto applications. For instance, a DeFi help chatbot integrated into a platform like Uniswap or Compound could use MCP to pull in real-time info for the user. If a user asks, “What’s the best way to hedge my position?”, the AI can fetch current rates, volatility data, and the user’s portfolio details via MCP, then give a context-aware answer. Platforms are exploring AI-powered assistants embedded in wallets or dApps that can guide users through complex transactions, explain risks, and even execute sequences of steps with approval. These AI agents effectively sit on top of multiple Web3 services (DEXes, lending pools, insurance protocols), using MCP to query and command them as needed, thereby simplifying the user experience.
  • Beyond Web3 – Multi-Domain Workflows: Although our focus is Web3, it's worth noting MCP’s use cases extend to any domain where AI needs external data. It’s already being used to connect AI to things like Google Drive, Slack, GitHub, Figma, and more. In practice, a single AI agent could straddle Web3 and Web2: e.g., analyzing an Excel financial model from Google Drive, then suggesting on-chain trades based on that analysis, all in one workflow. MCP’s flexibility allows cross-domain automation (e.g., "schedule my meeting if my DAO vote passes, and email the results") that blends blockchain actions with everyday tools.

Problems Solved: The overarching problem MCP addresses is the lack of a unified interface for AI to interact with live data and services. Before MCP, if you wanted an AI to use a new service, you had to hand-code a plugin or integration for that specific service’s API, often in an ad-hoc way. In Web3 this was especially cumbersome – every blockchain or protocol has its own interfaces, and no AI could hope to support them all. MCP solves this by standardizing how the AI describes what it wants (natural language mapped to tool calls) and how services describe what they offer. This drastically reduces integration work. For example, instead of writing a custom plugin for each DeFi protocol, a developer can write one MCP server for that protocol (essentially annotating its functions in natural language). Any MCP-enabled AI (whether Claude, ChatGPT, or open-source models) can then immediately utilize it. This makes AI extensible in a plug-and-play fashion, much like how adding a new device via a universal port is easier than installing a new interface card.

In sum, MCP in Web3 enables AI agents to become first-class citizens of the blockchain world – querying, analyzing, and even transacting across decentralized systems, all through safe, standardized channels. This opens the door to more autonomous dApps, smarter user agents, and seamless integration of on-chain and off-chain intelligence.

4. Tokenomics and Governance Model

Unlike typical Web3 protocols, MCP does not have a native token or cryptocurrency. It is not a blockchain or a decentralized network on its own, but rather an open protocol specification (more akin to HTTP or JSON-RPC in spirit). Thus, there is no built-in tokenomics – no token issuance, staking, or fee model inherent to using MCP. AI applications and servers communicate via MCP without any cryptocurrency involved; for instance, an AI calling a blockchain via MCP might pay gas fees for the blockchain transaction, but MCP itself adds no extra token fee. This design reflects MCP’s origin in the AI community: it was introduced as a technical standard to improve AI-tool interactions, not as a tokenized project.

Governance of MCP is carried out in an open-source, community-driven fashion. After releasing MCP as an open standard, Anthropic signaled a commitment to collaborative development. A broad steering committee and working groups have formed to shepherd the protocol’s evolution. Notably, by mid-2025, major stakeholders like Microsoft and GitHub joined the MCP steering committee alongside Anthropic. This was announced at Microsoft Build 2025, indicating a coalition of industry players guiding MCP’s roadmap and standards decisions. The committee and maintainers work via an open governance process: proposals to change or extend MCP are typically discussed publicly (e.g. via GitHub issues and “SEP” – Standard Enhancement Proposal – guidelines). There is also an MCP Registry working group (with maintainers from companies like Block, PulseMCP, GitHub, and Anthropic) which exemplifies the multi-party governance. In early 2025, contributors from at least 9 different organizations collaborated to build a unified MCP server registry for discovery, demonstrating how development is decentralized across community members rather than controlled by one entity.

Since there is no token, governance incentives rely on the common interests of stakeholders (AI companies, cloud providers, blockchain developers, etc.) to improve the protocol for all. This is somewhat analogous to how W3C or IETF standards are governed, but with a faster-moving GitHub-centric process. For example, Microsoft and Anthropic worked together to design an improved authorization spec for MCP (integrating things like OAuth and single sign-on), and GitHub collaborated on the official MCP Registry service for listing available servers. These enhancements were contributed back to the MCP spec for everyone’s benefit.

It’s worth noting that while MCP itself is not tokenized, there are forward-looking ideas about layering economic incentives and decentralization on top of MCP. Some researchers and thought leaders in Web3 foresee the emergence of “MCP networks” – essentially decentralized networks of MCP servers and agents that use blockchain-like mechanisms for discovery, trust, and rewards. In such a scenario, one could imagine a token being used to reward those who run high-quality MCP servers (similar to how miners or node operators are incentivized). Capabilities like reputation ratings, verifiable computation, and node discovery could be facilitated by smart contracts or a blockchain, with a token driving honest behavior. This is still conceptual, but projects like MIT’s Namda (discussed later) are experimenting with token-based incentive mechanisms for networks of AI agents using MCP. If these ideas mature, MCP might intersect with on-chain tokenomics more directly, but as of 2025 the core MCP standard remains token-free.

In summary, MCP’s “governance model” is that of an open technology standard: collaboratively maintained by a community and a steering committee of experts, with no on-chain governance token. Decisions are guided by technical merit and broad consensus rather than coin-weighted voting. This distinguishes MCP from many Web3 protocols – it aims to fulfill Web3’s ideals (decentralization, interoperability, user empowerment) through open software and standards, not through a proprietary blockchain or token. In the words of one analysis, “the promise of Web3... can finally be realized not through blockchain and cryptocurrency, but through natural language and AI agents”, positioning MCP as a key enabler of that vision. That said, as MCP networks grow, we may see hybrid models where blockchain-based governance or incentive mechanisms augment the ecosystem – a space to watch closely.

5. Community and Ecosystem

The MCP ecosystem has grown explosively in a short time, spanning AI developers, open-source contributors, Web3 engineers, and major tech companies. It’s a vibrant community effort, with key contributors and partnerships including:

  • Anthropic: As the creator, Anthropic seeded the ecosystem by open-sourcing the MCP spec and several reference servers (for Google Drive, Slack, GitHub, etc.). Anthropic continues to lead development (for example, staff like Theodora Chu serve as MCP product managers, and Anthropic’s team contributes heavily to spec updates and community support). Anthropic’s openness attracted others to build on MCP rather than see it as a single-company tool.

  • Early Adopters (Block, Apollo, Zed, Replit, Codeium, Sourcegraph): In the first months after release, a wave of early adopters implemented MCP in their products. Block (formerly Square) integrated MCP to explore AI agentic systems in fintech – Block’s CTO praised MCP as an open bridge connecting AI to real-world applications. Apollo (likely Apollo GraphQL) also integrated MCP to allow AI access to internal data. Developer tool companies like Zed (code editor), Replit (cloud IDE), Codeium (AI coding assistant), and Sourcegraph (code search) each worked to add MCP support. For instance, Sourcegraph uses MCP so an AI coding assistant can retrieve relevant code from a repository in response to a question, and Replit’s IDE agents can pull in project-specific context. These early adopters gave MCP credibility and visibility.

  • Big Tech Endorsement – OpenAI, Microsoft, Google: In a notable turn, companies that are otherwise competitors aligned on MCP. OpenAI’s CEO Sam Altman publicly announced in March 2025 that OpenAI would add MCP support across its products (including ChatGPT’s desktop app), saying “People love MCP and we are excited to add support across our products”. This meant OpenAI’s Agent API and ChatGPT plugins would speak MCP, ensuring interoperability. Just weeks later, Google DeepMind’s CEO Demis Hassabis revealed that Google’s upcoming Gemini models and tools would support MCP, calling it a good protocol and an open standard for the “AI agentic era”. Microsoft not only joined the steering committee but partnered with Anthropic to build an official C# SDK for MCP to serve the enterprise developer community. Microsoft’s GitHub unit integrated MCP into GitHub Copilot (VS Code’s ‘Copilot Labs/Agents’ mode), enabling Copilot to use MCP servers for things like repository searching and running test cases. Additionally, Microsoft announced Windows 11 would expose certain OS functions (like file system access) as MCP servers so AI agents can interact with the operating system securely. The collaboration among OpenAI, Microsoft, Google, and Anthropic – all rallying around MCP – is extraordinary and underscores the community-over-competition ethos of this standard.

  • Web3 Developer Community: A number of blockchain developers and startups have embraced MCP. Several community-driven MCP servers have been created to serve blockchain use cases:

    • The team at Alchemy (a leading blockchain infrastructure provider) built an Alchemy MCP Server that offers on-demand blockchain analytics tools via MCP. This likely lets an AI get blockchain stats (like historical transactions, address activity) through Alchemy’s APIs using natural language.
    • Contributors developed a Bitcoin & Lightning Network MCP Server to interact with Bitcoin nodes and the Lightning payment network, enabling AI agents to read Bitcoin block data or even create Lightning invoices via standard tools.
    • The crypto media and education group Bankless created an Onchain MCP Server focused on Web3 financial interactions, possibly providing an interface to DeFi protocols (sending transactions, querying DeFi positions, etc.) for AI assistants.
    • Projects like Rollup.codes (a knowledge base for Ethereum Layer 2s) made an MCP server for rollup ecosystem info, so an AI can answer technical questions about rollups by querying this server.
    • Chainstack, a blockchain node provider, launched a suite of MCP servers (covered earlier) for documentation, EVM chain data, and Solana, explicitly marketing it as “putting your AI on blockchain steroids” for Web3 builders.

    Additionally, Web3-focused communities have sprung up around MCP. For example, PulseMCP and Goose are community initiatives referenced as helping build the MCP registry. We’re also seeing cross-pollination with AI agent frameworks: the LangChain community integrated adapters so that all MCP servers can be used as tools in LangChain-powered agents, and open-source AI platforms like Hugging Face TGI (text-generation-inference) are exploring MCP compatibility. The result is a rich ecosystem where new MCP servers are announced almost daily, serving everything from databases to IoT devices.

  • Scale of Adoption: The traction can be quantified to some extent. By February 2025 – barely three months after launch – over 1,000 MCP servers/connectors had been built by the community. This number has only grown, indicating thousands of integrations across industries. Mike Krieger (Anthropic’s Chief Product Officer) noted by spring 2025 that MCP had become a “thriving open standard with thousands of integrations and growing”. The official MCP Registry (launched in preview in Sept 2025) is cataloging publicly available servers, making it easier to discover tools; the registry’s open API allows anyone to search for, say, “Ethereum” or “Notion” and find relevant MCP connectors. This lowers the barrier for new entrants and further fuels growth.

  • Partnerships: We’ve touched on many implicit partnerships (Anthropic with Microsoft, etc.). To highlight a few more:

    • Anthropic & Slack: Anthropic partnered with Slack to integrate Claude with Slack’s data via MCP (Slack has an official MCP server, enabling AI to retrieve Slack messages or post alerts).
    • Cloud Providers: Amazon (AWS) and Google Cloud have worked with Anthropic to host Claude, and it’s likely they support MCP in those environments (e.g., AWS Bedrock might allow MCP connectors for enterprise data). While not explicitly in citations, these cloud partnerships are important for enterprise adoption.
    • Academic collaborations: The MIT and IBM research project Namda (discussed next) represents a partnership between academia and industry to push MCP’s limits in decentralized settings.
    • GitHub & VS Code: Partnership to enhance developer experience – e.g., VS Code’s team actively contributed to MCP (one of the registry maintainers is from VS Code team).
    • Numerous startups: Many AI startups (agent startups, workflow automation startups) are building on MCP instead of reinventing the wheel. This includes emerging Web3 AI startups looking to offer “AI as a DAO” or autonomous economic agents.

Overall, the MCP community is diverse and rapidly expanding. It includes core tech companies (for standards and base tooling), Web3 specialists (bringing blockchain knowledge and use cases), and independent developers (who often contribute connectors for their favorite apps or protocols). The ethos is collaborative. For example, security concerns about third-party MCP servers have prompted community discussions and contributions of best practices (e.g., Stacklok contributors working on security tooling for MCP servers). The community’s ability to iterate quickly (MCP saw several spec upgrades within months, adding features like streaming responses and better auth) is a testament to broad engagement.

In the Web3 ecosystem specifically, MCP has fostered a mini-ecosystem of “AI + Web3” projects. It’s not just a protocol to use; it’s catalyzing new ideas like AI-driven DAOs, on-chain governance aided by AI analysis, and cross-domain automation (like linking on-chain events to off-chain actions through AI). The presence of key Web3 figures – e.g., Zhivko Todorov of LimeChain stating “MCP represents the inevitable integration of AI and blockchain” – shows that blockchain veterans are actively championing it. Partnerships between AI and blockchain companies (such as the one between Anthropic and Block, or Microsoft’s Azure cloud making MCP easy to deploy alongside its blockchain services) hint at a future where AI agents and smart contracts work hand-in-hand.

One could say MCP has ignited the first genuine convergence of the AI developer community with the Web3 developer community. Hackathons and meetups now feature MCP tracks. As a concrete measure of ecosystem adoption: by mid-2025, OpenAI, Google, and Anthropic – collectively representing the majority of advanced AI models – all support MCP, and on the other side, leading blockchain infrastructure providers (Alchemy, Chainstack), crypto companies (Block, etc.), and decentralized projects are building MCP hooks. This two-sided network effect bodes well for MCP becoming a lasting standard.

6. Roadmap and Development Milestones

MCP’s development has been fast-paced. Here we outline the major milestones so far and the roadmap ahead as gleaned from official sources and community updates:

  • Late 2024 – Initial Release: On Nov 25, 2024, Anthropic officially announced MCP and open-sourced the specification and initial SDKs. Alongside the spec, they released a handful of MCP server implementations for common tools (Google Drive, Slack, GitHub, etc.) and added support in the Claude AI assistant (Claude Desktop app) to connect to local MCP servers. This marked the 1.0 launch of MCP. Early proof-of-concept integrations at Anthropic showed how Claude could use MCP to read files or query a SQL database in natural language, validating the concept.
  • Q1 2025 – Rapid Adoption and Iteration: In the first few months of 2025, MCP saw widespread industry adoption. By March 2025, OpenAI and other AI providers announced support (as described above). This period also saw spec evolution: Anthropic updated MCP to include streaming capabilities (allowing large results or continuous data streams to be sent incrementally). This update was noted in April 2025 with the C# SDK news, indicating MCP now supported features like chunked responses or real-time feed integration. The community also built reference implementations in various languages (Python, JavaScript, etc.) beyond Anthropic’s SDK, ensuring polyglot support.
  • Q2 2025 – Ecosystem Tooling and Governance: In May 2025, with Microsoft and GitHub joining the effort, there was a push for formalizing governance and enhancing security. At Build 2025, Microsoft unveiled plans for Windows 11 MCP integration and detailed a collaboration to improve authorization flows in MCP. Around the same time, the idea of an MCP Registry was introduced to index available servers (the initial brainstorming started in March 2025 according to the registry blog). The “standards track” process (SEP – Standard Enhancement Proposals) was established on GitHub, similar to Ethereum’s EIPs or Python’s PEPs, to manage contributions in an orderly way. Community calls and working groups (for security, registry, SDKs) started convening.
  • Mid 2025 – Feature Expansion: By mid-2025, the roadmap prioritized several key improvements:
    • Asynchronous and Long-Running Task Support: Plans to allow MCP to handle long operations without blocking the connection. For example, if an AI triggers a cloud job that takes minutes, the MCP protocol would support async responses or reconnection to fetch results.
    • Authentication & Fine-Grained Security: Developing fine-grained authorization mechanisms for sensitive actions. This includes possibly integrating OAuth flows, API keys, and enterprise SSO into MCP servers so that AI access can be safely managed. By mid-2025, guides and best practices for MCP security were in progress, given the security risks of allowing AI to invoke powerful tools. The goal is that, for instance, if an AI is to access a user’s private database via MCP, it should follow a secure authorization flow (with user consent) rather than just an open endpoint.
    • Validation and Compliance Testing: Recognizing the need for reliability, the community prioritized building compliance test suites and reference implementations. By ensuring all MCP clients/servers adhere to the spec (through automated testing), they aimed to prevent fragmentation. A reference server (likely an example with best practices for remote deployment and auth) was on the roadmap, as was a reference client application demonstrating full MCP usage with an AI.
    • Multimodality Support: Extending MCP beyond text to support modalities like image, audio, video data in the context. For example, an AI might request an image from an MCP server (say, a design asset or a diagram) or output an image. The spec discussion included adding support for streaming and chunked messages to handle large multimedia content interactively. Early work on “MCP Streaming” was already underway (to support things like live audio feeds or continuous sensor data to AI).
    • Central Registry & Discovery: The plan to implement a central MCP Registry service for server discovery was executed in mid-2025. By September 2025, the official MCP Registry was launched in preview. This registry provides a single source of truth for publicly available MCP servers, allowing clients to find servers by name, category, or capabilities. It’s essentially like an app store (but open) for AI tools. The design allows for public registries (a global index) and private ones (enterprise-specific), all interoperable via a shared API. The Registry also introduced a moderation mechanism to flag or delist malicious servers, with a community moderation model to maintain quality.
  • Late 2025 and Beyond – Toward Decentralized MCP Networks: While not “official” roadmap items yet, the trajectory points toward more decentralization and Web3 synergy:
    • Researchers are actively exploring how to add decentralized discovery, reputation, and incentive layers to MCP. The concept of an MCP Network (or “marketplace of MCP endpoints”) is being incubated. This might involve smart contract-based registries (so no single point of failure for server listings), reputation systems where servers/clients have on-chain identities and stake for good behavior, and possibly token rewards for running reliable MCP nodes.
    • Project Namda at MIT, which started in 2024, is a concrete step in this direction. By 2025, Namda had built a prototype distributed agent framework on MCP’s foundations, including features like dynamic node discovery, load balancing across agent clusters, and a decentralized registry using blockchain techniques. They even have experimental token-based incentives and provenance tracking for multi-agent collaborations. Milestones from Namda show that it’s feasible to have a network of MCP agents running across many machines with trustless coordination. If Namda’s concepts are adopted, we might see MCP evolve to incorporate some of these ideas (possibly through optional extensions or separate protocols layered on top).
    • Enterprise Hardening: On the enterprise side, by late 2025 we expect MCP to be integrated into major enterprise software offerings (Microsoft’s inclusion in Windows and Azure is one example). The roadmap includes enterprise-friendly features like SSO integration for MCP servers and robust access controls. The general availability of the MCP Registry and toolkits for deploying MCP at scale (e.g., within a corporate network) is likely by end of 2025.

To recap some key development milestones so far (timeline format for clarity):

  • Nov 2024: MCP 1.0 released (Anthropic).
  • Dec 2024 – Jan 2025: Community builds first wave of MCP servers; Anthropic releases Claude Desktop with MCP support; small-scale pilots by Block, Apollo, etc.
  • Feb 2025: 1000+ community MCP connectors achieved; Anthropic hosts workshops (e.g., at an AI summit, driving education).
  • Mar 2025: OpenAI announces support (ChatGPT Agents SDK).
  • Apr 2025: Google DeepMind announces support (Gemini will support MCP); Microsoft releases preview of C# SDK.
  • May 2025: Steering Committee expanded (Microsoft/GitHub); Build 2025 demos (Windows MCP integration).
  • Jun 2025: Chainstack launches Web3 MCP servers (EVM/Solana) for public use.
  • Jul 2025: MCP spec version updates (streaming, authentication improvements); official Roadmap published on MCP site.
  • Sep 2025: MCP Registry (preview) launched; likely MCP hits general availability in more products (Claude for Work, etc.).
  • Late 2025 (projected): Registry v1.0 live; security best-practice guides released; possibly initial experiments with decentralized discovery (Namda results).

The vision forward is that MCP becomes as ubiquitous and invisible as HTTP or JSON – a common layer that many apps use under the hood. For Web3, the roadmap suggests deeper fusion: where not only will AI agents use Web3 (blockchains) as sources or sinks of information, but Web3 infrastructure itself might start to incorporate AI agents (via MCP) as part of its operation (for example, a DAO might run an MCP-compatible AI to manage certain tasks, or oracles might publish data via MCP endpoints). The roadmap’s emphasis on things like verifiability and authentication hints that down the line, trust-minimized MCP interactions could be a reality – imagine AI outputs that come with cryptographic proofs, or an on-chain log of what tools an AI invoked for audit purposes. These possibilities blur the line between AI and blockchain networks, and MCP is at the heart of that convergence.

In conclusion, MCP’s development is highly dynamic. It has hit major early milestones (broad adoption and standardization within a year of launch) and continues to evolve rapidly with a clear roadmap emphasizing security, scalability, and discovery. The milestones achieved and planned ensure MCP will remain robust as it scales: addressing challenges like long-running tasks, secure permissions, and the sheer discoverability of thousands of tools. This forward momentum indicates that MCP is not a static spec but a growing standard, likely to incorporate more Web3-flavored features (decentralized governance of servers, incentive alignment) as those needs arise. The community is poised to adapt MCP to new use cases (multimodal AI, IoT, etc.), all while keeping an eye on the core promise: making AI more connected, context-aware, and user-empowering in the Web3 era.

7. Comparison with Similar Web3 Projects or Protocols

MCP’s unique blend of AI and connectivity means there aren’t many direct apples-to-apples equivalents, but it’s illuminating to compare it with other projects at the intersection of Web3 and AI or with analogous goals:

  • SingularityNET (AGI/X)Decentralized AI Marketplace: SingularityNET, launched in 2017 by Dr. Ben Goertzel and others, is a blockchain-based marketplace for AI services. It allows developers to monetize AI algorithms as services and users to consume those services, all facilitated by a token (AGIX) which is used for payments and governance. In essence, SingularityNET is trying to decentralize the supply of AI models by hosting them on a network where anyone can call an AI service in exchange for tokens. This differs from MCP fundamentally. MCP does not host or monetize AI models; instead, it provides a standard interface for AI (wherever it’s running) to access data/tools. One could imagine using MCP to connect an AI to services listed on SingularityNET, but SingularityNET itself focuses on the economic layer (who provides an AI service and how they get paid). Another key difference: Governance – SingularityNET has on-chain governance (via SingularityNET Enhancement Proposals (SNEPs) and AGIX token voting) to evolve its platform. MCP’s governance, by contrast, is off-chain and collaborative without a token. In summary, SingularityNET and MCP both strive for a more open AI ecosystem, but SingularityNET is about a tokenized network of AI algorithms, whereas MCP is about a protocol standard for AI-tool interoperability. They could complement: for example, an AI on SingularityNET could use MCP to fetch external data it needs. But SingularityNET doesn’t attempt to standardize tool use; it uses blockchain to coordinate AI services, while MCP uses software standards to let AI work with any service.
  • Fetch.ai (FET)Agent-Based Decentralized Platform: Fetch.ai is another project blending AI and blockchain. It launched its own proof-of-stake blockchain and framework for building autonomous agents that perform tasks and interact on a decentralized network. In Fetch’s vision, millions of “software agents” (representing people, devices, or organizations) can negotiate and exchange value, using FET tokens for transactions. Fetch.ai provides an agent framework (uAgents) and infrastructure for discovery and communication between agents on its ledger. For example, a Fetch agent might help optimize traffic in a city by interacting with other agents for parking and transport, or manage a supply chain workflow autonomously. How does this compare to MCP? Both deal with the concept of agents, but Fetch.ai’s agents are strongly tied to its blockchain and token economy – they live on the Fetch network and use on-chain logic. MCP agents (AI hosts) are model-driven (like an LLM) and not tied to any single network; MCP is content to operate over the internet or within a cloud setup, without requiring a blockchain. Fetch.ai tries to build a new decentralized AI economy from the ground up (with its own ledger for trust and transactions), whereas MCP is layer-agnostic – it piggybacks on existing networks (could be used over HTTPS, or even on top of a blockchain if needed) to enable AI interactions. One might say Fetch is more about autonomous economic agents and MCP about smart tool-using agents. Interestingly, these could intersect: an autonomous agent on Fetch.ai might use MCP to interface with off-chain resources or other blockchains. Conversely, one could use MCP to build multi-agent systems that leverage different blockchains (not just one). In practice, MCP has seen faster adoption because it didn’t require its own network – it works with Ethereum, Solana, Web2 APIs, etc., out of the box. Fetch.ai’s approach is more heavyweight, creating an entire ecosystem that participants must join (and acquire tokens) to use. In sum, Fetch.ai vs MCP: Fetch is a platform with its own token/blockchain for AI agents, focusing on interoperability and economic exchanges between agents, while MCP is a protocol that AI agents (in any environment) can use to plug into tools and data. Their goals overlap in enabling AI-driven automation, but they tackle different layers of the stack and have very different architectural philosophies (closed ecosystem vs open standard).
  • Chainlink and Decentralized OraclesConnecting Blockchains to Off-Chain Data: Chainlink is not an AI project, but it’s highly relevant as a Web3 protocol solving a complementary problem: how to connect blockchains with external data and computation. Chainlink is a decentralized network of nodes (oracles) that fetch, verify, and deliver off-chain data to smart contracts in a trust-minimized way. For example, Chainlink oracles provide price feeds to DeFi protocols or call external APIs on behalf of smart contracts via Chainlink Functions. Comparatively, MCP connects AI models to external data/tools (some of which might be blockchains). One could say Chainlink brings data into blockchains, while MCP brings data into AI. There is a conceptual parallel: both establish a bridge between otherwise siloed systems. Chainlink focuses on reliability, decentralization, and security of data fed on-chain (solving the “oracle problem” of single point of failure). MCP focuses on flexibility and standardization of how AI can access data (solving the “integration problem” for AI agents). They operate in different domains (smart contracts vs AI assistants), but one might compare MCP servers to oracles: an MCP server for price data might call the same APIs a Chainlink node does. The difference is the consumer – in MCP’s case, the consumer is an AI or user-facing assistant, not a deterministic smart contract. Also, MCP does not inherently provide the trust guarantees that Chainlink does (MCP servers can be centralized or community-run, with trust managed at the application level). However, as mentioned earlier, ideas to decentralize MCP networks could borrow from oracle networks – e.g., multiple MCP servers could be queried and results cross-checked to ensure an AI isn’t fed bad data, similar to how multiple Chainlink nodes aggregate a price. In short, Chainlink vs MCP: Chainlink is Web3 middleware for blockchains to consume external data, MCP is AI middleware for models to consume external data (which could include blockchain data). They address analogous needs in different realms and could even complement: an AI using MCP might fetch a Chainlink-provided data feed as a reliable resource, and conversely, an AI could serve as a source of analysis that a Chainlink oracle brings on-chain (though that latter scenario would raise questions of verifiability).
  • ChatGPT Plugins / OpenAI Functions vs MCPAI Tool Integration Approaches: While not Web3 projects, a quick comparison is warranted because ChatGPT plugins and OpenAI’s function calling feature also connect AI to external tools. ChatGPT plugins use an OpenAPI specification provided by a service, and the model can then call those APIs following the spec. The limitations are that it’s a closed ecosystem (OpenAI-approved plugins running on OpenAI’s servers) and each plugin is a siloed integration. OpenAI’s newer “Agents” SDK is closer to MCP in concept, letting developers define tools/functions that an AI can use, but initially it was specific to OpenAI’s ecosystem. LangChain similarly provided a framework to give LLMs tools in code. MCP differs by offering an open, model-agnostic standard for this. As one analysis put it, LangChain created a developer-facing standard (a Python interface) for tools, whereas MCP creates a model-facing standard – an AI agent can discover and use any MCP-defined tool at runtime without custom code. In practical terms, MCP’s ecosystem of servers grew larger and more diverse than the ChatGPT plugin store within months. And rather than each model having its own plugin format (OpenAI had theirs, others had different ones), many are coalescing around MCP. OpenAI itself signaled support for MCP, essentially aligning their function approach with the broader standard. So, comparing OpenAI Plugins to MCP: plugins are a curated, centralized approach, while MCP is a decentralized, community-driven approach. In a Web3 mindset, MCP is more “open source and permissionless” whereas proprietary plugin ecosystems are more closed. This makes MCP analogous to the ethos of Web3 even though it’s not a blockchain – it enables interoperability and user control (you could run your own MCP server for your data, instead of giving it all to one AI provider). This comparison shows why many consider MCP as having more long-term potential: it’s not locked to one vendor or one model.
  • Project Namda and Decentralized Agent Frameworks: Namda deserves a separate note because it explicitly combines MCP with Web3 concepts. As described earlier, Namda (Networked Agent Modular Distributed Architecture) is an MIT/IBM initiative started in 2024 to build a scalable, distributed network of AI agents using MCP as the communication layer. It treats MCP as the messaging backbone (since MCP uses standard JSON-RPC-like messages, it fit well for inter-agent comms), and then adds layers for dynamic discovery, fault tolerance, and verifiable identities using blockchain-inspired techniques. Namda’s agents can be anywhere (cloud, edge devices, etc.), but a decentralized registry (somewhat like a DHT or blockchain) keeps track of them and their capabilities in a tamper-proof way. They even explore giving agents tokens to incentivize cooperation or resource sharing. In essence, Namda is an experiment in what a “Web3 version of MCP” might look like. It’s not a widely deployed project yet, but it’s one of the closest “similar protocols” in spirit. If we view Namda vs MCP: Namda uses MCP (so it’s not competing standards), but extends it with a protocol for networking and coordinating multiple agents in a trust-minimized manner. One could compare Namda to frameworks like Autonolas or Multi-Agent Systems (MAS) that the crypto community has seen, but those often lacked a powerful AI component or a common protocol. Namda + MCP together showcase how a decentralized agent network could function, with blockchain providing identity, reputation, and possibly token incentives, and MCP providing the agent communication and tool-use.

In summary, MCP stands apart from most prior Web3 projects: it did not start as a crypto project at all, yet it rapidly intersects with Web3 because it solves complementary problems. Projects like SingularityNET and Fetch.ai aimed to decentralize AI compute or services using blockchain; MCP instead standardizes AI integration with services, which can enhance decentralization by avoiding platform lock-in. Oracle networks like Chainlink solved data delivery to blockchain; MCP solves data delivery to AI (including blockchain data). If Web3’s core ideals are decentralization, interoperability, and user empowerment, MCP is attacking the interoperability piece in the AI realm. It’s even influencing those older projects – for instance, there is nothing stopping SingularityNET from making its AI services available via MCP servers, or Fetch agents from using MCP to talk to external systems. We might well see a convergence where token-driven AI networks use MCP as their lingua franca, marrying the incentive structure of Web3 with the flexibility of MCP.

Finally, if we consider market perception: MCP is often touted as doing for AI what Web3 hoped to do for the internet – break silos and empower users. This has led some to nickname MCP informally as “Web3 for AI” (even when no blockchain is involved). However, it’s important to recognize MCP is a protocol standard, whereas most Web3 projects are full-stack platforms with economic layers. In comparisons, MCP usually comes out as a more lightweight, universal solution, while blockchain projects are heavier, specialized solutions. Depending on use case, they can complement rather than strictly compete. As the ecosystem matures, we might see MCP integrated into many Web3 projects as a module (much like how HTTP or JSON are ubiquitous), rather than as a rival project.

8. Public Perception, Market Traction, and Media Coverage

Public sentiment toward MCP has been overwhelmingly positive in both the AI and Web3 communities, often bordering on enthusiastic. Many see it as a game-changer that arrived quietly but then took the industry by storm. Let’s break down the perception, traction, and notable media narratives:

Market Traction and Adoption Metrics: By mid-2025, MCP achieved a level of adoption rare for a new protocol. It’s backed by virtually all major AI model providers (Anthropic, OpenAI, Google, Meta) and supported by big tech infrastructure (Microsoft, GitHub, AWS etc.), as detailed earlier. This alone signals to the market that MCP is likely here to stay (akin to how broad backing propelled TCP/IP or HTTP in early internet days). On the Web3 side, the traction is evident in developer behavior: hackathons started featuring MCP projects, and many blockchain dev tools now mention MCP integration as a selling point. The stat of “1000+ connectors in a few months” and Mike Krieger’s “thousands of integrations” quote are often cited to illustrate how rapidly MCP caught on. This suggests strong network effects – the more tools available via MCP, the more useful it is, prompting more adoption (a positive feedback loop). VCs and analysts have noted that MCP achieved in under a year what earlier “AI interoperability” attempts failed to do over several years, largely due to timing (riding the wave of interest in AI agents) and being open-source. In Web3 media, traction is sometimes measured in terms of developer mindshare and integration into projects, and MCP scores high on both now.

Public Perception in AI and Web3 Communities: Initially, MCP flew under the radar when first announced (late 2024). But by early 2025, as success stories emerged, perception shifted to excitement. AI practitioners saw MCP as the “missing puzzle piece” for making AI agents truly useful beyond toy examples. Web3 builders, on the other hand, saw it as a bridge to finally incorporate AI into dApps without throwing away decentralization – an AI can use on-chain data without needing a centralized oracle, for instance. Thought leaders have been singing praises: for example, Jesus Rodriguez (a prominent Web3 AI writer) wrote in CoinDesk that MCP may be “one of the most transformative protocols for the AI era and a great fit for Web3 architectures”. Rares Crisan in a Notable Capital blog argued that MCP could deliver on Web3’s promise where blockchain alone struggled, by making the internet more user-centric and natural to interact with. These narratives frame MCP as revolutionary yet practical – not just hype.

To be fair, not all commentary is uncritical. Some AI developers on forums like Reddit have pointed out that MCP “doesn’t do everything” – it’s a communication protocol, not an out-of-the-box agent or reasoning engine. For instance, one Reddit discussion titled “MCP is a Dead-End Trap” argued that MCP by itself doesn’t manage agent cognition or guarantee quality; it still requires good agent design and safety controls. This view suggests MCP could be overhyped as a silver bullet. However, these criticisms are more about tempering expectations than rejecting MCP’s usefulness. They emphasize that MCP solves tool connectivity but one must still build robust agent logic (i.e., MCP doesn’t magically create an intelligent agent, it equips one with tools). The consensus though is that MCP is a big step forward, even among cautious voices. Hugging Face’s community blog noted that while MCP isn’t a solve-it-all, it is a major enabler for integrated, context-aware AI, and developers are rallying around it for that reason.

Media Coverage: MCP has received significant coverage across both mainstream tech media and niche blockchain media:

  • TechCrunch has run multiple stories. They covered the initial concept (“Anthropic proposes a new way to connect data to AI chatbots”) around launch in 2024. In 2025, TechCrunch highlighted each big adoption moment: OpenAI’s support, Google’s embrace, Microsoft/GitHub’s involvement. These articles often emphasize the industry unity around MCP. For example, TechCrunch quoted Sam Altman’s endorsement and noted the rapid shift from rival standards to MCP. In doing so, they portrayed MCP as the emerging standard similar to how no one wanted to be left out of the internet protocols in the 90s. Such coverage in a prominent outlet signaled to the broader tech world that MCP is important and real, not just a fringe open-source project.
  • CoinDesk and other crypto publications latched onto the Web3 angle. CoinDesk’s opinion piece by Rodriguez (July 2025) is often cited; it painted a futuristic picture where every blockchain could be an MCP server and new MCP networks might run on blockchains. It connected MCP to concepts like decentralized identity, authentication, and verifiability – speaking the language of the blockchain audience and suggesting MCP could be the protocol that truly melds AI with decentralized frameworks. Cointelegraph, Bankless, and others have also discussed MCP in context of “AI agents & DeFi” and similar topics, usually optimistic about the possibilities (e.g., Bankless had a piece on using MCP to let an AI manage on-chain trades, and included a how-to for their own MCP server).
  • Notable VC Blogs / Analyst Reports: The Notable Capital blog post (July 2025) is an example of venture analysis drawing parallels between MCP and the evolution of web protocols. It essentially argues MCP could do for Web3 what HTTP did for Web1 – providing a new interface layer (natural language interface) that doesn’t replace underlying infrastructure but makes it usable. This kind of narrative is compelling and has been echoed in panels and podcasts. It positions MCP not as competing with blockchain, but as the next layer of abstraction that finally allows normal users (via AI) to harness blockchain and web services easily.
  • Developer Community Buzz: Outside formal articles, MCP’s rise can be gauged by its presence in developer discourse – conference talks, YouTube channels, newsletters. For instance, there have been popular blog posts like “MCP: The missing link for agentic AI?” on sites like Runtime.news, and newsletters (e.g., one by AI researcher Nathan Lambert) discussing practical experiments with MCP and how it compares to other tool-use frameworks. The general tone is curiosity and excitement: developers share demos of hooking up AI to their home automation or crypto wallet with just a few lines using MCP servers, something that felt sci-fi not long ago. This grassroots excitement is important because it shows MCP has mindshare beyond just corporate endorsements.
  • Enterprise Perspective: Media and analysts focusing on enterprise AI also note MCP as a key development. For example, The New Stack covered how Anthropic added support for remote MCP servers in Claude for enterprise use. The angle here is that enterprises can use MCP to connect their internal knowledge bases and systems to AI safely. This matters for Web3 too as many blockchain companies are enterprises themselves and can leverage MCP internally (for instance, a crypto exchange could use MCP to let an AI analyze internal transaction logs for fraud detection).

Notable Quotes and Reactions: A few are worth highlighting as encapsulating public perception:

  • “Much like HTTP revolutionized web communications, MCP provides a universal framework... replacing fragmented integrations with a single protocol.” – CoinDesk. This comparison to HTTP is powerful; it frames MCP as infrastructure-level innovation.
  • “MCP has [become a] thriving open standard with thousands of integrations and growing. LLMs are most useful when connecting to the data you already have...” – Mike Krieger (Anthropic). This is an official confirmation of both traction and the core value proposition, which has been widely shared on social media.
  • “The promise of Web3... can finally be realized... through natural language and AI agents. ...MCP is the closest thing we've seen to a real Web3 for the masses.” – Notable Capital. This bold statement resonates with those frustrated by the slow UX improvements in crypto; it suggests AI might crack the code of mainstream adoption by abstracting complexity.

Challenges and Skepticism: While enthusiasm is high, the media has also discussed challenges:

  • Security Concerns: Outlets like The New Stack or security blogs have raised that allowing AI to execute tools can be dangerous if not sandboxed. What if a malicious MCP server tried to get an AI to perform a harmful action? The LimeChain blog explicitly warns of “significant security risks” with community-developed MCP servers (e.g., a server that handles private keys must be extremely secure). These concerns have been echoed in discussions: essentially, MCP expands AI’s capabilities, but with power comes risk. The community’s response (guides, auth mechanisms) has been covered as well, generally reassuring that mitigations are being built. Still, any high-profile misuse of MCP (say an AI triggered an unintended crypto transfer) would affect perception, so media is watchful on this front.
  • Performance and Cost: Some analysts note that using AI agents with tools could be slower or more costly than directly calling an API (because the AI might need multiple back-and-forth steps to get what it needs). In high-frequency trading or on-chain execution contexts, that latency could be problematic. For now, these are seen as technical hurdles to optimize (through better agent design or streaming), rather than deal-breakers.
  • Hype management: As with any trending tech, there’s a bit of hype. A few voices caution not to declare MCP the solution to everything. For instance, the Hugging Face article asks “Is MCP a silver bullet?” and answers no – developers still need to handle context management, and MCP works best in combination with good prompting and memory strategies. Such balanced takes are healthy in the discourse.

Overall Media Sentiment: The narrative that emerges is largely hopeful and forward-looking:

  • MCP is seen as a practical tool delivering real improvements now (so not vaporware), which media underscore by citing working examples: Claude reading files, Copilot using MCP in VSCode, an AI completing a Solana transaction in a demo, etc..
  • It’s also portrayed as a strategic linchpin for the future of both AI and Web3. Media often conclude that MCP or things like it will be essential for “decentralized AI” or “Web4” or whatever term one uses for the next-gen web. There’s a sense that MCP opened a door, and now innovation is flowing through – whether it's Namda’s decentralized agents or enterprises connecting legacy systems to AI, many future storylines trace back to MCP’s introduction.

In the market, one could gauge traction by the formation of startups and funding around the MCP ecosystem. Indeed, there are rumors/reports of startups focusing on “MCP marketplaces” or managed MCP platforms getting funding (Notable Capital writing about it suggests VC interest). We can expect media to start covering those tangentially – e.g., “Startup X uses MCP to let your AI manage your crypto portfolio – raises $Y million”.

Conclusion of Perception: By late 2025, MCP enjoys a reputation as a breakthrough enabling technology. It has strong advocacy from influential figures in both AI and crypto. The public narrative has evolved from “here’s a neat tool” to “this could be foundational for the next web”. Meanwhile, practical coverage confirms it’s working and being adopted, lending credibility. Provided the community continues addressing challenges (security, governance at scale) and no major disasters occur, MCP’s public image is likely to remain positive or even become iconic as “the protocol that made AI and Web3 play nice together.”

Media will likely keep a close eye on:

  • Success stories (e.g., if a major DAO implements an AI treasurer via MCP, or a government uses MCP for open data AI systems).
  • Any security incidents (to evaluate risk).
  • The evolution of MCP networks and whether any token or blockchain component officially enters the picture (which would be big news bridging AI and crypto even more tightly).

As of now, however, the coverage can be summed up by a line from CoinDesk: “The combination of Web3 and MCP might just be a new foundation for decentralized AI.” – a sentiment that captures both the promise and the excitement surrounding MCP in the public eye.

References:

  • Anthropic News: "Introducing the Model Context Protocol," Nov 2024
  • LimeChain Blog: "What is MCP and How Does It Apply to Blockchains?" May 2025
  • Chainstack Blog: "MCP for Web3 Builders: Solana, EVM and Documentation," June 2025
  • CoinDesk Op-Ed: "The Protocol of Agents: Web3’s MCP Potential," Jul 2025
  • Notable Capital: "Why MCP Represents the Real Web3 Opportunity," Jul 2025
  • TechCrunch: "OpenAI adopts Anthropic’s standard…", Mar 26, 2025
  • TechCrunch: "Google to embrace Anthropic’s standard…", Apr 9, 2025
  • TechCrunch: "GitHub, Microsoft embrace… (MCP steering committee)", May 19, 2025
  • Microsoft Dev Blog: "Official C# SDK for MCP," Apr 2025
  • Hugging Face Blog: "#14: What Is MCP, and Why Is Everyone Talking About It?" Mar 2025
  • Messari Research: "Fetch.ai Profile," 2023
  • Medium (Nu FinTimes): "Unveiling SingularityNET," Mar 2024

World Liberty Financial: The Future of Money, Backed by USD1

· 11 min read
Dora Noda
Software Engineer

Overview of World Liberty Financial

World Liberty Financial (WLFI) is a decentralized‑finance (DeFi) platform created by members of the Trump family and their partners. According to the Trump Organization’s site, the platform aims to bridge traditional banking and blockchain technology by combining the stability of legacy finance with the transparency and accessibility of decentralized systems. Its mission is to provide modern services for money movement, lending and digital‑asset management while supporting dollar‑backed stability, making capital accessible to individuals and institutions, and simplifying DeFi for mainstream users.

WLFI launched its governance token ($WLFI) in September 2025 and introduced a dollar‑pegged stablecoin called USD1 in March 2025. The platform describes USD1 as a “future of money” stablecoin designed to serve as the base pair for tokenized assets and to promote U.S. dollar dominance in the digital economy. Co‑founder Donald Trump Jr. has framed WLFI as a non‑political venture intended to empower everyday people and strengthen the U.S. dollar’s global role.

History and Founding

  • Origins (2024–2025). WLFI was announced in September 2024 as a crypto venture led by members of the Trump family. The company launched its governance token WLFIlaterthatyear.AccordingtoReuters,theenterprisesinitialWLFI later that year. According to Reuters, the enterprise’s initial WLFI token sale raised only about $2.7 million, but sales surged after Donald Trump’s 2024 election victory (information referenced in widely cited reports, though not directly available in our sources). WLFI is majority‑owned by a Trump business entity and has nine co‑founders, including Donald Trump Jr., Eric Trump and Barron Trump.
  • Management. The Trump Organization describes WLFI’s leadership roles as: Donald Trump (Chief Crypto Advocate), Eric Trump and Donald Trump Jr. (Web3 Ambassadors), Barron Trump (DeFi visionary), and Zach Witkoff (CEO and co‑founder). The company’s daily operations are managed by Zach Witkoff and partners such as Zachary Folkman and Chase Herro.
  • Stablecoin initiative. WLFI announced the USD1 stablecoin in March 2025. USD1 was described as a dollar‑pegged stablecoin backed by U.S. Treasuries, U.S. dollar deposits and other cash equivalents. The coin’s reserves are custodied by BitGo Trust Company, a regulated digital‑asset custodian. USD1 launched on Binance’s BNB Chain and later expanded to Ethereum, Solana and Tron.

USD1 Stablecoin: Design and Features

Reserve model and stability mechanism

USD1 is designed as a fiat‑backed stablecoin with a 1:1 redemption mechanism. Each USD1 token is redeemable for one U.S. dollar, and the stablecoin’s reserves are held in short‑term U.S. Treasury bills, dollar deposits and cash equivalents. These assets are custodied by BitGo Trust, a regulated entity known for institutional digital‑asset custody. WLFI advertises that USD1 offers:

  1. Full collateralization and audits. The reserves are fully collateralized and subject to monthly third‑party attestations, providing transparency over backing assets. In May 2025, Binance Academy noted that regular reserve breakdowns were not yet publicly available and that WLFI had pledged third‑party audits.
  2. Institutional orientation. WLFI positions USD1 as an “institutional‑ready” stablecoin aimed at banks, funds and large companies, though it is also accessible to retail users.
  3. Zero mint/redeem fees. USD1 reportedly charges no fees for minting or redemption, reducing friction for users handling large volumes.
  4. Cross‑chain interoperability. The stablecoin uses Chainlink’s Cross‑Chain Interoperability Protocol (CCIP) to enable secure transfers across Ethereum, BNB Chain and Tron. Plans to expand to additional blockchains were confirmed through partnerships with networks like Aptos and Tron.

Market performance

  • Rapid growth. Within a month of launch, USD1’s market capitalization reached about **2.1billion,drivenbyhighprofileinstitutionaldealssuchasa2.1 billion**, driven by high‑profile institutional deals such as a 2 billion investment by Abu Dhabi’s MGX fund into Binance using USD1. By early October 2025 the supply had grown to roughly $2.68 billion, with most tokens issued on BNB Chain (79 %), followed by Ethereum, Solana and Tron.
  • Listing and adoption. Binance listed USD1 on its spot market in May 2025. WLFI touts widespread integration across DeFi protocols and centralised exchanges. DeFi platforms like ListaDAO, Venus Protocol and Aster support lending, borrowing and liquidity pools using USD1. WLFI emphasises that users can redeem USD1 for U.S. dollars through BitGo within one to two business days.

Institutional uses and tokenized asset plans

WLFI envisions USD1 as the default settlement asset for tokenized real‑world assets (RWAs). CEO Zach Witkoff has said that commodities such as oil, gas, cotton and timber should be traded on‑chain and that WLFI is actively working to tokenize these assets and pair them with USD1 because they require a trustworthy, transparent stablecoin. He described USD1 as “the most trustworthy and transparent stablecoin on Earth”.

Products and Services

Debit card and retail apps

At the TOKEN2049 conference in Singapore, Zach Witkoff announced that WLFI will release a crypto debit card that allows users to spend digital assets in everyday transactions. The company planned to launch a pilot program in the next quarter, with a full rollout expected in Q4 2025 or Q1 2026. CoinLaw summarized key details:

  • The card will link crypto balances to consumer purchases and is expected to integrate with services like Apple Pay.
  • WLFI is also developing a consumer‑facing retail app to complement the card.

Tokenization and investment products

Beyond payments, WLFI aims to tokenize real‑world commodities. Witkoff said they are exploring tokenization of oil, gas, timber and real estate to create blockchain‑based trading instruments. WLFI’s governance token (WLFI),launchedinSeptember2025,grantsholderstheabilitytovoteoncertaincorporatedecisions.Theprojecthasalsoformedstrategicpartnerships,includingALT5SigmasagreementtopurchaseWLFI), launched in September 2025, grants holders the ability to vote on certain corporate decisions. The project has also formed strategic partnerships, including ALT5 Sigma’s agreement to purchase 750 million of WLFI tokens as part of its treasury strategy.

Donald Trump Jr.’s Perspective

Co‑founder Donald Trump Jr. is a prominent public face of WLFI. His remarks at industry events and interviews reveal the motivations behind the project and his views on traditional finance, regulation and the U.S. dollar’s role.

Critique of traditional finance

  • “Broken” and undemocratic system. During a panel titled World Liberty Financial: The Future of Money, Backed by USD1 at the Token2049 conference, Trump Jr. argued that traditional finance is undemocratic and “broken.” He recounted that when his family entered politics, 300 of their bank accounts were eliminated overnight, illustrating how financial institutions can punish individuals for political reasons. He said the family moved from being at the top of the financial “pyramid” to the bottom, revealing that the system favours insiders and functions like a Ponzi scheme.
  • Inefficiency and lack of value. He criticised the traditional financial industry for being mired in inefficiencies, where people “making seven figures a year” merely push paperwork without adding real value.

Advocating for stablecoins and the dollar

  • Preserving dollar hegemony. Trump Jr. asserts that stablecoins like USD1 will backfill the role previously played by countries purchasing U.S. Treasuries. He told the Business Times that stablecoins could create “dollar hegemony” allowing the U.S. to lead globally and keep many places safe and sound. Speaking to Cryptopolitan, he argued that stablecoins actually preserve U.S. dollar dominance because demand for dollar‑backed tokens supports Treasuries at a time when conventional buyers (e.g., China and Japan) are reducing exposure.
  • Future of finance and DeFi. Trump Jr. described WLFI as the future of finance and emphasized that blockchain and DeFi technologies can democratize access to capital. At an ETH Denver event covered by Panews, he argued that clear regulatory frameworks are needed to prevent companies from moving offshore and to protect investors. He urged the U.S. to lead global crypto innovation and criticized excessive regulation for stifling growth.
  • Financial democratization. He believes combining traditional and decentralized finance through WLFI will provide liquidity, transparency and stability to underserved populations. He also highlights blockchain’s potential to eliminate corruption by making transactions transparent and on‑chain.
  • Advice to newcomers. Trump Jr. advises new investors to start with small amounts, avoid excessive leverage and engage in continuous learning about DeFi.

Political neutrality and media criticism

Trump Jr. stresses that WLFI is “100 % not a political organization” despite the Trump family’s deep involvement. He frames the venture as a platform to benefit Americans and the world rather than a political vehicle. During the Token2049 panel he criticized mainstream media outlets, saying they had discredited themselves, and Zach Witkoff asked the audience whether they considered The New York Times trustworthy.

Partnerships and Ecosystem Integration

MGX–Binance investment

In May 2025, WLFI announced that USD1 would facilitate a **2billioninvestmentbyAbuDhabibasedMGXintocryptoexchangeBinance.TheannouncementhighlightedWLFIsgrowinginfluenceandwastoutedasevidenceofUSD1sinstitutionalappeal.However,U.S.SenatorElizabethWarrencriticizedthedeal,callingitcorruptionbecausependingstablecoinlegislation(theGENIUSAct)couldbenefitthepresidentsfamily.CoinMarketCapdatacitedbyReutersshowedUSD1scirculatingvaluereachingabout2 billion investment** by Abu Dhabi‑based MGX into crypto exchange **Binance**. The announcement highlighted WLFI’s growing influence and was touted as evidence of USD1’s institutional appeal. However, U.S. Senator Elizabeth Warren criticized the deal, calling it “corruption” because pending stablecoin legislation (the GENIUS Act) could benefit the president’s family. CoinMarketCap data cited by Reuters showed USD1’s circulating value reaching about 2.1 billion at that time.

Aptos partnership

At the TOKEN2049 conference in October 2025, WLFI and layer‑1 blockchain Aptos announced a partnership to deploy USD1 on the Aptos network. Brave New Coin reports that WLFI selected Aptos because of its high throughput (transactions settle in under half a second) and fees under one‑hundredth of a cent. The collaboration aims to challenge dominant stablecoin networks by providing cheaper, faster rails for institutional transactions. CryptoSlate notes that USD1’s integration will make Aptos the fifth network to mint the stablecoin, with day‑one support from DeFi protocols such as Echelon Market and Hyperion as well as wallets and exchanges like Petra, Backpack and OKX. WLFI executives view the expansion as part of a broader strategy to grow DeFi adoption and to position USD1 as a settlement layer for tokenized assets.

Debit‑card and Apple Pay integration

Reuters and CoinLaw report that WLFI will launch a crypto debit card bridging crypto assets with everyday spending. Witkoff told Reuters that the company expects to roll out a pilot program within the next quarter, with a full launch by late 2025 or early 2026. The card will integrate with Apple Pay, and WLFI will release a retail app to simplify crypto payments.

Controversies and Criticisms

Reserve transparency. Binance Academy highlighted that, as of May 2025, USD1 lacked publicly available reserve breakdowns. WLFI promised third‑party audits, but the absence of detailed disclosures raised investor concerns.

Political conflicts of interest. WLFI’s deep ties to the Trump family have drawn scrutiny. A Reuters investigation reported that an anonymous wallet holding $2 billion in USD1 received funds shortly before the MGX investment, and the owners of the wallet could not be identified. Critics argue that the venture could allow the Trump family to benefit financially from regulatory decisions. Senator Elizabeth Warren warned that the stablecoin legislation being considered by Congress would make it easier for the president and his family to “line their own pockets”. Media outlets like The New York Times and The New Yorker have described WLFI as eroding the boundary between private enterprise and public policy.

Market concentration and liquidity concerns. CoinLaw reported that more than half of USD1’s liquidity came from just three wallets as of June 2025. Such concentration raises questions about the organic demand for USD1 and its resilience in stressed markets.

Regulatory uncertainty. Trump Jr. himself acknowledges that U.S. crypto regulation remains unclear and calls for comprehensive rules to prevent companies from moving offshore. Critics argue that WLFI benefits from deregulatory moves by the Trump administration while shaping policy that could favour its own financial interests.

Conclusion

World Liberty Financial positions itself as a pioneer at the intersection of traditional finance and decentralized technology, using the USD1 stablecoin as the backbone for payments, tokenization and DeFi products. The platform’s emphasis on institutional backing, cross‑chain interoperability and zero‑fee minting distinguishes USD1 from other stablecoins. Partnerships with networks like Aptos and major deals such as the MGX‑Binance investment underscore WLFI’s ambition to become a global settlement layer for tokenized assets.

From Donald Trump Jr.’s perspective, WLFI is not merely a commercial venture but a mission to democratize finance, preserve U.S. dollar hegemony and challenge what he sees as a broken and elitist traditional‑finance system. He champions regulatory clarity while criticizing excessive oversight, reflecting broader debates within the crypto industry. However, WLFI’s political associations, opaque reserve disclosures and concentration of liquidity invite skepticism. The company’s success will depend on balancing innovation with transparency and navigating the complex interplay between private interests and public policy.

Directed Acyclic Graph (DAG) in Blockchain

· 47 min read
Dora Noda
Software Engineer

What is a DAG and How Does it Differ from a Blockchain?

A Directed Acyclic Graph (DAG) is a type of data structure consisting of vertices (nodes) connected by directed edges that never form a cycle. In the context of distributed ledgers, a DAG-based ledger organizes transactions or events in a web-like graph rather than a single sequential chain. This means that unlike a traditional blockchain where each new block references only one predecessor (forming a linear chain), a node in a DAG may reference multiple previous transactions or blocks. As a result, many transactions can be confirmed in parallel, rather than strictly one-by-one in chronological blocks.

To illustrate the difference, if a blockchain looks like a long chain of blocks (each block containing many transactions), a DAG-based ledger looks more like a tree or web of individual transactions. Every new transaction in a DAG can attach to (and thereby validate) one or more earlier transactions, instead of waiting to be packaged into the next single block. This structural difference leads to several key distinctions:

  • Parallel Validation: In blockchains, miners/validators add one block at a time to the chain, so transactions are confirmed in batches per new block. In DAGs, multiple transactions (or small “blocks” of transactions) can be added concurrently, since each can attach to different parts of the graph. This parallelization means DAG networks don’t have to wait for a single long chain to grow one block at a time.
  • No Global Sequential Order: A blockchain inherently creates a total order of transactions (every block has a definite place in one sequence). A DAG ledger, by contrast, forms a partial order of transactions. There is no single “latest block” that all transactions queue for; instead, many tips of the graph can coexist and be extended simultaneously. Consensus protocols are then needed to eventually sort out or agree on the order or validity of transactions in the DAG.
  • Transaction Confirmation: In a blockchain, transactions are confirmed when they are included in a mined/validated block and that block becomes part of the accepted chain (often after more blocks are added on top). In DAG systems, a new transaction itself helps confirm previous transactions by referencing them. For example, in IOTA’s Tangle (a DAG), each transaction must approve two previous transactions, effectively having users collaboratively validate each other’s transactions. This removes the strict division between “transaction creators” and “validators” that exists in blockchain mining – every participant issuing a transaction also does a bit of validation work.

Importantly, a blockchain is actually a special case of a DAG – a DAG that has been constrained to a single chain of blocks. Both are forms of distributed ledger technology (DLT) and share goals like immutability and decentralization. However, DAG-based ledgers are “blockless” or multi-parent in structure, which gives them different properties in practice. Traditional blockchains like Bitcoin and Ethereum use sequential blocks and often discard any competing blocks (forks), whereas DAG ledgers attempt to incorporate and arrange all transactions without discarding any, as long as they’re not conflicting. This fundamental difference lays the groundwork for the contrasts in performance and design detailed below.

Technical Comparison: DAG vs. Blockchain Architecture

To better understand DAGs vs blockchains, we can compare their architectures and validation processes:

  • Data Structure: Blockchains store data in blocks linked in a linear sequence (each block contains many transactions and points to a single previous block, forming one long chain). DAG ledgers use a graph structure: each node in the graph represents a transaction or an event block, and it can link to multiple previous nodes. This directed graph has no cycles, meaning if you follow the links “backwards” you can never loop back to a transaction you started from. The lack of cycles allows a topological ordering of transactions (a way to sort them so that every reference comes after the referenced transaction). In short, blockchains = one-dimensional chain, DAGs = multi-dimensional graph.
  • Throughput and Concurrency: Because of the structural differences, blockchains and DAGs handle throughput differently. A blockchain, even under optimal conditions, adds blocks one by one (often waiting for each block to be validated and propagated network-wide before the next one). This inherently limits transaction throughput – for example, Bitcoin averages 5–7 transactions per second (TPS) and Ethereum ~15–30 TPS under the classic proof-of-work design. DAG-based systems, by contrast, allow many new transactions/blocks to enter the ledger concurrently. Multiple branches of transactions can grow simultaneously and later mesh together, dramatically increasing potential throughput. Some modern DAG networks claim throughput in the thousands of TPS, approaching or exceeding traditional payment networks in capacity.
  • Transaction Validation Process: In blockchain networks, transactions wait in a mempool and are validated when a miner or validator packages them into a new block, then other nodes verify that block against the history. In DAG networks, validation is often more continuous and decentralized: each new transaction carries out a validation action by referencing (approving) earlier transactions. For example, each transaction in IOTA’s Tangle must confirm two previous transactions by checking their validity and doing a small proof-of-work, thereby “voting” for those transactions. In Nano’s block-lattice DAG, each account’s transactions form their own chain and are validated via votes by representative nodes (more on this later). The net effect is that DAGs spread out the work of validation: rather than a single block producer validating a batch of transactions, every participant or many validators concurrently validate different transactions.
  • Consensus Mechanism: Both blockchains and DAGs need a way for the network to agree on the state of the ledger (which transactions are confirmed and in what order). In blockchains, consensus often comes from Proof of Work or Proof of Stake producing the next block and the rule of “longest (or heaviest) chain wins”. In DAG ledgers, consensus can be more complex since there isn’t a single chain. Different DAG projects use different approaches: some use gossip protocols and virtual voting (as in Hedera Hashgraph) to come to agreement on transaction order, others use Markov Chain Monte Carlo tip selection (IOTA’s early approach) or other voting schemes to decide which branches of the graph are preferred. We will discuss specific consensus methods in DAG systems in a later section. Generally, reaching network-wide agreement in a DAG can be faster in terms of throughput, but it requires careful design to handle conflicts (like double-spend attempts) since multiple transactions can exist in parallel before final ordering.
  • Fork Handling: In a blockchain, a “fork” (two blocks mined at nearly the same time) results in one branch eventually winning (longest chain) and the other being orphaned (discarded), which wastes any work done on the orphan. In a DAG, the philosophy is to accept forks as additional branches of the graph rather than waste them. The DAG will incorporate both forks; the consensus algorithm then determines which transactions end up confirmed (or how conflicting transactions are resolved) without throwing away all of one branch. This means no mining power or effort is wasted on stale blocks, contributing to efficiency. For example, Conflux’s Tree-Graph (a PoW DAG) attempts to include all blocks in the ledger and orders them, rather than orphaning any, thereby utilizing 100% of produced blocks.

In summary, blockchains offer a simpler, strictly ordered structure where validation is block-by-block, whereas DAGs provide a more complex graph structure allowing asynchronous and parallel transaction processing. DAG-based ledgers must employ additional consensus logic to manage this complexity, but they promise significantly higher throughput and efficiency by utilizing the network’s full capacity rather than forcing a single-file queue of blocks.

Benefits of DAG-Based Blockchain Systems

DAG architectures were introduced primarily to overcome the limitations of traditional blockchains in scalability, speed, and cost. Here are the key benefits of DAG-based distributed ledgers:

  • High Scalability & Throughput: DAG networks can achieve high transaction throughput because they handle many transactions in parallel. Since there is no single chain bottleneck, the TPS (transactions per second) can scale with network activity. In fact, some DAG protocols have demonstrated throughput on the order of thousands of TPS. For example, Hedera Hashgraph has the capacity to process 10,000+ transactions per second in the base layer, far outpacing Bitcoin or Ethereum. In practice, Hedera has demonstrated finalizing transactions in about 3–5 seconds, compared to the minutes or longer confirmation times on PoW blockchains. Even DAG-based smart contract platforms like Fantom have achieved near-instant finality (~1–2 seconds) for transactions under normal loads. This scalability makes DAGs attractive for applications requiring high volume, such as IoT microtransactions or real-time data streams.
  • Low Transaction Costs (Feeless or Minimal Fees): Many DAG-based ledgers boast negligible fees or even feeless transactions. By design, they often don’t rely on miners expecting block rewards or fees; for instance, in IOTA and Nano, there are no mandatory transaction fees – a crucial property for micro-payments in IoT and everyday use. Where fees exist (e.g., Hedera or Fantom), they tend to be very low and predictable, since the network can handle load without bidding wars for limited block space. Hedera transactions cost around $0.0001 (a ten-thousandth of a dollar) in fees, a tiny fraction of typical blockchain fees. Such low costs open the door to use cases like high-frequency transactions or tiny payments which would be infeasible on fee-heavy chains. Also, because DAGs include all valid transactions rather than dropping some in case of forks, there’s less “wasted” work – which indirectly helps keep costs down by utilizing resources efficiently.
  • Fast Confirmation and Low Latency: In DAG ledgers, transactions don’t need to wait for inclusion in a global block, so confirmation can be faster. Many DAG systems achieve quick finality – the point at which a transaction is considered permanently confirmed. For example, Hedera Hashgraph's consensus typically finalizes transactions within a few seconds with 100% certainty (ABFT finality). Nano's network often sees transactions confirmed in <1 second thanks to its lightweight voting process. This low latency enhances user experience, making transactions appear nearly instant, which is important for real-world payments and interactive applications.
  • Energy Efficiency: DAG-based networks often do not require the intensive proof-of-work mining that many blockchains use, making them far more energy-efficient. Even compared to proof-of-stake blockchains, some DAG networks use minimal energy per transaction. For instance, a single Hedera Hashgraph transaction consumes on the order of 0.0001 kWh (kilowatt-hour) of energy. This is several orders of magnitude less than Bitcoin (which can be hundreds of kWh per transaction) or even many PoS chains. The efficiency comes from eliminating wasteful computations (no mining race) and from not discarding any transaction attempts. If blockchain networks were to switch to DAG-based models universally, the energy savings could be monumental. The carbon footprint of DAG networks like Hedera is so low that its overall network is carbon-negative when offsets are considered. Such energy efficiency is increasingly crucial for sustainable Web3 infrastructure.
  • No Mining & Democratized Validation: In many DAG models, there is no distinct miner/validator role that ordinary users can’t perform. For example, every IOTA user who issues a transaction is also helping validate two others, essentially decentralizing the validation work to the edges of the network. This can reduce the need for powerful mining hardware or staking large amounts of capital to participate in consensus, potentially making the network more accessible. (However, some DAG networks do still use validators or coordinators – see the discussion on consensus and decentralization later.)
  • Smooth Handling of High Traffic: Blockchains often suffer from mempool backlogs and fee spikes under high load (since only one block at a time can clear transactions). DAG networks, due to their parallel nature, generally handle traffic spikes more gracefully. As more transactions flood the network, they simply create more parallel branches in the DAG, which the system can process concurrently. There is less of a hard cap on throughput (scalability is more “horizontal”). This leads to better scalability under load, with fewer delays and only modest increases in confirmation times or fees, up to the capacity of the nodes’ network and processing power. In essence, a DAG can absorb bursts of transactions without congesting as quickly, making it suitable for use cases that involve bursts of activity (e.g., IoT devices all sending data at once, or a viral DApp event).

In summary, DAG-based ledgers promise faster, cheaper, and more scalable transactions than the classical blockchain approach. They aim to support mass adoption scenarios (micropayments, IoT, high-frequency trading, etc.) that current mainstream blockchains struggle with due to throughput and cost constraints. These benefits, however, come with certain trade-offs and implementation challenges, which we will address in later sections.

Consensus Mechanisms in DAG-Based Platforms

Because DAG ledgers don’t naturally produce a single chain of blocks, they require innovative consensus mechanisms to validate transactions and ensure everyone agrees on the ledger state. Different projects have developed different solutions tailored to their DAG architecture. Here we outline some notable consensus approaches used by DAG-based platforms:

  • IOTA’s Tangle – Tip Selection and Weighted Voting: IOTA’s Tangle is a DAG of transactions designed for the Internet of Things (IoT). In IOTA’s original model, there are no miners; instead, every new transaction must do a small Proof of Work and approve two previous transactions (these are the “tips” of the graph). This tip selection is often done via a Markov Chain Monte Carlo (MCMC) algorithm that probabilistically chooses which tips to approve, favoring the heaviest subtangle to prevent fragmentation. Consensus in early IOTA was partly achieved by this cumulative weight of approvals – the more future transactions indirectly approve yours, the more “confirmed” it becomes. However, to secure the network in its infancy, IOTA relied on a temporary centralized Coordinator node that issued periodic milestone transactions to finalize the Tangle. This was a major criticism (centralization) and is being removed in the upgrade known as “Coordicide” (IOTA 2.0). In IOTA 2.0, a new consensus model applies a leaderless Nakamoto-style consensus on a DAG. Essentially, nodes perform on-tangle voting: when a node attaches a new block, that block implicitly votes on the validity of the transactions it references. A committee of validator nodes (chosen via a staking mechanism) issues validation blocks as votes, and a transaction is confirmed when it accumulates enough weighted approvals (a concept called approval weight). This approach combines the idea of the heaviest DAG (similar to longest chain) with explicit voting to achieve consensus without a coordinator. In short, IOTA’s consensus evolved from tip selection + Coordinator to a fully decentralized voting on DAG branches by nodes, aiming for security and quick agreement on the ledger state.
  • Hedera Hashgraph – Gossip and Virtual Voting (aBFT): Hedera Hashgraph uses a DAG of events coupled with an asynchronous Byzantine Fault-Tolerant (aBFT) consensus algorithm. The core idea is “gossip about gossip”: each node rapidly gossips signed information about transactions and about its gossip history to other nodes. This creates a Hashgraph (the DAG of events) where every node eventually knows what every other node has gossiped, including the structure of who heard what and when. Using this DAG of events, Hedera implements virtual voting. Instead of sending out actual vote messages for ordering transactions, nodes simulate a voting algorithm locally by analyzing the graph of gossip connections. Leemon Baird’s Hashgraph algorithm can deterministically calculate how a theoretical round of votes on transaction order would go, by looking at the “gossip network” history recorded in the DAG. This yields a consensus timestamp and a total order of transactions that is fair and final (transactions are ordered by the median time they were received by the network). Hashgraph’s consensus is leaderless and achieves aBFT, meaning it can tolerate up to 1/3 of nodes being malicious without compromising consensus. In practice, Hedera’s network is governed by a set of 39 known organization-run nodes (the Hedera Council), so it’s permissioned but geographically distributed. The benefit is extremely fast and secure consensus: Hedera can reach finality in seconds with guaranteed consistency. The Hashgraph consensus mechanism is patented but has been open-sourced as of 2024, and it showcases how DAG + innovative consensus (gossip & virtual voting) can replace a traditional blockchain protocol.
  • Fantom’s Lachesis – Leaderless PoS aBFT: Fantom is a smart contract platform that uses a DAG-based consensus called Lachesis. Lachesis is an aBFT Proof-of-Stake protocol inspired by Hashgraph. In Fantom, each validator node assembles received transactions into an event block and adds it to its own local DAG of events. These event blocks contain transactions and references to earlier events. Validators gossip these event blocks to each other asynchronously – there’s no single sequence in which blocks must be produced or agreed upon. As event blocks propagate, the validators periodically identify certain events as milestones (or “root event blocks”) once a supermajority of nodes have seen them. Lachesis then orders these finalized events and commits them to a final Opera Chain (a traditional blockchain data structure) that acts as the ledger of confirmed blocks. In essence, the DAG of event blocks allows Fantom to achieve consensus asynchronously and very fast, then the final outcome is a linear chain for compatibility. This yields about 1–2 second finality for transactions on Fantom. Lachesis has no miners or leaders proposing blocks; all validators contribute event blocks and the protocol deterministically orders them. The consensus is secured by a Proof-of-Stake model (validators must stake FTM tokens and are weighted by stake). Lachesis is also aBFT, tolerating up to 1/3 faulty nodes. By combining DAG concurrency with a final chain output, Fantom achieves high throughput (several thousand TPS in tests) while remaining EVM-compatible for smart contracts. It’s a good example of using a DAG internally to boost performance, without exposing a DAG’s complexity to the application layer (developers still see a normal chain of transactions in the end).
  • Nano’s Open Representative Voting (ORV): Nano is a payment cryptocurrency that uses a unique DAG structure called a block-lattice. In Nano, each account has its own blockchain (account-chain) that only the account owner can update. All these individual chains form a DAG, since transactions from different accounts link asynchronously (a send in one account-chain references a receive in another, etc.). Consensus in Nano is achieved via a mechanism called Open Representative Voting (ORV). Users designate a representative node for their account (this is a weight delegation, not locking up funds), and these representatives vote on the validity of transactions. Every transaction is settled individually (there are no blocks bundling multiple txns) and is considered confirmed when a supermajority (e.g. >67%) of the voting weight (from representatives) agrees on it. Since honest account owners won’t double-spend their own funds, forks are rare and usually only caused by malicious attempts, which reps can quickly vote to reject. Finality is typically achieved in under a second for each transaction. ORV is similar to Proof-of-Stake in that voting weight is based on account balances (stake), but there is no staking reward or fee – representatives are voluntary nodes. The lack of mining and block production means Nano can operate feelessly and efficiently. However, it relies on a set of trusted representatives being online to vote, and there’s an implicit centralization in which nodes accumulate large voting weight (though users can switch reps anytime, maintaining decentralization control in the hands of users). Nano’s consensus is lightweight and optimized for speed and energy efficiency, aligning with its goal of being a fast, feeless digital cash.
  • Other Notable Approaches: Several other DAG-based consensus protocols exist. Hedera Hashgraph and Fantom Lachesis we covered; beyond those:
    • Avalanche Consensus (Avalanche/X-Chain): Avalanche uses a DAG-based consensus where validators repeatedly sample each other in a randomized process to decide which transactions or blocks to prefer. The Avalanche X-Chain (exchange chain) is a DAG of transactions (UTXOs) and achieves consensus via this network sampling method. Avalanche’s protocol is probabilistic but extremely fast and scalable – it can finalize transactions in ~1 second and reportedly handle up to 4,500 TPS per subnet. Avalanche’s approach is unique in combining DAG data structures with a metastable consensus (Snowball protocol), and it’s secured by Proof-of-Stake (anyone can be a validator with sufficient stake).
    • Conflux Tree-Graph: Conflux is a platform that extended Bitcoin’s PoW into a DAG of blocks. It uses a Tree-Graph structure where blocks reference not just one parent but all known previous blocks (no orphaning). This allows Conflux to use Proof-of-Work mining but keep all forks as part of the ledger, leading to much higher throughput than a typical chain. Conflux can thus achieve on the order of 3–6k TPS in theory, using PoW, by having miners produce blocks continually without waiting for a single chain. Its consensus then orders these blocks and resolves conflicts by a heaviest subtree rule. This is an example of a hybrid PoW DAG.
    • Hashgraph Variants and Academic Protocols: There are numerous academic DAG protocols (some implemented in newer projects): SPECTRE and PHANTOM (blockDAG protocols aimed at high throughput and fast confirmation, from DAGlabs), Aleph Zero (a DAG aBFT consensus used in Aleph Zero blockchain), Parallel Chains / Prism (research projects splitting transaction confirmation into parallel subchains and DAGs), and recent advancements like Sui’s Narwhal & Bullshark which use a DAG mempool for high throughput and a separate consensus for finality. While not all of these have large-scale deployments, they indicate a rich field of research. Many of these protocols differentiate between availability (writing lots of data fast to a DAG) and consistency (agreeing on one history), trying to get the best of both.

Each DAG platform tailors its consensus to its needs – whether it’s feeless microtransactions, smart contract execution, or interoperability. A common theme, however, is avoiding a single serial bottleneck: DAG consensus mechanisms strive to allow lots of concurrent activity and then use clever algorithms (gossip, voting, sampling, etc.) to sort things out, rather than constraining the network to a single block producer at a time.

Case Studies: Examples of DAG-Based Blockchain Projects

Several projects have implemented DAG-based ledgers, each with unique design choices and target use cases. Below we examine some prominent DAG-based platforms:

  • IOTA (The Tangle): IOTA is one of the first DAG-based cryptocurrencies, designed for the Internet of Things. Its ledger, called the Tangle, is a DAG of transactions where each new transaction confirms two previous ones. IOTA’s goal is to enable feeless microtransactions between IoT devices (paying tiny amounts for data or services). It launched in 2016, and to bootstrap security it used a Coordinator node (run by the IOTA Foundation) to prevent attacks on the early network. IOTA has been working on “Coordicide” to fully decentralize the network by introducing a voting consensus (as described earlier) where nodes vote on conflicting transactions using a leaderless Nakamoto consensus on the heaviest DAG. In terms of performance, IOTA can, in theory, achieve very high throughput (the protocol doesn’t set a hard TPS limit; more activity actually helps it confirm transactions faster). In practice, testnets have demonstrated hundreds of TPS, and the upcoming IOTA 2.0 is expected to scale well for IoT demand. Use cases for IOTA revolve around IoT and data integrity: e.g., sensor data streaming with integrity proofs, vehicle-to-vehicle payments, supply chain tracking, and even decentralized identity (the IOTA Identity framework allows issuing and verifying digital credentials/DIDs on the Tangle). IOTA does not natively support smart contracts on its base layer, but the project has introduced a parallel Smart Contracts framework and tokens on a secondary layer to enable more complex DApp functionality. A notable feature of IOTA is its zero fees, which is enabled by requiring a small PoW by the sender instead of charging a fee – this makes it attractive for high-volume, low-value transactions (e.g., a sensor sending data every few seconds for a negligible cost).
  • Hedera Hashgraph (HBAR): Hedera is a public distributed ledger that uses the Hashgraph consensus algorithm (invented by Dr. Leemon Baird). Hedera started in 2018 and is governed by a council of large organizations (Google, IBM, Boeing, and others) who run the initial set of nodes. Unlike most others, Hedera is permissioned in governance (only approved council members run consensus nodes currently, up to 39 nodes) though anyone can use the network. Its Hashgraph DAG enables very high throughput and fast finality – Hedera can process over 10,000 TPS with finality in 3-5 seconds under optimal conditions. It achieves this with the aBFT gossip-based consensus described earlier. Hedera emphasizes enterprise and Web3 use cases that need reliability at scale: its network offers services for tokenization (Hedera Token Service), a Consensus Service for tamper-proof event logging, and a Smart Contract service (which is EVM-compatible). Notable applications on Hedera include supply chain provenance (e.g., Avery Dennison’s apparel tracking), high-volume NFT minting (low fees make minting NFTs inexpensive), payments and micropayments (like ad tech micropayments), and even decentralized identity solutions. Hedera has a DID method registered with W3C and frameworks like Hedera Guardian to support verifiable credentials and regulatory compliance (for example, tracking carbon credits). A key feature is Hedera’s strong performance combined with claimed stability (the Hashgraph algorithm guarantees no forks and mathematically proven fairness in ordering). The trade-off is that Hedera is less decentralized in node count than open networks (by design, with its governance model), though the council nodes are located globally and the plan is to eventually increase openness. In summary, Hedera Hashgraph is a prime example of a DAG-based DLT targeting enterprise-grade applications, with an emphasis on high throughput, security, and governance.
  • Fantom (FTM): Fantom is a smart contract platform (Layer-1 blockchain) that employs a DAG-based consensus called Lachesis. Launched in 2019, Fantom gained popularity especially in the DeFi boom of 2021-2022 as an Ethereum-compatible chain with much higher performance. Fantom’s Opera network runs the Lachesis aBFT consensus (detailed above), where validators keep a local DAG of event blocks and achieve consensus asynchronously, then finalize transactions in a main chain. This gives Fantom a typical time-to-finality of ~1 second for transactions and the ability to handle thousands of transactions per second in throughput. Fantom is EVM-compatible, meaning developers can deploy Solidity smart contracts and use the same tooling as Ethereum, which greatly helped its adoption in DeFi. Indeed, Fantom became home to numerous DeFi projects (DEXes, lending protocols, yield farms) attracted by its speed and low fees. It also hosts NFT projects and gaming DApps – essentially any Web3 application that benefits from fast, cheap transactions. A noteworthy point is that Fantom achieved a high level of decentralization for a DAG platform: it has dozens of independent validators securing the network (permissionless, anyone can run a validator with the minimum stake), unlike some DAG networks that restrict validators. This positions Fantom as a credible alternative to more traditional blockchains for decentralized applications, leveraging DAG tech under the hood to break the performance bottleneck. The network’s FTM token is used for staking, governance and fees (which are only a few cents per transaction, much lower than Ethereum gas fees). Fantom demonstrated that DAG-based consensus can be integrated with smart contract platforms to achieve both speed and compatibility.
  • Nano (XNO): Nano is a lightweight cryptocurrency launched in 2015 (originally as RaiBlocks) that uses a DAG block-lattice structure. Nano’s primary focus is peer-to-peer digital cash: instant, feeless transactions with minimal resource usage. In Nano, each account has its own chain of transactions, and transfers between accounts are handled via a send block on the sender’s chain and a receive block on the recipient’s chain. This asynchronous design means the network can process transactions independently and in parallel. Consensus is achieved by Open Representative Voting (ORV), where the community appoints representative nodes by delegation of balance weight. Representatives vote on conflicting transactions (which are rare, usually only in double-spend attempts), and once a quorum (67% weight) agrees, the transaction is cemented (irreversibly confirmed). Nano’s typical confirmation times are well below a second, making it feel instantaneous in everyday use. Because there are no mining rewards or fees, running a Nano node or representative is a voluntary effort, but the network’s design minimizes load (each transaction is only 200 bytes and can be processed quickly). Nano’s DAG approach and consensus allow it to be extremely energy-efficient – there is a tiny PoW performed by senders (mainly as an anti-spam measure), but it’s trivial compared to PoW blockchains. The use cases for Nano are simple by design: it’s meant for currency transfers, from everyday purchases to remittances, where speed and zero fees are the selling points. Nano does not support smart contracts or complex scripting; it focuses on doing one thing very well. A challenge for Nano’s model is that it relies on the honest majority of representatives; since there are no monetary incentives, the security model is based on the assumption that large token holders will act in the network’s best interest. So far, Nano has maintained a fairly decentralized set of principal representatives and has seen use in merchant payments, tipping, and other micropayment scenarios online.
  • Hedera vs IOTA vs Fantom vs Nano (At a Glance): The table below summarizes some key characteristics of these DAG-based projects:
Project (Year)Data Structure & ConsensusPerformance (Throughput & Finality)Notable Features / Use Cases
IOTA (2016)DAG of transactions (“Tangle”); each tx approves 2 others. Originally coordinator-secured; moving to decentralized leaderless consensus (vote on heaviest DAG, no miners).Theoretically high TPS (scales with activity); ~10s confirmation in active network (faster as load increases). Ongoing research to improve finality. Feeless transactions.IoT micropayments and data integrity (feeless microtransactions), supply chain, sensor data, auto, decentralized identity (IOTA Identity DID method). No base-layer smart contracts (separate layers for that).
Hedera Hashgraph (2018)DAG of events (Hashgraph); gossip-about-gossip + virtual voting consensus (ABFT), run by ~29–39 council nodes (PoS weighted). No miners; timestamps for ordering.~10,000 TPS max; finality 3-5 seconds for transactions. Extremely low energy per tx (0.0001 kWh). Very low fixed fees ($0.0001 per transfer).Enterprise and Web3 applications: tokenization (HTS), NFTs and content services, payments, supply chain tracking, healthcare data, gaming, etc. Council governance by big corporations; network is EVM-compatible for smart contracts (Solidity). Focus on high throughput and security for businesses.
Fantom (FTM) (2019)DAG of validator event blocks; Lachesis aBFT PoS consensus (leaderless). Each validator builds DAG of events, which are confirmed and stitched into a final blockchain (Opera chain).Empirically a few hundred TPS in DeFi usage; 1-2 second finality typical. Capable of thousands of TPS in benchmarks. Low fees (fractions of a cent).DeFi and smart contracts on a high-speed L1. EVM-compatible (runs Solidity DApps). Supports DEXes, lending, NFT marketplaces (fast trading, cheap minting). DAG consensus hidden behind a developer-friendly blockchain interface. Staking available for anyone (decentralized validator set).
Nano (XNO) (2015)DAG of account-chains (block-lattice); each tx is its own block. Open Representative Voting for consensus (dPoS-like voting on conflicts). No mining, no fees.~Hundreds of TPS feasible (limited mainly by network I/O). <1s confirmation for typical transactions. No fees at all (feeless). Extremely low resource usage (efficient for IoT/mobile).Digital currency for instant payments. Ideal for micropayments, tipping, retail transactions, where fees and latency must be minimal. Not designed for smart contracts – focuses on simple transfers. Very low power consumption (green cryptocurrency). Community-run representatives (no central authority).

(Table: Comparison of selected DAG-based ledger projects and their characteristics. TPS = transactions per second.)

Other DAG-based projects not detailed above include Obyte (Byteball) – a DAG ledger for conditional payments and data storage, IoT Chain (ITC) – an IoT-focused DAG project, Avalanche – which we discussed as using DAG in its consensus and has become a major DeFi platform, Conflux – a high-throughput PoW DAG in China, and academic prototypes like SPECTRE/PHANTOM. Each explores the design space of DAG ledgers in different ways, but the four examples above (IOTA, Hedera, Fantom, Nano) illustrate the diversity: from feeless IoT transactions to enterprise networks and DeFi smart contract chains, all leveraging DAG structures.

Use Cases of DAG Technology in the Web3 Ecosystem

DAG-based blockchain systems unlock certain use cases particularly well, thanks to their high performance and unique properties. Here are some current and potential use cases where DAGs are making an impact in Web3:

  • Internet of Things (IoT): IoT involves millions of devices transmitting data and potentially transacting with each other (machine-to-machine payments). DAG ledgers like IOTA were explicitly designed for this scenario. With feeless microtransactions and the ability to handle high frequencies of small payments, a DAG ledger can enable IoT devices to pay for services or bandwidth on the fly. For example, a smart electric car might automatically pay a charging station a few cents worth of electricity, or sensors could sell data to a platform in real time. IOTA’s Tangle has been used in smart city pilots, supply chain IoT integrations (tracking goods and environmental conditions), and decentralized data marketplaces where sensor data is immutably logged and traded. The scalability of DAGs addresses the huge volume that widespread IoT networks generate, and their low cost suits micropayment economics.
  • Decentralized Finance (DeFi): DeFi applications like decentralized exchanges (DEXs), lending platforms, and payment networks benefit from high throughput and low latency. DAG-based smart contract platforms (e.g. Fantom, and to an extent Avalanche’s X-Chain for simple asset transfers) offer an advantage in that trades can settle faster and fees remain low even during high demand. In 2021, Fantom saw a surge of DeFi activity (yield farming, automated market makers, etc.) and was able to handle it with much lower congestion than Ethereum at the time. Additionally, DAG networks’ quick finality reduces the risk of trade execution uncertainty (on slow chains, users wait many blocks for finality which can introduce risk in fast-paced trading). Another angle is decentralized payment networks – Nano, for example, can be viewed as part of the DeFi spectrum, enabling peer-to-peer transfers and potentially being a micropayment rail for layer-2 of other systems. DAG’s performance could also support high-frequency trading or complex multi-step DeFi transactions executing more smoothly.
  • Non-Fungible Tokens (NFTs) and Gaming: The NFT boom has highlighted the need for low-cost minting and transfers. On Ethereum, minting NFTs became costly when gas fees spiked. DAG networks like Hedera and Fantom have been pitched as alternatives where minting an NFT costs a tiny fraction of a cent, making them viable for in-game assets, collectibles, or large-scale drops. Hedera’s Token Service allows native token and NFT issuance with the network’s low, predictable fees, and has been used by content platforms and even enterprises (e.g., music artists issuing tokens or universities tracking degrees). In gaming, where micro-transactions are common, a fast DAG ledger could handle frequent asset trades or reward distributions without slowing the game or bankrupting players in fees. The high throughput ensures that even if a popular game or NFT collection draws in millions of users, the network can handle the load (whereas we’ve seen games on Ethereum clog the network in the past). For instance, an NFT-based game on Fantom can update state quickly enough to provide near-real-time responsiveness.
  • Decentralized Identity (DID) and Credentials: Identity systems benefit from an immutable ledger to anchor identities, credentials, and attestations. DAG networks are being explored for this because they offer scalability for potentially billions of identity transactions (every login, certificate issuance, etc.) and low cost, which is crucial if, say, each citizen’s ID interactions were recorded. IOTA Identity is one example: it provides a DID method did:iota where identity documents are referenced on the Tangle. This can be used for self-sovereign identity: users control their identity documents, and verifiers can retrieve proofs from the DAG. Hedera is also active in the DID space – it has a DID specification and has been used in projects like securing tamper-proof logs of college degrees, COVID vaccination certificates, or supply chain compliance documents (via the Hedera Consensus Service as an ID anchoring service). The advantages of DAGs here are that writing data is cheap and fast, so updating an identity state (like rotating keys, adding a credential) doesn’t face the cost or delay hurdles of a busy blockchain. Additionally, the finality and ordering guarantees can be important for audits (Hashgraph, for example, provides a trusted timestamp order of events which is useful in compliance logging).
  • Supply Chain and Data Integrity: Beyond identity, any use case that involves logging a high volume of data entries can leverage DAG DLTs. Supply chain tracking is a notable one – products moving through a supply chain generate many events (manufactured, shipped, inspected, etc.). Projects have used Hedera and IOTA to log these events on a DAG ledger for immutability and transparency. The high throughput ensures the ledger won’t become a bottleneck even if every item in a large supply network is being scanned and recorded. Moreover, the low or zero fees mean you can record even low-value events on-chain without incurring major costs. Another example is IoT data integrity: energy grids or telecommunications might log device readings on a DAG ledger to later prove that data wasn’t tampered with. Constellation Network’s DAG (another DAG project) focuses on big data validation for enterprises and government (like US Air Force drone data integrity) – highlighting how a scalable DAG can handle big data streams in a trusted way.
  • Payments and Remittances: Fast and feeless transactions make DAG cryptocurrencies like Nano and IOTA well-suited for payment use cases. Nano has seen adoption in scenarios like online tipping (where a user can send a few cents to a content creator instantly) and international remittances (where speed and zero fees make a big difference compared to waiting hours and paying percent-level fees). DAG networks can serve as high-speed payment rails for integrating with point-of-sale systems or mobile payment apps. For instance, a coffee shop could use a DAG-based crypto for payments and not worry about latency or cost (the user experience can rival contactless credit card payments). Hedera’s HBAR is also used in some payment trials (due to its fast finality and low fee, some fintech applications consider it for settlement). Additionally, because DAG networks often have higher capacity, they can maintain performance even during global shopping events or spikes in usage, which is valuable for payment reliability.
  • Real-time Datafeeds and Oracles: Oracles (services that feed external data to blockchain smart contracts) require writing many data points to a ledger. A DAG ledger could act as a high-throughput oracle network, recording price feeds, weather data, IoT sensor readings, etc., with a guarantee of ordering and timestamp. The Hedera Consensus Service, for example, is used by some oracle providers to timestamp data before feeding it into other chains. The speed ensures that data is fresh, and the throughput means even rapid data streams can be handled. In decentralized Web3 analytics or advertising, where every click or impression might be logged for transparency, a DAG backend can cope with the event volume.

In all these use cases, the common thread is that DAG networks aim to provide the scalability, speed, and cost-efficiency that broaden the scope of what we can decentralize. They are particularly useful where high frequency or high volume transactions occur (IoT, microtransactions, machine data) or where user experience demands fast, seamless interactions (gaming, payments). That said, not every use case will migrate to DAG-based ledgers – sometimes the maturity and security of traditional blockchains, or simply network effects (e.g. Ethereum’s huge developer base), outweigh raw performance needs. Nonetheless, DAGs are carving out a niche in the Web3 stack for scenarios that strain conventional chains.

Limitations and Challenges of DAG-Based Networks

While DAG-based distributed ledgers offer enticing advantages, they also come with trade-offs and challenges. It’s important to critically examine these limitations:

  • Maturity and Security: The majority of DAG consensus algorithms are relatively new and less battle-tested compared to Bitcoin or Ethereum’s well-studied blockchain protocols. This can mean unknown security vulnerabilities or attack vectors might exist. The complexity of DAG systems potentially opens new avenues for attacks – for example, an attacker might try to spam or bloat the DAG with conflicting subtangles, or take advantage of the parallel structure to double-spend before the network reaches consensus. Academic analyses note that increased complexity introduces a broader range of vulnerabilities compared to simpler linear chains. Some DAG networks have suffered issues: e.g., early on, IOTA’s network experienced a few instances where it had to be paused due to irregularities/hacks (one incident in 2020 involved stolen funds and the Coordinator was shut off temporarily to resolve it). These incidents underline that the security models are still being refined. Moreover, finality in some DAGs is probabilistic – e.g., pre-Coordicide IOTA had no absolute finality, only increasing confirmation confidence – which can be tricky for certain applications (though newer DAGs like Hashgraph and Fantom provide instant finality with aBFT guarantees).
  • Consensus Complexity: Achieving consensus in a DAG often involves complicated algorithms (gossip protocols, virtual voting, random sampling, etc.). This complexity can translate to larger codebases and more complicated implementations, increasing the risk of software bugs. It also makes the system harder for developers to understand. A blockchain’s longest-chain rule is conceptually simple, whereas, say, Hashgraph’s virtual voting or Avalanche’s repeated random sampling are not immediately intuitive. The complexity can slow down adoption: developers and enterprises may be hesitant to trust a system they find harder to comprehend or audit. As one study pointed out, partial-order based systems (DAGs) require more effort to integrate with existing infrastructure and developer mindsets. Tools and libraries for DAG networks are also less mature in many cases, meaning the developer experience might be rougher than on Ethereum or Bitcoin.
  • Decentralization Trade-offs: Some current DAG implementations sacrifice some degree of decentralization to achieve their performance. For instance, Hedera’s reliance on a fixed council of 39 nodes means the network is not open to anyone to participate in consensus, which has drawn criticism despite its technical strengths. IOTA, for a long time, relied on a central Coordinator to prevent attacks, which was a single point of failure/control. Nano’s consensus relies on a small number of principal representatives holding most voting weight (as of 2023, the top few reps often control a large portion of online voting weight), which could be seen as a concentration of power – though this is somewhat analogous to mining pools in PoW. In general, blockchains are currently perceived as easier to decentralize widely (thousands of nodes) than some DAG networks. The reasons are varied: some DAG algorithms might have higher node bandwidth requirements (making it harder for many nodes to participate fully), or the project’s design might intentionally keep a permissioned structure initially. This isn’t an inherent limitation of DAGs per se, but rather of specific implementations. It’s possible to have a highly decentralized DAG network, but in practice many haven’t reached the node counts of major blockchains yet.
  • Need for Volume (Security vs Throughput): Some DAG networks paradoxically require high transaction volume to function optimally. For example, IOTA’s security model becomes robust when lots of honest transactions are constantly confirming each other (raising the cumulative weight of honest subtangles). If the network activity is very low, the DAG can suffer from laziness – tips not getting approved quickly, or an attacker finding it easier to try and override parts of the DAG. In contrast, a traditional blockchain like Bitcoin doesn’t require a minimum number of transactions to remain secure (even if few transactions occur, miners are still competing to extend the chain). Thus, DAGs often thrive under load but might stagnate under sparse usage, unless special measures are taken (like IOTA’s coordinator or background “maintenance” transactions). This means performance can be inconsistent – great when usage is high, but possibly slower confirmation in off-peak times or low-use scenarios.
  • Ordering and Compatibility: Because DAGs produce a partial order of events that eventually needs to be consistent, the consensus algorithms can be quite intricate. In smart contract contexts, total ordering of transactions is required to avoid double-spending and to maintain deterministic execution. DAG systems like Fantom solve this by building an ordering layer (the final Opera Chain), but not all DAG systems support complex smart contracts easily. The state management and programming model can be challenging on a pure DAG. For example, if two transactions are non-conflicting, they can confirm in parallel on a DAG – that’s fine. But if they do conflict (say, two txns spending the same output or two trades on the same order), the network must decide one and drop the other. Ensuring that all nodes make the same decision in a decentralized way is harder without a single chain ordering everything. This is why many DAG projects initially avoided smart contracts or global state and focused on payments (where conflicts are simpler to detect via UTXOs or account balances). Interfacing DAG ledgers with existing blockchain ecosystems can also be non-trivial; for example, connecting an EVM to a DAG required Fantom to create a mechanism to linearize the DAG for the EVM execution. These complexities mean that not every use case can be immediately implemented on a DAG without careful design.
  • Storage and Sync: A potential issue is that if a DAG ledger allows a high volume of parallel transactions, the ledger can grow quickly. Efficient algorithms for pruning the DAG (removing old transactions that are no longer needed for security) are important, as well as for letting light nodes operate (light clients need ways to confirm transactions without storing the entire DAG). Research has identified the reachability challenge: ensuring new transactions can reach and reference earlier ones efficiently, and figuring out how to truncate history safely in a DAG. While blockchains also face growth issues, the DAG’s structure might complicate things like calculating balances or proofs for partial state, since the ledger isn’t a simple list of blocks. This is largely a technical challenge that can be addressed, but it adds to the overhead of designing a robust DAG system.
  • Perception and Network Effects: Outside of pure technical issues, DAG projects face the challenge of proving themselves in a blockchain-dominated space. Many developers and users are simply more comfortable with blockchain L1s, and network effects (more users, more dApps, more tooling on existing chains) can be hard to overcome. DAGs are sometimes marketed with bold claims (“blockchain killer”, etc.), which can invite skepticism. For example, a project might claim unlimited scalability – but users will wait to see it demonstrated under real conditions. Until DAG networks host “killer apps” or large user bases, they may be seen as experimental. Additionally, getting listed on exchanges, custody solutions, wallets – the whole infrastructure that already supports major blockchains – is an ongoing effort for each new DAG platform. So there’s a bootstrapping challenge: despite technical merits, adoption can lag due to ecosystem inertia.

In summary, DAG-based ledgers trade simplicity for performance, and that comes with growing pains. The complexity of consensus, potential centralization in some implementations, and the need to gain trust equivalent to older blockchain systems are hurdles to overcome. The research community is actively studying these issues – for instance, a 2024 systematization-of-knowledge paper on DAG protocols notes the increasing variety of designs and the need for holistic understanding of their trade-offs. As DAG projects mature, we can expect many of these challenges (like removal of coordinators, open participation, better dev tools) to be addressed, but they are important to consider when evaluating DAG vs blockchain for a given application.

The adoption of DAG-based blockchain technology is still in its early stages relative to the widespread use of traditional blockchains. As of 2025, only a handful of public distributed ledgers use DAGs at scale – notable ones being Hedera Hashgraph, IOTA, Fantom, Nano, Avalanche (for part of its system), and a few others. Blockchains (linear chains) remain the dominant architecture in deployed systems. However, interest in DAGs has been steadily increasing in both industry and academia. We can identify a few trends and the outlook for DAG in blockchain:

  • Growing Number of DAG Projects and Research: There is a visible uptick in the number of new projects exploring DAG or hybrid architectures. For example, recent platforms like Aleph Zero (a privacy-focused network) use a DAG consensus for fast ordering, and Sui and Aptos (Move-language chains) incorporate DAG-based mempool or parallel execution engines to scale performance. Academic research into DAG-based consensus is flourishing – protocols like SPECTRE, PHANTOM, GhostDAG, and newer ones are pushing the boundaries, and comprehensive analyses (SoK papers) are being published to classify and evaluate DAG approaches. This indicates a healthy exploration and the emergence of best practices. As research identifies solutions to earlier weaknesses (for instance, how to achieve fairness, how to prune DAGs, how to secure DAGs under dynamic conditions), we’ll likely see these innovations trickle into implementations.
  • Hybrid Models in Mainstream Use: An interesting trend is that even traditional blockchains are adopting DAG concepts to improve performance. Avalanche is a prime example of a hybrid: it presents itself as a blockchain platform, but at its core uses a DAG consensus. It has gained significant adoption in DeFi and NFT circles, showing that users sometimes adopt a DAG-based system without even realizing it, as long as it meets their needs (fast and cheap). This trend may continue: DAG as an internal engine while exposing a familiar blockchain interface could be a winning strategy, easing developers in. Fantom did this with its Opera chain, and other projects might follow suit, effectively making DAG tech an unseen backbone for next-gen chains.
  • Enterprise and Niche Adoption: Enterprises that require high throughput, predictable costs, and are comfortable with more permissioned networks have been inclined to explore DAG ledgers. Hedera’s Governing Council model attracted big companies; they in turn drive use cases like asset tokenization for financial services, or tracking software licenses, etc., on Hedera. We’re seeing consortia consider DAG-based DLT for things like telecommunications settlements, advertising impression tracking, or interbank transfers, where the volume is high and they need finality. IOTA has been involved in European Union funded projects for infrastructure, digital identity pilots, and industrial IoT – these are more long-term adoption paths, but they show that DAGs are on the radar beyond just the crypto community. If some of these trials prove successful and scalable, we could see sector-specific adoption of DAG networks (e.g., an IoT consortium all using a DAG ledger to share and monetize data).
  • Community and Decentralization Progress: Early criticisms of DAG networks (central coordinators, permissioned validators) are gradually being addressed. IOTA’s Coordicide will, if successful, remove the central coordinator and transition IOTA to a fully decentralized network with a form of staking and community-run validators. Hedera has open-sourced its code and hinted at plans to further decentralize governance in the long run (beyond the initial council). Nano’s community continuously works on decentralizing representative distribution (encouraging more users to run reps or split their delegations). These moves are important for the credibility and trust in DAG networks, aligning them more with the ethos of blockchain. As decentralization increases, it’s likely that more crypto-native users and developers will be willing to build on or contribute to DAG projects, which can accelerate growth.
  • Interoperability and Layer-2 Use: We might also see DAGs being used as scaling layers or interoperable networks rather than standalone ecosystems. For example, a DAG ledger could serve as a high-speed layer-2 for Ethereum, periodically anchoring batched results to Ethereum for security. Alternatively, DAG networks could be linked via bridges to existing blockchains, allowing assets to flow where it’s cheapest to transact. If the UX can be made seamless, users might transact on a DAG network (enjoying high speed) while still relying on a base blockchain for settlement or security – getting the best of both worlds. Some projects consider this kind of layered approach.
  • Future Outlook – Complement, not Replacement (for now): It’s telling that even proponents often say DAG is an “alternative” or complement to blockchain rather than an outright replacement. In the near future, we can expect heterogeneous networks: some will be blockchain-based, some DAG-based, each optimized for different scenarios. DAGs might power the high-frequency backbone of Web3 (handling the grunt work of microtransactions and data logging), while blockchains might remain preferred for settlement, extremely high-value transactions, or where simplicity and robustness are paramount. Over a longer horizon, if DAG-based systems continue to prove themselves and if they can demonstrate equal or greater security and decentralization, it’s conceivable they could become the dominant paradigm for distributed ledgers. The energy efficiency angle also aligns DAGs well with global sustainability pressures, potentially making them more politically and socially acceptable in the long run. The carbon footprint benefits of DAG networks, combined with their performance advantages, could be a major driver if regulatory environments emphasize green technology.
  • Community Sentiment: There is a segment of the crypto community that is very excited about DAGs – seeing them as the next evolutionary step of DLT. You’ll often hear phrases like “DAGs are the future; blockchains will eventually be seen as the dial-up internet compared to DAG’s broadband.” This enthusiasm has to be balanced with practical results, but it suggests that talent and investments are flowing into this area. On the other hand, skeptics remain, pointing out that decentralization and security shouldn’t be compromised for speed – so DAG projects will have to demonstrate that they can have the best of both worlds.

In conclusion, the future outlook for DAG in blockchain is cautiously optimistic. Right now, blockchains still dominate, but DAG-based platforms are carving out their space and proving their capabilities in specific domains. As research resolves current challenges, we’ll likely see more convergence of ideas – with blockchains adopting DAG-inspired improvements and DAG networks adopting the lessons of blockchains on governance and security. Web3 researchers and developers would do well to keep an eye on DAG advancements, as they represent a significant branch of the DLT evolution tree. The coming years may see a diverse ecosystem of interoperable ledgers where DAGs play a vital role in scaling and special-purpose applications, moving us closer to the vision of a scalable, decentralized web.

In the words of one Hedera publication: DAG-based ledgers are “a promising step forward” in the evolution of digital currencies and decentralized tech – not a silver bullet to replace blockchains outright, but an important innovation that will work alongside and inspire improvements in the distributed ledger landscape as a whole.

Sources: The information in this report is drawn from a variety of credible sources, including academic research on DAG-based consensus, official documentation and whitepapers from projects like IOTA, Hedera Hashgraph, Fantom, and Nano, as well as technical blogs and articles that provide insights into DAG vs blockchain differences. These references support the comparative analysis, benefits, and case studies discussed above. The continued dialogue in the Web3 research community suggests that DAGs will remain a hot topic as we seek to solve the trilemma of scalability, security, and decentralization in the next generation of blockchain technology.

Somnia Layer-1 Blockchain Deep Dive: 1M TPS and sub-second finality

· 65 min read
Dora Noda
Software Engineer

Somnia is an EVM-compatible Layer-1 blockchain built for extreme performance, capable of over 1,000,000 transactions per second (TPS) with sub-second finality. To achieve this, Somnia reimagines core blockchain design with four key technical innovations:

  • MultiStream Consensus: Somnia’s consensus is a novel proof-of-stake BFT protocol where each validator maintains its own “data chain” of transactions, producing blocks independently. A separate consensus chain periodically confirms the latest block of every validator’s data chain and orders them into one global blockchain. This allows parallel transaction ingestion: multiple validators can propagate transactions concurrently on their data streams, which are later merged into a single ordered log. The consensus chain (inspired by the Autobahn BFT research) ensures security by preventing any validator from forking or altering its own stream once the global block is finalized. Figure 1 illustrates this architecture, where validator-specific chains feed into a global consensus block.

  • Accelerated Sequential Execution: Instead of relying on multi-threaded execution, Somnia opts to make a single core extremely fast. The Somnia client compiles EVM smart contracts to native x86 machine code (just-in-time or ahead-of-time). Frequently-used contracts are translated into optimized machine instructions, eliminating the typical interpretation overhead and achieving near-native C++ speed for execution. In benchmarks this yields hundreds of nanoseconds per ERC-20 transfer, supporting millions of TX/sec on one core. Less-called contracts can still run in the standard EVM interpreter, balancing compilation cost. Additionally, Somnia leverages modern CPU out-of-order execution and pipelining (“hardware-level parallelism”) to speed up individual transactions. By compiling to native code, the CPU can execute instructions in parallel at the chip level (e.g. overlapping memory fetches and computations), further accelerating sequential logic like token transfers. This design choice recognizes that software parallelism often fails under highly correlated workload spikes (e.g. a hot NFT mint where all transactions hit the same contract). Somnia’s single-thread optimizations ensure even “hot” contract scenarios achieve high throughput where naive parallel execution would stall.
  • IceDB (Deterministic Storage Engine): Somnia includes a custom blockchain database called IceDB to maximize state access performance and predictability. Unlike typical LevelDB/RocksDB backends, IceDB provides deterministic read/write costs: every operation returns a “performance report” of exactly how many RAM cache lines and disk pages were accessed. This allows Somnia to charge gas fees based on actual resource usage in a consistent, consensus-deterministic way. For example, reads served from memory can cost less gas than cold reads hitting disk, without nondeterminism. IceDB also uses an improved caching layer optimized for both read and write, yielding extremely low latency (15–100 nanoseconds per operation on average). Additionally, IceDB features built-in state snapshotting: it exploits the internal structure of the log-structured storage to maintain and update global state hashes efficiently, instead of building a separate Merkle tree at the application level. This reduces overhead for computing state roots and proofs. Overall, IceDB’s design ensures predictable, high-speed state access and gas metering fairness, which are critical at Somnia’s scale.
  • Advanced Compression & Networking: Pushing millions of TPS means nodes must exchange huge volumes of transaction data (e.g. 1M ERC-20 transfers/sec ~ 1.5 Gbps of raw data). Somnia addresses this via compression and networking optimizations:
    • Streaming Compression: Because each validator publishes a continuous data stream, Somnia can use stateful stream compression across blocks. Common patterns (like repetitive addresses, contract calls, parameters) are compressed by referencing prior occurrences in the stream, achieving far better ratios than independent block compression. This leverages the power-law distribution of blockchain activity – a small subset of addresses or calls accounts for a large fraction of transactions, so encoding them with short symbols yields massive compression (e.g. an address used in 10% of TX can be coded in ~3 bits instead of 20 bytes). Traditional chains can’t easily use stream compression because block producers rotate; Somnia’s fixed per-validator streams unlock this capability.
    • BLS Signature Aggregation: To eliminate the biggest incompressible parts of transactions (signatures and hashes), Somnia uses BLS signatures for transactions and supports aggregating many signatures into one. This means a block of hundreds of transactions can carry a single combined signature, drastically cutting data size (and verification cost) compared to having 64 bytes of ECDSA signature per transaction. Transaction hashes are likewise not transmitted (peers recompute them as needed). Together, compression and BLS aggregation reduce bandwidth requirements enough to sustain Somnia’s high throughput without “choking” the network.
    • Bandwidth Symmetry: In Somnia’s multi-leader design, every validator continuously shares its fraction of new data each block, rather than one leader blasting the entire block to others. Consequently, network load is symmetrically distributed – each of N validators uploads roughly 1/N of total data to N-1 peers (and downloads the other portions) every block, instead of a single leader uploading N-1 copies. No node ever needs outbound bandwidth higher than the overall chain throughput, avoiding the bottleneck where a single leader must have an enormous upload pipe. This even utilization allows Somnia to approach the physical bandwidth limits of nodes without centralizing on a few supernodes. In short, Somnia’s networking stack is designed so that all validators share the work of propagating transactions, enabling near gigabit-level throughput across the decentralized network.

Consensus and Security: The consensus chain uses a modified PBFT (Practical Byzantine Fault Tolerance) proof-of-stake protocol with a partially synchronous assumption. Somnia launched with 60–100 validators globally distributed (the mainnet started with ~60 and targets 100). Validators are required to run powerful hardware (spec roughly between a Solana and Aptos node in performance) to handle the load. This validator count balances performance with sufficient decentralization – the team’s philosophy is “sufficient decentralization” (enough to ensure security and censorship-resistance, but not so extreme that it cripples performance). Notably, Google Cloud participated as a validator at launch, alongside other professional node operators.

Somnia implements standard PoS security measures like staking deposits and slashing for malicious behavior. To bolster safety in its novel execution engine, Somnia uses a unique “Cuthbert” system – an alternative reference implementation (unoptimized) that runs in parallel with the main client on each node. Every transaction is executed on both engines; if a divergence or bug is detected in the optimized client’s results, the validator will halt and refuse to finalize, preventing consensus errors. This dual execution acts as a real-time audit, ensuring the aggressive performance optimizations never produce incorrect state transitions. Over time, as confidence in the primary client grows, Cuthbert can be phased out, but during early stages it adds an extra layer of security.

In summary, Somnia’s architecture is tailored to real-time, mass-user applications. By decoupling transaction propagation from finalization (MultiStream), supercharging single-core execution (EVM compilation and CPU-level parallelism), optimizing the data layer (IceDB) and minimizing bandwidth per transaction (compression + aggregation), Somnia achieves performance orders of magnitude beyond traditional L1s. Improbable CEO Herman Narula claims it’s “the most advanced layer-one… able to handle thousands of times the throughput of Ethereum or Solana” – built specifically for the speed, scale, and responsiveness needed by next-gen games, social networks, and immersive metaverse experiences.

Tokenomics – Supply, Utility, and Economic Design

Supply and Distribution: Somnia’s native token, SOMI, has a fixed maximum supply of 1,000,000,000 tokens (1 billion). There is no ongoing inflation – the supply is capped and tokens were allocated upfront to various stakeholders with vesting schedules. The allocation breakdown is as follows:

Allocation CategoryPercentageToken AmountRelease Schedule
Team11.0%110,000,0000% at launch; 12-month cliff, then vest over 48 months.
Launch Partners15.0%150,000,0000% at launch; 12-month cliff, then vest over 48 months (includes early ecosystem contributors like Improbable).
Investors (Seed)15.15%151,500,0000% at launch; 12-month cliff, then vest over 36 months.
Advisors3.58%35,800,0000% at launch; 12-month cliff, then vest over 36 months.
Ecosystem Fund27.345%273,450,0005.075% unlocked at launch, remaining vest linearly over 48 months. Used to fund ecosystem development and the Somnia Foundation.
Community & Rewards27.925%279,250,00010.945% unlocked at launch, plus additional releases at 1 and 2 months post-launch, then vest linearly over 36 months. Used for community incentives, airdrops, liquidity, and validator staking rewards.
Total100%1,000,000,000~16% circulating at TGE (Token Generation Event), remainder vested over 3–4 years.

At mainnet launch (TGE in Q3 2025), around 16% of the supply went into circulation (mostly from the Community and Ecosystem allocations initial unlocks). The majority of tokens (team, partners, investors) are locked for the first year and then released gradually, aligning incentives for long-term development. This structured vesting helps prevent immediate large sell-offs and ensures the foundation and core contributors have resources over time to grow the network.

Token Utility: SOMI is central to Somnia’s ecosystem and follows a Delegated Proof of Stake (DPoS) model. Its main uses include:

  • Staking and Security: Validators must stake 5,000,000 SOMI each to run a node and participate in consensus. This significant stake (~0.5% of total supply per validator) provides economic security; malicious actors risk losing their bond. Somnia initially targets 100 validators, meaning up to 500 million SOMI could be staked for node operation (some of which may come from delegation, see below). In addition, delegators (any token holders) can stake SOMI by delegating to validators to help them meet the 5M requirement. Delegators earn a share of rewards in return. This opens staking yields to non-validators and helps decentralize stake among many token holders. Only staked tokens (either by validators or via delegation) are eligible for network rewards – simply holding tokens without staking does not earn rewards.
  • Gas Fees: All on-chain transactions and smart contract executions require SOMI for gas fees. This means every interaction (transfers, mints, DApp use) creates demand for the token. Somnia’s gas model is based on Ethereum’s (same unit definitions) but with adjustments and much lower base costs. As detailed later, Somnia has sub-cent fees and even dynamic discounts for high-volume DApps, but fees are still paid in SOMI. Thus, if the network sees heavy usage (e.g. a popular game or social app), users and developers will need SOMI to fuel their transactions, driving utility.
  • Validator/Delegator Rewards: Block rewards on Somnia come from transaction fees and a community treasury, not inflation. Specifically, 50% of all gas fees are distributed to validators (and their delegators) as rewards. The other 50% of fees is burned (removed from circulation) as a deflationary mechanism. This fee split (half to validators, half burned) resembles Ethereum’s EIP-1559 model, except it’s a fixed 50/50 split in Somnia’s current design. In practice, validators’ earnings will derive from the network’s fee volume – as usage grows, fee rewards grow. To bootstrap security before fees are significant, Somnia also has treasury incentives for validators. The Community allocation includes tokens earmarked for staking rewards and liquidity; the foundation can distribute these as needed (likely as staking yield supplements in early years). Importantly, only staked tokens earn rewards – this encourages active participation and locks up supply. Delegators share in the fee rewards of their chosen validator proportionally to their stake, minus the validator’s commission (each validator sets a “delegation rate”, e.g. if set to 80%, then 80% of that validator’s rewards are shared with delegates). Somnia offers two delegation options: delegate to a specific validator’s pool (subject to a 28-day unbonding period, or immediate emergency unstake with a steep 50% slash penalty), or delegate to a general pool which auto-distributes across all under-staked validators (no lockup period, but likely a blended lower yield). This flexible DPoS design incentivizes token holders to secure the network for rewards, while providing an easy out (general pool) for those who want liquidity.
  • Governance: As Somnia matures, SOMI will govern network decisions. Token holders will eventually vote on proposals affecting protocol upgrades, use of treasury funds, economic parameters, etc. The project envisions a multi-faceted governance (see “Tokens Governance” below) where SOMI holders (the “Token House”) mainly control allocations of foundation and community funds, while validators, developers, and users have councils for technical and policy decisions. In early mainnet, governance is mostly handled by the Somnia Foundation (for agility and safety), but over 1–2 years it will progressively decentralize to the token community and councils. Thus, holding SOMI will confer influence over the ecosystem’s direction, making it a governance token in addition to a utility token.

Deflationary Mechanics: Because supply is fixed, Somnia relies on fee burning to introduce deflationary pressure. As noted, 50% of every gas fee is burnt permanently. This means if network usage is high, SOMI’s circulating supply will decrease over time, potentially increasing token scarcity. For example, if 1 million SOMI worth of fees are generated in a month, 500k SOMI would be destroyed. This burn mechanism can offset token unlocks or selling, and aligns long-term token value with network usage (more activity -> more burn). Additionally, Somnia currently doesn’t support user-specified tips (priority fees) at launch – the base fee model is efficient enough given high throughput, though they may introduce tips later if congestion arises. With ultra-low fees, the burn per transaction is tiny, but at scale (billions of transactions), it accumulates. Somnia’s economic model therefore combines zero inflation, scheduled unlocks, and fee-burning, aiming for long-term sustainability. If the network achieves mainstream volume, SOMI could become deflationary, benefiting stakers and holders as supply diminishes.

Gas Model Highlights: Somnia’s gas pricing is generally much cheaper than Ethereum’s, but with some novel twists for fairness and scalability. Most opcode costs are adjusted downward (since Somnia’s throughput and efficiency are higher) but storage costs were recalibrated upward per unit (to avoid abuse given low fee per gas). Two especially noteworthy features planned for 2025 are:

  • Dynamic Volume Discounts: Somnia introduces a tiered gas price discount for accounts or applications that sustain high TPS usage. In effect, the more transactions an app or user executes per hour, the lower the effective gas price they pay (up to 90% off at ~400 TPS). This volume-based pricing is meant to incentivize large-scale DApps to run on Somnia by dramatically reducing their costs at scale. It’s implemented as a stepwise decreasing gas price once certain TPS thresholds per account are exceeded (0.1, 1, 10, 100, 400 TPS etc.). This model (expected to roll out after mainnet launch) rewards projects that bring heavy load, ensuring Somnia remains affordable even when powering real-time games or social feeds with hundreds of transactions per second. It’s an unusual mechanism (most chains have a flat fee market), signaling Somnia’s prioritization of mass throughput use-cases.
  • Transient Storage: Somnia plans to offer time-bounded storage options where a developer can choose to store data on-chain only temporarily (for hours or days) at much lower gas cost than permanent storage. For example, an on-chain variable that only needs to persist for an hour (like a game lobby status or a player’s ephemeral position) can be stored with ~90% less gas than a normal permanent write. The gas schedule for a 32-byte SSTORE might be 20k gas for 1-hour retention vs 200k for indefinite. This concept of “transient state” is explicitly aimed at gaming and entertainment applications that generate lots of temporary data (leaderboards, game state) which doesn’t need to live forever on-chain. By providing an expiration-based storage with discounts, Somnia can support such real-time applications more efficiently. The implementation likely involves automatically discarding the state after the chosen duration (or moving it to a separate store), though details are to be rolled out. This feature, combined with Somnia’s compression, is geared towards on-chain games managing large volumes of state updates without bloating the chain or incurring huge costs.

Overall, Somnia’s tokenomics align with its goal of powering Web3 at Web2 scale. A large initial token pool funded development and ecosystem growth (with reputable backers and long locks signaling commitment), while the ongoing economic design uses market-driven rewards (via fees) and deflation to maintain value. SOMI holders are incentivized to stake and participate, as all network benefits (fee revenue, governance power) accrue to active stakers. With a capped supply and usage-proportional burn, SOMI’s value is tightly coupled to the success of the network: as more users and apps join, demand for tokens (for gas and staking) rises and supply diminishes from burns, creating a feedback loop supporting the token’s long-term sustainability.

Ecosystem and Partnerships

Despite only launching its mainnet in late 2025, Somnia entered the scene with a robust ecosystem of projects and strategic partners thanks to an extensive testnet phase and support from industry heavyweights.

Ecosystem dApps and Protocols: By mainnet launch, over 70 projects and dApps were already building on or integrating with Somnia. The initial ecosystem skews heavily toward gaming and social applications, reflecting Somnia’s target market of immersive, real-time apps. Notable projects include:

  • Sparkball: A flagship Web3 game on Somnia, Sparkball is a fast-paced 4v4 sports MOBA/brawler developed by Opti Games. It joined Somnia as a launch title, introducing on-chain gameplay and NFT-based team assets. Sparkball showcases Somnia’s ability to handle quick matchmaking and in-game transactions (for example, minting/trading players or items) with negligible latency.
  • Variance: An anime-themed roguelite RPG with rich story and no pay-to-win mechanics. Variance’s developers (veterans from Pokémon GO and Axie Infinity) chose Somnia for its capacity to handle large-scale game economies and transactions cheaply. After discussions with Somnia’s founder, the team was convinced Somnia understood game developers’ needs and the vision for Web3 gaming. Variance moved its in-game token ($VOID) and NFT logic onto Somnia, enabling features like on-chain loot drops and player-owned assets at scale. The game’s community grew significantly after announcing the switch to Somnia. Variance held playtests and community quests on Somnia’s testnet, demonstrating multi-player on-chain combat and rewarding players with NFTs and tokens.
  • Maelstrom Rise: A naval battle-royale game (think Fortnite at sea) by Uprising Labs. Maelstrom features real-time ship combat and an integrated on-chain economy for upgrades and collectibles. Already available off-chain (on Steam), Maelstrom is transitioning to Somnia to give players true ownership of warships and items. It’s one of the more accessible Web3 games, aiming to onboard traditional gamers by blending familiar gameplay with blockchain perks.
  • Dark Table CCG: An on-chain collectible card game supporting up to 4 players per match. It offers free-to-play deck building, with all cards as NFTs that players own and trade freely. Dark Table leverages Somnia to run a cross-platform card economy without central servers, letting players truly own their decks. It’s designed to be easy-entry (no crypto purchase needed to start) to attract both casual and competitive card gamers to Web3.
  • Netherak Demons: A dark fantasy action RPG backed by Somnia’s Dream Catalyst accelerator. Players customize demon characters and engage in real-time PvE and PvP battles, with an NFT collection that ties into game progress. Netherak uses Somnia’s tech to allow persistent character progression on-chain – players’ achievements and loot are recorded as assets they control, adding meaningful stakes to the gameplay.
  • Masks of the Void: A roguelite action-adventure game with procedurally generated levels, also supported by Uprising Labs. It planned a closed playtest where minting a free NFT grants early access, showcasing how Somnia can integrate NFT gating for game content. Masks of the Void emphasizes replayability and blockchain-enhanced progression (e.g. meta-game rewards that persist run-to-run as NFTs).

These are just a few highlights. The Somnia gaming ecosystem spans many genres – from naval shooters to card battlers to RPGs – indicating the platform’s broad appeal to developers. All these games leverage on-chain features (ownership of items, tokens for rewards, NFT characters, etc.) that require a high-performance chain to be enjoyable for players. Early results are promising: for instance, Somnia’s testnet ran a fully on-chain sandbox MMO demo called “Chunked” (built by Improbable) where thousands of players interacted in real time, generating 250 million transactions in 5 days – a record-breaking load that validated Somnia’s capabilities.

Beyond gaming, Somnia’s initial ecosystem includes other Web3 domains:

  • Social and Metaverse: Somnia is meant to power decentralized social networks and virtual worlds, though specific apps are early. However, hints of social platforms are present. For example, Somnia partnered with Yuga Labs to integrate Otherside NFTs (from Bored Ape Yacht Club’s metaverse) into Somnia’s world, allowing those assets to be used across immersive experiences. Community-driven events like BoredElon Musk’s Edison “gamevents” were run with Improbable tech in 2023, and Somnia is poised to bring such metaversal events fully on-chain going forward. There is also a Somnia Metaverse Browser application – essentially a custom Web3 browser/wallet geared for virtual world interaction, making it easy for users to access DApps and metaverse experiences in one interface. As the network matures, expect social dApps (decentralized Twitter/Reddit analogues, community hubs) and metaverse platforms to launch on Somnia, leveraging its identity portability features (Somnia natively supports MSquared’s open standards for avatar and asset interoperability across worlds).
  • DeFi and Others: At launch Somnia wasn’t primarily DeFi-focused, but some infrastructure is in place. There are integrations with price oracles like DIA (for on-chain price feeds) and Chainlink VRF via Protofire adapters (for randomness in games). A few DeFi-style use cases were discussed, such as fully on-chain order book exchanges (Somnia’s low latency could enable order-matching on-chain similar to a centralized exchange). We can expect an AMM or DEX to appear (the docs even include a guide to build a DEX on Somnia), and perhaps novel protocols blending gaming and finance (e.g. NFT lending or tokenized game asset markets). The presence of custody providers BitGo and Fireblocks as partners also indicates an eye towards supporting institutional and financial use-cases (they make holding tokens secure for exchanges and funds). Furthermore, Somnia’s tech can support AI and data-heavy apps (the Dreamthon program explicitly calls for AI and InfoFi projects), so we may see innovations like decentralized AI agents or data marketplaces on the chain.

Strategic Partnerships: Somnia is backed by an impressive roster of partners and backers:

  • Improbable and MSquared: Improbable – a leading metaverse technology company – is the primary development partner of Somnia. Improbable actually built the Somnia blockchain under contract for the Somnia Foundation, contributing its decade of distributed systems expertise. MSquared (M²), a metaverse network initiative backed by Improbable, is also closely involved. Together, Improbable and MSquared committed **up to 270milliontosupportSomniasdevelopmentandecosystem.Thisenormousinvestmentpool(announcedinearly2025)camepartlyfromM2s270 million** to support Somnia’s development and ecosystem. This enormous investment pool (announced in early 2025) came partly from M²’s 150M raise in 2022 (which included Andreessen Horowitz, SoftBank Vision Fund 2, Mirana, and others as investors) and $120M from Improbable’s venture allocation. The funding supports grants, marketing, and onboarding projects. Improbable’s involvement also brings technical integrations: Somnia is designed to work with Improbable’s Morpheus technology for massive virtual events. In 2023, Improbable powered virtual experiences like MLB’s Virtual Ballpark and K-pop concerts with tens of thousands of concurrent users – those users could soon be onboarded into Somnia so that event interactions yield on-chain assets or tokens. Improbable and MSquared essentially ensure Somnia has both the financial runway and real use-cases (metaverse events, games) to jump-start adoption.
  • Infrastructure & Web3 Services: Somnia integrated with many major blockchain service providers from day one:
    • OpenSea: The world’s largest NFT marketplace is integrated with Somnia, meaning Somnia-based NFTs can be traded on OpenSea. This is a big win for game developers on Somnia – their in-game NFTs (characters, skins, etc.) have immediate liquidity and visibility on a popular marketplace.
    • LayerZero: Somnia is connected to other chains via LayerZero’s Stargate protocol, enabling omnichain asset transfers and bridges. For example, users can bridge USDC or other stablecoins from Ethereum to Somnia easily through Stargate. This interoperability is crucial for onboarding liquidity into Somnia’s ecosystem.
    • Ankr: Ankr provides RPC nodes and global node infrastructure. It’s likely used to offer public RPC endpoints, node hosting, and API services for Somnia, making it easier for developers to access the network without running their own full nodes.
    • Sequence (Horizon): Sequence is a smart contract wallet and developer platform tailored for games (by Horizon). Integration with Sequence suggests Somnia can leverage smart wallet features (e.g. gas abstractions, login with email/social) to onboard mainstream users. Sequence’s multi-chain wallet likely added support for Somnia, so players can sign transactions with a user-friendly interface.
    • Thirdweb: Thirdweb’s Web3 SDKs and tools are fully compatible with Somnia. Thirdweb provides plug-and-play modules for NFT drops, marketplaces, tokens, and especially Account Abstraction. Indeed, Somnia’s docs have guides on gasless transactions and account abstraction via Thirdweb. This partnership means developers on Somnia can quickly build DApps using Thirdweb’s libraries and users can benefit from features like one-click walletless onboarding (gas fees sponsored by the DApp, etc.).
    • DIA & Oracles: DIA is a decentralized oracle provider; Somnia uses DIA price feeds for DeFi or in-game economy data. Additionally, Somnia worked with Protofire to adapt Chainlink VRF (verifiable random function) for random number generation in Somnia smart contracts. This ensures games can get secure randomness (for loot drops, etc.). We can expect further oracle integrations (perhaps Chainlink full price feeds in the future) as needed by DeFi projects.
  • Cloud and Enterprise Partners: Google Cloud not only invested but also runs a validator, providing credibility and cloud infrastructure expertise. Having a tech giant’s cloud division actively validate the network helps with reliability and opens doors to enterprise collaborations (e.g. Google Cloud might offer blockchain node services for Somnia or include Somnia in its marketplace). There were also partnerships with Fireblocks and BitGo – these are top digital asset custody and wallet providers. Their involvement means exchanges and institutions can safely custody SOMI and Somnia-based assets from day one, smoothing the path for SOMI listings and institutional adoption. Indeed, shortly after mainnet, Binance listed SOMI and featured it in a promotional airdrop campaign, likely facilitated by such custody readiness.
  • Ecosystem Growth Programs: The Somnia Foundation established a **10millionGrantProgramtofunddevelopersbuildingonSomnia.Thisgrantprogramlaunchedalongsidemainnettoincentivizetooldevelopment,DApps,research,andcommunityinitiatives.ComplementingitisDreamCatalyst,SomniasacceleratorspecificallyforWeb3gamingstartups.DreamCatalyst(runwithUprisingLabs)providesfunding,infrastructurecredits,mentorship,andgotomarketsupporttogamestudiosthatbuildonSomnia.Atleastahalfdozengames(likeNetherakDemonsandothers)werepartofthefirstDreamCatalystcohort,receivingportionsofthat10 million Grant Program** to fund developers building on Somnia. This grant program launched alongside mainnet to incentivize tool development, DApps, research, and community initiatives. Complementing it is **Dream Catalyst**, Somnia’s accelerator specifically for Web3 gaming startups. Dream Catalyst (run with Uprising Labs) provides funding, infrastructure credits, mentorship, and go-to-market support to game studios that build on Somnia. At least a half-dozen games (like Netherak Demons and others) were part of the first Dream Catalyst cohort, receiving portions of that 10M fund. There’s also Dreamthon, an upcoming accelerator program for other verticals – focusing on DeFi, AI, “InfoFi” (information markets), and SocialFi projects in the Somnia ecosystem. Additionally, Somnia organized online hackathons and quests throughout testnet: for example, a 60-day Somnia Odyssey event rewarded users for completing tasks and likely culminated in an airdrop. Early users could earn “points” and NFTs for testing dApps (a Points Program), and mini-hackathons are planned to continuously engage devs. This multi-pronged approach – grants, accelerators, hackathons, community quests – shows Somnia’s strong commitment to building a vibrant ecosystem quickly, by lowering barriers and funding experimenters.

In summary, Somnia launched not in isolation but backed by a powerful alliance of tech companies, investors, and service providers. Improbable’s support gives it cutting-edge tech and a pipeline of massive virtual events. Partnerships with the likes of Google Cloud, Binance, LayerZero, OpenSea, and others ensure Somnia is plugged into the broader crypto infrastructure from the start, enhancing its appeal to developers (who want reliable tools and liquidity) and to users (who demand easy bridging and trading of assets). Meanwhile, an array of Web3 games – Sparkball, Variance, Maelstrom, and more – are actively building on Somnia, aiming to be the first wave of fully on-chain entertainment that showcases the network’s capabilities. With dozens of projects live or in development, Somnia’s ecosystem at mainnet was already richer than some chains years into launch. This strong momentum is likely to grow as the grants and partnerships continue to bear fruit, potentially positioning Somnia as a central hub for on-chain gaming and metaverse applications in the coming years.

Developer & User Infrastructure

Somnia was built to be developer-friendly and to onboard potentially millions of users who may not be crypto-savvy. As an EVM-compatible chain, it supports the familiar Ethereum toolchain out of the box, while also offering custom SDKs and services to enhance the developer experience and user onboarding.

Developer Tooling and Compatibility: Somnia maintains full Ethereum Virtual Machine compatibility, meaning developers can write smart contracts in Solidity or Vyper and deploy with minimal changes. The network supports standard Ethereum RPC interfaces and chain ID, so tools like Hardhat, Truffle, Foundry, and libraries like Web3.js or ethers.js work seamlessly (the Somnia docs even provide specific how-tos for deploying with Hardhat and Foundry). This lowers the learning curve significantly – any Solidity developer can become a Somnia developer without learning a new language or VM.

To accelerate development and testing, Somnia launched an interactive Playground environment. The Playground allows teams (especially gaming/metaverse teams) to prototype on-chain logic in a low-friction way, using templates for NFTs, mini-games, social tokens, etc. It likely provides a sandbox network or developer portal for quick iterations. Additionally, Somnia’s GitBook documentation is comprehensive, covering everything from deploying contracts to using advanced features (like Ormi APIs, see below).

Somnia SDKs and APIs: Recognizing that querying on-chain data efficiently is as important as writing contracts, Somnia partnered with Ormi Labs to provide robust data indexing and API services. Ormi is essentially Somnia’s answer to The Graph: it offers subgraphs and GraphQL APIs for indexing contract events and state. Developers can create custom subgraphs for their DApps (e.g. to index all game item NFTs or social posts) via Ormi, and then query that data easily. The Ormi Data APIs deliver structured on-chain data with high availability, so front-end applications don’t need to run their own indexer nodes. This significantly simplifies building rich user interfaces on Somnia. Somnia has run Codelabs and tutorials showing how to build dApp UIs with Ormi’s GraphQL endpoints, indicating strong support for this tooling. In short, Somnia provides first-class indexing support, which is crucial for things like leaderboards in games or feeds in social apps – data that needs to be filtered and fetched quickly.

In addition to Ormi, Somnia’s infrastructure page lists multiple public RPC endpoints and explorer services:

  • RPC endpoints by providers like Ankr (for public access to the network).
  • Block Explorers: It appears Somnia had a testnet explorer (“Shannon”) and presumably a mainnet explorer for tracking transactions and accounts. Explorers are vital for developers and users to debug transactions and verify on-chain activity.
  • Safes (Multisig): The docs mention “Safes”, likely integration with Safe (formerly Gnosis Safe) for multi-signature wallets. This means DAOs or game studios on Somnia can use secure multisig wallets to manage their treasury or in-game assets. Safe integration is another piece of infrastructure that makes Somnia enterprise- and DAO-ready.
  • Wallet Adapters: Many popular Web3 wallets are supported. MetaMask can connect to Somnia by configuring the network RPC (the docs guide users through adding Somnia’s network to MetaMask). For a more seamless user experience, Somnia worked with RainbowKit and ConnectKit (React libraries for wallet connections), ensuring DApp developers can easily let users connect with a variety of wallets. There's also a guide for using Privy (a wallet solution focusing on user-friendly login).
  • Account Abstraction: Through Thirdweb’s SDK, Somnia supports account abstraction features. For instance, Thirdweb’s Smart Wallet or Account Abstraction SDK can be used on Somnia, enabling meta-transactions (gasless UX) or social login wallets. The docs explicitly describe gasless transactions with Thirdweb, meaning DApps can pay gas on behalf of users – a critical capability for mainstream adoption, as end-users might not even need to hold SOMI to play a game initially.

User Onboarding and Community Engagement: Somnia’s team has been proactive in growing a community of both developers and end-users:

  • The Somnia Discord is the central hub for developers (with a dedicated dev-chat and support from the core team). During testnet, developers could request test tokens (STT) via Discord to deploy and test their contracts. This direct support channel helped onboard many projects.
  • For end-users, Somnia organized events like the Somnia Quest and Somnia Odyssey. The Quest was a campaign in June 2025 where users completed social and testnet tasks (like following on X, joining Discord, trying DApps) to earn rewards and climb a leaderboard. The Odyssey (mentioned in a blog on Sep 9, 2025) was a 60-day adventure likely leading up to mainnet, where users who consistently interacted with testnet apps or learned about Somnia could unlock an airdrop. Indeed, Binance’s HODLer Airdrop on Sep 1, 2025, distributed 30 million SOMI (3% of supply) to Binance users who met certain criteria. This was a major user acquisition event, effectively giving thousands of crypto users a stake in Somnia and an incentive to try the network. The airdrop and various quests have helped Somnia build an initial user base and social media presence (Somnia’s Twitter – now X – and other channels have grown quickly).
  • Metaverse Browser: As mentioned, Somnia introduced a specialized Metaverse Browser application. This likely serves as a user-friendly gateway where someone can create a wallet, browse Somnia DApps, and enter virtual events seamlessly. It has an integrated Web3 wallet and a simple interface for accessing DApps. This kind of curated experience could ease non-crypto users into blockchain (for example, a gamer could download the Somnia browser, and join a virtual concert where the browser handles wallet creation and token transactions under the hood).
  • Developer Accelerator Programs: We covered Dream Catalyst and Dreamthon under ecosystem, but from a developer infrastructure perspective, these programs also ensure that new devs have guidance and resources. Dream Catalyst provided not just funding but also infrastructure tooling and community building support. That means participating teams likely got help with integrating Somnia’s SDKs, optimizing their contracts for Somnia’s architecture, etc.

In terms of documentation and resources:

  • Somnia offers a Lightpaper and OnePager for quick overviews (linked on their site), and a more detailed Litepaper/whitepaper in the docs (the Concepts section we referenced serves that purpose).
  • They have example repositories and code templates (for instance, how to build a DEX, how to use Subgraphs, how to integrate wallets – all provided in their official GitBook). By providing these, Somnia lowers the barrier to entry for developers from other chains who want to quickly get something running.
  • Audits: The docs mention an Audits section, implying the Somnia code has undergone third-party security audits. While details aren’t provided in our sources, this is important infrastructure – ensuring the node software and key contracts (like the staking or token contracts) are audited to protect developers and users.

Overall, Somnia’s developer infrastructure appears well-thought-out: EVM compatibility for familiarity, enhanced with custom data APIs, built-in account abstraction, and strong dev support. For users, the combination of ultra-low fees, possible gasless transactions, and specialized applications (Metaverse Browser, quests, etc.) aims to provide a Web2-level user experience on a Web3 platform. Somnia’s early focus on community engagement (airdrops, quests) shows a growth-hacking mentality – seeding the network with content and users so that developers have a reason to build, and vice versa. As Somnia grows, we can expect even more refined SDKs (perhaps plugins for Unity/Unreal for game devs) and continued improvements to user wallets (maybe native mobile wallets or social logins). The foundation’s substantial funding ensures that both devs and users will be supported with the tools they need to thrive on Somnia.

Use Cases and Applications

Somnia is purpose-built to enable a new class of decentralized applications that were previously infeasible due to blockchain limitations. Its high throughput and low latency open the door to fully on-chain, real-time experiences across various domains:

  • Gaming (GameFi): This is Somnia’s primary focus. With Somnia, developers can build games where every game action (movement, combat, item drops, trades) can be recorded or executed on-chain in real time. This means true ownership of in-game assets – players hold their characters, skins, cards, or loot as NFTs/tokens in their own wallets, not in a game company’s database. Entire game economies can run on-chain, enabling features like play-to-earn rewards, player-to-player trading without intermediaries, and community-driven game modifications. Crucially, Somnia’s capacity (1M+ TPS) and fast finality make on-chain games responsive. For example, an action RPG on Somnia can execute thousands of player actions per second without lag, or a trading card game can have instant moves and shuffles on-chain. Somnia’s account abstraction and low fees also allow games to potentially cover gas for players, making the experience seamless (players may not even realize blockchain is under the hood). The platform specifically envisions “fully on-chain games at internet scale” – persistent virtual worlds or MMOs where game state lives on Somnia and continues as long as the community keeps it alive. Because assets are on-chain, a game on Somnia could even continue evolving under community control if the original developer leaves – a concept impossible in Web2. Current examples: Sparkball demonstrates an on-chain multiplayer sports brawler; Chunked (the Improbable tech demo) showed a Minecraft-like sandbox entirely on-chain with real user interactions; Variance and Maelstrom will show how richer RPG and battle royale experiences translate to blockchain. The ultimate promise is games where hundreds of thousands of players play simultaneously in a shared on-chain world – something Somnia is uniquely positioned to handle.
  • Social Networks and Web3 Social Media: With Somnia, one could build a decentralized social platform where user profiles, posts, followers, and likes are all on-chain data under user control. For instance, a Twitter-like DApp on Somnia might store each tweet as an on-chain message NFT and each follow as an on-chain relationship. In such a network, users truly own their content and social graph, which could be ported to other apps easily. Somnia’s scale means a social feed could handle viral activity (millions of posts and comments) without crashing. And sub-second finality means interactions (posting, commenting) appear nearly instantly, as users expect in Web2. One benefit of on-chain social is censorship resistance – no single company can delete your content or ban your account – and data portability – you could move to a different frontend or client and keep your followers/content because it’s on a public ledger. The Somnia team explicitly mentions decentralized social networks built on self-sovereign identity and portable social graphs as a core use case. They also foresee user assembly governance where key users have a say (this could tie into how social networks moderate content in a decentralized way). A concrete early example is likely community forums within games – e.g., a game on Somnia might have an on-chain guild chat or an event board that is decentralized. But in the long term, Somnia could host full-fledged alternatives to Facebook or Twitter, especially for communities that value freedom and ownership. Another interesting angle is creator-owned platforms: imagine a YouTube-like service on Somnia where video NFTs represent content and creators earn directly via microtransactions or tokenized engagement. Somnia’s throughput could handle the metadata and interactions (though video storage would be off-chain), and its cheap transactions enable micro-tipping and token rewards for content creation.
  • Metaverse and Virtual Worlds: Somnia provides the identity and economic infrastructure for metaverses. In practice, this means virtual world platforms can use Somnia for avatar identities, cross-world assets, and transactions within virtual experiences. MSquared’s open standards for avatars/assets are supported on Somnia, so a user’s 3D avatar or digital fashion items can be represented as tokens on Somnia and ported across different worlds. For example, you might have a single avatar NFT that you use in a virtual concert, a sports meetup, and a game – all on Somnia-based platforms. As Improbable orchestrates massive events (like virtual sports watch parties, music festivals, etc.), Somnia can handle the economy layer: minting POAPs (proof of attendance tokens), selling virtual merchandise as NFTs, rewarding participants with tokens, and allowing peer-to-peer trading in real time during events. Somnia’s ability to support tens of thousands of concurrent users in one shared state (through multi-stream consensus) is crucial for metaverse scenarios where a large crowd might transact or interact simultaneously. The MLB Virtual Ballpark and K-pop events in 2023 (pre-Somnia) reached thousands of users; with Somnia, those users could each have wallets and assets, enabling things like a live NFT drop to everyone in the “stadium” or a real-time token scoreboard for event participation. Essentially, Somnia can underpin a persistent, interoperable metaverse economy: think of it as the ledger that records who owns what across many interconnected virtual worlds. This supports use cases like virtual real estate (land NFTs) that can be traded or borrowed against, cross-world quest rewards (complete an objective in game A, get an item usable in world B), or even identity reputation (on-chain records of a user’s achievements or credentials across platforms).
  • Decentralized Finance (DeFi): While Somnia is mainly positioned as a consumer app chain, its high performance opens some intriguing DeFi possibilities. For one, Somnia can host high-frequency trading and complex financial instruments on-chain. The team specifically mentions fully on-chain limit order books. On Ethereum, order book exchanges are impractical (too slow/expensive), which is why DeFi uses AMMs. But on Somnia, a DEX could maintain an order book smart contract and match orders in real time, just like a centralized exchange, because the chain can handle thousands of ops per second. This could bring CEX-like functionality and liquidity on-chain with transparency and self-custody. Another area is real-time risk management: Somnia’s speed could allow on-chain derivatives that update margin requirements every second, or live options order books. Moreover, with its transient storage feature, Somnia could support things like ephemeral insurance contracts or streaming payments that exist only for a short period. DeFi protocols on Somnia might also leverage its deterministic gas for more predictable costs. For instance, a micro-loan platform on Somnia could feasibly process tiny transactions (like $0.01 interest payments every minute) because fees are fractions of a cent. So Somnia could power Web3 microtransactions and payment streams in DeFi and beyond (something Ethereum can’t economically do at scale). Additionally, Somnia’s ability to compress data and aggregate signatures might allow batching of thousands of transfers or trades in one block, further boosting throughput for DeFi use-cases like airdrops or mass payouts. While DeFi isn’t the marketing focus, an efficient financial ecosystem is likely to emerge on Somnia to support the games and metaverses (e.g., DEXes for game tokens, lending markets for NFTs, etc.). We might see specialized protocols, for example a NFT fractionalization exchange where gaming items can be fractionally traded – Somnia can handle the bursty demand if a popular item suddenly pumps.
  • Identity and Credentials: Somnia’s combination of self-sovereign identity and high capacity enables on-chain identity systems that could be used for authentication, reputation, and credentials in Web3. For example, a user could have an identity NFT or soulbound token on Somnia that attests to their achievements (like “completed X game quests” or “attended Y events” or even off-chain credentials like degrees or memberships). These could be used across multiple applications. A user’s portable social graph – who their friends are, which communities they belong to – can be stored on Somnia and taken from one game or social platform to another. This is powerful for breaking the silos of Web2: imagine switching a social app but keeping your followers, or a gamer profile that carries your history into new games (maybe earning you veteran perks). With Somnia’s governance model incorporating a User Assembly (key users providing oversight), we might also see identity-based governance where users with proven participation get more say in certain decisions (all enforceable on-chain via those credentials). Another use case is content creator economies – a creator could issue their own token or NFT series on Somnia to their fanbase, and those could unlock access across various platforms (videos, chats, virtual events). Since Somnia can handle large volumes, a popular creator with millions of fans could airdrop badges to all of them or handle micro-tipping in real time during a live stream.
  • Real-Time Web Services: Broadly, Somnia can act as a decentralized backend for services that require instant responses. Consider a decentralized messaging app where messages are events on-chain – with sub-second finality, two users could chat via Somnia and see messages appear almost instantly and immutably (perhaps with encryption on content, but timestamps and proofs on-chain). Or an online marketplace where orders and listings are smart contracts – Somnia could update inventory and sales in real time, preventing double-spending of items and enabling atomic swaps of goods for payment. Even streaming platforms could integrate blockchain for rights management: e.g., a music streaming service on Somnia might manage song play counts and license micropayments to artists every few seconds of play (because it can handle high-frequency small transactions). In essence, Somnia enables Web2-level interactivity with Web3 trust and ownership. Any application where many users interact simultaneously (auctions, multiplayer collaboration tools, live data feeds) could be decentralized on Somnia without sacrificing performance.

Current Status of Use Cases: As of late 2025, the most tangible use cases live on Somnia revolve around gaming and collectibles – several games are in testing or early access phases on mainnet, and NFT collections (avatars, game assets) are being minted on Somnia. The network has successfully facilitated huge test events (billions of testnet tx, large-scale demos) proving that these use cases aren’t just theoretical. The next step is converting those tests into continuous live applications with real users. Early adopters like Sparkball and Variance will be important litmus tests: if they can attract thousands of daily players on Somnia, we’ll see the chain truly flex its muscles and perhaps attract even more game developers.

Potential future applications are exciting to consider. For example, national or enterprise-scale projects: a government could use Somnia to issue a digital ID or handle an election on-chain (millions of votes in seconds, with transparency), or a stock exchange could use it for trading tokenized securities at high frequency. The InfoFi part mentioned for Dreamthon hints at things like decentralized Reddit or prediction markets (massive number of small bets and votes) that Somnia could power.

In summary, Somnia’s use cases span gaming, social, metaverse, DeFi, identity, and beyond, all tied by a common thread: real-time, massive-scale transactions with full on-chain trust. It aims to bring experiences usually reserved for centralized servers into the decentralized realm. If Ethereum pioneered decentralized finance, Somnia’s ambition is to pioneer decentralized life – from entertainment to social connections – by finally delivering the performance needed for mainstream-style apps. As the network matures, we’ll likely see new innovations that leverage its unique features (e.g., games using transient state for physics simulations, or social apps using streaming compression to handle millions of tiny actions). The next year or two will reveal which of these potential applications gain traction and prove out Somnia’s promise in the wild.

Competitive Landscape

Somnia enters a crowded Layer-1 arena, but it differentiates itself with its extreme throughput and focus on fully on-chain consumer applications. Here’s how Somnia compares to some other prominent L1 blockchains:

AspectSomnia (SOMI)Ethereum (ETH)Solana (SOL)Avalanche (AVAX)Sui (SUI)
Launch (Mainnet)2025 (Q3) – new entrant backed by Improbable2015 (front-runner, now L1 + L2 ecosystem)2020 (high-performance monolithic L1)2020 (multi-chain platform: P-Chain, C-Chain, subnets)2023 (Move-based L1)
Consensus MechanismMultiStream PoS-BFT: Many parallel validator chains + PBFT consensus chain (inspired by Autobahn). PoS with ~100 validators.Proof-of-Stake + Nakamoto consensus (Gasper): ~700k validators (permissionless). Blocks every ~12 sec, finalized in ~2 epochs (≈12 min) in current form.Tower BFT PoS using Proof-of-History for timing. ~2200 validators. Rotating leader, parallel block processing.Snowman (Avalanche) consensus on P-Chain, with leaderless repeated subsampling. ~1000 validators. C-Chain uses PoS Ethereum-like consensus (Snowman). Subnets can use custom consenses.Narwhal & Bullshark DAG-based PoS with instant leader rotation. ~100 validators (permissionless growing set). Uses Move VM.
Throughput1,000,000+ TPS demonstrated in tests (1.05M ERC-20 TX/sec on 100 nodes). Aims for internet-scale (million+ TPS sustained).~15–30 TPS on mainnet L1. Scales via L2 rollups (theoretically unlimited, but each rollup is separate).~2,000–3,000 TPS typical; tested up to ~50k TPS on testnet (theoretical 65k+ TPS). Highly parallel for non-overlapping TX.~4,500 TPS on C-Chain (EVM) under ideal conditions. Subnets allow horizontal scaling by adding more chains.~20,000+ TPS in testing (Sui devnet hit 297k TPS in one benchmark). Real-world TPS is lower (hundreds to low thousands). Uses parallel execution for independent transactions.
Transaction Finality~0.1–0.5 seconds (sub-second deterministic finality). Essentially real-time.~12 seconds block time, ~6-12 minutes for probabilistic finality (with PoS, final after ~2 epochs). Future upgrades (Danksharding/PoS tweaks) may reduce time.~0.4 second block time on average. Finality usually within ~1-2 seconds (Solana blocks are finalized quickly barring forks).~1–2 seconds to finality on C-Chain (Avalanche consensus is quick finality). Subnet finality can vary but generally 1-3s.~1 second typical finality (Sui’s consensus finalizes transactions very fast in optimistically good network conditions).
Scalability ModelScale-up (vertical) + parallel streams: Single chain with massive throughput via optimized execution + multi-leader consensus. No sharding needed; one global state. Plans to add validators as tech matures.Layer-2 scaling & Sharding (future): Ethereum itself remains decentralized but low TPS; scales via rollups (Arbitrum, Optimism, etc.) on top. Sharding is on roadmap (Danksharding) to increase L1 throughput moderately.Monolithic chain: All state on one chain. Relies on high node performance and parallel execution. No sharding (Solana sacrifices some decentralization for raw TPS).Subnet & multiple chains: Avalanche P-Chain manages validators; C-Chain (EVM) is one chain (~4.5k TPS). Additional subnets can be launched for new apps, each with own throughput. So it scales horizontally by adding more chains (but each subnet is a separate state).Multi-lane execution: Sui uses object-based execution to parallelize TX. Like Solana, a single chain where throughput comes from parallelism and high hardware requirements. No sharding; one global state (with object partitioning internally).
Programming and VMEVM-compatible (Solidity, Vyper). Smart contracts compiled to x86 for performance. Supports all Ethereum tooling.EVM (Solidity, Vyper) on mainnet. Enormous mature ecosystem of dev tools and frameworks.Custom VM (called Sealevel) using Rust or C/C++. Not EVM-compatible. Uses LLVM for BPF bytecode. Steeper learning curve (Rust) but high performance.Multiple VMs: Default C-Chain is EVM (Solidity) – dev-friendly but lower performance. Other subnets can run custom VMs (e.g., Avalanche has a WASM-based testnet VM) for specific needs.Move VM: Uses Move, a Rust-based safe language for assets. Not EVM-compatible, so new ecosystem needed. Focus on asset-oriented programming (resources).
Unique InnovationsCompiled EVM, IceDB, multi-stream consensus, BLS aggregation, transient storage – enabling extreme TPS and large state. Deterministic gas costs per storage access. Compression for bandwidth. Emphasis on real-time dApps (games/metaverse).Security & decentralization – Ethereum prioritizes maximum decentralization and economic security (hundreds of thousands of validators, $20B+ staked). Has pioneering features like Account Abstraction (ERC-4337) and leading smart contract ecosystem. However, base layer has limited performance by design (scaling pushed to L2s).Proof-of-History (clock before consensus) to speed ordering; highly optimized validator client. Parallel runtime for non-conflicting TX. Solana’s differentiator is raw speed on a monolithic chain, but it requires powerful hardware (128+ GB RAM, high-end CPU/GPUs). It’s not EVM, which limits easy adoption from Ethereum devs.Subnet flexibility – ability to launch custom blockchains under Avalanche’s validator set, tailored for specific apps (e.g., with their own gas token or rules). Fast finality via Avalanche consensus. However, C-Chain (EVM) performance is much lower than Somnia’s, and using multiple subnets sacrifices composability between apps.Object-centric parallelism – Sui’s object model lets independent transactions execute concurrently, improving throughput when there are many unrelated TX. Also features like transaction batching, causal order for certain TX types. Move language ensures asset safety (no accidental loss of tokens). Lower throughput than Somnia, but focuses on gaming too (Sui emphasizes NFTs and simple games with Move).
Decentralization Trade-offsStarting with ~60–100 validators (foundation-selected initially, then token-holder elected). Hardware requirements relatively high (comparable to Solana/Aptos node). So not as permissionless as Ethereum, but sufficient for its use-cases (goal to grow validator set over time). Embraces "sufficient decentralization" for performance.Very high decentralization (anyone can stake 32 ETH to run a validator; thousands of independent validators). Security and censorship resistance are top-notch. But performance suffers; needs L2s for scaling, which add complexity.More centralized in practice: <2500 validators, with a small number often producing most blocks. High hardware costs means many participants use Google Cloud or data centers (less home nodes). Network has experienced outages in past under high load.Fairly decentralized: ~1000 validators, and anyone can join by staking minimum ~2,000 AVAX. Avalanche consensus is scalable in validator count without slowing much. However, each subnet may form its own smaller validator set, possibly sacrificing some security for performance.Moderate decentralization: about 100 validators (like Somnia's scale). Permissionless but at genesis heavily backed by a few entities. Uses delegated PoS too. Sui's approach is similar to Somnia/Aptos in that it's new and relatively small validator set aimed to grow.
Ecosystem & AdoptionEmerging – ~70 projects at launch, mainly gaming (Sparkball, Variance, etc.). Strong support from Improbable (metaverse events) and funding ($270M). Needs to prove itself with real user adoption post-launch. Integrated with big services (OpenSea, LayerZero) for jumpstart.Mature & vast – thousands of dApps, $20B+ TVL in DeFi, NFT market established. Developer pool is largest here. However, for high-throughput games, Ethereum L1 is not used – those projects use sidechains or L2s. Ethereum is the safe choice for general purpose dApps but not for real-time apps without L2.Growing (esp. DeFi/NFT) – Solana has a strong DeFi ecosystem (Serum, Raydium) and NFT scene (e.g., Degenerate Apes). It’s known for Web3 social apps too (Solana’s Saga phone, etc). Some gaming projects are on Solana as well. It has real users (tens of millions of addresses) but also saw hiccups with stability. Solana appeals to those who want L1 speed without sharding, at cost of more centralized infra.Mature (especially enterprise and niches) – Avalanche has DeFi (Trader Joe, etc.) and launched gaming subnets (e.g., DeFi Kingdoms moved to an Avalanche subnet). Its strength is flexibility: projects can get their own chain. However, Avalanche’s primary C-Chain is limited by EVM performance. Somnia’s one chain can outpace Avalanche’s single chain by orders of magnitude, but Avalanche can have multiple parallel chains. Composability between subnets is an issue (they need bridges).New & focusing on gaming/NFT – Sui, like Somnia, positions itself for games and next-gen apps (they demoed on-chain games too). Sui’s Move language is a barrier for some devs (not Solidity), but it offers safety features. Its ecosystem in 2023 was in infancy – a few game demos, NFTs, and basic DeFi. Somnia might actually compete more with Sui/Aptos for mindshare in Web3 gaming, as all promise high TPS. Somnia has the EVM advantage (easier adoption), whereas Sui bets on Move’s safety and parallel design.

In essence, Somnia’s closest analogs are Solana, Sui/Aptos, and maybe specialized app-chains like certain Avalanche subnets or Polygon’s upcoming high-performance chains. Like Solana, Somnia forgoes extreme decentralization in favor of performance, but Somnia differentiates itself by sticking to the EVM (helping it piggyback on Ethereum’s developer base) and by introducing a unique multi-chain consensus rather than one leader at a time. Solana’s approach to parallelism (multiple GPU threads processing different transactions) contrasts with Somnia’s approach (multiple validators each processing different streams). During correlated loads (one hot contract), Somnia’s single-core optimization shines, whereas Solana’s parallelism would throttle since all threads contend on the same state.

Compared to Ethereum mainnet, Somnia is orders of magnitude faster but sacrifices decentralization (100 validators vs Ethereum’s hundreds of thousands). Ethereum also has a far larger and battle-tested ecosystem. However, Ethereum cannot directly handle games or social apps at scale – those end up on L2s or sidechains. Somnia essentially positions itself as an alternative to an Ethereum rollup, one that is its own L1 with higher performance than any current rollup and without needing fraud proofs or separate security assumptions (aside from its smaller validator set). In the long run, Ethereum’s roadmap (sharding, danksharding, etc.) will increase throughput but likely not into the millions of TPS on L1. Instead, Ethereum bets on rollups; Somnia bets on scaling L1 itself with advanced engineering. They may not compete for the exact same use cases initially (DeFi might stay on Ethereum/L2, while games go to Somnia or similar chains). Interoperability (via LayerZero or others) might allow them to complement each other, with assets moving between Ethereum and Somnia as needed.

Avalanche offers subnets which, like Somnia, can be dedicated to games with high throughput. The difference is each Avalanche subnet is a separate instance (you’d need to spin up your own validators or recruit some validators to join it). Somnia instead provides a shared high-capacity chain, which makes interoperability between apps easier (all Somnia apps live on one chain, composable, like on Ethereum or Solana). Avalanche’s primary subnet (C-Chain) is EVM but much slower than Somnia. So Somnia outperforms Avalanche’s common chain by far, though Avalanche can scale if a project makes a custom subnet (but then that subnet might not have the full general composability or user base). For a developer, deploying on Somnia might be simpler than managing an Avalanche subnet, and you immediately tap into Somnia’s shared user pool and liquidity.

Sui (and Aptos) are often cited as next-gen high-TPS chains, using Move and parallel consensus. Somnia’s advantage over Sui is throughput (Sui hasn’t demonstrated millions TPS; their design is perhaps in the low hundreds of thousands at best) and EVM-compatibility. Sui’s advantage might be Move’s safety for complex asset logic and possibly a more decentralized roadmap (although at launch Sui also had around 100 validators). If Somnia captures the game studios that prefer using Solidity (maybe porting Solidity contracts from Ethereum game prototypes), it could outpace Sui in ecosystem quickly, given how large the Solidity developer community is.

Somnia also compares to Solana in aiming for consumer Web3 (both have emphasized social and phone integrations – Solana had a Saga phone, Somnia a browser, etc.). Herman Narula’s bold claim that Somnia can do “thousands of times the throughput of Solana” sets the tone that Somnia sees itself not just as another fast chain, but the fastest EVM chain where Solana is the fastest non-EVM chain. If Somnia delivers even an order of magnitude better sustained TPS than Solana in practice (say Solana does 5k TPS average and Somnia could do 50k or more average with peaks in the millions), it will genuinely carve a niche for applications that even Solana can’t handle (for example, a Fortnite-scale blockchain game or a global-scale social network).

One more competitor to note is Polygon 2.0 or zkEVMs – while not L1s, they offer scaling for EVM. Polygon is working on an array of ZK-rollups and high-performance chains. Those could potentially match some of Somnia’s performance while benefiting from Ethereum security. However, ZK-rollups with 1M TPS are not here yet, and even then, they might face data availability limits. Somnia’s approach is an all-in-one solution with its own security. It will have to prove that its security (100 validators PoS) is robust enough for big money applications, something Ethereum’s rollups inherently inherit from ETH. But for gaming and social, where security requirements are slightly different (stealing a game sword NFT isn’t as catastrophic as stealing billions in DeFi TVL), Somnia’s trade-off could be perfectly acceptable and even preferable due to user experience.

In conclusion, Somnia stands out by pushing the performance envelope further than any current general-purpose L1, while keeping the familiarity of EVM. It aims to occupy a space in the market for “Web3 at Web2 scale” that others have only partially addressed:

  • Ethereum will dominate trust and DeFi, but will offload high-frequency tasks to L2 (which add complexity and fragmentation).
  • Solana showed high TPS for DeFi and NFTs, but is not EVM and had stability issues; Somnia could attract projects that want Solana-like speed with Ethereum tooling.
  • Avalanche offers customizability and EVM comfort, but hasn’t demonstrated near Somnia’s single-chain performance.
  • Sui/Aptos are in the same generation as Somnia, competing for game developers, but Somnia’s early partnerships (Improbable, big brands) and EVM compatibility give it a strong edge if executed well.

As Narula said, Somnia is arguably the first chain built specifically for real-time virtual experiences at massive scale. If those experiences (games, events, social worlds) become the next big wave of blockchain adoption, Somnia’s competition might actually be traditional cloud infrastructure (AWS, etc.) as much as other blockchains – because it’s trying to replace centralized game servers and social databases, not just compete for existing blockchain apps. In that light, Somnia’s success will be measured by whether it can host applications that attract millions of users who perhaps don’t even know (or care) that a blockchain is running underneath. No current L1 has yet achieved that level of mainstream user app (even Solana’s biggest apps have hundreds of thousands, not millions of active users). That is the bar Somnia has set for itself, and against which its innovative architecture will be tested in the coming years.

Roadmap and Current Status

Somnia’s journey has rapidly progressed from concept to reality in a short time, and it continues to evolve post-mainnet with clear goals:

Recent Developments (2024–2025):

  • Funding and Testnet (2024): The project emerged from stealth backed by significant funding. In early 2024, Improbable announced the $270M commitment to Somnia and MSquared’s ecosystem. This provided a huge runway. Somnia ran a Devnet in late 2024 (Nov) where it broke records: achieving 1.05 million TPS and other benchmarks across a 100-node global setup. Those results (including 50k Uniswap trades/sec, 300k NFT mints/sec) were publicized to build credibility. Following Devnet, a fully public Testnet launched on Feb 20, 2025. The testnet (codenamed Shannon) ran for about 6 months. During that time, Somnia claims to have processed over 10 billion transactions and onboarded 118 million test wallet addresses – staggering figures. These numbers likely include scripted load tests and community participation. The testnet also saw peak daily throughput of 1.9 billion transactions in a day (a record for any EVM context). CoinDesk noted these figures but also that the public explorer was offline at the time to verify, implying some of these were internal metrics. Nonetheless, the testnet demonstrated stability under unprecedented load.

    Throughout testnet, Somnia ran engagement programs: a Points incentive program where early users completing tasks could earn points (likely convertible to future tokens or rewards), and collaborated with partners (game developers did playtests, hackathons were held). The testnet phase was also when 70+ ecosystem partners/projects were onboarded. This indicates that by mainnet, a lot of integrations and apps were ready or near-ready.

  • Mainnet Launch (Q3 2025): Somnia launched mainnet on September 2, 2025. The launch included the release of the SOMI token and the enabling of staking. Notably, at mainnet:

    • 60 validators came online (with big names like Google Cloud among them).
    • The Somnia Foundation is operational, overseeing the chain as a neutral steward. Improbable delivered the tech and now the Foundation (also referred to as the Virtual Society Foundation) is in charge of governance and development forward.
    • SOMI listing and distribution: Within a day of launch, Binance revealed SOMI as part of its “Seed Tag” listings and did the HODLer airdrop. This was a huge boost – effectively a top exchange endorsement. Many new L1s struggle to get exchange traction, but Somnia immediately got SOMI into users’ hands via Binance.
    • On social media, Somnia’s team and partners touted the mainnet’s capabilities. A press release from Improbable and coverage in outlets like CoinDesk, Yahoo Finance, etc., spread the word that “the fastest EVM chain” is live.
    • Initial ecosystem dApps began deployment. For example, the NFT bridging via LayerZero was active (one could bridge stablecoins as per docs), and some of the testnet games started moving to mainnet (Sparkball’s launch, etc., around September as indicated by blogs and updates).
    • Community airdrop events (the Somnia Odyssey) likely culminated around launch, distributing some of that Community token allocation to early supporters.

In summary, mainnet launch was successful and positioned Somnia with live validators, a live token, and >70 projects either live or imminently launching. Importantly, they hit the market exactly as interest in Web3 gaming and metaverse was picking up again in late 2025, leveraging that trend.

Current Status (Late 2025): Somnia mainnet is operational with sub-second blocks. The network is still in a bootstrap phase where the Somnia Foundation and core team maintain significant control to ensure stability. For example, governance proposals are likely not fully open yet; the foundation is probably managing upgrades and parameter tweaks while the community is being educated on governance processes. The token distribution is still very concentrated (since only ~16% is circulating and investors/team tokens won’t start unlocking until late 2026). This means the Foundation has ample token reserves to support the ecosystem (via grants, liquidity provision, etc.).

On the technical front, Somnia is likely monitoring and fine-tuning the performance in real conditions. Are real dApps pushing it to its limits? Possibly not yet – initial user counts are probably in the thousands, not millions. So there may not be 1M TPS happening on mainnet regularly, but the capacity is there. The team might use this period to optimize the client software, incorporate any feedback from Cuthbert (if any divergences were found, those would be fixed promptly), and harden security. The security audits results (if not already released) might be published around this time or early 2026 to assure developers of safety.

Near-Term Roadmap (2026): The Somnia documentation and communications hint at several near-term goals:

  • Feature Rollouts: Some features were planned to activate after launch:
    • The Dynamic Gas Pricing & Volume Discounts are slated to roll out by end of 2025. This requires some testing and perhaps governance approval to turn on. Once enabled, high-throughput dApps will start enjoying cheaper gas, which could be a selling point to attract enterprise or big Web2 partners.
    • The Transient Storage feature is also scheduled for late 2025. The implementation likely needs to be carefully tested (ensuring data deletion works correctly and doesn’t introduce consensus issues). When this goes live, Somnia will be one of the first chains to offer expir-able on-chain data, which will be huge for game devs (imagine temporary game sessions on-chain).
    • Tipping (priority fees): They noted tipping might be introduced later if needed. If network usage increases to where blocks are consistently full, by 2026 they might enable optional tips to prioritize transactions (just like Ethereum’s base fee & tip model). This would be a sign of healthy congestion if it happens.
    • Validator Set Expansion: Initially ~60, the goal is to increase the number of validators over time to improve decentralization without hurting performance. They mentioned expecting growth beyond 100 as the network matures. The timeline might depend on how well the consensus scales with more validators (PBFT tends to get slower as validators increase, but maybe their Autobahn-inspired variant can handle a few hundred). In 2026, they might onboard additional validators, possibly from their community or new partners. This could be done through governance votes (token holders approving new validators) or automatically if enough stake is backing new entrants.
    • Decentralizing Governance: Somnia laid out a Progressive Decentralization roadmap in governance. In the first 6 months (bootstrap phase), the Foundation board is fully in control. So roughly until Q1/Q2 2026, we’ll be in bootstrap – during which they likely refine processes and onboard members to councils. Then from 6–24 months (mid-2026 to late 2027), they enter Transition phase where the Token House (token holders) can start voting on proposals, though the Foundation can veto if needed. We might see the first on-chain votes in 2026 for things like grant allocations or minor parameter changes. By year 2 (2027), the aim is Mature phase where token holder decisions mostly stand and Foundation only does emergency interventions. So for 2026, one key goal is establishing those governance bodies: possibly electing members to the Validator Council, Developer Council, User Assembly that were described. This will involve community organization – likely something the Foundation will facilitate by selecting reputable members initially (for example, inviting top game devs to a dev council, or big community guild leaders to a user assembly).
  • Ecosystem Growth: On the adoption front, 2026 will be about turning pilot projects into mainstream successes:
    • We expect full game releases: Sparkball and Variance might go from beta to official launch on Somnia mainnet in 2026, aiming to attract tens of thousands of players. Other games from the Dream Catalyst cohort (Maelstrom, Netherak, Dark Table, etc.) will likely roll out to the public. Somnia’s team will support these launches, possibly via marketing campaigns, tournaments, and incentive programs (like play-to-earn or airdrops) to draw gamers in.
    • New partnerships: Improbable/MSquared planned to scale from 30 events in 2023 to 300+ metaverse events in 2024. In 2024 they did many events off-chain; in 2025/2026, we expect those events to integrate Somnia. For example, perhaps a major sports event or music festival in 2026 will use Somnia for ticketing or fan rewards. Google Cloud’s involvement suggests possible enterprise events or showcases via Google’s cloud clients. Also, given Mirana (associated with Bybit/BitDAO) and others invested, Somnia might see collaboration with exchanges or big Web3 brands to utilize the network.
    • MSquared Integration: The chainwire release noted M² plans to integrate Somnia into its network of metaverses. That means any virtual world using MSquared’s tech could adopt Somnia as its transaction layer. By 2026, we might see MSquared formally launch its metaverse network with Somnia underpinning avatar identity, item trading, etc. If Yuga Labs’ Otherside is still on track, perhaps an interoperability demonstration with Somnia will occur (e.g., use your Otherside NFT in a Somnia-powered world).
    • Developer Community Expansion: The $10M grants will be distributed over time – by 2026, likely dozens of projects will have received funding. The output of that could be more tools (say, Unity SDK for Somnia, or more Ormi improvements), more apps (maybe someone builds a Somnia-based decentralized Twitter or a new DeFi platform). Somnia will probably hold more hackathons (potentially some in-person at conferences, etc.) and continue aggressive devrel to attract talent. They might especially target developers from Ethereum who are hitting scaling limits with their dApps, offering them an easy port to Somnia.
    • Interoperability and Bridges: Already integrated with LayerZero, Somnia will likely expand bridges to other ecosystems for broader asset support. For instance, integration with Polygon or Cosmos IBC could be on the table. Also, cross-chain standards for NFTs (maybe letting Ethereum NFTs mirror onto Somnia for usage in games) could be pursued. Since Somnia is EVM, deploying bridge contracts for popular tokens (USDC, USDT, WETH) is straightforward – 2026 could see deeper liquidity as more of these cross-chain assets flow in.
    • Performance Monitoring: As more real usage comes, the team will monitor for any stability issues. Are there any attack vectors (spamming many data chains, etc.)? They might implement refinements like rate-limits per data chain or further optimizations if needed. The Cuthbert dual execution will likely run until at least 2026 to catch any divergence; if the system proves very stable, they might consider turning it off to reduce overhead after a year or two, but that is contingent on full confidence.
  • Marketing and Outreach: With mainnet and initial apps live, Somnia’s challenge for 2026 is building a user base. Expect heavy marketing aimed at gamers and crypto users alike:
    • We might see partnerships with gaming guilds or esports teams, to drive players to Somnia games.
    • Perhaps celebrity collaborations for virtual events (given they did K-Pop and sports legends in test events, they could escalate that – imagine a famous musician releasing an album through a Somnia metaverse show with NFT merch).
    • Also, attending and sponsoring major conferences (GDC for game devs, Consensus for crypto, etc.) to promote the platform.
    • By late 2025, they already had significant press (Binance Academy article, CoinDesk coverage, etc.). In 2026, more independent analyses (Messari profiles, etc.) will come out, and Somnia will want to showcase usage metrics to prove traction (like “X daily active users, Y transactions processed”).

Longer-Term Vision: Though not explicitly asked, it’s worth noting Somnia’s trajectory:

  • In a few years, they imagine Somnia as a widely-used base layer for Web3 entertainment, with billions of transactions as routine, and a decentralized governance run by its community and councils. They also likely foresee continuous technical improvement – e.g., exploring sharding if needed, or adopting new cryptography (maybe zk-proofs to compress data even more, or post-quantum crypto eventually).
  • Another long-term goal might be carbon neutrality or efficiency: high TPS chains often worry about energy usage. If Somnia reaches millions of TPS, ensuring nodes can handle it efficiently (maybe through hardware acceleration or cloud scaling) will be important. With Google Cloud in the mix, perhaps green data center initiatives or special hardware (like GPUs or FPGAs for compression) could be considered.
  • By then, competition will also step up (Ethereum 2.0 with sharding, zkEVMs, Solana improvements, etc.). Somnia will have to keep its edge through innovation and network effects (if it captures a large player base early, that momentum can carry it).

In summary, the roadmap for the next 1-2 years focuses on:

  1. Activating key protocol features (gas discounts, transient storage) to fully deliver promised functionality.
  2. Decentralizing governance gradually – moving from foundation-led to community-led without jeopardizing progress.
  3. Driving ecosystem growth – ensuring the funded projects launch and attract users, forging new partnerships (with content creators, game studios, maybe even Web2 companies interested in Web3), and possibly expanding into more regions and communities.
  4. Maintaining performance and security as usage scales – watching for any issues when, say, a game drives a spike of 10k TPS of real traffic, and responding accordingly (this might include running more public test events, maybe a “Mainnet stress test” event where they encourage tons of transactions to test limits).

Somnia has made a splashy debut, but 2026 will be the proving ground: It needs to convert its impressive technology and well-funded ecosystem into real adoption and a sustainable, decentralized network. The foundation’s large token treasury (Ecosystem and Community ~55% of supply) gives it the means to bootstrap activity for years, so in the near-term we’ll see those tokens put to use – via airdrops, rewards (possibly liquidity mining if a DEX launches), developer bounties, and user acquisition campaigns. The mainnet launch slogan from Improbable was that Somnia “marks the foundation of an open digital asset economy, where billions of people can interact across immersive experiences”. The next steps on the roadmap are all about laying the bricks of that foundation: getting the first millions of people and first killer apps to engage with Somnia’s “dream computer” (as they dub it), and thereby validating that Web3 can indeed operate at internet scale.

If Somnia continues on its current trajectory, by the end of 2026 we could see dozens of fully on-chain games and social platforms running, a flourishing community-run network with hundreds of validators, and SOMI being used daily by mainstream users (often unknowingly, under the hood of games). Achieving that would mark a significant milestone not just for Somnia but for the blockchain industry’s push into mainstream, real-time applications. The pieces are in place; now it’s about execution and adoption in this critical deep-research-fueled phase of the project’s roadmap.

Sources:

  • Somnia Official Documentation (Litepaper & Technical Concepts)
  • Somnia Tokenomics and Governance Docs
  • Improbable Press Release (Mainnet Launch)
  • CoinDesk Coverage of Somnia Launch
  • Binance Academy – What is Somnia (SOMI)
  • Gam3s.gg – Coverage of Somnia Games (Variance, Sparkball, etc.)
  • Stakin Research – Introduction to Somnia
  • Chainwire Press Release – $270M Investment & Devnet results
  • Somnia Blog – Improbable & MSquared Events, Mainnet News
  • Official Somnia Docs – Developer Guides (bridging, wallets, etc.)

Building Decentralized Encryption with @mysten/seal: A Developer's Tutorial

· 13 min read
Dora Noda
Software Engineer

Privacy is becoming public infrastructure. In 2025, developers need tools that make encryption as easy as storing data. Mysten Labs' Seal provides exactly that—decentralized secrets management with onchain access control. This tutorial will teach you how to build secure Web3 applications using identity-based encryption, threshold security, and programmable access policies.


Introduction: Why Seal Matters for Web3

Traditional cloud applications rely on centralized key management systems where a single provider controls access to encrypted data. While convenient, this creates dangerous single points of failure. If the provider is compromised, goes offline, or decides to restrict access, your data becomes inaccessible or vulnerable.

Seal changes this paradigm entirely. Built by Mysten Labs for the Sui blockchain, Seal is a decentralized secrets management (DSM) service that enables:

  • Identity-based encryption where content is protected before it leaves your environment
  • Threshold encryption that distributes key access across multiple independent nodes
  • Onchain access control with time locks, token-gating, and custom authorization logic
  • Storage agnostic design that works with Walrus, IPFS, or any storage solution

Whether you're building secure messaging apps, gated content platforms, or time-locked asset transfers, Seal provides the cryptographic primitives and access control infrastructure you need.


Getting Started

Prerequisites

Before diving in, ensure you have:

  • Node.js 18+ installed
  • Basic familiarity with TypeScript/JavaScript
  • A Sui wallet for testing (like Sui Wallet)
  • Understanding of blockchain concepts

Installation

Install the Seal SDK via npm:

npm install @mysten/seal

You'll also want the Sui SDK for blockchain interactions:

npm install @mysten/sui

Project Setup

Create a new project and initialize it:

mkdir seal-tutorial
cd seal-tutorial
npm init -y
npm install @mysten/seal @mysten/sui typescript @types/node

Create a simple TypeScript configuration:

// tsconfig.json
{
"compilerOptions": {
"target": "ES2020",
"module": "commonjs",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true
}
}

Core Concepts: How Seal Works

Before writing code, let's understand Seal's architecture:

1. Identity-Based Encryption (IBE)

Unlike traditional encryption where you encrypt to a public key, IBE lets you encrypt to an identity (like an email address or Sui address). The recipient can only decrypt if they can prove they control that identity.

2. Threshold Encryption

Instead of trusting a single key server, Seal uses t-of-n threshold schemes. You might configure 3-of-5 key servers, meaning any 3 servers can cooperate to provide decryption keys, but 2 or fewer cannot.

3. Onchain Access Control

Access policies are enforced by Sui smart contracts. Before a key server provides decryption keys, it verifies that the requestor meets the onchain policy requirements (token ownership, time constraints, etc.).

4. Key Server Network

Distributed key servers validate access policies and generate decryption keys. These servers are operated by different parties to ensure no single point of control.


Basic Implementation: Your First Seal Application

Let's build a simple application that encrypts sensitive data and controls access through Sui blockchain policies.

Step 1: Initialize the Seal Client

// src/seal-client.ts
import { SealClient } from '@mysten/seal';
import { SuiClient } from '@mysten/sui/client';

export async function createSealClient() {
// Initialize Sui client for testnet
const suiClient = new SuiClient({
url: 'https://fullnode.testnet.sui.io'
});

// Configure Seal client with testnet key servers
const sealClient = new SealClient({
suiClient,
keyServers: [
'https://keyserver1.seal-testnet.com',
'https://keyserver2.seal-testnet.com',
'https://keyserver3.seal-testnet.com'
],
threshold: 2, // 2-of-3 threshold
network: 'testnet'
});

return { sealClient, suiClient };
}

Step 2: Simple Encryption/Decryption

// src/basic-encryption.ts
import { createSealClient } from './seal-client';

async function basicExample() {
const { sealClient } = await createSealClient();

// Data to encrypt
const sensitiveData = "This is my secret message!";
const recipientAddress = "0x742d35cc6d4c0c08c0f9bf3c9b2b6c64b3b4f5c6d7e8f9a0b1c2d3e4f5a6b7c8";

try {
// Encrypt data for a specific Sui address
const encryptedData = await sealClient.encrypt({
data: Buffer.from(sensitiveData, 'utf-8'),
recipientId: recipientAddress,
// Optional: add metadata
metadata: {
contentType: 'text/plain',
timestamp: Date.now()
}
});

console.log('Encrypted data:', {
ciphertext: encryptedData.ciphertext.toString('base64'),
encryptionId: encryptedData.encryptionId
});

// Later, decrypt the data (requires proper authorization)
const decryptedData = await sealClient.decrypt({
ciphertext: encryptedData.ciphertext,
encryptionId: encryptedData.encryptionId,
recipientId: recipientAddress
});

console.log('Decrypted data:', decryptedData.toString('utf-8'));

} catch (error) {
console.error('Encryption/decryption failed:', error);
}
}

basicExample();

Access Control with Sui Smart Contracts

The real power of Seal comes from programmable access control. Let's create a time-locked encryption example where data can only be decrypted after a specific time.

Step 1: Deploy Access Control Contract

First, we need a Move smart contract that defines our access policy:

// contracts/time_lock.move
module time_lock::policy {
use sui::clock::{Self, Clock};
use sui::object::{Self, UID};
use sui::tx_context::{Self, TxContext};

public struct TimeLockPolicy has key, store {
id: UID,
unlock_time: u64,
authorized_user: address,
}

public fun create_time_lock(
unlock_time: u64,
authorized_user: address,
ctx: &mut TxContext
): TimeLockPolicy {
TimeLockPolicy {
id: object::new(ctx),
unlock_time,
authorized_user,
}
}

public fun can_decrypt(
policy: &TimeLockPolicy,
user: address,
clock: &Clock
): bool {
let current_time = clock::timestamp_ms(clock);
policy.authorized_user == user && current_time >= policy.unlock_time
}
}

Step 2: Integrate with Seal

// src/time-locked-encryption.ts
import { createSealClient } from './seal-client';
import { TransactionBlock } from '@mysten/sui/transactions';

async function createTimeLocked() {
const { sealClient, suiClient } = await createSealClient();

// Create access policy on Sui
const txb = new TransactionBlock();

const unlockTime = Date.now() + 60000; // Unlock in 1 minute
const authorizedUser = "0x742d35cc6d4c0c08c0f9bf3c9b2b6c64b3b4f5c6d7e8f9a0b1c2d3e4f5a6b7c8";

txb.moveCall({
target: 'time_lock::policy::create_time_lock',
arguments: [
txb.pure(unlockTime),
txb.pure(authorizedUser)
]
});

// Execute transaction to create policy
const result = await suiClient.signAndExecuteTransactionBlock({
transactionBlock: txb,
signer: yourKeypair, // Your Sui keypair
});

const policyId = result.objectChanges?.find(
change => change.type === 'created'
)?.objectId;

// Now encrypt with this policy
const sensitiveData = "This will unlock in 1 minute!";

const encryptedData = await sealClient.encrypt({
data: Buffer.from(sensitiveData, 'utf-8'),
recipientId: authorizedUser,
accessPolicy: {
policyId,
policyType: 'time_lock'
}
});

console.log('Time-locked data created. Try decrypting after 1 minute.');

return {
encryptedData,
policyId,
unlockTime
};
}

Practical Examples

Example 1: Secure Messaging Application

// src/secure-messaging.ts
import { createSealClient } from './seal-client';

class SecureMessenger {
private sealClient: any;

constructor(sealClient: any) {
this.sealClient = sealClient;
}

async sendMessage(
message: string,
recipientAddress: string,
senderKeypair: any
) {
const messageData = {
content: message,
timestamp: Date.now(),
sender: senderKeypair.toSuiAddress(),
messageId: crypto.randomUUID()
};

const encryptedMessage = await this.sealClient.encrypt({
data: Buffer.from(JSON.stringify(messageData), 'utf-8'),
recipientId: recipientAddress,
metadata: {
type: 'secure_message',
sender: senderKeypair.toSuiAddress()
}
});

// Store encrypted message on decentralized storage (Walrus)
return this.storeOnWalrus(encryptedMessage);
}

async readMessage(encryptionId: string, recipientKeypair: any) {
// Retrieve from storage
const encryptedData = await this.retrieveFromWalrus(encryptionId);

// Decrypt with Seal
const decryptedData = await this.sealClient.decrypt({
ciphertext: encryptedData.ciphertext,
encryptionId: encryptedData.encryptionId,
recipientId: recipientKeypair.toSuiAddress()
});

return JSON.parse(decryptedData.toString('utf-8'));
}

private async storeOnWalrus(data: any) {
// Integration with Walrus storage
// This would upload the encrypted data to Walrus
// and return the blob ID for retrieval
}

private async retrieveFromWalrus(blobId: string) {
// Retrieve encrypted data from Walrus using blob ID
}
}

Example 2: Token-Gated Content Platform

// src/gated-content.ts
import { createSealClient } from './seal-client';

class ContentGating {
private sealClient: any;
private suiClient: any;

constructor(sealClient: any, suiClient: any) {
this.sealClient = sealClient;
this.suiClient = suiClient;
}

async createGatedContent(
content: string,
requiredNftCollection: string,
creatorKeypair: any
) {
// Create NFT ownership policy
const accessPolicy = await this.createNftPolicy(
requiredNftCollection,
creatorKeypair
);

// Encrypt content with NFT access requirement
const encryptedContent = await this.sealClient.encrypt({
data: Buffer.from(content, 'utf-8'),
recipientId: 'nft_holders', // Special recipient for NFT holders
accessPolicy: {
policyId: accessPolicy.policyId,
policyType: 'nft_ownership'
}
});

return {
contentId: encryptedContent.encryptionId,
accessPolicy: accessPolicy.policyId
};
}

async accessGatedContent(
contentId: string,
userAddress: string,
userKeypair: any
) {
// Verify NFT ownership first
const hasAccess = await this.verifyNftOwnership(
userAddress,
contentId
);

if (!hasAccess) {
throw new Error('Access denied: Required NFT not found');
}

// Decrypt content
const decryptedContent = await this.sealClient.decrypt({
encryptionId: contentId,
recipientId: userAddress
});

return decryptedContent.toString('utf-8');
}

private async createNftPolicy(collection: string, creator: any) {
// Create Move contract that checks NFT ownership
// Returns policy object ID
}

private async verifyNftOwnership(user: string, contentId: string) {
// Check if user owns required NFT
// Query Sui for NFT ownership
}
}

Example 3: Time-Locked Asset Transfer

// src/time-locked-transfer.ts
import { createSealClient } from './seal-client';

async function createTimeLockTransfer(
assetData: any,
recipientAddress: string,
unlockTimestamp: number,
senderKeypair: any
) {
const { sealClient, suiClient } = await createSealClient();

// Create time-lock policy on Sui
const timeLockPolicy = await createTimeLockPolicy(
unlockTimestamp,
recipientAddress,
senderKeypair,
suiClient
);

// Encrypt asset transfer data
const transferData = {
asset: assetData,
recipient: recipientAddress,
unlockTime: unlockTimestamp,
transferId: crypto.randomUUID()
};

const encryptedTransfer = await sealClient.encrypt({
data: Buffer.from(JSON.stringify(transferData), 'utf-8'),
recipientId: recipientAddress,
accessPolicy: {
policyId: timeLockPolicy.policyId,
policyType: 'time_lock'
}
});

console.log(`Asset locked until ${new Date(unlockTimestamp)}`);

return {
transferId: encryptedTransfer.encryptionId,
unlockTime: unlockTimestamp,
policyId: timeLockPolicy.policyId
};
}

async function claimTimeLockTransfer(
transferId: string,
recipientKeypair: any
) {
const { sealClient } = await createSealClient();

try {
const decryptedData = await sealClient.decrypt({
encryptionId: transferId,
recipientId: recipientKeypair.toSuiAddress()
});

const transferData = JSON.parse(decryptedData.toString('utf-8'));

// Process the asset transfer
console.log('Asset transfer unlocked:', transferData);

return transferData;
} catch (error) {
console.error('Transfer not yet unlocked or access denied:', error);
throw error;
}
}

Integration with Walrus Decentralized Storage

Seal works seamlessly with Walrus, Sui's decentralized storage solution. Here's how to integrate both:

// src/walrus-integration.ts
import { createSealClient } from './seal-client';

class SealWalrusIntegration {
private sealClient: any;
private walrusClient: any;

constructor(sealClient: any, walrusClient: any) {
this.sealClient = sealClient;
this.walrusClient = walrusClient;
}

async storeEncryptedData(
data: Buffer,
recipientAddress: string,
accessPolicy?: any
) {
// Encrypt with Seal
const encryptedData = await this.sealClient.encrypt({
data,
recipientId: recipientAddress,
accessPolicy
});

// Store encrypted data on Walrus
const blobId = await this.walrusClient.store(
encryptedData.ciphertext
);

// Return reference that includes both Seal and Walrus info
return {
blobId,
encryptionId: encryptedData.encryptionId,
accessPolicy: encryptedData.accessPolicy
};
}

async retrieveAndDecrypt(
blobId: string,
encryptionId: string,
userKeypair: any
) {
// Retrieve from Walrus
const encryptedData = await this.walrusClient.retrieve(blobId);

// Decrypt with Seal
const decryptedData = await this.sealClient.decrypt({
ciphertext: encryptedData,
encryptionId,
recipientId: userKeypair.toSuiAddress()
});

return decryptedData;
}
}

// Usage example
async function walrusExample() {
const { sealClient } = await createSealClient();
const walrusClient = new WalrusClient('https://walrus-testnet.sui.io');

const integration = new SealWalrusIntegration(sealClient, walrusClient);

const fileData = Buffer.from('Important document content');
const recipientAddress = '0x...';

// Store encrypted
const result = await integration.storeEncryptedData(
fileData,
recipientAddress
);

console.log('Stored with Blob ID:', result.blobId);

// Later, retrieve and decrypt
const decrypted = await integration.retrieveAndDecrypt(
result.blobId,
result.encryptionId,
recipientKeypair
);

console.log('Retrieved data:', decrypted.toString());
}

Threshold Encryption Advanced Configuration

For production applications, you'll want to configure custom threshold encryption with multiple key servers:

// src/advanced-threshold.ts
import { SealClient } from '@mysten/seal';

async function setupProductionSeal() {
// Configure with multiple independent key servers
const keyServers = [
'https://keyserver-1.your-org.com',
'https://keyserver-2.partner-org.com',
'https://keyserver-3.third-party.com',
'https://keyserver-4.backup-provider.com',
'https://keyserver-5.fallback.com'
];

const sealClient = new SealClient({
keyServers,
threshold: 3, // 3-of-5 threshold
network: 'mainnet',
// Advanced options
retryAttempts: 3,
timeoutMs: 10000,
backupKeyServers: [
'https://backup-1.emergency.com',
'https://backup-2.emergency.com'
]
});

return sealClient;
}

async function robustEncryption() {
const sealClient = await setupProductionSeal();

const criticalData = "Mission critical encrypted data";

// Encrypt with high security guarantees
const encrypted = await sealClient.encrypt({
data: Buffer.from(criticalData, 'utf-8'),
recipientId: '0x...',
// Require all 5 servers for maximum security
customThreshold: 5,
// Add redundancy
redundancy: 2,
accessPolicy: {
// Multi-factor requirements
requirements: ['nft_ownership', 'time_lock', 'multisig_approval']
}
});

return encrypted;
}

Security Best Practices

1. Key Management

// src/security-practices.ts

// GOOD: Use secure key derivation
import { generateKeypair } from '@mysten/sui/cryptography/ed25519';

const keypair = generateKeypair();

// GOOD: Store keys securely (example with environment variables)
const keypair = Ed25519Keypair.fromSecretKey(
process.env.PRIVATE_KEY
);

// BAD: Never hardcode keys
const badKeypair = Ed25519Keypair.fromSecretKey(
"hardcoded-secret-key-12345" // Don't do this!
);

2. Access Policy Validation

// Always validate access policies before encryption
async function secureEncrypt(data: Buffer, recipient: string) {
const { sealClient } = await createSealClient();

// Validate recipient address
if (!isValidSuiAddress(recipient)) {
throw new Error('Invalid recipient address');
}

// Check policy exists and is valid
const policy = await validateAccessPolicy(policyId);
if (!policy.isValid) {
throw new Error('Invalid access policy');
}

return sealClient.encrypt({
data,
recipientId: recipient,
accessPolicy: policy
});
}

3. Error Handling and Fallbacks

// Robust error handling
async function resilientDecrypt(encryptionId: string, userKeypair: any) {
const { sealClient } = await createSealClient();

try {
return await sealClient.decrypt({
encryptionId,
recipientId: userKeypair.toSuiAddress()
});
} catch (error) {
if (error.code === 'ACCESS_DENIED') {
throw new Error('Access denied: Check your permissions');
} else if (error.code === 'KEY_SERVER_UNAVAILABLE') {
// Try with backup configuration
return await retryWithBackupServers(encryptionId, userKeypair);
} else if (error.code === 'THRESHOLD_NOT_MET') {
throw new Error('Insufficient key servers available');
} else {
throw new Error(`Decryption failed: ${error.message}`);
}
}
}

4. Data Validation

// Validate data before encryption
function validateDataForEncryption(data: Buffer): boolean {
// Check size limits
if (data.length > 1024 * 1024) { // 1MB limit
throw new Error('Data too large for encryption');
}

// Check for sensitive patterns (optional)
const dataStr = data.toString();
if (containsSensitivePatterns(dataStr)) {
console.warn('Warning: Data contains potentially sensitive patterns');
}

return true;
}

Performance Optimization

1. Batching Operations

// Batch multiple encryptions for efficiency
async function batchEncrypt(dataItems: Buffer[], recipients: string[]) {
const { sealClient } = await createSealClient();

const promises = dataItems.map((data, index) =>
sealClient.encrypt({
data,
recipientId: recipients[index]
})
);

return Promise.all(promises);
}

2. Caching Key Server Responses

// Cache key server sessions to reduce latency
class OptimizedSealClient {
private sessionCache = new Map();

async encryptWithCaching(data: Buffer, recipient: string) {
let session = this.sessionCache.get(recipient);

if (!session || this.isSessionExpired(session)) {
session = await this.createNewSession(recipient);
this.sessionCache.set(recipient, session);
}

return this.encryptWithSession(data, session);
}
}

Testing Your Seal Integration

Unit Testing

// tests/seal-integration.test.ts
import { describe, it, expect } from 'jest';
import { createSealClient } from '../src/seal-client';

describe('Seal Integration', () => {
it('should encrypt and decrypt data successfully', async () => {
const { sealClient } = await createSealClient();
const testData = Buffer.from('test message');
const recipient = '0x742d35cc6d4c0c08c0f9bf3c9b2b6c64b3b4f5c6d7e8f9a0b1c2d3e4f5a6b7c8';

const encrypted = await sealClient.encrypt({
data: testData,
recipientId: recipient
});

expect(encrypted.encryptionId).toBeDefined();
expect(encrypted.ciphertext).toBeDefined();

const decrypted = await sealClient.decrypt({
ciphertext: encrypted.ciphertext,
encryptionId: encrypted.encryptionId,
recipientId: recipient
});

expect(decrypted.toString()).toBe('test message');
});

it('should enforce access control policies', async () => {
// Test that unauthorized users cannot decrypt
const { sealClient } = await createSealClient();

const encrypted = await sealClient.encrypt({
data: Buffer.from('secret'),
recipientId: 'authorized-user'
});

await expect(
sealClient.decrypt({
ciphertext: encrypted.ciphertext,
encryptionId: encrypted.encryptionId,
recipientId: 'unauthorized-user'
})
).rejects.toThrow('Access denied');
});
});

Deployment to Production

Environment Configuration

// config/production.ts
export const productionConfig = {
keyServers: [
process.env.KEY_SERVER_1,
process.env.KEY_SERVER_2,
process.env.KEY_SERVER_3,
process.env.KEY_SERVER_4,
process.env.KEY_SERVER_5
],
threshold: 3,
network: 'mainnet',
suiRpc: process.env.SUI_RPC_URL,
walrusGateway: process.env.WALRUS_GATEWAY,
// Security settings
maxDataSize: 1024 * 1024, // 1MB
sessionTimeout: 3600000, // 1 hour
retryAttempts: 3
};

Monitoring and Logging

// utils/monitoring.ts
export class SealMonitoring {
static logEncryption(encryptionId: string, recipient: string) {
console.log(`[SEAL] Encrypted data ${encryptionId} for ${recipient}`);
// Send to your monitoring service
}

static logDecryption(encryptionId: string, success: boolean) {
console.log(`[SEAL] Decryption ${encryptionId}: ${success ? 'SUCCESS' : 'FAILED'}`);
}

static logKeyServerHealth(serverUrl: string, status: string) {
console.log(`[SEAL] Key server ${serverUrl}: ${status}`);
}
}

Resources and Next Steps

Official Documentation

Community and Support

  • Sui Discord: Join the #seal channel for community support
  • GitHub Issues: Report bugs and request features
  • Developer Forums: Sui community forums for discussions

Advanced Topics to Explore

  1. Custom Access Policies: Build complex authorization logic with Move contracts
  2. Cross-Chain Integration: Use Seal with other blockchain networks
  3. Enterprise Key Management: Set up your own key server infrastructure
  4. Audit and Compliance: Implement logging and monitoring for regulated environments

Sample Applications

  • Secure Chat App: End-to-end encrypted messaging with Seal
  • Document Management: Enterprise document sharing with access controls
  • Digital Rights Management: Content distribution with usage policies
  • Privacy-Preserving Analytics: Encrypted data processing workflows

Conclusion

Seal represents a fundamental shift toward making privacy and encryption infrastructure-level concerns in Web3. By combining identity-based encryption, threshold security, and programmable access control, it provides developers with powerful tools to build truly secure and decentralized applications.

The key advantages of building with Seal include:

  • No Single Point of Failure: Distributed key servers eliminate central authorities
  • Programmable Security: Smart contract-based access policies provide flexible authorization
  • Developer-Friendly: TypeScript SDK integrates seamlessly with existing Web3 tooling
  • Storage Agnostic: Works with Walrus, IPFS, or any storage solution
  • Production Ready: Built by Mysten Labs with enterprise security standards

Whether you're securing user data, implementing subscription models, or building complex multi-party applications, Seal provides the cryptographic primitives and access control infrastructure you need to build with confidence.

Start building today, and join the growing ecosystem of developers making privacy a fundamental part of public infrastructure.


Ready to start building? Install @mysten/seal and begin experimenting with the examples in this tutorial. The decentralized web is waiting for applications that put privacy and security first.

The Crypto Endgame: Insights from Industry Visionaries

· 12 min read
Dora Noda
Software Engineer

Visions from Mert Mumtaz (Helius), Udi Wertheimer (Taproot Wizards), Jordi Alexander (Selini Capital) and Alexander Good (Post Fiat)

Overview

Token2049 hosted a panel called “The Crypto Endgame” featuring Mert Mumtaz (CEO of Helius), Udi Wertheimer (Taproot Wizards), Jordi Alexander (Founder of Selini Capital) and Alexander Good (creator of Post Fiat). While there is no publicly available transcript of the panel, each speaker has expressed distinct visions for the long‑term trajectory of the crypto industry. This report synthesizes their public statements and writings—spanning blog posts, articles, news interviews and whitepapers—to explore how each person envisions the “endgame” for crypto.

Mert Mumtaz – Crypto as “Capitalism 2.0”

Core vision

Mert Mumtaz rejects the idea that cryptocurrencies simply represent “Web 3.0.” Instead, he argues that the endgame for crypto is to upgrade capitalism itself. In his view:

  • Crypto supercharges capitalism’s ingredients: Mumtaz notes that capitalism depends on the free flow of information, secure property rights, aligned incentives, transparency and frictionless capital flows. He argues that decentralized networks, public blockchains and tokenization make these features more efficient, turning crypto into “Capitalism 2.0”.
  • Always‑on markets & tokenized assets: He points to regulatory proposals for 24/7 financial markets and the tokenization of stocks, bonds and other real‑world assets. Allowing markets to run continuously and settle via blockchain rails will modernize the legacy financial system. Tokenization creates always‑on liquidity and frictionless trading of assets that previously required clearing houses and intermediaries.
  • Decentralization & transparency: By using open ledgers, crypto removes some of the gate‑keeping and information asymmetries found in traditional finance. Mumtaz views this as an opportunity to democratize finance, align incentives and reduce middlemen.

Implications

Mumtaz’s “Capitalism 2.0” thesis suggests that the industry’s endgame is not limited to digital collectibles or “Web3 apps.” Instead, he envisions a future where nation‑state regulators embrace 24/7 markets, asset tokenization and transparency. In that world, blockchain infrastructure becomes a core component of the global economy, blending crypto with regulated finance. He also warns that the transition will face challenges—such as Sybil attacks, concentration of governance and regulatory uncertainty—but believes these obstacles can be addressed through better protocol design and collaboration with regulators.

Udi Wertheimer – Bitcoin as a “generational rotation” and the altcoin reckoning

Generational rotation & Bitcoin “retire your bloodline” thesis

Udi Wertheimer, co‑founder of Taproot Wizards, is known for provocatively defending Bitcoin and mocking altcoins. In mid‑2025 he posted a viral thesis called “This Bitcoin Thesis Will Retire Your Bloodline.” According to his argument:

  • Generational rotation: Wertheimer argues that the early Bitcoin “whales” who accumulated at low prices have largely sold or transferred their coins. Institutional buyers—ETFs, treasuries and sovereign wealth funds—have replaced them. He calls this process a “full‑scale rotation of ownership”, similar to Dogecoin’s 2019‑21 rally where a shift from whales to retail demand fueled explosive returns.
  • Price‑insensitive demand: Institutions allocate capital without caring about unit price. Using BlackRock’s IBIT ETF as an example, he notes that new investors see a US$40 increase as trivial and are willing to buy at any price. This supply shock combined with limited float means Bitcoin could accelerate far beyond consensus expectations.
  • 400K+targetandaltcoincollapse:HeprojectsthatBitcoincouldexceedUS400K+ target and altcoin collapse:** He projects that Bitcoin could exceed **US400 000 per BTC by the end of 2025 and warns that altcoins will underperform or even collapse, with Ethereum singled out as the “biggest loser”. According to Wertheimer, once institutional FOMO sets in, altcoins will “get one‑shotted” and Bitcoin will absorb most of the capital.

Implications

Wertheimer’s endgame thesis portrays Bitcoin as entering its final parabolic phase. The “generational rotation” means that supply is moving into strong hands (ETFs and treasuries) while retail interest is just starting. If correct, this would create a severe supply shock, pushing BTC price well beyond current valuations. Meanwhile, he believes altcoins offer asymmetric downside because they lack institutional bid support and face regulatory scrutiny. His message to investors is clear: load up on Bitcoin now before Wall Street buys it all.

Jordi Alexander – Macro pragmatism, AI & crypto as twin revolutions

Investing in AI and crypto – two key industries

Jordi Alexander, founder of Selini Capital and a known game theorist, argues that AI and blockchain are the two most important industries of this century. In an interview summarised by Bitget he makes several points:

  • The twin revolutions: Alexander believes the only ways to achieve real wealth growth are to invest in technological innovation (particularly AI) or to participate early in emerging markets like cryptocurrency. He notes that AI development and crypto infrastructure will be the foundational modules for intelligence and coordination this century.
  • End of the four‑year cycle: He asserts that the traditional four‑year crypto cycle driven by Bitcoin halvings is over; instead the market now experiences liquidity‑driven “mini‑cycles.” Future up‑moves will occur when “real capital” fully enters the space. He encourages traders to see inefficiencies as opportunity and to develop both technical and psychological skills to thrive in this environment.
  • Risk‑taking & skill development: Alexander advises investors to keep most funds in safe assets but allocate a small portion for risk‑taking. He emphasizes building judgment and staying adaptable, as there is “no such thing as retirement” in a rapidly evolving field.

Critique of centralized strategies and macro views

  • MicroStrategy’s zero‑sum game: In a flash note he cautions that MicroStrategy’s strategy of buying BTC may be a zero‑sum game. While participants might feel like they are winning, the dynamic could hide risks and lead to volatility. This underscores his belief that crypto markets are often driven by negative‑sum or zero‑sum dynamics, so traders must understand the motivations of large players.
  • Endgame of U.S. monetary policy: Alexander’s analysis of U.S. macro policy highlights that the Federal Reserve’s control over the bond market may be waning. He notes that long‑term bonds have fallen sharply since 2020 and believes the Fed may soon pivot back to quantitative easing. He warns that such policy shifts could cause “gradually at first … then all at once” market moves and calls this a key catalyst for Bitcoin and crypto.

Implications

Jordi Alexander’s endgame vision is nuanced and macro‑oriented. Rather than forecasting a singular price target, he highlights structural changes: the shift to liquidity‑driven cycles, the importance of AI‑driven coordination and the interplay between government policy and crypto markets. He encourages investors to develop deep understanding and adaptability rather than blindly following narratives.

Alexander Good – Web 4, AI agents and the Post Fiat L1

Web 3’s failure and the rise of AI agents

Alexander Good (also known by his pseudonym “goodalexander”) argues that Web 3 has largely failed because users care more about convenience and trading than owning their data. In his essay “Web 4” he notes that consumer app adoption depends on seamless UX; requiring users to bridge assets or manage wallets kills growth. However, he sees an existential threat emerging: AI agents that can generate realistic video, control computers via protocols (such as Anthropic’s “Computer Control” framework) and hook into major platforms like Instagram or YouTube. Because AI models are improving rapidly and the cost of generating content is collapsing, he predicts that AI agents will create the majority of online content.

Web 4: AI agents negotiating on the blockchain

Good proposes Web 4 as a solution. Its key ideas are:

  • Economic system with AI agents: Web 4 envisions AI agents representing users as “Hollywood agents” negotiate on their behalf. These agents will use blockchains for data sharing, dispute resolution and governance. Users provide content or expertise to agents, and the agents extract value—often by interacting with other AI agents across the world—and then distribute payments back to the user in crypto.
  • AI agents handle complexity: Good argues that humans will not suddenly start bridging assets to blockchains, so AI agents must handle these interactions. Users will simply talk to chatbots (via Telegram, Discord, etc.), and AI agents will manage wallets, licensing deals and token swaps behind the scenes. He predicts a near‑future where there are endless protocols, tokens and computer‑to‑computer configurations that will be unintelligible to humans, making AI assistance essential.
  • Inevitable trends: Good lists several trends supporting Web 4: governments’ fiscal crises encourage alternatives; AI agents will cannibalize content profits; people are getting “dumber” by relying on machines; and the largest companies bet on user‑generated content. He concludes that it is inevitable that users will talk to AI systems, those systems will negotiate on their behalf, and users will receive crypto payments while interacting primarily through chat apps.

Mapping the ecosystem and introducing Post Fiat

Good categorizes existing projects into Web 4 infrastructure or composability plays. He notes that protocols like Story, which create on‑chain governance for IP claims, will become two‑sided marketplaces between AI agents. Meanwhile, Akash and Render sell compute services and could adapt to license to AI agents. He argues that exchanges like Hyperliquid will benefit because endless token swaps will be needed to make these systems user‑friendly.

His own project, Post Fiat, is positioned as a “kingmaker in Web 4.” Post Fiat is a Layer‑1 blockchain built on XRP’s core technology but with improved decentralization and tokenomics. Key features include:

  • AI‑driven validator selection: Instead of relying on human-run staking, Post Fiat uses large language models (LLMs) to score validators on credibility and transaction quality. The network distributes 55% of tokens to validators through a process managed by an AI agent, with the goal of “objectivity, fairness and no humans involved”. The system’s monthly cycle—publish, score, submit, verify and select & reward—ensures transparent selection.
  • Focus on investing & expert networks: Unlike XRP’s transaction‑bank focus, Post Fiat targets financial markets, using blockchains for compliance, indexing and operating an expert network composed of community members and AI agents. AGTI (Post Fiat’s development arm) sells products to financial institutions and may launch an ETF, with revenues funding network development.
  • New use cases: The project aims to disrupt the indexing industry by creating decentralized ETFs, provide compliant encrypted memos and support expert networks where members earn tokens for insights. The whitepaper details technical measures—such as statistical fingerprinting and encryption—to prevent Sybil attacks and gaming.

Web 4 as survival mechanism

Good concludes that Web 4 is a survival mechanism, not just a cool ideology. He argues that a “complexity bomb” is coming within six months as AI agents proliferate. Users will have to give up some upside to AI systems because participating in agentic economies will be the only way to thrive. In his view, Web 3’s dream of decentralized ownership and user privacy is insufficient; Web 4 will blend AI agents, crypto incentives and governance to navigate an increasingly automated economy.

Comparative analysis

Converging themes

  1. Institutional & technological shifts drive the endgame.
    • Mumtaz foresees regulators enabling 24/7 markets and tokenization, which will mainstream crypto.
    • Wertheimer highlights institutional adoption via ETFs as the catalyst for Bitcoin’s parabolic phase.
    • Alexander notes that the next crypto boom will be liquidity‑driven rather than cycle‑driven and that macro policies (like the Fed’s pivot) will provide powerful tailwinds.
  2. AI becomes central.
    • Alexander emphasises investing in AI alongside crypto as twin pillars of future wealth.
    • Good builds Web 4 around AI agents that transact on blockchains, manage content and negotiate deals.
    • Post Fiat’s validator selection and governance rely on LLMs to ensure objectivity. Together these visions imply that the endgame for crypto will involve synergy between AI and blockchain, where AI handles complexity and blockchains provide transparent settlement.
  3. Need for better governance and fairness.
    • Mumtaz warns that centralization of governance remains a challenge.
    • Alexander encourages understanding game‑theoretic incentives, pointing out that strategies like MicroStrategy’s can be zero‑sum.
    • Good proposes AI‑driven validator scoring to remove human biases and create fair token distribution, addressing governance issues in existing networks like XRP.

Diverging visions

  1. Role of altcoins. Wertheimer sees altcoins as doomed and believes Bitcoin will capture most capital. Mumtaz focuses on the overall crypto market including tokenized assets and DeFi, while Alexander invests across chains and believes inefficiencies create opportunity. Good is building an alt‑L1 (Post Fiat) specialized for AI finance, implying he sees room for specialized networks.
  2. Human agency vs AI agency. Mumtaz and Alexander emphasize human investors and regulators, whereas Good envisions a future where AI agents become the primary economic actors and humans interact through chatbots. This shift implies fundamentally different user experiences and raises questions about autonomy, fairness and control.
  3. Optimism vs caution. Wertheimer’s thesis is aggressively bullish on Bitcoin with little concern for downside. Mumtaz is optimistic about crypto improving capitalism but acknowledges regulatory and governance challenges. Alexander is cautious—highlighting inefficiencies, zero‑sum dynamics and the need for skill development—while still believing in crypto’s long‑term promise. Good sees Web 4 as inevitable but warns of the complexity bomb, urging preparation rather than blind optimism.

Conclusion

The Token2049 “Crypto Endgame” panel brought together thinkers with very different perspectives. Mert Mumtaz views crypto as an upgrade to capitalism, emphasizing decentralization, transparency and 24/7 markets. Udi Wertheimer sees Bitcoin entering a supply‑shocked generational rally that will leave altcoins behind. Jordi Alexander adopts a more macro‑pragmatic stance, urging investment in both AI and crypto while understanding liquidity cycles and game‑theoretic dynamics. Alexander Good envisions a Web 4 era where AI agents negotiate on blockchains and Post Fiat becomes the infrastructure for AI‑driven finance.

Although their visions differ, a common theme is the evolution of economic coordination. Whether through tokenized assets, institutional rotation, AI‑driven governance or autonomous agents, each speaker believes crypto will fundamentally reshape how value is created and exchanged. The endgame therefore seems less like an endpoint and more like a transition into a new system where capital, computation and coordination converge.

Tokenization: Redefining Capital Markets

· 12 min read
Dora Noda
Software Engineer

Introduction

Tokenization refers to representing ownership of an asset on a blockchain through digital tokens. These tokens can represent financial assets (equities, bonds, money‑market funds), real‑world assets (real estate, art, invoices) or even cash itself (stablecoins or deposit tokens). By moving assets onto programmable, always‑on blockchains, tokenization promises to reduce settlement friction, improve transparency and allow 24/7, global access to capital markets. During TOKEN2049 and subsequent discussions in 2024‑2025, leaders from crypto and traditional finance explored how tokenization could reshape capital markets.

Below is a deep dive into the visions and predictions of key participants from the “Tokenization: Redefining Capital Markets” panel and related interviews: Diogo Mónica (General Partner, Haun Ventures), Cynthia Lo Bessette (Head of Digital Asset Management, Fidelity Investments), Shan Aggarwal (Chief Business Officer, Coinbase), Alex Thorn (Head of Research, Galaxy), and Arjun Sethi (Co‑CEO, Kraken). The report also situates their views within broader developments such as tokenized treasury funds, stablecoins, deposit tokens and tokenized equities.

1. Diogo Mónica – General Partner, Haun Ventures

1.1 Vision: Stablecoins Are the “Starting Gun” for Tokenization

Diogo Mónica argues that well‑regulated stablecoins are the prerequisite for tokenizing capital markets. In an opinion piece for American Banker he wrote that stablecoins turn money into programmable digital tokens, unlocking 24/7 trading and enabling tokenization of many asset classes. Once money is on‑chain, “you open the door to tokenize everything else – equities, bonds, real estate, invoices, art”. Mónica notes that a few technologically advanced stablecoins already facilitate near‑instant, cheap cross‑border transfers; but regulatory clarity is needed to ensure wide adoption. He emphasizes that stablecoin regulations should be strict—modeled on the regulatory regime for money‑market funds—to ensure consumer protection.

1.2 Tokenization Will Revive Capital Formation and Globalize Markets

Mónica contends that tokenization could “fix” broken capital‑formation mechanisms. Traditional IPOs are expensive and restricted to certain markets; however, issuing tokenized securities could let companies raise capital on‑chain, with global access and lower costs. Transparent, always‑open markets could allow investors worldwide to trade tokens representing equity or other assets regardless of geographic boundaries. For Mónica, the goal is not to circumvent regulation but to create new regulatory frameworks that enable on‑chain capital markets. He argues that tokenized markets could boost liquidity for traditionally illiquid assets (e.g., real estate, small‑business shares) and democratize investment opportunities. He stresses that regulators need to build consistent rules for issuing, trading and transferring tokenized securities so that investors and issuers gain confidence in on‑chain markets.

1.3 Encouraging Startups and Institutional Adoption

As a venture capitalist at Haun Ventures, Mónica encourages startups working on infrastructure for tokenized assets. He highlights the importance of compliant digital identity and custody solutions, on‑chain governance and interoperable blockchains that can support large volumes. Mónica sees stablecoins as the first step, but he believes the next phase will be tokenized money‑market funds and on‑chain treasuries—building blocks for full‑scale capital markets.

2. Cynthia Lo Bessette – Head of Digital Asset Management, Fidelity Investments

2.1 Tokenization Delivers Transactional Efficiency and Access

Cynthia Lo Bessette leads Fidelity’s digital asset management business and is responsible for developing tokenization initiatives. She argues that tokenization improves settlement efficiency and broadens access to markets. In interviews about Fidelity’s planned tokenized money‑market fund, Lo Bessette stated that tokenizing assets can “drive transactional efficiencies” and improve access and allocation of capital across markets. She noted that tokenized assets could be used as non‑cash collateral to enhance capital efficiency, and said that Fidelity wants to “be an innovator… [and] leverage technology to provide better access”.

2.2 Fidelity’s Tokenized Money‑Market Fund

In 2024, Fidelity filed with the SEC to launch the Fidelity Treasury Digital Fund, a tokenized money‑market fund on the Ethereum blockchain. The fund issues shares as ERC‑20 tokens that represent fractional interests in a pool of government treasuries. The goal is to provide 24‑hour subscription and redemption, atomic settlement and programmable compliance. Lo Bessette explained that tokenizing treasuries can improve operational infrastructure, reduce the need for intermediaries and open the fund to a wider audience, including firms seeking on‑chain collateral. By offering a tokenized version of a core money‑market instrument, Fidelity wants to attract institutions exploring on‑chain financing.

2.3 Regulatory Engagement

Lo Bessette cautions that regulation is critical. Fidelity is working with regulators to ensure investor protections and compliance. She believes that close collaboration with the SEC and industry bodies will be necessary to gain approval for tokenized mutual funds and other regulated products. Fidelity also participates in industry initiatives such as the Tokenized Asset Coalition to develop standards for custody, disclosure and investor protection.

3. Shan Aggarwal – Chief Business Officer, Coinbase

3.1 Expanding Beyond Crypto Trading to On‑Chain Finance

As Coinbase’s first CBO, Shan Aggarwal is responsible for strategy and new business lines. He has articulated a vision where Coinbase becomes the “AWS of crypto infrastructure”, providing custody, staking, compliance and tokenization services for institutions and developers. In an interview (translated from Forbes), Aggarwal said he sees Coinbase’s role as supporting the on‑chain economy by building the infrastructure to tokenize real‑world assets, bridge traditional finance with Web3 and offer financial services like lending, payments and remittances. He notes that Coinbase wants to define the future of money rather than just participate in it.

3.2 Stablecoins Are the Native Payment Rail for AI Agents and Global Commerce

Aggarwal believes stablecoins will become the native settlement layer for both humans and AI. In a 2024 interview, he said that stablecoins enable global payments without intermediaries; as AI agents proliferate in commerce, “stablecoins are the native payment rails for AI agents”. He predicts that stablecoin payments will become so embedded in commerce that consumers and machines will use them without noticing, unlocking digital commerce for billions.

Aggarwal contends that all asset classes will eventually come on‑chain. He points out that tokenizing assets such as equities, treasuries or real estate allows them to be settled instantaneously and traded globally. He acknowledges that regulatory clarity and robust infrastructure are prerequisites, but he sees an inevitable shift from legacy clearing systems to blockchains.

3.3 Building Institutional Adoption and Compliance

Aggarwal emphasizes that institutions need secure custody, compliance services and reliable infrastructure to adopt tokenization. Coinbase has invested in Coinbase International Exchange, Base (its L2 network), and partnerships with stablecoin issuers (e.g., USDC). He suggests that as more assets become tokenized, Coinbase will provide “one‑stop‑shop” infrastructure for trading, financing and on‑chain operations. Importantly, Aggarwal works closely with policymakers to ensure regulation enables innovation without stifling growth.

4. Alex Thorn – Head of Research, Galaxy

4.1 Tokenized Equities: A First Step in a New Capital Markets Infrastructure

Alex Thorn leads research at Galaxy and has been instrumental in the firm’s decision to tokenize its own shares. In September 2024, Galaxy announced it would allow shareholders to move their Galaxy Class A shares onto the Solana blockchain via a tokenization partnership with Superstate. Thorn explained that tokenized shares confer the same legal and economic rights as traditional shares, but they can be transferred peer‑to‑peer and settle in minutes rather than days. He said that tokenized equities are “a new method of building faster, more efficient, more inclusive capital markets”.

4.2 Working Within Existing Regulation and with the SEC

Thorn stresses the importance of compliance. Galaxy built its tokenized share program to comply with U.S. securities laws: the tokenized shares are issued under a transfer agent, the tokens can only be transferred among KYC‑approved wallets, and redemptions occur via a regulated broker. Thorn said Galaxy wants to “work within existing rules” and will collaborate with the SEC to develop frameworks for on‑chain equities. He views this process as vital to convincing regulators that tokenization can protect investors while delivering efficiency gains.

4.3 Critical Perspective on Deposit Tokens and Unapproved Offerings

Thorn has expressed caution about other forms of tokenization. Discussing bank‑issued deposit tokens, he compared the current landscape to the 1830s “wildcat banking” era and warned that deposit tokens may not be widely adopted if each bank issues its own token. He argued that regulators might treat deposit tokens as regulated stablecoins and require a single, rigid federal standard to make them fungible.

Similarly, he criticized pre‑IPO token offerings launched without issuer consent. In an interview about Jupiter’s pre‑IPO token of Robinhood stock, Thorn noted that many pre‑IPO tokens are unauthorized and “don’t offer clean share ownership”. For Thorn, tokenization must occur with issuer approval and regulatory compliance; unauthorized tokenization undermines investor protections and could harm public perception.

5. Arjun Sethi – Co‑CEO, Kraken

5.1 Tokenized Equities Will Outgrow Stablecoins and Democratize Ownership

Arjun Sethi, co‑CEO of Kraken, is an ardent proponent of tokenized equities. He predicts that tokenized equities will eventually surpass stablecoins in market size because they provide real economic rights and global accessibility. Sethi envisions a world where anyone with an internet connection can buy a fraction of any stock 24/7, without geographic restrictions. He argues that tokenized stocks shift power back to individuals by removing barriers imposed by geography or institutional gatekeepers; for the first time, people around the world can own and use a share of a stock like money.

5.2 Kraken’s xStocks and Partnerships

In 2024 Kraken launched xStocks, a platform for trading tokenized U.S. equities on Solana. Sethi explained that the goal is to meet people where they are—by embedding tokenized stock trading into widely used apps. When Kraken integrated xStocks into the Telegram Wallet, Sethi said the integration aimed to “give hundreds of millions of users access to tokenized equities inside familiar apps”. He stressed that this is not just about novelty; it represents a paradigm shift toward borderless markets that operate 24/7.

Kraken also acquired the futures platform NinjaTrader and launched an Ethereum Layer 2 network (Ink), signaling its intent to expand beyond crypto into a full‑stack financial services platform. Partnerships with Apollo Global and Securitize allow Kraken to work on tokenizing private assets and corporate shares.

5.3 Regulatory Engagement and Public Listing

Sethi believes that a borderless, always‑on trading platform will require regulatory cooperation. In a Reuters interview he said that expanding into equities is a natural step and paves the way for asset tokenization; the future of trading will be borderless, always on, and built on crypto rails. Kraken engages with regulators globally to ensure its tokenized products comply with securities laws. Sethi has also said Kraken might consider a public listing in the future if it supports their mission.

6. Comparative Analysis and Emerging Themes

6.1 Tokenization as the Next Phase of Market Infrastructure

All panelists agree that tokenization is a fundamental infrastructure shift. Mónica describes stablecoins as the catalyst that enables tokenizing every other asset class. Lo Bessette sees tokenization as a way to improve settlement efficiency and open access. Aggarwal predicts that all assets will eventually come on‑chain and that Coinbase will provide the infrastructure. Thorn emphasizes that tokenized equities create faster, more inclusive capital markets, while Sethi foresees tokenized equities surpassing stablecoins and democratizing ownership.

6.2 Necessity of Regulatory Clarity

A recurring theme is the need for clear, consistent regulation. Mónica and Thorn insist that tokenized assets must comply with securities laws and that stablecoins and deposit tokens require strong regulation. Lo Bessette notes that Fidelity works closely with regulators, and its tokenized money‑market fund is designed to fit within existing regulatory frameworks. Aggarwal and Sethi highlight engagement with policymakers to ensure that their on‑chain products meet compliance requirements. Without regulatory clarity, tokenization risks replicating the fragmentation and opacity that blockchain seeks to solve.

6.3 Integration of Stablecoins and Tokenized Assets

Stablecoins and tokenized treasuries are seen as foundational. Aggarwal views stablecoins as the native rail for AI and global commerce. Mónica sees well‑regulated stablecoins as the “starting gun” for tokenizing other assets. Lo Bessette’s tokenized money‑market fund and Thorn’s caution about deposit tokens highlight different approaches to tokenizing cash equivalents. As stablecoins become widely adopted, they will likely be used for settling trades of tokenized securities and RWAs.

6.4 Democratization and Global Accessibility

Tokenization promises to democratize access to capital markets. Sethi’s enthusiasm for giving “hundreds of millions of users” access to tokenized equities through familiar apps captures this vision. Aggarwal sees tokenization enabling billions of people and AI agents to participate in digital commerce. Mónica’s view of 24/7 markets accessible globally aligns with these predictions. All emphasize that tokenization will remove barriers and bring inclusion to financial services.

6.5 Cautious Optimism and Challenges

While optimistic, the panelists also recognize challenges. Thorn warns against unauthorized pre‑IPO tokenization and stresses that deposit tokens might replicate “wildcat banking” if each bank issues its own. Lo Bessette and Mónica call for careful regulatory design. Aggarwal and Sethi highlight infrastructure demands such as compliance, custody and user experience. Balancing innovation with investor protection will be key to realizing the full potential of tokenized capital markets.

Conclusion

The visions expressed at TOKEN2049 and in subsequent interviews illustrate a shared belief that tokenization will redefine capital markets. Leaders from Haun Ventures, Fidelity, Coinbase, Galaxy and Kraken see tokenization as an inevitable evolution of financial infrastructure, driven by stablecoins, tokenized treasuries and tokenized equities. They anticipate that on‑chain markets will operate 24/7, enable global participation, reduce settlement friction and democratize access. However, these benefits depend on robust regulation, compliance and infrastructure. As regulators and industry participants collaborate, tokenization could unlock new forms of capital formation, democratize ownership and usher in a more inclusive financial system.