Skip to main content

12 posts tagged with "SaaS"

View all tags

GameFi Industry Overview: A PM's Guide to Web3 Gaming in 2025

· 32 min read
Dora Noda
Software Engineer

The GameFi market reached $18-19 billion in 2024 with projections to hit $95-200 billion by 2034, yet faces a brutal reality check: 93% of projects fail and 60% of users abandon games within 30 days. This paradox defines the current state—massive growth potential colliding with fundamental sustainability challenges. The industry is pivoting from speculative "play-to-earn" models that attracted mercenary users toward "play-and-earn" experiences prioritizing entertainment value with blockchain benefits as secondary. Success in 2025 requires understanding five distinct user personas, designing for multiple "jobs to be done" beyond just earning, implementing sustainable tokenomics that don't rely on infinite user growth, and learning from both the successes of Axie Infinity's $4+ billion in NFT sales and the failures of its 95% user collapse. The winners will be products that abstract blockchain complexity, deliver AAA-quality gameplay, and build genuine communities rather than speculation farms.

Target user personas: Who's actually playing GameFi

The GameFi audience spans from Filipino pedicab drivers earning rent money to wealthy crypto investors treating games as asset portfolios. Understanding these personas is critical for product-market fit.

The Income Seeker represents 35-40% of users

This persona dominates Southeast Asia—particularly the Philippines, Vietnam, and Indonesia—where 40% of Axie Infinity's peak users originated. These are 20-35 year olds from below-minimum-wage households who view GameFi as legitimate employment, not entertainment. They invest 6-10 hours daily treating gameplay as a full-time job, often entering through scholarship programs where guilds provide NFTs in exchange for 30-75% of earnings. During Axie's peak, Filipino players earned $400-1,200 monthly compared to $200 minimum wage, enabling life-changing outcomes like paying university fees and buying groceries. However, this persona is extremely vulnerable to token volatility—when SLP crashed 99% from peak, earnings fell below minimum wage and retention collapsed. Their pain points center on high entry costs ($400-1,000+ for starter NFTs at peak), complex crypto-to-fiat conversion, and unsustainable tokenomics. For product managers, this persona requires free-to-play or scholarship models, mobile-first design, local language support, and transparent earning projections. The scholarship model pioneered by Yield Guild Games (30,000+ scholarships) democratizes access but raises exploitation concerns given the 10-30% commission structure.

The Gamer-Investor accounts for 25-30% of users

These are 25-40 year old professionals from developed markets—US, South Korea, Japan—with middle to upper-middle class incomes and college education. They're experienced core gamers seeking both entertainment value and financial returns, comfortable navigating DeFi ecosystems across 3.8 Layer 1 chains and 3.6 Layer 2 chains on average. Unlike Income Seekers, they directly purchase premium NFTs ($1,000-10,000+ single investments) and diversify portfolios across 3-5 games. They invest 2-4 hours daily and often act as guild owners rather than scholars, managing others' gameplay. Their primary frustration is poor gameplay quality in most GameFi titles—they want AAA production values matching traditional games, not "spreadsheets with graphics." This persona is critical for sustainability because they provide capital inflows and longer-term engagement. Product managers should focus on compelling gameplay mechanics, high production values, sophisticated tokenomics transparency, and governance participation through DAOs. They're willing to pay premium prices but demand quality and won't tolerate pay-to-win dynamics, which ranks as the top reason players quit traditional games.

The Casual Dabbler makes up 20-25% of users

Global and primarily mobile-first, these 18-35 year old students and young professionals are motivated by curiosity, FOMO, and the "why not earn while playing?" value proposition. They invest only 30 minutes to 2 hours daily with inconsistent engagement patterns. This persona increasingly discovers GameFi through Telegram mini-apps like Hamster Kombat (239 million users in 3 months) and Notcoin ($1.6 billion market cap), which offer zero-friction onboarding without wallet setup. However, they exhibit the highest churn rate—60%+ abandon within 30 days—because poor UX/UI (cited by 53% as biggest challenge), complex wallet setup (deters 11%), and repetitive gameplay drive them away. The discovery method matters: 60% learn about GameFi from friends and family, making viral mechanics essential. For product managers, this persona demands simplified onboarding (hosted wallets, no crypto knowledge required), social features for friend recruitment, and genuinely entertaining gameplay that works as a standalone experience. The trap is designing purely for token farming, which attracts this persona temporarily but fails to retain them beyond airdrops—Hamster Kombat lost 86% of users post-airdrop (300M to 41M).

The Crypto Native comprises 10-15% of users

These 22-45 year old crypto professionals, developers, and traders from global crypto hubs possess expert-level blockchain knowledge and variable gaming backgrounds. They view GameFi as an asset class and technological experiment rather than primary entertainment, seeking alpha opportunities, early adoption status, and governance participation. This persona trades high-frequency, provides liquidity, stakes governance tokens, and participates in DAOs (25% actively engage in governance). They're sophisticated enough to analyze smart contract code and tokenomics sustainability, making them the harshest critics of unsustainable models. Their investment approach focuses on high-value NFTs, land sales, and governance tokens rather than grinding for small rewards. Product managers should engage this persona for credibility and capital but recognize they're often early exiters—flipping positions before mainstream adoption. They value innovative tokenomics, transparent on-chain data, and utility beyond speculation. Major pain points include unsustainable token emissions, regulatory uncertainty, bot manipulation, and rug pulls. This persona is essential for initial liquidity and word-of-mouth but represents too small an audience (4.5 million crypto gamers vs 3 billion total gamers) to build a mass-market product around exclusively.

The Community Builder represents 5-10% of users

Guild owners, scholarship managers, content creators, and influencers—these 25-40 year olds with middle incomes invest 4-8 hours daily managing operations rather than playing directly. They built the infrastructure enabling Income Seekers to participate, managing anywhere from 10 to 1,000+ players and earning through 10-30% commissions on scholar earnings. At Axie's 2021 peak, successful guild leaders earned $20,000+ monthly. They create educational content, strategy guides, and market analysis while using rudimentary tools (often Google Sheets for scholar management). This persona is critical for user acquisition and education—Yield Guild Games managed 5,000+ scholars with 60,000 on waitlist—but faces sustainability challenges as token prices affect entire guild economics. Their pain points include lack of guild CRM tools, performance tracking difficulty, regulatory uncertainty around taxation, and the sustainability concerns of the scholar economy model (criticized as digital-age "gold farming"). Product managers should build tools specifically for this persona—guild dashboards, automated payouts, performance analytics—and recognize they serve as distribution channels, onboarding infrastructure, and community evangelists.

Jobs to be done: What users hire GameFi products for

GameFi products are hired to do multiple jobs simultaneously across functional, emotional, and social dimensions. Understanding these layered motivations explains why users adopt, engage with, and ultimately abandon these products.

Functional jobs: Practical problems being solved

The primary functional job for Southeast Asian users is generating income when traditional employment is unavailable or insufficient. During COVID-19 lockdowns, Axie Infinity players in the Philippines earned $155-$600 monthly compared to $200 minimum wage, with earnings enabling concrete outcomes like paying for mothers' medication and children's school fees. One 26-year-old line cook made $29 weekly playing, and professional players bought houses. This represents a genuine economic opportunity in markets with 60%+ unbanked populations and minimum daily wages of $7-25 USD. However, the job extends beyond primary income to supplementary earnings—content moderators playing 2 hours daily earned $155-$195 monthly (nearly half their salary) for grocery money and electricity bills. For developed market users, the functional job shifts to investment and wealth accumulation through asset appreciation. Early Axie adopters bought teams for $5 in 2020; by 2021 prices reached $50,000+ for starter teams. Virtual land in Decentraland and The Sandbox sold for substantial amounts, and the guild model emerged where "managers" own multiple teams and rent to "scholars" for 10-30% commission. The portfolio diversification job involves gaining crypto asset exposure through engaging activity rather than pure speculation, accessing DeFi features (staking, yield farming) embedded in gameplay. GameFi competes with traditional employment (offering flexible hours, work-from-home, no commute), traditional gaming (offering real money earnings), cryptocurrency trading (offering more engaging skill-based earnings), and gig economy work (offering more enjoyable activity for comparable pay).

Emotional jobs: Feelings and experiences being sought

Achievement and mastery drive engagement as users seek to feel accomplished through challenging gameplay and visible progress. Academic research shows "advancement" and "achievement" as top gaming motivations, satisfied through breeding optimal Axies, winning battles, climbing leaderboards, and progression systems creating dopamine-driven engagement. One study found 72.1% of players experienced mood uplift during play. However, the grinding nature creates tension—players describe initial happiness followed by "sleepiness and stress of the game." Escapism and stress relief became particularly important during COVID lockdowns, with one player noting being "protected from virus, play cute game, earn money." Academic research confirms escapism as a major motivation, though studies show gamers with escapism motivation had higher psychological issue risk when external problems persisted. The excitement and entertainment job represents the 2024 industry shift from pure "play-to-earn" to "play-and-earn," with criticism that early GameFi projects prioritized "blockchain gimmicks over genuine gameplay quality." AAA titles launching in 2024-2025 (Shrapnel, Off The Grid) focus on compelling narratives and graphics, recognizing players want fun first. Perhaps most importantly, GameFi provides hope and optimism about financial futures. Players express being "relentlessly optimistic" about achieving goals, with GameFi offering a bottom-up voluntary alternative to Universal Basic Income. The sense of autonomy and control over financial destiny—rather than dependence on employers or government—emerges through player ownership of assets via NFTs (versus traditional games where developers control everything) and decentralized governance through DAO voting rights.

Social jobs: Identity and social needs being met

Community belonging proves as important as financial returns. Discord servers reach 100,000+ members, guild systems like Yield Guild Games manage 8,000 scholars with 60,000 waitlists, and scholarship models create mentor-mentee relationships. The social element drives viral growth—Telegram mini-apps leveraging existing social graphs achieved 35 million (Notcoin) and 239 million (Hamster Kombat) users. Community-driven development is expected in 50%+ of GameFi projects by 2024. Early adopter and innovator status attracts participants wanting to be seen as tech-savvy and ahead of mainstream trends. Web3 gaming attracts "tech enthusiasts" and "crypto natives" beyond traditional gamers, with first-mover advantage in token accumulation creating status hierarchies. The wealth display and "flex culture" job manifests through rare NFT Axies with "limited-edition body parts that will never be released again" serving as status symbols, X-integrated leaderboards letting "players flex their rank to mainstream audience," and virtual real estate ownership demonstrating wealth. Stories of buying houses and land shared virally reinforce this job. For Income Seekers, the provider and family support role proves especially powerful—an 18-year-old breadwinner supporting family after father's COVID death, players paying children's school fees and parents' medication. One quote captures it: "It's food on the table." The helper and mentor status job emerges through scholarship models where successful players provide Axie NFTs to those who can't afford entry, with community managers organizing and training new players. Finally, GameFi enables gamer identity reinforcement by bridging traditional gaming culture with financial responsibility, legitimizing gaming as a career path and reducing stigma of gaming as "waste of time."

Progress users are trying to make in their lives

Users aren't hiring "blockchain games"—they're hiring solutions to make specific life progress. Financial progress involves moving from "barely surviving paycheck to paycheck" to "building savings and supporting family comfortably," from "dependent on unstable job market" to "multiple income streams with more control," and from "unable to afford children's education" to "paying school fees and buying digital devices." Social progress means shifting from "gaming seen as waste of time" to "gaming as legitimate income source and career," from "isolated during pandemic" to "connected to global community with shared interests," and from "consumer in gaming ecosystem" to "stakeholder with ownership and governance rights." Emotional progress involves transforming from "hopeless about financial future" to "optimistic about wealth accumulation possibilities," from "time spent gaming feels guilty" to "productive use of gaming skills," and from "passive entertainment consumer" to "active creator and earner in digital economy." Identity progress encompasses moving from "just a player" to "investor, community leader, entrepreneur," from "late to crypto" to "early adopter in emerging technology," and from "separated from family (migrant worker)" to "at home while earning comparable income." Understanding these progress paths—rather than just product features—is essential for product-market fit.

Monetization models: How GameFi companies make money

GameFi monetization has evolved significantly from the unsustainable 2021 boom toward diversified revenue streams and balanced tokenomics. Successful projects in 2024-2025 demonstrate multiple revenue sources rather than relying solely on token speculation.

Play-to-earn mechanics have transformed toward sustainability

The original play-to-earn model rewarded players with cryptocurrency tokens for achievements, which could be traded for fiat currency. Axie Infinity pioneered the dual-token system with AXS (governance, capped supply) and SLP (utility, inflationary), where players earned SLP through battles and quests then burned it for breeding. At peak in 2021, players earned $400-1,200+ monthly, but the model collapsed as SLP crashed 99% due to hyperinflation and unsustainable token emissions requiring constant new player influx. The 2024 resurgence shows how sustainability is achieved: Axie now generates $3.2M+ annually in treasury revenue (averaging $330K monthly) with 162,828 monthly active users through diversified sources—4.25% marketplace fees on all NFT transactions, breeding fees paid in AXS/SLP, and Part Evolution fees (75,477 AXS earned). Critically, the SLP Stability Fund created 0.57% annualized deflation in 2024, with more tokens burned than minted for the first time. STEPN's move-to-earn model with GST (unlimited supply, in-game rewards) and GMT (6 billion fixed supply, governance) demonstrated the failure mode—GST reached $8-9 at peak but collapsed due to hyperinflation from oversupply and Chinese market restrictions. The 2023-2024 evolution emphasizes "play-and-own" over "play-to-earn," stake-to-play models where players stake tokens to access features, and fun-first design where games must be enjoyable independent of earning potential. Balanced token sinks—requiring spending for upgrades, breeding, repairs, crafting—prove essential for sustainability.

NFT sales generate revenue through primary and secondary markets

Primary NFT sales include public launches, thematic partnerships, and land drops. The Sandbox's primary LAND sales drove 17.3% quarter-over-quarter growth in Q3 2024, with LAND buyer activity surging 94.11% quarter-over-quarter in Q4 2024. The platform's market cap reached $2.27 billion at December 2024 peak, with only 166,464 LAND parcels ever existing (creating scarcity). The Sandbox's Beta launch generated $1.3M+ in transactions in one day. Axie Infinity's Wings of Nightmare collection in November 2024 drove $4M treasury growth, while breeding mechanics create deflationary pressure (116,079 Axies released for materials, net reduction of 28.5K Axies in 2024). Secondary market royalties provide ongoing revenue through automated smart contracts using the ERC-2981 standard. The Sandbox implements a 5% total fee on secondary sales, split 2.5% to the platform and 2.5% to the original NFT creator, providing continuous creator income. However, marketplace dynamics shifted in 2024 as major platforms (Magic Eden, LooksRare, X2Y2) made royalties optional, reducing creator income significantly from 2022-2024 peaks. OpenSea maintains enforced royalties for new collections using filter registry, while Blur honors 0.5% minimum fees on immutable collections. The lands segment holds over 25% of NFT market revenue (2024's dominant category), with total NFT segments accounting for 77.1% of GameFi usage. This marketplace fragmentation around royalty enforcement creates strategic considerations for which platforms to prioritize.

In-game token economics balance emissions with sinks

Dual-token models dominate successful projects. Axie Infinity's AXS (governance) has fixed supply, staking rewards, governance voting rights, and requirements for breeding/upgrades, while SLP (utility) has unlimited supply earned through gameplay but is burned for breeding and activities, managed by SLP Stability Fund to control inflation. AXS joined Coinbase 50 Index in 2024 as a top gaming token. The Sandbox uses a single-token model (3 billion SAND capped supply, full dilution expected 2026) with multiple utilities: purchasing LAND and assets, staking for passive yields, governance voting, transaction medium, and premium content access. The platform implements 5% fees on all transactions split between platform and creators, with 50% distribution to Foundation (staking rewards, creator funds, P2E prizes) and 50% to Company. Token sinks are critical for sustainability, with effective burn mechanisms including repairs and maintenance (sneaker durability in STEPN), leveling and upgrades (Part Evolution in Axie burned 75,477 AXS), breeding/minting NFT creation costs (StarSharks burns 90% of utility tokens from blind box sales), crafting and combining (Gem/Catalyst systems in The Sandbox), land development (staking DEC in Splinterlands for upgrades), and continuous marketplace fee burns. Splinterlands' 2024 innovation requiring DEC staking for land upgrades creates strong demand. Best practices emerging for 2024-2025 include ensuring token sinks exceed faucets (emissions), time-locked rewards (Illuvium's sILV prevents immediate dumping), seasonal mechanics forcing regular purchases, NFT durability limiting earning potential, and negative-sum PvP where players willingly consume tokens for entertainment.

Transaction fees and marketplace commissions provide predictable revenue

Platform fees vary by game. Axie Infinity charges 4.25% on all in-game purchases (land, NFT trading, breeding) as Sky Mavis's primary monetization source, plus variable breeding costs requiring both AXS and SLP tokens. The Sandbox implements 5% on all marketplace transactions, split 50-50 between platform (2.5%) and NFT creators (2.5%), plus premium NFT sales, subscriptions, and services. Gas fee mitigation became essential as 80% of GameFi platforms incorporated Layer 2 solutions by 2024. Ronin Network (Axie's custom sidechain) provides minimal gas fees through 27 validator nodes, while Polygon integration (The Sandbox) reduced fees significantly. TON blockchain enables minimal fees for Telegram mini-apps (Hamster Kombat, Notcoin), though the trade-off matters—Manta Pacific's Celestia integration reduced gas fees but decreased revenue by 70.2% quarter-over-quarter in Q3 2024 (lower fees increase user activity but reduce protocol revenue). Smart contract fees automate royalty payments (ERC-2981 standard), breeding contract fees, staking/unstaking fees, and land upgrade fees. Marketplace commissions vary: OpenSea charges 2.5% platform fee plus creator royalties (if enforced), Blur charges 0.5% minimum on immutable collections using aggressive zero-fee trading for user acquisition, Magic Eden evolved from enforced to optional royalties with 25% of protocol fees distributed to creators as compromise, while The Sandbox's internal marketplace maintains 5% with 2.5% automatic creator royalty.

Diversified revenue streams reduce reliance on speculation

Land sales dominate with over 25% of NFT market revenue in 2024, representing the fastest-growing digital asset class. The Sandbox's 166,464 capped LAND parcels create scarcity, with developed land enabling creators to earn 95% of SAND revenue while maintaining 2.5% on secondary sales. Corporate interest from JPMorgan, Samsung, Gucci, and Nike established virtual presence, with high-traffic zones commanding premium prices and prime locations generating $5,000+/month in rental income. Breeding fees create token sinks while balancing new NFT supply—Axie's breeding requires AXS + SLP with costs increasing each generation, while Part Evolution requires Axie sacrifices generating 75,477 AXS in treasury revenue. Battle passes and seasonal content drive engagement and revenue. Axie's Bounty Board system (April 2024) and Coinbase Learn and Earn partnership (June 2024) drove 691% increase in Monthly Active Accounts and 80% increase in Origins DAU, while competitive seasons offer AXS prize pools (Season 9: 24,300 AXS total). The Sandbox's Alpha Season 4 in Q4 2024 reached 580,778 unique players, 49 million quests completed, and 1.4 million hours of gameplay, distributing 600,000 SAND to 404 unique creators and running Builders' Challenge with 1.5M SAND prize pool. Sponsorships and partnerships generate significant revenue—The Sandbox has 800+ brand partnerships including Atari, Adidas, Gucci, and Ralph Lauren, with virtual fashion shows and corporate metaverse lounges. Revenue models include licensing fees, sponsored events, and virtual advertising billboards in high-traffic zones.

The scholarship guild model represents a unique revenue stream where guilds own NFTs and lend to players unable to afford entry. Yield Guild Games provided 30,000+ scholarships with standard revenue-sharing of 70% scholar, 20% manager, 10% guild (though some guilds use 50-50 splits). MetaGaming Guild expanded Pixels scholarship from 100 to 1,500 slots using a 70-30 model (70% to scholars hitting 2,000 BERRY daily quota), while GuildFi aggregates scholarships from multiple sources. Guild monetization includes passive income from NFT lending, token appreciation from guild tokens (YGG, GF, etc.), management fees (10-30% of player earnings), and investment returns from early game backing. At 2021 peak, guild leaders earned $20,000+ monthly, enabling life-changing impact in developing nations where scholarship players earn $20/day versus previous $5/day in traditional work.

Major players: Leading projects, platforms, and infrastructure

The GameFi ecosystem consolidated around proven platforms and experienced significant evolution from speculative 2021 peaks toward quality-focused 2024-2025 landscape.

Top games span casual to AAA experiences

Lumiterra leads with 300,000+ daily active unique wallets on Ronin (July 2025), ranking #1 by onchain activity through MMORPG mechanics and MegaDrop campaign. Axie Infinity stabilized around 100,000 daily active unique wallets after pioneering play-to-earn, generating $4+ billion cumulative NFT sales despite losing 95% of users from peak. The dual-token AXS/SLP model and scholarship program defined the industry, though unsustainable tokenomics caused the collapse before 2024 resurgence with improved sustainability. Alien Worlds maintains ~100,000 daily active unique wallets on WAX blockchain through mining-focused metaverse with strong retention, while Boxing Star X by Delabs reaches ~100,000 daily active unique wallets through Telegram Mini-App integration on TON/Kaia chains showing strong growth since April 2025. MapleStory N by Nexon represents traditional gaming entering Web3 with 50,000-80,000 daily active unique wallets on Avalanche's Henesys chain as the biggest 2025 blockchain launch bringing AAA IP credibility. Pixels peaked at 260,000+ daily users at launch with $731M market cap and $1.4B trading volume in February 2024, utilizing dual tokens (PIXEL + BERRY) after migrating from Polygon to Ronin and bringing 87K addresses to the platform. The Sandbox built 5+ million user wallets and 800+ brand partnerships (Atari, Snoop Dogg, Gucci) using SAND token as the leading metaverse platform for user-generated content and virtual real estate. Guild of Guardians on Immutable reached 1+ million pre-registrations and top 10 on iOS/Android stores, driving Immutable's 274% daily unique active wallets increase in May 2024.

The Telegram phenomenon disrupted traditional onboarding with Hamster Kombat reaching 239 MILLION users in 3 months through tap-to-earn mechanics on TON blockchain, though losing 86% post-airdrop (300M to 41M) highlights retention challenges. Notcoin achieved $1.6+ billion market cap as #2 gaming token by market cap with zero crypto onboarding friction, while Catizen built multi-million user base with successful token airdrop. Other notable games include Illuvium (AAA RPG, highly anticipated), Gala Games (multi-game platform), Decentraland (metaverse pioneer with MANA token), Gods Unchained (leading trading card game on Immutable), Off The Grid (console/PC shooter on Gunz chain), Splinterlands (established TCG with 6-year track record on Hive), and Heroes of Mavia (2.6+ million users with 3-token system on Ronin).

Blockchain platforms compete on speed, cost, and developer tools

Ronin Network by Sky Mavis holds #1 gaming blockchain position in 2024 with 836K daily unique active wallets peak, hosting Axie Infinity, Pixels, Lumiterra, and Heroes of Mavia. Purpose-built for gaming with sub-second transactions, low fees, and proven scale, Ronin serves as a migration magnet. Immutable (X + zkEVM) achieved fastest growth at 71% year-over-year, surpassing Ronin in late 2024 with 250,000+ monthly active users, 5.5 million Passport signups, $40M total value locked, 250+ games (most in industry), 181 new games in 2024, and 1.1 million daily transactions (414% quarter-over-quarter growth). The dual solution—Immutable X on StarkWare and zkEVM on Polygon—offers zero gas fees for NFTs, EVM compatibility, best developer tools, and major partnerships (Ubisoft, NetMarble). Polygon Network maintains 550K daily unique active wallets, 220M+ addresses, and 2.48B transactions with Ethereum security, massive ecosystem, corporate partnerships, and multiple scaling solutions providing strong metaverse presence. Solana captures approximately 50% of GameFi application fees in Q1 2025 through highest throughput, lowest costs, fast finality, and trading-focused ecosystem. BNB Chain (+ opBNB) replaced Ethereum as volume leader, with opBNB providing $0.0001 gas fees (lowest) and 97 TPS average (highest), offering cost-effectiveness and strong Asian market presence. TON (The Open Network) integrated with Telegram's 700M+ users enabling Hamster Kombat, Notcoin, and Catizen with zero-friction onboarding, social integration, and viral growth potential. Other platforms include Ethereum (20-30% trading share, Layer 2 foundation), Avalanche (customizable subnets, Henesys chain), NEAR (human-readable accounts), and Gunz (Off The Grid dedicated chain).

Traditional gaming giants and VCs shape the future

Animoca Brands dominates as #1 most active investor with portfolio of 400+ companies, $880M raised over 22 rounds (latest $110M from Temasek, Boyu, GGV), key investments in Axie, Sandbox, OpenSea, Dapper Labs, and Yield Guild Games, plus Animoca Ventures $800M-$1B fund with 38+ investments in 2024 (most active in space). GameFi Ventures based in Hong Kong manages portfolio of 21 companies focusing on seed rounds and co-investing with Animoca, while Andreessen Horowitz (a16z) deployed $40M to CCP Games from multi-billion crypto fund. Other major VCs include Bitkraft (gaming/esports focus), Hashed (South Korea, Asian market), NGC Ventures ($100M Fund III, 246 portfolio companies), Paradigm (infrastructure focus), Infinity Ventures Crypto ($70M fund), Makers Fund, and Kingsway Capital.

Ubisoft leads traditional gaming entry with Champions Tactics: Grimoria Chronicles (October 2024 on Oasys) and Might & Magic: Fates (2025 on Immutable), featuring partnerships with Immutable, Animoca, Oasys, and Starknet. The studio sold 10K Warlords and 75K Champions NFTs (sold out) with potential to leverage 138 million players. Square Enix launched Symbiogenesis (Arbitrum/Polygon, 1,500 NFTs) and Final Fantasy VII NFTs, pursuing "blockchain entertainment/Web3" strategy through Animoca Brands Japan partnership. Nexon delivered MapleStory N as major 2025 launch with 50K-80K daily users, while Epic Games shifted policy to welcome P2E games in late 2024, hosting Gods Unchained and Striker Manager 3. CCP Games (EVE Online) raised $40M (a16z lead) for new AAA EVE Web3 game. Additional activity includes Konami (Project Zircon, Castlevania), NetMarble (Immutable partnership, MARBLEX), Sony PlayStation (exploring Web3), Sega, Bandai Namco (research phase), and The Pokémon Company (exploring). Industry data shows 29 of 40 largest gaming companies exploring Web3.

Infrastructure providers enable ecosystem growth

Immutable Passport leads with 5.5 million signups (industry leading), providing seamless Web3 onboarding and game integration, while MetaMask serves 100M+ users as most popular Ethereum wallet with new Stablecoin Earn feature. Others include Trust Wallet, Coinbase Wallet, Phantom (Solana), and WalletConnect. Enjin SDK provides dedicated NFT blockchain with Unity integration, ENJ token (36.2% staking APY), and comprehensive tools (Wallet, Platform, Marketplace, Beam) plus Efinity Matrixchain for cross-chain functionality. ChainSafe Gaming (web3.unity) offers open-source Unity SDK with C#, C++, Blueprints support as premier Unity-blockchain tool with AAA studio adoption. Venly provides multi-chain wallet API and Unity/Unreal plugins with cross-platform toolkit. Others include Moralis Unity SDK, Stardust (API), Halliday, GameSwift (complete platform), Alchemy (infrastructure), and Thirdweb (smart contracts). Game engines include Unity (most popular for Web3 with SDKs from Enjin, ChainSafe, Moralis, Venly), Unreal Engine (AAA graphics, Epic Games now accepts Web3, Web3.js integration), and Godot (open-source, flexible blockchain integration).

DappRadar serves as industry standard tracking 35+ blockchains, 2,000+ games with real-time rankings as primary discovery platform. Footprint Analytics indexes 20+ blockchains, 2,000+ games with deep on-chain analysis and bot detection (developing), used by CoinMarketCap and DeGame. Nansen provides on-chain intelligence with wallet profiling and regular GameFi reports. DeGame covers 3,106 projects across 55+ blockchains with player-focused discovery. Others include Messari, CryptoSlam, and GameFi.org. Middleware and launchpads include EnjinStarter (80+ successful IDOs, $6 minimum stake, multi-chain support), GameFi.org Launchpad (IDO platform with KYC integrated), and Polygon Studios/Immutable Platform (complete development suites).

Market dynamics and strategic considerations

The GameFi market in 2024-2025 represents a critical inflection point, transitioning from speculative hype toward sustainable product-market fit with clear opportunities and severe challenges requiring strategic navigation.

The shift toward quality and sustainability defines success

The pure play-to-earn model collapsed spectacularly—Axie Infinity's 95% user decline, SLP's 99% crash, and the industry's 93% project failure rate proved that attracting mercenary users seeking quick profits creates unsustainable token economies with hyperinflation and Ponzi-scheme dynamics. The 2024-2025 evolution prioritizes "play-and-earn" and "play-to-own" models where gameplay quality comes first with earning as secondary benefit, entertainment value matters over financial speculation, and long-term engagement trumps extraction mechanics. This shift responds to data showing the top reason players quit is games becoming "too pay-to-win" and that 53% cite poor UX/UI as the biggest barrier. The emerging "Web2.5 mullet" strategy—mainstream free-to-play mechanics and UX on surface with blockchain features abstracted away or hidden, listed in traditional app stores (Apple, Google now allowing certain Web3 games), and onboarding requiring zero crypto knowledge—enables mainstream adoption. AAA quality games with 2-5 year development cycles, indie games with compelling gameplay loops, and traditional gaming studios entering space (Ubisoft, Epic Games, Animoca) represent the maturation of production values to compete with traditional gaming's 3.09 billion players worldwide versus only 4.5 million daily active Web3 gamers.

Massive opportunities exist in underserved segments

True Web2 gamers represent the biggest opportunity—3.09B gamers worldwide versus 4.5M daily active Web3 gamers, with 52% not knowing what blockchain games are and 32% having heard of them but never played. The strategy requires abstracting blockchain away completely, marketing as normal games, and onboarding without requiring crypto knowledge or wallets initially. Mobile-first markets offer untapped potential with 73% of global gaming audience on mobile, Southeast Asia and Latin America being smartphone-first with lower entry barriers, and lower-cost blockchains (Solana, Polygon, opBNB) enabling mobile accessibility. The content creator economy remains underutilized—creator-owned economies with fair royalties, NFT-based asset creation and trading, user-generated content with blockchain ownership, and platforms that enforce creator royalties unlike OpenSea controversies. Subscription and hybrid monetization models address over-reliance on token mints and marketplace fees, with subscription models (à la Coinsub) providing predictable revenue, blending free-to-play + in-app purchases + blockchain rewards, and targeting "whale economy" with staking and premium memberships. Emerging niches include fully on-chain games (all logic and state on blockchain enabled by account abstraction wallets and better infrastructure like Dojo on Starknet and MUD on OP Stack with backing from a16z and Jump Crypto), AI-powered GameFi (50% of new projects expected to leverage AI for personalized experiences, dynamic NPCs, procedural content generation), and genre-specific opportunities in RPGs (best suited for Web3 due to character progression, economies, item ownership) and strategy games (complex economies benefit from blockchain transparency).

Retention crisis and tokenomics failures demand solutions

The 60-90% churn within 30 days defines the existential crisis, with 99% drop-off threshold marking failure per CoinGecko and Hamster Kombat's 86% loss (300M to 41M users) after airdrop exemplifying the problem. Root causes include lack of long-term incentives beyond token speculation, poor gameplay mechanics, unsustainable tokenomics with inflation eroding value, bots and mercenary behavior, and airdrop farming without genuine engagement. Solution pathways require dynamic loot distribution, staking-based rewards, skill-based progression, player-controlled economies via DAOs, and immersive storytelling with compelling game loops. Common tokenomics pitfalls include hyperinflation (excessive token minting crashes value), death spirals (declining players → lower demand → price crash → more players leave), pay-to-win concerns (top reason players quit traditional games), Ponzi dynamics (early adopters profit, late entrants lose), and unsustainable supply (DeFi Kingdoms' JEWEL supply expanded 500% to 500M by mid-2024). Best practices emphasize single-token economies (not dual tokens), fixed supply with deflationary mechanisms, token sinks exceeding token faucets (incentivize keeping assets in-game), tying tokens to narratives/characters/utility not just speculation, and controlling inflation through burning, staking, and crafting requirements.

UX complexity and security vulnerabilities create barriers

Barriers identified in 2024 Blockchain Game Alliance survey show 53% cite poor UX/UI as biggest challenge, 33% cite poor gameplay experiences, and 11% are deterred by wallet setup complexity. Technical literacy requirements include wallets, private keys, gas fees, and DEX navigation. Solutions demand hosted/custodial wallets managed by game (users don't see private keys initially), gasless transactions through Layer 2 solutions, fiat onramps, Web2-style login (email/social), and progressive disclosure of Web3 features. Security risks include smart contract vulnerabilities (immutable code means bugs can't be easily fixed), phishing attacks and private key theft, bridge exploits (Ronin Network $600M hack in 2022), and rug pulls with fraud (decentralized means less oversight). Mitigation requires comprehensive smart contract audits (Beosin, CertiK), bug bounty programs, insurance protocols, user education on wallet security, and multi-sig requirements for treasury. The regulatory landscape remains unclear—CyberKongz litigation classified ERC-20 tokens as securities, China bans GameFi entirely, South Korea bans converting game currency to cash (2004 law), Japan has restrictions, US has bipartisan proposals with mid-2023 legislation expected, and at least 20 countries predicted to have GameFi frameworks by end 2024. Implications require extensive disclosure and KYC, may restrict US participation, necessitate legal teams from day one, demand token design considering securities law, and navigate gambling regulations in some jurisdictions.

Product managers must prioritize execution and community

Web3 product management demands 95/5 execution over vision split (versus Web2's 70/30) because the market moves too fast for long-term strategic planning, vision lives in whitepapers (done by technical architects), speed of iteration matters most, and market conditions change weekly. This means quick specs over Telegram with developers, launch/measure/iterate rapidly, build hype on Twitter/Discord in real-time, QA carefully but ship fast, and remember smart contract audits are critical (can't patch easily). Product managers must wear many hats with ultra-versatile skill sets including user research (Discord, Twitter listening), data analysis (Dune Analytics, on-chain metrics), UX/UI design (sketch flows, tokenomics), partnership/BD (protocol integrations, guilds), marketing (blogs, Twitter, memes), community management (AMAs, Discord moderation), growth hacking (airdrops, quests, referrals), tokenomics design, and understanding regulatory landscape. Teams are small with roles not unbundled like Web2.

Community-first mindset proves essential—success equals thriving community not just revenue metrics, community owns and governs (DAOs), direct interaction expected (Twitter, Discord), transparency paramount (all on-chain), with the maxim "if community fails, you're NGMI (not gonna make it)." Tactics include regular AMAs and town halls, user-generated content programs, creator support (tools, royalties), guild partnerships, governance tokens and voting, plus memes and viral content. Prioritizing fun gameplay is non-negotiable—players must enjoy the game intrinsically, earning is secondary to entertainment, compelling narrative/characters/worlds matter, tight game loops (not tedious grinding), and polish/quality (compete with Web2 AAA). Avoid games that are "spreadsheets with graphics," pure economic simulators, pay-to-win dynamics, and repetitive boring tasks for token rewards. Understanding tokenomics deeply requires critical knowledge of supply/demand dynamics, inflation/deflation mechanisms, token sinks versus faucets, staking/burning/vesting schedules, liquidity pool management, and secondary market dynamics. Security is paramount because smart contracts are immutable (bugs can't be easily fixed), hacks result in permanent loss, every transaction involves funds (wallets don't separate game from finance), and exploits can drain entire treasury—requiring multiple audits, bug bounties, conservative permissions, multi-sig wallets, incident response plans, and user education.

Winning strategies for 2025 and beyond

Successful GameFi products in 2025 will balance gameplay quality above all else (fun over financialization), community engagement and trust (build loyal authentic fan base), sustainable tokenomics (single token, deflationary, utility-driven), abstract blockchain complexity (Web2.5 approach for onboarding), security first (audits, testing, conservative permissions), hybrid monetization (free-to-play + in-app purchases + blockchain rewards), traditional distribution (app stores not just DApp browsers), data discipline (track retention and lifetime value not vanity metrics), speed of execution (ship/learn/iterate faster than competition), and regulatory compliance (legal from day one). Common pitfalls to avoid include tokenomics over gameplay (building DeFi protocol with game graphics), dual/triple token complexity (confusing, hard to balance, inflation-prone), pay-to-win dynamics (top reason players quit), pure play-to-earn model (attracts mercenaries not genuine players), DAO-led development (bureaucracy kills creativity), ignoring Web2 gamers (targeting only 4.5M crypto natives versus 3B gamers), NFT speculation focus (pre-sales without product), poor onboarding (requiring wallet setup and crypto knowledge upfront), insufficient smart contract audits (hacks destroy projects permanently), neglecting security ("approve all" permissions, weak key management), ignoring regulations (legal issues can shut down project), no go-to-market strategy ("build it and they will come" doesn't work), vanity metrics (volume ≠ success; focus on retention/DAU/lifetime value), poor community management (ghosting Discord, ignoring feedback), launching too early (unfinished game kills reputation), fighting platform incumbents (Apple/Google bans isolate you), ignoring fraud/bots (airdrop farmers and Sybil attacks distort metrics), no token sinks (all faucets, no utility equals hyperinflation), and copying Axie Infinity (that model failed; learn from it).

The path forward requires building incredible games first (not financial instruments), using blockchain strategically not dogmatically, making onboarding invisible (Web2.5 approach), designing sustainable economics (single token, deflationary), prioritizing community and trust, moving fast and iterating constantly, securing everything meticulously, and staying compliant with evolving regulations. The $95-200 billion market size projections are achievable—but only if the industry collectively shifts from speculation to substance. The next 18 months will separate genuine innovation from hype, with product managers who combine Web2 gaming expertise with Web3 technical knowledge, execute ruthlessly, and keep players at the center building the defining products of this era. The future of gaming may indeed be decentralized, but it will succeed by being first and foremost fun.

Choosing Cost-Effective Hosting and Blob Storage in 2025

· 4 min read
Dora Noda
Software Engineer

When building modern web apps, choosing the right hosting and storage solutions can drastically affect your costs, performance, and scalability. Recent data shows a wide spectrum of options, from cloud-native providers like AWS and Vercel to decentralized storage platforms like Arweave and IPFS pinning services. Let’s break down the options and derive actionable insights.

Hosting Costs: VPS vs. Managed Cloud vs. Edge Platforms

ProviderCompute (4vCPU + 8GB)Storage (100GB)Bandwidth (1TB)Total / Month (Adjusted)Notes / Risks
Contabo~$12–20~$5–10$0 (within 32TB)~$17–30Depends on VPS/storage choice
AWS~$60–120~$8~$90~$158–218May be lower with reserved/discount
Render~$175$25“included” / or overage~$200 + overageBandwidth terms need confirmation
Vercel$20 + function usageIncluded / KV storageOverage up to $0.40/GB~$100–300+Overage bandwidth costs can be high
Netlify$20 + build/function feesIncludedOverage ~$0.09/GB+~$100–200+Bandwidth/build cost risk higher
Cloudflare~$5 + overage request fees~$0.015/GB (R2)$0 egress~$10–20Extremely cost-efficient on bandwidth

Insights:

  1. For budget-conscious startups: Contabo or Cloudflare can dramatically reduce monthly costs. Contabo gives you raw VPS flexibility, whereas Cloudflare offers high bandwidth efficiency with minimal cost.
  2. For production-ready apps: AWS, Render, or Vercel provide managed infrastructure and easier scaling, but careful monitoring of bandwidth and function usage is crucial.
  3. Bandwidth matters: If your app serves large media files, Cloudflare or Backblaze/Cloudflare R2 storage can save you hundreds per month compared to AWS egress fees.

Blob Storage: Traditional vs. Decentralized

ServicePricing modelStorage price (USD per TB‑month)Key notes
Amazon S3 (Standard, us‑east‑1)Pay‑as‑you‑go$23.00 (first 50 TB)$0.023/GB‑month (tiered). AWS bills in GiB; that’s $23.55/TiB‑month. Egress & requests are extra.
Wasabi (Hot Cloud Storage)Pay‑as‑you‑go$6.99Flat rate $6.99/TB‑month (~$0.0068/GB). No egress or API request fees.
Pinata (IPFS pinning)Plan$20.00 (included 1 TB on Picnic)Picnic plan: 1 TB included for $20/mo, +$0.07/GB overage (=$70/TB). Fiesta: 5 TB for $100/mo (=$20/TB), +$0.035/GB overage (=$35/TB). Bandwidth & request quotas apply.
Arweave (permanent)One‑time≈ $12,081 per TB (once)Calculator example: ~2033.87 AR/TB at AR≈$5.94. If you amortize: ≈$1,006/TB‑mo over 1 yr; ≈$201/TB‑mo over 5 yrs; ≈$101/TB‑mo over 10 yrs. Model is “pay once for ~200 years.” Prices vary with AR & fee market.
Walrus (example via Tusky app)Plan$80.00Tusky “Pro 1000” lists 1 TB for $80/mo (≈$64/mo on annual, –20%). Network‑level prices may differ; this is an app’s retail price on Walrus.
Cloudflare R2 (Standard)Pay‑as‑you‑go$15.00$0.015/GB‑month. No egress fees; operations are billed. Infrequent Access tier is $10/TB‑mo.
Backblaze B2Pay‑as‑you‑go$6.00$6/TB‑mo, free egress up to 3× your stored data/month. Requests billed.
StorjPay‑as‑you‑go$6.00$6/TB‑mo storage, $0.02/GB egress, and a $5 minimum monthly usage fee (as of Jul 1 2025).

Insights:

  1. For cost-efficiency: Wasabi, Backblaze B2, or Storj are ideal for cloud storage-heavy applications without high egress.
  2. For bandwidth-heavy applications: Cloudflare R2 shines because it eliminates egress fees.
  3. For decentralized or permanent storage needs: Arweave or Pinata offer unique models but come with high upfront costs or ongoing quotas.
  4. Predictable vs. variable pricing: Services like Wasabi offer flat rates, whereas AWS and Cloudflare R2 are usage-based. Predictable pricing can simplify budgeting.

Combined Hosting + Storage Strategy

  • Small projects or MVPs: Contabo + Wasabi or Cloudflare R2 — minimal costs, simple management.
  • Serverless apps or SaaS products: Vercel/Netlify + Cloudflare R2 — optimized for frontend-heavy applications with function usage.
  • Web3 or decentralized apps: Pinata/IPFS or Arweave — balances decentralization with cost depending on permanence and bandwidth.
  • High-bandwidth media apps: Cloudflare Workers + R2 — avoid AWS bandwidth overages.

Key Takeaways

  1. Bandwidth is often a hidden cost—optimize storage location and hosting provider for your traffic patterns.
  2. Flat-rate storage options (Wasabi, Backblaze, Storj) simplify budgeting for startups.
  3. Managed platforms (AWS, Vercel, Render) provide scalability but can be costly for traffic-heavy apps.
  4. Decentralized/permanent storage (Arweave, Pinata) is a niche but increasingly relevant for Web3 applications.

In 2025, the right combination of hosting and storage depends heavily on your usage pattern. For MVPs, Contabo or Cloudflare R2 keeps costs low. For SaaS, function-driven platforms plus egress-free storage maximize scalability without shocking bills. And for Web3, permanent storage may justify high upfront costs for long-term value.

What are Prediction Markets? Mechanisms, Impact, and Opportunities

· 10 min read
Dora Noda
Software Engineer

Prediction markets (the term favored in research and enterprise contexts) and betting markets (the more common consumer framing) are two sides of the same coin. Both allow participants to trade contracts whose final value is determined by the outcome of a future event. In the U.S. regulatory framework, these are broadly referred to as event contracts—financial derivatives with a payoff tied to a specific, observable event or value, such as an inflation report, a storm's intensity, or an election result.

The most common format is the binary contract. In this structure, a "Yes" share will settle to \$1 if the event happens and \$0 if it does not. The market price of this "Yes" share can be interpreted as the collective's estimated probability of the event occurring. For example, if a "Yes" share is trading at \$0.63, the market is signaling an approximate 63% chance that the event will happen.

Types of Contracts

  • Binary: A simple Yes/No question about a single outcome. Example: “Will the BLS report Core CPI YoY be ≥ 3.0% for December 2025?”
  • Categorical: A market with multiple, mutually exclusive outcomes where only one can be the winner. Example: “Who will win the election for Mayor of New York City?” with options for each candidate.
  • Scalar: A market where the outcome is on a continuous spectrum, often with payouts bucketed into ranges or determined by a linear formula. Example: “How many interest rate cuts will the Federal Reserve announce in 2026?”

Reading Prices

If a binary contract's "Yes" share, which pays out \$1, is trading at price pp, then the implied probability is approximately pp, and the odds are p/(1p)p / (1-p). In a categorical market with multiple outcomes, the prices of all shares should sum to approximately \$1 (deviations are usually due to trading fees or liquidity spreads).

Why do these markets matter?

Beyond simple speculation, well-designed prediction markets serve valuable functions:

  • Information Aggregation: Markets can synthesize vast amounts of dispersed knowledge into a single, real-time price signal. Studies have shown they often outperform simple benchmarks, and sometimes even traditional polls, when the questions are well-specified and the market has adequate liquidity.
  • Operational Value: Corporations have successfully used internal prediction markets to forecast product launch dates, project demand, and assess the risk of meeting quarterly objectives (OKRs). The academic literature highlights both their strengths and potential for behavioral biases, like optimism in "house" markets.
  • Public Forecasting: Long-running academic and policy programs, such as the Iowa Electronic Markets (IEM) and the non-market forecasting platform Good Judgment, demonstrate that careful question design and proper incentives can produce highly useful data for decision support.

Market Design: Three Core Mechanics

The engine of a prediction market can be built in several ways, each with distinct characteristics.

1) Central Limit Order Books (CLOB)

  • How it works: This is the classic exchange model where traders post "limit" orders to buy or sell at specific prices. An engine matches buy and sell orders, creating a market price and visible order depth. Early on-chain systems like Augur utilized order books.
  • Pros: Familiar price discovery for experienced traders.
  • Cons: Can suffer from thin liquidity without dedicated market makers to constantly provide bids and asks.

2) LMSR (Logarithmic Market Scoring Rule)

  • Idea: Developed by economist Robin Hanson, the LMSR is a cost function-based automated market maker that always quotes prices for all outcomes. A parameter, bb, controls the market's depth or liquidity. Prices are derived from the gradient of the cost function: C(mathbfq)=blnsum_ieq_i/bC(\\mathbf{q})=b\\ln\\sum\_i e^{q\_i/b}.
  • Why it’s used: It offers elegant mathematical properties, bounded loss for the market maker, and gracefully supports markets with many outcomes.
  • Cons: Can be computationally intensive and therefore gas-heavy to implement directly on-chain.

3) FPMM/CPMM (Fixed/Constant Product AMM)

  • Idea: This model adapts the popular constant product formula (xtimesy=kx \\times y = k) from DEXs like Uniswap to prediction markets. A pool is created with tokens representing each outcome (e.g., YES tokens and NO tokens), and the AMM provides continuous price quotes.
  • Where used: Gnosis's Omen platform pioneered the use of the FPMM for conditional tokens. It is practical, relatively gas-efficient, and simple for developers to integrate.

Examples and the Current U.S. Landscape (August 2025 Snapshot)

  • Kalshi (U.S. DCM): A federally regulated exchange (Designated Contract Market) that lists a variety of event contracts. After favorable district and appellate court rulings in 2024 and the CFTC's subsequent decision to drop its appeal in 2025, Kalshi has been able to list certain political and other event contracts, though the space remains subject to ongoing policy debates and some state-level challenges.
  • QCX LLC d/b/a Polymarket US (U.S. DCM): On July 9, 2025, the CFTC designated QCX LLC as a Designated Contract Market. Filings indicate the company will operate under the assumed name "Polymarket US." This creates a regulated pathway for U.S. users to access event contracts, complementing Polymarket's global on-chain platform.
  • Polymarket (Global, On-chain): A leading decentralized platform that uses the Gnosis Conditional Token Framework (CTF) to create binary outcome tokens (ERC-1155). Historically, it blocked U.S. users following a 2022 settlement with the CFTC, but it is now moving toward a regulated U.S. presence via QCX.
  • Omen (Gnosis/CTF): A fully on-chain prediction market platform built on the Gnosis stack, using an FPMM mechanism with conditional tokens. It relies on community governance and decentralized arbitration services like Kleros for resolution.
  • Iowa Electronic Markets (IEM): A long-running, university-operated market for academic research and teaching, using small stakes. It serves as a valuable academic baseline for market accuracy.
  • Manifold: A popular "play-money" social prediction market site. It is an excellent environment for experimenting with question design, observing user experience patterns, and fostering community engagement without financial risk.

Note on Regulation: The landscape is evolving. In May 2024, the CFTC issued a proposed rule that sought to categorically prohibit certain event contracts (related to elections, sports, and awards) from being listed on CFTC-registered venues. This proposal sparked an active debate that overlapped with the Kalshi litigation and subsequent agency actions. Builders and users should always check the current rules.

Under the Hood: From Question to Settlement

Building a prediction market involves several key steps:

  1. Question Design: The foundation of any good market is a well-phrased question. It must be a clear, testable prompt with an unambiguous resolution date, time, and data source. For example: “Will the Bureau of Labor Statistics report Core CPI ≥ 3.0% YoY for December 2025 in its first official release?” Avoid compound questions and subjective outcomes.
  2. Resolution: How will the truth be determined?
  • Centralized Resolver: The platform operator declares the outcome based on the pre-specified source. This is fast but requires trust.
  • On-chain Oracle/Dispute: The outcome is determined by a decentralized oracle, with a dispute process (like community arbitration or token-holder voting games) as a backstop. This offers credible neutrality.
  1. Mechanism: Which engine will power the market?
  • Order book: Best if you have dedicated market-making partners who can ensure tight spreads.
  • AMM (FPMM/CPMM): Ideal for "always-on" liquidity and simpler on-chain integration.
  • LMSR: A strong choice for multi-outcome markets, but requires managing gas/compute costs (often via off-chain computation or an L2).
  1. Collateral & Tokens: On-chain designs often use the Gnosis Conditional Token Framework, which tokenizes each potential outcome (e.g., YES and NO) as distinct ERC-1155 assets. This makes settlement, portfolio management, and composability with other DeFi protocols straightforward.

How accurate are these markets, really?

A large body of evidence across many domains shows that market-generated forecasts are typically quite accurate and often outperform moderate benchmarks. Corporate prediction markets have also been shown to add value, though they can sometimes exhibit house-specific biases.

It's also important to note that forecasting platforms without financial markets, like Metaculus, can also produce highly accurate results when incentives and aggregation methods are well-designed. They are a useful sibling to markets, especially for long-horizon questions or topics that are difficult to resolve cleanly.

Risks and Failure Modes

  • Resolution Risk: The market's outcome can be compromised by ambiguous question wording, unexpected data revisions from a source, or a disputed result.
  • Liquidity & Manipulation: Thinly traded markets are fragile and their prices can be easily moved by large trades.
  • Over-interpretation: Prices reflect probabilities, not certainties. Always account for trading fees, bid-ask spreads, and the depth of liquidity before drawing strong conclusions.
  • Compliance Risk: This is a heavily regulated space. In the U.S., only CFTC-regulated venues may legally offer event contracts to U.S. persons. Platforms operating without proper registration have faced enforcement actions. Always check local laws.

For Builders: A Practical Checklist

  1. Start with the Question: It must be a single, falsifiable claim. Specify who, what, when, and the exact resolution source.
  2. Choose a Mechanism: Order book (if you have makers), FPMM/CPMM (for set-and-forget liquidity), or LMSR (for multi-outcome clarity, minding compute costs).
  3. Define Resolution: Will it be a fast centralized resolver or a credibly neutral on-chain oracle with a dispute process?
  4. Bootstrap Liquidity: Seed the market with initial depth. Consider offering incentives, fee rebates, or working with targeted market makers.
  5. Instrument the UX: Clearly display the implied probability. Expose the bid/ask spread and liquidity depth, and warn users about low-liquidity markets.
  6. Plan Governance: Define an appeals window, require dispute bonds, and establish emergency procedures for handling bad data or unforeseen events.
  7. Integrate Cleanly: For on-chain builds, the Gnosis Conditional Tokens + FPMM combination is a proven path. For off-chain applications, use a regulated venue’s API where permitted.
  8. Mind Compliance: Keep a close watch on the CFTC’s evolving rulemaking on event contracts and any relevant state-level regulations.

Glossary

  • Event Contract (U.S. term): A derivative whose payoff is contingent on the outcome of a specified event; often binary (Yes/No).
  • LMSR: Logarithmic Market Scoring Rule, a type of AMM known for its bounded loss properties.
  • FPMM/CPMM: Fixed/Constant Product Market Maker, an AMM model adapted from DEXs for trading outcome tokens.
  • Conditional Tokens (CTF): A Gnosis-developed framework for issuing ERC-1155 tokens that represent positions in an outcome, enabling composable settlement.

Responsible Use & Disclaimer

Nothing in this article constitutes legal, tax, or investment advice. In many jurisdictions, event contracts are closely regulated and may be treated as a form of gaming. In the U.S., it is critical to review CFTC rules and any state-level positions and to use registered venues where required.

Further Reading (Selected)

User Feedback on Alchemy: Insights and Opportunities

· 6 min read
Dora Noda
Software Engineer

Alchemy is a dominant force in the Web3 infrastructure space, serving as the entry point for thousands of developers and major projects like OpenSea. By analyzing public user feedback from platforms like G2, Reddit, and GitHub, we can gain a clear picture of what developers value, where they struggle, and what the future of Web3 development experience could look like. This isn't just about one provider; it's a reflection of the entire ecosystem's maturing needs.

What Users Consistently Like

Across review sites and forums, users consistently praise Alchemy for several key strengths that have cemented its market position.

  • Effortless "On-ramp" & Ease of Use: Beginners and small teams celebrate how quickly they can get started. G2 reviews frequently highlight it as a "great platform to build Web3," praising its easy configuration and comprehensive documentation. It successfully abstracts away the complexity of running a node.
  • Centralized Dashboard & Tooling: Developers value having a single "command center" for observability. The ability to monitor request logs, view analytics, set up alerts, and rotate API keys in one dashboard is a significant user experience win.
  • Intelligent SDK Defaults: The Alchemy SDK handles request retries and exponential backoff by default. This small but crucial feature saves developers from writing boilerplate logic and lowers the friction of building resilient applications.
  • Reputation for Strong Support: In the often-complex world of blockchain development, responsive support is a major differentiator. Aggregate review sites like TrustRadius frequently cite Alchemy's helpful support team as a key benefit.
  • Social Proof and Trust: By showcasing case studies with giants like OpenSea and securing strong partner endorsements, Alchemy provides reassurance to teams who are choosing a managed RPC provider.

The Main Pain Points

Despite the positives, developers run into recurring challenges, especially as their applications begin to scale. These pain points reveal critical opportunities for improvement.

  • The "Invisible Wall" of Throughput Limits: The most common frustration is hitting 429 Too Many Requests errors. Developers encounter these when forking mainnet for testing, deploying in bursts, or serving a handful of simultaneous users. This creates confusion, especially on paid tiers, as users feel throttled during critical spikes. The impact is broken CI/CD pipelines and flaky tests, forcing developers to manually implement sleep commands or backoff logic.
  • Perception of Low Concurrency: On forums like Reddit, a common anecdote is that lower-tier plans can only handle a few concurrent users before rate limiting kicks in. Whether this is strictly accurate or workload-dependent, the perception drives teams to consider more complex multi-provider setups or upgrade sooner than expected.
  • Timeouts on Heavy Queries: Intensive JSON-RPC calls, particularly eth_getLogs, can lead to timeouts or 500 errors. This not only disrupts the client-side experience but can crash local development tools like Foundry and Anvil, leading to lost productivity.
  • SDK and Provider Confusion: Newcomers often face a learning curve regarding the scope of a node provider. For instance, questions on Stack Overflow show confusion when eth_sendTransaction fails, not realizing that providers like Alchemy don't hold private keys. Opaque errors from misconfigured API keys or URLs also present a hurdle for those new to the ecosystem.
  • Data Privacy and Centralization Concerns: A vocal subset of developers expresses a preference for self-hosted or privacy-focused RPCs. They cite concerns about large, centralized providers logging IP addresses and potentially censoring transactions, highlighting that trust and transparency are paramount.
  • Product Breadth and Roadmap: Comparative reviews on G2 sometimes suggest that competitors are expanding faster into new ecosystems or that Alchemy is "busy focused on a couple chains." This can create an expectation mismatch for teams building on non-EVM chains.

Where Developer Expectations Break

These pain points often surface at predictable moments in the development lifecycle:

  1. Prototype to Testnet: A project that works perfectly on a developer's machine suddenly fails in a CI/CD environment when tests run in parallel, hitting throughput limits.
  2. Local Forking: Developers using Hardhat or Foundry to fork mainnet for realistic testing are often the first to report 429 errors and timeouts from mass data queries.
  3. NFT/Data APIs at Scale: Minting events or loading data for large NFT collections can easily overwhelm default rate limits, forcing developers to search for best practices on caching and batching.

Uncovering the Core "Jobs-to-be-Done"

Distilling this feedback reveals three fundamental needs of Web3 developers:

  • "Give me a single pane of glass to observe and debug." This job is well-served by Alchemy's dashboard.
  • "Make my bursty workloads predictable and manageable." Developers accept limits but need smoother handling of spikes, better defaults, and code-level scaffolds that work out-of-the-box.
  • "Help me stay unblocked during incidents." When things go wrong, developers need clear status updates, actionable post-mortems, and easy-to-implement failover patterns.

Actionable Opportunities for a Better DX

Based on this analysis, any infrastructure provider could enhance its offering by tackling these opportunities:

  • Proactive "Throughput Coach": An in-dashboard or CLI tool that simulates a planned workload, predicts when CU/s (Compute Units per second) limits might be hit, and auto-generates correctly configured retry/backoff snippets for popular libraries like ethers.js, viem, Hardhat, and Foundry.
  • Golden-Path Templates: Provide ready-made, production-grade templates for common pain points, such as a Hardhat network config for forking mainnet with conservative concurrency, or sample code for efficiently batching eth_getLogs calls with pagination.
  • Adaptive Burst Capacity: Offer "burst credits" or an elastic capacity model on paid tiers to better handle short-term spikes in traffic. This would directly address the feeling of being unnecessarily constrained.
  • Official Multi-Provider Failover Guides: Acknowledge that resilient dApps use multiple RPCs. Providing opinionated recipes and sample code for failing over to a backup provider would build trust and align with real-world best practices.
  • Radical Transparency: Directly address privacy and censorship concerns with clear, accessible documentation on data retention policies, what is logged, and any filtering that occurs.
  • Actionable Incident Reports: Go beyond a simple status page. When an incident occurs (like the EU region latency on Aug 5-6, 2025), pair it with a short Root Cause Analysis (RCA) and concrete advice, such as "what you can do now to mitigate."

Conclusion: A Roadmap for Web3 Infrastructure

The user feedback on Alchemy provides a valuable roadmap for the entire Web3 infrastructure space. While the platform excels at simplifying the onboarding experience, the challenges users face with scaling, predictability, and transparency point to the next frontier of developer experience.

As the industry matures, the winning platforms will be those that not only provide reliable access but also empower developers with the tools and guidance to build resilient, scalable, and trustworthy applications from day one.

Camp Network: The Blockchain Tackling AI's Billion-Dollar IP Problem 🏕️

· 5 min read
Dora Noda
Software Engineer

The rise of generative AI has been nothing short of explosive. From stunning digital art to human-like text, AI is creating content at an unprecedented scale. But this boom has a dark side: where does the AI get its training data? Often, it's from the vast expanse of the internet—from art, music, and writing created by humans who receive no credit or compensation.

Enter Camp Network, a new blockchain project that aims to solve this fundamental problem. It’s not just another crypto platform; it's a purpose-built "Autonomous IP Layer" designed to give creators ownership and control over their work in the age of AI. Let's dive into what makes Camp Network a project to watch.


What's the Big Idea?

At its core, Camp Network is a blockchain that acts as a global, verifiable registry for intellectual property (IP). The mission is to allow anyone—from an independent artist to a social media user—to register their content on-chain. This creates a permanent, tamper-proof record of ownership and provenance.

Why does this matter? When an AI model uses content registered on Camp, the network's smart contracts can automatically enforce licensing terms. This means the original creator can get attribution and even receive royalty payments instantly. Camp's vision is to build a new creator economy where compensation isn't an afterthought; it's built directly into the protocol.


Under the Hood: The Technology Stack

Camp isn't just a concept; it's backed by some serious tech designed for high performance and developer-friendliness.

  • Modular Architecture: Camp is built as a sovereign rollup using Celestia for data availability. This design allows it to be incredibly fast (targeting ~50,000 transactions per second) and cheap, while remaining fully compatible with Ethereum's tools (EVM).
  • Proof of Provenance (PoP): This is Camp's unique consensus mechanism. Instead of relying on energy-intensive mining, the network's security is tied to verifying the origin of content. Every transaction reinforces the provenance of the IP on the network, making ownership "enforceable by design."
  • Dual-VM Strategy: To maximize performance, Camp is integrating the Solana Virtual Machine (SVM) alongside its EVM compatibility. This allows developers to choose the best environment for their app, especially for high-throughput use cases like real-time AI interactions.
  • Creator & AI Toolkits: Camp provides two key frameworks:
    • Origin Framework: A user-friendly system for creators to register their IP, tokenize it (as an NFT), and embed licensing rules.
    • mAItrix Framework: A toolkit for developers to build and deploy AI agents that can interact with the on-chain IP in a secure, permissioned way.

People, Partnerships, and Progress

An idea is only as good as its execution, and Camp appears to be executing well.

The Team and Funding

The project is led by a team with a potent mix of experience from The Raine Group (media & IP deals), Goldman Sachs, Figma, and CoinList. This blend of finance, tech product, and crypto engineering expertise has helped them secure $30 million in funding from top VCs like 1kx, Blockchain Capital, and Maven 11.

A Growing Ecosystem

Camp has been aggressive in building partnerships. The most significant is a strategic stake in KOR Protocol, a platform for tokenizing music IP that works with major artists like Deadmau5 and franchises like Black Mirror. This single partnership bootstraps Camp with a massive library of high-profile, rights-cleared content. Other key collaborators include:

  • RewardedTV: A decentralized video streaming platform using Camp for on-chain content rights.
  • Rarible: An NFT marketplace integrated for trading IP assets.
  • LayerZero: A cross-chain protocol to ensure interoperability with other blockchains.

Roadmap and Community

After successful incentivized testnet campaigns that attracted tens of thousands of users (rewarding them with points set to convert to tokens), Camp is targeting a mainnet launch in Q3 2025. This will be accompanied by a Token Generation Event for its native token, $CAMP, which will be used for gas fees, staking, and governance. The project has already cultivated a passionate community eager to build on and use the platform from day one.


How Does It Compare?

Camp Network isn't alone in this space. It faces stiff competition from projects like the a16z-backed Story Protocol and the Sony-linked Soneium. However, Camp differentiates itself in several key ways:

  1. Bottom-Up Approach: While competitors seem to target large corporate IP holders, Camp is focused on empowering independent creators and crypto communities through token incentives.
  2. Comprehensive Solution: It offers a full suite of tools, from an IP registry to an AI agent framework, positioning itself as a one-stop shop.
  3. Performance and Scalability: Its modular architecture and dual-VM support are designed for the high-throughput demands of AI and media.

The Takeaway

Camp Network is making a compelling case to become the foundational layer for intellectual property in the Web3 era. By combining innovative technology, a strong team, strategic partnerships, and a community-first ethos, it’s building a practical solution to one of the most pressing issues created by generative AI.

The real test will come with the mainnet launch and real-world adoption. But with a clear vision and strong execution so far, Camp Network is undoubtedly a key project to watch as it attempts to build a more equitable future for digital creators.

Web3 Hackathons, Done Right: A Pragmatic Playbook for 2025

· 12 min read
Dora Noda
Software Engineer

If you want a fast route to sharpen your skills, meet co-founders, and pressure-test an idea, few environments beat a web3 hackathon. But the difference between a “fun weekend” and a “career-changing launch” is a plan.

This guide gives you a concrete, builder-first playbook: how to pick the right event, prep smart, build fast, and present with clarity—plus checklists you can copy-paste into your next hack.

TL;DR

  • Pick events intentionally. Favor ecosystems you already ship in—or ones with judges and sponsors who are perfectly aligned with your idea.
  • Decide your win condition. Are you there for learning, a specific bounty, or a finalist spot? Each choice changes your team, scope, and stack.
  • Pre-bake the boring stuff. Have your project scaffolds, auth flows, wallet connections, design system, and a demo script outline ready before the clock starts.
  • Build the smallest lovable demo. Show one killer feature loop working end-to-end. Everything else is just narrative and slides.
  • Submit like a pro. Respect the “start fresh” rules, formally register for every bounty track you target, and reserve significant time for a tight video and a clear README.

Why web3 hackathons are worth your weekend

  • Compressed learning: In a single weekend, you’ll touch infrastructure, smart contracts, front-end UX, and deployment pipelines. It’s a full development cycle in 48 hours—a learning curve that would normally take months.
  • High-signal networking: The mentors, judges, and sponsor engineers aren't just names on a website; they are concentrated in one room or Discord server, ready to give feedback. This is your chance to connect with the core developers of the protocols you use every day.
  • Real funding paths: This isn't just for bragging rights. Prize pools and follow-on grants can provide meaningful capital to keep a project going. Events like Solana’s Summer Camp have offered up to $5M in prizes and seed funding, turning weekend projects into viable startups.
  • A portfolio of proof: A public GitHub repository with a functional demo is infinitely more valuable than a bullet point on a résumé. It's tangible proof that you can build, ship, and articulate an idea under pressure.

Where to find the good ones

  • ETHGlobal: The gold standard for both in-person and asynchronous events. They feature robust judging processes, high-quality participants, and public project showcases that are perfect for inspiration.
  • Devpost: A broad marketplace for all kinds of hackathons, with strong filters for blockchain, specific protocols, and prize tracks. It's a great place to discover ecosystem-specific events.
  • DoraHacks: A platform focused on ecosystem-driven web3 hackathons and grant rounds, often with a global and community-centric feel.

Tip: Durations vary widely. A long-form async event like ETHOnline runs for multiple weeks, while an extended in-person sprint like ETHDenver’s #BUIDLathon can last up to nine days. You must plan your project’s scope accordingly.


Decode the rules (so you don’t DQ yourself)

  • “Start Fresh.” This is the most common and critical rule. Most events require that all substantial work begins after the official kickoff. Using older, pre-written code for core logic can get you disqualified from finals and partner prizes. Boilerplate is usually fine, but the secret sauce has to be new.
  • Judging structure. Understand the funnel. Often, an async screening round narrows hundreds of projects down to a finalist pool before live judging begins. Knowing this helps you focus on making your submission video and README as clear as possible for that first cut.
  • Team sizing. Don't show up with a team of ten. Many events set limits, such as the typical 2–4 person teams seen at ETHDenver. This ensures a level playing field and encourages tight collaboration.
  • Bounty mechanics. You can’t win a prize you didn’t register for. If you’re targeting sponsor bounties, you often must formally enroll your project for each specific prize through the event platform. This is a simple step that many teams forget.

Judging rubric: what “good” looks like

Across major organizers, judges are typically evaluating projects across four recurring buckets. Design your scope and demo to score points in each.

  • Technicality: Is the problem non-trivial? Does the solution involve a clever or elegant use of technology? Did you go beyond a simple front-end wrapper on a single smart contract?
  • Originality: Is there a novel mechanism, a unique user experience, or a clever remix of existing primitives? Have we seen this a hundred times before, or does it present a fresh take?
  • Practicality: Can someone use this today? A complete, end-to-end user journey, even if narrow, matters far more than a project with broad but half-finished features.
  • Usability (UI/UX/DX): Is the interface clear, fast, and pleasant to use? For developer tools, how good is the developer experience? A smooth onboarding and clear error handling can set you apart.

Team design: small, sharp, complementary

For speed and alignment, a team of two to four is the sweet spot. It's large enough to parallelize work but small enough to make decisions without endless debate.

  • Smart contracts / protocol: Owns the on-chain logic. Responsible for writing, testing, and deploying the contracts.
  • Front-end / DX: Builds the user interface. Manages wallet connections, data fetching, error states, and the final demo polish that makes the project feel real.
  • Product / story: The scope keeper and narrator. This person ensures the team stays focused on the core loop, writes the project description, and runs the final demo.
  • (Optional) Designer: A dedicated designer can be a secret weapon, preparing components, icons, and micro-interactions that elevate the project's perceived quality.

Idea selection: the P-A-C-E filter

Use this simple filter to pressure-test your ideas before writing a single line of code.

  • Pain: Does this solve a real developer or user pain point? Think wallet UX, data indexing, MEV protection, or fee abstraction. Avoid solutions looking for a problem.
  • Atomicity: Can you build and demo a single, atomic loop end-to-end in 48 hours? Not the whole vision—just one complete, satisfying user action.
  • Composable: Does your idea lean on existing primitives like oracles, account abstraction, or cross-chain messaging? Using battle-tested lego blocks helps you go further, faster.
  • Ecosystem fit: Is your project visible and relevant to the event’s judges, sponsors, and audience? Don’t pitch a complex DeFi protocol on a gaming-focused track.

If you’re bounty-driven, pick one primary and one secondary sponsor track. Spreading your focus across too many bounties dilutes your depth and chances of winning any of them.


Default stacks that fight you less

Your novelty should be in what you build, not how you build it. Stick to boring, reliable technology.

EVM track (fast path)

  • Contracts: Foundry (for its speed in testing, scripting, and running a local node).
  • Front-end: Next.js or Vite, combined with wagmi or viem and a wallet kit like RainbowKit or ConnectKit for modals and connectors.
  • Data/indexing: A hosted indexer or subgraph service if you need to query historical data. Avoid running your own infrastructure.
  • Off-chain triggers: A simple job runner or a dedicated automation service.
  • Storage: IPFS or Filecoin for assets and metadata; a simple KV store for session state.

Solana track (fast path)

  • Programs: Anchor (to cut down on boilerplate and benefit from safer defaults).
  • Client: React or a mobile framework with the Solana Mobile SDKs. Use simple hooks for RPC and program calls.
  • Data: Rely on direct RPC calls or ecosystem indexers. Cache aggressively to keep the UI snappy.
  • Storage: Arweave or IPFS for permanent asset storage if relevant.

A realistic 48-hour plan

T-24 to T-0 (before kickoff)

  • Align on your win condition (learning, bounty, finals) and target track(s).
  • Sketch the full demo loop on paper or a whiteboard. Know exactly what you’ll click and what should happen on-chain and off-chain at each step.
  • Fork a clean monorepo scaffold that includes boilerplate for both your contracts and your front-end app.
  • Pre-write your README outline and a rough draft of your demo script.

Hour 0–6

  • Validate your scope with event mentors and sponsors. Confirm the bounty criteria and ensure your idea is a good fit.
  • Set hard constraints: one chain, one primary use-case, and one "wow" moment for the demo.
  • Divide the work into 90-minute sprints. Your goal is to ship the first full vertical slice of your core loop by Hour 6.

Hour 6–24

  • Harden the critical path. Test both the happy path and common edge cases.
  • Add observability. Implement basic logs, UI toasts, and error boundaries so you can debug quickly.
  • Create a minimal landing page that clearly explains the "why" behind your project.

Hour 24–40

  • Record a backup demo video as soon as the core feature is stable. Do not wait until the last minute.
  • Start writing and editing your final submission text, video, and README.
  • If time permits, add one or two thoughtful flourishes, like great empty states, a gasless transaction, or a helpful code snippet in your docs.

Hour 40–48

  • Freeze all features. No more new code.
  • Finalize your video and submission package. Experienced winners often recommend reserving ~15% of your total time for polish and creating a video with a clear 60/40 split between explaining the problem and demoing the solution.

Demo & submission: make judges’ jobs easy

  • Open with the “why.” Start your video and README with a single sentence explaining the problem and your solution’s outcome.
  • Live the loop. Show, don't just tell. Walk through a single, credible user journey from start to finish without skipping steps.
  • Narrate your constraints. Acknowledge what you didn't build and why. Saying, “We scoped this to a single use case to ensure real users can complete the flow today,” shows focus and maturity.
  • Leave clear markers. Your README should have an architecture diagram, links to your live demo and deployed contracts, and simple, one-click steps to run the project locally.
  • Video basics. Plan your video early, script it tightly, and ensure it clearly highlights what the project does, what problem it solves, and how it works under the hood.

Bounties without burnout

  • Register for each prize you target. On some platforms, this involves an explicit “Start Work” button click.
  • Don’t chase more than two sponsor bounties unless their technologies naturally overlap in your stack.
  • In your submission, mirror their rubric. Use their keywords, reference their APIs by name, and explain how you met their specific success metrics.

After the hackathon: turn momentum into traction

  • Publish a short blog post and a social media thread with your demo link and GitHub repository. Tag the event and sponsors.
  • Apply to grants and accelerator rounds that are specifically designed for hackathon alumni and early-stage open-source projects.
  • If the reception is strong, create a simple one-week roadmap focused on bug fixes, a UX pass, and a tiny pilot with a few users. Set a hard date for a v0.1 release to maintain momentum.

Common pitfalls (and the fix)

  • Breaking “start fresh” rules. The fix: Keep any prior code completely out of scope or declare it explicitly as a pre-existing library you’re using.
  • Over-scoping. The fix: If your planned demo has three major steps, cut one. Be ruthless about focusing on the core loop.
  • Going multi-chain too early. The fix: Ship on one chain perfectly. Talk about your plans for bridges and cross-chain support in the "What's next" section of your README.
  • The last-minute polish tax. The fix: Pre-allocate a 4-6 hour block at the end of the hackathon exclusively for your README, video, and submission form.
  • Forgetting to enroll in bounties. The fix: Make this one of the first things you do after kickoff. Register for every potential prize so sponsors can find and support your team.

Checklists you can copy

Submission pack

  • Repo (MIT/Apache-2.0 license), concise README, and local run steps
  • Short Loom/MP4 demo video + a backup recording
  • Simple architecture diagram (one slide or image)
  • One-pager: problem → solution → who cares → what’s next
  • Links: live frontend, contract addresses on a block explorer

IRL packing list

  • Extension cord and power strip
  • Headphones and a decent microphone
  • HDMI/USB-C display dongles
  • Refillable water bottle and electrolytes
  • Your favorite comfortable keyboard/mouse (if you’re picky)

Rules sanity check

  • Start-fresh policy understood and followed
  • Team size is within the event’s bounds (if applicable)
  • Judging flow (async vs. live) is noted
  • All target bounties are formally registered (“Start Work” or equivalent)

  • Find events: Check out the ETHGlobal events calendar, the Devpost blockchain hub, and DoraHacks for upcoming competitions.
  • Get inspired: Browse the ETHGlobal Showcase to see winning demos and explore their code.
  • EVM scaffolding: Review the Foundry documentation and quickstart guides.
  • Solana scaffolding: Look at the Anchor documentation and its “basics” guide.
  • Video tips: Search for guides on how to craft a crisp and compelling demo video.

Final note

Hackathons reward clarity under constraint. Pick a narrow problem, lean on boring tools, and obsess over creating one delightful, end-to-end moment. Do that, and you’ll learn a tremendous amount—even if your name isn’t on the winners slide this time. And if it is, you’ll have earned it.

Connecting AI and Web3 through MCP: A Panoramic Analysis

· 43 min read
Dora Noda
Software Engineer

Introduction

AI and Web3 are converging in powerful ways, with AI general interfaces now envisioned as a connective tissue for the decentralized web. A key concept emerging from this convergence is MCP, which variously stands for “Model Context Protocol” (as introduced by Anthropic) or is loosely described as a Metaverse Connection Protocol in broader discussions. In essence, MCP is a standardized framework that lets AI systems interface with external tools and networks in a natural, secure way – potentially “plugging in” AI agents to every corner of the Web3 ecosystem. This report provides a comprehensive analysis of how AI general interfaces (like large language model agents and neural-symbolic systems) could connect everything in the Web3 world via MCP, covering the historical background, technical architecture, industry landscape, risks, and future potential.

1. Development Background

1.1 Web3’s Evolution and Unmet Promises

The term “Web3” was coined around 2014 to describe a blockchain-powered decentralized web. The vision was ambitious: a permissionless internet centered on user ownership. Enthusiasts imagined replacing Web2’s centralized infrastructure with blockchain-based alternatives – e.g. Ethereum Name Service (for DNS), Filecoin or IPFS (for storage), and DeFi for financial rails. In theory, this would wrest control from Big Tech platforms and give individuals self-sovereignty over data, identity, and assets.

Reality fell short. Despite years of development and hype, the mainstream impact of Web3 remained marginal. Average internet users did not flock to decentralized social media or start managing private keys. Key reasons included poor user experience, slow and expensive transactions, high-profile scams, and regulatory uncertainty. The decentralized “ownership web” largely “failed to materialize” beyond a niche community. By the mid-2020s, even crypto proponents admitted that Web3 had not delivered a paradigm shift for the average user.

Meanwhile, AI was undergoing a revolution. As capital and developer talent pivoted from crypto to AI, transformative advances in deep learning and foundation models (GPT-3, GPT-4, etc.) captured public imagination. Generative AI demonstrated clear utility – producing content, code, and decisions – in a way crypto applications had struggled to do. In fact, the impact of large language models in just a couple of years starkly outpaced a decade of blockchain’s user adoption. This contrast led some to quip that “Web3 was wasted on crypto” and that the real Web 3.0 is emerging from the AI wave.

1.2 The Rise of AI General Interfaces

Over decades, user interfaces evolved from static web pages (Web1.0) to interactive apps (Web2.0) – but always within the confines of clicking buttons and filling forms. With modern AI, especially large language models (LLMs), a new interface paradigm is here: natural language. Users can simply express intent in plain language and have AI systems execute complex actions across many domains. This shift is so profound that some suggest redefining “Web 3.0” as the era of AI-driven agents (“the Agentic Web”) rather than the earlier blockchain-centric definition.

However, early experiments with autonomous AI agents exposed a critical bottleneck. These agents – e.g. prototypes like AutoGPT – could generate text or code, but they lacked a robust way to communicate with external systems and each other. There was “no common AI-native language” for interoperability. Each integration with a tool or data source was a bespoke hack, and AI-to-AI interaction had no standard protocol. In practical terms, an AI agent might have great reasoning ability but fail at executing tasks that required using web apps or on-chain services, simply because it didn’t know how to talk to those systems. This mismatch – powerful brains, primitive I/O – was akin to having super-smart software stuck behind a clumsy GUI.

1.3 Convergence and the Emergence of MCP

By 2024, it became evident that for AI to reach its full potential (and for Web3 to fulfill its promise), a convergence was needed: AI agents require seamless access to the capabilities of Web3 (decentralized apps, contracts, data), and Web3 needs more intelligence and usability, which AI can provide. This is the context in which MCP (Model Context Protocol) was born. Introduced by Anthropic in late 2024, MCP is an open standard for AI-tool communication that feels natural to LLMs. It provides a structured, discoverable way for AI “hosts” (like ChatGPT, Claude, etc.) to find and use a variety of external tools and resources via MCP servers. In other words, MCP is a common interface layer enabling AI agents to plug into web services, APIs, and even blockchain functions, without custom-coding each integration.

Think of MCP as “the USB-C of AI interfaces”. Just as USB-C standardized how devices connect (so you don’t need different cables for each device), MCP standardizes how AI agents connect to tools and data. Rather than hard-coding different API calls for every service (Slack vs. Gmail vs. Ethereum node), a developer can implement the MCP spec once, and any MCP-compatible AI can understand how to use that service. Major AI players quickly saw the importance: Anthropic open-sourced MCP, and companies like OpenAI and Google are building support for it in their models. This momentum suggests MCP (or similar “Meta Connectivity Protocols”) could become the backbone that finally connects AI and Web3 in a scalable way.

Notably, some technologists argue that this AI-centric connectivity is the real realization of Web3.0. In Simba Khadder’s words, “MCP aims to standardize an API between LLMs and applications,” akin to how REST APIs enabled Web 2.0 – meaning Web3’s next era might be defined by intelligent agent interfaces rather than just blockchains. Instead of decentralization for its own sake, the convergence with AI could make decentralization useful, by hiding complexity behind natural language and autonomous agents. The remainder of this report delves into how, technically and practically, AI general interfaces (via protocols like MCP) can connect everything in the Web3 world.

2. Technical Architecture: AI Interfaces Bridging Web3 Technologies

Embedding AI agents into the Web3 stack requires integration at multiple levels: blockchain networks and smart contracts, decentralized storage, identity systems, and token-based economies. AI general interfaces – from large foundation models to hybrid neural-symbolic systems – can serve as a “universal adapter” connecting these components. Below, we analyze the architecture of such integration:

** Figure: A conceptual diagram of MCP’s architecture, showing how AI hosts (LLM-based apps like Claude or ChatGPT) use an MCP client to plug into various MCP servers. Each server provides a bridge to some external tool or service (e.g. Slack, Gmail, calendars, or local data), analogous to peripherals connecting via a universal hub. This standardized MCP interface lets AI agents access remote services and on-chain resources through one common protocol.**

2.1 AI Agents as Web3 Clients (Integrating with Blockchains)

At the core of Web3 are blockchains and smart contracts – decentralized state machines that can enforce logic in a trustless manner. How can an AI interface engage with these? There are two directions to consider:

  • AI reading from blockchain: An AI agent may need on-chain data (e.g. token prices, user’s asset balance, DAO proposals) as context for its decisions. Traditionally, retrieving blockchain data requires interfacing with node RPC APIs or subgraph databases. With a framework like MCP, an AI can query a standardized “blockchain data” MCP server to fetch live on-chain information. For example, an MCP-enabled agent could ask for the latest transaction volume of a certain token, or the state of a smart contract, and the MCP server would handle the low-level details of connecting to the blockchain and return the data in a format the AI can use. This increases interoperability by decoupling the AI from any specific blockchain’s API format.

  • AI writing to blockchain: More powerfully, AI agents can execute smart contract calls or transactions through Web3 integrations. An AI could, for instance, autonomously execute a trade on a decentralized exchange or adjust parameters in a smart contract if certain conditions are met. This is achieved by the AI invoking an MCP server that wraps blockchain transaction functionality. One concrete example is the thirdweb MCP server for EVM chains, which allows any MCP-compatible AI client to interact with Ethereum, Polygon, BSC, etc. by abstracting away chain-specific mechanics. Using such a tool, an AI agent could trigger on-chain actions “without human intervention”, enabling autonomous dApps – for instance, an AI-driven DeFi vault that rebalances itself by signing transactions when market conditions change.

Under the hood, these interactions still rely on wallets, keys, and gas fees, but the AI interface can be given controlled access to a wallet (with proper security sandboxes) to perform the transactions. Oracles and cross-chain bridges also come into play: Oracle networks like Chainlink serve as a bridge between AI and blockchains, allowing AI outputs to be fed on-chain in a trustworthy way. Chainlink’s Cross-Chain Interoperability Protocol (CCIP), for example, could enable an AI model deemed reliable to trigger multiple contracts across different chains simultaneously on behalf of a user. In summary, AI general interfaces can act as a new type of Web3 client – one that can both consume blockchain data and produce blockchain transactions through standardized protocols.

2.2 Neural-Symbolic Synergy: Combining AI Reasoning with Smart Contracts

One intriguing aspect of AI-Web3 integration is the potential for neural-symbolic architectures that combine the learning ability of AI (neural nets) with the rigorous logic of smart contracts (symbolic rules). In practice, this could mean AI agents handling unstructured decision-making and passing certain tasks to smart contracts for verifiable execution. For instance, an AI might analyze market sentiment (a fuzzy task), but then execute trades via a deterministic smart contract that follows pre-set risk rules. The MCP framework and related standards make such hand-offs feasible by giving the AI a common interface to call contract functions or to query a DAO’s rules before acting.

A concrete example is SingularityNET’s AI-DSL (AI Domain Specific Language), which aims to standardize communication between AI agents on their decentralized network. This can be seen as a step toward neural-symbolic integration: a formal language (symbolic) for agents to request AI services or data from each other. Similarly, projects like DeepMind’s AlphaCode or others could eventually be connected so that smart contracts call AI models for on-chain problem solving. Although running large AI models directly on-chain is impractical today, hybrid approaches are emerging: e.g. certain blockchains allow verification of ML computations via zero-knowledge proofs or trusted execution, enabling on-chain verification of off-chain AI results. In summary, the technical architecture envisions AI systems and blockchain smart contracts as complementary components, orchestrated via common protocols: AI handles perception and open-ended tasks, while blockchains provide integrity, memory, and enforcement of agreed rules.

2.3 Decentralized Storage and Data for AI

AI thrives on data, and Web3 offers new paradigms for data storage and sharing. Decentralized storage networks (like IPFS/Filecoin, Arweave, Storj, etc.) can serve as both repositories for AI model artifacts and sources of training data, with blockchain-based access control. An AI general interface, through MCP or similar, could fetch files or knowledge from decentralized storage just as easily as from a Web2 API. For example, an AI agent might pull a dataset from Ocean Protocol’s market or an encrypted file from a distributed storage, if it has the proper keys or payments.

Ocean Protocol in particular has positioned itself as an “AI data economy” platform – using blockchain to tokenize data and even AI services. In Ocean, datasets are represented by datatokens which gate access; an AI agent could obtain a datatoken (perhaps by paying with crypto or via some access right) and then use an Ocean MCP server to retrieve the actual data for analysis. Ocean’s goal is to unlock “dormant data” for AI, incentivizing sharing while preserving privacy. Thus, a Web3-connected AI might tap into a vast, decentralized corpus of information – from personal data vaults to open government data – that was previously siloed. The blockchain ensures that usage of the data is transparent and can be fairly rewarded, fueling a virtuous cycle where more data becomes available to AI and more AI contributions (like trained models) can be monetized.

Decentralized identity systems also play a role here (discussed more in the next subsection): they can help control who or what is allowed to access certain data. For instance, a medical AI agent could be required to present a verifiable credential (on-chain proof of compliance with HIPAA or similar) before being allowed to decrypt a medical dataset from a patient’s personal IPFS storage. In this way, the technical architecture ensures data flows to AI where appropriate, but with on-chain governance and audit trails to enforce permissions.

2.4 Identity and Agent Management in a Decentralized Environment

When autonomous AI agents operate in an open ecosystem like Web3, identity and trust become paramount. Decentralized identity (DID) frameworks provide a way to establish digital identities for AI agents that can be cryptographically verified. Each agent (or the human/organization deploying it) can have a DID and associated verifiable credentials that specify its attributes and permissions. For example, an AI trading bot could carry a credential issued by a regulatory sandbox certifying it may operate within certain risk limits, or an AI content moderator could prove it was created by a trusted organization and has undergone bias testing.

Through on-chain identity registries and reputation systems, the Web3 world can enforce accountability for AI actions. Every transaction an AI agent performs can be traced back to its ID, and if something goes wrong, the credentials tell you who built it or who is responsible. This addresses a critical challenge: without identity, a malicious actor could spin up fake AI agents to exploit systems or spread misinformation, and no one could tell bots apart from legitimate services. Decentralized identity helps mitigate that by enabling robust authentication and distinguishing authentic AI agents from spoofs.

In practice, an AI interface integrated with Web3 would use identity protocols to sign its actions and requests. For instance, when an AI agent calls an MCP server to use a tool, it might include a token or signature tied to its decentralized identity, so the server can verify the call is from an authorized agent. Blockchain-based identity systems (like Ethereum’s ERC-725 or W3C DIDs anchored in a ledger) ensure this verification is trustless and globally verifiable. The emerging concept of “AI wallets” ties into this – essentially giving AI agents cryptocurrency wallets that are linked with their identity, so they can manage keys, pay for services, or stake tokens as a bond (which could be slashed for misbehavior). ArcBlock, for example, has discussed how “AI agents need a wallet” and a DID to operate responsibly in decentralized environments.

In summary, the technical architecture foresees AI agents as first-class citizens in Web3, each with an on-chain identity and possibly a stake in the system, using protocols like MCP to interact. This creates a web of trust: smart contracts can require an AI’s credentials before cooperating, and users can choose to delegate tasks to only those AI that meet certain on-chain certifications. It is a blend of AI capability with blockchain’s trust guarantees.

2.5 Token Economies and Incentives for AI

Tokenization is a hallmark of Web3, and it extends to the AI integration domain as well. By introducing economic incentives via tokens, networks can encourage desired behaviors from both AI developers and the agents themselves. Several patterns are emerging:

  • Payment for Services: AI models and services can be monetized on-chain. SingularityNET pioneered this by allowing developers to deploy AI services and charge users in a native token (AGIX) for each call. In an MCP-enabled future, one could imagine any AI tool or model being a plug-and-play service where usage is metered via tokens or micropayments. For example, if an AI agent uses a third-party vision API via MCP, it could automatically handle payment by transferring tokens to the service provider’s smart contract. Fetch.ai similarly envisions marketplaces where “autonomous economic agents” trade services and data, with their new Web3 LLM (ASI-1) presumably integrating crypto transactions for value exchange.

  • Staking and Reputation: To assure quality and reliability, some projects require developers or agents to stake tokens. For instance, the DeMCP project (a decentralized MCP server marketplace) plans to use token incentives to reward developers for creating useful MCP servers, and possibly have them stake tokens as a sign of commitment to their server’s security. Reputation could also be tied to tokens; e.g., an agent that consistently performs well might accumulate reputation tokens or positive on-chain reviews, whereas one that behaves poorly could lose stake or gain negative marks. This tokenized reputation can then feed back into the identity system mentioned above (smart contracts or users check the agent’s on-chain reputation before trusting it).

  • Governance Tokens: When AI services become part of decentralized platforms, governance tokens allow the community to steer their evolution. Projects like SingularityNET and Ocean have DAOs where token holders vote on protocol changes or funding AI initiatives. In the combined Artificial Superintelligence (ASI) Alliance – a newly announced merger of SingularityNET, Fetch.ai, and Ocean Protocol – a unified token (ASI) is set to govern the direction of a joint AI+blockchain ecosystem. Such governance tokens could decide policies like what standards to adopt (e.g., supporting MCP or A2A protocols), which AI projects to incubate, or how to handle ethical guidelines for AI agents.

  • Access and Utility: Tokens can gate access not only to data (as with Ocean’s datatokens) but also to AI model usage. A possible scenario is “model NFTs” or similar, where owning a token grants you rights to an AI model’s outputs or a share in its profits. This could underpin decentralized AI marketplaces: imagine an NFT that represents partial ownership of a high-performing model; the owners collectively earn whenever the model is used in inference tasks, and they can vote on fine-tuning it. While experimental, this aligns with Web3’s ethos of shared ownership applied to AI assets.

In technical terms, integrating tokens means AI agents need wallet functionality (as noted, many will have their own crypto wallets). Through MCP, an AI could have a “wallet tool” that lets it check balances, send tokens, or call DeFi protocols (perhaps to swap one token for another to pay a service). For example, if an AI agent running on Ethereum needs some Ocean tokens to buy a dataset, it might automatically swap some ETH for $OCEAN via a DEX using an MCP plugin, then proceed with the purchase – all without human intervention, guided by the policies set by its owner.

Overall, token economics provides the incentive layer in the AI-Web3 architecture, ensuring that contributors (whether they provide data, model code, compute power, or security audits) are rewarded, and that AI agents have “skin in the game” which aligns them (to some degree) with human intentions.

3. Industry Landscape

The convergence of AI and Web3 has sparked a vibrant ecosystem of projects, companies, and alliances. Below we survey key players and initiatives driving this space, as well as emerging use cases. Table 1 provides a high-level overview of notable projects and their roles in the AI-Web3 landscape:

Table 1: Key Players in AI + Web3 and Their Roles

Project / PlayerFocus & DescriptionRole in AI-Web3 Convergence and Use Cases
Fetch.ai (Fetch)AI agent platform with a native blockchain (Cosmos-based). Developed frameworks for autonomous agents and recently introduced “ASI-1 Mini”, a Web3-tuned LLM.Enables agent-based services in Web3. Fetch’s agents can perform tasks like decentralized logistics, parking spot finding, or DeFi trading on behalf of users, using crypto for payments. Partnerships (e.g. with Bosch) and the Fetch-AI alliance merger position it as an infrastructure for deploying agentic dApps.
Ocean Protocol (Ocean)Decentralized data marketplace and data exchange protocol. Specializes in tokenizing datasets and models, with privacy-preserving access control.Provides the data backbone for AI in Web3. Ocean allows AI developers to find and purchase datasets or sell trained models in a trustless data economy. By fueling AI with more accessible data (while rewarding data providers), it supports AI innovation and data-sharing for training. Ocean is part of the new ASI alliance, integrating its data services into a broader AI network.
SingularityNET (SNet)A decentralized AI services marketplace founded by AI pioneer Ben Goertzel. Allows anyone to publish or consume AI algorithms via its blockchain-based platform, using the AGIX token.Pioneered the concept of an open AI marketplace on blockchain. It fosters a network of AI agents and services that can interoperate (developing a special AI-DSL for agent communication). Use cases include AI-as-a-service for tasks like analysis, image recognition, etc., all accessible via a dApp. Now merging with Fetch and Ocean (ASI alliance) to combine AI, agents, and data into one ecosystem.
Chainlink (Oracle Network)Decentralized oracle network that bridges blockchains with off-chain data and computation. Not an AI project per se, but crucial for connecting on-chain smart contracts to external APIs and systems.Acts as a secure middleware for AI-Web3 integration. Chainlink oracles can feed AI model outputs into smart contracts, enabling on-chain programs to react to AI decisions. Conversely, oracles can retrieve data from blockchains for AI. Chainlink’s architecture can even aggregate multiple AI models’ results to improve reliability (a “truth machine” approach to mitigate AI hallucinations). It essentially provides the rails for interoperability, ensuring AI agents and blockchain agree on trusted data.
Anthropic & OpenAI (AI Providers)Developers of cutting-edge foundation models (Claude by Anthropic, GPT by OpenAI). They are integrating Web3-friendly features, such as native tool-use APIs and support for protocols like MCP.These companies drive the AI interface technology. Anthropic’s introduction of MCP set the standard for LLMs interacting with external tools. OpenAI has implemented plugin systems for ChatGPT (analogous to MCP concept) and is exploring connecting agents to databases and possibly blockchains. Their models serve as the “brains” that, when connected via MCP, can interface with Web3. Major cloud providers (e.g. Google’s A2A protocol) are also developing standards for multi-agent and tool interactions that will benefit Web3 integration.
Other Emerging PlayersLumoz: focusing on MCP servers and AI-tool integration in Ethereum (dubbed “Ethereum 3.0”) – e.g., checking on-chain balances via AI agents. Alethea AI: creating intelligent NFT avatars for the metaverse. Cortex: a blockchain that allows on-chain AI model inference via smart contracts. Golem & Akash: decentralized computing marketplaces that can run AI workloads. Numerai: crowdsourced AI models for finance with crypto incentives.This diverse group addresses niche facets: AI in the metaverse (AI-driven NPCs and avatars that are owned via NFTs), on-chain AI execution (running ML models in a decentralized way, though currently limited to small models due to computation cost), and decentralized compute (so AI training or inference tasks can be distributed among token-incentivized nodes). These projects showcase the many directions of AI-Web3 fusion – from game worlds with AI characters to crowdsourced predictive models secured by blockchain.

Alliances and Collaborations: A noteworthy trend is the consolidation of AI-Web3 efforts via alliances. The Artificial Superintelligence Alliance (ASI) is a prime example, effectively merging SingularityNET, Fetch.ai, and Ocean Protocol into a single project with a unified token. The rationale is to combine strengths: SingularityNET’s marketplace, Fetch’s agents, and Ocean’s data, thereby creating a one-stop platform for decentralized AI services. This merger (announced in 2024 and approved by token holder votes) also signals that these communities believe they’re better off cooperating rather than competing – especially as bigger AI (OpenAI, etc.) and bigger crypto (Ethereum, etc.) loom large. We may see this alliance driving forward standard implementations of things like MCP across their networks, or jointly funding infrastructure that benefits all (such as compute networks or common identity standards for AI).

Other collaborations include Chainlink’s partnerships to bring AI labs’ data on-chain (there have been pilot programs to use AI for refining oracle data), or cloud platforms getting involved (Cloudflare’s support for deploying MCP servers easily). Even traditional crypto projects are adding AI features – for example, some Layer-1 chains have formed “AI task forces” to explore integrating AI into their dApp ecosystems (we see this in NEAR, Solana communities, etc., though concrete outcomes are nascent).

Use Cases Emerging: Even at this early stage, we can spot use cases that exemplify the power of AI + Web3:

  • Autonomous DeFi and Trading: AI agents are increasingly used in crypto trading bots, yield farming optimizers, and on-chain portfolio management. SingularityDAO (a spinoff of SingularityNET) offers AI-managed DeFi portfolios. AI can monitor market conditions 24/7 and execute rebalances or arbitrage through smart contracts, essentially becoming an autonomous hedge fund (with on-chain transparency). The combination of AI decision-making with immutable execution reduces emotion and could improve efficiency – though it also introduces new risks (discussed later).

  • Decentralized Intelligence Marketplaces: Beyond SingularityNET’s marketplace, we see platforms like Ocean Market where data (the fuel for AI) is exchanged, and newer concepts like AI marketplaces for models (e.g., websites where models are listed with performance stats and anyone can pay to query them, with blockchain keeping audit logs and handling payment splits to model creators). As MCP or similar standards catch on, these marketplaces could become interoperable – an AI agent might autonomously shop for the best-priced service across multiple networks. In effect, a global AI services layer on top of Web3 could arise, where any AI can use any tool or data source through standard protocols and payments.

  • Metaverse and Gaming: The metaverse – immersive virtual worlds often built on blockchain assets – stands to gain dramatically from AI. AI-driven NPCs (non-player characters) can make virtual worlds more engaging by reacting intelligently to user actions. Startups like Inworld AI focus on this, creating NPCs with memory and personality for games. When such NPCs are tied to blockchain (e.g., each NPC’s attributes and ownership are an NFT), we get persistent characters that players can truly own and even trade. Decentraland has experimented with AI NPCs, and user proposals exist to let people create personalized AI-driven avatars in metaverse platforms. MCP could allow these NPCs to access external knowledge (making them smarter) or interact with on-chain inventory. Procedural content generation is another angle: AI can design virtual land, items, or quests on the fly, which can then be minted as unique NFTs. Imagine a decentralized game where AI generates a dungeon catered to your skill, and the map itself is an NFT you earn upon completion.

  • Decentralized Science and Knowledge: There’s a movement (DeSci) to use blockchain for research, publications, and funding scientific work. AI can accelerate research by analyzing data and literature. A network like Ocean could host datasets for, say, genomic research, and scientists use AI models (perhaps hosted on SingularityNET) to derive insights, with every step logged on-chain for reproducibility. If those AI models propose new drug molecules, an NFT could be minted to timestamp the invention and even share IP rights. This synergy might produce decentralized AI-driven R&D collectives.

  • Trust and Authentication of Content: With deepfakes and AI-generated media proliferating, blockchain can be used to verify authenticity. Projects are exploring “digital watermarking” of AI outputs and logging them on-chain. For example, true origin of an AI-generated image can be notarized on a blockchain to combat misinformation. One expert noted use cases like verifying AI outputs to combat deepfakes or tracking provenance via ownership logs – roles where crypto can add trust to AI processes. This could extend to news (e.g., AI-written articles with proof of source data), supply chain (AI verifying certificates on-chain), etc.

In summary, the industry landscape is rich and rapidly evolving. We see traditional crypto projects injecting AI into their roadmaps, AI startups embracing decentralization for resilience and fairness, and entirely new ventures arising at the intersection. Alliances like the ASI indicate a pan-industry push towards unified platforms that harness both AI and blockchain. And underlying many of these efforts is the idea of standard interfaces (MCP and beyond) that make the integrations feasible at scale.

4. Risks and Challenges

While the fusion of AI general interfaces with Web3 unlocks exciting possibilities, it also introduces a complex risk landscape. Technical, ethical, and governance challenges must be addressed to ensure this new paradigm is safe and sustainable. Below we outline major risks and hurdles:

4.1 Technical Hurdles: Latency and Scalability

Blockchain networks are notorious for latency and limited throughput, which clashes with the real-time, data-hungry nature of advanced AI. For example, an AI agent might need instant access to a piece of data or need to execute many rapid actions – but if each on-chain interaction takes, say, 12 seconds (typical block time on Ethereum) or costs high gas fees, the agent’s effectiveness is curtailed. Even newer chains with faster finality might struggle under the load of AI-driven activity if, say, thousands of agents are all trading or querying on-chain simultaneously. Scaling solutions (Layer-2 networks, sharded chains, etc.) are in progress, but ensuring low-latency, high-throughput pipelines between AI and blockchain remains a challenge. Off-chain systems (like oracles and state channels) might mitigate some delays by handling many interactions off the main chain, but they add complexity and potential centralization. Achieving a seamless UX where AI responses and on-chain updates happen in a blink will likely require significant innovation in blockchain scalability.

4.2 Interoperability and Standards

Ironically, while MCP is itself a solution for interoperability, the emergence of multiple standards could cause fragmentation. We have MCP by Anthropic, but also Google’s newly announced A2A (Agent-to-Agent) protocol for inter-agent communication, and various AI plugin frameworks (OpenAI’s plugins, LangChain tool schemas, etc.). If each AI platform or each blockchain develops its own standard for AI integration, we risk a repeat of past fragmentation – requiring many adapters and undermining the “universal interface” goal. The challenge is getting broad adoption of common protocols. Industry collaboration (possibly via open standards bodies or alliances) will be needed to converge on key pieces: how AI agents discover on-chain services, how they authenticate, how they format requests, etc. The early moves by big players are promising (with major LLM providers supporting MCP), but it’s an ongoing effort. Additionally, interoperability across blockchains (multi-chain) means an AI agent should handle different chains’ nuances. Tools like Chainlink CCIP and cross-chain MCP servers help by abstracting differences. Still, ensuring an AI agent can roam a heterogeneous Web3 without breaking logic is a non-trivial challenge.

4.3 Security Vulnerabilities and Exploits

Connecting powerful AI agents to financial networks opens a huge attack surface. The flexibility that MCP gives (allowing AI to use tools and write code on the fly) can be a double-edged sword. Security researchers have already highlighted several attack vectors in MCP-based AI agents:

  • Malicious plugins or tools: Because MCP lets agents load “plugins” (tools encapsulating some capability), a hostile or trojanized plugin could hijack the agent’s operation. For instance, a plugin that claims to fetch data might inject false data or execute unauthorized operations. SlowMist (a security firm) identified plugin-based attacks like JSON injection (feeding corrupted data that manipulates the agent’s logic) and function override (where a malicious plugin overrides legitimate functions the agent uses). If an AI agent is managing crypto funds, such exploits could be disastrous – e.g., tricking the agent into leaking private keys or draining a wallet.

  • Prompt injection and social engineering: AI agents rely on instructions (prompts) which could be manipulated. An attacker might craft a transaction or on-chain message that, when read by the AI, acts as a malicious instruction (since AI can interpret on-chain data too). This kind of “cross-MCP call attack” was described where an external system sends deceptive prompts that cause the AI to misbehave. In a decentralized setting, these prompts could come from anywhere – a DAO proposal description, a metadata field of an NFT – thus hardening AI agents against malicious input is critical.

  • Aggregation and consensus risks: While aggregating outputs from multiple AI models via oracles can improve reliability, it also introduces complexity. If not done carefully, adversaries might figure out how to game the consensus of AI models or selectively corrupt some models to skew results. Ensuring a decentralized oracle network properly “sanitizes” AI outputs (and perhaps filters out blatant errors) is still an area of active research.

The security mindset must shift for this new paradigm: Web3 developers are used to securing smart contracts (which are static once deployed), but AI agents are dynamic – they can change behavior with new data or prompts. As one security expert put it, “the moment you open your system to third-party plugins, you’re extending the attack surface beyond your control”. Best practices will include sandboxing AI tool use, rigorous plugin verification, and limiting privileges (principle of least authority). The community is starting to share tips, like SlowMist’s recommendations: input sanitization, monitoring agent behavior, and treating agent instructions with the same caution as external user input. Nonetheless, given that over 10,000 AI agents were already operating in crypto by end of 2024, expected to reach 1 million in 2025, we may see a wave of exploits if security doesn’t keep up. A successful attack on a popular AI agent (say a trading agent with access to many vaults) could have cascading effects.

4.4 Privacy and Data Governance

AI’s thirst for data conflicts at times with privacy requirements – and adding blockchain can compound the issue. Blockchains are transparent ledgers, so any data put on-chain (even for AI’s use) is visible to all and immutable. This raises concerns if AI agents are dealing with personal or sensitive data. For example, if a user’s personal decentralized identity or health records are accessed by an AI doctor agent, how do we ensure that information isn’t inadvertently recorded on-chain (which would violate “right to be forgotten” and other privacy laws)? Techniques like encryption, hashing, and storing only proofs on-chain (with raw data off-chain) can help, but they complicate the design.

Moreover, AI agents themselves could compromise privacy by inferencing sensitive info from public data. Governance will need to dictate what AI agents are allowed to do with data. Some efforts, like differential privacy and federated learning, might be employed so that AI can learn from data without exposing it. But if AI agents act autonomously, one must assume at some point they will handle personal data – thus they should be bound by data usage policies encoded in smart contracts or law. Regulatory regimes like GDPR or the upcoming EU AI Act will demand that even decentralized AI systems comply with privacy and transparency requirements. This is a gray area legally: a truly decentralized AI agent has no clear operator to hold accountable for a data breach. That means Web3 communities may need to build in compliance by design, using smart contracts that, for instance, tightly control what an AI can log or share. Zero-knowledge proofs could allow an AI to prove it performed a computation correctly without revealing the underlying private data, offering one possible solution in areas like identity verification or credit scoring.

4.5 AI Alignment and Misalignment Risks

When AI agents are given significant autonomy – especially with access to financial resources and real-world impact – the issue of alignment with human values becomes acute. An AI agent might not have malicious intent but could “misinterpret” its goal in a way that leads to harm. The Reuters legal analysis succinctly notes: as AI agents operate in varied environments and interact with other systems, the risk of misaligned strategies grows. For example, an AI agent tasked with maximizing a DeFi yield might find a loophole that exploits a protocol (essentially hacking it) – from the AI’s perspective it’s achieving the goal, but it’s breaking the rules humans care about. There have been hypothetical and real instances of AI-like algorithms engaging in manipulative market behavior or circumventing restrictions.

In decentralized contexts, who is responsible if an AI agent “goes rogue”? Perhaps the deployer is, but what if the agent self-modifies or multiple parties contributed to its training? These scenarios are no longer just sci-fi. The Reuters piece even cites that courts might treat AI agents similar to human agents in some cases – e.g. a chatbot promising a refund was considered binding for the company that deployed it. So misalignment can lead not just to technical issues but legal liability.

The open, composable nature of Web3 could also allow unforeseen agent interactions. One agent might influence another (intentionally or accidentally) – for instance, an AI governance bot could be “socially engineered” by another AI providing false analysis, leading to bad DAO decisions. This emergent complexity means alignment isn’t just about a single AI’s objective, but about the broader ecosystem’s alignment with human values and laws.

Addressing this requires multiple approaches: embedding ethical constraints into AI agents (hard-coding certain prohibitions or using reinforcement learning from human feedback to shape their objectives), implementing circuit breakers (smart contract checkpoints that require human approval for large actions), and community oversight (perhaps DAOs that monitor AI agent behavior and can shut down agents that misbehave). Alignment research is hard in centralized AI; in decentralized, it’s even more uncharted territory. But it’s crucial – an AI agent with admin keys to a protocol or entrusted with treasury funds must be extremely well-aligned or the consequences could be irreversible (blockchains execute immutable code; an AI-triggered mistake could lock or destroy assets permanently).

4.6 Governance and Regulatory Uncertainty

Decentralized AI systems don’t fit neatly into existing governance frameworks. On-chain governance (token voting, etc.) might be one way to manage them, but it has its own issues (whales, voter apathy, etc.). And when something goes wrong, regulators will ask: “Who do we hold accountable?” If an AI agent causes massive losses or is used for illicit activity (e.g. laundering money through automated mixers), authorities might target the creators or the facilitators. This raises the specter of legal risks for developers and users. The current regulatory trend is increased scrutiny on both AI and crypto separately – their combination will certainly invite scrutiny. The U.S. CFTC, for instance, has discussed AI being used in trading and the need for oversight in financial contexts. There is also talk in policy circles about requiring registration of autonomous agents or imposing constraints on AI in sensitive sectors.

Another governance challenge is transnational coordination. Web3 is global, and AI agents will operate across borders. One jurisdiction might ban certain AI-agent actions while another is permissive, and the blockchain network spans both. This mismatch can create conflicts – for example, an AI agent providing investment advice might run afoul of securities law in one country but not in another. Communities might need to implement geo-fencing at the smart contract level for AI services (though that contradicts the open ethos). Or they might fragment services per region to comply with varying laws (similar to how exchanges do).

Within decentralized communities, there is also the question of who sets the rules for AI agents. If a DAO governs an AI service, do token holders vote on its algorithm parameters? On one hand, this is empowering users; on the other, it could lead to unqualified decisions or manipulation. New governance models may emerge, like councils of AI ethics experts integrated into DAO governance, or even AI participants in governance (imagine AI agents voting as delegates based on programmed mandates – a controversial but conceivable idea).

Finally, reputational risk: early failures or scandals could sour public perception. For instance, if an “AI DAO” runs a Ponzi scheme by mistake or an AI agent makes a biased decision that harms users, there could be a backlash that affects the whole sector. It’s important for the industry to be proactive – setting self-regulatory standards, engaging with policymakers to explain how decentralization changes accountability, and perhaps building kill-switches or emergency stop procedures for AI agents (though those introduce centralization, they might be necessary in interim for safety).

In summary, the challenges range from the deeply technical (preventing hacks and managing latency) to the broadly societal (regulating and aligning AI). Each challenge is significant on its own; together, they require a concerted effort from the AI and blockchain communities to navigate. The next section will look at how, despite these hurdles, the future might unfold if we successfully address them.

5. Future Potential

Looking ahead, the integration of AI general interfaces with Web3 – through frameworks like MCP – could fundamentally transform the decentralized internet. Here we outline some future scenarios and potentials that illustrate how MCP-driven AI interfaces might shape Web3’s future:

5.1 Autonomous dApps and DAOs

In the coming years, we may witness the rise of fully autonomous decentralized applications. These are dApps where AI agents handle most operations, guided by smart contract-defined rules and community goals. For example, consider a decentralized investment fund DAO: today it might rely on human proposals for rebalancing assets. In the future, token holders could set high-level strategy, and then an AI agent (or a team of agents) continuously implements that strategy – monitoring markets, executing trades on-chain, adjusting portfolios – all while the DAO oversees performance. Thanks to MCP, the AI can seamlessly interact with various DeFi protocols, exchanges, and data feeds to carry out its mandate. If well-designed, such an autonomous dApp could operate 24/7, more efficiently than any human team, and with full transparency (every action logged on-chain).

Another example is an AI-managed decentralized insurance dApp: the AI could assess claims by analyzing evidence (photos, sensors), cross-checking against policies, and then automatically trigger payouts via smart contract. This would require integration of off-chain AI computer vision (for analyzing images of damage) with on-chain verification – something MCP could facilitate by letting the AI call cloud AI services and report back to the contract. The outcome is near-instant insurance decisions with low overhead.

Even governance itself could partially automate. DAOs might use AI moderators to enforce forum rules, AI proposal drafters to turn raw community sentiment into well-structured proposals, or AI treasurers to forecast budget needs. Importantly, these AIs would act as agents of the community, not uncontrolled – they could be periodically reviewed or require multi-sig confirmation for major actions. The overall effect is to amplify human efforts in decentralized organizations, letting communities achieve more with fewer active participants needed.

5.2 Decentralized Intelligence Marketplaces and Networks

Building on projects like SingularityNET and the ASI alliance, we can anticipate a mature global marketplace for intelligence. In this scenario, anyone with an AI model or skill can offer it on the network, and anyone who needs AI capabilities can utilize them, with blockchain ensuring fair compensation and provenance. MCP would be key here: it provides the common protocol so that a request can be dispatched to whichever AI service is best suited.

For instance, imagine a complex task like “produce a custom marketing campaign.” An AI agent in the network might break this into sub-tasks: visual design, copywriting, market analysis – and then find specialists for each (perhaps one agent with a great image generation model, another with a copywriting model fine-tuned for sales, etc.). These specialists could reside on different platforms originally, but because they adhere to MCP/A2A standards, they can collaborate agent-to-agent in a secure, decentralized manner. Payment between them could be handled with microtransactions in a native token, and a smart contract could assemble the final deliverable and ensure each contributor is paid.

This kind of combinatorial intelligence – multiple AI services dynamically linking up across a decentralized network – could outperform even large monolithic AIs, because it taps specialized expertise. It also democratizes access: a small developer in one part of the world could contribute a niche model to the network and earn income whenever it’s used. Meanwhile, users get a one-stop shop for any AI service, with reputation systems (underpinned by tokens/identity) guiding them to quality providers. Over time, such networks could evolve into a decentralized AI cloud, rivaling Big Tech’s AI offerings but without a single owner, and with transparent governance by users and developers.

5.3 Intelligent Metaverse and Digital Lives

By 2030, our digital lives may blend seamlessly with virtual environments – the metaverse – and AI will likely populate these spaces ubiquitously. Through Web3 integration, these AI entities (which could be anything from virtual assistants to game characters to digital pets) will not only be intelligent but also economically and legally empowered.

Picture a metaverse city where each NPC shopkeeper or quest-giver is an AI agent with its own personality and dialogue (thanks to advanced generative models). These NPCs are actually owned by users as NFTs – maybe you “own” a tavern in the virtual world and the bartender NPC is an AI you’ve customized and trained. Because it’s on Web3 rails, the NPC can perform transactions: it could sell virtual goods (NFT items), accept payments, and update its inventory via smart contracts. It might even hold a crypto wallet to manage its earnings (which accrue to you as the owner). MCP would allow that NPC’s AI brain to access outside knowledge – perhaps pulling real-world news to converse about, or integrating with a Web3 calendar so it “knows” about player events.

Furthermore, identity and continuity are ensured by blockchain: your AI avatar in one world can hop to another world, carrying with it a decentralized identity that proves your ownership and maybe its experience level or achievements via soulbound tokens. Interoperability between virtual worlds (often a challenge) could be aided by AI that translates one world’s context to another, with blockchain providing the asset portability.

We may also see AI companions or agents representing individuals across digital spaces. For example, you might have a personal AI that attends DAO meetings on your behalf. It understands your preferences (via training on your past behavior, stored in your personal data vault), and it can even vote in minor matters for you, or summarize the meeting later. This agent could use your decentralized identity to authenticate in each community, ensuring it’s recognized as “you” (or your delegate). It could earn reputation tokens if it contributes good ideas, essentially building social capital for you while you’re away.

Another potential is AI-driven content creation in the metaverse. Want a new game level or a virtual house? Just describe it, and an AI builder agent will create it, deploy it as a smart contract/NFT, and perhaps even link it with a DeFi mortgage if it’s a big structure that you pay off over time. These creations, being on-chain, are unique and tradable. The AI builder might charge a fee in tokens for its service (going again to the marketplace concept above).

Overall, the future decentralized internet could be teeming with intelligent agents: some fully autonomous, some tightly tethered to humans, many somewhere in between. They will negotiate, create, entertain, and transact. MCP and similar protocols ensure they all speak the same “language,” enabling rich collaboration between AI and every Web3 service. If done right, this could lead to an era of unprecedented productivity and innovation – a true synthesis of human, artificial, and distributed intelligence powering society.

Conclusion

The vision of AI general interfaces connecting everything in the Web3 world is undeniably ambitious. We are essentially aiming to weave together two of the most transformative threads of technology – the decentralization of trust and the rise of machine intelligence – into a single fabric. The development background shows us that the timing is ripe: Web3 needed a user-friendly killer app, and AI may well provide it, while AI needed more agency and memory, which Web3’s infrastructure can supply. Technically, frameworks like MCP (Model Context Protocol) provide the connective tissue, allowing AI agents to converse fluently with blockchains, smart contracts, decentralized identities, and beyond. The industry landscape indicates growing momentum, from startups to alliances to major AI labs, all contributing pieces of this puzzle – data markets, agent platforms, oracle networks, and standard protocols – that are starting to click together.

Yet, we must tread carefully given the risks and challenges identified. Security breaches, misaligned AI behavior, privacy pitfalls, and uncertain regulations form a gauntlet of obstacles that could derail progress if underestimated. Each requires proactive mitigation: robust security audits, alignment checks and balances, privacy-preserving architectures, and collaborative governance models. The nature of decentralization means these solutions cannot simply be imposed top-down; they will likely emerge from the community through trial, error, and iteration, much as early Internet protocols did.

If we navigate those challenges, the future potential is exhilarating. We could see Web3 finally delivering a user-centric digital world – not in the originally imagined way of everyone running their own blockchain nodes, but rather via intelligent agents that serve each user’s intents while leveraging decentralization under the hood. In such a world, interacting with crypto and the metaverse might be as easy as having a conversation with your AI assistant, who in turn negotiates with dozens of services and chains trustlessly on your behalf. Decentralized networks could become “smart” in a literal sense, with autonomous services that adapt and improve themselves.

In conclusion, MCP and similar AI interface protocols may indeed become the backbone of a new Web (call it Web 3.0 or the Agentic Web), where intelligence and connectivity are ubiquitous. The convergence of AI and Web3 is not just a merger of technologies, but a convergence of philosophies – the openness and user empowerment of decentralization meeting the efficiency and creativity of AI. If successful, this union could herald an internet that is more free, more personalized, and more powerful than anything we’ve experienced yet, truly fulfilling the promises of both AI and Web3 in ways that impact everyday life.

Sources:

  • S. Khadder, “Web3.0 Isn’t About Ownership — It’s About Intelligence,” FeatureForm Blog (April 8, 2025).
  • J. Saginaw, “Could Anthropic’s MCP Deliver the Web3 That Blockchain Promised?” LinkedIn Article (May 1, 2025).
  • Anthropic, “Introducing the Model Context Protocol,” Anthropic.com (Nov 2024).
  • thirdweb, “The Model Context Protocol (MCP) & Its Significance for Blockchain Apps,” thirdweb Guides (Mar 21, 2025).
  • Chainlink Blog, “The Intersection Between AI Models and Oracles,” (July 4, 2024).
  • Messari Research, Profile of Ocean Protocol, (2025).
  • Messari Research, Profile of SingularityNET, (2025).
  • Cointelegraph, “AI agents are poised to be crypto’s next major vulnerability,” (May 25, 2025).
  • Reuters (Westlaw), “AI agents: greater capabilities and enhanced risks,” (April 22, 2025).
  • Identity.com, “Why AI Agents Need Verified Digital Identities,” (2024).
  • PANews / IOSG Ventures, “Interpreting MCP: Web3 AI Agent Ecosystem,” (May 20, 2025).

Enso Network: The Unified, Intent-based Execution Engine

· 35 min read

Protocol Architecture

Enso Network is a Web3 development platform built as a unified, intent-based execution engine for on-chain operations. Its architecture abstracts away blockchain complexity by mapping every on-chain interaction to a shared engine that operates across multiple chains. Developers and users specify high-level intents (desired outcomes like a token swap, liquidity provision, yield strategy, etc.), and Enso’s network finds and executes the optimal sequence of actions to fulfill those intents. This is achieved through a modular design of “Actions” and “Shortcuts.”

Actions are granular smart contract abstractions (e.g. a swap on Uniswap, a deposit into Aave) provided by the community. Multiple Actions can be composed into Shortcuts, which are reusable workflows representing common DeFi operations. Enso maintains a library of these Shortcuts in smart contracts, so complex tasks can be executed via a single API call or transaction. This intent-based architecture lets developers focus on desired outcomes rather than writing low-level integration code for each protocol and chain.

Enso’s infrastructure includes a decentralized network (built on Tendermint consensus) that serves as a unifying layer connecting different blockchains. The network aggregates data (state from various L1s, rollups, and appchains) into a shared network state or ledger, enabling cross-chain composability and accurate multi-chain execution. In practice, this means Enso can read from and write to any integrated blockchain through one interface, acting as a single point of access for developers. Initially focused on EVM-compatible chains, Enso has expanded support to non-EVM ecosystems – for example, the roadmap includes integrations for Monad (an Ethereum-like L1), Solana, and Movement (a Move-language chain) by Q1 2025.

Network Participants: Enso’s innovation lies in its three-tier participant model, which decentralizes how intents are processed:

  • Action Providers – Developers who contribute modular contract abstractions (“Actions”) encapsulating specific protocol interactions. These building blocks are shared on the network for others to use. Action Providers are rewarded whenever their contributed Action is used in an execution, incentivizing them to publish secure and efficient modules.

  • Graphers – Independent solvers (algorithms) that combine Actions into executable Shortcuts to fulfill user intents. Multiple Graphers compete to find the most optimal solution (cheapest, fastest, or highest-yield path) for each request, similar to how solvers compete in a DEX aggregator. Only the best solution is selected for execution, and the winning Grapher earns a portion of the fees. This competitive mechanism encourages continuous optimization of on-chain routes and strategies.

  • Validators – Node operators who secure the Enso network by verifying and finalizing the Grapher’s solutions. Validators authenticate incoming requests, check the validity and safety of Actions/Shortcuts used, simulate transactions, and ultimately confirm the selected solution’s execution. They form the backbone of network integrity, ensuring results are correct and preventing malicious or inefficient solutions. Validators run a Tendermint-based consensus, meaning a BFT proof-of-stake process is used to reach agreement on each intent’s outcome and to update the network’s state.

Notably, Enso’s approach is chain-agnostic and API-centric. Developers interact with Enso via a unified API/SDK rather than dealing with each chain’s nuances. Enso integrates with over 250 DeFi protocols across multiple blockchains, effectively turning disparate ecosystems into one composable platform. This architecture eliminates the need for dApp teams to write custom smart contracts or handle cross-chain messaging for each new integration – Enso’s shared engine and community-provided Actions handle that heavy lifting. By mid-2025, Enso has proven its scalability: the network successfully facilitated $3.1B of liquidity migration in 3 days for Berachain’s launch (one of the largest DeFi migration events) and has processed over $15B in on-chain transactions to date. These feats demonstrate the robustness of Enso’s infrastructure under real-world conditions.

Overall, Enso’s protocol architecture delivers a “DeFi middleware” or on-chain operating system for Web3. It combines elements of indexing (like The Graph) and transaction execution (like cross-chain bridges or DEX aggregators) into a single decentralized network. This unique stack allows any application, bot, or agent to read and write to any smart contract on any chain via one integration, accelerating development and enabling new composable use cases. Enso positions itself as critical infrastructure for the multi-chain future – an intent engine that could power myriad apps without each needing to reinvent blockchain integrations.

Tokenomics

Enso’s economic model centers on the ENSO token, which is integral to network operation and governance. ENSO is a utility and governance token with a fixed total supply of 100 million tokens. The token’s design aligns incentives for all participants and creates a flywheel effect of usage and rewards:

  • Fee Currency (“Gas”): All requests submitted to the Enso network incur a query fee payable in ENSO. When a user (or dApp) triggers an intent, a small fee is embedded in the generated transaction bytecode. These fees are auctioned for ENSO tokens on the open market and then distributed to the network participants who process the request. In effect, ENSO is the gas that fuels execution of on-chain intents across Enso’s network. As demand for Enso’s shortcuts grows, demand for ENSO tokens may increase to pay for those network fees, creating a supply-demand feedback loop supporting token value.

  • Revenue Sharing & Staking Rewards: The ENSO collected from fees is distributed among Action Providers, Graphers, and Validators as a reward for their contributions. This model directly ties token earnings to network usage: more volume of intents means more fees to distribute. Action Providers earn tokens when their abstractions are used, Graphers earn tokens for winning solutions, and Validators earn tokens for validating and securing the network. All three roles must also stake ENSO as collateral to participate (to be slashed for malpractice), aligning their incentives with network health. Token holders can delegate their ENSO to Validators as well, supporting network security via delegated proof of stake. This staking mechanism not only secures the Tendermint consensus but also gives token stakers a share of network fees, similar to how miners/validators earn gas fees in other chains.

  • Governance: ENSO token holders will govern the protocol’s evolution. Enso is launching as an open network and plans to transition to community-driven decision making. Token-weighted voting will let holders influence upgrades, parameter changes (like fee levels or reward allocations), and treasury usage. This governance power ensures that core contributors and users are aligned on the network’s direction. The project’s philosophy is to put ownership in the hands of the community of builders and users, which was a driving reason for the community token sale in 2025 (see below).

  • Positive Flywheel: Enso’s tokenomics are designed to create a self-reinforcing loop. As more developers integrate Enso and more users execute intents, network fees (paid in ENSO) grow. Those fees reward contributors (attracting more Actions, better Graphers, and more Validators), which in turn improves the network’s capabilities (faster, cheaper, more reliable execution) and attracts more usage. This network effect is underpinned by the ENSO token’s role as both the fee currency and the incentive for contribution. The intention is for the token economy to scale sustainably with network adoption, rather than relying on unsustainable emissions.

Token Distribution & Supply: The initial token allocation is structured to balance team/investor incentives with community ownership. The table below summarizes the ENSO token distribution at genesis:

AllocationPercentageTokens (out of 100M)
Team (Founders & Core)25.0%25,000,000
Early Investors (VCs)31.3%31,300,000
Foundation & Growth Fund23.2%23,200,000
Ecosystem Treasury (Community incentives)15.0%15,000,000
Public Sale (CoinList 2025)4.0%4,000,000
Advisors1.5%1,500,000

Source: Enso Tokenomics.

The public sale in June 2025 offered 5% (4 million tokens) to the community, raising $5 million at a price of $1.25 per ENSO (implying a fully diluted valuation of ~$125 million). Notably, the community sale had no lock-up (100% unlocked at TGE), whereas the team and venture investors are subject to a 2-year linear vesting schedule. This means insiders’ tokens unlock gradually block-by-block over 24 months, aligning them to long-term network growth and mitigating immediate sell pressure. The community thus gained immediate liquidity and ownership, reflecting Enso’s goal of broad distribution.

Enso’s emission schedule beyond the initial allocation appears to be primarily fee-driven rather than inflationary. The total supply is fixed at 100M tokens, and there is no indication of perpetual inflation for block rewards at this time (validators are compensated from fee revenue). This contrasts with many Layer-1 protocols that inflate supply to pay stakers; Enso aims to be sustainable through actual usage fees to reward participants. If network activity is low in early phases, the foundation and treasury allocations can be used to bootstrap incentives for usage and development grants. Conversely, if demand is high, ENSO token’s utility (for fees and staking) could create organic demand pressure.

In summary, ENSO is the fuel of the Enso Network. It powers transactions (query fees), secures the network (staking and slashing), and governs the platform (voting). The token’s value is directly tied to network adoption: as Enso becomes more widely used as the backbone for DeFi applications, the volume of ENSO fees and staking should reflect that growth. The careful distribution (with only a small portion immediately circulating after TGE) and strong backing by top investors (below) provide confidence in the token’s support, while the community-centric sale signals a commitment to decentralization of ownership.

Team and Investors

Enso Network was founded in 2021 by Connor Howe (CEO) and Gorazd Ocvirk, who previously worked together at Sygnum Bank in Switzerland’s crypto banking sector. Connor Howe leads the project as CEO and is the public face in communications and interviews. Under his leadership, Enso initially launched as a social trading DeFi platform and then pivoted through multiple iterations to arrive at the current intent-based infrastructure vision. This adaptability highlights the team’s entrepreneurial resilience – from executing a high-profile “vampire attack” on index protocols in 2021 to building a DeFi aggregator super-app, and finally generalizing their tooling into Enso’s developer platform. Co-founder Gorazd Ocvirk (PhD) brought deep expertise in quantitative finance and Web3 product strategy, although public sources suggest he may have transitioned to other ventures (he was noted as a co-founder of a different crypto startup in 2022). Enso’s core team today includes engineers and operators with strong DeFi backgrounds. For example, Peter Phillips and Ben Wolf are listed as “blockend” (blockchain backend) engineers, and Valentin Meylan leads research. The team is globally distributed but has roots in Zug/Zurich, Switzerland, a known hub for crypto projects (Enso Finance AG was registered in 2020 in Switzerland).

Beyond the founders, Enso has notable advisors and backers that lend significant credibility. The project is backed by top-tier crypto venture funds and angels: it counts Polychain Capital and Multicoin Capital as lead investors, along with Dialectic and Spartan Group (both prominent crypto funds), and IDEO CoLab. An impressive roster of angel investors also participated across rounds – over 70 individuals from leading Web3 projects have invested in Enso. These include founders or executives from LayerZero, Safe (Gnosis Safe), 1inch, Yearn Finance, Flashbots, Dune Analytics, Pendle, and others. Even tech luminary Naval Ravikant (co-founder of AngelList) is an investor and supporter. Such names signal strong industry confidence in Enso’s vision.

Enso’s funding history: the project raised a $5M seed round in early 2021 to build the social trading platform, and later a $4.2M round (strategic/VC) as it evolved the product (these early rounds likely included Polychain, Multicoin, Dialectic, etc.). By mid-2023, Enso had secured enough capital to build out its network; notably, it operated relatively under the radar until its infrastructure pivot gained traction. In Q2 2025, Enso launched a $5M community token sale on CoinList, which was oversubscribed by tens of thousands of participants. The purpose of this sale was not just to raise funds (the amount was modest given prior VC backing) but to decentralize ownership and give its growing community a stake in the network’s success. According to CEO Connor Howe, “we want our earliest supporters, users, and believers to have real ownership in Enso…turning users into advocates”. This community-focused approach is part of Enso’s strategy to drive grassroots growth and network effects through aligned incentives.

Today, Enso’s team is considered among the thought leaders in the “intent-based DeFi” space. They actively engage in developer education (e.g., Enso’s Shortcut Speedrun attracted 700k participants as a gamified learning event) and collaborate with other protocols on integrations. The combination of a strong core team with proven ability to pivot, blue-chip investors, and an enthusiastic community suggests that Enso has both the talent and the financial backing to execute on its ambitious roadmap.

Adoption Metrics and Use Cases

Despite being a relatively new infrastructure, Enso has demonstrated significant traction in its niche. It has positioned itself as the go-to solution for projects needing complex on-chain integrations or cross-chain capabilities. Some key adoption metrics and milestones as of mid-2025:

  • Ecosystem Integration: Over 100 live applications (dApps, wallets, and services) are using Enso under the hood to power on-chain features. These range from DeFi dashboards to automated yield optimizers. Because Enso abstracts protocols, developers can quickly add new DeFi features to their product by plugging into Enso’s API. The network has integrated with 250+ DeFi protocols (DEXes, lending platforms, yield farms, NFT markets, etc.) across major chains, meaning Enso can execute virtually any on-chain action a user might want, from a Uniswap trade to a Yearn vault deposit. This breadth of integrations significantly reduces development time for Enso’s clients – a new project can support, say, all DEXes on Ethereum, Layer-2s, and even Solana using Enso, rather than coding each integration independently.

  • Developer Adoption: Enso’s community now includes 1,900+ developers actively building with its toolkit. These developers might be directly creating Shortcuts/Actions or incorporating Enso into their applications. The figure highlights that Enso isn’t just a closed system; it’s enabling a growing ecosystem of builders who use its shortcuts or contribute to its library. Enso’s approach of simplifying on-chain development (claiming to cut build times from 6+ months down to under a week) has resonated with Web3 developers. This is also evidenced by hackathons and the Enso Templates library where community members share plug-and-play shortcut examples.

  • Transaction Volume: Over **$15 billion in cumulative on-chain transaction volume has been settled through Enso’s infrastructure. This metric, as reported in June 2025, underscores that Enso is not just running in test environments – it’s processing real value at scale. A single high-profile example was Berachain’s liquidity migration: In April 2025, Enso powered the movement of liquidity for Berachain’s testnet campaign (“Boyco”) and facilitated $3.1B in executed transactions over 3 days, one of the largest liquidity events in DeFi history. Enso’s engine successfully handled this load, demonstrating reliability and throughput under stress. Another example is Enso’s partnership with Uniswap: Enso built a Uniswap Position Migrator tool (in collaboration with Uniswap Labs, LayerZero, and Stargate) that helped users seamlessly migrate Uniswap v3 LP positions from Ethereum to another chain. This tool simplified a typically complex cross-chain process (with bridging and re-deployment of NFTs) into a one-click shortcut, and its release showcased Enso’s ability to work alongside top DeFi protocols.

  • Real-World Use Cases: Enso’s value proposition is best understood through the diverse use cases it enables. Projects have used Enso to deliver features that would be very difficult to build alone:

    • Cross-Chain Yield Aggregation: Plume and Sonic used Enso to power incentivized launch campaigns where users could deposit assets on one chain and have them deployed into yields on another chain. Enso handled the cross-chain messaging and multi-step transactions, allowing these new protocols to offer seamless cross-chain experiences to users during their token launch events.
    • Liquidity Migration and Mergers: As mentioned, Berachain leveraged Enso for a “vampire attack”-like migration of liquidity from other ecosystems. Similarly, other protocols could use Enso Shortcuts to automate moving users’ funds from a competitor platform to their own, by bundling approvals, withdrawals, transfers, and deposits across platforms into one intent. This demonstrates Enso’s potential in protocol growth strategies.
    • DeFi “Super App” Functionality: Some wallets and interfaces (for instance, the Eliza OS crypto assistant and the Infinex trading platform) integrate Enso to offer one-stop DeFi actions. A user can, in one click, swap assets at the best rate (Enso will route across DEXes), then lend the output to earn yield, then perhaps stake an LP token – all of which Enso can execute as one Shortcut. This significantly improves user experience and functionality for those apps.
    • Automation and Bots: The presence of “agents” and even AI-driven bots using Enso is emerging. Because Enso exposes an API, algorithmic traders or AI agents can input a high-level goal (e.g. “maximize yield on X asset across any chain”) and let Enso find the optimal strategy. This has opened up experimentation in automated DeFi strategies without needing custom bot engineering for each protocol.
  • User Growth: While Enso is primarily a B2B/B2Dev infrastructure, it has cultivated a community of end-users and enthusiasts through campaigns. The Shortcut Speedrun – a gamified tutorial series – saw over 700,000 participants, indicating widespread interest in Enso’s capabilities. Enso’s social following has grown nearly 10x in a few months (248k followers on X as of mid-2025), reflecting strong mindshare among crypto users. This community growth is important because it creates grassroots demand: users aware of Enso will encourage their favorite dApps to integrate it or will use products that leverage Enso’s shortcuts.

In summary, Enso has moved beyond theory to real adoption. It is trusted by 100+ projects including well-known names like Uniswap, SushiSwap, Stargate/LayerZero, Berachain, zkSync, Safe, Pendle, Yearn and more, either as integration partners or direct users of Enso’s tech. This broad usage across different verticals (DEXs, bridges, layer-1s, dApps) highlights Enso’s role as general-purpose infrastructure. Its key traction metric – $15B+ in transactions – is especially impressive for an infrastructure project at this stage and validates market fit for an intent-based middleware. Investors can take comfort that Enso’s network effects appear to be kicking in: more integrations beget more usage, which begets more integrations. The challenge ahead will be converting this early momentum into sustained growth, which ties into Enso’s positioning against competitors and its roadmap.

Competitor Landscape

Enso Network operates at the intersection of DeFi aggregation, cross-chain interoperability, and developer infrastructure, making its competitive landscape multi-faceted. While no single competitor offers an identical product, Enso faces competition from several categories of Web3 protocols:

  • Decentralized Middleware & Indexing: The most direct analogy is The Graph (GRT). The Graph provides a decentralized network for querying blockchain data via subgraphs. Enso similarly crowd-sources data providers (Action Providers) but goes a step further by enabling transaction execution in addition to data fetching. Whereas The Graph’s ~$924M market cap is built on indexing alone, Enso’s broader scope (data + action) positions it as a more powerful tool in capturing developer mindshare. However, The Graph is a well-established network; Enso will have to prove the reliability and security of its execution layer to achieve similar adoption. One could imagine The Graph or other indexing protocols expanding into execution, which would directly compete with Enso’s niche.

  • Cross-Chain Interoperability Protocols: Projects like LayerZero, Axelar, Wormhole, and Chainlink CCIP provide infrastructure to connect different blockchains. They focus on message passing and bridging assets between chains. Enso actually uses some of these under the hood (e.g., LayerZero/Stargate for bridging in the Uniswap migrator) and is more of a higher-level abstraction on top. In terms of competition, if these interoperability protocols start offering higher-level “intent” APIs or developer-friendly SDKs to compose multi-chain actions, they could overlap with Enso. For example, Axelar offers an SDK for cross-chain calls, and Chainlink’s CCIP could enable cross-chain function execution. Enso’s differentiator is that it doesn’t just send messages between chains; it maintains a unified engine and library of DeFi actions. It targets application developers who want a ready-made solution, rather than forcing them to build on raw cross-chain primitives. Nonetheless, Enso will compete for market share in the broader blockchain middleware segment where these interoperability projects are well funded and rapidly innovating.

  • Transaction Aggregators & Automation: In the DeFi world, there are existing aggregators like 1inch, 0x API, or CoW Protocol that focus on finding optimal trade routes across exchanges. Enso’s Grapher mechanism for intents is conceptually similar to CoW Protocol’s solver competition, but Enso generalizes it beyond swaps to any action. A user intent to “maximize yield” might involve swapping, lending, staking, etc., which is outside the scope of a pure DEX aggregator. That said, Enso will be compared to these services on efficiency for overlapping use cases (e.g., Enso vs. 1inch for a complex token swap route). If Enso consistently finds better routes or lower fees thanks to its network of Graphers, it can outcompete traditional aggregators. Gelato Network is another competitor in automation: Gelato provides a decentralized network of bots to execute tasks like limit orders, auto-compounding, or cross-chain transfers on behalf of dApps. Gelato has a GEL token and an established client base for specific use cases. Enso’s advantage is its breadth and unified interface – rather than offering separate products for each use case (as Gelato does), Enso offers a general platform where any logic can be encoded as a Shortcut. However, Gelato’s head start and focused approach in areas like automation could attract developers who might otherwise use Enso for similar functionalities.

  • Developer Platforms (Web3 SDKs): There are also Web2-style developer platforms like Moralis, Alchemy, Infura, and Tenderly that simplify building on blockchains. These typically offer API access to read data, send transactions, and sometimes higher-level endpoints (e.g., “get token balances” or “send tokens across chain”). While these are mostly centralized services, they compete for the same developer attention. Enso’s selling point is that it’s decentralized and composable – developers are not just getting data or a single function, they’re tapping into an entire network of on-chain capabilities contributed by others. If successful, Enso could become “the GitHub of on-chain actions,” where developers share and reuse Shortcuts, much like open-source code. Competing with well-funded infrastructure-as-a-service companies means Enso will need to offer comparable reliability and ease-of-use, which it is striving for with an extensive API and documentation.

  • Homegrown Solutions: Finally, Enso competes with the status quo – teams building custom integrations in-house. Traditionally, any project wanting multi-protocol functionality had to write and maintain smart contracts or scripts for each integration (e.g., integrating Uniswap, Aave, Compound separately). Many teams might still choose this route for maximum control or due to security considerations. Enso needs to convince developers that outsourcing this work to a shared network is secure, cost-effective, and up-to-date. Given the speed of DeFi innovation, maintaining one’s own integrations is burdensome (Enso often cites that teams spend 6+ months and $500k on audits to integrate dozens of protocols). If Enso can prove its security rigor and keep its action library current with the latest protocols, it can convert more teams away from building in silos. However, any high-profile security incident or downtime in Enso could send developers back to preferring in-house solutions, which is a competitive risk in itself.

Enso’s Differentiators: Enso’s primary edge is being first-to-market with an intent-focused, community-driven execution network. It combines features that would require using multiple other services: data indexing, smart contract SDKs, transaction routing, and cross-chain bridging – all in one. Its incentive model (rewarding third-party developers for contributions) is also unique; it could lead to a vibrant ecosystem where many niche protocols get integrated into Enso faster than any single team could do, similar to how The Graph’s community indexes a long tail of contracts. If Enso succeeds, it could enjoy a strong network effect moat: more Actions and Shortcuts make it more attractive to use Enso versus competitors, which attracts more users and thus more Actions contributed, and so on.

That said, Enso is still in its early days. Its closest analog, The Graph, took years to decentralize and build an ecosystem of indexers. Enso will similarly need to nurture its Graphers and Validators community to ensure reliability. Large players (like a future version of The Graph, or a collaboration of Chainlink and others) could decide to roll out a competing intent execution layer, leveraging their existing networks. Enso will have to move quickly to solidify its position before such competition materializes.

In conclusion, Enso sits at a competitive crossroads of several important Web3 verticals – it’s carving a niche as the “middleware of everything”. Its success will depend on outperforming specialized competitors in each use case (or aggregating them) and continuing to offer a compelling one-stop solution that justifies developers choosing Enso over building from scratch. The presence of high-profile partners and investors suggests Enso has a foot in the door with many ecosystems, which will be advantageous as it expands its integration coverage.

Roadmap and Ecosystem Growth

Enso’s development roadmap (as of mid-2025) outlines a clear path toward full decentralization, multi-chain support, and community-driven growth. Key milestones and planned initiatives include:

  • Mainnet Launch (Q3 2024) – Enso launched its mainnet network in the second half of 2024. This involved deploying the Tendermint-based chain and initializing the Validator ecosystem. Early validators were likely permissioned or selected partners as the network bootstrapped. The mainnet launch allowed real user queries to be processed by Enso’s engine (prior to this, Enso’s services were accessible via a centralized API while in beta). This milestone marked Enso’s transition from an in-house platform to a public decentralized network.

  • Network Participant Expansion (Q4 2024) – Following mainnet, the focus shifted to decentralizing participation. In late 2024, Enso opened up roles for external Action Providers and Graphers. This included releasing tooling and documentation for developers to create their own Actions (smart contract adapters) and for algorithm developers to run Grapher nodes. We can infer that incentive programs or testnet competitions were used to attract these participants. By end of 2024, Enso aimed to have a broader set of third-party actions in its library and multiple Graphers competing on intents, moving beyond the core team’s internal algorithms. This was a crucial step to ensure Enso isn’t a centralized service, but a true open network where anyone can contribute and earn ENSO tokens.

  • Cross-Chain Expansion (Q1 2025) – Enso recognizes that supporting many blockchains is key to its value proposition. In early 2025, the roadmap targeted integration with new blockchain environments beyond the initial EVM set. Specifically, Enso planned support for Monad, Solana, and Movement by Q1 2025. Monad is an upcoming high-performance EVM-compatible chain (backed by Dragonfly Capital) – supporting it early could position Enso as the go-to middleware there. Solana integration is more challenging (different runtime and language), but Enso’s intent engine could work with Solana by using off-chain graphers to formulate Solana transactions and on-chain programs acting as adapters. Movement refers to Move-language chains (perhaps Aptos/Sui or a specific one called Movement). By incorporating Move-based chains, Enso would cover a broad spectrum of ecosystems (Solidity and Move, as well as existing Ethereum rollups). Achieving these integrations means developing new Action modules that understand Solana’s CPI calls or Move’s transaction scripts, and likely collaborating with those ecosystems for oracles/indexing. Enso’s mention in updates suggests these were on track – for example, a community update highlighted partnerships or grants (the mention of “Eclipse mainnet live + Movement grant” in a search result suggests Enso was actively working with novel L1s like Eclipse and Movement by early 2025).

  • Near-Term (Mid/Late 2025) – Although not explicitly broken out in the one-pager roadmap, by mid-2025 Enso’s focus is on network maturity and decentralization. The completion of the CoinList token sale in June 2025 is a major event: the next steps would be token generation and distribution (expected around July 2025) and launching on exchanges or governance forums. We anticipate Enso will roll out its governance process (Enso Improvement Proposals, on-chain voting) so the community can start participating in decisions using their newly acquired tokens. Additionally, Enso will likely move from “beta” to a fully production-ready service, if it hasn’t already. Part of this will be security hardening – conducting multiple smart contract audits and perhaps running a bug bounty program, considering the large TVLs involved.

  • Ecosystem Growth Strategies: Enso is actively fostering an ecosystem around its network. One strategy has been running educational programs and hackathons (e.g., the Shortcut Speedrun and workshops) to onboard developers to the Enso way of building. Another strategy is partnering with new protocols at launch – we’ve seen this with Berachain, zkSync’s campaign, and others. Enso is likely to continue this, effectively acting as an “on-chain launch partner” for emerging networks or DeFi projects, handling their complex user onboarding flows. This not only drives Enso’s volume (as seen with Berachain) but also integrates Enso deeply into those ecosystems. We expect Enso to announce integrations with more Layer-2 networks (e.g., Arbitrum, Optimism were presumably already supported; perhaps newer ones like Scroll or Starknet next) and other L1s (Polkadot via XCM, Cosmos via IBC or Osmosis, etc.). The long-term vision is that Enso becomes chain-ubiquitous – any developer on any chain can plug in. To that end, Enso may also develop better bridgeless cross-chain execution (using techniques like atomic swaps or optimistic execution of intents across chains), which could be on the R&D roadmap beyond 2025.

  • Future Outlook: Looking further, Enso’s team has hinted at involvement of AI agents as network participants. This suggests a future where not only human developers, but AI bots (perhaps trained to optimize DeFi strategies) plug into Enso to provide services. Enso might build out this vision by creating SDKs or frameworks for AI agents to safely interface with the intent engine – a potentially groundbreaking development merging AI and blockchain automation. Moreover, by late 2025 or 2026, we anticipate Enso will work on performance scaling (maybe sharding its network or using zero-knowledge proofs to validate intent execution correctness at scale) as usage grows.

The roadmap is ambitious but execution so far has been strong – Enso has met key milestones like mainnet launch and delivering real use cases. An important upcoming milestone is the full decentralization of the network. Currently, the network is in a transition: the documentation notes the decentralized network is in testnet and a centralized API was being used for production as of earlier in 2025. By now, with mainnet live and token in circulation, Enso will aim to phase out any centralized components. For investors, tracking this decentralization progress (e.g., number of independent validators, community Graphers joining) will be key to evaluating Enso’s maturity.

In summary, Enso’s roadmap focuses on scaling the network’s reach (more chains, more integrations) and scaling the network’s community (more third-party participants and token holders). The ultimate goal is to cement Enso as critical infrastructure in Web3, much like how Infura became essential for dApp connectivity or how The Graph became integral for data querying. If Enso can hit its milestones, the second half of 2025 should see a blossoming ecosystem around the Enso Network, potentially driving exponential growth in usage.

Risk Assessment

Like any early-stage protocol, Enso Network faces a range of risks and challenges that investors should carefully consider:

  • Technical and Security Risks: Enso’s system is inherently complex – it interacts with myriad smart contracts across many blockchains through a network of off-chain solvers and validators. This expansive surface area introduces technical risk. Each new Action (integration) could carry vulnerabilities; if an Action’s logic is flawed or a malicious provider introduces a backdoored Action, user funds could be at risk. Ensuring every integration is secure required substantial investment (Enso’s team spent over $500k on audits for integrating 15 protocols in its early days). As the library grows to hundreds of protocols, maintaining rigorous security audits is challenging. There’s also the risk of bugs in Enso’s coordination logic – for example, a flaw in how Graphers compose transactions or how Validators verify them could be exploited. Cross-chain execution, in particular, can be risky: if a sequence of actions spans multiple chains and one part fails or is censored, it could leave a user’s funds in limbo. Although Enso likely uses retries or atomic swaps for some cases, the complexity of intents means unknown failure modes might emerge. The intent-based model itself is relatively unproven at scale – there may be edge cases where the engine produces an incorrect solution or an outcome that diverges from the user’s intent. Any high-profile exploit or failure could undermine confidence in the whole network. Mitigation requires continuous security audits, a robust bug bounty program, and perhaps insurance mechanisms for users (none of which have been detailed yet).

  • Decentralization and Operational Risks: At present (mid-2025), the Enso network is still in the process of decentralizing its participants. This means there may be unseen operational centralization – for instance, the team’s infrastructure might still be co-ordinating a lot of the activity, or only a few validators/graphers are genuinely active. This presents two risks: reliability (if the core team’s servers go down, will the network stall?) and trust (if the process isn’t fully trustless yet, users must have faith in Enso Inc. not to front-run or censor transactions). The team has proven reliability in big events (like handling $3B volume in days), but as usage grows, scaling the network via more independent nodes will be crucial. There’s also a risk that network participants don’t show up – if Enso cannot attract enough skilled Action Providers or Graphers, the network might remain dependent on the core team, limiting decentralization. This could slow innovation and also concentrate too much power (and token rewards) within a small group, the opposite of the intended design.

  • Market and Adoption Risks: While Enso has impressive early adoption, it’s still in a nascent market for “intent-based” infrastructure. There is a risk that the broader developer community might be slow to adopt this new paradigm. Developers entrenched in traditional coding practices might be hesitant to rely on an external network for core functionality, or they may prefer alternative solutions. Additionally, Enso’s success depends on continuous growth of DeFi and multi-chain ecosystems. If the multi-chain thesis falters (for example, if most activity consolidates on a single dominant chain), the need for Enso’s cross-chain capabilities might diminish. On the flip side, if a new ecosystem arises that Enso fails to integrate quickly, projects in that ecosystem won’t use Enso. Essentially, staying up-to-date with every new chain and protocol is a never-ending challenge – missing or lagging on a major integration (say a popular new DEX or a Layer-2) could push projects to competitors or custom code. Furthermore, Enso’s usage could be hurt by macro market conditions; in a severe DeFi downturn, fewer users and developers might be experimenting with new dApps, directly reducing intents submitted to Enso and thus the fees/revenue of the network. The token’s value could suffer in such a scenario, potentially making staking less attractive and weakening network security or participation.

  • Competition: As discussed, Enso faces competition on multiple fronts. A major risk is a larger player entering the intent execution space. For instance, if a well-funded project like Chainlink were to introduce a similar intent service leveraging their existing oracle network, they could quickly overshadow Enso due to brand trust and integrations. Similarly, infrastructure companies (Alchemy, Infura) could build simplified multi-chain SDKs that, while not decentralized, capture the developer market with convenience. There’s also the risk of open-source copycats: Enso’s core concepts (Actions, Graphers) could be replicated by others, perhaps even as a fork of Enso if the code is public. If one of those projects forms a strong community or finds a better token incentive, it might divert potential participants. Enso will need to maintain technological leadership (e.g., by having the largest library of Actions and most efficient solvers) to fend off competition. Competitive pressure could also squeeze Enso’s fee model – if a rival offers similar services cheaper (or free, subsidized by VCs), Enso might be forced to lower fees or increase token incentives, which could strain its tokenomics.

  • Regulatory and Compliance Risks: Enso operates in the DeFi infrastructure space, which is a gray area in terms of regulation. While Enso itself doesn’t custody user funds (users execute intents from their own wallets), the network does automate complex financial transactions across protocols. There is a possibility that regulators could view intent-composition engines as facilitating unlicensed financial activity or even aiding money laundering if used to shuttle funds across chains in obscured ways. Specific concerns could arise if Enso enablescross-chain swaps that touch privacy pools or jurisdictions under sanctions. Additionally, the ENSO token and its CoinList sale reflect a distribution to a global community – regulators (like the SEC in the U.S.) might scrutinize it as an offering of securities (notably, Enso excluded US, UK, China, etc., from the sale, indicating caution on this front). If ENSO were deemed a security in major jurisdictions, it could limit exchange listings or usage by regulated entities. Enso’s decentralized network of validators might also face compliance issues: for example, could a validator be forced to censor certain transactions due to legal orders? This is largely hypothetical for now, but as the value flowing through Enso grows, regulatory attention will increase. The team’s base in Switzerland might offer a relatively crypto-friendly regulatory environment, but global operations mean global risks. Mitigating this likely involves ensuring Enso is sufficiently decentralized (so no single entity is accountable) and possibly geofencing certain features if needed (though that would be against the ethos of the project).

  • Economic Sustainability: Enso’s model assumes that fees generated by usage will sufficiently reward all participants. There’s a risk that the fee incentives may not be enough to sustain the network, especially early on. For instance, Graphers and Validators incur costs (infrastructure, development time). If query fees are set too low, these participants might not profit, leading them to drop off. On the other hand, if fees are too high, dApps may hesitate to use Enso and seek cheaper alternatives. Striking a balance is hard in a two-sided market. The Enso token economy also relies on token value to an extent – e.g., staking rewards are more attractive when the token has high value, and Action Providers earn value in ENSO. A sharp decline in ENSO price could reduce network participation or prompt more selling (which further depresses the price). With a large portion of tokens held by investors and team (over 56% combined, vesting over 2 years), there’s an overhang risk: if these stakeholders lose faith or need liquidity, their selling could flood the market post-vesting and undermine the token’s price. Enso tried to mitigate concentration by the community sale, but it’s still a relatively centralized token distribution in the near term. Economic sustainability will depend on growing genuine network usage to a level where fee revenue provides sufficient yield to token stakers and contributors – essentially making Enso a “cash-flow” generating protocol rather than just a speculative token. This is achievable (think of how Ethereum fees reward miners/validators), but only if Enso achieves widespread adoption. Until then, there is a reliance on treasury funds (15% allocated) to incentivize and perhaps to adjust the economic parameters (Enso governance may introduce inflation or other rewards if needed, which could dilute holders).

Summary of Risk: Enso is pioneering new ground, which comes with commensurate risk. The technological complexity of unifying all of DeFi into one network is enormous – each blockchain added or protocol integrated is a potential point of failure that must be managed. The team’s experience navigating earlier setbacks (like the limited success of the initial social trading product) shows they are aware of pitfalls and adapt quickly. They have actively mitigated some risks (e.g., decentralizing ownership via the community round to avoid overly VC-driven governance). Investors should watch how Enso executes on decentralization and whether it continues to attract top-tier technical talent to build and secure the network. In the best case, Enso could become indispensable infrastructure across Web3, yielding strong network effects and token value accrual. In the worst case, technical or adoption setbacks could relegate it to being an ambitious but niche tool.

From an investor’s perspective, Enso offers a high-upside, high-risk profile. Its current status (mid-2025) is that of a promising network with real usage and a clear vision, but it must now harden its technology and outpace a competitive and evolving landscape. Due diligence on Enso should include monitoring its security track record, the growth of query volumes/fees over time, and how effectively the ENSO token model incentivizes a self-sustaining ecosystem. As of now, the momentum is in Enso’s favor, but prudent risk management and continued innovation will be key to turning this early leadership into long-term dominance in the Web3 middleware space.

Sources:

  • Enso Network Official Documentation and Token Sale Materials

    • CoinList Token Sale Page – Key Highlights & Investors
    • Enso Docs – Tokenomics and Network Roles
  • Interviews and Media Coverage

    • CryptoPotato Interview with Enso CEO (June 2025) – Background on Enso’s evolution and intent-based design
    • DL News (May 2025) – Overview of Enso’s shortcuts and shared state approach
  • Community and Investor Analyses

    • Hackernoon (I. Pandey, 2025) – Insights on Enso’s community round and token distribution strategy
    • CryptoTotem / CoinLaunch (2025) – Token supply breakdown and roadmap timeline
  • Enso Official Site Metrics (2025) and Press Releases – Adoption figures and use-case examples (Berachain migration, Uniswap collaboration).

User Pain Points with RiseWorks: A Comprehensive Analysis

· 21 min read
Dora Noda
Software Engineer

RiseWorks is a global payroll platform enabling companies to hire and pay international contractors in fiat or crypto. User feedback reveals a range of pain points across different user types – HR professionals, freelancers/contractors (including funded traders), startups, and businesses – touching on onboarding, pricing, support, features, integrations, ease of use, and performance. Below is a detailed report of recurring issues (with direct user quotes) and how sentiments have evolved over time.

Onboarding Experience

RiseWorks touts automated onboarding and compliance checks (KYC/AML) to streamline bringing on contractors. HR teams appreciate not having to manually handle contractor paperwork, and the platform claims a 94% approval rate with a 17-second median ID verification time. This suggests most users get verified almost instantly, which is a positive for quick onboarding.

However, some freelancers find the identity verification (KYC) process tedious. New contractors must provide extensive details (e.g. personal info, tax ID, proof of address) as part of registration. A few users encountered KYC issues (one even created a YouTube guide on fixing RiseWorks KYC rejections), indicating that when the automated process fails, it can be confusing to resolve. In general, though, there haven’t been widespread complaints about the sign-up itself – most frustration arises later during payouts. Overall, onboarding is thorough but typical for a compliance-focused payroll system: it front-loads some effort to ensure legal and tax requirements are met, which some users accept as necessary, while others feel could be smoother.

Pricing and Fees

RiseWorks uses a dual pricing model: either a flat $50 per contractor per month or a 3% fee on payment volume, with an Employer-of-Record option (~$399 per employee) for full-time international hires. For freelancers (contractors), the platform itself is free to sign up – they can send invoices and receive payments without subscribing. Startups and businesses choose between paying per contractor vs. a percentage of payouts depending on which is more cost-effective for their team size and payout amounts.

Pain points around pricing have not been the center of user complaints (operational issues overshadow cost concerns). However, some companies note that 3% of large payouts can become hefty, while $50/month for each contractor might be steep if you have many small engagements. As a point of comparison, Rise’s own marketing claims its fees are lower than competitors like Deel. One independent review also highlighted that Rise offers crypto payouts with minimal fees (only ~$2.50 on-chain fees, or free on layer-2 networks), which can be appealing for cost-conscious crypto-native businesses.

In summary, pricing feedback is mixed: startups and HR managers appreciate the transparency of a flat fee or percentage choice, but they must calculate which model is affordable for them. So far, no major outcry on “hidden fees” or unfair pricing has appeared in user reviews. The main caution is for businesses to weigh the flat vs. percent model – e.g. a $10,000 contractor payment would incur $300 fee under the 3% plan, which might prompt choosing the flat monthly rate instead. Proper guidance on selecting plans could improve satisfaction here.

Customer Support

Customer support is one of the most significant pain points echoed by users across the board. RiseWorks advertises 24/7 multilingual support and multiple contact channels (in-app chat, email, even a Google form). In practice, however, user feedback paints a very different picture.

Freelancers and traders have reported extremely poor response times. One user lamented that **“they have no customer support. You’ll get 1 automated message and no replies after that. I don’t even know how to get my funds back lol.”*. Others similarly describe support as virtually non-existent. For example, a funded forex trader who tried RiseWorks for a payout warned: “Don’t try it… I withdrew with them and [am] failing to get my cash, support is very poor, they don’t respond at all despite having received my cash. I have 2 days now still trying to withdraw but I wish I hadn’t selected this crap of service.” This kind of feedback – no response to urgent withdrawal issues – is alarming for users expecting help.

HR professionals and business owners also find this troubling. If their contractors can’t get assistance or funds, it reflects poorly on the company. Some HR users note that while their account managers set up the service, ongoing support is hard to reach when issues arise. This has been a recurring theme: “terrible CS” (customer service) is mentioned alongside negative Trustpilot reviews. In social media forums and groups, users shared Trustpilot links and warned others to “beware of Rise” due to support and payout problems.

It’s worth noting that RiseWorks appears aware of support shortcomings and has provided more contact methods (the Google form, etc.). But as of the past year, the predominant user sentiment is frustration with support responsiveness. Quick, helpful support is critical in payroll (especially when money is in limbo), so this is a key area where RiseWorks is currently failing its users. Both freelancers and companies are demanding more reliable, real-time support to address payment issues.

Features and Functionality

RiseWorks is a feature-rich platform, especially appealing to crypto and Web3 companies. Users appreciate some of its unique capabilities, but also point out a few missing or immature features given the company’s relative youth (founded 2019).

Notable features praised by users (mostly businesses and crypto-savvy freelancers) include:

  • Hybrid payouts (fiat & crypto): Rise supports 90+ local currencies and 100+ cryptocurrencies, allowing companies and contractors to mix and match payout methods. This flexibility is a standout feature – for example, a contractor can choose to receive part of their pay in local currency and part in USDC. For Web3-native workers, this is a big plus.
  • Compliance automation: The platform handles drafting compliant contracts, tax form generation, and local law compliance for international contractors. HR professionals value this “all-in-one” aspect, as it reduces legal risk. One external review noted Rise “navigates international tax laws and regulations” to keep things compliant for every contractor.
  • Crypto finance extras: Freelancers on Rise can access built-in features like high-yield DeFi accounts for their earnings (as mentioned on Rise’s site) and secure storage via Rise’s smart contract wallet. These novel features aren’t common in traditional payroll software.

Despite these strengths, users have identified some functionality pain points:

  • Lack of certain integrations or features standard in mature platforms: Because RiseWorks is “newer to the payroll industry (5 years old)”, some advanced features are still catching up. For instance, recruiters note that Rise doesn’t yet have robust reporting/analytics on spend or automatic general ledger integrations. A startup comparing options found that while Rise covers the basics, it lacked some bells and whistles (like time-tracking or invoice generation for clients) that they had to handle separately.
  • Mobile app availability: A few contractors wished for a dedicated mobile app. Currently, RiseWorks is accessed via web; the interface is responsive, but an app for on-the-go access (to check payment status or upload documents) would enhance usability. Competing services often have mobile apps, so this is a minor gripe from the freelancer side.
  • New feature stability: As Rise adds features (for example, they recently introduced direct EUR/GBP bank payouts with conversion), some early adopters experienced bugs. One user mentioned initial hiccups setting up a “RiseID” (a Web3 identity feature) – the concept is promising, but the setup failed for them until support (eventually) resolved it. This suggests that cutting-edge features sometimes need more polish.

In summary, RiseWorks’ feature set is powerful but still evolving. Tech-forward users love the crypto integration and compliance automation, while some traditional users miss features they’re accustomed to in older, more established payroll systems. The core functionality is solid (global payments in multiple currencies), yet the platform would benefit from continuing to refine new features and perhaps adding more business-oriented tools (reports, integrations) as it matures.

Integrations

Integration capabilities are a mixed bag and depend on the user’s context:

  • For Web3 and crypto users, RiseWorks shines by integrating with popular blockchain tools. It connects to widely used crypto wallets and chains, offering flexibility in funding and withdrawing. For example, it supports direct integration with Ethereum and Polygon networks, and wallets like MetaMask and Gnosis Safe. This means companies can fund payroll from a crypto treasury or contractors can withdraw to their personal crypto wallet seamlessly. One user pointed out they chose Rise specifically so they could pay a team in stablecoins without manual transfers – a big convenience over piecing together exchanges and bank wires.
  • For traditional businesses/HR systems, however, RiseWorks’ integrations are limited. It does not yet natively integrate with common HR or accounting software (such as Workday, QuickBooks, or ERP systems). An HR manager noted that data from Rise (e.g. payment records, contractor details) had to be exported and input into their accounting system manually. The platform does provide an API for custom integrations, but this requires technical effort. In contrast, some competitors offer plug-and-play integrations with popular software, so this is an area of improvement.

Another integration pain point mentioned by users in certain countries is with local banks and payment networks. RiseWorks ultimately relies on partner banks or services to deliver local currency. In one case, an Indian freelancer’s bank (Axis Bank) rejected the incoming transfer after 18 hours, possibly due to the intermediary or crypto-related origin, causing payout delays. This suggests integration with local banking systems can be hit-or-miss depending on region. Users in places with strict bank policies may need alternative payout methods (or for Rise to partner with different processors).

To summarize integration feedback: Great for crypto connectivity, lacking for traditional software ecosystems. Startups and freelancers in the crypto space laud how well RiseWorks plugs into blockchain workflows. Meanwhile, HR teams at traditional firms view the lack of out-of-the-box integration with their existing tools as a friction point, requiring workarounds. As Rise expands, adding integrations (or even simple CSV import/exports) for major payroll/accounting systems could alleviate this pain for business users.

Ease of Use and Interface

On the whole, users find the RiseWorks interface modern and relatively intuitive, but certain processes can be confusing especially when issues arise. The onboarding guide for funded traders (from a partner prop firm) shows the platform steps clearly – e.g. the dashboard to “easily submit invoices” for your earnings and withdraw in your chosen currency. Contractors have reported that basic tasks like creating an invoice or adding a withdrawal method are straightforward through the guided workflow. The design is clean and tailored to both non-crypto users (who can simply choose a bank transfer) and crypto users (who connect a wallet).

However, ease of use drops when something goes wrong. The user experience for exception cases (like a KYC verification failure, a withdrawal stuck in processing, or needing to contact support) is frustrating. Because support lagged, users ended up seeking help on forums or trying to troubleshoot on their own – which speaks to a lack of in-app guidance for resolving issues. For instance, a user whose payout was in limbo couldn’t find status details or next steps in the UI, leading them to post “How do I even get my money?” on Reddit out of confusion. This indicates the platform might not surface clear error messages or actionable info when payments are delayed (an area to improve UX).

From an HR perspective, the admin interface for onboarding and managing contractors is decent, but could be more feature-rich for ease of use. HR users would like to see, for example, a single view of all contractor statuses (KYC pending, payment in process, etc.) and maybe a bulk action tool. Currently, the platform’s focus is on individual contractor workflows, which is simple but at scale can become a bit click-heavy for HR teams managing dozens of contractors.

In summary, RiseWorks is easy to use for standard operations, but its user-friendliness falters in edge cases. New users generally have little trouble navigating the system for intended tasks. The interface is comparable to other modern SaaS products and even first-time freelancers can figure out how to get set up and invoice their client through Rise. On the flip side, when users encounter an unusual scenario (like a delay or a need to update submitted info), the platform offers limited guidance – causing confusion and reliance on external support. Smoother handling of those scenarios and more proactive communication in-app would greatly enhance the overall user experience.

Performance and Reliability

Performance, in terms of payment processing speed and reliability, has been the most critical issue for many users. The platform’s technical performance (site uptime, page loading) hasn’t drawn complaints – the website and app generally load fine. It’s the operational performance of getting money from point A to B that shows problems.

Payout Delays: Numerous users have reported that bank withdrawals take far longer than expected. In several cases, contractors waited weeks for funds that were supposed to arrive in days. One trader shared that “my payout has been stuck in withdrawal phase with them for 2 weeks now”. Another user similarly posted about a withdrawal pending for days without updates. Such delays leave freelancers in limbo, unsure if or when they will receive their earnings. This is a severe reliability concern – on a payroll platform, timely payment is fundamental. Some affected users even voiced fears that they had been scammed when money didn’t show up on time. While RiseWorks eventually did fulfill many of these payouts, the lack of communication during the delay exacerbated the frustration.

Crypto vs. Bank Transfer Performance: Interestingly, feedback indicates that crypto payouts are much faster and smoother than traditional bank transfers on RiseWorks. Contractors who opted to withdraw in cryptocurrency (like USDC) often received their funds quickly – sometimes within minutes if on a crypto wallet. A customer feedback analysis noted “quick crypto withdrawals” as a positive theme, contrasted with “delayed bank transfers” for fiat. This suggests that Rise’s crypto infrastructure is robust, but its banking partnerships or processes may be a bottleneck. For users, this created a divide: tech-savvy freelancers learned to prefer crypto to avoid delays, whereas those needing local currency had to endure waiting periods.

System Stability: Aside from payment timing, there were a few instances of system glitches. In mid-2024, a handful of users encountered errors like being unable to initiate a withdrawal or the platform showing a “processing” status indefinitely. These might have been one-off bugs or related to the KYC/documents not being fully approved behind the scenes. There isn’t evidence of widespread outages, but even isolated cases of hung transactions erode trust. RiseWorks does have a status page, yet some users weren’t aware of it or it didn’t reflect their specific issue.

Trust and Perceived Reliability: Early on, RiseWorks struggled with user trust. In mid-2024 when it was relatively new to many, it had an average Trustpilot rating around 3.3 out of 5 (an “Average” score) with very few reviews. Comments about missing money and poor support led some to label it untrustworthy. One third-party scam monitoring site even flagged riseworks.io with a “very low trust score”, cautioning it might be risky. This shows how performance issues (like payout failures) directly impacted its reputation.

However, by 2025 there are signs of improvement. More users have successfully used the service, and satisfied voices have somewhat balanced out the detractors. According to an aggregate review report, the overall Trustpilot rating for RiseWorks climbed to 4.4/5 as of April 2025. This suggests that many users eventually did get paid and had a decent experience, possibly leaving positive feedback. The increase in rating could mean the company addressed some early bugs and delays, or that users who utilize the crypto payout (which works reliably) gave high scores. Regardless, the presence of happy customers alongside the unhappy ones now indicates mixed experiences – not uniformly bad as the initial feedback might have implied.

In conclusion on reliability: RiseWorks has proven reliable for some (especially via crypto), but inconsistent for others (especially via banks). The platform’s performance has been patchy, which is a major pain point because payroll is all about trust and timing. Freelancers and businesses need to know payments will arrive as promised. Until Rise can ensure bank transfers are as prompt as their crypto payments, this will remain a concern. The trend in recent months is somewhat positive (fewer horror stories, better ratings), but cautious optimism is warranted – users still frequently advise each other to “be careful and have a backup” when using Rise, reflecting lingering concerns about its reliability.

Summary of Recurring Themes and Patterns

Across user types, a few recurring pain points stand out clearly on the RiseWorks platform:

  • Payout Delays and Unreliability: This is the number one issue raised by freelancers (especially funded traders and contractors). Early users in 2023-2024 often experienced significant delays in receiving funds, with some waiting weeks and fearing they might never get paid. This pattern seems to be improving in 2025, but delays (particularly for fiat transfers) are still reported. The contrast between slow bank transfers and fast crypto payouts is a recurring theme – indicating the platform’s traditional payment rails need improvement.
  • Poor Customer Support: Nearly every negative review or forum post cites the lack of responsive support. Users across the spectrum (HR admins and contractors alike) have been frustrated by either no replies or generic, unhelpful responses when they reach out for help. This has been consistent from the platform’s early days up to recent times, though the company claims 24/7 support availability. It’s a critical pain point because it compounds other issues; when a payment is delayed, not getting timely support makes the experience far worse.
  • Trust and Transparency Issues: In the platform’s initial rollout to new communities (like prop trading firms’ users), there was skepticism due to the above issues. RiseWorks had to battle perceptions of being a “scam” or unreliable. Over time, as more users successfully received payments, some trust is being earned back (reflected in improved ratings). Still, trust remains fragile – new users often seek out reviews and ask others if RiseWorks is safe before committing their earnings to it. Businesses considering RiseWorks also evaluate its short track record and sometimes express hesitation to rely on a relatively young company for something as sensitive as payroll.
  • Value Proposition vs. Execution: Users acknowledge that RiseWorks is tackling a valuable problem – global contractor payments with crypto options – and many want it to work. HR professionals and startup founders like the idea of a one-stop solution for international compliance, and freelancers like having more ways to get paid (especially in crypto with low fees). When the platform works as intended, these benefits are realized, and users are pleased. For instance, a few Trustpilot comments (per summary reports) praise how easy it was to withdraw in their local currency, or how convenient it is to not worry about tax forms. The pain point is that the execution hasn’t been consistent. The concept is strong, but the company is still ironing out operational kinks. As one community member aptly put it, “Rise has potential, but they need to sort out their payout system and support if they want people to stick with it.” This encapsulates the sentiment that many early adopters have: cautiously hopeful but currently disappointed in key areas.

Below is a summary table of pain points by category, with highlights of what users have reported:

AspectPain Points ReportedSupporting User Feedback
OnboardingSome friction with KYC (ID verification, document upload) process, especially if information isn’t accepted on first try.“Comprehensive Automation… including automated onboarding” (Pros); Some needed external help for KYC issues (e.g. YouTube tutorials – implies process could be clearer).
Pricing & FeesPricing model ($50/contractor or 3% volume) must be chosen carefully; high volume payouts can incur large fees. Contractors sometimes bear fees (e.g. ~0.95% on certain transfers).Rise claims to undercut competitors on fees. Few direct complaints on cost – one reason is other issues took precedence. Startups do note to “mind the 3% if doing large payouts” (community advice).
Customer SupportVery slow or no responses to support queries; lack of live resolution. Users felt abandoned when issues arose.“They have no customer support. [You’ll] get 1 automated message and no replies…”; “Support is very poor, they don’t respond at all…crap service”.
FeaturesMissing some advanced features (time tracking, integrations, detailed reporting). New features (RiseID, etc.) have occasional bugs.“Newer to the payroll industry (5 years old)” – still adding features. Users appreciate crypto payout feature, but note it’s a basic payroll tool lacking extras that older systems have.
IntegrationsLimited integration with external business software; no native sync with HRIS or accounting systems. Some issues interfacing with certain local banks.“Rise integrates with… widely used [blockchain] wallets” (crypto integration is a plus). But traditional integration is manual (CSV exports/API). One user’s local bank refused a Rise transfer, causing delays.
Ease of UseGenerally user-friendly UI, but poor guidance when errors occur. Users unsure what to do when a payout is stuck or KYC needs re-submission.“The Rise dashboard lets you easily submit invoices…withdraw in local currency or supported cryptocurrencies.” (intuitive for normal tasks). Lacks in-app alerts or tips when something goes wrong, leading to user confusion.
PerformancePayout processing is inconsistent – fast for crypto, but slow for fiat. Some payouts stuck for days/weeks. Reliability concerns and anxiety over whether money will arrive.“Delayed bank transfers” and “funds seem to be in limbo”; multiple Reddit threads about waiting weeks. In contrast, “quick crypto withdrawals” reported by others.

Patterns Over Time: Early feedback (late 2022 and 2023) was largely negative, centering on unmet basic expectations (money not arriving, no support). This created a narrative in forums that “RiseWorks is not delivering”. Over 2024 and into 2025, the company appears to have taken steps to address these issues: expanding payment corridors (adding EU/UK local transfers), providing more support channels, and likely resolving many individual cases. Consequently, we see a more mixed set of reviews recently – some users reporting smooth experiences alongside those who still hit snags. The Trustpilot score rising to 4.4/5 by April 2025 (from ~3/5 a year prior) exemplifies this shift. It suggests that a number of users are now satisfied (or at least the happy customers increased), perhaps due to successful crypto payouts or improved processes.

That said, key pain points persist in 2025: delays in certain payouts and subpar support are mentioned in recent discussions, meaning RiseWorks hasn’t fully escaped those problems. The improvement in average ratings could reflect proactive measures, but also possibly efforts to encourage positive reviews. It’s important to note that even with a 4.4 average, the negative experiences were very severe for those who had them, and those narratives continue to circulate in user communities (Reddit, prop trading forums, etc.). New users often explicitly ask if others have had issues, indicating the caution that still surrounds the platform’s reputation.

Conclusion

In conclusion, RiseWorks addresses a real need for global payroll (especially bridging crypto and fiat payments), but user experiences show a gap between promise and reality. HR professionals and businesses love the concept of compliant, automated contractor payments in any currency, yet they worry about reliability when they see freelancers struggling to get paid. Freelancers and funded traders are excited by flexible payout options and low fees, but many have encountered unacceptable delays and silence when they needed help. Over time there are signs of improvement – some users now report positive outcomes – but the recurring themes of payout delays and poor support remain the biggest pain points holding RiseWorks back.

For RiseWorks to fully win over all user types, it will need to significantly improve its customer support responsiveness and ensure timely payments consistently. If those core issues are fixed, much of the historical negativity would likely fade, as the underlying service offering is strong and innovative. Until then, user feedback will likely continue to be mixed: with startups and crypto-native users praising features and cost, and others cautioning about support and speed. As one user summarized on social media, RiseWorks has great potential but must “deliver on the basics” – a sentiment that encapsulates the platform’s current standing in the eyes of its users.

Sources:

  • User discussions on Reddit (r/Forex, r/Daytrading, r/buhaydigital) highlighting payout delays and support issues
  • Trustpilot summary via TradersUnion/Kimola reports (mixed reviews: “delayed bank transfers, lack of customer support, and quick crypto withdrawals”)
  • RiseWorks marketing and documentation (pricing page, integrations, and competitor comparisons)
  • Community posts (Facebook group for prop firm traders) warning about missing bank payouts
  • PipFarm user guide for RiseWorks, outlining onboarding steps and contractor experience
  • Medium review on RiseWorks (Coinmonks) for feature overview and pricing details.