Covenant-72B: The Largest Collaboratively Trained AI Model in Crypto History
What if the next frontier AI model wasn't trained in a billion-dollar data center owned by a single corporation — but by dozens of anonymous contributors scattered across the globe, coordinated by a blockchain, communicating over ordinary internet connections?
That's exactly what just happened. Templar's Covenant-72B, a 72.7-billion-parameter large language model pre-trained entirely on Bittensor's Subnet 3, has become the largest collaboratively trained AI model in crypto history — and one of the first to achieve competitive performance with centralized baselines while allowing fully permissionless participation. No whitelists. No corporate gatekeepers. Just GPUs, compressed gradients, and a token-incentive mechanism that kept everyone honest.
Anthropic co-founder Jack Clark called out the achievement in his influential Import AI newsletter, noting that decentralized training compute is growing at 20x per year — four times faster than centralized frontier training's 5x annual growth rate.
Here's why this matters far beyond the Bittensor ecosystem.