Gradients

Wiki Powered byIconIQ
Gradients

We've just announced IQ AI.

Check it out

Gradients

Gradients is a decentralized artificial intelligence (AI) training platform operating as Subnet 56 within the ecosystem. The project aims to provide enterprise-grade AI model training services by creating a transparent and competitive environment for developing and validating training methodologies for both text and image-based models. [1] [2]

Overview

Gradients is designed to address a key challenge in the adoption of decentralized AI: the lack of transparency and trust. Many enterprise clients are hesitant to use decentralized networks for training proprietary models due to the "black-box" nature of the process, where the methods used by anonymous network participants () are not verifiable. An enterprise client was quoted as stating, "If there was more transparency and trust with what was happening with the data, there would be a clear path to using Gradients." To solve this, Gradients shifted its model to one of "competitive transparency." [1]

The platform functions as an Automated Large Language Model (AutoLLM) system that leverages a decentralized network of compute providers. Its core innovation, introduced in the Gradients 5.0 update, is a tournament-based system where developers compete to create the most effective AI training scripts. The winning scripts from these competitions are made open-source, providing full visibility into the methodology. This allows Gradients to offer validated, high-performance, and transparent training solutions to enterprise customers. The project's development is associated with an entity named "rayonlabs," as indicated by its code repository. [1] [3]

The project's vision is to create a commercial platform that offers not only performance but also verifiable trust. By open-sourcing the best methodologies and planning for future integration with trusted compute services, Gradients aims to provide a value proposition that combines the innovation of a competitive ecosystem with the security and transparency required by enterprise clients. As stated by the project's spokesperson, "The path to commercial viability runs through openness. Gradients 5.0 is that path." [1]

History

The public discourse around Gradients began in early 2025, with a series of announcements outlining its capabilities and progress. In an April 6, 2025 post, the platform was described as a high-performing AutoLLM system where decentralized miners compete to train models. By May 9, 2025, the project claimed to have achieved leadership in training both text and image models, highlighting its multi-modal capabilities. [4]

A significant development occurred in June 2025 with the release of a research paper titled "G.O.D.: Training Foundation Models as a Differentiable Game," which reportedly detailed "world-leading results" achieved by the platform. This was followed by the announcement of Gradients 5.0 on July 6, 2025. This major update was framed as a strategic pivot to address enterprise concerns about transparency, with the tagline "Opening the Black Box — Unlocking Enterprise AI Training." The rollout of the first stage of Gradients 5.0 was scheduled to begin on July 21, 2025. [1] [4]

Technology

The technological foundation of Gradients is its decentralized network for AI model training, which evolved significantly with the introduction of Gradients 5.0. This update shifted the platform from a conventional miner-based system to a structured, tournament-based competition designed to foster transparency and identify superior training methods. [1]

Gradients 5.0 Tournament System

The core of Gradients 5.0 is a bi-weekly tournament where AutoML practitioners compete to produce the best training scripts. The winning script from each tournament is released as an open-source asset, which then becomes a validated commercial product that Gradients can offer to its clients. This model is designed to replace the opaque system of individual miners with a transparent framework where the best methodologies are proven through open competition. [1]

Tournament Structure

Each tournament cycle runs for two weeks and is divided into two main categories to address different AI domains:

  • Image Tournament: This competition focuses on diffusion models such as and SDXL. Participants are tasked with training models on specific person or style datasets.
  • Text Tournament: This competition is centered on Instruct/Chat models. Tasks are selected from various types, including GRPO, DPO, and Instruct, and are applied to a range of datasets.

The tournament progresses through several stages to identify a definitive winner:

  1. Group Stage: This initial stage is implemented when there are 16 or more participants. Miners are divided into groups of 6-8, and the top performers from each group advance to the next stage.
  2. Knockout Stage: When fewer than 16 miners remain, the competition shifts to a head-to-head elimination format until a single challenger emerges.
  3. Boss Round: The tournament winner does not automatically become the new champion. Instead, they must challenge the defending champion from the previous tournament. To be crowned the new champion, the challenger's script must outperform the defender's by a margin of at least 5%. This rule is in place to ensure that new winning methods represent a significant improvement and are not just the result of random variance. [1]

Miner Participation and Evaluation

The process for participation is designed to be straightforward and secure. Miners do not submit their proprietary code directly to the system. Instead, they register their existing GitHub repositories by providing the repository URL and a specific commit hash. The tournament's automated system then clones the specified repository version and executes the training script on standardized GPU infrastructure. This ensures that the competition is based on the quality of the methodology, not on hardware advantages, as the system dynamically allocates GPU resources based on model complexity.

The evaluation process is managed entirely by network validators. They are responsible for running the scripts, evaluating the performance of the resulting trained models, and scoring the participants. This creates a trustless environment where results are verifiable and the competition is fair. After evaluation, the trained models are uploaded to HuggingFace. [1]

Three-Stage Rollout Plan

The transition to the Gradients 5.0 model was planned in three distinct stages to ensure a stable and validated migration.

  • Stage 1: Hybrid Competition: Beginning July 21, 2025, the new tournament system was set to run in parallel with the existing miner system. The weight of the tournament results in the network's emission distribution was designed to increase weekly, allowing the new system's performance to be validated against established benchmarks.
  • Stage 2: Organic Customer Hosting: Once the tournament-winning scripts consistently demonstrated performance that met or exceeded the previous system, Gradients planned to begin using them to serve real enterprise customers. This stage marks a shift in the economic model, focusing on deploying a few proven, winning scripts at scale rather than funding numerous individual operations.
  • Stage 3: Trusted Compute Integration: The final stage involves integrating Gradients with other specialized Bittensor subnets. The plan includes using the Chutes hosting infrastructure in conjunction with a Trusted Execution Service (TES). This integration is intended to provide cryptographic proof of training integrity, offering enterprise-grade security and verifiable certainty about how models are trained. [1]

Tokenomics

Gradients has a native token with the ticker symbol SN56, corresponding to its designation as Subnet 56 on the network. The token is categorized under the AI and Bittensor Ecosystem tags on market data platforms. [2] [3]

Key token metrics include:

  • Name: Gradients
  • Ticker: SN56
  • Network: Bittensor
  • Maximum Supply: 21,000,000 SN56
  • Total Supply: Data aggregators report conflicting figures. As of late 2025, reported a total supply of 3,181,838 SN56, while reported 1,260,000 SN56.
  • Circulating Supply: Similar to total supply, reported figures vary. listed the circulating supply as 3,181,838 SN56, whereas provided a self-reported figure of 1,260,000 SN56.

The token is primarily traded on decentralized exchanges within the Bittensor ecosystem, such as Subnet Tokens. The most active trading pairs include SN56/SN0 and SN56/TAO. These details reflect the token's integration within its native network. [2] [3]

Use Cases and Commercial Offering

Gradients is positioned as a commercial platform for enterprise AI, with a value proposition centered on transparency, performance, and security. The open-sourcing of tournament-winning training scripts is a key differentiator, allowing clients to inspect and trust the methodologies used. A project representative noted, "When we tell enterprises 'this approach won against 24 competitors, and we’ll run it for you with cryptographic proof of integrity', that’s a fundamentally different value proposition than any competitor can offer." [1]

The platform's commercial offerings are designed to cater to enterprise needs:

  • Transparency as a Service: Clients gain full visibility into the training methodologies, which have been validated through a competitive, public process.
  • Premium Training-as-a-Service: Gradients offers a managed service to execute the proven, open-source training scripts on enterprise-scale infrastructure, removing the implementation burden from the client.
  • API Access: Enterprises can access the winning AutoML approaches through the Gradients API, allowing for easy integration into their existing workflows.
  • Cryptographically Verified Training: The planned integration with a Trusted Execution Service (TES) will provide cryptographic proof of training integrity, a critical feature for security-conscious organizations in regulated industries. [1]

Team

Specific details about the founding team or corporate structure behind Gradients are not widely publicized. However, some entities are publicly associated with the project. The GitHub repository for the project is maintained under the name "rayonlabs." A key public voice for the project is an individual or group operating under the pseudonym "WanderingWeights," who authored the detailed announcements regarding the Gradients 5.0 update and the project's strategic direction. [3] [1]

REFERENCES

HomeCategoriesRankEventsGlossary