White paper
  • WHITE PAPER
    • 📖Introduction: The Data Wall
    • 📖Human Generated Data
    • 📖Scaling data generation with blockchain
  • TA-DA PLATFORM
    • ⚙️Architecture
      • 👷Production
      • ✅Quality Control
      • 💰Rewards
    • ❇️Use-cases
      • 🧠Artificial Intelligence
        • ➡️Audio Datasets
        • ➡️Video Datasets
        • ➡️Image Datasets
        • ➡️Text Datasets
      • 🖇️Data Structuring
    • 🛫Roadmap
  • Token
    • 📊Token Economics
    • 🏦Staking
      • ➡️On-chain Staking
      • ➡️Meria Staking
      • ➡️xExchange Metastaking
  • LINKS
    • 🌎Website
    • 🐦X (Twitter)
    • 🗣️Discord
    • 🗞️Telegram
Powered by GitBook
On this page
  • Verification flow
  • QC core principles
  1. TA-DA PLATFORM
  2. Architecture

Quality Control

Verification flow

Quality controls are made at two levels:

  • task level: making sure each task has been properly done

  • job level: making sure all tasks have been completed consistently with requirements

Our quality check relies upon 3 levels of validation, both fully customizable based on clients' needs:

  • Consensus based verification: verification is done by "community verifiers", and validation is made based on consensus, with "slashing" of incorrect productions or colluding verifiers to incentivize good behaviours

  • AI-powered review: specifically trained LLMs help screen compliant tasks to increase verification throughput

  • Curator validation: verification is done by specifically trained curators

Depending upon clients' needs and projects' complexity, it is possible to combine any of these 3 verification mechanisms to provide stronger quality checks

Job validation triggers the distribution of rewards, unless specified otherwise in instructions

QC core principles

Ta-da's platform is build so as to enable quality controls at scale, guaranteeing better data quality, at

  • Redundancy models (e.g., consensus, majority voting) can be expensive and slow.

  • Automated quality checks often struggle with edge cases or subtle errors.

  • Data injection and real-time feedback loops

  • Integration with ML pipelines or client APIs to close the loop between data needs and data supply.

  • Analytics dashboards to help data consumers measure dataset evolution, annotation variance, and coverage gaps.

PreviousProductionNextRewards

Last updated 6 days ago

⚙️
✅