Overview

Thets is a revolutionary Sovereign Data Rollup that streamlines data sourcing and transformation. Through a globally distributed network of Thets nodes, it enables AI to have universal access to structured web data.

Validator

Validators play a crucial role in the Thets network by receiving, verifying, and batching web transactions from Routers. They generate Zero-Knowledge (ZK) proofs to record session data on-chain, ensuring data integrity. These on-chain proofs can be referenced in datasets to verify the origin of the data and track its lineage throughout its lifecycle. Initially, the validator system will operate with a single centralized validator but will gradually evolve into a decentralized committee of multiple validators.

Router

Thets Routers serve as the bridge between Thets Nodes and the Validators. They ensure the accountability of the node network and manage the relay of bandwidth. Routers are incentivized to operate efficiently and earn rewards based on the total amount of validated bandwidth they relay through the network.

Thets Node

Thets Nodes utilize the user's unused bandwidth to relay traffic, allowing the network to gather public web data without accessing personal data. Running a node is straightforward and free. Node operators are rewarded for the data traffic relayed through their nodes, making it a beneficial way to contribute to the network.

Proof Processor

The Proof Processor compiles validity proofs for session data from all web requests and submits these proofs to a layer 1 blockchain. This process ensures a permanent record of every data scraping activity conducted on the network. Additionally, it provides complete transparency into the origin and history of the AI training data, laying a solid foundation for data integrity and trustworthiness.

Thets Data Vault

The Thets Data Vault acts as the bridge between the collected data and the Layer 1 Settlement Layer. This immutable data structure securely stores complete datasets and links them to their corresponding on-chain proofs. It ensures data provenance, maintaining the integrity and traceability of data throughout its lifecycle.

Data Structuring Models

Data structuring models transform unstructured web data into organized formats. This process involves various pre-processing steps to clean, normalize, and structure the raw data, ensuring it meets the standards required for AI models. This conversion is essential for turning raw web data into valuable, structured datasets for AI applications.

Last updated