Architecture
Last updated
Was this helpful?
Last updated
Was this helpful?
The diagram above illustrates the architecture of Entangle's Universald Data Feeds (UDF), a system meticulously designed to ensure secure, efficient, and scalable delivery of verified data across blockchains. At its core, the system integrates data Aggregation, Attestation, and Updating with a convenient Data Delivery method, supporting diverse use cases while ensuring reliability and accuracy.
The process begins with data collection, where the raw data is fetched from reliable, high-quality Data Providers at fixed intervals. This data is encapsulated into a structured format containing key attributes such as the feedKey
, value
, and timestamp
. To ensure the integrity and authenticity of the data, it is signed using ECDSA signatures, effectively securing it from tampering. Once packaged, the signed data is sent as an update transaction to the Oracle Chain, where it undergoes aggregation and attestation (validation).
The Oracle Chain serves as the processing hub, consisting of two key modules. It offloads data collection and processing to L2 auxiliary blockchains, enabling fast, distributed data collection and parallel processing for real-time data feeds. The Aggregation module consolidates inputs from multiple publishers into a coherent dataset. Subsequently, the Attestation module creates on-chain verifiable proofs, certifying the accuracy and reliability of the aggregated data on the target chain.
Users can then use the Update Provider Service which is a service to generate snapshots of data which are maintained on the Oracle Chain for consistency and retrieved via a Delivery Method (i.e., an API) for accessibility. It provides a unified reference point, eliminating the need to interact directly with the Oracle Chain.
On the Target Chain, UDF plays a pivotal role in managing delivered data. It enables decentralized applications to access and utilize verified data directly in their logic and facilitates verification of data, ensuring that all signatures are valid, votes are unique, and the data has met consensus thresholds. For numerical feeds, the verification process aggregates the data into a final state, often using statistical methods like medians to derive a reliable result.
UDF integrates L2 solutions for data collection and scalability. Messages from source chains are distributed across multiple L2 networks through agent groups, enabling parallel processing and faster execution. Unlike traditional L2 processes, UIP's internal utility networks bypass full block finalization for messages that do not involve value transfers, relying solely on source network finalization to speed up delivery.
UDF implements a range of features, outlined below, which enable it to excel in delivering accurate, timely, and secure data to decentralized applications.
Flexibility - The system supports two primary delivery models, Push and Pull. In the Push Model, updates are actively sent by the Push Publisher (i.e., Push Node) to the target chain, while in the Pull Model, users initiate requests for the latest data snapshots and append verification details to their transactions.
Update Provider - A service to generate snapshots from data that's collected periodically, providing a unified reference point, eliminating the need to interact directly with the Oracle Chain, making it a convenient to retrieve reliable, up-to-date data.
Scalability - UDF offloads computationally intensive tasks such as aggregation and validation to the Oracle Chain, allowing the system to minimize on-chain costs while ensuring rapid updates.
Security - A robust verification process utilizing ECDSA signatures guarantees that the data remains tamper-proof throughout its lifecycle, allowing cross-chain deployments without comprising data integrity.
These features position UDF as a leading solution for cross-chain data delivery and make it an essential pillar in infrastructure for modern blockchain ecosystems.