Architecture
Last updated
Was this helpful?
Last updated
Was this helpful?
The diagram above illustrates the architecture of Entangle's Universal Data Feeds (UDF), a system designed to ensure secure, efficient, and scalable delivery of verified data across blockchains. At its core, UDF integrates data Aggregation, Attestation, and Updating with a convenient Data Delivery method, supporting diverse use cases while ensuring reliability and accuracy.
The process begins with data collection, where the raw data is fetched from reliable, high-quality Data Providers at fixed intervals. This data is encapsulated into a structured format containing key attributes such as the feedKey
, value
, and timestamp
. To ensure the integrity and authenticity of the data, it is signed using ECDSA signatures, effectively securing it from tampering. Once packaged, the signed data is sent as an update transaction to the Oracle Chain, where it undergoes aggregation and attestation (validation).
The Oracle Chain serves as the processing hub, consisting of two key modules. It offloads data collection and processing to L2 auxiliary blockchains, enabling fast, distributed data collection and parallel processing for real-time data feeds. The Aggregation module consolidates inputs from multiple publishers into a coherent dataset. Subsequently, the Attestation module creates on-chain verifiable proofs, certifying the accuracy and reliability of the aggregated data on the target chain.
Users can then use the Update Provider Service which is a service to generate snapshots of data which are maintained on the Oracle Chain for consistency and retrieved via a Delivery Method (i.e., an API) for accessibility. It provides a unified reference point, eliminating the need to interact directly with the Oracle Chain.
On the Target Chain, UDF plays a pivotal role in managing delivered data. It enables decentralized applications to access and utilize verified data directly in their logic and facilitates verification of data, ensuring that all signatures are valid, votes are unique, and the data has met consensus thresholds. For numerical feeds, the verification process aggregates the data into a final state, often using statistical methods like medians to derive a reliable result.
UDF integrates EIB with L2 solutions for scalability, performance, and cost optimization. This approach allows relaying aggregated data snapshots to each target chain. The L2 network is parallelized and synced with EIB to handle specific tasks (aggregator rollups) and offload mainchain overhead while enabling unlimited horizontal scaling.
EIB's robust consensus mechanism (Tendermint BFT) that finalizes data more rapidly than many L1 networks is leveraged by the L2 solutions as a trusted layer to periodically commit their state. This lets the L2 solutions inherit EIB’s security while achieving greater compute bandwidth and lower costs, enabling near real-time data updates.
Additionally, unlike traditional L2 processes, UIP's internal utility networks bypass full block finalization for messages that do not involve value transfers, relying solely on source network finalization to speed up delivery.
In short, UDF leverages the EIB, a specialized blockchain, for orchestrating:
Cross-Chain Consensus
Validator Set & Finality
Performance & Lower Costs
A decentralized network of lightweight transmitter nodes retrieves raw data from various sources and cryptographically signs each update before submission to the oracle chain. These signatures collectively form a Byzantine-tolerant proof, ensuring the integrity and validity of the data. Signatures are merged into a final proof, which is used for validation on external blockchains and off-chain applications.
Transmitters must stake the native EIB tokens to participate in the network, aligning incentives with honest operation. Staked assets act as a security deposit, misbehavior, downtime, or incorrect data submissions result in slashing penalties. This mechanism prevents Sybil attacks while ensuring that only committed, well-incentivized nodes contribute to UDF’s data integrity.
Generate signatures on data updates, contributing to a trust-minimized consensus.
Are monitored and rotated, with penalties for downtime or producing inconsistent signatures.
Once finalized, the data snapshot is posted on the EIB/L2 oracle chain, ensuring accuracy through multiple layers of signed votes. The Finalized Data Snap service continuously synchronizes with the oracle chain, providing real-time oracle data for users and push nodes to retrieve validated oracle data efficiently.
The final snapshot is a cryptographically verifiable object that includes:
Aggregated Value (e.g., BTC/USD = 100,000)
Timestamp
Transmitter Signatures
Optional Extra Metadata (standard deviation, volume data, etc.)
UDF supports two primary delivery models, Push and Pull. In the Push Model, updates are actively sent by the Push Publisher (i.e., Push Node) to the target chain, while in the Pull Model, users initiate requests for the latest data snapshots and append verification details to their transactions.
UDF offloads computationally intensive tasks such as aggregation and validation to the Oracle Chain, allowing the system to minimize on-chain costs while ensuring rapid updates. Additionally, a robust verification process guarantees that the data remains tamper-proof throughout its lifecycle, allowing cross-chain deployments without comprising data integrity.
UDF is an essential pillar in infrastructure for modern blockchain ecosystems. It excels in delivering accurate, timely, and secure data to decentralized applications, positioning it as a leading solution for cross-chain data delivery.
UDF Oracle
Core system for data aggregation, signature verification, and on-chain publication.
Pull Model
Data-verification method for each user transaction, containing oracle votes.
Push Model
Data publication method where nodes regularly push updated values on-chain.
Transmitters
Decentralized agents that sign and attest to data, forming a security layer for aggregated values.
Entangle Interoperable Blockchain (EIB)
A specialized oracle chain that coordinates data availability, finalization, and bridging logic.
Layer 2 (L2)
Layer 2 rollups that further boost throughput and reduce latency of the EIB.
Finalized Snapshot
A canonical data set aggregated from various sources & attested by transmitters.
VerificationLib
A smart contract library that validates oracle votes for secure data feeds on target chains.
Byzantine Fault Tolerance (BFT)
An algorithim for fault tolerance used by transmitters to sign data updates.