Data Streaming Consensus

Unlike static data, streaming data presents unique challenges due to its inherent variability and the rapid pace at which it changes.

Adapting to Streaming Data Challenges:

  • Custom Data Finalization Libraries: External developers can create their own libraries equipped with sophisticated algorithms specifically designed to handle the complexities of streaming data. This ensures data integrity and reliability despite the fluctuations and timing differences among data transmitters.

  • Tailored Consensus Mechanisms: These libraries allow for the customization of consensus mechanisms to suit the specific requirements of each type of data transmitted, enhancing the adaptability of the system.

  • Voting Mechanisms: The libraries include voting mechanisms that enable external developers to reward transmitters for providing accurate and timely data, thus incentivizing high-quality data transmission within the network.

Data Finalization Process:

  • Algorithmic Consensus: The finalization library contains algorithms that seek a result closest to reality based on transmitter votes. It classifies transmitters into those deserving rewards and those not meeting the required accuracy levels.

  • Transmitters Bets Mechanism: After data finalization, this mechanism processes bets and rewards. Consistent failure to receive rewards can lead to slashing, ensuring active participation and integrity within the protocol.

  • MasterStreamDataSpotter Contract: Finalized data is recorded in this contract, serving as a central repository for all verified data.

Last updated