UDF Integration Guides For Custom Feeds

This section outlines the process of integrating a new data feed into the Entangle Unified Data Feed system. The default integration flow ensures smooth onboarding of new data sources and utilizes UDF’s capabilities for fetching, parsing, and distributing data on-chain.

Key Components:

  1. REST URL Template: Defines how data is fetched from external sources.

  2. Pipelines for Parsing: Parses relevant information from the fetched data, such as price, timestamp, and volume.

Steps for Integration

1. Reach out to Entangle

Before integration begins, the protocol should provide Entangle with:

  • Initial Data with Examples: Real sample data that the integration will process.

  • Data Processing Requirements: Outline any data transformation or formatting needed for the final output to be distributed on-chain.

  • Update Frequency Suggestions: Provide suggestions on how frequently updates should occur, allowing customization to meet the protocol’s needs.

Entangle UDF offers a generic REST interface solution for asset feeds. The required components include:

  • REST URL Template: A template for fetching data using REST, leveraging variables such as DataKey and SourceID.

  • Parsing Pipelines: For extracting key data (e.g., price, timestamp) from the response JSON using GJSON syntax.

2. Define REST URL Template

Create a URL template that points to the endpoint of the data provider. The template uses Go text syntax, with placeholders such as DataKey and SourceID. Example for Binance API:

https://api.binance.com/api/v3/avgPrice?symbol={{.DataKey}}

This URL template is inserted into the configuration file, allowing Entangle's backend to fetch the necessary data.

3. Set Up Parsing Pipelines

In the configuration file, set up pipelines to parse data from the response JSON. Below is an example JSON response structure:

{
  "asset": {
    "price": 1000000000000000000,
    "timestamp": 1721636387,
    "volume": 10000000000
  }
}
  • Pipeline for Price: asset.price

  • Pipeline for Timestamp: asset.timestamp

  • Pipeline for Volume: asset.volume

These pipelines should be defined using GJSON syntax to extract the required fields.

4. Configure the System

The next step is to implement both the URL template and parsing pipelines in the YAML configuration file. This connects the data feed with Entangle UDF.

Example configuration snippet:

urlTemplate: "https://api.binance.com/api/v3/avgPrice?symbol={{.DataKey}}"
pipelines:
  - value: "asset.price"
  - timestamp: "asset.timestamp"
  - volume: "asset.volume"

5. Fetch Data

To retrieve data updates, use the Finalized Data API. For example, to fetch the NGL/USD price feed, execute the following curl command:

curl https://pricefeed.entangle.fi/spotters/prices-feed1?assets=NGL/USD

Sample response:

{
  "calldata": {
    "merkleRoot": "0xa4d6d7594888f1fe8675d033737baf6106819f7b0e59f3d794f5e1adcec70022",
    "signatures": [
      { "R": "...", "S": "...", "V": 28 },
      { "R": "...", "S": "...", "V": 28 },
      { "R": "...", "S": "...", "V": 28 }
    ],
    "feeds": [
      {
        "key": "NGL/USD",
        "value": { "data": "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAcHqjZobCo0=", "timestamp": 1722866387 },
        "merkleProofs": ["wg/sd9GqECY5zFKdsHRThUdPda0Parv24Npcj15EqJ4=", "CK5qUUFgAaZBYE/Aq3x+Y61HHAQnW7q6A2K1obw55ZE="]
      }
    ]
  },
  "error": ""
}

This response contains the Merkle root, signatures, and data required to verify the update.

6. Verify Data

Once you have fetched the data, use the PullOracle smart contract to verify the updates on-chain.

7. Monitor and Update

Ongoing monitoring ensures that the integration functions correctly. Periodically review the configuration and update it as needed to accommodate changes in the data provider's API or Entangle’s system.

Use Cases

  1. Integrating a New Crypto Asset to UDF: Easily onboard new cryptocurrency price feeds for decentralized applications.

  2. RWA Integration: Onboard real-world assets into the decentralized ecosystem.

  3. Proof of Reserves Integration: Verify asset reserves with external data feeds.

By following this integration process, you can smoothly onboard new data feeds into the Entangle UDF and leverage its decentralized infrastructure for distributing reliable and accurate data.

FAQ

Q1: How often should partners update data on their server?

A1: The frequency of data updates should be tailored to the specific needs and characteristics of each asset. Factors to consider include data volatility, the criticality of having the most up-to-date information, and the nature of the application using the data. Partners should aim for a balance between providing timely updates and minimizing unnecessary load on their servers.

Q2: What should I do if the data provided by an external API does not meet the required format?

A2: If the data from an external API does not meet the required format (e.g., price in numeric 18 decimals format, timestamp in UNIX seconds), you may need to preprocess the data before feeding it into the Entangle UDF system. This could involve converting data types, rounding floating-point numbers, or formatting timestamps.

Q3: Can Entangle UDF support custom modifiers for data parsing?

A3: Currently, the data parsing pipelines use GJSON syntax, which supports basic JSON parsing and querying. If you require custom modifiers for more complex parsing, please contact the Entangle team with your requirements, as there is potential for future support for custom modifiers.

Q4: How do I handle optional fields like timestamp and volume in the server response?

A4: If the timestamp or volume is not provided in the server response:

  • Timestamp: The system will assign the system time to the new update.

  • Volume: If volume is not provided, VWAP (Volume Weighted Average Price) will be disabled. Ensure your configuration accounts for these defaults if the data source does not include these fields.

Q5: Can I use public data providers for integration?

A5: Yes, you can use public data providers for integration. You will need to provide the URL specification for the public data provider's API. From there, you can create a URL template and pipelines for parsing the necessary data.

Integration Process Summary

  • Define REST URL Template: Create a URL template using the text/template syntax that will point to the endpoint of the data provider. The `DataKey` and `SourceID` variables will be available in the template.

  • Set Up Parsing Pipelines: Configure pipelines in the YAML configuration file to parse the price, timestamp, and volume (if available) from the JSON response. Use GJSON syntax for querying the JSON structure.

  • Configure the System: Implement the URL template and parsing pipelines in the configuration file to integrate the data feed with Entangle UDF.

  • Fetch Data: Use the Finalized Data API to retrieve updates on-chain using the `PullOracle` smart contract.

  • Verify Data: Use the Finalized Data API to verify updates on-chain using the `PullOracle` smart contract.

  • Monitor and Update: Regularly monitor the integration to ensure data accuracy and make adjustments to the configuration as needed based on any changes from the data provider or updates to the Entangle UDF system.

Last updated