Integration Process

This section outlines the process of integrating a new data feed into the Entangle Unified Data Feed system. The default integration flow ensures smooth onboarding of new data sources and utilizes UDF’s capabilities for fetching, parsing, and distributing data on-chain.

Key Components:

  1. REST URL Template: Defines how data is fetched from external sources.

  2. Pipelines for Parsing: Parses relevant information from the fetched data, such as price, timestamp, and volume.

Steps for Integration

1. Reach out to Entangle

Before integration begins, the protocol should provide Entangle with:

  • Initial Data with Examples: Real sample data that the integration will process.

  • Data Processing Requirements: Outline any data transformation or formatting needed for the final output to be distributed on-chain.

  • Update Frequency Suggestions: Provide suggestions on how frequently updates should occur, allowing customization to meet the protocol’s needs.

Entangle UDF offers a generic REST interface solution for asset feeds. The required components include:

  • REST URL Template: A template for fetching data using REST, leveraging variables such as DataKey and SourceID.

  • Parsing Pipelines: For extracting key data (e.g., price, timestamp) from the response JSON using GJSON syntax.

2. Define REST URL Template

Create a URL template that points to the endpoint of the data provider. The template uses Go text syntax, with placeholders such as DataKey and SourceID. Example for Binance API:

https://api.binance.com/api/v3/avgPrice?symbol={{.DataKey}}

This URL template is inserted into the configuration file, allowing Entangle's backend to fetch the necessary data.

3. Set Up Parsing Pipelines

In the configuration file, set up pipelines to parse data from the response JSON. Below is an example JSON response structure:

{
  "asset": {
    "price": 1000000000000000000,
    "timestamp": 1721636387,
    "volume": 10000000000
  }
}
  • Pipeline for Price: asset.price

  • Pipeline for Timestamp: asset.timestamp

  • Pipeline for Volume: asset.volume

These pipelines should be defined using GJSON syntax to extract the required fields.

4. Configure the System

The next step is to implement both the URL template and parsing pipelines in the YAML configuration file. This connects the data feed with Entangle UDF.

Example configuration snippet:

urlTemplate: "https://api.binance.com/api/v3/avgPrice?symbol={{.DataKey}}"
pipelines:
  - value: "asset.price"
  - timestamp: "asset.timestamp"
  - volume: "asset.volume"

5. Fetch Data

To retrieve data updates, use the Finalized Data API. For example, to fetch the NGL/USD price feed, execute the following curl command:

curl https://pricefeed.entangle.fi/spotters/prices-feed1?assets=NGL/USD

Sample response:

{
  "calldata": {
    "merkleRoot": "0xa4d6d7594888f1fe8675d033737baf6106819f7b0e59f3d794f5e1adcec70022",
    "signatures": [
      { "R": "...", "S": "...", "V": 28 },
      { "R": "...", "S": "...", "V": 28 },
      { "R": "...", "S": "...", "V": 28 }
    ],
    "feeds": [
      {
        "key": "NGL/USD",
        "value": { "data": "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAcHqjZobCo0=", "timestamp": 1722866387 },
        "merkleProofs": ["wg/sd9GqECY5zFKdsHRThUdPda0Parv24Npcj15EqJ4=", "CK5qUUFgAaZBYE/Aq3x+Y61HHAQnW7q6A2K1obw55ZE="]
      }
    ]
  },
  "error": ""
}

This response contains the Merkle root, signatures, and data required to verify the update.

6. Verify Data

Once you have fetched the data, use the PullOracle smart contract to verify the updates on-chain.

7. Monitor and Update

Ongoing monitoring ensures that the integration functions correctly. Periodically review the configuration and update it as needed to accommodate changes in the data provider's API or Entangle’s system.

Last updated