Custom Data Feeds

If you're looking for a data type that isn't currently supported but would like it to be, feel free to contact us, and we can discuss implementing it.

Custom data in data feeds refers to user-defined information that is integrated into a data feed to meet the specific needs of a dApp or smart contract. This enables developers to pull tailored data from both on-chain and off-chain sources, facilitating more accurate and relevant decision-making and operations. With UDF, users can easily expand the data available to their applications by incorporating new data types into the Entangle UDF infrastructure.


When Is Custom Data Needed?

  • Integrating New Crypto Assets: Add new cryptocurrency price feeds to ensure decentralized exchanges, lending platforms, or wallets always have the latest market prices.

  • Real-World Asset (RWA) Integration: Introduce real-world assets like commodities, real estate, or physical goods into the decentralized ecosystem, enabling their tokenization and trading.

  • Proof of Reserves Integration: Add external data feeds to verify asset reserves, providing transparency and confidence in the backing of digital assets or financial platforms.


Requesting New Data Types

To integrate new data into UDF, contact us and provide the following information:

  • Initial Data with Examples: Share real sample data that will be processed during the integration.

  • Data Processing Requirements: Outline any necessary transformations or formatting for the data to be delivered on-chain.

  • Update Frequency Suggestions: Recommend the desired update frequency to tailor the integration to the protocol’s needs.


Integration Process

This section outlines the process of integrating a new data feed into UDF. The default integration flow ensures smooth onboarding of new data sources and utilizes UDF’s capabilities for fetching, parsing, and distributing data on-chain.

1

Define REST URL Template

The Rest URL template defines how data is fetched from external sources. You will need to create a URL template that points to the endpoint of the data provider. The template uses Go text syntax, with placeholders such as DataKey and SourceID. Example for Binance API:

https://api.binance.com/api/v3/avgPrice?symbol={{.DataKey}}

This URL template is inserted into the configuration file, allowing Entangle's backend to fetch the necessary data.

2

Set Up Parsing Pipelines

In the configuration file, set up pipelines to parse data from the response JSON. Below is an example JSON response structure:

{
  "asset": {
    "price": 1000000000000000000,
    "timestamp": 1721636387,
    "volume": 10000000000
  }
}
  • Pipeline for Price: asset.price

  • Pipeline for Timestamp: asset.timestamp

  • Pipeline for Volume: asset.volume

These pipelines should be defined using GJSON syntax to extract the required fields.

3

Configure the System

The next step is to implement both the URL template and parsing pipelines in the YAML configuration file. This connects the data feed with Entangle UDF.

Example configuration snippet:

urlTemplate: "https://api.binance.com/api/v3/avgPrice?symbol={{.DataKey}}"
pipelines:
  - value: "asset.price"
  - timestamp: "asset.timestamp"
  - volume: "asset.volume"
4

Fetch Data

To retrieve data updates, use the Finalized Data API. For example, to fetch the NGL/USD price feed, execute the following curl command:

curl https://udfsnaptest.ent-dx.com/last_votes?feedKeys=NGL/USD

Below is a sample response containing publisher votes along with their signatures, and data encoded as bytes in the update_call_data property which is ready to be used on-chain.

{
  "update_call_data": "0x..",
  "feeds": [
    {
      "feed_key": "NGL/USD",
      "votes": [
        {
          "Value": "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAS4fbe+zAAA=",
          "Timestamp": 1737093395,
          "Signature": {
            "R": "0xfb2e89fb52f1d832d5437f4b773c0f67749b914b25c592af9cbc8ffe58e9e928",
            "S": "0x3c7e717a67b8a0c072f0271f35f3994c865c12b930f20b57149b623cade43d43",
            "V": 27
          }
        },
        {
          "Value": "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAS4fbe+zAAA=",
          "Timestamp": 1737093402,
          "Signature": {
            "R": "0xe6a1448feac17a219c63a426f978cd395f8be55a74ebc717a88385033c7c5432",
            "S": "0x1a16ba5138e5293b03c093097c8564c55a0f054537a6c665d3b4bc9cf716d46f",
            "V": 28
          }
        },
        {
          "Value": "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAS4fbe+zAAA=",
          "Timestamp": 1737093400,
          "Signature": {
            "R": "0x8308dc0d3b5a32a12374fbc8a2852cb14af4e31cc015b4d463e4cfb41d5aa469",
            "S": "0x12d5746c9499e7dc5edbbc436f696478ab4ad5a96247fb357d3d0c418769778d",
            "V": 28
          }
        }
      ]
    }
  ]
}
5

Finishing

Monitor and Update

Ongoing monitoring ensures that the integration functions correctly. Periodically review the configuration and update it as needed to accommodate changes in the data provider's API or Entangle’s system.

Once you have fetched the data, use the UDFOracle smart contract to verify the updates on-chain.

Last updated

Was this helpful?