npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

bfx-terminal-examples

v4.0.0

Published

Bitfinex Terminal Examples

Downloads

14

Readme

Bitfinex Terminal

Bitfinex Terminal offers market data as live data streams and is built on top of Dazaar. It offers first class support for Algo Traders by offering market data over a distributed database to everyone.

The market data streams are free and allow algo traders and data scientists easy and fast access to our historical trading data. Trading data is easy to download and replicate. It is shared on a P2P network - like BitTorrent - but with a database interface. Also, unlike with BitTorrent, you won't have to download everything until you can use it, streaming time ranges is supported. And if you decide to download more data, e.g. from an earlier timeframe, it will of course preserve the data ordered, in a nice B-Tree, for fast access.

Table of Contents

  1. How to use it?
  2. Support for Algo Traders
  3. Example: Get a Full Snapshot of the BTCUSD trades, and keep following new updates
  4. Example: Query Candles Data
  5. Tutorial: Backtest your Trading Strategies with Bitfinex Terminal & Honey Framework
  6. Tutorial: Execute your Trading Strategy with the Honey Framework and Bitfinex Terminal
  7. Article: Learn more about Dazaar

How to use it?

Every data stream in Bitfinex Terminal and also Dazaar has a unique id on the network. Imagine it like the url of the data stream. You can directly use ids or you can use Dazaar Cards, which are available for download. We have prepared a few examples that will show how to use it.

First class support for Algo Traders

Bitfinex Terminal is also supported by the Honey Framework. You can easily backtest your Honey Framework Strategies on Bitfinex Terminal with the Honey Framework backtesting tools. And you can also decide to trade on your own or to sell the trading signals on the Dazaar network with bfx-hf-strategy-dazaar.

Examples

Also make sure to check our tutorial material :)

Example: Get a Full Snapshot of the BTCUSD trades, and keep following new updates

In our first example we will prepare a full copy of the Bitfinex trades data. We also want to keep following new updates that come in. We will walk through a small set of code, as if we were writing it. The full example can be found at examples/trades-full-copy.js.

As a first step we have to require our dependencies, one of it are the terms of service that we can read at https://github.com/bitfinexcom/bitfinex-terminal/tree/master/terms-of-use:

const dazaar = require('dazaar')
const swarm = require('dazaar/swarm')

const Hyperbee = require('hyperbee')
const keyEncoding = require('bitfinex-terminal-key-encoding')
const terms = require('bitfinex-terminal-terms-of-use')

And we create a Dazaar market. Our data will be stored at dbs/full-trades

const market = dazaar('dbs/full-trades')

Then we download the Dazaar Card for the stream and load it. The option live: true keeps the connection open, so we keep listening for updates after the data is fully synced. If we would set sparse to true, we would just download requested data that we request in a query. This way, we make a full copy. We also accept the Bitfinex Terminal terms of use by loading them into Dazaar, after we have read them:

const card = require('../cards/bitfinex.terminal.btcusd.trades.json')
const buyer = dmarket.buy(card, { live: true, sparse: false, terms })

In the next step a lot happens. We register an event listener for the feed event. When it is emitted, we know that the feed is ready for consuming. We also set up a Hyperbee instance, which will provide us a nice interface to access the data. We pass the db instance to a function called doQuery which we will highlight next.

buyer.on('feed', function () {
  console.log('got feed')

  const db = new Hyperbee(buyer.feed, {
    keyEncoding,
    valueEncoding: 'json'
  })

  doQuery(db)
})

The function doQuery will make a request for us. It will request the required data prioritized and print it to our console. We select all trades with a timestamp larger than October 2018, 9:00 UTC and 2019. We also limit the results to 10:

function doQuery (db) {
  db.createReadStream({
    gte: { timestamp: new Date('2018-10-10T09:00:00.000Z') },
    lte: { timestamp: new Date('2019-10-10T09:00:00.000Z') },
    limit: 10
  }).on('data', (d) => {
    console.log(d)
  })
}

To start everything, we have to join the P2P swarm:

swarm(buyer)

And for demonstration purposes we also log the downloaded elements, so we can see the progress:

setInterval(() => {
  if (!buyer.feed) return

  console.log('data feed length is', buyer.feed.length, 'elements')
}, 1000)

This example code will download the whole BTCUSD trades dataset from Bitfinex Terminal. If the connection is reset, it will just resume where it stopped. We also make a request for a timerange and print it to the console. The timerange is prioritized and downloaded first. When all downloading is finished, we keep the socket open to receive the latest updates. That is a lot that is done for us behind the scenes.

Example: Query Candles Data

Querying Candle data is as easy as getting historical trade data. All candle data is consolidated into one database, and we just have to select our candle type.

The full example can be found at examples/candles-sparse-select.js. Our setup is similar to the trades example, we require our dependencies and set up a database that is stored on disk:

const dazaar = require('dazaar')
const swarm = require('dazaar/swarm')

const Hyperbee = require('hyperbee')
const keyEncoding = require('bitfinex-terminal-key-encoding')

const market = dazaar('dbs/sparse-candles')

This time we load a Dazaar Card for candles. We also enable the sparse mode, that means, just the data we directly request is downloaded. We also accept the Bitfinex Terminal terms of use by loading module for it into Dazaar:

const card = require('../cards/bitfinex.terminal.btcusd.candles.json')
const terms = require('bitfinex-terminal-terms-of-use')
const buyer = dmarket.buy(card, { sparse: true, terms })

Our event listener looks exactly the same as the one from the previous example:

buyer.on('feed', function () {
  console.log('got feed')

  const db = new Hyperbee(buyer.feed, {
    keyEncoding,
    valueEncoding: 'json'
  })

  doQuery(db)
})

For our query we define a timeframe and the candle type. We also reverse the results, so we should get the newest entries first:

function doQuery (db) {
  db.createReadStream({
    gte: { candle: '5m', timestamp: new Date('2018-10-10T09:00:00.000Z') },
    lte: { candle: '5m', timestamp: new Date('2019-10-10T09:00:00.000Z') },
    limit: 10,
    reverse: true
  }).on('data', (d) => {
    console.log(d)
  })
}

In this example we did a sparse-enabled select on the data, so just the data we selected is downloaded. The full example can be found in examples/candles-sparse-select.js.

Tutorials

Our Tutorials can be found in the folder articles.

Articles