Node.js tardis-dev
library provides convenient access to tick-level real-time and historical cryptocurrency market data both in exchange native and normalized formats. Instead of callbacks it relies on async iteration (for await ...of) enabling composability features like seamless switching between real-time data streaming and historical data replay or computing derived data locally.
const { replayNormalized, normalizeTrades, normalizeBookChanges } = require('@radar/tardis-dev')
const messages = replayNormalized(
{
exchange: 'bitmex',
symbols: ['XBTUSD', 'ETHUSD'],
from: '2019-05-01',
to: '2019-05-02'
},
normalizeTrades,
normalizeBookChanges
)
for await (const message of messages) {
console.log(message)
}
-
historical tick-level market data replay backed by tardis.dev HTTP API — includes full order book depth snapshots plus incremental updates, tick-by-tick trades, historical open interest, funding, index, mark prices, liquidations and more
-
consolidated real-time data streaming API connecting directly to exchanges' public WebSocket APIs
- support for both exchange-native and normalized market data formats (unified format for accessing market data across all supported exchanges — normalized trades, order book and ticker data)
- seamless switching between real-time streaming and historical market data replay thanks to
async iterables
providing unified way of consuming data messages
- transparent historical local data caching (cached data is stored on disk in compressed GZIP format and decompressed on demand when reading the data)
- support for top cryptocurrency exchanges: BitMEX, Deribit, Binance, FTX, OKEx, Huobi Futures, Huobi Global, Bitfinex, Coinbase Pro, Kraken Futures, Kraken, Bitstamp, Gemini, Poloniex, Bybit, Phemex, Delta Exchange, FTX US, Binance US, Gate.io, OKCoin, bitFlyer, HitBTC, CoinFLEX (2.0), Binance Jersey and more
- automatic closed connections and stale connections reconnection logic for real-time streams
- combining multiple exchanges feeds into single one via
combine
helper function — synchronized historical market data replay and consolidated real-time data streaming from multiple exchanges
- computing derived data locally like order book imbalance, custom trade bars, book snapshots and more via
compute
helper function andcomputables
, e.g., volume based bars, top 20 levels order book snapshots taken every 10 ms etc.
- full limit order book reconstruction both for real-time and historical data via
OrderBook
object
- fast and lightweight architecture — low memory footprint and no heavy in-memory buffering
- extensible mapping logic that allows adjusting normalized formats for specific needs
Requires Node.js v12+ installed.
npm install @radar/tardis-dev --save
Example showing how to quickly display real-time spread and best bid/ask info across multiple exchanges at once. It can be easily adapted to do the same for historical data (replayNormalized
instead of streamNormalized
).
const tardis = require('@radar/tardis-dev')
const { streamNormalized, normalizeBookChanges, combine, compute, computeBookSnapshots } = tardis
const exchangesToStream = [
{ exchange: 'bitmex', symbols: ['XBTUSD'] },
{ exchange: 'deribit', symbols: ['BTC-PERPETUAL'] },
{ exchange: 'cryptofacilities', symbols: ['PI_XBTUSD'] }
]
// for each specified exchange call streamNormalized for it
// so we have multiple real-time streams for all specified exchanges
const realTimeStreams = exchangesToStream.map((e) => {
return streamNormalized(e, normalizeBookChanges)
})
// combine all real-time message streams into one
const messages = combine(...realTimeStreams)
// create book snapshots with depth1 that are produced
// every time best bid/ask info is changed
// effectively computing real-time quotes
const realTimeQuoteComputable = computeBookSnapshots({
depth: 1,
interval: 0,
name: 'realtime_quote'
})
// compute real-time quotes for combines real-time messages
const messagesWithQuotes = compute(messages, realTimeQuoteComputable)
const spreads = {}
// print spreads info every 100ms
setInterval(() => {
console.clear()
console.log(spreads)
}, 100)
// update spreads info real-time
for await (const message of messagesWithQuotes) {
if (message.type === 'book_snapshot') {
spreads[message.exchange] = {
spread: message.asks[0].price - message.bids[0].price,
bestBid: message.bids[0],
bestAsk: message.asks[0]
}
}
}
Example showing simple pattern of providing async iterable
of market data messages to the function that can process them no matter if it's is real-time or historical market data. That effectively enables having the same 'data pipeline' for backtesting and live trading.
const tardis = require('@radar/tardis-dev')
const { replayNormalized, streamNormalized, normalizeTrades, compute, computeTradeBars } = tardis
const historicalMessages = replayNormalized(
{
exchange: 'bitmex',
symbols: ['XBTUSD'],
from: '2019-08-01',
to: '2019-08-02'
},
normalizeTrades
)
const realTimeMessages = streamNormalized(
{
exchange: 'bitmex',
symbols: ['XBTUSD']
},
normalizeTrades
)
async function produceVolumeBasedTradeBars(messages) {
const withVolumeTradeBars = compute(
messages,
computeTradeBars({
kind: 'volume',
interval: 100 * 1000 // aggregate by 100k contracts volume
})
)
for await (const message of withVolumeTradeBars) {
if (message.type === 'trade_bar') {
console.log(message.name, message)
}
}
}
await produceVolumeBasedTradeBars(historicalMessages)
// or for real time data
// await produceVolumeBasedTradeBars(realTimeMessages)
const { stream } = require('@radar/tardis-dev')
const messages = stream({
exchange: 'bitmex',
filters: [
{ channel: 'trade', symbols: ['XBTUSD'] },
{ channel: 'orderBookL2', symbols: ['XBTUSD'] }
]
})
for await (const message of messages) {
console.log(message)
}
const { replay } = require('@radar/tardis-dev')
const messages = replay({
exchange: 'bitmex',
filters: [
{ channel: 'trade', symbols: ['XBTUSD'] },
{ channel: 'orderBookL2', symbols: ['XBTUSD'] }
],
from: '2019-05-01',
to: '2019-05-02'
})
for await (const message of messages) {
console.log(message)
}