Reading Dukascopy bi5 Tick History with the TradingData Stream Library for Java

This java library reads the publicly available binary format bi5 Dukascopy Bank tick history files and convert them to a Java InputStream to be used with your applications.

https://bitbucket.org/limemojito/trading-data-stream

TradingDataStream FX Data model library

This library supports;

  • High level search APIs for Tick and Bar streams, backed by cached dukascopy files.
  • on demand fetch from Dukascopy
  • local filesystem caching
  • Amazon Web Service S3 caching
  • Bar aggregation from the tick data
  • Bar search queries by barCount or date time range (UTC).
  • stream -> CSV file conversion.
  • stream -> JSON file conversion.
  • “Standlone” configuration for quick scripts.
  • Spring bean configurations and customisation for use in large applications.

Provided under the Apache 2.0 License, please refer to LICENSE.txt and DATA_DISCLAIMER.txt in our software code repository. This software is supplied as-is, use at your own risk and information from using this software does NOT constitute financial advice.

Please note we are not affiliated with Dukascopy in any way. This project was a clean room engineering effort to read the dukascopy files. This library was inspired by the C++ binding at https://github.com/ninety47/dukascopy.

Fetching Tick data using Dukascopy bi5 publicly available history data

Using TradingDataStream with a maven project

Add the following to the dependencies section of your pom.xml

<dependency>
  <groupId>com.limemojito.oss.trading.trading-data-stream</groupId>
  <artifactId>model</artifactId>
  <version>2.1.2</version>
</dependency>

TradingDataStream: Using the high level TradingSearch API for Tick data

This high level API allows you to use a query by time to retrieve ticks. An appropriate number of bi5 file are retrieved from dukascopy to answer the query, with data timing, etc to fit the results within the query parameters.

The standalone setup here uses local file caching in your user’s home directory under .dukascopy-cache to cache the bi5 files retrieved to increase the speed of repeated searches.

TradingSearch search=TradingDataStreamConfiguration.standaloneSetup();
try(TradingInputStream<Tick> ticks = search("EURUSD","2020-01-02T00:00:00Z","2020-01-02T00:59:59Z")){
    ticks.stream()
         .foreach(t -> log.info("{} {} bid: {}}, t.getMillisecondsUtc(), t.getSymbol(), t.getBid());
}

TradingDataStream: Reading an existing Dukascopy bi5 FX Tick History file with Java

We recommend using the TradingSearch APIs as these work with configured caches to reduce the load on the Dukascopy servers. Our low level APIs can read individual file data streams as below.

The separation of “path” and the file data is due to the naming convention of the data in the Dukascopy repository.

Symbol/Year/Month (0 indexed)/DayOfMonth/{24hourOfDay}h_ticks.bi5

String path = "EURUSD/2018/06/05/05h_ticks.bi5"
try(FileInputStream fileStream = new FileInputStream(path);
    TradingInputStream<Tick> ticks = new DukascopyTickInputStream(VALIDATOR, path, fileStream)) {
    ticks.stream().foreach( t -> log.info("{} {} bid: {}}, t.getMillisecondsUtc(), t.getSymbol(), t.getBid());
}

Tick Dukascopy File Format

Note that dukascopy is a UTC+0 offset so no time adjustment is necessary

The files I downloaded are named something like ’00h_ticks.bi5′. These ‘bi5’ files are LZMA compressed binary data files. The binary data file are formatted into 20-byte rows.

  • 32-bit integer: milliseconds since epoch
  • 32-bit float: Ask price
  • 32-bit float: Bid price
  • 32-bit float: Ask volume
  • 32-bit float: Bid volume

The ask and bid prices need to be multiplied by the point value for the symbol/currency pair. The epoch is extracted from the URL (and the folder structure I’ve used to store the files on disk). It represents the point in time that the file starts from e.g. 2013/01/14/00h_ticks.bi5 has the epoch of midnight on 14 January 2013. Example using C++ to work file format, including format and computation of “epoch time”:

LZ compression/decompression can be done with apache commons compress:

https://commons.apache.org/proper/commons-compress/

This format is “valid” after experimentation.

[   TIME  ] [   ASKP  ] [   BIDP  ] [   ASKV  ] [   BIDV  ]
[0000 0800] [0002 2f51] [0002 2f47] [4096 6666] [4013 3333]
  • TIME is a 32-bit big-endian integer representing the number of milliseconds that have passed since the beginning of this hour.
  • ASKP is a 32-bit big-endian integer representing the asking price of the pair, multiplied by 100,000.
  • BIDP is a 32-bit big-endian integer representing the bidding price of the pair, multiplied by 100,000.
  • ASKV is a 32-bit big-endian floating point number representing the asking volume, divided by 1,000,000.
  • BIDV is a 32-bit big-endian floating point number representing the bidding volume, divided by 1,000,000.

Tick Data JSON Format

Note that epoch milliseconds is relative to UTC timezone. source is live | historical

{
   "epochMilliseconds": 94875945798,
   "symbol": "EURUSD",
   "bid" :134567,
   "ask" : 134520,
   "source": "live",
   "streamId": "00000000-0000-0000-0000-000000000000" 
}