Orderbooks
Crypto order books scraper
Keep a limit order book copy updated in real time for various crypto pairs
Supported exchanges
- bitmex (perpetual)
- bitstamp (spot)
- coinbase pro (spot)
- kraken (spot)
- okex (spot)
Installation
-
Clone the repo
git clone ...
-
Install dependencies
go get ./...
-
Customize configuration
Copy config.json
and edit it according to your needs:
pairs
: pairs to scrape (if an exchange doesn't support a pair it is ignored for that specific exchange with a warning on startup)
snapshot_interval
: interval between full snapshots (in seconds)
storage_path
: where snapshot and events file are saved
exchanges
: exchanges to scrape
save_Events
: save events files
-
(optional) Install systemd service (tested on Ubuntu 18.04)
go build -o scripts/orderbooks
./scripts/install.sh
⚠️ Systemd service is enabled on startup by default
Usage
Help
Run orderbooks --help
for command list and orderbooks [COMMAND] --help
for detailed command help
Run scraper
Run order books scraper and keep save data in csv files.
orderbooks run --log_level 1 --config your_config_file.json
⚠️Uncompressed files for all the exchanges and supported pairs may take more than 5GB a day, use the builtin compress
command to save 80% space
[FILE_TYPE]
(events|snapshots)
Compress files
Compress stored files up to today-n day (default n = 5)
orderbooks compress [n]
Exctact files
Exctact compressed files for [EXCHANGE]
and [PAIR]
in [TARGET PATH]
orderbooks extract --extract_events=[0|1] [EXCHANGE] [PAIR] [TARGET PATH]
Optional parameters:
--extract_events
(default 0) Exctract snapshots files if 0 or event files if 1
Remove stored files
Remove old stored files up to today-n day (default n = 5)
orderbooks clean [n]
Upload to Elasticsearch
Upload data stored in files to elasticsearch
orderbooks to_elasticsearch --storage_path [STORAGE_PATH] --interval=10 --snapshot_mode=1 [PAIR] [EXCHANGE] [FROM_DATE] [TO_DATE]
All positional arguments are required.
Dates are in yyyy-mm-dd
format
Optional parameters:
--storage_path
(default 0)
--interval
--snapshot_mode
Live feed
Publish real time snapshots on a NATS channel and optionally save snapshots to elasticsearch
data_feed --interval 1 [PAIR] [EXCHANGE]
gRPC server
Run a grpc server to query stored files and/or stream live data (check server/proto/orderbooks.proto
for available endpoints and API docs)
orderbooks server --storage_path [STORAGE_PATH]
Contributing
The project was meant as a learning project, so it is not regularly maintained but any contributions you make are greatly appreciated.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
)
- Commit your Changes (
git commit -m 'Add some AmazingFeature'
)
- Push to the Branch (
git push origin feature/AmazingFeature
)
- Open a Pull Request