Skip to main content

No project description provided

Project description

GitHub last commit GitHub license<space><space> PyPI PyPI - Python Version

Short Overview.

binance_historical_data is a python package (py>=3.8) which makes download of historical crypto data (prices and volumes) from binance server as simple as it can only be. You don’t even need to have an account at binance.com to download all history of crypto data

Dumped locally and then unzipped,
so you would have an identical local ready to use data copy
Using this package you will be able to have full historical data of prices and volumes with only 3 lines of python code
And if you need to update already downloaded data then once again 3 lines of python code will do the job
Limitations: The previous day data appears on binance server a few minutes after 0 a.m. UTC
So there is a delay in which you can get the data.

Installation via pip:

pip install binance_historical_data

How to use it

Initialize main object: data_dumper

from binance_historical_data import BinanceDataDumper

data_dumper = BinanceDataDumper(
    path_dir_where_to_dump=".",
    asset_class="spot",  # spot, um, cm
    data_type="klines",  # aggTrades, klines, trades
    data_frequency="1m",
)

Arguments:

  1. path_dir_where_to_dump:
    (string) Path to folder where to dump the data
  2. asset_class:
    (string) Source of data: [spot, um, cm] um: usd(t) margined futures, cm: coin margined futures
  3. data_type=”klines”:
    (string) data type to dump:
    [aggTrades, klines, trades] for spot
    [aggTrades, klines, trades, indexPriceKlines, markPriceKlines, premiumIndexKlines, metrics] for futures (metrics only supported for um)
    Refer to binance doc for additional info: https://github.com/binance/binance-public-data
  4. str_data_frequency:
    (string) One of [1m, 3m, 5m, 15m, 30m, 1h, 2h, 4h, 6h, 8h, 12h]
    Frequency of price-volume data candles to get

1) The only method to dump the data

data_dumper.dump_data(
    tickers=None,
    date_start=None,
    date_end=None,
    is_to_update_existing=False,
    tickers_to_exclude=["UST"],
)

Arguments:

  1. tickers=None:
    (list) Trading pairs for which to dump data
    if equals to None - all USDT pairs will be used
  2. date_start=None:
    (datetime.date) The date from which to start dump
    if equals to None - every trading pair will be dumped from the early begining (the earliest is 2017-01-01)
  3. date_end=True=None:
    (datetime.date) The last date for which to dump data
    if equals to None - Today’s date will be used
  4. is_to_update_existing=False:
    (bool) Flag if you want to update the data if it’s already exist
  5. tickers_to_exclude=None:
    (list) Tickers to exclude from dump

2) Delete outdated daily results

Delete all daily data for which full month monthly data was already dumped

data_dumper.delete_outdated_daily_results()

.csv klines (candles) files columns

“Open time” - Timestamp
“Open”
“High”
“Low”
“Close”
“Volume”
“Close time” - Timestamp
“Quote asset volume”
“Number of trades”
“Taker buy base asset volume”
“Taker buy quote asset volume”
“Ignore”

Examples

How to dump all data for all USDT trading pairs

Please be advised that the first data dump for all trading pairs might take some time (~40 minutes)

data_dumper.dump_data()

How to update data (get all new data)

It’s as easy as running the exactly same method dump_data once again
The data_dumper will find all the dates for which data already exists
and will try to dump only the new data
data_dumper.dump_data()

How to update (reload) data for the asked time period

data_dumper.dump_data(
    date_start=datetime.date(year=2021, month=1, day=1),
    date_end=datetime.date(year=2022, month=1, day=1),
    is_to_update_existing=True
)

Other useful methods

Get all trading pairs (tickers) from binance

print(data_dumper.get_list_all_trading_pairs())

Get the first data when data for the ticker can be found

print(data_dumper.get_min_start_date_for_ticker())

Get all tickers with locally saved data

print(
    data_dumper.get_all_tickers_with_data(timeperiod_per_file="daily")
)

Get all dates for which there is locally saved data

print(
    data_dumper.get_all_dates_with_data_for_ticker(
        ticker,
        timeperiod_per_file="monthly"
    )
)

Get directory where the local data of exact ticker lies

print(
    data_dumper.get_local_dir_to_data(
        ticker,
        timeperiod_per_file,
    )
)

Create file name for the local file

print(
    data_dumper.create_filename(
        ticker,
        date_obj,
        timeperiod_per_file="monthly",
    )
)

Contacts

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

binance_historical_data-0.1.14.tar.gz (14.0 kB view details)

Uploaded Source

Built Distribution

binance_historical_data-0.1.14-py3-none-any.whl (12.6 kB view details)

Uploaded Python 3

File details

Details for the file binance_historical_data-0.1.14.tar.gz.

File metadata

  • Download URL: binance_historical_data-0.1.14.tar.gz
  • Upload date:
  • Size: 14.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.13 CPython/3.9.7 Windows/10

File hashes

Hashes for binance_historical_data-0.1.14.tar.gz
Algorithm Hash digest
SHA256 fdd776093bedfd44a9f4463e63ba67b92656f840e10fab4180b8eb756b837a54
MD5 257e4d2d5d1fb0e2c61a4821c157fc98
BLAKE2b-256 ec0f3b241a021c36ec64fad094dbcd6d9f359b5545ab01802fa75e3693d9f201

See more details on using hashes here.

File details

Details for the file binance_historical_data-0.1.14-py3-none-any.whl.

File metadata

File hashes

Hashes for binance_historical_data-0.1.14-py3-none-any.whl
Algorithm Hash digest
SHA256 47caf4e785cb88f0bff04e425e3d29b83b4a8d56d0401f0e0e542d26a4cd315c
MD5 5a65a5a5b99345d9c6f5311538d9df85
BLAKE2b-256 8fa58826a2fe598b1c08be13d5b3a15f3106142f966907e12eb413f9350ced2b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page