Skip to main content

Multiplexed chunked file downloader

Project description

StreamShatter

Ever wondered where all the internet bandwidth you get from speedtest websites goes when you're actually trying to download something? Or does your WiFi constantly cut out and ruin your in-progress downloads? Well, either way, here's a tool that may or may not help!

Originally a very basic script for reliably downloading files from servers with inconsistent connections, this project has been revisited and modernised to use https://github.com/jawah/niquests to greatly improve multiplexing performance, for those who still have use for such a tool.

StreamShatter takes advantage of the Range HTTP header to dynamically allocate multiple chunks, by starting with one streaming request and gradually bisecting it while bandwidth permits, all without restarting the download. This allows for single, large file downloads from hosts that, whether intentionally or unintentionally, have degraded throughputs. The individual chunks also serve as checkpoints for if/when connections are broken.

Installation

  • Install python and pip
  • Install StreamShatter as a package: pip install streamshatter

Usage

usage: streamshatter [-h] [-V] [-H HEADERS] [-l CONCURRENT_LIMIT] [-sl SIZE_LIMIT] [-t TIMEOUT] [-s | --ssl | --no-ssl]
                     [-d | --debug | --no-debug] [-lp | --log-progress | --no-log-progress]
                     url [filename]

Multiplexed chunked file downloader

positional arguments:
  url                   Target URL
  filename              Output filename; use "-" for stdout pipe

options:
  -h, --help            show this help message and exit
  -V, --version         show program's version number and exit
  -H, --headers HEADERS
                        HTTP headers, interpreted as JSON
  -l, -cl, --concurrent-limit CONCURRENT_LIMIT
                        Limits the amount of concurrent requests; defaults to 64
  -sl, --size-limit SIZE_LIMIT
                        Limits the amount of data to download; defaults to 1099511627776
  -t, --timeout TIMEOUT
                        Limits the amount of time allowed for the initial request to succeed
  -s, --ssl, --no-ssl   Enforces SSL verification; defaults to TRUE
  -d, --debug, --no-debug
                        Terminates immediately upon non-timeout errors, and writes the response data for errored chunks; defaults to
                        FALSE
  -lp, --log-progress, --no-log-progress
                        Continually updates a progress bar in the standard output; defaults to TRUE

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

streamshatter-1.1.11.tar.gz (9.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

streamshatter-1.1.11-py3-none-any.whl (10.5 kB view details)

Uploaded Python 3

File details

Details for the file streamshatter-1.1.11.tar.gz.

File metadata

  • Download URL: streamshatter-1.1.11.tar.gz
  • Upload date:
  • Size: 9.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for streamshatter-1.1.11.tar.gz
Algorithm Hash digest
SHA256 ebd7805c7479c75d4a1aea1e0c68554fbe9c662eb38b6548742e39a05aa70594
MD5 dc3b51e4e9f95e9cb3a7804cb8631ebd
BLAKE2b-256 bc9be97a5d855868c9ff78a00f88efb46695cc1359fd862c7092ef51bf5e97f1

See more details on using hashes here.

File details

Details for the file streamshatter-1.1.11-py3-none-any.whl.

File metadata

  • Download URL: streamshatter-1.1.11-py3-none-any.whl
  • Upload date:
  • Size: 10.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for streamshatter-1.1.11-py3-none-any.whl
Algorithm Hash digest
SHA256 979a0080133c044630c3e4d39ee55039cba39c142a0a98f0d9ca0581e6656721
MD5 20f8daca373e8a7e50b86f3a17684cf4
BLAKE2b-256 342ba82ff2cd9f93454cf4ef50a4b5f36ff38e3c503a356831820b282caba317

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page