Skip to main content

Multiplexed chunked file downloader

Project description

StreamShatter

Ever wondered where all the internet bandwidth you get from speedtest websites goes when you're actually trying to download something? Or does your WiFi constantly cut out and ruin your in-progress downloads? Well, either way, here's a tool that may or may not help!

Originally a very basic script for reliably downloading files from servers with inconsistent connections, this project has been revisited and modernised to use https://github.com/jawah/niquests to greatly improve multiplexing performance, for those who still have use for such a tool.

StreamShatter takes advantage of the Range HTTP header to dynamically allocate multiple chunks, by starting with one streaming request and gradually bisecting it while bandwidth permits, all without restarting the download. This allows for single, large file downloads from hosts that, whether intentionally or unintentionally, have degraded throughputs. The individual chunks also serve as checkpoints for if/when connections are broken.

Installation

  • Install python and pip
  • Install StreamShatter as a package: pip install streamshatter

Usage

usage: streamshatter [-h] [-V] [-H HEADERS] [-l CONCURRENT_LIMIT] [-sl SIZE_LIMIT] [-t TIMEOUT] [-s | --ssl | --no-ssl]
                     [-d | --debug | --no-debug] [-lp | --log-progress | --no-log-progress]
                     url [filename]

Multiplexed chunked file downloader

positional arguments:
  url                   Target URL
  filename              Output filename; use "-" for stdout pipe

options:
  -h, --help            show this help message and exit
  -V, --version         show program's version number and exit
  -H, --headers HEADERS
                        HTTP headers, interpreted as JSON
  -l, -cl, --concurrent-limit CONCURRENT_LIMIT
                        Limits the amount of concurrent requests; defaults to 64
  -sl, --size-limit SIZE_LIMIT
                        Limits the amount of data to download; defaults to 1099511627776
  -t, --timeout TIMEOUT
                        Limits the amount of time allowed for the initial request to succeed
  -s, --ssl, --no-ssl   Enforces SSL verification; defaults to TRUE
  -d, --debug, --no-debug
                        Terminates immediately upon non-timeout errors, and writes the response data for errored chunks; defaults to
                        FALSE
  -lp, --log-progress, --no-log-progress
                        Continually updates a progress bar in the standard output; defaults to TRUE

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

streamshatter-1.1.9.tar.gz (9.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

streamshatter-1.1.9-py3-none-any.whl (10.1 kB view details)

Uploaded Python 3

File details

Details for the file streamshatter-1.1.9.tar.gz.

File metadata

  • Download URL: streamshatter-1.1.9.tar.gz
  • Upload date:
  • Size: 9.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.13.8

File hashes

Hashes for streamshatter-1.1.9.tar.gz
Algorithm Hash digest
SHA256 e362e00d0f030b76bc0e312f4d73b5d4325d90faa60778a9b68b765638ac43ef
MD5 06d40f39fba1a75b3c8446e32a1e3cf2
BLAKE2b-256 23360866efbd57158f91bae71fb3ef4156b3e32dcac9fe22c169605acb87ef43

See more details on using hashes here.

File details

Details for the file streamshatter-1.1.9-py3-none-any.whl.

File metadata

  • Download URL: streamshatter-1.1.9-py3-none-any.whl
  • Upload date:
  • Size: 10.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.13.8

File hashes

Hashes for streamshatter-1.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 955f319e2fe129fc999c8396e00ce52a05420093481928d3ddbab5910039be7b
MD5 fbf12367b980bb200bb3c145c552d915
BLAKE2b-256 39ac367b169ecc6065258255e179369bc40f20a610055d33f58dab2e93dde860

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page