Skip to main content

Multiplexed chunked file downloader

Project description

StreamShatter

Ever wondered where all the internet bandwidth you get from speedtest websites goes when you're actually trying to download something? Or does your WiFi constantly cut out and ruin your in-progress downloads? Well, either way, here's a tool that may or may not help!

Originally a very basic script for reliably downloading files from servers with inconsistent connections, this project has been revisited and modernised to use https://github.com/jawah/niquests to greatly improve multiplexing performance, for those who still have use for such a tool.

StreamShatter takes advantage of the Range HTTP header to dynamically allocate multiple chunks, by starting with one streaming request and gradually bisecting it while bandwidth permits, all without restarting the download. This allows for single, large file downloads from hosts that, whether intentionally or unintentionally, have degraded throughputs. The individual chunks also serve as checkpoints for if/when connections are broken.

Installation

  • Install python and pip
  • Install StreamShatter as a package: pip install streamshatter

Usage

usage: streamshatter [-h] [-V] [-H HEADERS] [-l CONCURRENT_LIMIT] [-sl SIZE_LIMIT] [-t TIMEOUT] [-s | --ssl | --no-ssl]
                     [-d | --debug | --no-debug] [-lp | --log-progress | --no-log-progress]
                     url [filename]

Multiplexed chunked file downloader

positional arguments:
  url                   Target URL
  filename              Output filename; use "-" for stdout pipe

options:
  -h, --help            show this help message and exit
  -V, --version         show program's version number and exit
  -H, --headers HEADERS
                        HTTP headers, interpreted as JSON
  -l, -cl, --concurrent-limit CONCURRENT_LIMIT
                        Limits the amount of concurrent requests; defaults to 64
  -sl, --size-limit SIZE_LIMIT
                        Limits the amount of data to download; defaults to 1099511627776
  -t, --timeout TIMEOUT
                        Limits the amount of time allowed for the initial request to succeed
  -s, --ssl, --no-ssl   Enforces SSL verification; defaults to TRUE
  -d, --debug, --no-debug
                        Terminates immediately upon non-timeout errors, and writes the response data for errored chunks; defaults to
                        FALSE
  -lp, --log-progress, --no-log-progress
                        Continually updates a progress bar in the standard output; defaults to TRUE

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

streamshatter-1.1.8.tar.gz (9.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

streamshatter-1.1.8-py3-none-any.whl (9.8 kB view details)

Uploaded Python 3

File details

Details for the file streamshatter-1.1.8.tar.gz.

File metadata

  • Download URL: streamshatter-1.1.8.tar.gz
  • Upload date:
  • Size: 9.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.13.8

File hashes

Hashes for streamshatter-1.1.8.tar.gz
Algorithm Hash digest
SHA256 39604c610d3f7ceaf57a41b62d699838896343e5eeca79b0e2968d98c8dd89f4
MD5 c08c04622dd5dfea3725af4803e901de
BLAKE2b-256 9e3790bc38b98554c9426bd338aec386d6e38020a5639ca519ad10d590c69b33

See more details on using hashes here.

File details

Details for the file streamshatter-1.1.8-py3-none-any.whl.

File metadata

  • Download URL: streamshatter-1.1.8-py3-none-any.whl
  • Upload date:
  • Size: 9.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.13.8

File hashes

Hashes for streamshatter-1.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 f340ce49254a6a4bfdbb10afb7c599e30904af2ead70fce43df4bbc698800b45
MD5 3f72b44ff453b18086430554d213cc28
BLAKE2b-256 21983111478748be72db33d611bb7e43fa3e7afdfb1933d53084d3df6b4b8194

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page