Skip to main content

Multiplexed chunked file downloader

Project description

StreamShatter

Ever wondered where all the internet bandwidth you get from speedtest websites goes when you're actually trying to download something? Or does your WiFi constantly cut out and ruin your in-progress downloads? Well, either way, here's a tool that may or may not help!

Originally a very basic script for reliably downloading files from servers with inconsistent connections, this project has been revisited and modernised to use https://github.com/jawah/niquests to greatly improve multiplexing performance, for those who still have use for such a tool.

StreamShatter takes advantage of the Range HTTP header to dynamically allocate multiple chunks, by starting with one streaming request and gradually bisecting it while bandwidth permits, all without restarting the download. This allows for single, large file downloads from hosts that, whether intentionally or unintentionally, have degraded throughputs. The individual chunks also serve as checkpoints for if/when connections are broken.

Installation

  • Install python and pip
  • Install StreamShatter as a package: pip install streamshatter

Usage

usage: streamshatter [-h] [-V] [-H HEADERS] [-l CONCURRENT_LIMIT] [-sl SIZE_LIMIT] [-t TIMEOUT] [-s | --ssl | --no-ssl]
                     [-d | --debug | --no-debug] [-lp | --log-progress | --no-log-progress]
                     url [filename]

Multiplexed chunked file downloader

positional arguments:
  url                   Target URL
  filename              Output filename; use "-" for stdout pipe

options:
  -h, --help            show this help message and exit
  -V, --version         show program's version number and exit
  -H, --headers HEADERS
                        HTTP headers, interpreted as JSON
  -l, -cl, --concurrent-limit CONCURRENT_LIMIT
                        Limits the amount of concurrent requests; defaults to 64
  -sl, --size-limit SIZE_LIMIT
                        Limits the amount of data to download; defaults to 1099511627776
  -t, --timeout TIMEOUT
                        Limits the amount of time allowed for the initial request to succeed
  -s, --ssl, --no-ssl   Enforces SSL verification; defaults to TRUE
  -d, --debug, --no-debug
                        Terminates immediately upon non-timeout errors, and writes the response data for errored chunks; defaults to
                        FALSE
  -lp, --log-progress, --no-log-progress
                        Continually updates a progress bar in the standard output; defaults to TRUE

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

streamshatter-1.1.7.tar.gz (9.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

streamshatter-1.1.7-py3-none-any.whl (9.7 kB view details)

Uploaded Python 3

File details

Details for the file streamshatter-1.1.7.tar.gz.

File metadata

  • Download URL: streamshatter-1.1.7.tar.gz
  • Upload date:
  • Size: 9.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.13.8

File hashes

Hashes for streamshatter-1.1.7.tar.gz
Algorithm Hash digest
SHA256 9be13fe916d27542cd37efdf18de83170b9b7ccd78cef55ba0a7a145339df165
MD5 4a28df2eb6d1796856db9a373cce53b4
BLAKE2b-256 08eadb924da76666f741e4e8d06cad51f8b0209651b6f496ed7a782e944e83fc

See more details on using hashes here.

File details

Details for the file streamshatter-1.1.7-py3-none-any.whl.

File metadata

  • Download URL: streamshatter-1.1.7-py3-none-any.whl
  • Upload date:
  • Size: 9.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.13.8

File hashes

Hashes for streamshatter-1.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 c34ab4ddd7dcc1affc47bb110dc56bc4436af3e5af0bd9a2c82631ba86459571
MD5 70947cc42830d4b4f01bcc98b210762f
BLAKE2b-256 d2d38c67184298d841d05ce2f3a2b76839d4c78aa672c49bb2d074f3560d838c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page