Skip to main content

Multiplexed chunked file downloader

Project description

StreamShatter

Ever wondered where all the internet bandwidth you get from speedtest websites goes when you're actually trying to download something? Or does your WiFi constantly cut out and ruin your in-progress downloads? Well, either way, here's a tool that may or may not help!

Originally a very basic script for reliably downloading files from servers with inconsistent connections, this project has been revisited and modernised to use https://github.com/jawah/niquests to greatly improve multiplexing performance, for those who still have use for such a tool.

StreamShatter takes advantage of the Range HTTP header to dynamically allocate multiple chunks, by starting with one streaming request and gradually bisecting it while bandwidth permits, all without restarting the download. This allows for single, large file downloads from hosts that, whether intentionally or unintentionally, have degraded throughputs. The individual chunks also serve as checkpoints for if/when connections are broken.

Installation

  • Install python and pip
  • Install StreamShatter as a package: pip install streamshatter

Usage

usage: streamshatter [-h] [-V] [-H HEADERS] [-l CONCURRENT_LIMIT] [-sl SIZE_LIMIT] [-t TIMEOUT] [-s | --ssl | --no-ssl]
                     [-d | --debug | --no-debug] [-lp | --log-progress | --no-log-progress]
                     url [filename]

Multiplexed chunked file downloader

positional arguments:
  url                   Target URL
  filename              Output filename; use "-" for stdout pipe

options:
  -h, --help            show this help message and exit
  -V, --version         show program's version number and exit
  -H, --headers HEADERS
                        HTTP headers, interpreted as JSON
  -l, -cl, --concurrent-limit CONCURRENT_LIMIT
                        Limits the amount of concurrent requests; defaults to 64
  -sl, --size-limit SIZE_LIMIT
                        Limits the amount of data to download; defaults to 1099511627776
  -t, --timeout TIMEOUT
                        Limits the amount of time allowed for the initial request to succeed
  -s, --ssl, --no-ssl   Enforces SSL verification; defaults to TRUE
  -d, --debug, --no-debug
                        Terminates immediately upon non-timeout errors, and writes the response data for errored chunks; defaults to
                        FALSE
  -lp, --log-progress, --no-log-progress
                        Continually updates a progress bar in the standard output; defaults to TRUE

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

streamshatter-1.1.10.tar.gz (9.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

streamshatter-1.1.10-py3-none-any.whl (10.5 kB view details)

Uploaded Python 3

File details

Details for the file streamshatter-1.1.10.tar.gz.

File metadata

  • Download URL: streamshatter-1.1.10.tar.gz
  • Upload date:
  • Size: 9.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for streamshatter-1.1.10.tar.gz
Algorithm Hash digest
SHA256 2b8b6e6c4328230749200c37572473613a02f54cc6d37dc7ed6729783aa24474
MD5 8e98d3b13b2026f20a2b4cff2950487a
BLAKE2b-256 1fbec93adde142498608418a4803d759f8b6f5ecf16fdbd4eaf5984d5378e06e

See more details on using hashes here.

File details

Details for the file streamshatter-1.1.10-py3-none-any.whl.

File metadata

  • Download URL: streamshatter-1.1.10-py3-none-any.whl
  • Upload date:
  • Size: 10.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for streamshatter-1.1.10-py3-none-any.whl
Algorithm Hash digest
SHA256 557335e31fd7d9f2fb5de1f7aadeb042547d674069a98656731ba2a4f6a2447a
MD5 fed75f524b8b6178eb1265025a42874e
BLAKE2b-256 b51a5d24775a212db0779f63e360441fb276f765d6d100f135af86c4c06bedb4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page