Skip to main content

Content consumption analyzer CLI

Project description

consumo: Content Consumption Analyzer

PyPI Package Python Versions Codecov Downloads License

GIF showcasing the program being used, by revealing it would take 21 minutes and 18 seconds to read the entire license at the standard 265 words per minute.

VHS

Introduction

consumo is a command-line interface (CLI) built with Typer that calculates the time to consume either online or offline media. It can be used for sorting media by duration for later consumption or by deciding if something can be viewed today or at a later date.

It's designed with broad support in mind. When it comes to online media, it supports video platforms by directly getting the duration of the linked video; online hosted files by extracting the duration from their metadata; articles and text in general by using the Medium formula to calculate the total consumption time based on text, using a (customizable) words per minute (WPM) count; image count; video duration of the videos on the page. For further details, see: How Medium Calculates Read Time.

For offline media, multiple backends are used to calculate the reading time. However, by design, local HTML files have full feature parity with online pages.

CLI

Content Consumption Analyzer.

Usage:

$ [OPTIONS] COMMAND [ARGS]...

Options:

  • --version: Print the program's version and exit.
  • --install-completion: Install completion for the current shell.
  • --show-completion: Show completion for the current shell, to copy it or customize the installation.
  • --help: Show this message and exit.

Commands:

  • file: Calculate the consumption time of files...
  • list: Calculate the consumption time of all the links in a link list file...
  • url: Calculate the consumption time of URLs...

file

Calculate the consumption time of files concurrently in a *h *m *s format.

Usage:

$ file [OPTIONS] FILES...

Arguments:

  • FILES...: [required]

Options:

  • --sort / --no-sort: Sort the output by duration in ascending order. [default: no-sort]
  • --words-per-minute INTEGER: Reading speed in words per minute. [default: 265]
  • --help: Show this message and exit.

list

Calculate the consumption time of all the links in a link list file in a *h *m *s format.

Example: A "file with a list of links" is a plain text file that looks like this:

```text
https://en.wikipedia.org/wiki/Python_(programming_language)
https://en.wikipedia.org/wiki/High-level_programming_language
https://en.wikipedia.org/wiki/General-purpose_programming_language
https://en.wikipedia.org/wiki/Code_readability
https://en.wikipedia.org/wiki/Significant_indentation
https://en.wikipedia.org/wiki/Type_system#DYNAMIC
https://en.wikipedia.org/wiki/Garbage_collection_(computer_science)
https://en.wikipedia.org/wiki/Programming_paradigm
https://en.wikipedia.org/wiki/Structured_programming
https://en.wikipedia.org/wiki/Procedural_programming
https://en.wikipedia.org/wiki/Object-oriented_programming
https://en.wikipedia.org/wiki/Functional_programming
...
```

Usage:

$ list [OPTIONS] FILE

Arguments:

  • FILE: [required]

Options:

  • --sort / --no-sort: Sort the output by duration in ascending order. [default: no-sort]
  • --words-per-minute INTEGER: Reading speed in words per minute. [default: 265]
  • --help: Show this message and exit.

url

Calculate the consumption time of URLs concurrently in a *h *m *s format.

Usage:

$ url [OPTIONS] URLS...

Arguments:

  • URLS...: [required]

Options:

  • --sort / --no-sort: Sort the output by duration in ascending order. [default: no-sort]
  • --words-per-minute INTEGER: Reading speed in words per minute. [default: 265]
  • --help: Show this message and exit.

Context

I'm pretty unorganized. No matter how much I try to tidy things up, I always manage to make a mess somewhere else. In this case, I host in my own machine a FreshRSS container which should ideally be my only source of online content and things should be saved there. However, after hoarding 30+ tabs on my phone with random links from the web, I decided to make a file like this on my computer:

https://en.wikipedia.org/wiki/Python_(programming_language)
https://en.wikipedia.org/wiki/High-level_programming_language
https://en.wikipedia.org/wiki/General-purpose_programming_language
https://en.wikipedia.org/wiki/Code_readability
https://en.wikipedia.org/wiki/Significant_indentation
https://en.wikipedia.org/wiki/Type_system#DYNAMIC
https://en.wikipedia.org/wiki/Garbage_collection_(computer_science)
https://en.wikipedia.org/wiki/Programming_paradigm
https://en.wikipedia.org/wiki/Structured_programming
https://en.wikipedia.org/wiki/Procedural_programming
https://en.wikipedia.org/wiki/Object-oriented_programming
https://en.wikipedia.org/wiki/Functional_programming
...

Repeat until you get over a hundred links (and multiple websites other than Wikipedia). Needless to say, I felt overwhelmed and thought: "LLMs can view webpages. Maybe I can give this list of links to one so it can sort them by duration for a better experience?"

I tried multiple models, but none where able to do that. Maybe there's something like this out there already, but I forgot to search for it. But thankfully that sparkled a great idea for a project: consumo!

Configuration file

consumo supports a TOML under your system's default configuration directory (on Linux, $XDG_CONFIG/HOME/config.toml). It has these default values:

[general]
sort = false
words_per_minute = 265

Philosophies

  • Dependency Injection.
  • Parse, don't validate[^1].
  • Test Driven Development[^2].

[^1]: King, A. (2019) Parse, don’t validate. Alexis King’s Blog. Available at: https://lexi-lambda.github.io/blog/2019/11/05/parse-don-t-validate/ (Accessed: September 29, 2025).

[^2]: Beck, K. (2003) Test-driven development: By example. Boston: Addison-Wesley (The Addison-Wesley signature series).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

consumo-0.2.0.tar.gz (11.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

consumo-0.2.0-py3-none-any.whl (19.5 kB view details)

Uploaded Python 3

File details

Details for the file consumo-0.2.0.tar.gz.

File metadata

  • Download URL: consumo-0.2.0.tar.gz
  • Upload date:
  • Size: 11.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.9.14 {"installer":{"name":"uv","version":"0.9.14","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for consumo-0.2.0.tar.gz
Algorithm Hash digest
SHA256 52ea59347dc86ed0ac9f618fd020401c71fd57be3fafd111f3685f4b16b28819
MD5 0962d13477c9fb7dc46e705e2db35f3b
BLAKE2b-256 0536d8353e62a416e4d109fabb2a1c15a680f3a7c81f460dde57d1cb4e25d9a9

See more details on using hashes here.

File details

Details for the file consumo-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: consumo-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 19.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.9.14 {"installer":{"name":"uv","version":"0.9.14","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for consumo-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9fdc4eceaec19712784be7d515effce1b0d1e0db58e9291320997f44171fd861
MD5 13adb383192478415b40dd8f13c7c04c
BLAKE2b-256 4210867fd290388acfd3e89a5d5e9698250cb57fe6f140e2eb6a9d09c5ded856

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page