Skip to main content

A command-line interface for interacting with various Large Language Models with streaming markdown output

Project description

StreamLM

Downloads PyPI version GitHub Release Build Status Code style: black

A command-line interface for interacting with various Large Language Models with beautiful markdown-formatted responses.

Installation

uv (recommended)

uv tool install streamlm

PyPI

pip install streamlm

Homebrew (macOS/Linux)

brew install jeffmylife/streamlm/streamlm

Usage

After installation, you can use the lm command:

lm explain quantum computing
lm -m gpt-4o "write a Python function"
lm -m claude-3-5-sonnet "analyze this data"

Raw Markdown Output

StreamLM includes beautiful built-in markdown formatting, but you can also output raw markdown for piping to other tools:

# Output raw markdown without Rich formatting
lm --md "explain machine learning" > output.md

# Pipe to your favorite markdown formatter (like glow)
lm --md "write a Python tutorial" | glow

# Use with other markdown tools
lm --raw "create documentation" | pandoc -f markdown -t html

Supported Models

StreamLM provides access to various Large Language Models including:

  • OpenAI: GPT-4o, o1, o3-mini, GPT-4o-mini
  • Anthropic: Claude-3-7-sonnet, Claude-3-5-sonnet, Claude-3-5-haiku
  • Google: Gemini-2.5-flash, Gemini-2.5-pro, Gemini-2.0-flash-thinking
  • DeepSeek: DeepSeek-R1, DeepSeek-V3
  • xAI: Grok-4, Grok-3-beta, Grok-3-mini-beta
  • Local models: Via Ollama (Llama3.3, Qwen2.5, DeepSeek-Coder, etc.)

Options

  • --model / -m: Choose the LLM model
  • --image / -i: Include image files for vision models
  • --context / -c: Add context from a file
  • --max-tokens / -t: Set maximum response length
  • --temperature / -temp: Control response creativity (0.0-1.0)
  • --think: Show reasoning process (for reasoning models)
  • --debug / -d: Enable debug mode
  • --raw / --md: Output raw markdown without Rich formatting

Features

  • 🎨 Beautiful markdown-formatted responses
  • 🖼️ Image input support for compatible models
  • 📁 Context file support
  • 🧠 Reasoning model support (DeepSeek, OpenAI o1, etc.)
  • 🔧 Extensive model support across providers
  • ⚡ Fast and lightweight
  • 🛠️ Easy configuration

Links

License

MIT License - see LICENSE file for details.

Development

Setup

# Clone the repository
git clone https://github.com/jeffmylife/streamlm.git
cd streamlm

# Install with dev dependencies
uv pip install -e ".[dev]"

Running Tests

All tests use uv run for consistency:

# Run all tests
uv run pytest tests/ -v

# Run with coverage
uv run pytest tests/ -v --cov=src --cov-report=term-missing

# Run specific test types
uv run pytest tests/test_cli.py -v                    # Unit tests only
uv run pytest tests/test_integration.py -v            # Integration tests only

Release Process

# Make your changes
uv version --bump patch
git add .
git commit -m "feat: your changes"
git push

# Create GitHub release (this triggers everything automatically)
gh release create v0.1.11 --generate-notes

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

streamlm-0.1.11.tar.gz (147.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

streamlm-0.1.11-py3-none-any.whl (21.1 kB view details)

Uploaded Python 3

File details

Details for the file streamlm-0.1.11.tar.gz.

File metadata

  • Download URL: streamlm-0.1.11.tar.gz
  • Upload date:
  • Size: 147.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for streamlm-0.1.11.tar.gz
Algorithm Hash digest
SHA256 663b558b173e7c8a001af14e313a8b5f490c1bca61461fd43838dcb7a2ffbef5
MD5 a9565e924343ac731bd852b4eee94872
BLAKE2b-256 c2e3602a17ff193c4ce534bcd0b26985a2a8bed603aa593e39a1b98ce75a68ed

See more details on using hashes here.

Provenance

The following attestation bundles were made for streamlm-0.1.11.tar.gz:

Publisher: publish.yml on jeffmylife/streamlm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file streamlm-0.1.11-py3-none-any.whl.

File metadata

  • Download URL: streamlm-0.1.11-py3-none-any.whl
  • Upload date:
  • Size: 21.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for streamlm-0.1.11-py3-none-any.whl
Algorithm Hash digest
SHA256 05db3463fdc3a1c1acfe72d2271dbabe5989443e98bf18138de38ab8dae1fe42
MD5 915ba80526ee85b456fd4fbbcd9645ea
BLAKE2b-256 fe0f4a66765b3a24e1e4673e8ba3eb2b632350f94c6f17ba3fdb39334e2a2f8a

See more details on using hashes here.

Provenance

The following attestation bundles were made for streamlm-0.1.11-py3-none-any.whl:

Publisher: publish.yml on jeffmylife/streamlm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page