Skip to main content

CLI: HTTP(S) or local file/dir → Readability → Markdown; optional LiteLLM summary; optional --max-tokens truncation.

Project description

ai-context-cli

PyPI version Python versions License

HTTP(S) → clean Markdown for LLM context: httpx fetch → Readability → markdownify; optional LiteLLM summary with --summary (lazy-loaded — no LLM stack until you opt in).

# Markdown only (stdout stays pipe-clean; errors → stderr)
ai-context-cli "https://example.com/article"

# Same + LLM summary (loads .env from cwd for API keys)
ai-context-cli "https://example.com/article" --summary

# Verbose pipeline logs on stderr
ai-context-cli "https://example.com/article" --summary -v

Architecture and conventions: SPEC.md.

⚡ Why

Raw HTML wastes tokens. ai-context-cli is a small preprocessing step: one URL in, readable Markdown (and optionally a short summary) out.

🧩 What you get

  • Readability-based main content extraction (readability-lxml)
  • Markdown via markdownify
  • Optional LLM summary via LiteLLM (--summary, --model)
  • Typed errors → stable CLI exit codes (see SPEC.md §7)

Scope today: SOURCE is an http(s) URL or a local file or directory (UTF-8 text; see SPEC §5). --structure and --version are available.

Quick Start

# 1) Install
pip install ai-context-cli

# 2) Convert a page to Markdown
ai-context-cli "https://example.com/article"

# 3) Show installed version
ai-context-cli --version

# 4) Add a summary block
ai-context-cli "https://example.com/article" --summary

📦 Install

From a clone (dev):

python3 -m venv .venv
source .venv/bin/activate   # Windows: .\.venv\Scripts\Activate.ps1
pip install -U pip
pip install -e ".[dev]"
ai-context-cli --help

From PyPI: pip install ai-context-cli

Requires Python 3.11+.

🤖 LLM providers (LiteLLM)

--summary uses LiteLLM model ids. Set the provider’s native env vars (also read from a .env in the working directory when --summary is used).

Provider Example --model Credentials
OpenAI gpt-4o-mini (default if --model omitted) OPENAI_API_KEY
Anthropic anthropic/claude-3-5-sonnet-latest ANTHROPIC_API_KEY
OpenRouter openrouter/openai/gpt-4o-mini (pick any model slug OpenRouter exposes) OPENROUTER_API_KEY
Ollama (local) ollama/llama3 Optional: OLLAMA_API_BASE (default http://localhost:11434)
ai-context-cli "https://example.com/article" --summary --model "openrouter/openai/gpt-4o-mini"
ai-context-cli "https://example.com/article" --summary --model "ollama/llama3"

See .env.example for other keys (MISTRAL_API_KEY, timeouts, etc.).

🧪 Tests & lint

pytest
ruff check .
ruff format --check .
mypy src

🤝 Contributing

See CONTRIBUTING.md (layers, ports/adapters, where to put code).

📝 Changelog

Release notes are tracked in CHANGELOG.md.

📄 License

MIT — see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai_context_cli-1.0.0.tar.gz (38.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ai_context_cli-1.0.0-py3-none-any.whl (31.5 kB view details)

Uploaded Python 3

File details

Details for the file ai_context_cli-1.0.0.tar.gz.

File metadata

  • Download URL: ai_context_cli-1.0.0.tar.gz
  • Upload date:
  • Size: 38.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.5 cpython/3.13.2 HTTPX/0.28.1

File hashes

Hashes for ai_context_cli-1.0.0.tar.gz
Algorithm Hash digest
SHA256 1fb4ca5e00ec62ff380e871f083271e1d9ce16de22f9bc40f93388d143d8bdc3
MD5 9a05e18d777238ba60554bf69b923f23
BLAKE2b-256 d12748e0a982afc47b692f16f1adeb6d22c6ea1f4c492bb97776e933ea5640af

See more details on using hashes here.

File details

Details for the file ai_context_cli-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: ai_context_cli-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 31.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.5 cpython/3.13.2 HTTPX/0.28.1

File hashes

Hashes for ai_context_cli-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 bc6a81481ea24ddfbe00687d47478c33fdae38f5eed3324e33c3ca63fefd5748
MD5 0e912e3b258eddf7a07202ac3a218580
BLAKE2b-256 5fb28e704a0864e01c0cd9efe0ff27ee3a3a8933629c24366889bc91975edf49

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page