Skip to main content

LLM-ify: create LLM-ready markdown and text from websites.

Project description

LLM-ify

LLM-ify

LLM-ify turns any website documentation into local, readable docs in seconds. It also generates llms.txt and llms-full.txt plus full markdown/text captures for single pages or entire sites, so your LLM has clean, structured context.

Install + Run (Recommended)

pip install llmify-cli
llmify

Run Crawl4AI setup (after install):

llmify setup

Update

pip install -U llmify-cli

What is llms.txt?

llms.txt is a standardized format for making website content more accessible to Large Language Models (LLMs). It provides:

  • llms.txt: A concise index of all pages with titles and descriptions
  • llms-full.txt: Complete content of all pages for comprehensive access

Features

  • Turn documentation websites into local, searchable markdown
  • Full website or single-page capture
  • LLM-friendly index (llms.txt) + full corpus (llms-full.txt)
  • Per-page doc files + glossary for fast navigation
  • OpenAI or OpenRouter support
  • Interactive terminal UI

Prerequisites

  • Python 3.7+
  • OpenAI API key (Get one here)
  • Crawl4AI browser dependencies (run llmify setup after install)

Developer Quick Run

git clone https://github.com/Chillbruhhh/LLM-ify.git
cd LLM-ify
python -m venv venv
venv\Scripts\activate  # Windows
# or: source venv/bin/activate (macOS/Linux)
pip install -r requirements.txt
crawl4ai-setup
python main.py

Build packages (wheel + sdist):

python -m build

API Key Setup

Set up your OpenAI API key:

Option A: Using .env file (recommended)

cp .env.example .env
# Edit .env and configure:
# - Add OPENAI_API_KEY (required)

Option B: Using environment variables

export OPENAI_API_KEY="your-openai-api-key"

Option C: Using command line arguments (See the TUI for input fields)

OpenRouter (Optional)

LLM-ify can also use OpenRouter. Set the key and choose the provider in the TUI settings:

OPENROUTER_API_KEY="your-openrouter-api-key"

Ollama (Optional)

LLM-ify can use a local Ollama server. Select ollama in the TUI and set the model name (for example llama3.1:8b). Ollama runs at http://localhost:11434/v1 by default.

Usage (TUI)

Launch the terminal UI:

python main.py

Enter a URL, choose a mode (full website or single page), and run. Settings are saved in config.json automatically.

Model Provider + Model Name

Choose the provider in Settings, then set the model name for that provider:

  • OpenAI default: gpt-4.1-nano
  • OpenRouter default: openai/gpt-4.1-nano
  • Ollama: set your local model name (for example llama3.1:8b)

Output Format

llms.txt

# https://example.com llms.txt

- [Page Title](https://example.com/page1): Brief description of the page content here
- [Another Page](https://example.com/page2): Another concise description of page content

llms-full.txt

# https://example.com llms-full.txt

<|llm-ify-page-1-lllmstxt|>
## Page Title
Full markdown content of the page...

<|llm-ify-page-2-lllmstxt|>
## Another Page
Full markdown content of another page...

Output Locations

Output files are written under collected-texts/llmify-<domain>/ by default. Example:

collected-texts/llmify-docs.example.com/GLOSSARY.md
collected-texts/llmify-docs.example.com/docs/<page-title>.md
collected-texts/llmify-docs.example.com/llms-files/llms.md
collected-texts/llmify-docs.example.com/llms-files/llms-full.md
collected-texts/llmify-docs.example.com/seeds.json

Agent Instructions

See INSTRUCTIONS.md for guidance on how LLM agents should navigate the generated documentation and glossary.

Contributing

See CONTRIBUTING.md for setup, workflow, and PR guidelines.

Changelog

See CHANGELOG.md for release notes.

License

PolyForm Noncommercial - see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmify_cli-0.1.6.tar.gz (28.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llmify_cli-0.1.6-py3-none-any.whl (27.6 kB view details)

Uploaded Python 3

File details

Details for the file llmify_cli-0.1.6.tar.gz.

File metadata

  • Download URL: llmify_cli-0.1.6.tar.gz
  • Upload date:
  • Size: 28.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for llmify_cli-0.1.6.tar.gz
Algorithm Hash digest
SHA256 954caa04432616aba2cd5d8b07d0c034957525d169ce7b2f5ef2c933adad8d28
MD5 d19c2398b5546eecde62faeaa8dc5c8d
BLAKE2b-256 1b3b6f6e68d910abe40d114eea268cb9992d6aec3997ce33a474a23035080a6f

See more details on using hashes here.

File details

Details for the file llmify_cli-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: llmify_cli-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 27.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for llmify_cli-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 cb50ad3b1b39fddf9ec1fc00ae9fb0ac2d484127d9cd5e94d63593ddf65ddff8
MD5 cddea2dfd6b9152ae91372063d5fb0fc
BLAKE2b-256 c16f7896cb88bf642a84ee38b1a612611d43ca00091b604559453626609ece12

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page