Skip to main content

LLM-ify: create LLM-ready markdown and text from websites.

Project description

LLM-ify

LLM-ify

LLM-ify generates llms.txt and llms-full.txt plus full markdown/text captures for single pages or entire websites, so your LLM has clean, readable context.

Install + Run

pip install llmify-cli
llmify

Run LLM-ify setup (after install):

llmify setup

Update

pip install -U llmify-cli

What is llms.txt?

llms.txt is a standardized format for making website content more accessible to Large Language Models (LLMs). It provides:

  • llms.txt: A concise index of all pages with titles and descriptions
  • llms-full.txt: Complete content of all pages for comprehensive access

Features

  • Full website or single-page capture
  • Text + Markdown output
  • LLM-friendly index (llms.txt) + full corpus (llms-full.txt)
  • OpenAI or OpenRouter support
  • Interactive terminal UI

Prerequisites

  • Python 3.7+
  • OpenAI API key (Get one here)
  • Crawl4AI browser dependencies (run llmify setup after install)

Quick Setup (Recommended)

Other launch commands (all open the TUI):

llm-ify
llmify-cli

Run the quickstart script for your OS:

# Windows (PowerShell)
scripts\quickstart.ps1

# macOS/Linux
./scripts/quickstart.sh

This creates a venv, installs deps, runs crawl4ai-setup, and launches the TUI.

Every time after this just activate your venv and run

python main.py

Manual Setup

  1. Clone the repository:
git clone https://github.com/Chillbruhhh/LLM-ify.git
cd LLM-ify
  1. Create and activate a virtual environment:
# Windows
python -m venv venv
venv\Scripts\activate

# macOS/Linux
python3 -m venv venv
source venv/bin/activate
  1. Install dependencies:
pip install -r requirements.txt
  1. Run the setup command:
crawl4ai-setup

This installs browser dependencies needed for Crawl4AI.

  1. Set up your OpenAI API key:

    Option A: Using .env file (recommended)

    cp .env.example .env
    # Edit .env and configure:
    # - Add OPENAI_API_KEY (required)
    

    Option B: Using environment variables

    export OPENAI_API_KEY="your-openai-api-key"
    

    Option C: Using command line arguments (See the TUI for input fields)

OpenRouter (Optional)

LLM-ify can also use OpenRouter. Set the key and choose the provider in the TUI settings:

OPENROUTER_API_KEY="your-openrouter-api-key"

Ollama (Optional)

LLM-ify can use a local Ollama server. Select ollama in the TUI and set the model name (for example llama3.1:8b). Ollama runs at http://localhost:11434/v1 by default.

Usage (TUI)

Launch the terminal UI:

python main.py

Enter a URL, choose a mode (full website or single page), and run. Settings are saved in config.json automatically.

Model Provider + Model Name

Choose the provider in Settings, then set the model name for that provider:

  • OpenAI default: gpt-4.1-nano
  • OpenRouter default: openai/gpt-4.1-nano
  • Ollama: set your local model name (for example llama3.1:8b)

Output Format

llms.txt

# https://example.com llms.txt

- [Page Title](https://example.com/page1): Brief description of the page content here
- [Another Page](https://example.com/page2): Another concise description of page content

llms-full.txt

# https://example.com llms-full.txt

<|llm-ify-page-1-lllmstxt|>
## Page Title
Full markdown content of the page...

<|llm-ify-page-2-lllmstxt|>
## Another Page
Full markdown content of another page...

Output Locations

Output files are written under collected-texts/llmify-<domain>/ by default. Example:

collected-texts/llmify-docs.example.com/GLOSSARY.md
collected-texts/llmify-docs.example.com/docs/<page-title>.md
collected-texts/llmify-docs.example.com/llms-files/llms.md
collected-texts/llmify-docs.example.com/llms-files/llms-full.md
collected-texts/llmify-docs.example.com/seeds.json

Agent Instructions

See INSTRUCTIONS.md for guidance on how LLM agents should navigate the generated documentation and glossary.

Contributing

See CONTRIBUTING.md for setup, workflow, and PR guidelines.

Changelog

See CHANGELOG.md for release notes.

License

PolyForm Noncommercial - see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmify_cli-0.1.5.tar.gz (28.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llmify_cli-0.1.5-py3-none-any.whl (27.7 kB view details)

Uploaded Python 3

File details

Details for the file llmify_cli-0.1.5.tar.gz.

File metadata

  • Download URL: llmify_cli-0.1.5.tar.gz
  • Upload date:
  • Size: 28.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for llmify_cli-0.1.5.tar.gz
Algorithm Hash digest
SHA256 79247db8cd4f9e311ce58392976c2f418c3b25297ae22616ebeba4753bbe6710
MD5 62563f5763b2b77815d9cdb6daf2b203
BLAKE2b-256 0099f7a11ce25ef218b2e3b4a04eeeb90b55253fc32647451f1c2bf79f984dac

See more details on using hashes here.

File details

Details for the file llmify_cli-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: llmify_cli-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 27.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for llmify_cli-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 cf910d835d2055d6e2df0d55a68f9a3cd9a7e23aedc497a04b0990556436857b
MD5 c4269e57fbd7efaa47f45d41de23c68f
BLAKE2b-256 db20cffef7fe9206e7b286ffa0fea23eb3667fe8e3b22c9e485dd9719c3486e8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page