Skip to main content

Convert llms.txt to Claude Code Skill format

Project description

llmstxt2skill

Convert llms.txt files to Claude Code Skill format.

Installation

pip install llmstxt2skill

For development:

git clone https://github.com/akuwano/llmstxt2skill.git
cd llmstxt2skill
uv pip install -e ".[dev]"

Usage

# Basic usage
llmstxt2skill https://docs.databricks.com/llms.txt

# Preview without writing (dry-run)
llmstxt2skill https://docs.databricks.com/llms.txt --dry-run

# Custom skill name
llmstxt2skill https://docs.databricks.com/llms.txt --name databricks

# Custom output directory
llmstxt2skill https://docs.databricks.com/llms.txt -o ./skills

# Overwrite existing skill
llmstxt2skill https://docs.databricks.com/llms.txt --force

# Check version
llmstxt2skill --version

Options

Option Description
-v, --version Show version and exit
--name Custom skill name (defaults to kebab-case of title)
-o, --output Output directory (defaults to ~/.claude/skills)
--dry-run Preview output without writing files
--force Overwrite existing skill
--enrich Use LLM to generate enriched skill with translation and structuring
--provider LLM provider: databricks, openai, anthropic, openai-compatible (default: databricks)
--lang Target language for enriched skill (default: ja)
--model Model identifier (defaults to provider's default model)
--max-tokens Maximum tokens for LLM output (default: 16000)

LLM Enrichment

Use --enrich to generate high-quality skills with:

  • Localized content (Japanese by default)
  • Translated and structured documentation links (all links preserved)
  • Trigger conditions for when to use the skill
  • Capabilities and limitations
  • Usage instructions

Note: The --enrich option calls external LLM APIs, which may incur costs depending on your provider and usage.

Supported Providers

Provider Environment Variables Default Model
databricks DATABRICKS_HOST, DATABRICKS_TOKEN databricks-gemini-3-pro
openai OPENAI_API_KEY gpt-4o-mini
anthropic ANTHROPIC_API_KEY claude-3-5-sonnet-20241022
openai-compatible OPENAI_BASE_URL, OPENAI_API_KEY (optional) default

Note: The openai-compatible provider is intended for local LLM servers (vLLM, Ollama, llama.cpp) and uses HTTP by default. For production use with remote endpoints, ensure HTTPS is configured.

Examples

# Databricks (default provider)
export DATABRICKS_HOST="https://your-workspace.cloud.databricks.com"
export DATABRICKS_TOKEN="your-token"
llmstxt2skill https://docs.databricks.com/llms.txt --enrich

# OpenAI
export OPENAI_API_KEY="sk-..."
llmstxt2skill https://example.com/llms.txt --enrich --provider openai

# Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."
llmstxt2skill https://example.com/llms.txt --enrich --provider anthropic

# Local LLM (vLLM, Ollama, llama.cpp server)
export OPENAI_BASE_URL="http://localhost:8000"
llmstxt2skill https://example.com/llms.txt --enrich --provider openai-compatible --model llama3

# Specify model explicitly
llmstxt2skill https://example.com/llms.txt --enrich --provider openai --model gpt-4o

# English output
llmstxt2skill https://example.com/llms.txt --enrich --lang en

# Increase max tokens for large llms.txt files
llmstxt2skill https://docs.databricks.com/llms.txt --enrich --max-tokens 20000

Output

By default, skills are written to ~/.claude/skills/{skill-name}/SKILL.md

Use -o or --output to specify a custom output directory:

llmstxt2skill https://example.com/llms.txt -o ./my-skills
# Output: ./my-skills/{skill-name}/SKILL.md

Example

Input (llms.txt):

# Databricks Documentation

> Comprehensive documentation for the Databricks platform.

## Overview
- [Main docs](https://docs.databricks.com/) - How-to guides

Output (SKILL.md):

---
name: databricks-documentation
description: Comprehensive documentation for the Databricks platform.
---

# Databricks Documentation

Comprehensive documentation for the Databricks platform.

## Overview
- [Main docs](https://docs.databricks.com/) - How-to guides

Development

# Install dev dependencies
uv pip install -e ".[dev]"

# Run tests
uv run pytest tests/ -v

# Run linter
uv run ruff check src/ tests/

License

Apache License 2.0 - see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmstxt2skill-0.2.4.tar.gz (15.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llmstxt2skill-0.2.4-py3-none-any.whl (22.8 kB view details)

Uploaded Python 3

File details

Details for the file llmstxt2skill-0.2.4.tar.gz.

File metadata

  • Download URL: llmstxt2skill-0.2.4.tar.gz
  • Upload date:
  • Size: 15.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.1

File hashes

Hashes for llmstxt2skill-0.2.4.tar.gz
Algorithm Hash digest
SHA256 c83cb0e92a86d0912e1cae5acdca556e3805730dc06fe41a9867799f19bc0bb4
MD5 8e2496c21a00fc665f0571e44e2b0728
BLAKE2b-256 67ea4905337cb5abdf5c4cbbdb4514af2fed4598f4a7c8a213db75af1fe8ace2

See more details on using hashes here.

File details

Details for the file llmstxt2skill-0.2.4-py3-none-any.whl.

File metadata

  • Download URL: llmstxt2skill-0.2.4-py3-none-any.whl
  • Upload date:
  • Size: 22.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.1

File hashes

Hashes for llmstxt2skill-0.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 7d63d30c000080ff7dbf2fc41aebf430daa8c2e0e4495ebfdfe3be4635b2f584
MD5 3957cd112c53dea342afd6c718e5eb99
BLAKE2b-256 6e95c31344349fce5f974a4f5e8fb882d6d257a1e005f6280e8fb43e763ae353

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page