Convert llms.txt to Claude Code Skill format
Project description
llmstxt2skill
Convert llms.txt files to Claude Code Skill format.
Installation
pip install llmstxt2skill
For development:
git clone https://github.com/akuwano/llmstxt2skill.git
cd llmstxt2skill
uv pip install -e ".[dev]"
Usage
# Basic usage
llmstxt2skill https://docs.databricks.com/llms.txt
# Preview without writing (dry-run)
llmstxt2skill https://docs.databricks.com/llms.txt --dry-run
# Custom skill name
llmstxt2skill https://docs.databricks.com/llms.txt --name databricks
# Custom output directory
llmstxt2skill https://docs.databricks.com/llms.txt -o ./skills
# Overwrite existing skill
llmstxt2skill https://docs.databricks.com/llms.txt --force
Options
| Option | Description |
|---|---|
--name |
Custom skill name (defaults to kebab-case of title) |
-o, --output |
Output directory (defaults to ~/.claude/skills) |
--dry-run |
Preview output without writing files |
--force |
Overwrite existing skill |
--enrich |
Use LLM to generate enriched skill with curation and localization |
--provider |
LLM provider: databricks, openai, anthropic, openai-compatible (default: databricks) |
--lang |
Target language for enriched skill (default: ja) |
--model |
Model identifier (defaults to provider's default model) |
LLM Enrichment
Use --enrich to generate high-quality skills with:
- Localized content (Japanese by default)
- Curated and categorized links
- Trigger conditions for when to use the skill
- Capabilities and limitations
- Usage instructions
Note: The
--enrichoption calls external LLM APIs, which may incur costs depending on your provider and usage.
Supported Providers
| Provider | Environment Variables | Default Model |
|---|---|---|
databricks |
DATABRICKS_HOST, DATABRICKS_TOKEN |
databricks-gemini-3-pro |
openai |
OPENAI_API_KEY |
gpt-4o-mini |
anthropic |
ANTHROPIC_API_KEY |
claude-3-5-sonnet-20241022 |
openai-compatible |
OPENAI_BASE_URL, OPENAI_API_KEY (optional) |
default |
Note: The
openai-compatibleprovider is intended for local LLM servers (vLLM, Ollama, llama.cpp) and uses HTTP by default. For production use with remote endpoints, ensure HTTPS is configured.
Examples
# Databricks (default provider)
export DATABRICKS_HOST="https://your-workspace.cloud.databricks.com"
export DATABRICKS_TOKEN="your-token"
llmstxt2skill https://docs.databricks.com/llms.txt --enrich
# OpenAI
export OPENAI_API_KEY="sk-..."
llmstxt2skill https://example.com/llms.txt --enrich --provider openai
# Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."
llmstxt2skill https://example.com/llms.txt --enrich --provider anthropic
# Local LLM (vLLM, Ollama, llama.cpp server)
export OPENAI_BASE_URL="http://localhost:8000"
llmstxt2skill https://example.com/llms.txt --enrich --provider openai-compatible --model llama3
# Specify model explicitly
llmstxt2skill https://example.com/llms.txt --enrich --provider openai --model gpt-4o
# English output
llmstxt2skill https://example.com/llms.txt --enrich --lang en
Output
By default, skills are written to ~/.claude/skills/{skill-name}/SKILL.md
Use -o or --output to specify a custom output directory:
llmstxt2skill https://example.com/llms.txt -o ./my-skills
# Output: ./my-skills/{skill-name}/SKILL.md
Example
Input (llms.txt):
# Databricks Documentation
> Comprehensive documentation for the Databricks platform.
## Overview
- [Main docs](https://docs.databricks.com/) - How-to guides
Output (SKILL.md):
---
name: databricks-documentation
description: Comprehensive documentation for the Databricks platform.
---
# Databricks Documentation
Comprehensive documentation for the Databricks platform.
## Overview
- [Main docs](https://docs.databricks.com/) - How-to guides
Development
# Install dev dependencies
uv pip install -e ".[dev]"
# Run tests
uv run pytest tests/ -v
# Run linter
uv run ruff check src/ tests/
License
Apache License 2.0 - see LICENSE for details.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llmstxt2skill-0.2.1.tar.gz.
File metadata
- Download URL: llmstxt2skill-0.2.1.tar.gz
- Upload date:
- Size: 15.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4d94a71407cfadeececa5d13b4c10e597595cd36cc73e28231102c235c8cc201
|
|
| MD5 |
530abd415bcdbdc213a241159fad1dfd
|
|
| BLAKE2b-256 |
0cb7324b48f2499fd9e536c23aafa2067e8ccb693742cda7ec1432d2fef11455
|
File details
Details for the file llmstxt2skill-0.2.1-py3-none-any.whl.
File metadata
- Download URL: llmstxt2skill-0.2.1-py3-none-any.whl
- Upload date:
- Size: 22.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
12f4211a8fb2e9da1953383f13a03b4519aa8aa4d1a583519fc101ce5d68ad44
|
|
| MD5 |
4378a261fa4cffe69b1f86ea3d86f27a
|
|
| BLAKE2b-256 |
34cfe181d0280d5270e836f6d5f2182b6641de924fd5734bb7b5a3359d701f4f
|