Skip to main content

Official Python SDK for SkillNet: Create, Evaluate, and Connect AI Skills.

Project description

skillnet-ai

The official Python SDK & CLI for SkillNet — search, install, create, evaluate, and connect AI agent skills.

PyPI version Downloads License: MIT Python 3.9+

Website · GitHub · PyPI


Quick Start

pip install skillnet-ai
from skillnet_ai import SkillNetClient

client = SkillNetClient()  # No API key needed for search & download

# Find a skill
results = client.search(q="pdf", limit=5)
print(results[0].skill_name, results[0].stars)

# Install it
client.download(url=results[0].skill_url, target_dir="./my_skills")

That's it. Search and download are free — no API key, no rate limit.

For create, evaluate, and analyze, set API_KEY (any OpenAI-compatible key). See Configuration.


Features

Feature What it does
🔍 Search Keyword match or AI semantic search across 500+ community skills
📦 Install One-line download from any GitHub skill directory
Create Auto-convert repos, PDFs, conversation logs, or text prompts → structured skill packages
📊 Evaluate Score skills on 5 dimensions: Safety · Completeness · Executability · Maintainability · Cost-Awareness
🕸️ Analyze Map similar_to · belong_to · compose_with · depend_on relationships between skills

Python SDK

Initialize

from skillnet_ai import SkillNetClient

client = SkillNetClient(
    api_key="sk-...",         # Required for create / evaluate / analyze
    # base_url="...",         # Optional: custom LLM endpoint (default: OpenAI)
    # github_token="ghp-..." # Optional: for private repos or higher rate limits
)

Credentials can also be set via environment variables: API_KEY, BASE_URL, GITHUB_TOKEN.

Search

# Keyword search
results = client.search(q="pdf", limit=10, min_stars=5, sort_by="stars")

# Semantic search — find skills by meaning, not just keywords
results = client.search(q="analyze financial PDF reports", mode="vector", threshold=0.85)

if results:
    print(f"{results[0].skill_name}{results[0].stars}")
    print(results[0].skill_url)
Search Parameters
Parameter Type Default Description
q str required Search query (keywords or natural language)
mode str "keyword" "keyword" or "vector"
category str None Filter by category
limit int 20 Max results per request
page int 1 Page number (keyword only)
min_stars int 0 Minimum star count (keyword only)
sort_by str "stars" "stars" or "recent" (keyword only)
threshold float 0.8 Similarity threshold 0.0–1.0 (vector only)

Install

local_path = client.download(
    url="https://github.com/anthropics/skills/tree/main/skills/skill-creator",
    target_dir="./my_skills"
)
print(f"Installed at: {local_path}")

Create

Convert diverse sources into structured skill packages:

# From conversation logs / execution traces
client.create(trajectory_content="User: rename .jpg→.png\nAgent: Done.", output_dir="./skills")

# From a GitHub repository
client.create(github_url="https://github.com/zjunlp/DeepKE", output_dir="./skills")

# From office documents (PDF / PPT / Word)
client.create(office_file="./guide.pdf", output_dir="./skills")

# From a natural language description
client.create(prompt="A skill for web scraping article titles", output_dir="./skills")

All modes auto-generate a complete skill package: SKILL.md + optional scripts/, references/, assets/.

Evaluate

Score any skill on 5 quality dimensions. Accepts local paths or GitHub URLs:

result = client.evaluate(
    target="https://github.com/anthropics/skills/tree/main/skills/algorithmic-art"
)
# {
#   "safety":          {"level": "Good", "reason": "..."},
#   "completeness":    {"level": "Good", "reason": "..."},
#   "executability":   {"level": "Average", "reason": "..."},
#   "maintainability": {"level": "Good", "reason": "..."},
#   "cost_awareness":  {"level": "Good", "reason": "..."}
# }

Analyze Relationships

Discover connections between skills in a local directory:

relationships = client.analyze(skills_dir="./my_skills")

for rel in relationships:
    print(f"{rel['source']} --[{rel['type']}]--> {rel['target']}")
# PDF_Parser --[compose_with]--> Text_Summarizer
# Web_Scraper --[similar_to]--> Data_Extractor

Detects four relationship types: similar_to · belong_to · compose_with · depend_on. Results are saved to relationships.json by default.


CLI

The CLI ships automatically with pip install skillnet-ai — powered by Typer + Rich for beautiful terminal output.

skillnet <command> --help    # Full options for any command

Commands at a Glance

Command What it does Example
search Find skills skillnet search "pdf" --mode vector
download Install a skill skillnet download <url> -d ./skills
create Create from any source skillnet create log.txt -d ./skills
evaluate Quality report skillnet evaluate ./my_skill
analyze Relationship graph skillnet analyze ./my_skills

Search

skillnet search "pdf"
skillnet search "analyze financial reports" --mode vector --threshold 0.85
skillnet search "visualization" --category "Development" --sort-by stars --limit 10

Install

skillnet download https://github.com/anthropics/skills/tree/main/skills/algorithmic-art
skillnet download <url> -d ./my_agent/skills
skillnet download <private_url> --token <your_github_token>

# Use a mirror for faster downloads in restricted networks
skillnet download <url> --mirror https://ghfast.top/

Create

skillnet create ./logs/trajectory.txt -d ./skills          # from trajectory
skillnet create --github https://github.com/owner/repo      # from GitHub repo
skillnet create --office ./docs/guide.pdf                    # from PDF/PPT/Word
skillnet create --prompt "A skill for table extraction"      # from prompt
skillnet create --office report.pdf --model gpt-4o           # custom model

Evaluate

skillnet evaluate ./my_skills/web_search
skillnet evaluate https://github.com/anthropics/skills/tree/main/skills/algorithmic-art
skillnet evaluate ./my_skill --category "Development" --model gpt-4o

Analyze

skillnet analyze ./my_skills
skillnet analyze ./my_skills --no-save     # print only, don't write file
skillnet analyze ./my_skills --model gpt-4o

⚙️ Configuration

Environment Variables

Variable Required For Default
API_KEY create · evaluate · analyze
BASE_URL Custom LLM endpoint https://api.openai.com/v1
GITHUB_TOKEN Private repos / higher rate limits
SKILLNET_MODEL Default LLM model for all commands gpt-4o
GITHUB_MIRROR Faster downloads in restricted networks

search and download (public repos) require no credentials at all.

Recommended mirror: https://ghfast.top/ — set GITHUB_MIRROR or pass --mirror to speed up downloads in restricted networks.

Linux / macOS:

export API_KEY="sk-..."
export BASE_URL="https://..."   # optional

Windows PowerShell:

$env:API_KEY = "sk-..."
$env:BASE_URL = "https://..."   # optional

Or pass credentials directly in code:

client = SkillNetClient(api_key="sk-...", base_url="https://...")

📂 Skill Structure

Every created or downloaded skill follows a standardized layout:

skill-name/
├── SKILL.md          # [Required] YAML metadata + markdown instructions
├── scripts/          # [Optional] Executable Python / Bash scripts
├── references/       # [Optional] Static docs, API specs, schemas
└── assets/           # [Optional] Templates, icons, examples

🤝 Contributing

Contributions are welcome! Feel free to open an Issue or submit a Pull Request.

📄 License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

skillnet_ai-0.0.18.tar.gz (57.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

skillnet_ai-0.0.18-py3-none-any.whl (59.5 kB view details)

Uploaded Python 3

File details

Details for the file skillnet_ai-0.0.18.tar.gz.

File metadata

  • Download URL: skillnet_ai-0.0.18.tar.gz
  • Upload date:
  • Size: 57.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.0

File hashes

Hashes for skillnet_ai-0.0.18.tar.gz
Algorithm Hash digest
SHA256 72bae4d66c1e4e0b8d67a4afac03d3c2542828e1d80f82917c6a10b034461b39
MD5 d77737e7730a16032f9f6f307ec2c9a6
BLAKE2b-256 8dc37bedcbe8020df8d6e156681d343334f69a5f8d8e50c08caa060229760374

See more details on using hashes here.

File details

Details for the file skillnet_ai-0.0.18-py3-none-any.whl.

File metadata

  • Download URL: skillnet_ai-0.0.18-py3-none-any.whl
  • Upload date:
  • Size: 59.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.0

File hashes

Hashes for skillnet_ai-0.0.18-py3-none-any.whl
Algorithm Hash digest
SHA256 d9eba2f8b0fac2b42ecd623d1133eef683df101001ed09a0fc01bfdca9626cec
MD5 64de265ad9ef4fbb7b23225bc3222451
BLAKE2b-256 1419368d2ea1ac9eca0ee05fb7ecffe1ae8101728d46d4238c26cdaec5d6cdc6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page