Skip to main content

Official Python SDK for SkillNet: Create, Evaluate, and Connect AI Skills.

Project description

skillnet-ai

The official Python SDK & CLI for SkillNet — search, install, create, evaluate, and connect AI agent skills.

PyPI version Downloads License: MIT Python 3.9+

Website · GitHub · PyPI


Quick Start

pip install skillnet-ai
from skillnet_ai import SkillNetClient

client = SkillNetClient()  # No API key needed for search & download

# Find a skill
results = client.search(q="pdf", limit=5)
print(results[0].skill_name, results[0].stars)

# Install it
client.download(url=results[0].skill_url, target_dir="./my_skills")

That's it. Search and download are free — no API key, no rate limit.

For create, evaluate, and analyze, set API_KEY (any OpenAI-compatible key). See Configuration.


Features

Feature What it does
🔍 Search Keyword match or AI semantic search across 500+ community skills
📦 Install One-line download from any GitHub skill directory
Create Auto-convert repos, PDFs, conversation logs, or text prompts → structured skill packages
📊 Evaluate Score skills on 5 dimensions: Safety · Completeness · Executability · Maintainability · Cost-Awareness
🕸️ Analyze Map similar_to · belong_to · compose_with · depend_on relationships between skills

Python SDK

Initialize

from skillnet_ai import SkillNetClient

client = SkillNetClient(
    api_key="sk-...",         # Required for create / evaluate / analyze
    # base_url="...",         # Optional: custom LLM endpoint (default: OpenAI)
    # github_token="ghp-..." # Optional: for private repos or higher rate limits
)

Credentials can also be set via environment variables: API_KEY, BASE_URL, GITHUB_TOKEN.

Search

# Keyword search
results = client.search(q="pdf", limit=10, min_stars=5, sort_by="stars")

# Semantic search — find skills by meaning, not just keywords
results = client.search(q="analyze financial PDF reports", mode="vector", threshold=0.85)

if results:
    print(f"{results[0].skill_name}{results[0].stars}")
    print(results[0].skill_url)
Search Parameters
Parameter Type Default Description
q str required Search query (keywords or natural language)
mode str "keyword" "keyword" or "vector"
category str None Filter by category
limit int 20 Max results per request
page int 1 Page number (keyword only)
min_stars int 0 Minimum star count (keyword only)
sort_by str "stars" "stars" or "recent" (keyword only)
threshold float 0.8 Similarity threshold 0.0–1.0 (vector only)

Install

local_path = client.download(
    url="https://github.com/anthropics/skills/tree/main/skills/skill-creator",
    target_dir="./my_skills"
)
print(f"Installed at: {local_path}")

Create

Convert diverse sources into structured skill packages:

# From conversation logs / execution traces
client.create(trajectory_content="User: rename .jpg→.png\nAgent: Done.", output_dir="./skills")

# From a GitHub repository
client.create(github_url="https://github.com/zjunlp/DeepKE", output_dir="./skills")

# From office documents (PDF / PPT / Word)
client.create(office_file="./guide.pdf", output_dir="./skills")

# From a natural language description
client.create(prompt="A skill for web scraping article titles", output_dir="./skills")

All modes auto-generate a complete skill package: SKILL.md + optional scripts/, references/, assets/.

Evaluate

Score any skill on 5 quality dimensions. Accepts local paths or GitHub URLs:

result = client.evaluate(
    target="https://github.com/anthropics/skills/tree/main/skills/algorithmic-art"
)
# {
#   "safety":          {"level": "Good", "reason": "..."},
#   "completeness":    {"level": "Good", "reason": "..."},
#   "executability":   {"level": "Average", "reason": "..."},
#   "maintainability": {"level": "Good", "reason": "..."},
#   "cost_awareness":  {"level": "Good", "reason": "..."}
# }

Analyze Relationships

Discover connections between skills in a local directory:

relationships = client.analyze(skills_dir="./my_skills")

for rel in relationships:
    print(f"{rel['source']} --[{rel['type']}]--> {rel['target']}")
# PDF_Parser --[compose_with]--> Text_Summarizer
# Web_Scraper --[similar_to]--> Data_Extractor

Detects four relationship types: similar_to · belong_to · compose_with · depend_on. Results are saved to relationships.json by default.


CLI

The CLI ships automatically with pip install skillnet-ai — powered by Typer + Rich for beautiful terminal output.

skillnet <command> --help    # Full options for any command

Commands at a Glance

Command What it does Example
search Find skills skillnet search "pdf" --mode vector
download Install a skill skillnet download <url> -d ./skills
create Create from any source skillnet create log.txt -d ./skills
evaluate Quality report skillnet evaluate ./my_skill
analyze Relationship graph skillnet analyze ./my_skills

Search

skillnet search "pdf"
skillnet search "analyze financial reports" --mode vector --threshold 0.85
skillnet search "visualization" --category "Development" --sort-by stars --limit 10

Install

skillnet download https://github.com/anthropics/skills/tree/main/skills/algorithmic-art
skillnet download <url> -d ./my_agent/skills
skillnet download <private_url> --token <your_github_token>

Create

skillnet create ./logs/trajectory.txt -d ./skills          # from trajectory
skillnet create --github https://github.com/owner/repo      # from GitHub repo
skillnet create --office ./docs/guide.pdf                    # from PDF/PPT/Word
skillnet create --prompt "A skill for table extraction"      # from prompt
skillnet create --office report.pdf --model gpt-4o           # custom model

Evaluate

skillnet evaluate ./my_skills/web_search
skillnet evaluate https://github.com/anthropics/skills/tree/main/skills/algorithmic-art
skillnet evaluate ./my_skill --category "Development" --model gpt-4o

Analyze

skillnet analyze ./my_skills
skillnet analyze ./my_skills --no-save     # print only, don't write file
skillnet analyze ./my_skills --model gpt-4o

⚙️ Configuration

Environment Variables

Variable Required For Default
API_KEY create · evaluate · analyze
BASE_URL Custom LLM endpoint https://api.openai.com/v1
GITHUB_TOKEN Private repos / higher rate limits

search and download (public repos) require no credentials at all.

Linux / macOS:

export API_KEY="sk-..."
export BASE_URL="https://..."   # optional

Windows PowerShell:

$env:API_KEY = "sk-..."
$env:BASE_URL = "https://..."   # optional

Or pass credentials directly in code:

client = SkillNetClient(api_key="sk-...", base_url="https://...")

📂 Skill Structure

Every created or downloaded skill follows a standardized layout:

skill-name/
├── SKILL.md          # [Required] YAML metadata + markdown instructions
├── scripts/          # [Optional] Executable Python / Bash scripts
├── references/       # [Optional] Static docs, API specs, schemas
└── assets/           # [Optional] Templates, icons, examples

🤝 Contributing

Contributions are welcome! Feel free to open an Issue or submit a Pull Request.

📄 License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

skillnet_ai-0.0.13.tar.gz (54.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

skillnet_ai-0.0.13-py3-none-any.whl (56.0 kB view details)

Uploaded Python 3

File details

Details for the file skillnet_ai-0.0.13.tar.gz.

File metadata

  • Download URL: skillnet_ai-0.0.13.tar.gz
  • Upload date:
  • Size: 54.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.0

File hashes

Hashes for skillnet_ai-0.0.13.tar.gz
Algorithm Hash digest
SHA256 c52d357f96e6d5f9e0aa7dac3d4f5b4383ff3508dea671664ea1e8afa6e37ad4
MD5 b4bf7ac1cc8205e779e7837b1196492f
BLAKE2b-256 d4eb00fcfd74d79f80f6238ca04e193cde0011c443b3330533e5951802174a86

See more details on using hashes here.

File details

Details for the file skillnet_ai-0.0.13-py3-none-any.whl.

File metadata

  • Download URL: skillnet_ai-0.0.13-py3-none-any.whl
  • Upload date:
  • Size: 56.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.0

File hashes

Hashes for skillnet_ai-0.0.13-py3-none-any.whl
Algorithm Hash digest
SHA256 ff5f0d43e4a2bd37ba6bfc38859c56703a8dd983ce44d626f2d4f15321393be0
MD5 a7107178813487380a518060f7555277
BLAKE2b-256 939e585426c08901e844948972af7d1c681399818e1164fad06061dbbe370716

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page