Skip to main content

The open source AI capability substrate.

Project description

woyn

The open source AI capability substrate.

Status: v0.1.0a6 — early alpha. Seven capability verbs (chat, retrieve, transcribe, crawl, generate_image, synthesize, extract), a five-store memory layer, a policy-driven router, a frugality engine (cache + cascade + optional compression), and seven default adapters. See CHANGELOG.md for what's in each release.

What's new in v0.1.0a6 (Plan 3)

The frugality engine and policy-driven router are live. Two new capabilities:

# Text-to-speech
woyn synthesize "Welcome to woyn" -o welcome.wav

# Document extraction (PDF, docx, html, …)
woyn extract paper.pdf

# Inspect or clear the frugality cache
woyn cache clear

The router now scores adapters per call against your active policy (balanced, frugal, quality, or regulated). With policy = "frugal" in woyn.toml, woyn picks the cheapest adapter that satisfies hard constraints; with quality, it ignores cost and picks the strongest.

Optional prompt compression via pip install woyn[frugal] (LLMLingua).

What's new in v0.1.0a4 (Plan 2.1)

woyn install now bootstraps everything woyn needs in one shot:

# Lightweight default — chat, retrieve, transcribe, crawl all working
woyn install                  # installs Ollama, pulls phi3:mini, installs Chromium for crawl

# Heavy opt-ins (explicit consent each)
woyn install --with image     # adds torch + diffusers for woyn image (~3GB)

# See what's set up
woyn doctor                   # reports adapter health AND bootstrap component status

Friendlier errors:

  • woyn crawl without a browser → clear message pointing to woyn install
  • woyn image without [image] extra → clear message pointing to woyn install --with image
  • The diffusers adapter no longer appears in woyn list adapters until torch + diffusers are actually installed.

What's new in v0.1.0a3

  • Three new capability verbs
    • transcribe — audio → text via faster-whisper (CTranslate2-backed Whisper).
    • crawl — URL → Markdown via crawl4ai.
    • generate_image — prompt → PNG bytes via Hugging Face diffusers (optional, install with pip install woyn[image]).
  • Memory layer (woyn.memory) with five stores behind one MemoryAPI façade:
    • episodic (SQLite, time-ordered events)
    • documental (Chroma + blob storage, semantic search over docs)
    • semantic (SQLite entity/relation graph)
    • procedural (versioned prompt templates on disk)
    • personal (LoRA adapter registry stub)
    • Plus memory_aware decorator for transparent context injection / persistence.
  • CLI: woyn transcribe FILE, woyn crawl URL, woyn image PROMPT, and woyn memory inspect|forget|export|import.
  • 132 unit tests passing, 3 gated integration tests skipped by default.

What is this?

woyn is the layer between an application or agent's intent ("summarize this document") and the open-source AI tools that fulfill it. One unified Python SDK and CLI, fronting a growing set of pluggable adapters (Ollama, Chroma, and many more in upcoming plans). Zero required subscriptions, zero required API keys, designed to run on any laptop with 16 GB RAM or more.

Install

pip install woyn

For development:

git clone https://github.com/your-fork/woyn
cd woyn
python -m venv .venv && source .venv/bin/activate
pip install -e ".[dev]"

Quickstart

# 1. Install woyn
pip install woyn

# 2. Bootstrap Ollama and pull a default model (one-time, ~3 min on first run)
woyn install        # macOS: requires Homebrew. Linux: runs the official Ollama installer.

# 3. Use it
woyn chat "Why is the sky blue?"
woyn chat "Write a haiku about winter" --stream

Discovery

woyn version
woyn list capabilities
woyn list adapters
woyn doctor          # health-check every registered adapter

Retrieve

# Ingest a directory of .md / .txt files into a named collection:
woyn retrieve ingest ./my-docs --collection notes

# Search:
woyn retrieve search "what did I write about elasticity?" --collection notes --k 5

Python SDK

import asyncio
from woyn.capabilities import chat, retrieve
from woyn.adapters import register_default_adapters

register_default_adapters()

async def main():
    out = await chat.complete("Why is the sky blue?")
    print(out.text)

    await retrieve.ingest("notes", [("d1", "rayleigh scattering")])
    found = await retrieve.search("why blue", "notes", k=3)
    print(found.documents)

asyncio.run(main())

Manual setup (alternative to woyn install)

If you'd rather install Ollama yourself:

# macOS
brew install ollama

# Linux
curl -fsSL https://ollama.com/install.sh | sh

# Then
ollama serve &
ollama pull phi3:mini

woyn chat "hello"

Tests

pytest -q                   # unit tests
pytest --integration -q     # also run real-service tests (needs Ollama)

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

woyn-0.1.0a6.tar.gz (40.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

woyn-0.1.0a6-py3-none-any.whl (62.7 kB view details)

Uploaded Python 3

File details

Details for the file woyn-0.1.0a6.tar.gz.

File metadata

  • Download URL: woyn-0.1.0a6.tar.gz
  • Upload date:
  • Size: 40.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for woyn-0.1.0a6.tar.gz
Algorithm Hash digest
SHA256 b45e4ea8aec8648c7782a59dae3167bdf591af0bb48a387dfb0d1277b4a44560
MD5 efeb4a9f365fe849dc32d53962e58c70
BLAKE2b-256 866cc7054914cf84e85ef2e61d67fbf82a2087b06095e0065ac92fc66e809ba9

See more details on using hashes here.

File details

Details for the file woyn-0.1.0a6-py3-none-any.whl.

File metadata

  • Download URL: woyn-0.1.0a6-py3-none-any.whl
  • Upload date:
  • Size: 62.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for woyn-0.1.0a6-py3-none-any.whl
Algorithm Hash digest
SHA256 8cbab740d7992a41cb84596953adb0cd5a048fa62ca6d15037f653a60cc4fbe1
MD5 be1ef707ad6c763982cd62ee7fa993c3
BLAKE2b-256 dc5581b7e9d6394b6ba3dc8ae139852ee4da3ee3274fa86282fb077e33a8aa2d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page