Skip to main content

PersonaSpec toolkit for LLM agents

Project description

larva

larva is a PersonaSpec toolkit for LLM agent systems. It gives you one place to validate, assemble, normalize, register, resolve, clone, update, and export canonical persona definitions.

What larva is for

Use larva when you want a stable authority for agent persona definitions instead of ad hoc prompt files scattered across tools and repos.

  • Validate PersonaSpec JSON before it reaches runtime
  • Assemble personas from reusable components
  • Store canonical personas in a local registry under ~/.larva/
  • Resolve, clone, update, delete, and export personas across tools
  • Expose the same operations through MCP, CLI, Python, and a small web UI

larva does not run agents, call LLMs, enforce gateway policy, or manage memory.

Install

Install into your Python environment:

pip install larva

Or run larva without a persistent install:

uvx larva --help

Quick start

The example below creates a minimal persona, validates it, stores it in the local registry, and resolves the canonical output back out.

Create a minimal persona:

cat <<'EOF' > code-reviewer.json
{
  "id": "code-reviewer",
  "description": "Reviews code for correctness and style",
  "prompt": "You are a senior code reviewer.",
  "model": "openai/gpt-5.4",
  "capabilities": {"shell": "read_only"}
}
EOF

Validate, register, and resolve it:

larva validate code-reviewer.json
larva register code-reviewer.json
larva resolve code-reviewer

Clone and modify it for experimentation:

larva clone code-reviewer code-reviewer-exp
larva update code-reviewer-exp --set model=openai/gpt-5.4-pro
larva list --json

Core concepts

PersonaSpec

The main larva artifact is a flat JSON object called PersonaSpec.

The canonical PersonaSpec schema is defined by opifex. larva validates, assembles, and normalizes PersonaSpec as a downstream admission and projection layer, not the contract authority.

{
  "spec_version": "0.1.0",
  "id": "code-reviewer",
  "description": "Reviews code changes with read-focused tooling.",
  "prompt": "You are a senior code reviewer...",
  "model": "openai/gpt-5.4",
  "capabilities": {
    "shell": "read_only",
    "filesystem": "read_write"
  },
  "spec_digest": "sha256:..."
}

Key rules:

  • id is required and must be flat kebab-case
  • spec_version is schema identity, not persona revisioning
  • v1 pins spec_version to "0.1.0"
  • spec_digest is recomputed by larva from canonical content
  • there is no inheritance or base: field in canonical output

Components

larva can also assemble personas from reusable components stored in ~/.larva/components/:

~/.larva/
  components/
    prompts/
    toolsets/
    constraints/
    models/
  registry/

Example assembly command:

larva assemble --id code-reviewer \
  --prompt code-reviewer \
  --prompt careful-reasoning \
  --toolset read-only \
  --constraints strict \
  --model gpt-5

Components are read from the user-managed shell boundary at ~/.larva/components/. Those files are local input, not canonical larva state; only the assembled and validated PersonaSpec is authoritative at runtime.

Interfaces

MCP

Primary programmatic surface:

larva.validate(spec)              -> ValidationReport
larva.assemble(components)        -> PersonaSpec
larva.register(spec)              -> {id, registered}
larva.resolve(id, overrides?)     -> PersonaSpec
larva.list()                      -> [{id, description, spec_digest, model}]
larva.update(id, patches)         -> PersonaSpec
larva.clone(source_id, new_id)    -> PersonaSpec
larva.delete(id)                  -> {id, deleted}
larva.clear(confirm)              -> {cleared, count}
larva.export(all?, ids?)          -> [PersonaSpec, ...]
larva.component_list()            -> {prompts, toolsets, constraints, models}
larva.component_show(type, name)  -> component content

Start larva as an MCP server over stdio:

larva mcp

Or with uvx:

uvx larva mcp

If you want the packaged local web UI/runtime instead of stdio, start:

larva serve

Or with uvx:

uvx larva serve

CLI

larva validate <spec.json> [--json]
larva register <spec.json> [--json]
larva resolve <id> [--override key=value]... [--json]
larva list [--json]
larva update <id> --set key=value [--set ...] [--json]
larva clone <source-id> <new-id> [--json]
larva delete <id> [--json]
larva clear --confirm "CLEAR REGISTRY" [--json]
larva export --all [--json]
larva export --id <id> [--id <id>]... [--json]
larva assemble --id <id> [--prompt <name>]... [--toolset <name>]... [--constraints <name>]... [--model <name>] [--override key=value]... [--var key=value]... [-o output.json]
larva component list [--json]
larva component show <type>/<name> [--json]

Python API

from larva.shell.python_api import (
    assemble,
    clear,
    clone,
    component_list,
    component_show,
    delete,
    export_all,
    export_ids,
    list,
    register,
    resolve,
    update,
    validate,
)

The Python API mirrors the main CLI and MCP operations and returns the same canonical PersonaSpec shapes.

The package root is not the authoritative Python API surface. Keep imports on larva.shell.python_api; larva.__init__ remains metadata-only (__version__) unless guard policy and architecture docs are updated together.

Other surfaces

Web UI

The authoritative packaged startup path is:

larva serve

larva serve binds 127.0.0.1:7400 by default, accepts --port and --no-open, and serves the packaged single-file UI plus the normative REST surface documented in INTERFACES.md.

The repository also includes a supported contributor convenience entrypoint for local review work:

pip install fastapi uvicorn
python contrib/web/server.py

Scope note:

  • larva serve is the canonical packaged web runtime users should target
  • python contrib/web/server.py is supported for contributor/local-review use, not the canonical packaged entrypoint
  • documented REST endpoints are the verified contract surface
  • the prompt copy button is documented only as browser convenience UI behavior
  • batch update is documented only for the contrib runtime, not for larva serve
  • component query semantics are shared across transports and should be centralized outside adapter-local envelopes
  • CLI, MCP, Web, and Python API keep their own rendering, error envelopes, and runtime hooks
  • preserved runnable liveness proof for both entrypoints lives in tests/shell/artifacts/web_runtime_liveness.md

OpenCode plugin

larva also ships an OpenCode plugin that exposes larva personas as agents. See contrib/opencode-plugin/README.md.

Architecture

larva uses a strict layered structure enforced by Invar.

Layer Path Role
Core src/larva/core/ Pure logic, contracts, no I/O
App src/larva/app/ Use-case orchestration
Shell src/larva/shell/ CLI, MCP, filesystem, web adapters

Structural guardrails frozen for the remediation campaign:

  • src/larva/shell/web.py is the authoritative packaged REST surface
  • contrib/web/server.py is an extension consumer, not the contract owner
  • src/larva/core/patch.py dotted-path patch semantics stay separate from src/larva/app/facade.py dotted lookup semantics unless later evidence says otherwise

Read next

If you are just getting started, read README.md then USER_GUIDE.md.

  • USER_GUIDE.md - detailed human-oriented usage guide
  • USAGE.md - agent-oriented operational guide
  • INTERFACES.md - public interface specification
  • ARCHITECTURE.md - module boundaries and dependency design
  • ADR-001-spec-version-boundary.md - spec_version design decision
  • ADR-002-capability-intent-without-runtime-policy.md - capability intent model

License

AGPL-3.0-or-later

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

larva-0.4.5.tar.gz (704.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

larva-0.4.5-py3-none-any.whl (98.3 kB view details)

Uploaded Python 3

File details

Details for the file larva-0.4.5.tar.gz.

File metadata

  • Download URL: larva-0.4.5.tar.gz
  • Upload date:
  • Size: 704.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for larva-0.4.5.tar.gz
Algorithm Hash digest
SHA256 47dd3c471cc73a79088905484f0eb7fd398528ec63d37c71139b4d0364f280cb
MD5 b9bb050c0649cfc69d3f1f28bd397a79
BLAKE2b-256 fb0bef02b44ef6fc2d335a249a1769ad2e6cbe7fd145a4705bd0d34c129f7e34

See more details on using hashes here.

Provenance

The following attestation bundles were made for larva-0.4.5.tar.gz:

Publisher: publish.yml on Tefx/larva

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file larva-0.4.5-py3-none-any.whl.

File metadata

  • Download URL: larva-0.4.5-py3-none-any.whl
  • Upload date:
  • Size: 98.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for larva-0.4.5-py3-none-any.whl
Algorithm Hash digest
SHA256 5db01accab04f57a786e07896887d94fab0fb10ea4ea271a1633229fef1f4a77
MD5 7c39f5627249adcb511102ca59407798
BLAKE2b-256 e4b4ac4857920549ab9731ca35c1c6e5c91cc1f6b18662d3c8c55db599044b86

See more details on using hashes here.

Provenance

The following attestation bundles were made for larva-0.4.5-py3-none-any.whl:

Publisher: publish.yml on Tefx/larva

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page