Skip to main content

A simple framework for LLM-powered applications

Project description

LLMProc

License Status

A Unix-inspired framework for building powerful LLM applications that lets you spawn specialized models, manage large outputs, and enhance context with file preloading.

LLMProc treats language models as processes: spawn them, fork them, link them together, and handle their I/O with a familiar Unix-like approach.

Table of Contents

Installation

# Install with uv (recommended)
uv pip install -e .

# Or with pip
pip install -e .

# Set environment variables
export OPENAI_API_KEY="your-key"    # For OpenAI models
export ANTHROPIC_API_KEY="your-key"  # For Claude models

The package supports .env files for environment variables.

Quick Start

Python usage

import asyncio
from llmproc import LLMProgram, register_tool

@register_tool()
def calculate(expression: str) -> dict:
    return {"result": eval(expression, {"__builtins__": {}})}

async def main():
    # You can load a program from a TOML file
    program = LLMProgram.from_toml('examples/anthropic/claude-3-5-haiku.toml')

    # Or create a program with the python API
    program = (
        LLMProgram(
            model_name="claude-3-7-sonnet-20250219",
            provider="anthropic",
            system_prompt="You are a helpful assistant."
        )
        .add_tool(calculate)
    )

    # Start and use the process
    process = await program.start()
    result = await process.run('What is 125 * 48?')
    print(process.get_last_message())

asyncio.run(main())

CLI usage

# Start interactive session
llmproc-demo ./examples/anthropic/claude-3-5-haiku.toml

# Single prompt
llmproc-demo ./examples/anthropic/claude-3-5-sonnet.toml -p "What is Python?"

# Read from stdin
cat questions.txt | llmproc-demo ./examples/anthropic/claude-3-7-sonnet.toml -n

Features

LLMProc offers a complete toolkit for building sophisticated LLM applications:

Basic Configuration

Developer Experience

  • Python SDK - Create programs with intuitive method chaining
  • Function-Based Tools - Register Python functions as tools with type-safety and auto-conversion

Process Management

  • Program Linking - Spawn and delegate tasks to specialized LLM processes
  • Fork Tool - Create process copies with shared conversation state

Large Content Handling

More Features

  • Prompt Caching - Automatic 90% token savings for Claude models (enabled by default)
  • Reasoning/Thinking models - Claude 3.7 Thinking and OpenAI Reasoning models
  • MCP Protocol - Standardized interface for tool usage
  • Cross-provider support - Currently supports Anthropic, OpenAI, and Anthropic on Vertex AI

Demo Tools

LLMProc includes demo command-line tools for quick experimentation:

llmproc-demo

Interactive CLI for testing LLM configurations:

llmproc-demo ./examples/anthropic/claude-3-5-haiku.toml  # Interactive session
llmproc-demo ./config.toml -p "What is Python?"          # Single prompt
cat questions.txt | llmproc-demo ./config.toml -n        # Pipe mode

Commands: exit or quit to end the session

llmproc-prompt

View the compiled system prompt without making API calls:

llmproc-prompt ./config.toml                 # Display to stdout
llmproc-prompt ./config.toml -o prompt.txt   # Save to file
llmproc-prompt ./config.toml -E              # Without environment info

Use Cases

  • Claude Code - A minimal Claude Code implementation, with support for preloading CLAUDE.md, spawning, MCP

Documentation

For advanced usage and implementation details, see MISC.md.

Design Philosophy

LLMProc treats LLMs as computing processes:

  • Each model is a process defined by a program (TOML file)
  • It maintains state between executions
  • It interacts with the system through defined interfaces

The library functions as a kernel:

  • Implements system calls for LLM processes
  • Manages resources across processes
  • Creates a standardized interface with the environment

Roadmap

Future development plans:

  1. Exec System Call for process replacement
  2. Process State Serialization & Restoration
  3. Retry mechanism with exponential backoff
  4. Enhanced error handling and reporting
  5. Support for streaming
  6. File Descriptor System Phase 3 enhancements
  7. Gemini models support

License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmproc-0.3.1.tar.gz (98.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llmproc-0.3.1-py3-none-any.whl (21.9 kB view details)

Uploaded Python 3

File details

Details for the file llmproc-0.3.1.tar.gz.

File metadata

  • Download URL: llmproc-0.3.1.tar.gz
  • Upload date:
  • Size: 98.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.17

File hashes

Hashes for llmproc-0.3.1.tar.gz
Algorithm Hash digest
SHA256 ec88ef47d8b8a9cc79459d6fc84731162fdaffbe3b522303f9cf2f010a6f160f
MD5 d86a1c23e90fc1b43029601b20f184e1
BLAKE2b-256 802df7de4c92e9b3cb9a2ff9347f2d93c6848b2d2da89db4d89e4029e5970bca

See more details on using hashes here.

File details

Details for the file llmproc-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: llmproc-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 21.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.17

File hashes

Hashes for llmproc-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d2937fdea867bf14ef78bfd16a0feb249a14a87dafa8db4973d161b2288188fd
MD5 3ed9f04003ea7979dee6a72ba186e9c2
BLAKE2b-256 26b54b54f57cb55aa392a41582bb293b7f4bd85602fbed5d547a723674757bb8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page