A simple framework for LLM-powered applications
Project description
LLMProc
LLMProc: A Unix-inspired operating system for language models. Like processes in an OS, LLMs execute instructions, make system calls, manage resources, and communicate with each other - enabling powerful multi-model applications with sophisticated I/O management.
Table of Contents
Installation
# Install with uv (recommended)
uv pip install llmproc # Base package
uv pip install "llmproc[openai]" # For OpenAI models
uv pip install "llmproc[anthropic]" # For Anthropic models
uv pip install "llmproc[all]" # All providers
See MISC.md for additional installation options and provider configurations.
Quick Start
Python usage
# Full example: examples/multiply_example.py
import asyncio
from llmproc import LLMProgram, register_tool
@register_tool()
def multiply(a: float, b: float) -> dict:
"""Multiply two numbers and return the result."""
return {"result": a * b} # Expected: π * e = 8.539734222677128
async def main():
program = LLMProgram(
model_name="claude-3-7-sonnet-20250219",
provider="anthropic",
system_prompt="You're a helpful assistant.",
parameters={"max_tokens": 1024}
)
program.set_enabled_tools([multiply])
process = await program.start()
await process.run("Can you multiply 3.14159265359 by 2.71828182846?")
print(process.get_last_message())
if __name__ == "__main__":
asyncio.run(main())
CLI usage
# Start interactive session
llmproc-demo ./examples/anthropic/claude-3-5-haiku.toml
# Single prompt
llmproc-demo ./examples/anthropic/claude-3-5-sonnet.toml -p "What is Python?"
# Read from stdin
cat questions.txt | llmproc-demo ./examples/anthropic/claude-3-7-sonnet.toml -n
# Use Gemini models
llmproc-demo ./examples/gemini/gemini-2.0-flash-direct.toml
Features
Supported Model Providers
- OpenAI: GPT-4o, GPT-4o-mini, GPT-4.5
- Anthropic: Claude 3 Haiku, Claude 3.5/3.7 Sonnet (direct API and Vertex AI)
- Google: Gemini 1.5 Flash/Pro, Gemini 2.0 Flash, Gemini 2.5 Pro (direct API and Vertex AI)
LLMProc offers a Unix-inspired toolkit for building sophisticated LLM applications:
Process Management - Unix-like LLM Orchestration
- Program Linking - Spawn specialized LLM processes for delegated tasks
- Fork Tool - Create process copies with shared conversation state
- GOTO (Time Travel) - Reset conversations to previous points with context compaction demo
Large Content Handling - Sophisticated I/O Management
- File Descriptor System - Unix-like pagination for large outputs
- Reference ID System - Mark up and reference specific pieces of content
- Smart Content Pagination - Optimized line-aware chunking for content too large for context windows
Usage Examples
- See the Python SDK documentation for the fluent API
- Use Function-Based Tools to register Python functions as tools
- Start with a simple configuration for quick experimentation
Additional Features
- File Preloading - Enhance context by loading files into system prompts
- Environment Info - Add runtime context like working directory
- Prompt Caching - Automatic 90% token savings for Claude models (enabled by default)
- Reasoning/Thinking models - Claude 3.7 Thinking and OpenAI Reasoning models
- MCP Protocol - Standardized interface for tool usage
- Tool Aliases - Provide simpler, intuitive names for tools
- Cross-provider support - Currently supports Anthropic, OpenAI, and Google Gemini
Demo Tools
LLMProc includes demo command-line tools for quick experimentation:
llmproc-demo
Interactive CLI for testing LLM configurations:
llmproc-demo ./examples/anthropic/claude-3-5-haiku.toml # Interactive session
llmproc-demo ./config.toml -p "What is Python?" # Single prompt
cat questions.txt | llmproc-demo ./config.toml -n # Pipe mode
Commands: exit or quit to end the session
llmproc-prompt
View the compiled system prompt without making API calls:
llmproc-prompt ./config.toml # Display to stdout
llmproc-prompt ./config.toml -o prompt.txt # Save to file
llmproc-prompt ./config.toml -E # Without environment info
Use Cases
- Claude Code - A minimal Claude Code implementation, with support for preloading CLAUDE.md, spawning, MCP
Documentation
Documentation Index: Start here for guided learning paths
- Examples: Sample configurations and use cases
- API Docs: Detailed API documentation
- Python SDK: Fluent API and program creation
- Function-Based Tools: Python function tools with type hints
- File Descriptor System: Handling large outputs
- Program Linking: LLM-to-LLM communication
- GOTO (Time Travel): Conversation time travel
- MCP Feature: Model Context Protocol for tools
- Tool Aliases: Using simpler names for tools
- Gemini Integration: Google Gemini models usage guide
- Testing Guide: Testing and validation
- For complete reference, see reference.toml
For advanced usage and implementation details, see MISC.md. For design rationales and API decisions, see FAQ.md.
Design Philosophy
LLMProc treats LLMs as processes in a Unix-inspired operating system framework:
- LLMs function as processes that execute prompts and make tool calls
- Tools operate at both user and kernel levels, with system tools able to modify process state
- The Process abstraction naturally maps to Unix concepts like spawn, fork, goto, and IPC
- This architecture provides a foundation for evolving toward a more complete LLM operating system
For in-depth explanations of these design decisions, see our API Design FAQ.
Roadmap
- Persistent children & inter-process communication
- llmproc mcp server
- Streaming api support
- Process State Serialization & Restoration
- Feature parity for openai/gemini models
License
Apache License 2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llmproc-0.5.0.tar.gz.
File metadata
- Download URL: llmproc-0.5.0.tar.gz
- Upload date:
- Size: 160.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.5.17
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
628f4bc4065cae81737573257cd1bd13e9f7bf902445b097d813edb27cc060ba
|
|
| MD5 |
c3f77450c26fdcea75ada3a1d01ed3c3
|
|
| BLAKE2b-256 |
5ddaa4e1cf71f18195b1278f46f128a86fc5581f63e6b4ebc47546583bad3e1a
|
File details
Details for the file llmproc-0.5.0-py3-none-any.whl.
File metadata
- Download URL: llmproc-0.5.0-py3-none-any.whl
- Upload date:
- Size: 28.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.5.17
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
eb3f9dd7539c5eba7c87b68523ab1470ab826f54e04b8008c88bfb8c555c94a4
|
|
| MD5 |
55f76a9e3b3ebc5d04de6a0e4483e284
|
|
| BLAKE2b-256 |
3506f99944665f23949aea31d23fadedf7607c0065fce9b2a7b33587c5dc0e16
|