A framework to orchestrate long-running LLM workflows with a persistent task tracker.
Project description
๐งต python-weaver
Transform complex, multi-stage workflows into reliable, resumable AI orchestration
hey! im a grad student building this framework for catering every type of user who work with LLMs. It all started, when LLMs were missing important pieces of context in long shot conversations, tasks like research writing, market analysis, code migration were stunted. to tackle this, i made this framework, to make LLMs take notes, plan in advance, maintain a tracker (blueprint). i hope to make this production ready one day :D
this framework enables llms to execute sophisticated, long-duration projects through intelligent task orchestration. By combining built-in connectors, human oversight, and multi-provider llm support, weaver bridges the gap between simple ai interactions and complex, real-world automation.
Architecture Diagram
โจ Why Weaver?
๐ Never Lose Progress Again
- Stateful execution with SQLite persistence
- Resume interrupted workflows from any point
- Comprehensive error handling and retry logic
๐ค Human-AI Collaboration
- Review and edit AI-generated task plans
- Approve results before proceeding to next steps
- Export/import workflows via intuitive CSV interface
๐ก๏ธ Production-Ready Architecture
- Local-first security - your data never leaves your machine
- Multi-provider LLM support through litellm integration
- Modular, extensible design for custom workflows
๐ Intelligent Task Management
- Automatic dependency resolution
- Cost tracking across all LLM providers
- Detailed execution logs and timestamps
๐ Quick Start
Installation
pip install python-weaver
Basic Usage
from weaver.project import Project
# Initialize a new project
project = Project(
project_name="market_analysis",
project_goal="Generate a comprehensive market analysis report for renewable energy trends"
)
# Ingest your data sources
project.ingest([
"data/market_report.pdf",
"data/competitor_analysis.txt",
"https://example.com/industry-trends"
])
# Generate an AI-powered execution plan
project.plan()
# ๐ Review and edit the generated blueprint.csv
# Execute with human oversight
project.run(human_feedback=True)
CLI Workflow
# Initialize project
weaver init my_project "Analyze customer feedback and generate improvement recommendations"
# Add data sources
weaver ingest my_project reports/*.pdf feedback_data.csv
# Generate execution plan
weaver plan my_project
# Execute with human feedback
weaver run my_project
# Or run fully automated
weaver run my_project --no-human-feedback --steps 5
๐๏ธ Workflow
Weaver's architecture ensures reliability and transparency:
- Blueprint Database: SQLite-backed task tracking with complete state persistence
- Agent System: Intelligent task execution with automatic retry and error handling
- Connector Framework: Extensible ingestion system for PDFs, URLs, and custom sources
- Multi-Provider Support: Seamless integration with 10+ LLM providers via litellm
๐ฏ Core Features
๐ Intelligent Task Planning
- Automated Decomposition: Break complex goals into manageable, sequential tasks
- Dependency Management: Automatic resolution of task dependencies and prerequisites
- Resource Optimization: Smart LLM selection and load balancing across providers
๐ Stateful Execution
- Persistent State: All progress saved to local SQLite database
- Resume Capability: Pick up exactly where you left off after interruptions
- Audit Trail: Complete execution history with timestamps and cost tracking
๐ฅ Human-in-the-Loop
- Plan Review: Edit AI-generated task plans before execution
- Result Approval: Review and modify outputs at each step
- CSV Interface: Intuitive spreadsheet-based workflow management
๐ Multi-Provider Support
- OpenAI: GPT-4, GPT-4 Turbo, GPT-3.5
- Anthropic: Claude 3 Haiku, Sonnet, Opus
- Google: Gemini 1.5 Pro, Flash
- Azure OpenAI: Enterprise-grade OpenAI models
- Local Models: Ollama, LM Studio integration
- Custom Providers: Easy integration via litellm
๐ฆ Installation & Setup
Prerequisites
- Python 3.8+
- At least one LLM provider API key
Quick Setup
# Install from PyPI
pip install python-weaver
# Verify installation
weaver check
Provider Configuration
Set your API keys as environment variables:
# OpenAI
export OPENAI_API_KEY="sk-..."
# Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."
# Google
export GOOGLE_API_KEY="AIza..."
# Azure OpenAI
export AZURE_API_KEY="..."
export AZURE_API_BASE="https://your-resource.openai.azure.com/"
Advanced Configuration
Create a weaver.toml file for advanced settings:
[litellm]
set_verbose = true
drop_params = true
[api_bases]
openai = "https://api.openai.com/v1"
anthropic = "https://api.anthropic.com"
[models]
default_orchestrator = "gpt-4o-mini"
๐ง Advanced Usage
Custom Connectors
Extend Weaver with custom data sources:
from weaver.connectors.base_connector import BaseConnector
class DatabaseConnector(BaseConnector):
def ingest(self, source: str) -> str:
# Your custom ingestion logic
return extracted_text
# Register and use
project.register_connector("db", DatabaseConnector())
project.ingest(["db://my_database/table"])
Programmatic Workflow Management
from weaver.blueprint import Blueprint
from weaver.agent import Agent
# Direct blueprint manipulation
blueprint = Blueprint("my_project.db")
task_id = blueprint.add_task(
task_name="Data Analysis",
llm_config_key="claude-3-sonnet",
prompt_template="Analyze the following data: {data}",
dependencies="1,2"
)
# Execute specific tasks
agent = Agent(blueprint, "My project goal")
agent.execute_task(task_id)
Cost Optimization
# Configure cost-aware execution
project = Project("cost_optimized", "Generate report")
# Use different models for different task types
project.blueprint.add_task(
task_name="Quick Summary",
llm_config_key="gpt-4o-mini", # Cost-effective for simple tasks
prompt_template="Summarize: {content}"
)
project.blueprint.add_task(
task_name="Deep Analysis",
llm_config_key="gpt-4o", # High-quality for complex analysis
prompt_template="Provide detailed analysis: {content}"
)
๐ Real-World Examples
๐ Market Research Automation
project = Project("market_research", "Comprehensive renewable energy market analysis")
project.ingest([
"reports/iea_renewable_2024.pdf",
"data/market_data.csv",
"https://irena.org/statistics"
])
project.plan()
project.run()
๐ Academic Research Pipeline
project = Project("literature_review", "Systematic review of AI ethics literature")
project.ingest([
"papers/*.pdf",
"abstracts.txt",
"https://arxiv.org/search/cs.AI"
])
project.plan()
project.run(steps=3) # Process in batches
๐ข Business Process Automation
project = Project("quarterly_report", "Generate Q4 business performance report")
project.ingest([
"financials/q4_data.xlsx",
"metrics/kpi_dashboard.csv",
"feedback/customer_surveys.txt"
])
project.plan()
project.run(human_feedback=True)
๐ ๏ธ Development
Setup Development Environment
# Clone repository
git clone https://github.com/python-weaver/python-weaver.git
cd python-weaver
# Install in development mode
pip install -e .[dev]
# Run tests
pytest
# Run with coverage
pytest --cov=weaver --cov-report=html
Project Structure
python-weaver/
โโโ weaver/
โ โโโ __init__.py
โ โโโ project.py # Main user interface
โ โโโ blueprint.py # SQLite task management
โ โโโ agent.py # LLM task execution
โ โโโ config.py # Provider configurations
โ โโโ cli.py # Command-line interface
โ โโโ connectors/ # Data ingestion modules
โโโ tests/ # Comprehensive test suite
โโโ examples/ # Usage examples
โโโ docs/ # Documentation
Contributing
could use all the help i can get. i'll make a contributing guide soon to help you easily navigate and develop!
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
๐ Roadmap
๐ Upcoming Features
- ๐ Async/Await Patterns
- ๐ Entirely customizable connectors
- ๐ Web Interface: Browser-based project management dashboard
๐ฏ Current Focus
- Documentation and Official Website
- DB Optimization
- Better Memory Management
- RAG For long contexts
- Security Concerns addressing
๐ Documentation
Coming Soon!
๐ License
This project is licensed under the MIT License - see the LICENSE file for details.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file python_weaver-0.1.6.tar.gz.
File metadata
- Download URL: python_weaver-0.1.6.tar.gz
- Upload date:
- Size: 20.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
84821210ce6f6c3d39eb21c67c18b86c68f062824e4c4a0af4faaf0db5442276
|
|
| MD5 |
e353fd43e3f1a28b1413480dd215ec03
|
|
| BLAKE2b-256 |
21e9a072f71d80b661d105084d53994c9ef29d5a459b8a30993b40204b0a14d6
|
Provenance
The following attestation bundles were made for python_weaver-0.1.6.tar.gz:
Publisher:
python-publish.yml on adv-11/python-weaver
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
python_weaver-0.1.6.tar.gz -
Subject digest:
84821210ce6f6c3d39eb21c67c18b86c68f062824e4c4a0af4faaf0db5442276 - Sigstore transparency entry: 255863847
- Sigstore integration time:
-
Permalink:
adv-11/python-weaver@e14036ba48d781b536a1ff6b12a93c9f5790136a -
Branch / Tag:
refs/tags/v0.1.6 - Owner: https://github.com/adv-11
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@e14036ba48d781b536a1ff6b12a93c9f5790136a -
Trigger Event:
release
-
Statement type:
File details
Details for the file python_weaver-0.1.6-py3-none-any.whl.
File metadata
- Download URL: python_weaver-0.1.6-py3-none-any.whl
- Upload date:
- Size: 21.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bf610b7d026ef1657b77c4cc1653135233d00ed26fbe1097e679b4cad578958c
|
|
| MD5 |
47070abb6e7a3a1f3bc3ef6735e59bf2
|
|
| BLAKE2b-256 |
eda50d8292425244b0e280c124db0ec9c152f81e01bb85accc4317ff107ceb1f
|
Provenance
The following attestation bundles were made for python_weaver-0.1.6-py3-none-any.whl:
Publisher:
python-publish.yml on adv-11/python-weaver
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
python_weaver-0.1.6-py3-none-any.whl -
Subject digest:
bf610b7d026ef1657b77c4cc1653135233d00ed26fbe1097e679b4cad578958c - Sigstore transparency entry: 255863850
- Sigstore integration time:
-
Permalink:
adv-11/python-weaver@e14036ba48d781b536a1ff6b12a93c9f5790136a -
Branch / Tag:
refs/tags/v0.1.6 - Owner: https://github.com/adv-11
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@e14036ba48d781b536a1ff6b12a93c9f5790136a -
Trigger Event:
release
-
Statement type: