Lightweight Python AI Agent Framework with Web UI
Project description
Agent Zero Lite
A lightweight, cross-platform implementation of Agent Zero that maintains core functionality while reducing complexity and dependencies.
Features
✅ Full LiteLLM Support - 100+ AI providers (OpenAI, Anthropic, Google, local models, etc.)
✅ Web UI - Complete interface at localhost:50001
✅ Vector Memory - FAISS-based persistent memory
✅ Document RAG - PDF, text, and document processing
✅ Multi-Agent - Superior/subordinate agent hierarchy
✅ MCP Client - Model Context Protocol integration
✅ Local Execution - Python, Node.js, and terminal
✅ Tunneling - Remote access support
✅ File Management - Work directory browser
Removed from Full Version
❌ Browser automation (Playwright)
❌ Docker/SSH execution
❌ Speech processing (STT/TTS)
❌ Task scheduling
❌ Backup/restore system
❌ Web search tools
Quick Start
- Install (CPU-only by default):
pip install agent-zero-lite
- Optional extras:
- CPU ML helpers (additional ONNX/Transformers utilities):
pip install "agent-zero-lite[cpu]"
- Transformers stack (CPU) and ONNX runtime (sentence-transformers included by default):
pip install "agent-zero-lite[ml]"
- Audio transcription (Whisper, CPU):
pip install "agent-zero-lite[audio]"
- GPU stack (advanced; choose your CUDA build of torch separately if needed):
pip install "agent-zero-lite[gpu]" # For PyTorch CUDA builds, see: https://pytorch.org/get-started/locally/
- CPU ML helpers (additional ONNX/Transformers utilities):
- Configure environment:
cp .env.example .env
# Edit .env with your API keys
- Start the Web UI:
python run_ui.py
- Open browser:
http://localhost:50001
Configuration
Minimal Setup
Set at least one LLM provider in .env:
# OpenAI
OPENAI_API_KEY=sk-...
# Or Anthropic
ANTHROPIC_API_KEY=sk-ant-...
# Or local Ollama
CHAT_MODEL_PROVIDER=ollama
CHAT_MODEL_NAME=llama3.1:8b
OLLAMA_API_BASE=http://localhost:11434
Full Configuration
See .env.example for all available options including:
- All 100+ LiteLLM providers
- Model configurations
- Rate limiting settings
- MCP server integration
- Memory and knowledge settings
Supported Models
Agent Zero Lite supports all LiteLLM providers:
Commercial APIs
- OpenAI: GPT-4o, GPT-4, GPT-3.5, etc.
- Anthropic: Claude 3.5 Sonnet, Claude 3 Opus, etc.
- Google: Gemini 1.5 Pro, Gemini 1.5 Flash, etc.
- Groq: Llama 3.1, Mixtral, etc. (fast inference)
- Together AI: Llama, Mistral, etc.
- Mistral AI: Mistral Large, Mistral 7B, etc.
- Cohere: Command R+, Command Light, etc.
Local Models
- Ollama: Any local model (llama3.1, mistral, etc.)
- LM Studio: Local model server
- Text Generation WebUI: Local inference
- VLLM: High-performance inference server
Enterprise
- Azure OpenAI: Enterprise GPT models
- AWS Bedrock: Claude, Titan, etc.
- Google Vertex AI: Enterprise Gemini
- Hugging Face: Hosted models
Usage Examples
Basic Chat
from agent import AgentContext
import initialize
# Initialize agent
config = initialize.initialize_agent()
context = AgentContext(config)
# Send message
response = context.communicate("Hello, what can you help me with?")
Code Execution
The agent can execute Python, Node.js, and terminal commands:
User: "Create a Python script that calculates fibonacci numbers"
Agent: Uses code_execution tool to write and run Python code
Document Processing
User: "Analyze this PDF document and summarize the key points"
Agent: Uses document_query tool to process and analyze documents
Multi-Agent Collaboration
User: "Create a complex analysis using multiple specialized agents"
Agent: Uses call_subordinate to delegate tasks to specialized sub-agents
Architecture
Agent Zero Lite maintains the core Agent Zero architecture:
- Agent Loop: Reason → Tool Use → Response cycle
- Tool System: Extensible plugin architecture
- Memory: FAISS vector database for persistent memory
- Extensions: Hook-based system for customization
- Prompts: Template-based prompt management
Development
Adding Tools
Create new tools in python/tools/:
from python.helpers.tool import Tool, Response
class MyTool(Tool):
async def execute(self, **kwargs):
# Tool logic here
return Response(message="result", break_loop=False)
Adding Extensions
Create extensions in python/extensions/:
from python.helpers.extension import Extension
class MyExtension(Extension):
async def execute(self, **kwargs):
# Extension logic here
pass
Troubleshooting
Common Issues
- Model not responding: Check API keys in
.env - Port in use: Change PORT in
.env - Memory issues: Reduce context length settings
- Missing dependencies: Run
pip install -r requirements.txt
Debugging
Enable debug logging by setting:
LITELLM_LOG=DEBUG
Migration
From Full Agent Zero
- Copy
.envsettings - Copy
memory/andknowledge/folders - Copy
work_dir/contents - Remove Docker/SSH configurations
To Full Agent Zero
- Install additional dependencies
- Add Docker/SSH configurations
- No data migration needed
Performance
Agent Zero Lite is optimized for:
- Startup: ~3 seconds vs 15+ seconds
- Memory: ~200MB vs 1GB+ RAM usage
- Dependencies: ~30 packages vs 45+ packages
- Installation: <2 minutes vs 10+ minutes
License
Same as Agent Zero - check the original repository for license terms.
Support
For issues and questions:
- Check this README
- Review
.env.exampleconfiguration - See the original Agent Zero documentation
- Report issues to the Agent Zero repository
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file agent_zero_lite-1.0.16.tar.gz.
File metadata
- Download URL: agent_zero_lite-1.0.16.tar.gz
- Upload date:
- Size: 4.4 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
be7e4573d8849e1c6f0a7436a336d597303046c5e32fc77dd1e8e50919c1c586
|
|
| MD5 |
c484abdedab22c56fb412dd41afe47b9
|
|
| BLAKE2b-256 |
2602808d82dfb50dc00ece4022d3d3aa23283850432046f521103e64c96e2e1f
|
File details
Details for the file agent_zero_lite-1.0.16-py3-none-any.whl.
File metadata
- Download URL: agent_zero_lite-1.0.16-py3-none-any.whl
- Upload date:
- Size: 145.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b8b4ad573f8286befcaaaf9b769d24287a5c6df019a87c4b112cc92990f2b9a7
|
|
| MD5 |
5eae183c625883e3f6cc46a25362bed3
|
|
| BLAKE2b-256 |
aeab101adabd6a167be45e1fdabea6c9cbbf9cbca59585a3def9e8cada37a1a1
|