Micro agent with tool support and MCP integration.
Project description
⚡ z007 🤖: Nimble AI Agent
pronounced: "zee-double-oh-seven"
A lightweight and readable agent for interacting with LLM on AWS Bedrock with tool and MCP (Model Context Protocol) support.
Features
- 🟢 Ultra Readable: Clean, maintainable codebase in about 600 lines - easy to understand, modify, and extend
- ⚡ Super easy: Just run
uvx z007@latestwithAWS_PROFILE=<your profile>in env and start chatting instantly - ⚡ Simple Install: Quick install
uv tool install --upgrade z007and start chatting instantlyz007withAWS_PROFILE=<your profile>in env - 🔧 Tool Support: Built-in calculator and easily use plain python functions as tools
- 🔌 MCP Integration: Connect to Model Context Protocol servers
- 🐍 Python API: Easy integration into your Python projects
- 🚀 Async: Concurrent tool execution
Quick Start
Install and run with uvx (recommended)
```bash
# Install and run directly with AWS_PROFILE configured - fastest way to start!
AWS_PROFILE=your-profile uvx z007@latest
# Or install globally
uv tool install z007
AWS_PROFILE=your-profile z007
Install as Python package
pip install z007
Usage
Command Line
# Start interactive chat
z007
# With custom model (AWS Bedrock)
AWS_PROFILE=your-profile z007 --model-id "openai.gpt-oss-120b-1:0"
# With MCP configuration
z007 --mcp-config ./mcp.json
Python API
Simple usage
import asyncio
from z007 import Agent, create_calculator_tool
async def main():
calculator = create_calculator_tool()
async with Agent(model_id="openai.gpt-oss-20b-1:0", tools=[calculator]) as agent:
response = await agent.run("What is 2+2?")
print(response)
asyncio.run(main())
Using the Agent class
import asyncio
from z007 import Agent, create_calculator_tool
async def main():
calculator = create_calculator_tool()
async with Agent(
model_id="openai.gpt-oss-20b-1:0",
system_prompt="You are a helpful coding assistant.",
tools=[calculator]
) as agent:
response = await agent.run("Write a Python function to reverse a string")
print(response)
asyncio.run(main())
Custom Tools
Create your own tools by writing simple Python functions:
import asyncio
from z007 import Agent
def weather_tool(city: str) -> str:
"""Get weather information for a city"""
# In a real implementation, call a weather API
return f"The weather in {city} is sunny, 25°C"
def file_reader_tool(filename: str) -> str:
"""Read contents of a file"""
try:
with open(filename, 'r') as f:
return f.read()
except Exception as e:
return f"Error reading file: {e}"
async def main():
async with Agent(
model_id="openai.gpt-oss-20b-1:0",
tools=[weather_tool, file_reader_tool]
) as agent:
response = await agent.run("What's the weather like in Paris?")
print(response)
asyncio.run(main())
MCP Integration
Connect to Model Context Protocol servers for advanced capabilities:
- Create
mcp.json:
{
"servers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/project"]
},
"brave-search": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-brave-search"],
"env": {
"BRAVE_API_KEY": "${env:BRAVE_API_KEY}"
}
},
"playwright": {
"command": "npx",
"args": ["@playwright/mcp@latest"]
}
}
}
- Use with z007:
z007 --mcp-config mcp.json
Or in Python:
import json
from z007 import Agent
# Load MCP config
with open("mcp.json") as f:
mcp_config = json.load(f)
async with Agent(
model_id="openai.gpt-oss-20b-1:0",
mcp_config=mcp_config
) as agent:
response = await agent.run("Search for recent news about AI")
print(response)
Configuration
Environment Variables
For AWS Bedrock (default provider):
-
AWS_PROFILE: AWS profile name (e.g.,AWS_PROFILE=codemobs)or
-
AWS_REGION: AWS region (default: us-east-1) -
AWS_ACCESS_KEY_ID: AWS access key -
AWS_SECRET_ACCESS_KEY: AWS secret key
Supported Models
AWS Bedrock models with verified access:
openai.gpt-oss-20b-1:0(default)
Note: Model availability depends on your AWS account's Bedrock access permissions. Use AWS_PROFILE=your-profile to specify credentials.
- Any AWS Bedrock model with tool support
Interactive Commands
When running z007 in interactive mode:
/help- Show help/tools- List available tools/clear- Clear conversation history/exit- Exit
Requirements
- Python 3.9+
- LLM provider credentials (AWS for Bedrock)
License
MIT License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file z007-0.2.1.tar.gz.
File metadata
- Download URL: z007-0.2.1.tar.gz
- Upload date:
- Size: 379.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.17
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d89562c747ed071eb7ae0c51c9201b165e9557d4ac5b93fdfa565ee755938a0c
|
|
| MD5 |
2d70884140fb9476f9a50d6ba955ec35
|
|
| BLAKE2b-256 |
60f14788cd10047d38b984497984586fd3af3dcb4a544f01f552bb275a05a56c
|
File details
Details for the file z007-0.2.1-py3-none-any.whl.
File metadata
- Download URL: z007-0.2.1-py3-none-any.whl
- Upload date:
- Size: 15.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.17
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e394793b1b0fec4bb94d2ba8b07eee2b0a9e5ce2e4eb523da5ca44d5d8cfc55e
|
|
| MD5 |
7d9c14fa822d44825e68acaa1b7e73f6
|
|
| BLAKE2b-256 |
13bef1d44e1597a046e71c5fde1fe654858676bc0fd5501abcf94c7ac5af7827
|