AI agent framework with Chief and Chen, a conversational AI psychologist with web search and multi-provider support in your terminal.
Project description
Chief & Chen AI Agents
A sophisticated AI agent framework built with Pydantic AI, featuring two main applications: Chief and Chen. Both agents provide conversational AI interfaces with web search capabilities and configurable model providers.
Chen has been written as an AI psychologist with an extensive software engineering background. You can think of her like Wendy Rhoades that is always on your side.
On the other hand, Chief is a barebone agent with a barebone system prompt. Chief provides an entry point if you want to write your own agents using the same patterns that Chen uses.
Features
- Dual Agent System: Chief and Chen applications with distinct personalities and capabilities
- Multi-Provider Support: Anthropic, OpenAI, and OpenRouter integration
- Web Search Integration: Powered by Tavily API for real-time information
- Smart Configuration: Interactive onboarding with JSON-based settings management
- MongoDB Persistence:
Async document storage with Beanie ODMNot implemented yet - Rich CLI Interface: Beautiful terminal UI with Typer and Rich
- MCP Support: Coming soon
Quick Start
Installation
# Clone and setup
git clone git@github.com:tistaharahap/chief-ai.git
cd chief
source .venv/bin/activate
# Install dependencies
rye sync
First Run (Chen)
# Launch Chen with automatic onboarding
rye run chen
Chen will guide you through interactive setup to configure your API keys and preferences.
First Run (Chief)
# Launch Chief
rye run chief
LLM Providers & Models Priority
In src/libagentic/providers.py, it is clear that the first choice is Anthropic's Claude 4 Sonnet.
If the anthropic_api_key in ~/.chen/settings.json is set, Claude 4 Sonnet will be the first choice. When all providers are set, the fallback becomes:
claude-sonnet-4-20250514via Anthropicgpt-5via OpenAIdeepseek/deepseek-chat-v3.1:freevia OpenRouter
To use free models, simply set the OpenRouter API key and leave the others unset.
Configuration
Chen Configuration System
Chen uses a sophisticated JSON-based configuration system with interactive onboarding:
# View current settings
rye run chen config
# Set individual values
rye run chen config set anthropic_api_key "sk-ant-..."
rye run chen config set context_window 150000
# Manual onboarding
rye run chen onboard
# Reset all settings
rye run chen reset
Settings are stored in ~/.chen/settings.json with support for:
- Anthropic, OpenAI, OpenRouter, and Tavily API keys
- Configurable context window size
- Automatic validation and type conversion
Environment Variables (Optional)
Environment variables serve as defaults during onboarding:
export ANTHROPIC_API_KEY="your-key-here"
export OPENAI_API_KEY="your-key-here"
export OPENROUTER_API_KEY="your-key-here"
export TAVILY_API_KEY="your-key-here"
Architecture
Core Structure
src/
├── appclis/ # CLI applications
│ ├── chen.py # Chen agent with config system
│ ├── chief.py # Chief agent
│ └── settings/ # Configuration management
├── libagentic/ # Core agent framework
│ ├── agents.py # Agent factory functions
│ ├── providers.py # Model provider configs
│ ├── prompts.py # System prompts
│ └── tools/ # Agent tools and capabilities
└── libshared/ # Shared utilities
└── mongo.py # MongoDB base classes
Technology Stack
- Pydantic AI: Core agent framework with Logfire integration
- Typer + Rich: Beautiful CLI interfaces with command groups
- Pydantic Settings: Type-safe configuration management
- Beanie: Async MongoDB ODM for persistence
- Tavily: Web search API integration
- OpenRouter: Multi-provider LLM access
Development
Code Quality
# Lint and format
ruff check . --fix
ruff format .
# Type checking and final lint
ruff check .
Testing
# Run tests (when implemented)
pytest
# With coverage
pytest --cov=src
Model Providers
- Default:
deepseek/deepseek-chat-v3.1:freevia OpenRouter,gpt-5via OpenAI andclaude-sonnet-4-20250514via Anthropic - Supported: All Anthropic, OpenAI, and OpenRouter models
- Configuration: Interactive setup or individual config commands
- Fallbacks: Automatic provider selection based on availability
License
Contributing
Contributions are welcome! Please open issues or pull requests for enhancements and bug fixes.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file chief_ai-0.1.0.tar.gz.
File metadata
- Download URL: chief_ai-0.1.0.tar.gz
- Upload date:
- Size: 98.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
51d57bfba0bc238d7181ea69955094e6388f40240f643a973d71e6c2749c7899
|
|
| MD5 |
dfa406d5db35056866f8daa6eee6e608
|
|
| BLAKE2b-256 |
4f4e6800b4bac8a650229ed056dc029c5631922ae1cac892017fe3a1ef028928
|
File details
Details for the file chief_ai-0.1.0-py3-none-any.whl.
File metadata
- Download URL: chief_ai-0.1.0-py3-none-any.whl
- Upload date:
- Size: 40.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c5eee7c383a54f6b1a0363db3c9cd01992545fe1221160edfecba2a15e17d60b
|
|
| MD5 |
eb33aacf9f67ab8b4564b79b8c08181c
|
|
| BLAKE2b-256 |
ffe635c7a24115acfd1845ce16201a9f9d95ecf66433512fb2d721ac998f0fe2
|