Skip to main content

Intelligent documentation agent for AI coding assistants

Project description

Sensei

Intelligent documentation agent for AI coding assistants

Sensei solves the context pollution problem by being a specialized agent that orchestrates multiple knowledge sources and returns curated, accurate documentation with code examples.

Features

  • Multi-Source Documentation: Searches Context7, nia.ai, deepcon.ai, Sourcegraph, and Tavily
  • Intelligent Synthesis: Uses Claude Sonnet via DSPy to synthesize information from multiple sources
  • FastAPI REST API: Query Sensei via HTTP endpoints
  • MCP Server: Expose Sensei as an MCP tool for AI coding agents
  • Feedback Loop: Rate responses to improve future results
  • Self-Hostable: Run locally with SQLite or deploy with PostgreSQL

Quick Start

Installation

  1. Clone the repository
git clone https://github.com/yourusername/sensei.git
cd sensei
  1. Install dependencies
# Make sure you have Python 3.13 installed
uv sync
  1. Configure environment
cp .env.example .env
# Edit .env and add your API keys
  1. Run Sensei
# Run both API and MCP servers (default)
uv run sensei

# Or run just the API server
uv run sensei api

# Or run just the MCP server
uv run sensei mcp

Configuration

Sensei requires several API keys to function. Copy .env.example to .env and configure:

# Required: Claude API for the reasoning engine
ANTHROPIC_API_KEY=sk-ant-...

# Documentation sources (at least one recommended)
CONTEXT7_API_KEY=...
NIA_API_KEY=...
DEEPCON_API_KEY=...
TAVILY_API_KEY=...

# Database (default: SQLite)
DATABASE_URL=sqlite+aiosqlite:///./sensei.db

# API Server
API_HOST=0.0.0.0
API_PORT=8000

Usage

REST API

Once the API server is running, you can query Sensei via HTTP:

# Query for documentation
curl -X POST http://localhost:8000/query \
  -H "Content-Type: application/json" \
  -d '{"query": "How do I authenticate with OAuth in FastAPI?"}'

# Health check
curl http://localhost:8000/health

# Rate a response
curl -X POST http://localhost:8000/rate \
  -H "Content-Type: application/json" \
  -d '{"query_id": "abc-123", "rating": 5, "feedback": "Very helpful!"}'

MCP Server

When running in MCP mode, Sensei exposes two tools:

  • sensei_query: Query for documentation and code examples
  • sensei_rate: Rate a previous response

AI coding agents can use these tools via the Model Context Protocol.

Docker Deployment

Using Docker

# Build the image
docker build -t sensei .

# Run API server only
docker run -p 8000:8000 --env-file .env sensei api

# Run with custom environment variables
docker run -p 8000:8000 \
  -e ANTHROPIC_API_KEY=sk-ant-... \
  -e DATABASE_URL=sqlite+aiosqlite:///./sensei.db \
  sensei api

Using Docker Compose

# Start both API and MCP servers
docker-compose up -d

# View logs
docker-compose logs -f

# Stop services
docker-compose down

Development

Running Tests

# Run all tests
uv run pytest

# Run with coverage
uv run pytest --cov=sensei --cov-report=html

# Run specific test file
uv run pytest tests/test_api.py -v

Project Structure

sensei/
\x00\x00 sensei/              # Main package
   \x00\x00 agent/          # DSPy agent and tools
   \x00\x00 api/            # FastAPI server
   \x00\x00 database/       # SQLAlchemy models and storage
   \x00\x00 mcp/            # MCP server implementation
   \x00\x00 config.py       # Configuration management
   \x00\x00 main.py         # Entry point
\x00\x00 tests/              # Test suite
\x00\x00 docs/               # Documentation
\x00\x00 Dockerfile          # Docker image definition
\x00\x00 docker-compose.yml  # Docker Compose configuration
\x00\x00 pyproject.toml      # Project dependencies

Architecture

Sensei uses a multi-layer architecture:

  1. Agent Layer (DSPy): Orchestrates tool selection and response synthesis
  2. Tool Layer: Wraps external APIs (Context7, Sourcegraph, etc.)
  3. API Layer (FastAPI): Exposes HTTP endpoints
  4. MCP Layer: Provides MCP protocol interface
  5. Storage Layer (SQLAlchemy): Stores queries and ratings

Roadmap

  • Implement actual Context7 and Tavily MCP integrations
  • Add nia.ai and deepcon.ai API endpoints
  • Implement streaming responses
  • Add query caching
  • Support additional LLM models (OpenAI, Gemini)
  • Build feedback-based prompt optimization
  • Define open .codemap format

License

MIT

Contributing

Contributions welcome! Please read our contributing guidelines and submit pull requests to our repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sensei_ai-0.1.1.tar.gz (52.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sensei_ai-0.1.1-py3-none-any.whl (49.2 kB view details)

Uploaded Python 3

File details

Details for the file sensei_ai-0.1.1.tar.gz.

File metadata

  • Download URL: sensei_ai-0.1.1.tar.gz
  • Upload date:
  • Size: 52.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.11 {"installer":{"name":"uv","version":"0.9.11"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for sensei_ai-0.1.1.tar.gz
Algorithm Hash digest
SHA256 1d889aadea7cd71407fe48a2a1f3e57af44eb8f8b0e7cf02674b4b32af513fef
MD5 2a484b1326f69d18032feb1085db8e31
BLAKE2b-256 d7412756bb273fd74383174d16bd94e9abef25351f338246947e37052d193bd2

See more details on using hashes here.

File details

Details for the file sensei_ai-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: sensei_ai-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 49.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.11 {"installer":{"name":"uv","version":"0.9.11"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for sensei_ai-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e2da9e3020697dd02896960f614d3758947d43922101dd8e26ba1d286b950c87
MD5 c53cb0d1d91842964e0224685aaeafa7
BLAKE2b-256 97653477b85c3519c46dafa8a457b39515a12231a800455188c0aef1aa4c346b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page