Skip to main content

Intelligent documentation agent for AI coding assistants

Project description

Sensei

Intelligent documentation agent for AI coding assistants

Sensei solves the context pollution problem by being a specialized agent that orchestrates multiple knowledge sources and returns curated, accurate documentation with code examples.

Features

  • Multi-Source Documentation: Searches Context7, nia.ai, deepcon.ai, Sourcegraph, and Tavily
  • Intelligent Synthesis: Uses Claude Sonnet via DSPy to synthesize information from multiple sources
  • FastAPI REST API: Query Sensei via HTTP endpoints
  • MCP Server: Expose Sensei as an MCP tool for AI coding agents
  • Feedback Loop: Rate responses to improve future results
  • Self-Hostable: Run locally with SQLite or deploy with PostgreSQL

Quick Start

Installation

  1. Clone the repository
git clone https://github.com/yourusername/sensei.git
cd sensei
  1. Install dependencies
# Make sure you have Python 3.13 installed
uv sync
  1. Configure environment
cp .env.example .env
# Edit .env and add your API keys
  1. Run Sensei
# Run both API and MCP servers (default)
uv run sensei

# Or run just the API server
uv run sensei api

# Or run just the MCP server
uv run sensei mcp

Configuration

Sensei requires several API keys to function. Copy .env.example to .env and configure:

# Required: Claude API for the reasoning engine
ANTHROPIC_API_KEY=sk-ant-...

# Documentation sources (at least one recommended)
CONTEXT7_API_KEY=...
NIA_API_KEY=...
DEEPCON_API_KEY=...
TAVILY_API_KEY=...

# Database (default: SQLite)
DATABASE_URL=sqlite+aiosqlite:///./sensei.db

# API Server
API_HOST=0.0.0.0
API_PORT=8000

Usage

REST API

Once the API server is running, you can query Sensei via HTTP:

# Query for documentation
curl -X POST http://localhost:8000/query \
  -H "Content-Type: application/json" \
  -d '{"query": "How do I authenticate with OAuth in FastAPI?"}'

# Health check
curl http://localhost:8000/health

# Rate a response
curl -X POST http://localhost:8000/rate \
  -H "Content-Type: application/json" \
  -d '{"query_id": "abc-123", "rating": 5, "feedback": "Very helpful!"}'

MCP Server

When running in MCP mode, Sensei exposes two tools:

  • sensei_query: Query for documentation and code examples
  • sensei_rate: Rate a previous response

AI coding agents can use these tools via the Model Context Protocol.

Docker Deployment

Using Docker

# Build the image
docker build -t sensei .

# Run API server only
docker run -p 8000:8000 --env-file .env sensei api

# Run with custom environment variables
docker run -p 8000:8000 \
  -e ANTHROPIC_API_KEY=sk-ant-... \
  -e DATABASE_URL=sqlite+aiosqlite:///./sensei.db \
  sensei api

Using Docker Compose

# Start both API and MCP servers
docker-compose up -d

# View logs
docker-compose logs -f

# Stop services
docker-compose down

Development

Running Tests

# Run all tests
uv run pytest

# Run with coverage
uv run pytest --cov=sensei --cov-report=html

# Run specific test file
uv run pytest tests/test_api.py -v

Project Structure

sensei/
\x00\x00 sensei/              # Main package
   \x00\x00 agent/          # DSPy agent and tools
   \x00\x00 api/            # FastAPI server
   \x00\x00 database/       # SQLAlchemy models and storage
   \x00\x00 mcp/            # MCP server implementation
   \x00\x00 config.py       # Configuration management
   \x00\x00 main.py         # Entry point
\x00\x00 tests/              # Test suite
\x00\x00 docs/               # Documentation
\x00\x00 Dockerfile          # Docker image definition
\x00\x00 docker-compose.yml  # Docker Compose configuration
\x00\x00 pyproject.toml      # Project dependencies

Architecture

Sensei uses a multi-layer architecture:

  1. Agent Layer (DSPy): Orchestrates tool selection and response synthesis
  2. Tool Layer: Wraps external APIs (Context7, Sourcegraph, etc.)
  3. API Layer (FastAPI): Exposes HTTP endpoints
  4. MCP Layer: Provides MCP protocol interface
  5. Storage Layer (SQLAlchemy): Stores queries and ratings

Roadmap

  • Implement actual Context7 and Tavily MCP integrations
  • Add nia.ai and deepcon.ai API endpoints
  • Implement streaming responses
  • Add query caching
  • Support additional LLM models (OpenAI, Gemini)
  • Build feedback-based prompt optimization
  • Define open .codemap format

License

MIT

Contributing

Contributions welcome! Please read our contributing guidelines and submit pull requests to our repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sensei_ai-0.1.0.tar.gz (51.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sensei_ai-0.1.0-py3-none-any.whl (48.1 kB view details)

Uploaded Python 3

File details

Details for the file sensei_ai-0.1.0.tar.gz.

File metadata

  • Download URL: sensei_ai-0.1.0.tar.gz
  • Upload date:
  • Size: 51.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.11 {"installer":{"name":"uv","version":"0.9.11"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for sensei_ai-0.1.0.tar.gz
Algorithm Hash digest
SHA256 68931b081e5b063e4a6fca09ef3a8edda17cd55ab10f1a68ef162b44a6cd1c03
MD5 714f0844ad40995bc0302002f5fdcb79
BLAKE2b-256 56c7d2b34df75d712488d4eca463de3ed6fbf4a97204114387dbb5ea117365c7

See more details on using hashes here.

File details

Details for the file sensei_ai-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: sensei_ai-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 48.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.11 {"installer":{"name":"uv","version":"0.9.11"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for sensei_ai-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c9d290cb3f0d6c5bc542b7759cbbb732512d70362894903c5377e7e7cb6a7135
MD5 dce7e1c100a5ddaa4baeefea26e7295b
BLAKE2b-256 924e0a2c5bb2ab3581cea641495e34bad2b4f925dca43da7a3b7516c1a78f422

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page