Skip to main content

Intelligent documentation agent for AI coding assistants

Project description

Sensei

Intelligent documentation agent for AI coding assistants

Sensei solves the context pollution problem by being a specialized agent that orchestrates multiple knowledge sources and returns curated, accurate documentation with code examples.

Features

  • Multi-Source Documentation: Searches Context7, nia.ai, deepcon.ai, Sourcegraph, and Tavily
  • Intelligent Synthesis: Uses Claude Sonnet via DSPy to synthesize information from multiple sources
  • FastAPI REST API: Query Sensei via HTTP endpoints
  • MCP Server: Expose Sensei as an MCP tool for AI coding agents
  • Feedback Loop: Rate responses to improve future results
  • Self-Hostable: Run locally with SQLite or deploy with PostgreSQL

Quick Start

Installation

  1. Clone the repository
git clone https://github.com/yourusername/sensei.git
cd sensei
  1. Install dependencies
# Make sure you have Python 3.13 installed
uv sync
  1. Configure environment
cp .env.example .env
# Edit .env and add your API keys
  1. Run Sensei
# Run both API and MCP servers (default)
uv run sensei

# Or run just the API server
uv run sensei api

# Or run just the MCP server
uv run sensei mcp

Configuration

Sensei requires several API keys to function. Copy .env.example to .env and configure:

# Required: Claude API for the reasoning engine
ANTHROPIC_API_KEY=sk-ant-...

# Documentation sources (at least one recommended)
CONTEXT7_API_KEY=...
NIA_API_KEY=...
DEEPCON_API_KEY=...
TAVILY_API_KEY=...

# Database (default: SQLite)
DATABASE_URL=sqlite+aiosqlite:///./sensei.db

# API Server
API_HOST=0.0.0.0
API_PORT=8000

Usage

REST API

Once the API server is running, you can query Sensei via HTTP:

# Query for documentation
curl -X POST http://localhost:8000/query \
  -H "Content-Type: application/json" \
  -d '{"query": "How do I authenticate with OAuth in FastAPI?"}'

# Health check
curl http://localhost:8000/health

# Rate a response
curl -X POST http://localhost:8000/rate \
  -H "Content-Type: application/json" \
  -d '{"query_id": "abc-123", "rating": 5, "feedback": "Very helpful!"}'

MCP Server

When running in MCP mode, Sensei exposes two tools:

  • sensei_query: Query for documentation and code examples
  • sensei_rate: Rate a previous response

AI coding agents can use these tools via the Model Context Protocol.

Docker Deployment

Using Docker

# Build the image
docker build -t sensei .

# Run API server only
docker run -p 8000:8000 --env-file .env sensei api

# Run with custom environment variables
docker run -p 8000:8000 \
  -e ANTHROPIC_API_KEY=sk-ant-... \
  -e DATABASE_URL=sqlite+aiosqlite:///./sensei.db \
  sensei api

Using Docker Compose

# Start both API and MCP servers
docker-compose up -d

# View logs
docker-compose logs -f

# Stop services
docker-compose down

Development

Running Tests

# Run all tests
uv run pytest

# Run with coverage
uv run pytest --cov=sensei --cov-report=html

# Run specific test file
uv run pytest tests/test_api.py -v

Project Structure

sensei/
\x00\x00 sensei/              # Main package
   \x00\x00 agent/          # DSPy agent and tools
   \x00\x00 api/            # FastAPI server
   \x00\x00 database/       # SQLAlchemy models and storage
   \x00\x00 mcp/            # MCP server implementation
   \x00\x00 config.py       # Configuration management
   \x00\x00 main.py         # Entry point
\x00\x00 tests/              # Test suite
\x00\x00 docs/               # Documentation
\x00\x00 Dockerfile          # Docker image definition
\x00\x00 docker-compose.yml  # Docker Compose configuration
\x00\x00 pyproject.toml      # Project dependencies

Architecture

Sensei uses a multi-layer architecture:

  1. Agent Layer (DSPy): Orchestrates tool selection and response synthesis
  2. Tool Layer: Wraps external APIs (Context7, Sourcegraph, etc.)
  3. API Layer (FastAPI): Exposes HTTP endpoints
  4. MCP Layer: Provides MCP protocol interface
  5. Storage Layer (SQLAlchemy): Stores queries and ratings

Roadmap

  • Implement actual Context7 and Tavily MCP integrations
  • Add nia.ai and deepcon.ai API endpoints
  • Implement streaming responses
  • Add query caching
  • Support additional LLM models (OpenAI, Gemini)
  • Build feedback-based prompt optimization
  • Define open .codemap format

License

MIT

Contributing

Contributions welcome! Please read our contributing guidelines and submit pull requests to our repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sensei_ai-0.1.2.tar.gz (52.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sensei_ai-0.1.2-py3-none-any.whl (49.2 kB view details)

Uploaded Python 3

File details

Details for the file sensei_ai-0.1.2.tar.gz.

File metadata

  • Download URL: sensei_ai-0.1.2.tar.gz
  • Upload date:
  • Size: 52.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.11 {"installer":{"name":"uv","version":"0.9.11"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for sensei_ai-0.1.2.tar.gz
Algorithm Hash digest
SHA256 2e8ead55c74f1d7efe72cdb0113d83dd372cb1ccb423941f204f32e477e43bcc
MD5 13e5201c957833ad70f2e11dd07a63d3
BLAKE2b-256 92739b93e028582685b68399a31a1af03ebe3d3d0d3d6cb5b466441ca30eb887

See more details on using hashes here.

File details

Details for the file sensei_ai-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: sensei_ai-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 49.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.11 {"installer":{"name":"uv","version":"0.9.11"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for sensei_ai-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a05749dc301397872dcf7adedf8054e64893ce247a609881517c54fd516b0c05
MD5 181bf4bb183eee393e2a7c68774f1997
BLAKE2b-256 767c2db681d788e3700245b01ad7d8fad2ef17222ab028d26b0a5ffe6ae7cb90

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page