Skip to main content

Chat with local Ollama models that can explore your codebase

Project description

Local Brain — Claude Code Plugin Marketplace

A Claude Code plugin marketplace that extends Claude with local capabilities. The first skill lets Claude delegate codebase exploration to local Ollama models.

Install Marketplace

Add this marketplace to Claude Code:

/plugin marketplace add IsmaelMartinez/local-brain

Then install the plugin:

/plugin install local-brain@local-brain-marketplace

Available Plugins

local-brain

Delegate codebase exploration to local Ollama models. Claude offloads read-only tasks to your machine—no cloud round-trips, full privacy.

┌─────────────┐     delegates      ┌─────────────┐     calls      ┌─────────┐
│ Claude Code │ ──────────────────►│ Local Brain │ ──────────────►│ Ollama  │
│   (Cloud)   │                    │ (Smolagents)│                │ (Local) │
│             │◄────────────────── │             │◄────────────── │         │
└─────────────┘     returns        └─────────────┘    responds    └─────────┘
                    results                        with code execution

What Claude can delegate:

  • "Review the code changes"
  • "Explain how the auth module works"
  • "Generate a commit message"
  • "Find all TODO comments"

Marketplace Structure

This repo follows the Claude Code plugin structure:

local-brain/                          # MARKETPLACE ROOT
├── .claude-plugin/
│   └── marketplace.json              # Marketplace manifest
└── local-brain/                      # PLUGIN
    ├── plugin.json                   # Plugin manifest
    └── skills/
        └── local-brain/
            └── SKILL.md              # Skill documentation

local-brain Plugin Details

Prerequisites

  1. Install the CLI:
uv pip install local-brain

Or with pipx:

pipx install local-brain
  1. Install Ollama from ollama.ai and pull a model:
ollama pull qwen3

CLI Usage

local-brain "What files changed recently?"
local-brain "Review the code in src/"
local-brain "Generate a commit message"
local-brain "Explain how auth works"
local-brain "prompt"                       # Ask anything (auto-selects best model)
local-brain -v "prompt"                    # Verbose (show tool calls)
local-brain -m qwen2.5-coder:7b "prompt"   # Specific model
local-brain --list-models                  # Show available models
local-brain --root /path/to/project "prompt"  # Set project root

Model Discovery

Local Brain automatically detects installed Ollama models and picks the best one:

local-brain --list-models

Recommended models:

  • qwen3:latest — General purpose (default)
  • qwen2.5-coder:7b — Code-focused
  • llama3.2:3b — Fast, lightweight
  • mistral:7b — Balanced

Tools

Custom read-only tools registered with Smolagents' @tool decorator:

Tool What it does
read_file Read file contents (path-jailed)
list_directory List files with glob patterns (path-jailed)
file_info Get file metadata (path-jailed)
git_diff Show git changes (staged or unstaged)
git_status Show current branch and changes
git_log View recent commit history
git_changed_files List modified/staged files

Architecture

Local Brain uses Smolagents as the agent framework:

local_brain/
├── __init__.py      # Version
├── cli.py           # Click CLI entry point
├── models.py        # Ollama model discovery & selection
├── security.py      # Path jailing utilities
└── smolagent.py     # CodeAgent + custom tools

What comes from Smolagents:

  • CodeAgent — Agent that executes tasks via code generation
  • LiteLLMModel — Connects to Ollama via LiteLLM
  • @tool decorator — Registers our custom tools with the agent

What we implement:

  • All 7 tools (read_file, git_diff, etc.) — our code, registered via @tool
  • Path jailing security — restricts file access to project root
  • Model discovery — detects installed Ollama models

Security

Two-layer security model:

  1. Tool layer — Our pre-defined tools are trusted code:

    • ✅ Read files within project directory (path-jailed)
    • ✅ Execute git commands (read-only via subprocess)
    • ❌ File I/O outside project root blocked
    • ❌ Sensitive files blocked (.env, keys)
  2. LLM sandbox — Code generated by the LLM runs in LocalPythonExecutor:

    • ❌ Cannot import subprocess, socket, os.system, etc.
    • ❌ Cannot access network directly
    • ✅ Can only call our pre-defined tools

The LLM writes Python code that calls our tools—it cannot bypass them to run arbitrary shell commands.

Why no web access? Claude Code already has web access—delegate web research to Claude, local codebase work to Local Brain. This separation prevents data exfiltration and prompt injection from fetched content.

Future Ideas

  • MCP Bridge — Ollama ↔ Model Context Protocol bridge when MCP adoption increases
  • Docker Sandbox — Stronger isolation via container when Docker is available
  • CLI Wrappers — Wrap existing tools (ripgrep, gh, tokei) instead of custom implementations
  • Observability — Add tracing and logging for debugging agent behavior

Architecture Decisions

See docs/adrs/ for Architecture Decision Records:

  • ADR-001 — Why custom implementation over frameworks
  • ADR-002 — Why Smolagents for code execution
  • ADR-003 — Why no web tools

Adding New Plugins

Want to add a plugin to this marketplace?

  1. Create a new directory at the root:
your-plugin/
├── plugin.json
└── skills/
    └── your-skill/
        └── SKILL.md
  1. Register it in .claude-plugin/marketplace.json:
{
  "plugins": [
    { "name": "local-brain", "source": "./local-brain", "description": "..." },
    { "name": "your-plugin", "source": "./your-plugin", "description": "..." }
  ]
}

See the Claude Code plugin docs for full specifications.


Development

git clone https://github.com/IsmaelMartinez/local-brain.git
cd local-brain
uv sync
uv run local-brain "Hello!"

Note: Requires Python 3.10-3.13 (grpcio build issue with 3.14).

Run Tests

uv run pytest tests/ -v

Test Plugin Locally

# In Claude Code
/plugin marketplace add ./path/to/local-brain
/plugin install local-brain@local-brain-marketplace

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

local_brain-0.5.0.tar.gz (189.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

local_brain-0.5.0-py3-none-any.whl (17.7 kB view details)

Uploaded Python 3

File details

Details for the file local_brain-0.5.0.tar.gz.

File metadata

  • Download URL: local_brain-0.5.0.tar.gz
  • Upload date:
  • Size: 189.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for local_brain-0.5.0.tar.gz
Algorithm Hash digest
SHA256 156c9f93b5b2e04ebe358bb0d8274fc7fbbc40087f8e11917eebf7fbb8a8c8d1
MD5 d89ded496f6098db2e350ebbd650d523
BLAKE2b-256 a2655d4218df8719fe1aeece0e57478c9362623da34fc05ed53701e076f5f169

See more details on using hashes here.

Provenance

The following attestation bundles were made for local_brain-0.5.0.tar.gz:

Publisher: release.yml on IsmaelMartinez/local-brain

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file local_brain-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: local_brain-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 17.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for local_brain-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a9fcc8f4817d9675ed4eb2c8e33d33e7fded67aa7a694460c4f6b112ceb53dce
MD5 987af65770e6097361b7174a2af05d68
BLAKE2b-256 3e4bf08ff242dc7364e9aa4afa87e62b1e2d0973d123e7e0adc7cf7fd7234230

See more details on using hashes here.

Provenance

The following attestation bundles were made for local_brain-0.5.0-py3-none-any.whl:

Publisher: release.yml on IsmaelMartinez/local-brain

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page