Skip to main content

An open-source, local-first, agentic coding assistant

Project description

๐Ÿš€ Locopilot

PyPI version Python License: MIT GitHub stars

Locopilot Demo

Locopilot is an open-source, local-first, agentic coding assistant built for developers. It leverages local LLMs (via Ollama), and advanced memory management using LangGraph, to automate, plan, and edit codebasesโ€”all inside an interactive shell.

  • Private: All code and prompts stay on your machine.
  • Agentic: Locopilot plans, edits, iterates, and manages your coding tasks.
  • Interactive: Drop into a shell, enter tasks or slash commands, and steer the agent in real time.
  • Memory-Efficient: Advanced memory compression via LangGraph for "infinite" context.
  • Extensible: Change models, modes, and add custom tools or plugins on the fly.

Table of Contents

โœจ Features

  • Local LLM Backend: Bring your own Ollama server and code with any open-source LLM.
  • LangGraph Agent Workflow: Plans, executes, edits, and compresses memory as a stateful, extensible graph.
  • Interactive Shell/REPL: After init, drop into a chat-like agent terminalโ€”just type coding tasks or slash commands.
  • Slash Command Support: /model, /change-mode, /concise, /clear, /new, /end, /help, and more.
  • Smart Memory Compression: Automatically summarizes previous context using the LLM itself, supporting ultra-long sessions.
  • Configurable: Models, modes, and summarization thresholds are all runtime-editable.
  • Pluggable Nodes: Add file tools, planning modules, git ops, and vector-based retrieval easily.
  • (Planned) Git Integration: Auto-commit, rollback, and view code diffs per agent step.

โšก๏ธ How It Works

1. Initialization

Run locopilot init in your project root.

  • Locopilot checks Ollama, prompts for model, sets up .locopilot/config.yaml.
  • You're dropped into an interactive agent shell (REPL).

2. Agentic Workflow (via LangGraph)

Each user input is parsed:

  • Slash command (/model, etc.) โ†’ runs as a graph branch.
  • Normal prompt (task) โ†’ plans, edits, summarizes via a workflow graph:
    User Task โ†’ [Planning Node] โ†’ [File Edit Node] โ†’ [Memory Summarizer Node] โ†’ (Repeat)
    
  • Memory is managed with a LangGraph memory nodeโ€”summarizing, chunking, and compressing context as needed.

3. Session Management

  • Change models, modes, or reset memory on the fly with slash commands.
  • All state (memory, model, mode) persists during the session.

๐Ÿ—๏ธ Architecture

Key components:

  • CLI Layer: Typer-based CLI, launches shell (REPL), parses slash commands.
  • LangGraph Workflow:
    • Nodes: Planning, file edit, summarization, slash command handler, etc.
    • Edges: Control session flow, branching between commands and prompts.
  • LLM Backend:
    • Ollama: For running CodeLlama, DeepSeek, etc.
  • Memory Layer:
    • LangChain/LangGraph memory objects (buffer, summary, vector, hybrid).
    • Summarizes old context using the LLM to avoid hitting token/window limits.
  • Config/Project Layer:
    • .locopilot/config.yaml stores model/backend/session preferences.

Stateful Graph Example:

               [User Input]
                      |
      +---------------+---------------+
      |                               |
 [Slash Command]              [Prompt/Task]
      |                               |
[Command Handler]   [Plan]->[Edit]->[Summarize]->[Memory]
      |                               |
     END                             Loop

๐Ÿ›  Getting Started

Requirements

  • Python 3.8+
  • Ollama running locally
  • pip

Install Locopilot

Option 1: Install from PyPI (Recommended)

pip install locopilot

Option 2: Install from Source

git clone https://github.com/Ripan-Roy/locopilot-ai.git
cd locopilot-backend
pip install -e .

Start Your Local LLM

Ollama:

ollama serve
ollama pull codellama:latest

Initialize and Enter the Agent Shell

locopilot init

This checks LLM backend, prompts for config, scans for project context, and launches the interactive shell.

๐Ÿ–ฅ๏ธ Usage: Interactive Shell & Commands

After init, Locopilot enters a shell where you can type prompts and commands:

Example Session

$ locopilot init
[โœ“] Ollama running. Model: codellama:latest
[โœ“] Project context initialized.

Locopilot Shell (mode: do):
> Add OAuth login to my Django app
[PLANNING] ...
[EDITING] ...
[MEMORY] ...

> /model
Current model: codellama:latest
Enter new model: deepseek-coder:latest
[โœ“] Model switched to deepseek-coder:latest

> /change-mode
Current mode: do
Available modes: do, refactor, explain, chat
Enter new mode: refactor
[โœ“] Mode set to refactor.

> Refactor the payment logic for clarity
...

> /concise
[โœ“] Context summarized and compressed.

> /clear
[โœ“] Session memory cleared.

> /new
[โœ“] New session started.

> /end
[โœ“] Session ended. Bye!

Supported Slash Commands

Command Purpose
/model Change LLM model/backend for current session
/change-mode Switch between do, refactor, explain, chat modes
/clear Clear all current context/memory
/new Start a new session/project
/end End the agent shell and exit
/concise Force summarization/compression of current context
/help Show help and command list

Anything not starting with / is treated as a task in the current mode!

๐Ÿ—‚๏ธ Project Structure

locopilot-backend/
โ”œโ”€โ”€ locopilot/                  # Main package directory
โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”œโ”€โ”€ core/                   # Core functionality
โ”‚   โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”‚   โ”œโ”€โ”€ agent.py           # LangGraph workflow and nodes
โ”‚   โ”‚   โ”œโ”€โ”€ memory.py          # Session/context memory management
โ”‚   โ”‚   โ””โ”€โ”€ executor.py        # Plan execution engine
โ”‚   โ”œโ”€โ”€ llm/                    # LLM backend handling
โ”‚   โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”‚   โ”œโ”€โ”€ connection.py      # Ollama connection helpers
โ”‚   โ”‚   โ””โ”€โ”€ backends/          # Backend-specific implementations
โ”‚   โ”œโ”€โ”€ cli/                    # CLI components
โ”‚   โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”‚   โ”œโ”€โ”€ app.py             # CLI entrypoint, shell/REPL logic
โ”‚   โ”‚   โ””โ”€โ”€ commands/          # CLI command implementations
โ”‚   โ””โ”€โ”€ utils/                  # Utility functions
โ”‚       โ”œโ”€โ”€ __init__.py
โ”‚       โ””โ”€โ”€ file_ops.py        # File operations, config helpers
โ”œโ”€โ”€ tests/                      # Test suite
โ”‚   โ”œโ”€โ”€ conftest.py
โ”‚   โ”œโ”€โ”€ test_agent.py
โ”‚   โ”œโ”€โ”€ test_basic.py
โ”‚   โ”œโ”€โ”€ test_connection.py
โ”‚   โ””โ”€โ”€ test_plan_executor.py
โ”œโ”€โ”€ scripts/                    # Setup and utility scripts
โ”‚   โ””โ”€โ”€ setup.sh
โ”œโ”€โ”€ docs/                       # Documentation
โ”œโ”€โ”€ assets/                     # Static assets
โ”‚   โ””โ”€โ”€ locopilot-demo.png
โ”œโ”€โ”€ pyproject.toml             # Package configuration
โ”œโ”€โ”€ requirements.txt           # Dependencies
โ”œโ”€โ”€ README.md
โ””โ”€โ”€ LICENSE

๐Ÿง  Memory Management (with LangGraph)

  • ConversationBufferMemory or ConversationSummaryBufferMemory is attached to the agent graph.
  • As session context grows, old steps are summarized using the LLM and replaced in memory.
  • This ensures Locopilot "remembers" key tasks, design decisions, and context for long sessions.
  • Slash command /concise lets you summarize on demand.

โšก๏ธ Extensibility & Roadmap

  • Editor Plugins: VSCode, Vim, JetBrains, etc.
  • Project-Aware RAG: Integrate vector DBs (Chroma, Qdrant) for smart codebase retrieval.
  • (Planned) Git Integration: Auto-commit, diff, and rollback per step.
  • Save/Load Sessions: /save, /load, /history commands.
  • Custom Plugins/Nodes: Add your own LangGraph nodes for tools or workflows.
  • Web/GUI Frontends: Same agent core, different interface.

๐Ÿค Contributing

  • Fork and PRs are welcome!
  • Open issues for bugs or feature requests.
  • For major features (graph nodes, memory backends), see CONTRIBUTING.md (coming soon).

๐Ÿ“ License

MIT License. Use, fork, and extend as you wish!

๐Ÿ’ก Inspiration

Locopilot is inspired by Copilot, Claude Code, Dev-GPT, OpenDevin, and the emerging open-source agentic ecosystemโ€”aiming to empower developers with private, supercharged, customizable AI tools.

๐Ÿšฆ Quickstart

# Install from PyPI
pip install locopilot

# Initialize in your project
locopilot init

# ... then just type your coding tasks and manage the session with slash commands!

Links:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

locopilot-0.1.10.tar.gz (29.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

locopilot-0.1.10-py3-none-any.whl (25.5 kB view details)

Uploaded Python 3

File details

Details for the file locopilot-0.1.10.tar.gz.

File metadata

  • Download URL: locopilot-0.1.10.tar.gz
  • Upload date:
  • Size: 29.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.13

File hashes

Hashes for locopilot-0.1.10.tar.gz
Algorithm Hash digest
SHA256 7934f2afd0ec6a1b9a9cdc1d0577c2ed0f294a73bba668d0bf58f6c10059de44
MD5 59fcb52e2db0da659994ef307a57471d
BLAKE2b-256 189afe4b41e34ff082e62b445a24d9870edf3b6b86e4d917af743043a800495a

See more details on using hashes here.

File details

Details for the file locopilot-0.1.10-py3-none-any.whl.

File metadata

  • Download URL: locopilot-0.1.10-py3-none-any.whl
  • Upload date:
  • Size: 25.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.13

File hashes

Hashes for locopilot-0.1.10-py3-none-any.whl
Algorithm Hash digest
SHA256 f3b1fbcd7609cf159bbe72c93d6e92c77c34b0d26dc40317c3c4f55fe54ca68d
MD5 bf34e6ea2504c63e3e956b3265d7f4ed
BLAKE2b-256 34393e5b8150ade2262e9f4463fe8bbc99b0200bbcc633e9548f04a5616f104b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page