An open-source, local-first, agentic coding assistant
Project description
๐ Locopilot
Locopilot is an open-source, local-first, agentic coding assistant built for developers. It leverages local LLMs (via Ollama), and advanced memory management using LangGraph, to automate, plan, and edit codebasesโall inside an interactive shell.
- Private: All code and prompts stay on your machine.
- Agentic: Locopilot plans, edits, iterates, and manages your coding tasks.
- Interactive: Drop into a shell, enter tasks or slash commands, and steer the agent in real time.
- Memory-Efficient: Advanced memory compression via LangGraph for "infinite" context.
- Extensible: Change models, modes, and add custom tools or plugins on the fly.
Table of Contents
- Features
- How It Works
- Architecture
- Getting Started
- Usage: Interactive Shell & Commands
- Project Structure
- Extensibility & Roadmap
- Contributing
- License
โจ Features
- Local LLM Backend: Bring your own Ollama server and code with any open-source LLM.
- LangGraph Agent Workflow: Plans, executes, edits, and compresses memory as a stateful, extensible graph.
- Interactive Shell/REPL: After init, drop into a chat-like agent terminalโjust type coding tasks or slash commands.
- Slash Command Support:
/model,/change-mode,/concise,/clear,/new,/end,/help, and more. - Smart Memory Compression: Automatically summarizes previous context using the LLM itself, supporting ultra-long sessions.
- Configurable: Models, modes, and summarization thresholds are all runtime-editable.
- Pluggable Nodes: Add file tools, planning modules, git ops, and vector-based retrieval easily.
- (Planned) Git Integration: Auto-commit, rollback, and view code diffs per agent step.
โก๏ธ How It Works
1. Initialization
Run locopilot init in your project root.
- Locopilot checks Ollama, prompts for model, sets up
.locopilot/config.yaml. - You're dropped into an interactive agent shell (REPL).
2. Agentic Workflow (via LangGraph)
Each user input is parsed:
- Slash command (
/model, etc.) โ runs as a graph branch. - Normal prompt (task) โ plans, edits, summarizes via a workflow graph:
User Task โ [Planning Node] โ [File Edit Node] โ [Memory Summarizer Node] โ (Repeat) - Memory is managed with a LangGraph memory nodeโsummarizing, chunking, and compressing context as needed.
3. Session Management
- Change models, modes, or reset memory on the fly with slash commands.
- All state (memory, model, mode) persists during the session.
๐๏ธ Architecture
Key components:
- CLI Layer: Typer-based CLI, launches shell (REPL), parses slash commands.
- LangGraph Workflow:
- Nodes: Planning, file edit, summarization, slash command handler, etc.
- Edges: Control session flow, branching between commands and prompts.
- LLM Backend:
- Ollama: For running CodeLlama, DeepSeek, etc.
- Memory Layer:
- LangChain/LangGraph memory objects (buffer, summary, vector, hybrid).
- Summarizes old context using the LLM to avoid hitting token/window limits.
- Config/Project Layer:
.locopilot/config.yamlstores model/backend/session preferences.
Stateful Graph Example:
[User Input]
|
+---------------+---------------+
| |
[Slash Command] [Prompt/Task]
| |
[Command Handler] [Plan]->[Edit]->[Summarize]->[Memory]
| |
END Loop
๐ Getting Started
Requirements
- Python 3.8+
- Ollama running locally
- pip
Install Locopilot
Option 1: Install from PyPI (Recommended)
pip install locopilot
Option 2: Install from Source
git clone https://github.com/Ripan-Roy/locopilot-ai.git
cd locopilot-backend
pip install -e .
Start Your Local LLM
Ollama:
ollama serve
ollama pull codellama:latest
Initialize and Enter the Agent Shell
locopilot init
This checks LLM backend, prompts for config, scans for project context, and launches the interactive shell.
๐ฅ๏ธ Usage: Interactive Shell & Commands
After init, Locopilot enters a shell where you can type prompts and commands:
Example Session
$ locopilot init
[โ] Ollama running. Model: codellama:latest
[โ] Project context initialized.
Locopilot Shell (mode: do):
> Add OAuth login to my Django app
[PLANNING] ...
[EDITING] ...
[MEMORY] ...
> /model
Current model: codellama:latest
Enter new model: deepseek-coder:latest
[โ] Model switched to deepseek-coder:latest
> /change-mode
Current mode: do
Available modes: do, refactor, explain, chat
Enter new mode: refactor
[โ] Mode set to refactor.
> Refactor the payment logic for clarity
...
> /concise
[โ] Context summarized and compressed.
> /clear
[โ] Session memory cleared.
> /new
[โ] New session started.
> /end
[โ] Session ended. Bye!
Supported Slash Commands
| Command | Purpose |
|---|---|
/model |
Change LLM model/backend for current session |
/change-mode |
Switch between do, refactor, explain, chat modes |
/clear |
Clear all current context/memory |
/new |
Start a new session/project |
/end |
End the agent shell and exit |
/concise |
Force summarization/compression of current context |
/help |
Show help and command list |
Anything not starting with / is treated as a task in the current mode!
๐๏ธ Project Structure
locopilot-backend/
โโโ locopilot/ # Main package directory
โ โโโ __init__.py
โ โโโ core/ # Core functionality
โ โ โโโ __init__.py
โ โ โโโ agent.py # LangGraph workflow and nodes
โ โ โโโ memory.py # Session/context memory management
โ โ โโโ executor.py # Plan execution engine
โ โโโ llm/ # LLM backend handling
โ โ โโโ __init__.py
โ โ โโโ connection.py # Ollama connection helpers
โ โ โโโ backends/ # Backend-specific implementations
โ โโโ cli/ # CLI components
โ โ โโโ __init__.py
โ โ โโโ app.py # CLI entrypoint, shell/REPL logic
โ โ โโโ commands/ # CLI command implementations
โ โโโ utils/ # Utility functions
โ โโโ __init__.py
โ โโโ file_ops.py # File operations, config helpers
โโโ tests/ # Test suite
โ โโโ conftest.py
โ โโโ test_agent.py
โ โโโ test_basic.py
โ โโโ test_connection.py
โ โโโ test_plan_executor.py
โโโ scripts/ # Setup and utility scripts
โ โโโ setup.sh
โโโ docs/ # Documentation
โโโ assets/ # Static assets
โ โโโ locopilot-demo.png
โโโ pyproject.toml # Package configuration
โโโ requirements.txt # Dependencies
โโโ README.md
โโโ LICENSE
๐ง Memory Management (with LangGraph)
ConversationBufferMemoryorConversationSummaryBufferMemoryis attached to the agent graph.- As session context grows, old steps are summarized using the LLM and replaced in memory.
- This ensures Locopilot "remembers" key tasks, design decisions, and context for long sessions.
- Slash command
/conciselets you summarize on demand.
โก๏ธ Extensibility & Roadmap
- Editor Plugins: VSCode, Vim, JetBrains, etc.
- Project-Aware RAG: Integrate vector DBs (Chroma, Qdrant) for smart codebase retrieval.
- (Planned) Git Integration: Auto-commit, diff, and rollback per step.
- Save/Load Sessions:
/save,/load,/historycommands. - Custom Plugins/Nodes: Add your own LangGraph nodes for tools or workflows.
- Web/GUI Frontends: Same agent core, different interface.
๐ค Contributing
- Fork and PRs are welcome!
- Open issues for bugs or feature requests.
- For major features (graph nodes, memory backends), see CONTRIBUTING.md (coming soon).
๐ License
MIT License. Use, fork, and extend as you wish!
๐ก Inspiration
Locopilot is inspired by Copilot, Claude Code, Dev-GPT, OpenDevin, and the emerging open-source agentic ecosystemโaiming to empower developers with private, supercharged, customizable AI tools.
๐ฆ Quickstart
# Install from PyPI
pip install locopilot
# Initialize in your project
locopilot init
# ... then just type your coding tasks and manage the session with slash commands!
Links:
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file locopilot-0.1.10.tar.gz.
File metadata
- Download URL: locopilot-0.1.10.tar.gz
- Upload date:
- Size: 29.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7934f2afd0ec6a1b9a9cdc1d0577c2ed0f294a73bba668d0bf58f6c10059de44
|
|
| MD5 |
59fcb52e2db0da659994ef307a57471d
|
|
| BLAKE2b-256 |
189afe4b41e34ff082e62b445a24d9870edf3b6b86e4d917af743043a800495a
|
File details
Details for the file locopilot-0.1.10-py3-none-any.whl.
File metadata
- Download URL: locopilot-0.1.10-py3-none-any.whl
- Upload date:
- Size: 25.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f3b1fbcd7609cf159bbe72c93d6e92c77c34b0d26dc40317c3c4f55fe54ca68d
|
|
| MD5 |
bf34e6ea2504c63e3e956b3265d7f4ed
|
|
| BLAKE2b-256 |
34393e5b8150ade2262e9f4463fe8bbc99b0200bbcc633e9548f04a5616f104b
|