Eidetic memory for your terminal โ local AI search over your command history and output
Project description
engram silently logs every command you run โ and its output โ into a local SQLite database on your machine. When you need to remember what happened, just ask in plain English.
$ engram ask "what was the docker error I got yesterday building the frontend?"
The build failed at 14:32 in /home/titus/projects/acme-app with:
Error response from daemon: failed to solve: failed to read
dockerfile: open Dockerfile: no such file or directory
You were in the wrong directory. The Dockerfile lives one level up.
Fix: cd .. && docker build -f acme-app/Dockerfile .
Everything stays on your machine. No cloud. No subscriptions. No data leaves your computer (unless you explicitly set an Anthropic API key, in which case only the relevant context snippets are sent โ never your full history).
The problem
Standard terminal history (Ctrl+R, history) only saves the commands you typed. The output โ the error message, the API response, the build log โ disappears the moment it scrolls off screen.
engram fixes that.
Features
| ๐ผ Full capture | Command, stdout, stderr, exit code, working directory, timestamp |
| ๐ค AI-powered search | Ask questions in plain English, get answers grounded in your actual history |
| ๐ 100% local by default | SQLite in ~/.engram/, works with Ollama |
| โก Zero-latency | Logging is async โ your shell prompt never slows down |
| ๐ Secret redaction | API keys, tokens, and passwords are scrubbed before storage (15+ patterns) |
| ๐ bash, zsh, fish | Shell hooks for all three with automatic error handling |
| ๐ฅ๏ธ Linux + macOS | Full support for both |
| ๐ก๏ธ Robust | Connection pooling, graceful degradation, automatic model fallback |
Quick start
1. Install
# One-liner (recommended):
curl -sSL https://raw.githubusercontent.com/TLJQ/engram/main/scripts/install.sh | bash
Or with pip:
pip install engram-shell
engram install # adds the shell hook to your RC file automatically
2. Restart your terminal
# Or source the hook manually:
source ~/.engram/engram.zsh # zsh
source ~/.engram/engram.bash # bash
3. Install Ollama for local AI
# Install from https://ollama.com, then:
ollama pull nomic-embed-text # for semantic search
ollama pull llama3 # for answering questions
ollama serve # start the server (or it starts automatically)
4. Start capturing
# Lightweight mode (commands only, no output):
# Just use your terminal normally โ the hook is already active.
# Full mode (commands + output, including interactive programs):
engram shell
5. Ask questions
engram ask "what curl command did I use to hit the auth API last week?"
engram ask "show me the last time a pip install failed and why"
engram ask "what was the JSON response I got from the Stripe API this morning?"
engram ask "what docker containers did I start yesterday?"
All commands
| Command | What it does |
|---|---|
engram shell |
Launch your shell inside the PTY wrapper โ captures full output |
engram ask "<question>" |
Ask a natural language question about your history |
engram install |
Add the shell hook to your RC file |
engram history |
Print recent logged commands |
engram search "<query>" |
Full-text search across commands and output |
engram index |
Embed un-indexed commands (run after first install or after ollama serve) |
engram status |
Show DB stats and configuration |
engram clear |
Delete all stored history (with confirmation) |
Flags
engram ask "..." --top-k 10 # use more context chunks (default: 5)
engram ask "..." --verbose # show which context was retrieved
engram history --limit 100 # show last 100 commands (default: 50)
engram index --reindex # re-embed everything, not just new commands
engram clear --yes # skip the confirmation prompt
Two capture modes
Lightweight hook (default after engram install)
Captures: commands + exit codes + working directory.
Does NOT capture output. Works everywhere. Zero risk of breaking interactive programs.
Add to your ~/.zshrc / ~/.bashrc:
source ~/.engram/engram.zsh # or engram.bash
PTY wrapper (engram shell) โ recommended
Captures: everything, including stdout, stderr, and output from interactive programs.
Uses OSC 633 shell integration (same standard as VS Code and iTerm2).
engram shell # wraps your $SHELL
engram shell bash # or a specific shell
Add this to your shell RC to always start in engram shell:
# At the bottom of ~/.zshrc โ only activates in interactive shells
[[ -z "$ENGRAM_PTY" && $- == *i* ]] && exec engram shell
Using Anthropic Claude instead of Ollama
export ANTHROPIC_API_KEY="sk-ant-..."
engram ask "what went wrong with my deploy this morning?"
Only the relevant context snippets (not your full history) are sent to Anthropic. Embeddings still run locally via Ollama โ only the final Q&A step uses the API.
Configuration
All config via environment variables. Add to your ~/.zshrc or ~/.bashrc.
| Variable | Default | Description |
|---|---|---|
ENGRAM_DIR |
~/.engram |
Where the database lives |
OLLAMA_HOST |
http://localhost:11434 |
Ollama server address |
ENGRAM_LLM_MODEL |
llama3 |
Ollama model for answering questions |
ENGRAM_EMBED_MODEL |
nomic-embed-text |
Ollama model for embeddings |
ANTHROPIC_API_KEY |
(unset) | Set to use Claude instead of Ollama |
How it works
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Your terminal โ
โ โ
โ $ docker build . โ
โ Error: no Dockerfile found โ output captured โ
โ $ โ prompt returns instantly (async logging) โ
โโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ shell hook (bash/zsh/fish)
โ or PTY wrapper (engram shell)
โผ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ ~/.engram/engram.db (SQLite) โ
โ โ
โ commands: command | output | exit_code | cwd | timestamp โ
โ embeddings: vector per command+output chunk โ
โโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ engram ask "..."
โผ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ 1. Embed the question (Ollama nomic-embed-text) โ
โ 2. Cosine similarity search over stored embeddings โ
โ โ falls back to full-text search if Ollama is offline โ
โ 3. Top-K chunks โ context window โ
โ 4. LLM generates answer (Ollama llama3 or Claude) โ
โ 5. Streams tokens to your terminal as they arrive โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Privacy
- Your terminal history never leaves your machine by default.
- If you set
ANTHROPIC_API_KEY, only the top-K retrieved context snippets (not your full history) are sent to Anthropic for the final Q&A step. - Secret redaction runs automatically before anything is stored. API keys, tokens, passwords, and connection strings are replaced with
[REDACTED]. - Add custom redaction patterns to
~/.engram/redact_patterns.txt(one Python regex per line). - The database lives at
~/.engram/engram.db. You own it.engram cleardeletes everything.
Roadmap
- Demo GIF in README
- Fish shell full output capture
-
engram exportโ export history to markdown or JSON -
engram tuiโ interactive TUI browser with fzf-style fuzzy search - Automatic sensitive-value redaction for more patterns
- Rust PTY core for lower overhead
- Windows support (ConPTY)
- Opt-in end-to-end encrypted multi-machine sync
Contributing
See CONTRIBUTING.md. PRs and issues welcome.
git clone https://github.com/TLJQ/engram
cd engram
pip install -e ".[dev]"
pytest tests/ -v
Acknowledgements
Inspired by atuin and the frustration of watching important terminal output scroll away forever.
License
MIT โ see LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file engram_shell-0.1.1.tar.gz.
File metadata
- Download URL: engram_shell-0.1.1.tar.gz
- Upload date:
- Size: 32.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
796670ca53ed3ee15124afeda39be7b6aadf5fe7b6b6bbbf1f5f7a9ed8942707
|
|
| MD5 |
2e8bfef302d50e44ad2835452a67596a
|
|
| BLAKE2b-256 |
5239e0f8392005e4004e37b4db548579f6ed486c8cc4b02405d75b9487a0a48d
|
File details
Details for the file engram_shell-0.1.1-py3-none-any.whl.
File metadata
- Download URL: engram_shell-0.1.1-py3-none-any.whl
- Upload date:
- Size: 26.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8c0a0a060b924cdde3be3a97e5bb8eb8df70cc889cfa635a48eb7afe8ade5025
|
|
| MD5 |
557592da3bd6e253a485d7a1b937948d
|
|
| BLAKE2b-256 |
bf4323e594d5b13225ed5d2d96f5d3826ba79351aa622568d44dd697d0f5a06e
|