AI research assistant — local Ollama or Claude/GPT-4o, switchable. Paper library, doc editing, web search.
Project description
acatome-chat
A local-first AI research assistant for scientific literature.
Runs with a local Ollama model by default — or switch to Claude, GPT-4o, or any litellm-compatible provider with a single flag. One install gives you an interactive shell with a searchable paper library, document editing, live web search, and domain-specific databases — all wired together through the Model Context Protocol (MCP).
What you get
| Capability | Powered by | What it does |
|---|---|---|
| Paper library | acatome-mcp + acatome-store |
Semantic search over your papers. Navigate by slug, DOI, or arXiv ID. Read abstracts, TOCs, full chunks, figures. Add notes. |
| PDF extraction | acatome-extract |
Drop a PDF, get structured text with metadata lookup (CrossRef + Semantic Scholar), RAKE keywords, and optional LLM summaries. Supports articles, datasheets, tech reports. |
| Document writing | precis-mcp |
Open, navigate, and edit Word (.docx) and LaTeX (.tex) documents. Tracked changes in Word. Auto-numbered headings. Citation support. |
| Web search | perplexity-sonar-mcp |
Live web queries via Perplexity Sonar — quick lookups, deep research with citations, academic and finance focus modes. |
| Catalysis DB | catapult-mcp |
Query DFT reaction energies, activation barriers, and catalyst comparisons from CatHub and Materials Project. |
| MOF DB | grandmofty-mcp |
Search Metal-Organic Frameworks by pore size, surface area, void fraction, gas isotherms. Data from CoRE, hMOF, QMOF. |
| LLM shell | acatome-lambic |
Provider-agnostic chat with tool use, thinking mode, and MCP server management. Works with Ollama, OpenAI, Anthropic, and any litellm-compatible provider. |
Install
pip install acatome-chat
# or
uv add acatome-chat
That's it. All MCP servers and the paper store are included. Default backend is SQLite + Chroma — no external services needed.
For heavier setups:
uv add "acatome-chat[postgres]" # PostgreSQL + pgvector
uv add "acatome-chat[embeddings]" # sentence-transformers
Quick start
1. Build your paper library
Extract PDFs and ingest them into the searchable store:
# Extract a single PDF (or a whole directory)
acatome-extract extract paper.pdf
acatome-extract extract ~/Downloads/papers/
# Ingest extracted bundles into the searchable store
acatome-store ingest ~/.acatome/papers/
Optional enrichment steps:
# Watch a folder for new PDFs (auto-extracts on arrival)
acatome-extract watch ~/Downloads/papers/
# Add LLM-generated summaries to your bundles
acatome-extract enrich ~/.acatome/papers/
2. Start the chat
# Default: local Ollama model (ollama/qwen3.5:9b)
acatome-chat
# Or use Claude / GPT-4o / any litellm provider
acatome-chat --model anthropic/claude-sonnet-4-20250514
acatome-chat --model openai/gpt-4o
acatome-chat --model ollama/llama3.1:8b
# Disable thinking/reasoning mode
acatome-chat --no-think
3. Use slash commands
The shell has / commands with tab autocomplete:
| Command | What it does |
|---|---|
/tools |
List all available MCP tools |
/status |
Show connected model and servers |
/model <spec> |
Switch LLM provider on the fly |
/think / /nothink |
Toggle reasoning mode |
/quit |
Exit the shell |
4. Talk to your papers
After ingesting papers, the LLM has direct access to your library. Example prompts:
› Find papers about CO2 conversion and write a summary with citations into co2review.docx
› Search for MOFs with high CO2 uptake and compare their pore sizes
› Read the abstract of li2024mof and summarize the key findings
› Open my draft.docx and add a new section about these results
› Search the web for recent advances in direct air capture
The assistant can read your papers, search the web, query chemistry databases, and write results directly into .docx or .tex files — all in one conversation.
Note: Document editing supports .docx and .tex formats. For Word files, changes are written as tracked changes. Be aware that Word may overwrite the file if it's open — save and close Word before asking the assistant to edit.
Architecture
acatome-chat (you are here)
├── acatome-lambic LLM shell engine (MCP client)
├── acatome-mcp Paper query MCP server
│ └── acatome-store SQLite/Postgres + Chroma/pgvector storage
│ └── acatome-meta Shared config and metadata
├── acatome-extract PDF → structured bundle pipeline
│ └── precis-summary RAKE keyword extraction
├── precis-mcp Document editor MCP server
├── perplexity-sonar-mcp Web search MCP server
├── catapult-mcp Catalysis database MCP server
│ └── chemdb-common Shared chemistry DB utilities
└── grandmofty-mcp MOF database MCP server
└── chemdb-common
Environment variables
| Variable | Required | Purpose |
|---|---|---|
PERPLEXITY_API_KEY |
For web search | Perplexity Sonar API key |
OPENAI_API_KEY |
For OpenAI models | OpenAI API key |
ANTHROPIC_API_KEY |
For Anthropic models | Anthropic API key |
For local models via Ollama, no API keys are needed.
License
GPL-3.0-or-later
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file acatome_chat-0.2.6.tar.gz.
File metadata
- Download URL: acatome_chat-0.2.6.tar.gz
- Upload date:
- Size: 7.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d8b850c4e4770369dcab02c69c8891d1a0561109c36837e38e81528e5fa08cef
|
|
| MD5 |
7f292012a86bd15f9b2f136b6295aa5d
|
|
| BLAKE2b-256 |
b779e2b2b40f93dda62a070e518176d3088c22839c2d9fab0a51bae7b2e7ac31
|
Provenance
The following attestation bundles were made for acatome_chat-0.2.6.tar.gz:
Publisher:
publish.yml on retospect/acatome-chat
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
acatome_chat-0.2.6.tar.gz -
Subject digest:
d8b850c4e4770369dcab02c69c8891d1a0561109c36837e38e81528e5fa08cef - Sigstore transparency entry: 1087424178
- Sigstore integration time:
-
Permalink:
retospect/acatome-chat@b98af358637d571da20704f943be48df33a6db92 -
Branch / Tag:
refs/tags/v0.2.6 - Owner: https://github.com/retospect
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@b98af358637d571da20704f943be48df33a6db92 -
Trigger Event:
release
-
Statement type:
File details
Details for the file acatome_chat-0.2.6-py3-none-any.whl.
File metadata
- Download URL: acatome_chat-0.2.6-py3-none-any.whl
- Upload date:
- Size: 8.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d373b997ca6d6ae38a432bb0dd5a816830698d6bed7f45a2be0b7a32702d071f
|
|
| MD5 |
6d705181de52b3746cf26fdf9789900b
|
|
| BLAKE2b-256 |
41406c3be384f26cb4229c0cfb6053e20faa712a2981af78294c9eb11153071c
|
Provenance
The following attestation bundles were made for acatome_chat-0.2.6-py3-none-any.whl:
Publisher:
publish.yml on retospect/acatome-chat
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
acatome_chat-0.2.6-py3-none-any.whl -
Subject digest:
d373b997ca6d6ae38a432bb0dd5a816830698d6bed7f45a2be0b7a32702d071f - Sigstore transparency entry: 1087424248
- Sigstore integration time:
-
Permalink:
retospect/acatome-chat@b98af358637d571da20704f943be48df33a6db92 -
Branch / Tag:
refs/tags/v0.2.6 - Owner: https://github.com/retospect
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@b98af358637d571da20704f943be48df33a6db92 -
Trigger Event:
release
-
Statement type: