Skip to main content

AI research assistant — local Ollama or Claude/GPT-4o, switchable. Paper library, doc editing, web search.

Project description

acatome-chat

A local-first AI research assistant for scientific literature.

Runs with a local Ollama model by default — or switch to Claude, GPT-4o, or any litellm-compatible provider with a single flag. One install gives you an interactive shell with a searchable paper library, document editing, live web search, and domain-specific databases — all wired together through the Model Context Protocol (MCP).

What you get

Capability Powered by What it does
Paper library acatome-mcp + acatome-store Semantic search over your papers. Navigate by slug, DOI, or arXiv ID. Read abstracts, TOCs, full chunks, figures. Add notes.
PDF extraction acatome-extract Drop a PDF, get structured text with metadata lookup (CrossRef + Semantic Scholar), RAKE keywords, and optional LLM summaries. Supports articles, datasheets, tech reports.
Document writing precis-mcp Open, navigate, and edit Word (.docx) and LaTeX (.tex) documents. Tracked changes in Word. Auto-numbered headings. Citation support.
Web search perplexity-sonar-mcp Live web queries via Perplexity Sonar — quick lookups, deep research with citations, academic and finance focus modes.
Catalysis DB catapult-mcp Query DFT reaction energies, activation barriers, and catalyst comparisons from CatHub and Materials Project.
MOF DB grandmofty-mcp Search Metal-Organic Frameworks by pore size, surface area, void fraction, gas isotherms. Data from CoRE, hMOF, QMOF.
LLM shell acatome-lambic Provider-agnostic chat with tool use, thinking mode, and MCP server management. Works with Ollama, OpenAI, Anthropic, and any litellm-compatible provider.

Install

pip install acatome-chat
# or
uv add acatome-chat

That's it. All MCP servers and the paper store are included. Default backend is SQLite + Chroma — no external services needed.

For heavier setups:

uv add "acatome-chat[postgres]"     # PostgreSQL + pgvector
uv add "acatome-chat[embeddings]"   # sentence-transformers

Quick start

1. Build your paper library

Extract PDFs and ingest them into the searchable store:

# Extract a single PDF (or a whole directory)
acatome-extract extract paper.pdf
acatome-extract extract ~/Downloads/papers/

# Ingest extracted bundles into the searchable store
acatome-store ingest ~/.acatome/papers/

Optional enrichment steps:

# Watch a folder for new PDFs (auto-extracts on arrival)
acatome-extract watch ~/Downloads/papers/

# Add LLM-generated summaries to your bundles
acatome-extract enrich ~/.acatome/papers/

2. Start the chat

# Default: local Ollama model (ollama/qwen3.5:9b)
acatome-chat

# Or use Claude / GPT-4o / any litellm provider
acatome-chat --model anthropic/claude-sonnet-4-20250514
acatome-chat --model openai/gpt-4o
acatome-chat --model ollama/llama3.1:8b

# Disable thinking/reasoning mode
acatome-chat --no-think

3. Use slash commands

The shell has / commands with tab autocomplete:

Command What it does
/tools List all available MCP tools
/status Show connected model and servers
/model <spec> Switch LLM provider on the fly
/think / /nothink Toggle reasoning mode
/quit Exit the shell

4. Talk to your papers

After ingesting papers, the LLM has direct access to your library. Example prompts:

› Find papers about CO2 conversion and write a summary with citations into co2review.docx
› Search for MOFs with high CO2 uptake and compare their pore sizes
› Read the abstract of li2024mof and summarize the key findings
› Open my draft.docx and add a new section about these results
› Search the web for recent advances in direct air capture

The assistant can read your papers, search the web, query chemistry databases, and write results directly into .docx or .tex files — all in one conversation.

Note: Document editing supports .docx and .tex formats. For Word files, changes are written as tracked changes. Be aware that Word may overwrite the file if it's open — save and close Word before asking the assistant to edit.

Architecture

acatome-chat (you are here)
├── acatome-lambic        LLM shell engine (MCP client)
├── acatome-mcp           Paper query MCP server
│   └── acatome-store     SQLite/Postgres + Chroma/pgvector storage
│       └── acatome-meta  Shared config and metadata
├── acatome-extract       PDF → structured bundle pipeline
│   └── precis-summary    RAKE keyword extraction
├── precis-mcp            Document editor MCP server
├── perplexity-sonar-mcp  Web search MCP server
├── catapult-mcp          Catalysis database MCP server
│   └── chemdb-common     Shared chemistry DB utilities
└── grandmofty-mcp        MOF database MCP server
    └── chemdb-common

Environment variables

Variable Required Purpose
PERPLEXITY_API_KEY For web search Perplexity Sonar API key
OPENAI_API_KEY For OpenAI models OpenAI API key
ANTHROPIC_API_KEY For Anthropic models Anthropic API key

For local models via Ollama, no API keys are needed.

License

GPL-3.0-or-later

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

acatome_chat-0.2.6.tar.gz (7.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

acatome_chat-0.2.6-py3-none-any.whl (8.0 kB view details)

Uploaded Python 3

File details

Details for the file acatome_chat-0.2.6.tar.gz.

File metadata

  • Download URL: acatome_chat-0.2.6.tar.gz
  • Upload date:
  • Size: 7.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for acatome_chat-0.2.6.tar.gz
Algorithm Hash digest
SHA256 d8b850c4e4770369dcab02c69c8891d1a0561109c36837e38e81528e5fa08cef
MD5 7f292012a86bd15f9b2f136b6295aa5d
BLAKE2b-256 b779e2b2b40f93dda62a070e518176d3088c22839c2d9fab0a51bae7b2e7ac31

See more details on using hashes here.

Provenance

The following attestation bundles were made for acatome_chat-0.2.6.tar.gz:

Publisher: publish.yml on retospect/acatome-chat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file acatome_chat-0.2.6-py3-none-any.whl.

File metadata

  • Download URL: acatome_chat-0.2.6-py3-none-any.whl
  • Upload date:
  • Size: 8.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for acatome_chat-0.2.6-py3-none-any.whl
Algorithm Hash digest
SHA256 d373b997ca6d6ae38a432bb0dd5a816830698d6bed7f45a2be0b7a32702d071f
MD5 6d705181de52b3746cf26fdf9789900b
BLAKE2b-256 41406c3be384f26cb4229c0cfb6053e20faa712a2981af78294c9eb11153071c

See more details on using hashes here.

Provenance

The following attestation bundles were made for acatome_chat-0.2.6-py3-none-any.whl:

Publisher: publish.yml on retospect/acatome-chat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page