Skip to main content

AI research assistant — local Ollama or Claude/GPT-4o, switchable. Paper library, doc editing, web search.

Project description

acatome-chat

A local-first AI research assistant for scientific literature.

Runs with a local Ollama model by default — or switch to Claude, GPT-4o, or any litellm-compatible provider with a single flag. One install gives you an interactive shell with a searchable paper library, document editing, live web search, and domain-specific databases — all wired together through the Model Context Protocol (MCP).

What you get

Capability Powered by What it does
Paper library acatome-mcp + acatome-store Semantic search over your papers. Navigate by slug, DOI, or arXiv ID. Read abstracts, TOCs, full chunks, figures. Add notes.
PDF extraction acatome-extract Drop a PDF, get structured text with metadata lookup (CrossRef + Semantic Scholar), RAKE keywords, and optional LLM summaries. Supports articles, datasheets, tech reports.
Document writing precis-mcp Open, navigate, and edit Word (.docx) and LaTeX (.tex) documents. Tracked changes in Word. Auto-numbered headings. Citation support.
Web search perplexity-sonar-mcp Live web queries via Perplexity Sonar — quick lookups, deep research with citations, academic and finance focus modes.
Catalysis DB catapult-mcp Query DFT reaction energies, activation barriers, and catalyst comparisons from CatHub and Materials Project.
MOF DB grandmofty-mcp Search Metal-Organic Frameworks by pore size, surface area, void fraction, gas isotherms. Data from CoRE, hMOF, QMOF.
LLM shell acatome-lambic Provider-agnostic chat with tool use, thinking mode, and MCP server management. Works with Ollama, OpenAI, Anthropic, and any litellm-compatible provider.

Install

pip install acatome-chat
# or
uv add acatome-chat

That's it. All MCP servers and the paper store are included. Default backend is SQLite + Chroma — no external services needed.

For heavier setups:

uv add "acatome-chat[postgres]"     # PostgreSQL + pgvector
uv add "acatome-chat[embeddings]"   # sentence-transformers

Quick start

1. Build your paper library

Extract PDFs and ingest them into the searchable store:

# Extract a single PDF (or a whole directory)
acatome-extract extract paper.pdf
acatome-extract extract ~/Downloads/papers/

# Ingest extracted bundles into the searchable store
acatome-store ingest ~/.acatome/papers/

Optional enrichment steps:

# Watch a folder for new PDFs (auto-extracts on arrival)
acatome-extract watch ~/Downloads/papers/

# Add LLM-generated summaries to your bundles
acatome-extract enrich ~/.acatome/papers/

2. Start the chat

# Default: local Ollama model (ollama/qwen3.5:9b)
acatome-chat

# Or use Claude / GPT-4o / any litellm provider
acatome-chat --model anthropic/claude-sonnet-4-20250514
acatome-chat --model openai/gpt-4o
acatome-chat --model ollama/llama3.1:8b

# Disable thinking/reasoning mode
acatome-chat --no-think

3. Use slash commands

The shell has / commands with tab autocomplete:

Command What it does
/tools List all available MCP tools
/status Show connected model and servers
/model <spec> Switch LLM provider on the fly
/think / /nothink Toggle reasoning mode
/db Show paper library connection info and stats
/review <prompt> Review the active document (see below)
/help Show command help
/quit Exit the shell

4. Review documents

› /review check scientific rigor and paragraph structure
› /review fix grammar and clarity --fix
› /review verify citations --comments-only
› /review improve transitions --section S2.3

The assistant reads each section, applies fixes as tracked changes, and adds margin comments for issues needing human judgment.

5. Talk to your papers

After ingesting papers, the LLM has direct access to your library. Example prompts:

› Find papers about CO2 conversion and write a summary with citations into co2review.docx
› Search for MOFs with high CO2 uptake and compare their pore sizes
› Read the abstract of li2024mof and summarize the key findings
› Open my draft.docx and add a new section about these results
› Search the web for recent advances in direct air capture

The assistant can read your papers, search the web, query chemistry databases, and write results directly into .docx or .tex files — all in one conversation.

Note: Document editing supports .docx and .tex formats. For Word files, changes are written as tracked changes. Be aware that Word may overwrite the file if it's open — save and close Word before asking the assistant to edit.

Architecture

acatome-chat (you are here)
├── acatome-lambic        LLM shell engine (MCP client)
├── acatome-mcp           Paper query MCP server
│   └── acatome-store     SQLite/Postgres + Chroma/pgvector storage
│       └── acatome-meta  Shared config and metadata
├── acatome-extract       PDF → structured bundle pipeline
│   └── precis-summary    RAKE keyword extraction
├── precis-mcp            Document editor MCP server
├── perplexity-sonar-mcp  Web search MCP server
├── catapult-mcp          Catalysis database MCP server
│   └── chemdb-common     Shared chemistry DB utilities
└── grandmofty-mcp        MOF database MCP server
    └── chemdb-common

Environment variables

Variable Required Purpose
PERPLEXITY_API_KEY For web search Perplexity Sonar API key
OPENAI_API_KEY For OpenAI models OpenAI API key
ANTHROPIC_API_KEY For Anthropic models Anthropic API key

For local models via Ollama, no API keys are needed.

License

GPL-3.0-or-later

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

acatome_chat-0.4.0.tar.gz (9.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

acatome_chat-0.4.0-py3-none-any.whl (9.7 kB view details)

Uploaded Python 3

File details

Details for the file acatome_chat-0.4.0.tar.gz.

File metadata

  • Download URL: acatome_chat-0.4.0.tar.gz
  • Upload date:
  • Size: 9.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for acatome_chat-0.4.0.tar.gz
Algorithm Hash digest
SHA256 668b9257cb915c1ee4d28c57023baa9adc1d4f18586c50a382619388f02cf1e0
MD5 5e087722666b2d3acc9b25541736998b
BLAKE2b-256 3fed25a4b3ac035430831cfe7062ce5e715f1fd990cc2d34a405c7c25eca09e3

See more details on using hashes here.

Provenance

The following attestation bundles were made for acatome_chat-0.4.0.tar.gz:

Publisher: publish.yml on retospect/acatome-chat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file acatome_chat-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: acatome_chat-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 9.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for acatome_chat-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d0d380638a79849ce582bfd528a7384a62d863a30d51ba3906d64af0c51e51c1
MD5 2f823fe1d6c54279a5cc3bfcd19991f3
BLAKE2b-256 51ea0a60543e518c468caecd113b0ba773e7c20e32334efdc799b022f94518b9

See more details on using hashes here.

Provenance

The following attestation bundles were made for acatome_chat-0.4.0-py3-none-any.whl:

Publisher: publish.yml on retospect/acatome-chat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page