Skip to main content

AI research assistant — local Ollama or Claude/GPT-4o, switchable. Paper library, doc editing, web search.

Project description

acatome-chat

A local-first AI research assistant for scientific literature.

Runs with a local Ollama model by default — or switch to Claude, GPT-4o, or any litellm-compatible provider with a single flag. One install gives you an interactive shell with a searchable paper library, document editing, live web search, and domain-specific databases — all wired together through the Model Context Protocol (MCP).

What you get

Capability Powered by What it does
Paper library acatome-mcp + acatome-store Semantic search over your papers. Navigate by slug, DOI, or arXiv ID. Read abstracts, TOCs, full chunks, figures. Add notes.
PDF extraction acatome-extract Drop a PDF, get structured text with metadata lookup (CrossRef + Semantic Scholar), RAKE keywords, and optional LLM summaries. Supports articles, datasheets, tech reports.
Document writing precis-mcp Open, navigate, and edit Word (.docx) and LaTeX (.tex) documents. Tracked changes in Word. Auto-numbered headings. Citation support.
Web search perplexity-sonar-mcp Live web queries via Perplexity Sonar — quick lookups, deep research with citations, academic and finance focus modes.
Catalysis DB catapult-mcp Query DFT reaction energies, activation barriers, and catalyst comparisons from CatHub and Materials Project.
MOF DB grandmofty-mcp Search Metal-Organic Frameworks by pore size, surface area, void fraction, gas isotherms. Data from CoRE, hMOF, QMOF.
LLM shell acatome-lambic Provider-agnostic chat with tool use, thinking mode, and MCP server management. Works with Ollama, OpenAI, Anthropic, and any litellm-compatible provider.

Install

pip install acatome-chat
# or
uv add acatome-chat

That's it. All MCP servers and the paper store are included. Default backend is SQLite + Chroma — no external services needed.

For heavier setups:

uv add "acatome-chat[postgres]"     # PostgreSQL + pgvector
uv add "acatome-chat[embeddings]"   # sentence-transformers

Quick start

1. Build your paper library

Extract PDFs and ingest them into the searchable store:

# Extract a single PDF (or a whole directory)
acatome-extract extract paper.pdf
acatome-extract extract ~/Downloads/papers/

# Ingest extracted bundles into the searchable store
acatome-store ingest ~/.acatome/papers/

Optional enrichment steps:

# Watch a folder for new PDFs (auto-extracts on arrival)
acatome-extract watch ~/Downloads/papers/

# Add LLM-generated summaries to your bundles
acatome-extract enrich ~/.acatome/papers/

2. Start the chat

# Default: local Ollama model (ollama/qwen3.5:9b)
acatome-chat

# Or use Claude / GPT-4o / any litellm provider
acatome-chat --model anthropic/claude-sonnet-4-20250514
acatome-chat --model openai/gpt-4o
acatome-chat --model ollama/llama3.1:8b

# Disable thinking/reasoning mode
acatome-chat --no-think

3. Use slash commands

The shell has / commands with tab autocomplete:

Command What it does
/tools List all available MCP tools
/status Show connected model and servers
/model <spec> Switch LLM provider on the fly
/think / /nothink Toggle reasoning mode
/quit Exit the shell

4. Talk to your papers

After ingesting papers, the LLM has direct access to your library. Example prompts:

› Find papers about CO2 conversion and write a summary with citations into co2review.docx
› Search for MOFs with high CO2 uptake and compare their pore sizes
› Read the abstract of li2024mof and summarize the key findings
› Open my draft.docx and add a new section about these results
› Search the web for recent advances in direct air capture

The assistant can read your papers, search the web, query chemistry databases, and write results directly into .docx or .tex files — all in one conversation.

Note: Document editing supports .docx and .tex formats. For Word files, changes are written as tracked changes. Be aware that Word may overwrite the file if it's open — save and close Word before asking the assistant to edit.

Architecture

acatome-chat (you are here)
├── acatome-lambic        LLM shell engine (MCP client)
├── acatome-mcp           Paper query MCP server
│   └── acatome-store     SQLite/Postgres + Chroma/pgvector storage
│       └── acatome-meta  Shared config and metadata
├── acatome-extract       PDF → structured bundle pipeline
│   └── precis-summary    RAKE keyword extraction
├── precis-mcp            Document editor MCP server
├── perplexity-sonar-mcp  Web search MCP server
├── catapult-mcp          Catalysis database MCP server
│   └── chemdb-common     Shared chemistry DB utilities
└── grandmofty-mcp        MOF database MCP server
    └── chemdb-common

Environment variables

Variable Required Purpose
PERPLEXITY_API_KEY For web search Perplexity Sonar API key
OPENAI_API_KEY For OpenAI models OpenAI API key
ANTHROPIC_API_KEY For Anthropic models Anthropic API key

For local models via Ollama, no API keys are needed.

License

GPL-3.0-or-later

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

acatome_chat-0.2.4.tar.gz (7.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

acatome_chat-0.2.4-py3-none-any.whl (8.0 kB view details)

Uploaded Python 3

File details

Details for the file acatome_chat-0.2.4.tar.gz.

File metadata

  • Download URL: acatome_chat-0.2.4.tar.gz
  • Upload date:
  • Size: 7.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for acatome_chat-0.2.4.tar.gz
Algorithm Hash digest
SHA256 31c49ec923fc0a59c53a5f3d788bb424c48ccd2bb0aa3ffa857ebd4c8630c226
MD5 0ce641feff2437361542bfea1f8e20ac
BLAKE2b-256 4353567d28e8a052ec1b4eb70db4bd28c7c49fd0df271869cefbbcce6bcd18c8

See more details on using hashes here.

Provenance

The following attestation bundles were made for acatome_chat-0.2.4.tar.gz:

Publisher: publish.yml on retospect/acatome-chat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file acatome_chat-0.2.4-py3-none-any.whl.

File metadata

  • Download URL: acatome_chat-0.2.4-py3-none-any.whl
  • Upload date:
  • Size: 8.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for acatome_chat-0.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 df268060169510bc941942503eb7744fb8a15c84e6923ea0ce8e1eaf661215ea
MD5 9063de013be31b9ed58315f1fa7ebd8b
BLAKE2b-256 140c966ebd829043deb9dd1e2b51b7b251833f568f6e01cf34bd23629a571fd6

See more details on using hashes here.

Provenance

The following attestation bundles were made for acatome_chat-0.2.4-py3-none-any.whl:

Publisher: publish.yml on retospect/acatome-chat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page