Skip to main content

Acatome research assistant — acatome-lambic shell with MCP tools

Project description

acatome-chat

A local-first AI research assistant for scientific literature.

One install gives you an interactive shell backed by a local LLM, a searchable paper library, document editing, live web search, and domain-specific databases — all wired together through the Model Context Protocol (MCP).

What you get

Capability Powered by What it does
Paper library acatome-mcp + acatome-store Semantic search over your papers. Navigate by slug, DOI, or arXiv ID. Read abstracts, TOCs, full chunks, figures. Add notes.
PDF extraction acatome-extract Drop a PDF, get structured text with metadata lookup (CrossRef + Semantic Scholar), RAKE keywords, and optional LLM summaries. Supports articles, datasheets, tech reports.
Document writing precis-mcp Open, navigate, and edit Word (.docx) and LaTeX (.tex) documents. Tracked changes in Word. Auto-numbered headings. Citation support.
Web search perplexity-sonar-mcp Live web queries via Perplexity Sonar — quick lookups, deep research with citations, academic and finance focus modes.
Catalysis DB catapult-mcp Query DFT reaction energies, activation barriers, and catalyst comparisons from CatHub and Materials Project.
MOF DB grandmofty-mcp Search Metal-Organic Frameworks by pore size, surface area, void fraction, gas isotherms. Data from CoRE, hMOF, QMOF.
LLM shell acatome-lambic Provider-agnostic chat with tool use, thinking mode, and MCP server management. Works with Ollama, OpenAI, Anthropic, and any litellm-compatible provider.

Install

pip install acatome-chat
# or
uv add acatome-chat

That's it. All MCP servers and the paper store are included. Default backend is SQLite + Chroma — no external services needed.

For heavier setups:

uv add "acatome-chat[postgres]"     # PostgreSQL + pgvector
uv add "acatome-chat[embeddings]"   # sentence-transformers

Quick start

# Start the shell (default: ollama/qwen3.5:9b)
acatome-chat

# Use a different model
acatome-chat --model ollama/llama3.1:8b
acatome-chat --model anthropic/claude-sonnet-4-20250514
acatome-chat --model openai/gpt-4o

# Disable thinking/reasoning mode
acatome-chat --no-think

Build your paper library

# Extract a single PDF
acatome-extract extract paper.pdf

# Watch a folder for new PDFs
acatome-extract watch ~/Downloads/papers/

# Enrich with LLM summaries
acatome-extract enrich ~/.acatome/papers/

# Ingest into the searchable store
acatome-store ingest ~/.acatome/papers/

Talk to your papers

Once inside the shell, the LLM has direct access to your library:

> Find papers about CO2 capture in MOFs from 2020 onwards
> Read the abstract of li2024mof
> Summarize the key findings from chunks 12-18
> Search the web for recent advances in direct air capture
> Open my review draft and add a paragraph about these results

Architecture

acatome-chat (you are here)
├── acatome-lambic        LLM shell engine (MCP client)
├── acatome-mcp           Paper query MCP server
│   └── acatome-store     SQLite/Postgres + Chroma/pgvector storage
│       └── acatome-meta  Shared config and metadata
├── acatome-extract       PDF → structured bundle pipeline
│   └── precis-summary    RAKE keyword extraction
├── precis-mcp            Document editor MCP server
├── perplexity-sonar-mcp  Web search MCP server
├── catapult-mcp          Catalysis database MCP server
│   └── chemdb-common     Shared chemistry DB utilities
└── grandmofty-mcp        MOF database MCP server
    └── chemdb-common

Environment variables

Variable Required Purpose
PERPLEXITY_API_KEY For web search Perplexity Sonar API key
OPENAI_API_KEY For OpenAI models OpenAI API key
ANTHROPIC_API_KEY For Anthropic models Anthropic API key

For local models via Ollama, no API keys are needed.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

acatome_chat-0.2.2.tar.gz (6.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

acatome_chat-0.2.2-py3-none-any.whl (7.2 kB view details)

Uploaded Python 3

File details

Details for the file acatome_chat-0.2.2.tar.gz.

File metadata

  • Download URL: acatome_chat-0.2.2.tar.gz
  • Upload date:
  • Size: 6.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for acatome_chat-0.2.2.tar.gz
Algorithm Hash digest
SHA256 13311cfed4af3240756f0bcffa7722425a80b4d567bae04a77ae0b26cd306faf
MD5 d8514eedd590df85889de63f7bec6820
BLAKE2b-256 59f819aa26d447d9fa97b4720df01f0314aae29c79c970ed1d04d98b407b775f

See more details on using hashes here.

Provenance

The following attestation bundles were made for acatome_chat-0.2.2.tar.gz:

Publisher: publish.yml on retospect/acatome-chat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file acatome_chat-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: acatome_chat-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 7.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for acatome_chat-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 9dceb1db6c37cb26a146fb6b43defd8448d1397cd0a17f8e5aee2d2995e311bb
MD5 9fc44756a78ef4642eaad63865c08069
BLAKE2b-256 6a3a16866ec1414833f81b5b268a35cbf98f515066583345e4ba55e98d688066

See more details on using hashes here.

Provenance

The following attestation bundles were made for acatome_chat-0.2.2-py3-none-any.whl:

Publisher: publish.yml on retospect/acatome-chat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page