Skip to main content

Design, launch, and track autonomous research experiments — with a paper library built in

Project description

Distillate — Orchestrate ML auto-research agents

Your research alchemist. Conjure ML experiments that run themselves. Distill insights from everything you read.

PyPI Python 3.10+ License: MIT   distillate.dev

Distillate desktop app

What is Distillate?

Distillate is an open-source research platform that orchestrates autonomous research agents. It turns your research questions into agents that run ML experiments, track results, and report what they find — while keeping your paper library organized with highlights and AI summaries.

At the center is Nicolas, your research alchemist. He spawns research agents that live in tmux sessions, iteratively improving your models and reporting results. He also tends the library — your paper collection, flowing from Zotero through any reading surface (reMarkable, iPad, desktop) into structured notes.

The core loop: read papers, run experiments, see them improve on the chart, distill what you learned, repeat. What you read informs what you try. What you try informs what you read next.

$ distillate

  ─── ⚗️  Nicolas ──────────────────────────────
  Your research alchemist.
  🧪 4 experiments · 12 runs · 1 running
  📚 42 papers read · 7 in queue

> /conjure tiny-matmul --duration 30m
  🧪 Spawning research agent...
  Created distillate-xp-tiny-matmul
  Research agent spawned — 30 min budget, will report when done.

> /distill tiny-matmul
  🔬 Distilling 8 runs...
  Best: run-7 (loss 0.0023, -42% from baseline)
  Key insight: block size 64 with gradient accumulation
  outperforms larger batches on this scale.

Skills

Nicolas responds to 9 skills organized across three roles:

The Laboratory 🧪

Skill Description
/conjure Summon a research agent — launch an experiment from a research question
/steer Guide a running agent — adjust goals or change direction
/assay Deep analysis of experiment results with cross-run comparison
/distill Extract insights from an experiment's session histories
/survey Scan all experiments for new runs and breakthroughs
/transmute Turn paper insights into experiment ideas

The Library 📚

Skill Description
/brew Sync papers, process highlights, refresh the library
/forage Discover trending papers and reading suggestions
/tincture Deep extraction from a single paper's highlights and notes

Quick Start

Install

pip install distillate
# or
uv pip install distillate

Requirements

  • Claude Code (claude CLI) — Distillate runs through your Claude Code subscription. No separate API key needed.
  • Zotero — for paper management (optional if you only run experiments)

Launch

distillate          # Start the Nicolas REPL
distillate --init   # Run the setup wizard (first time)
distillate --sync   # Classic sync-only workflow

Or use the desktop app for a full IDE experience.

Desktop App

The Distillate desktop app provides an IDE-style layout with four tabs:

  • Control Panel — metric chart, session timer, goal tracking, experiment overview
  • Session — live terminal attached to the running Claude Code agent
  • Results — runs grid with research insights (key breakthrough, lessons learned, dead ends)
  • Prompt — view and edit PROMPT.md with markdown rendering

New users get one-click onboarding: launch a demo experiment from the sidebar, or connect your Zotero library from the papers panel. Context-aware suggestions in the chat adapt to what you're looking at — experiment-specific actions when viewing an experiment, paper-specific actions when viewing a paper.

The desktop app connects to the same backend as the CLI — everything stays in sync.

How It Works

The core research loop:

  1. 📜 Add papers — Save papers to Zotero, read and highlight on any device. Nicolas extracts highlights, generates summaries, and builds your knowledge base.

  2. ⚗️ Conjure experiments — Describe a research question or point at a paper. Nicolas drafts the prompt, sets up a git repo, and spawns an autonomous research agent to run it.

  3. 🔬 Distill insights — As experiments run, Nicolas tracks every iteration with metrics, diffs, and decisions. Distill the results to see what worked, what didn't, and why.

  4. 🔗 Connect the dots — Link papers to experiments. When a run implements a technique from a paper, credit it with inspired_by. Use /transmute to turn paper insights into experiment ideas. What you read informs what you try next.

Every experiment lives in a git repo. Every paper lives in your Zotero library. Notes are plain markdown. There's no lock-in — Distillate enhances your existing tools.

Configuration

All settings live in ~/.config/distillate/.env. See .env.example for the full list.

The setup wizard (distillate --init) walks you through connecting Zotero, choosing a reading surface, and configuring optional features.

For advanced configuration, engagement scores, scheduling, and GitHub Actions automation — see the Power users guide.

Development

git clone https://github.com/rlacombe/distillate.git
cd distillate
uv venv --python 3.12
source .venv/bin/activate
uv pip install -e .
pytest tests/

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

distillate-0.42.0.tar.gz (2.2 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

distillate-0.42.0-py3-none-any.whl (356.8 kB view details)

Uploaded Python 3

File details

Details for the file distillate-0.42.0.tar.gz.

File metadata

  • Download URL: distillate-0.42.0.tar.gz
  • Upload date:
  • Size: 2.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for distillate-0.42.0.tar.gz
Algorithm Hash digest
SHA256 90c4157f8d3e22108ff12f1cdac82b0d820256393c050641e620b89f3e5663d6
MD5 1b6aafb6b69d1317d424d0019e20d223
BLAKE2b-256 be5cd75af63afdfb46ce62ea7ca64919ec724b0ed4b80fe0ad9b348720f95341

See more details on using hashes here.

Provenance

The following attestation bundles were made for distillate-0.42.0.tar.gz:

Publisher: publish.yml on rlacombe/distillate

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file distillate-0.42.0-py3-none-any.whl.

File metadata

  • Download URL: distillate-0.42.0-py3-none-any.whl
  • Upload date:
  • Size: 356.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for distillate-0.42.0-py3-none-any.whl
Algorithm Hash digest
SHA256 aacaf2f0bf28d51544039442aa1bddbbb8c335b32370cdc0ba400ee9da3cedf3
MD5 63cf9093ab70ca35160c59375f4a0790
BLAKE2b-256 cff7d83199357e46e87e0023f5e36e9a60f82d9f00a2eee779a9faaa90d42e70

See more details on using hashes here.

Provenance

The following attestation bundles were made for distillate-0.42.0-py3-none-any.whl:

Publisher: publish.yml on rlacombe/distillate

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page