Local, privacy-preserving meeting transcription and minutes generation for macOS. Parakeet TDT for speech-to-text, any OpenAI-compatible LLM for minutes.
Project description
Scribe
Local, privacy-preserving meeting transcription and minutes for macOS. Nothing leaves your machine.
Drop an audio or video file into the app, get a transcript and structured meeting minutes back. Record directly in the app if you prefer. Everything runs on your hardware — no cloud APIs, no data exfiltration, no subscriptions.
What It Does
- Speech-to-text via Parakeet TDT — Nvidia's speech model running on Apple MLX (GPU-accelerated on Apple Silicon). ~600MB model, processes audio in 2-minute chunks.
- Meeting minutes via any OpenAI-compatible LLM — takes the transcript and produces attendees, agenda, decisions, action items. Works with local models (Qwen, Llama, Mistral via mlx-lm/ollama/vLLM) or remote APIs.
- MCP server for LLM tools — Claude Code, Gemini CLI, or any MCP client can transcribe files and generate minutes directly.
Install
Requirements: macOS 15+ on Apple Silicon, Python 3.12+, ffmpeg.
# Install ffmpeg if you don't have it
brew install ffmpeg
# Install the backend
pip install scribe-minutes
# or: uv pip install scribe-minutes
# Start it
scribe
# → Backend running at http://localhost:8890
From source
git clone https://github.com/aaronmontgomery/scribe.git
cd scribe
pip install -e . # or: uv pip install -e .
scribe
macOS App
The native SwiftUI app provides drag-drop, recording, and a two-pane results view. Requires xcodegen.
brew install xcodegen
make app-install # builds and copies to /Applications
Or build manually:
cd app
xcodegen generate
xcodebuild -project Scribe.xcodeproj -scheme Scribe_macOS -destination 'platform=macOS' build
Configuration
All configuration is via environment variables. None are required — the backend works out of the box for transcription. Minutes generation needs an LLM.
| Variable | Default | Description |
|---|---|---|
SCRIBE_LLM_URL |
http://127.0.0.1:18080/v1/chat/completions |
OpenAI-compatible chat endpoint for minutes |
SCRIBE_LLM_MODEL |
mlx-community/Qwen3.5-35B-A3B-4bit |
Model name to request |
SCRIBE_LLM_HOST |
(empty) | SSH host for auto-tunnel to remote LLM (e.g. my-server) |
SCRIBE_LLM_REMOTE_PORT |
8080 |
LLM port on remote host |
SCRIBE_LLM_LOCAL_PORT |
18080 |
Local port for SSH tunnel |
SCRIBE_LLM_TIMEOUT |
300 |
LLM request timeout (seconds) |
SCRIBE_RAG_URL |
(empty) | Optional RAG MCP endpoint for name/role correction |
SCRIBE_DEVONTHINK_GROUP |
(empty) | DEVONthink group UUID for export (saves to inbox if unset) |
Example setups
Local LLM via Ollama:
export SCRIBE_LLM_URL=http://127.0.0.1:11434/v1/chat/completions
export SCRIBE_LLM_MODEL=qwen2.5:32b
scribe
Remote LLM via SSH tunnel:
export SCRIBE_LLM_HOST=my-gpu-server
export SCRIBE_LLM_REMOTE_PORT=8080
scribe
# Backend auto-creates: ssh -N -L 18080:127.0.0.1:8080 my-gpu-server
Transcription only (no LLM needed):
scribe
# Just use "Transcript Only" mode in the app, or the /transcribe API endpoint
MCP Integration
LLM tools (Claude Code, etc.) can transcribe files directly.
Add to your MCP config:
{
"scribe": {
"type": "http",
"url": "http://127.0.0.1:8890/mcp"
}
}
Tools:
transcribe_file(path, format)— speech-to-text (txt, srt, vtt, json)generate_minutes(path)— transcript + structured minutesbatch_minutes(paths)— process multiple filesget_latest_output()— read the most recent result
API
| Endpoint | Method | Description |
|---|---|---|
/health |
GET | Backend status |
/transcribe |
POST | Upload file → transcript |
/transcribe/path?path=... |
POST | Local file → transcript |
/minutes |
POST | Upload file → transcript + minutes |
/minutes/path?path=... |
POST | Local file → transcript + minutes |
/batch/minutes |
POST | Multiple files → batch results |
/output/list |
GET | List saved outputs |
/output/latest |
GET | Latest output as markdown |
/export/devonthink |
POST | Save to DEVONthink (macOS) |
Architecture
┌──────────────────────────────────┐
│ SwiftUI App │ Native macOS: drag-drop, recording,
│ Record / Open / Batch │ Liquid Glass UI, keyboard shortcuts
└──────────┬───────────────────────┘
│ localhost:8890
┌──────────▼───────────────────────┐
│ Python FastAPI Backend │
│ ├─ Parakeet TDT (STT on MLX) │ ~600MB, runs on Apple GPU
│ ├─ ffmpeg (audio extraction) │ video → 16kHz mono WAV
│ ├─ LLM client (minutes) │ any OpenAI-compatible endpoint
│ └─ /mcp endpoint │ MCP server for LLM tools
└──────────────────────────────────┘
Three ways to use the same backend:
- SwiftUI app — drag-drop GUI
- MCP tools — Claude/Gemini/Codex transcribe via tool calls
- REST API — curl, scripts, integrations
macOS App Features
- Drag-drop or Cmd+O for single files, Cmd+Shift+O for batch
- Built-in audio recorder (click Record, speak, get minutes)
- Liquid Glass UI (macOS 26+)
- Two-pane results: transcript left, rendered markdown minutes right
- Keyboard shortcuts: Cmd+S (save), Cmd+E (DEVONthink), Cmd+Shift+C (copy minutes), Cmd+N (new)
- Drag transcript/minutes directly into other apps
- Recent transcriptions on the home screen
- System notification when processing completes
- DEVONthink export (macOS)
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file scribe_minutes-0.1.0.tar.gz.
File metadata
- Download URL: scribe_minutes-0.1.0.tar.gz
- Upload date:
- Size: 2.3 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.11 {"installer":{"name":"uv","version":"0.10.11","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7706e988a7929d03747562dbc6ab19308cb64a58ab682c30022e4f92f91d9c80
|
|
| MD5 |
ee47497191aa672d366025c8048fb721
|
|
| BLAKE2b-256 |
4c3dd854486f682271315d33a3a3260ac54322f308418ee1fd818fa7bcbe5741
|
File details
Details for the file scribe_minutes-0.1.0-py3-none-any.whl.
File metadata
- Download URL: scribe_minutes-0.1.0-py3-none-any.whl
- Upload date:
- Size: 15.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.11 {"installer":{"name":"uv","version":"0.10.11","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
13005127912b59ca3e8cebd6f6f3fbe4f080753de2f97a24fea1c15be4f3d760
|
|
| MD5 |
2e277de3b4b1ae7c1fdece8ce788f7ee
|
|
| BLAKE2b-256 |
5c7c519b9461f3d624185c58f99971731493a1e430e5884dc06425e87bb1d4cc
|