Local voice layer for AI coding tools — 100% offline voice assistant for Claude Code
Project description
Don't just type to your AI coding tool. Talk to it.
What is voCLI?
You speak. Claude listens. Claude responds. You hear it.
voCLI adds a voice layer to Claude Code. Everything runs locally on your machine — no audio is sent to the cloud, ever.
You speak --> Mic --> faster-whisper (STT) --> Text to Claude
Claude responds --> Kokoro TTS --> Audio plays through speakers
Highlights
- Privacy-first — all audio processing stays on your machine
- Works offline — after initial model download, no internet needed
- Personalized — set your name and the assistant's name
- High-quality voice — natural-sounding Kokoro TTS with 27 voices to choose from
- Smart performance — auto-detects Apple Silicon vs Intel for optimal speed
- Remote-ready — use voice even when Claude Code runs on a remote VM
Quick Start
Recommended: Claude Code Plugin Marketplace
The easiest way to install voCLI:
# Step 1: Add the marketplace
/plugin marketplace add voCLI/voCLI
# Step 2: Install the plugin
/plugin install vocli@vocli
# Step 3: Run the setup wizard
/vocli:install
# Step 4: Start talking!
/vocli:talk
The install wizard walks you through everything: dependencies, models, and configuration.
Alternative: Manual Install via UV
# Install UV package manager (if needed)
curl -LsSf https://astral.sh/uv/install.sh | sh
# Add voCLI as an MCP server
claude mcp add --scope user vocli -- uvx --refresh vocli serve
# Restart Claude Code, then run:
/vocli:install
Local Install
For running Claude Code and voCLI on the same machine (Mac or Linux with mic/speakers).
The /vocli:install wizard handles everything:
| Step | What happens |
|---|---|
| 1 | Checks Python 3.10+ and ffmpeg |
| 2 | Installs faster-whisper and kokoro-onnx |
| 3 | Downloads Kokoro voice model (~325MB) and Whisper model (~500MB) |
| 4 | Detects your CPU architecture (Apple Silicon / Intel / ARM) |
| 5 | Lets you choose model size (tiny for speed, small for accuracy) |
| 6 | Asks for your name, assistant name, and preferences |
Performance note: On Apple Silicon, voCLI automatically uses float16 for faster speech recognition. On Intel, it uses int8.
Remote Install
For running Claude Code on a remote VM while keeping voice on your local machine.
Why?
Remote VMs have no microphone or speakers. voCLI's remote mode runs the audio servers on your local machine and connects them to Claude Code on the VM via the network.
How it works
Your Mac/Linux (local) Remote VM
+------------------+ +------------------+
| STT Server | <-- network | Claude Code |
| (port 2022) | -----------> | + voCLI Plugin |
| | | |
| TTS Server | <-- network | /vocli:talk |
| (port 8880) | -----------> | just works! |
| | | |
| Mic + Speakers | | No audio deps |
+------------------+ +------------------+
Setup
Install the plugin on your remote VM, then run the remote setup wizard:
The /vocli:remote-install wizard handles everything:
From starting the servers on your local machine, entering the URLs, and configuring your preferences.
Tip: If your VM can't reach your local machine directly, use SSH port forwarding:
ssh -R 2022:localhost:2022 -R 8880:localhost:8880 your-vmThen use
http://localhost:2022andhttp://localhost:8880as the URLs.
Commands
| Command | Description |
|---|---|
/vocli:install |
Install dependencies, download models, configure (local mode) |
/vocli:remote-install |
Set up remote mode with external STT/TTS servers |
/vocli:config |
Change assistant name, your name, preferences |
/vocli:talk |
Start a voice conversation |
MCP Tools
voCLI runs as an MCP server with three tools:
| Tool | What it does |
|---|---|
talk |
Speak a message and optionally listen for a reply. Auto-starts servers if needed. |
status |
Check health of STT/TTS servers and show current config. |
service |
Start, stop, or restart STT/TTS servers. |
Requirements
| Requirement | Details |
|---|---|
| Python | 3.10 or higher |
| OS | macOS (Apple Silicon or Intel) or Linux |
| Disk space | ~900MB for models |
| Audio | Microphone + speakers |
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file vocli-0.2.4.tar.gz.
File metadata
- Download URL: vocli-0.2.4.tar.gz
- Upload date:
- Size: 133.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f7430b52e71cf1d04f7b0ba490b75f84c9498755a9c7e6edafca65d457b49102
|
|
| MD5 |
bce629c74fb5c3fee24e5d92bb6aa3b4
|
|
| BLAKE2b-256 |
012578213374854079154771327302761a74d7d5fa4ac3d3b820f4c928a1946c
|
Provenance
The following attestation bundles were made for vocli-0.2.4.tar.gz:
Publisher:
publish.yml on voCLI/voCLI
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
vocli-0.2.4.tar.gz -
Subject digest:
f7430b52e71cf1d04f7b0ba490b75f84c9498755a9c7e6edafca65d457b49102 - Sigstore transparency entry: 1280371094
- Sigstore integration time:
-
Permalink:
voCLI/voCLI@5fa72d245a2082a25e48f548208a435eb18e8665 -
Branch / Tag:
refs/tags/v0.2.4 - Owner: https://github.com/voCLI
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@5fa72d245a2082a25e48f548208a435eb18e8665 -
Trigger Event:
release
-
Statement type:
File details
Details for the file vocli-0.2.4-py3-none-any.whl.
File metadata
- Download URL: vocli-0.2.4-py3-none-any.whl
- Upload date:
- Size: 22.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f7685b93d2bf7e041b0c5ac76ec2007346475422c36d4577859aec82bd6ad8eb
|
|
| MD5 |
c659e3d8676f31766b3e6b5357a41abd
|
|
| BLAKE2b-256 |
ea22730b8f08632addbe9cf09aaab67b26b6e24cc4334472f80627ff7176b686
|
Provenance
The following attestation bundles were made for vocli-0.2.4-py3-none-any.whl:
Publisher:
publish.yml on voCLI/voCLI
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
vocli-0.2.4-py3-none-any.whl -
Subject digest:
f7685b93d2bf7e041b0c5ac76ec2007346475422c36d4577859aec82bd6ad8eb - Sigstore transparency entry: 1280371100
- Sigstore integration time:
-
Permalink:
voCLI/voCLI@5fa72d245a2082a25e48f548208a435eb18e8665 -
Branch / Tag:
refs/tags/v0.2.4 - Owner: https://github.com/voCLI
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@5fa72d245a2082a25e48f548208a435eb18e8665 -
Trigger Event:
release
-
Statement type: